Reaching Limits

D

Dave

For processors, the ghz limit has been reached assuming one feels
heat is bad and I have a feeling that Intel will run the multi-core
technology down a rabbit hole the same as they did with upping the
processor speed.

Your assumption is based on silicon wafer technology to produce
microprocessors. The industry is always looking for alternatives, and
several have already been produced/discovered. WHICH ONE will replace
silicon is still undecided, afaik. But when that replacement hits the
fab, the GHz limit will be greatly extended, I'm sure.

I'm guessing that the only reason silicon is still the norm is that the
major chip manufacturers are milking it for all it's worth. That, and
a new fab costs trillions to set up, so the silicon needs to pay for
itself before it can be replaced. :)

For displays, 20 inch, 22 inch LCDs are pretty cheap now, so where
does the industry go from here? I sit about 2.5 to 3 feet from my
monitor but a 24 incher would be too large and a 24+ incher would
definitely be too large.


Careful. I've posted this before, but I'll have to post this again. I
think the biggest scam in computer hardware right now is the way
monitors are spec'd as far as size goes. The trend is widescreen, and
there's nothing wrong with that. Widescreen aspect ratio is a good
thing. HOWEVER, most monitor buyers still don't realize that
widescreen monitors are smaller than the standard old-style 4:3 ratio
monitors. That's because the viewing area is squashed, vertically
(the Y axis is really short). While the price of monitors is always
going down, the price of monitors is not going down as fast as the
numbers alone would seem to suggest. Your 24" monitor you mention is
probably widescreen, so it would be comparable to a standard monitor of
19 or 20". Both are cheap, so it seems that monitors are cheap when
you can buy a 24" monitor for a low price. But you need to keep it in
perspective. That is, to replace an older 24" monitor, you'd have to
look at something like a 32" or larger widescreen monitor, which is not
as cheap. If you just replace a 24" monitor with a 24" widescreen, it
will be pretty cheap, but you will likely feel that you've downgraded
in size. -Dave
 
G

geoff

For processors, the ghz limit has been reached assuming one feels heat is
bad and I have a feeling that Intel will run the multi-core technology down
a rabbit hole the same as they did with upping the processor speed.

For displays, 20 inch, 22 inch LCDs are pretty cheap now, so where does the
industry go from here? I sit about 2.5 to 3 feet from my monitor but a 24
incher would be too large and a 24+ incher would definitely be too large.

Maybe sitting in an egg where the entire interior is a monitor, lol.

Someone here posted a website that had pics of people's monitor setup. Most
had 3, 4, 5 monitors and there was one pic, the guy likes flight simulator
and had like 5 monitors for the bottom row and 3 monitors for the top row.

However, I think most average users are not going to buy a bunch of monitors
but the industry does not want to sit around either.

--g
 
M

Marty

For processors, the ghz limit has been reached assuming one feels heat
is bad and I have a feeling that Intel will run the multi-core
technology down a rabbit hole the same as they did with upping the
processor speed.

For displays, 20 inch, 22 inch LCDs are pretty cheap now, so where does
the industry go from here? I sit about 2.5 to 3 feet from my monitor
but a 24 incher would be too large and a 24+ incher would definitely be
too large.

I think a major challenge is to reduce power consumption. The coming few
years will see the cost of electricity rise significantly as governments
exploit the global warming thing for all its revenue potential.
 
J

John Doe

geoff said:
For processors, the ghz limit has been reached assuming one feels
heat is bad

Engineers have run up against limits lots of times, only to surpass
them.
and I have a feeling that Intel will run the
multi-core technology down a rabbit hole the same as they did with
upping the processor speed.

Please explain what the "rabbit hole" phrase means.

The multiple core CPU technology is excellent IMO.






--
My big wheel in-line street skates (a.k.a. rollerblades).
http://www.flickr.com/photos/27532210@N04/2565924423/

Google Groups is destroying the USENET archive, to hell with
Google.
 
M

Moe

geoff said:
For processors, the ghz limit has been reached assuming one feels heat is
bad and I have a feeling that Intel will run the multi-core technology down
a rabbit hole the same as they did with upping the processor speed.

For displays, 20 inch, 22 inch LCDs are pretty cheap now, so where does the
industry go from here? I sit about 2.5 to 3 feet from my monitor but a 24
incher would be too large and a 24+ incher would definitely be too large.

Maybe sitting in an egg where the entire interior is a monitor, lol.

Someone here posted a website that had pics of people's monitor setup. Most
had 3, 4, 5 monitors and there was one pic, the guy likes flight simulator
and had like 5 monitors for the bottom row and 3 monitors for the top row.

However, I think most average users are not going to buy a bunch of monitors
but the industry does not want to sit around either.

--g
I use multiple monitors for the markets.
http://www.multiplemonitors.org/
 
G

geoff

I'm guessing that the only reason silicon is still the norm is that the
major chip manufacturers are milking it for all it's worth. That, and
a new fab costs trillions to set up, so the silicon needs to pay for
itself before it can be replaced. :)

.. . . that is exactly the article in Forbes Magazine where they plotted out
the profits versus the cost of new fab setups.

There was a point where the two lines crossed meaning economically, there is
no way Intel could continue with the current technology and they would have
to find something new.

The thing is, that article was written around the mid 90s and the crossover
point was about 2001.

.. . . and yet, here we are, 10 years later, same technology. In the 90s,
Intel used to talk about bio-chips, etc. My feeling is, whatever is new out
there, Intel feels it is too risky to try since it would require pumping a
lot of money into it.

The same happened with Mazda when they bought the wankel engine, it nearly
drove them into bankruptcy.

So, the question is, if Forbes analysis was correct, how is Intel able to
maintain profits?

--g
 
E

Ed Medlin

geoff said:
. . . that is exactly the article in Forbes Magazine where they plotted
out the profits versus the cost of new fab setups.

There was a point where the two lines crossed meaning economically, there
is no way Intel could continue with the current technology and they would
have to find something new.

The thing is, that article was written around the mid 90s and the
crossover point was about 2001.

. . . and yet, here we are, 10 years later, same technology. In the 90s,
Intel used to talk about bio-chips, etc. My feeling is, whatever is new
out there, Intel feels it is too risky to try since it would require
pumping a lot of money into it.

The same happened with Mazda when they bought the wankel engine, it nearly
drove them into bankruptcy.
The rotary engine is still going strong. The RX series is still Mazda's
flagship performance car. There is also a whole open-wheel racing series
based on the Mazda dual-rotary RX8 engine. Folks still like speed, even
though it costs money.
So, the question is, if Forbes analysis was correct, how is Intel able to
maintain profits?

--g

Folks still like speed and Intel, at least at this time, is providing it
cheaply. I agree that it is about time to get away from the x86
architecture, but THAT would cost a ton.


Ed
 
J

John Doe

how come?

explain

Who knows? Who cares? Maybe he views the computer while on one of
those chin rests like at an optometrist.

One main benefit of a big monitor is that you can move around
without losing focus. Like at a movie theater, it's a much more
relaxed viewing environment.
 
F

Flasherly

For processors, the ghz limit has been reached assuming one feels heat is
bad and I have a feeling that Intel will run the multi-core technology down
a rabbit hole the same as they did with upping the processor speed.

For displays, 20 inch, 22 inch LCDs are pretty cheap now, so where does the
industry go from here? I sit about 2.5 to 3 feet from my monitor but a 24
incher would be too large and a 24+ incher would definitely be too large.

Maybe sitting in an egg where the entire interior is a monitor, lol.

Someone here posted a website that had pics of people's monitor setup. Most
had 3, 4, 5 monitors and there was one pic, the guy likes flight simulator
and had like 5 monitors for the bottom row and 3 monitors for the top row.

However, I think most average users are not going to buy a bunch of monitors
but the industry does not want to sit around either.

I haven't decided where to go on a multi-core stance, if it all (a 4
or +Ghz single core if stable and not a scorcher may appeal), for my
next upgrade. After my first 32" TEEVEE (Olevia HDTV make), I will
admit that the -WOW- factor overwhelmed me, and it didn't take much
persuading when I chanced up a 37" HDTV NEC (for $600) to quickly
adapt the 32" into a desktop monitor. Even getting by with AGP and
SVGA connectors on both, (look, Ma, no HDMI) both running with PCs for
as much PC MM content, as anything NTSC occasional offers (no cable
subscriptions here). The old saying -- moving along to TTL from CRTs
means no looking back -- which applies as much to my thinking: Getting
a bigger desk isn't a serious consideration at a minimum of a 32"
monitor.
 
G

geoff

Please explain what the "rabbit hole" phrase means.

Many people, from the context it is used in, feel it means unsuccessful.
However, more precisely, it means impractical.

After Alice fell down the rabbit hole, she saw many things that would be
impratical and/or could not be mapped back to the real world.

With Intel, the processor worked, no issues, Intel up'd the clock speed and
still no issues. Eventually, the clock speed was so high that too much heat
was generated and the idea of 'upping the clock speed' became impratical.

One would think with their engineers and testing they would know this before
releasing a processor to the public for sale.

Fortunately, Intel had a skunk works in Israel where they suggested the use
of 2 cores. Now, it is four and eight cores. A MB only has so much real
estate and I assume Intel will run this technology down a rabbit hole as
well.

What will force them to change is when a new company (or AMD) switches to a
new technology that offers high end performance but low power consumption
and low heat, IMHO.

--g
 
J

John Doe

....

I appreciate the explanation for what "running it down a rabbit
hole" means to you.
Fortunately, Intel had a skunk works in Israel where they
suggested the use of 2 cores. Now, it is four and eight cores.

As far as the end user PC market goes, currently it's up to only
four cores.
A MB only has so much real estate and I assume Intel will run this
technology down a rabbit hole as well.

My dual core CPU is no bigger than my prior single core CPU. They
just started the multiple core stuff. Also, progress is continuously
made on miniaturization and reducing power consumption.

The multiple core CPU thing is a major improvement. It's hardly a
time when we are reaching limits CPU wise. Then again, I guess that
depends partly on what kind of timeframe you're talking about.
 
D

Dave

geoff said:
. . . that is exactly the article in Forbes Magazine where they plotted
out the profits versus the cost of new fab setups.

There was a point where the two lines crossed meaning economically, there
is no way Intel could continue with the current technology and they would
have to find something new.

The thing is, that article was written around the mid 90s and the
crossover point was about 2001.

. . . and yet, here we are, 10 years later, same technology. In the 90s,
Intel used to talk about bio-chips, etc. My feeling is, whatever is new
out there, Intel feels it is too risky to try since it would require
pumping a lot of money into it.

The same happened with Mazda when they bought the wankel engine, it nearly
drove them into bankruptcy.

So, the question is, if Forbes analysis was correct, how is Intel able to
maintain profits?

--g

OK, I can't comment on the Forbes article, as I haven't read it. However,
the fab is so expensive to build, that it would have to be operated for
decades before a major overhaul...or before a whole new fab is built.

THAT was my point on why silicon is still used in microchips. I believe
that the silicon fabs still haven't paid for themselves, yet. -Dave
 
F

Fred

geoff said:
Many people, from the context it is used in, feel it means
unsuccessful. However, more precisely, it means impractical.

After Alice fell down the rabbit hole, she saw many things that would
be impratical and/or could not be mapped back to the real world.

With Intel, the processor worked, no issues, Intel up'd the clock
speed and still no issues. Eventually, the clock speed was so high
that too much heat was generated and the idea of 'upping the clock
speed' became impratical.
One would think with their engineers and testing they would know this
before releasing a processor to the public for sale.

Fortunately, Intel had a skunk works in Israel where they suggested
the use of 2 cores. Now, it is four and eight cores. A MB only has
so much real estate and I assume Intel will run this technology down
a rabbit hole as well.

The next generation Nehalem Intel chip will be a scaleable model. They are
moving to 45nm manufacturing process. Bringing the memory controller on die.
Including a GPU on the cpu package for mobile computing. New pin count.
The Nehalem processors will have up to eight cores and 16 threads.
The follow up to Nehalem, Westmere, should be a facelift, die shrink and new
instructions.
What will force them to change is when a new company (or AMD)
switches to a new technology that offers high end performance but low
power consumption and low heat, IMHO.

Nehalem/Westmere is scheduled to last a short term.
The successor to Nehalem/Westmere, Sandy Bridge is already in development
http://en.wikipedia.org/wiki/Sandy_Bridge_(microarchitecture)
 
G

geoff

24" is pure bliss!!!

The Dell I got, 2408WFP, is way too much screen for normal work. I solved
it by switching down to 1600x1200 and by following the advice on
anandtech.com, which is to reduce the rgb values from 100 to 80.

At native resolution, it is too bright and I am probably a victim of pixel
walk. You can test this by going here:

http://www.lagom.nl/lcd-test/

.. . . and selecting ''Inversion (pixel-walk)'. At native resolution, my
monitor flickers with panels 2a and 4a. My old Samsung flickered slightly
with panel 4a. I am using a DVI connection. Also, if I look away from the
monitor, I can see a flicker out of the side of my eyes when working.

With the resolution at 1600x1200, there is only a slight flicker with panel
4a, same as my Samsung and no flickering out of the side of my eyes.

I've read that Dell would build monitors for reviewers then switch to lower
quality panels for general sales, called the 'panel lottery'. Maybe I am a
victim of that, not sure.

I always crank up the resolution to native mode for games.

--g
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top