PC 4GB RAM limit

  • Thread starter Thread starter Tim Anderson
  • Start date Start date
Mxsmanic said:
David Maynard writes:




Mainly because disk and network delays prevent it.

That sentence doesn't make any sense no matter what one thinks it might mean.
The vast majority of computer users are not encoding video.

You must be a real busy beaver taking all these polls of the "the vast
majority."

Not to mention that, even if it were true, encoding video, regardless of
how few or many do it, is still proof that your claim of processor power
being 'all consumed by the GUI' is simply not true.
The market is limited largely to adolescent boys; most other computer
users (including the vast majority of women) don't play video games with
any significant frequency.

More polls again?

Still shows the processor power is not all consumed by the GUI.
The fact that things like DirectX are needed to even get games to run
demonstrates how completely they exhaust available horsepower, no matter
how high that horsepower is.

Utterly nonsensical. Needing graphics support for graphics isn't 'proof' of
anything.
However, machines that are woefully
inadequate for gaming are often generously dimensioned for just about
any other type of application short of weather prediction or nuclear
simulations.

A low power computer might well be fine for light web/office applications
but that says nothing at all about the GUI consuming all the processor
power of more powerful machines. It simply means one doesn't need a lot of
power for low power applications. And I tend to agree.
Most people can resolve considerably less than 100 ms.

Actually, most people can't resolve a single event to that degree. They can
'detect' it's occurrence (visual motion detector like 'whiz by', "what was
that?") but not resolve the timing.

On average, humans have a reaction time of 0.25 seconds to a visual
stimulus, 0.17 for an audio stimulus, and 0.15 seconds for a touch stimulus.

Others generally don't care.

Simply not true, as the giga mega hunka boat load of theme and GUI
enhancement download sites attest.

Tell Stardock that no one cares.
These programs are not just doing I/O to load; they are doing I/O
constantly.

I have no idea what 'these programs' are but common applications are not
doing "I/O constantly," except for the ones you claim "the vast majority"
don't do, like video editing.

It's not shared by geeks, but it is shared by end users, and they are a
much larger part of the user community.

No offense but I seriously doubt you have all that good a handle on what
the 'user community' thinks about bells, whistles, games, HTPC, PVRs, video
and the rest. For one, there's just too many of them with varying needs to
talk about 'the user community' as if it were some monolothic, single
minded, block that 'thinks like you do'.

The processor is usually the strongest link in the chain, even though it
spends most of its time waiting for memory modules to respond.

Now it's memory. Sooner or later you're going to rail against very
component, aren't you?
But this becomes a problem if we don't have to wait for network and disk
I/O. Then we end up waiting for the bells and whistles.

I don't. Why do you?
That's not an appropriate analogy,

It's certainly as inappropriate as yours are.
but you can accurately say that cars
are effectively no faster than 50 years ago because it still takes the
same amount of time to get to work in them.

No, but it's a good example of another appropriately inappropriate measure
because even if getting to work is 'no faster' it has nothing to do with
the *car*.

And the REASON these things matter is if one has even the slightest
inclination to improve any of these supposed 'problems' you're not going to
get anywhere trying to 'fix' the wrong bloody thing.
Much of what modern, well-written applications require is within the
capabilities of a 286. The 286 is indeed slower but an application
written for maximum efficiency on a 286 might well match a bloated
application on a modern system in terms of response time.

As long as you strip out features that, contrary to your exhaustive
polling, many users want and restrict it to simple mundane tasks, such as
your 'text editing' (without auto correction, images, fancy formatting,
version tracking, contextual help, hot fax, email, and the rest), and
remove firewall protection, and graphics support, and anti virus
protection, and automatic system repair, and change history tracking, and
file journalling, and file system security, multi-user support, internet,
auto update, and on and on, then yes. But then you don't have a modern PC
with modern apps if you do.

You're simply barking up the wrong tree with 286 comparisons because I used
to *have* an 8Mhz 286 at work, my notebook was a 12MHz 286, so I used it
instead of the office machine most of the time, and a 16 MHz 286 at home;
and I still have the apps. So don't tell ME a modern computer isn't any
'faster', more useful, more convenient, easier to use, or any other nonsense.
From five MHz to 3200 MHz, plus optimizations that increase the number
of instructions that can be executed on average per clock cycle.

'The processor' is not 'the computer'.
Anything that requires computing power.

Same silliness.
I'd expect argument without personal attacks from the computer literate.
But some of them are angry young males.

Then don't suggest that processor speed is the sole determinate of system
speed when you know darn good and well it isn't.
 
Mxsmanic said:
David Maynard writes:




They are trivial compared to 1000000:1.

Which is, again, an appropriately inappropriate comparison.

You can pound the table all day long, you can threaten the designers, you
can pass laws, you can gather an army and have a revolution but it won't
help one damn bit because silicon manufacturing processes DO NOT APPLY to
mechanical devices.

There's simply no point to 'complaining' that disk drive speed has
increased 'only' a thousand times over and are a hundred thousand times
less expensive. And even less of a point in using processor speed as the
basis of the 'complaint'.
If other requirements expand faster than disk drives improve, then disk
drives become a bottleneck.

Yes, everything would be faster if everything were faster. It still isn't
_slower_.

Yes, but what most people overlook is that many systems are heavily
dependent on those mechanical devices, and so they are often
bottlenecks.

If you want to inform 'most people' about the impact of disk speed on
system performance then fine, but inappropriate and misleading statements,
like you've been doing, aren't helpful and the people you claim need the
information are precisely the ones who will be mislead.
 
Mxsmanic said:
David Maynard writes:

The context you snipped dealt with hardware, in particular mainframe cost
of manufacture and, in particular, the cost of dedicated I/O processors and
other hardware architectural features (in the context of PCs using memory
mapped I/O being "really stupid" because "mainframes have done it [I/O
processors] for decades) and, in that context, you said "UNIX is close to a
mainframe system, though."

UNIX is not hardware.


UNIX has traditionally run on minicomputers, so UNIX systems are not
mainframes (and at least in the past they were not PCs, either). Saying
UNIX in a historical hardware context thus implies minicomputer hardware
like the PDPs.

And so? Still isn't hardware. All you added is it isn't 'mainframe' either.
 
Mxsmanic said:
David Maynard writes:




That is still largely true today. An important limiting factor on PC
sales today is the fact that many people just don't want a PC.

I had intended to explain that quote later but I had no idea the
explanation would be so convenient.

From Olsen himself:

"[That interpretation of my comment] is, of course, ridiculous because the
business we were in was making PCs, and almost from the start I had them at
home and my wife played Scrabble with time-sharing machines, and my
sixth-grade son was networking the MIT computers and the DEC computers
together, hopefully without doing mischief, using the computers I had at
home. Home computers were a natural continuum of the "personal computers"
that people had at work, in the laboratory, in the military. "

Olsen was talking about the popular sci-fi notion that computers
would/should 'control' the home and everything in it, including the people.
 
Conor said:
You need a faster graphics card if you can watch it happen.




Only on initial loading. I run Windowblinds. Once its up and in RAM, it
doesn't affect the speed at all.

OMG. bells and whistles
 
David said:
The context you snipped dealt with hardware, in particular mainframe cost
of manufacture and, in particular, the cost of dedicated I/O processors and
other hardware architectural features (in the context of PCs using memory
mapped I/O being "really stupid" because "mainframes have done it [I/O
processors] for decades) and, in that context, you said "UNIX is close to a
mainframe system, though."

UNIX is not hardware.

UNIX has traditionally run on minicomputers, so UNIX systems are not
mainframes (and at least in the past they were not PCs, either). Saying
UNIX in a historical hardware context thus implies minicomputer hardware
like the PDPs.


Unix has been running on million dollar mainframe configurations for
about 10 years. It's featured on IBM's top of the line z/390 these
days. Look at Sgi, also.
 
That is still largely true today. An important limiting factor on PC
sales today is the fact that many people just don't want a PC.


Most don't "want" their own 21 inch TV, either. One or two per
household is a huge market.

BTW: the famous Olsen quote is out of context according to Schein (who
worked for Olsen for about 30 years) in his recent book _DEC Is Dead,
Long Live DEC_. What Olsen said was that people didn't want PCs doing
stupid things like keeping track of what's in you refrigerator. That
being said, he did effectivley veto product proposals that may have
beat Compaq at it's own game (Compaq did not yet exist when Olsen made
this quote.)

(Don't argue with me about this. I'm only quoting the book)
 
Well, a 5 Mega pixel digital still camera produces compressed jpeg images in
the range of 2 to 4 MBytes at the 'super fine' quality level, or RAW images
of about 10 MBytes. No two are the same size. It is possible to delete
images from the memory card 'out of order' leaving 'holes'. Write speed
between the the camera's fixed memory and the storage card is important to
prevent delays between successive images after the internal buffer is full.

I admit that perhaps I am being over enthuastic about defragging (my latest
card has for times the capacity of my previous purchases) .... it may not
really be much of an issue:
defragging a camera memory card would only come into play if you did
not move all images to a PC before reinserting the card
fragmented file space may not be a factor in the way the camera stores
each successive image (the file system is FAT32)

It is difficult to find much information on how well Compact Flash cards
perform ...many brands don't even provide the write rate.

Though I may ultimately fine that my use pattern may not require defragging,
I don't consider the capability a 'frill'. And ultimately I have the choice
of whether to use it or not. I see that the largest Compact Flash capacity
has reached 12 GBytes (at a REALLY high price.) I'm considering the
possibility of using my older compact flash cards for backup of data more
compact than images. Sooner or later I'm sure I WILL use the defragging
function. And if I don't, a few extra programs, totaling 3 or so Mbytes,
stored on a hard drive doesn't cost much.

Phil Weldon
 
They are trivial compared to 1000000:1.

I think you are refering to a claim that CPUS are 1000000x
faster. Wrong.

I just came across my 1992 specs fpr the then-new DEC model 10000AXP,
featuring the new 64 bit Alpha CPU chip. It runs at 200MHz. (This
machine could be purchased with as much as 2GB of memory).

Today's 64 bit chips top out at 2-3GHz (10-15x) and HT and the new
dual core processors prove that there will not be an major increase in
clock speed soon.


Specint92 12 (200mhz version)
Specint2000 for the Opteron is 1,200

A factor of 100.


OTOH, a 2BG mainframe CPU by itself might cost $500k in 1993 and
You can buy an AMD cpu, 2GB or RAM and a mobo and chassis for
about $500.

A factor of a thousand.

What has really changed is that we can all own an AMD system if we
need to so just like disks, th e"speed" is a factor f all of us having
our own and not needing to share.
 
Mxsmanic said:
The graphics card is not what slows the process.
So what does? I'm running Windowblinds with DesktopX. If anything is
going to slow down the GUI, thats it.
 
Al said:
Unix has been running on million dollar mainframe configurations for
about 10 years. It's featured on IBM's top of the line z/390 these
days. Look at Sgi, also.

Yes, but it's not really a mainframe OS.
 
So what does? I'm running Windowblinds with DesktopX. If anything is
going to slow down the GUI, thats it.



A severely misconfigured card/driver might cause it but even
then it's doubtful, any 8 year old card can manage over
30FPS in 2D. In default configurations it might simply be
Windows' GUI effects which deliberately slow it down unto
the point of it being visually obvious, which was the whole
point of the effect.
 
So: Which is the weakest link here? Answer: Disk access times, all
else being equal. Worse yet, modern applications do a lot more disk I/O
than old applications did.

If we were to step back and look at the real limitation, the
bottleneck is now the user, not the disk. How are you going
to upgrade the user on your system?

Even so, I agree that for simple, common tasks they "can" be
disk bound, if we ignore caching. Even then they may be
somewhat disk bound, BUT on these simple tasks, it's not
always necessary for the performance to increase more than
the user can perceive it, except with excessive software
bloat scenarios, and even then, caching helps.


No, it's everything. Clicking on a link in MSIE just now, for example,
required more than 100 disk reads and just under 60 disk writes. The
rest of the delay was rendering time, which is mainly compute-bound.

Even if that link was to a local resource instead of on
internet, the bottleneck was as likely to be the network
speed (or server latencies). Even so, previsouly I did
mention use of a ramdisk, you can use one for browser
caching if you choose to do so. As for the remaining reads
a large portion may have been from registry for user
configuration information, leading one to think if you
dislike IE you can simply choose another more secure browser
with a behavior you like more, IF you can accept a change
from the user's preferences standpoint- and if you can't,
you somewhat have to accept that IE has supported your user
config by making those I/Os.
 
That is still largely true today. An important limiting factor on PC
sales today is the fact that many people just don't want a PC.


I think it far more likely to be the cost and the learning
curve involved. It's exceedingly rare when I hear someone
claim they don't "want a PC", compared to one of several
other reasons they don't currently have one such as space,
time available to use it, or that they actually DO have one
but it's partially broken and non-operational.
 
Most people still don't require them.

Most people don't technically "require" electric can
openers, microwaves, or more than one pair of shoes either.
"Require" is a pointless word to use in this context, it may
even be inhumane to expect someone in a modern society to
only have in their possessions what they "require" to stay
alive.

If they needed mobile access that badly, they'd all have laptops and
WiFi by now. But they don't.

"All"?
There will always be late-adopters and some that don't adopt
newer tech so long as older fits their needs. Again we're
back to "need"... "need" can also be definited as it relates
to "want". In fact some (if not most) dictionaries do so.

Many don't have wifi simply because the're waiting for the
next upgrade cycle to incorporate it... it hasn't been
cost-effective for many years relative to most other
consumer products.

It could also be that they're waiting for exactly the
type(s) of devices I mentioned, a fuller featured device
that isn't as costly and large/heavy as a laptop. Such
current and past technology is still, like all early and
less-evolved technology, less desirable the less mature it
is. Quite a few products were this way, people waited until
the product was suited to their needs which is always a
matter of designers better knowing what those needs are,
which requires time and studies... the moment a given
technology is available and possible to mass-produce in a
cost effective manner, there is still going to be a lag
integrating that tech into products comsumers want.

Again, similar to other devices consumers already DO buy.
Not 100% of consumers, I don't recall anyone claiming 100%
(or anywhere near that) adoption rates.

Hard for a lot of people. Ask the average non-geek on the street.

Ask them what, if they could use a laptap?
I'll bet quite a few would answer "yes".
Some might not yet have a laptop but if you had that laptop
with you and had a trial to see if those who are
computer-literate (now a very large percentage of the modern
world's populace) could use it, again I suspect most of not
all of those would be able to.

Anything one has not learned yet might be considered "hard",
but as many learned to ride a bike, use a fork and spoon,
etc, so they learn everything... what separates humans from
apes.
Tons, but their penetration is still very light, except for mobile
phones, and mobile phones are not being used for computing by average
users.

They are being used for data retrieval (phone numbers), text
messaging, movement of audio data streams and pictures.
They have much more in common with a computer than
preceeding phones, but due to the intentional mirrored
design and ergonomics into a famililar user interface,
they're computers that have a gentle learning curve beyond
the basic functions familiar from the old analog telephone.

A cellphone is not an exception in mobile devices, it is a
computer quite fit for it's designed tasks.

The distinction between those who would and could use new
mobile devices has more to do with their willingness to
learn them and cost. If someone simply 'decides' they won't
use one as you seem to have done, then of course they won't.
Similarly you could choose not to use a fork and spoon to
eat.

Again, similar arguements have been made in the past, yet
the devices were made, it was (in general) profitable to do
as people DO adopt these technologies and will continue to
do so.

I have a lot of experience with real-world users, as opposed to geeks,
and they are worlds apart. Geeks see the world through very distorting
glasses, and what they consider "essential" and "normal" is often
completely unknown to everyone else.


Very few people don't. The only people who use computers when they
don't have to are geeks. Almost no one is interested in computers for
their own sake.

More due to the benfits (or lack thereof). That's part of
what mobile computing is all about, providing the benefits
of smaller devices that can be used without being tethered
to a desk, being able to more transparently integrate the
interface to information and communication into other more
traditional tasks.

You cite one moment it time as if "right now" what
particular users do with a computer, is some evidence about
the future. In the past, even fewer people used desktop
systems. So it will be with any new devices, there will not
be a 100% adoption rate among those that will adopt things,
and for those that do, it will be a progressive adoption
among different types of users. Just like with any other
technology, it happens over the course of time, not a binary
"true" or "false".
 
Conor said:
Not just bells and whistles but flags and a fog horn too.

Hehe.

Yes, I've played with the 'free' stardock versions but it generally crashes
my win98 boxen.
 
kony said:
A severely misconfigured card/driver might cause it but even
then it's doubtful, any 8 year old card can manage over
30FPS in 2D. In default configurations it might simply be
Windows' GUI effects which deliberately slow it down unto
the point of it being visually obvious, which was the whole
point of the effect.

If memory serves that really was at least part of the motive behind
'scroll' and 'fade' presentation as an instantaneous 'POW' popup can be
startling and mildly disturbing.
 
kony said:
Most people don't technically "require" electric can
openers, microwaves, or more than one pair of shoes either.
"Require" is a pointless word to use in this context, it may
even be inhumane to expect someone in a modern society to
only have in their possessions what they "require" to stay
alive.

It's even more fundamental because there's no way to predict what 'un
needed' thing will lead to improvements in the 'necessary'. The first
question asked in Bell's telephone demonstration was "of what use is it?"
and, at the time, it could be said that no one 'needed' a telephone,
especially when one can't even envision a use for it, but being able to
call a doctor when ill, even if nothing else, can be a life saving thing.
It could be said that no one really 'needed' radio but radio technology
enables remote medical monitoring, not to mention being able to contact a
doctor from places where there are no land lines, or ships at sea that
impact ice bergs. Same thing for television technology that enables all
manner of medical scanning devices, remote surgery, and training.

And that's just examples of 'critical' needs, saving lives, but, after all,
other than propagating the species what's the purpose of living if not to
enjoy it?
 
Back
Top