Pagefile Size

C

Citizen Bob

I have a 2.4 GHz Celeron D with 512 MB RAM running Win2K/SP4.

I am running compute-intensive graphics applications that are taking
100% of the "CPU Usage" and 730 MB "MEM Usage" in the Performance
section of Task Manager.

Currently I am running a 512MB/1.5BG pagefile. What should the correct
size be for this environment?


--

Government's view of the economy could be summed up in a
few short phrases: If it moves, tax it. If it keeps moving,
regulate it. And if it stops moving, subsidize it.
--Ronald Reagan
 
R

Rod Speed

Citizen Bob said:
I have a 2.4 GHz Celeron D with 512 MB RAM running Win2K/SP4.
I am running compute-intensive graphics applications that are
taking 100% of the "CPU Usage" and 730 MB "MEM Usage"
in the Performance section of Task Manager.

What matters is how typical that 730MB is with that machine.
Currently I am running a 512MB/1.5BG pagefile.
What should the correct size be for this environment?

There is no such animal. That size is a bit excessive with that 730MB
but the only downside with that is a bit more hard drive space used.
 
J

Joel

I have a 2.4 GHz Celeron D with 512 MB RAM running Win2K/SP4.

I am running compute-intensive graphics applications that are taking
100% of the "CPU Usage" and 730 MB "MEM Usage" in the Performance
section of Task Manager.

Currently I am running a 512MB/1.5BG pagefile. What should the correct
size be for this environment?

You don't say what graphic program you use, but if you use something like
Photoshop then you may want to increase the memory to at least 2GB or so
(more if you wish but from 512MB to 2GB would make hair in your back
standing up <g>).

Also, you may wanna check with Task Manager to see which eating up the CPU
then try to control it. But I would increase the memory, and make sure
there is plenty of free disk space for swapping.
 
R

Robert Heiling

Joel said:
You don't say what graphic program you use, but if you use something like
Photoshop then you may want to increase the memory to at least 2GB or so
(more if you wish but from 512MB to 2GB would make hair in your back
standing up <g>).

Also, you may wanna check with Task Manager to see which eating up the CPU
then try to control it. But I would increase the memory, and make sure
there is plenty of free disk space for swapping.

Letting Win2k manage the size starting at it's recommended minimum of 1132MB
would be better all around than what he's currently at if you consider the fact
that they must *know something* when they recommend that value. My own Win2k is
set for a fixed 4095MB (the max permitted) on each of the 2 drives in
consideration of the size of drives that are now available. That way I put it
all to bed and don't worry about the size and don't worry about any pagefile
fragmentation either. If I ever find that I need some of those 8GB for other
purposes, I'll know it's time for a larger drive or time to clean house. <g>

Bob
 
J

Joel

Robert Heiling said:
Letting Win2k manage the size starting at it's recommended minimum of 1132MB
would be better all around than what he's currently at if you consider the fact
that they must *know something* when they recommend that value. My own Win2k is
set for a fixed 4095MB (the max permitted) on each of the 2 drives in
consideration of the size of drives that are now available. That way I put it
all to bed and don't worry about the size and don't worry about any pagefile
fragmentation either. If I ever find that I need some of those 8GB for other
purposes, I'll know it's time for a larger drive or time to clean house. <g>

Bob

I dunno, I have never used Win2K to know much about it. And I didn't say
anything about 4095MB nor 8GB to agree or disagree with whatever you are
saying. I only suggest to increase the *memory* from 512MB to around 2GBto
enjoy the speed (when using graphic program like Photoshop).

That's all I said, and I can't comment on thing I didn't say.
 
K

kony

Letting Win2k manage the size starting at it's recommended minimum of 1132MB
would be better all around than what he's currently at if you consider the fact
that they must *know something* when they recommend that value.

They don't actually "know something" as it applies here..
They have no way whatsoever to determine what the user might
be running, it is a random guess and a several-years-old
guess at that, which didn't take into account more modern
programs. Since win2k itself fits in less than 70MB memory,
it'll be entirely dependant on what the user is running.
 
K

kony

I have a 2.4 GHz Celeron D with 512 MB RAM running Win2K/SP4.

I am running compute-intensive graphics applications that are taking
100% of the "CPU Usage" and 730 MB "MEM Usage" in the Performance
section of Task Manager.

There is no "MEM Usage" figure on the Performance tab of
Task Manager. Which figure did you mean specifically?

The best answer is don't put a moment's though into this
until you've added enough real memory such that your
regularly reoccuring, Commit Charge "Peak" value
during/after the demanding tasks is a lower figure (than
that real memory), ideally the figure will be significantly
lower as the remainder can be put to use as a System Cache
that substantially speeds up the system by reducing HDD
access.

Currently I am running a 512MB/1.5BG pagefile. What should the correct
size be for this environment?

1024MB/1.5GB (needs more real memory)

If you get out of memory messages or errors, increase the
1.5GB to 2GB. If your particular applications are few in
numbers of concurrent tasks, but large in manipulated data
files, you will tend to need less memory than if you had far
more applications running with lesser data files to attain
that 730MB figure, IF it is the "Peak" figure I'd mentioned
above. Forget about the other figures, have enough real
memory to handle the peak with at least a couple hundred MB
(ideally more as budget allows) to spare. If you can have
1GB of memory devoted to that system cache it's that much
the better when doing some kinds of work with large files
involving subsequent access beyond a mere linear read/write
process.
 
S

Shep©

I have a 2.4 GHz Celeron D with 512 MB RAM running Win2K/SP4.

I am running compute-intensive graphics applications that are taking
100% of the "CPU Usage" and 730 MB "MEM Usage" in the Performance
section of Task Manager.

Currently I am running a 512MB/1.5BG pagefile. What should the correct
size be for this environment?

http://www.aumha.org/win5/a/xpvm.php

HTH :)
 
R

Robert Heiling

Joel said:
I dunno, I have never used Win2K to know much about it. And I didn't say
anything about 4095MB nor 8GB to agree or disagree with whatever you are
saying. I only suggest to increase the *memory* from 512MB to around 2GB to
enjoy the speed (when using graphic program like Photoshop).

That's all I said, and I can't comment on thing I didn't say.

Please don't seem so defensive. I was basically agreeing with you.

Bob
 
R

Robert Heiling

kony said:
They don't actually "know something" as it applies here..

I sort of figured that Microsoft just might know what values worked well for a
typical cross section of application use. But, in any case, we're faced with the
situation that too little can hurt! and too much can do no harm! I had mentioned
that recommended minimum only as an aside. It was not my main point, which was
to simply set it at 4095 MB and forget it. Micro-managing that sort of thing has
no real payback.
They have no way whatsoever to determine what the user might
be running, it is a random guess and a several-years-old
guess at that, which didn't take into account more modern
programs.

There isn't a system here with Vista on it, but one that has WinXP 2005 Media
Edition and it recommends 1437 MB minimum.
Since win2k itself fits in less than 70MB memory,
it'll be entirely dependant on what the user is running.

It apparently doesn't make a whole lot of difference from past to present and,
don't foget, the OP is running Win2k. But if losing a few gig on a large hard
drive bothers anyone, then go ahead and worry about it.

Bob
 
C

Citizen Bob

There is no "MEM Usage" figure on the Performance tab of
Task Manager.

Yes there is.
Which figure did you mean specifically?

The one I am staring at right not. It's on the left side under the
"CPU Usage". Both are graphical representations.
The best answer is don't put a moment's though into this
until you've added enough real memory such that your
regularly reoccuring, Commit Charge "Peak" value
during/after the demanding tasks is a lower figure (than
that real memory), ideally the figure will be significantly
lower as the remainder can be put to use as a System Cache
that substantially speeds up the system by reducing HDD
access.

At this moment, the "MEM Usage" is around 700MB and the Commit Vharge
Peak is 829328K. According to what you are saying, I should increase
the RAM from 512MB to 1GB.


--

Government's view of the economy could be summed up in a
few short phrases: If it moves, tax it. If it keeps moving,
regulate it. And if it stops moving, subsidize it.
--Ronald Reagan
 
K

kony

Yes there is.

My mistake, yes they did label the bar graph that way, but
it is actually the Commit Charge, "Total" figure. "Mem
Usage" as a term is meaningless since it doesn't tell us
what portion of real vs virtual and how it's allocated.

The Commit Charge, Total is only an instantaneous figure,
one that will easily be several hundreds of MB higher or
lower depending on what's running. Thus why I suggested you
look at the Commit Charge Peak, which is the peak amount the
"Total" figure rose to during the entire uptime of the
system/OS. Unless you'd had a rare usage during that uptime
that used an extraordinary amount of memory compared to what
you'd usually use, planning (having installed) that much
real memory plus a bit more is the desired goal.

The one I am staring at right not. It's on the left side under the
"CPU Usage". Both are graphical representations.

Yeah, ignore the graph since it's redundant and
non-descript.

At this moment, the "MEM Usage" is around 700MB and the Commit Vharge
Peak is 829328K. According to what you are saying, I should increase
the RAM from 512MB to 1GB.

Actually a little more than 1GB, but 1GB gets you most of
the benefits, IF that ~825MB peak is a common peak value for
your uses... if you routinely exceed that you'd need even
more than 1GB, or less if this computing session was
abnormally demanding compared to the usual jobs.
 
R

Rod Speed

Shep© said:
(e-mail address removed) (Citizen Bob) wrote

The very fundamental problem with that second one is that it perpetuates
the mindlessly silly line that the page file needs to be a minimum of 1.5 times
the size of the physical ram. That is just plain silly because when say the
amount of physical ram is doubled from say 1G to 2G, there will inevitably
be LESS need to page file space, not more.

The first one got that right.
 
R

Robert Heiling

Rod said:
The very fundamental problem with that second one is that it perpetuates
the mindlessly silly line that the page file needs to be a minimum of 1.5 times
the size of the physical ram. That is just plain silly because when say the
amount of physical ram is doubled from say 1G to 2G, there will inevitably
be LESS need to page file space, not more.

The first one got that right.

But just what is it that people are actually trying to accomplish with this
"optimization"? When stripped down to the bottom line, all they're doing is
attempting to save some HD space. That's the part that is silly.

Bob
 
C

Citizen Bob


There are some items in that article that have been controversial on
this forum before.:

"Have the initial size be at least 1.5 times bigger than the amount of
physical RAM. Do NOT make the Pagefile smaller than the amount of
physical RAM you've got installed on your system."

I was advised in an earlier thread to set my initial pagefile size to
512MB, which is the size of my RAM. I was also advised to set my
maximum pagefile size to 1.5GB. I had it a lot higher but I was
running into NTFS corruption problems that went away when I lowered to
the current values of 512MB/1.5GB.

"Make its initial size as big as the maximum size. Although this will
cause the Pagefile to occupy more HD space, we do not want it to start
off small, then having to constantly grow on the HD. Writing large
files (and the Pagefile is indeed large) to the HD will cause a lot of
disk activity that will cause performance degradation. Also, since the
Pagefile only grows in increments, you will probably cause Pagefile
fragmentation, adding more overhead to the already stressed HD."

I was advised to make the maximum 1.5GB even though the initial was
512MB. I have more than ample disk space to that is not an issue.



--

Government's view of the economy could be summed up in a
few short phrases: If it moves, tax it. If it keeps moving,
regulate it. And if it stops moving, subsidize it.
--Ronald Reagan
 
K

kony

But just what is it that people are actually trying to accomplish with this
"optimization"? When stripped down to the bottom line, all they're doing is
attempting to save some HD space. That's the part that is silly.


What is it that you are trying to accomplish by setting a
pagefile extremely large without any reason to think it
should be that large?

If it is merely to be sure you don't run out of virtual
memory space, do you see a lot of people reporting they have
that problem? No, and it's not at all usual for them to
have set a 4GB pagefile.

IOW, you have a solution for a problem that doesn't usually
exist.
 
K

kony

There are some items in that article that have been controversial on
this forum before.:

"Have the initial size be at least 1.5 times bigger than the amount of
physical RAM. Do NOT make the Pagefile smaller than the amount of
physical RAM you've got installed on your system."

I was advised in an earlier thread to set my initial pagefile size to
512MB, which is the size of my RAM. I was also advised to set my
maximum pagefile size to 1.5GB. I had it a lot higher but I was
running into NTFS corruption problems that went away when I lowered to
the current values of 512MB/1.5GB.

With 780-odd MB memory allocated as you reported earlier
(but really, you should consider the peak value not the
momentary as you did), 512MB pagefile should work. You are
continually overlooking that there is no one generic answer
that fits all systems as well as actually looking at YOUR
system usage. Don't tell us what you have the pagefile set
to, tell us what your PEAK Commit Charge is.

Remember that if you have too small a pagefile set, it won't
just slow down your use, you will see a warning message.
You could continue to have the system set to something like
a 512MB minimum (which minimizes fragmentation, contiguous
file if the disk space is available but even then, the file
itself may have fragmented access because that's how paging
works- only what's needed is read back), and a larger
maximum. You'd want your minimum large enough that your big
jobs don't exceed it, and the larger maximum is just a
failsafe should you do something very unusual. Keep in mind
that if you did such an unusual task and suddenly needed
another GB of virtual memory, you'd be sitting around for
ages waiting for the system to stop thrashing the HDD
swapping it all back and forth from disk to real memory.

You don't ever want to run jobs like that, to give you an
example I used to try to edit audio on a P2 box with 32MB in
it, several minutes would pass by waiting on the swapping to
get done. When a pair of 128MB DIMMs were added to that box
later, similar jobs took under 20 seconds.


"Make its initial size as big as the maximum size. Although this will
cause the Pagefile to occupy more HD space, we do not want it to start
off small, then having to constantly grow on the HD. Writing large
files (and the Pagefile is indeed large) to the HD will cause a lot of
disk activity that will cause performance degradation. Also, since the
Pagefile only grows in increments, you will probably cause Pagefile
fragmentation, adding more overhead to the already stressed HD."

You don't need to set initial same as max, just set initial
large enough that you don't "expect" it to ever be
exceeded... for example you could set a 1GB min and 2GB max.
Whether it would fragment past 1GB makes little difference,
because you don't want to use the system on anything that
would actually make use of an additional 1GB of virtual
memory, it is merely there for allocation purposes, not
reading and writing data.

I was advised to make the maximum 1.5GB even though the initial was
512MB. I have more than ample disk space to that is not an issue.

If it ain't broke don't fix it.
 
R

Rod Speed

Robert Heiling said:
Rod Speed wrote
But just what is it that people are actually trying to accomplish with this
"optimization"? When stripped down to the bottom line, all they're doing
is attempting to save some HD space. That's the part that is silly.

Sure, and I said that in my first post in this thread.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top