Frank said:
I had seen the article below, and unfortunately the author is now
deceased otherwise I would email him.
One question I had is if there is a performance gain by having no
page file at all. The laptop I use has 2 GB of RAM, and I can
never seem to use more than about 1 GB. So I had turned off the
page file.
No.
In particular, in the article by Alex Nichol he says :
Strictly speaking Virtual Memory is always in operation and cannot
be "turned off." What is meant by such wording is "set the system
to use no page file space at all."
Doing this would waste a lot of the RAM. The reason is that when
programs ask for an allocation of Virtual memory space, they may
ask for a great deal more than they ever actually bring into use -
the total may easily run to hundreds of megabytes. These addresses
have to be assigned to somewhere by the system. If there is a page
file available, the system can assign them to it - if there is not,
they have to be assigned to RAM, locking it out from any actual use.
The two questions I have about this are :
1) Does it actually "waste" RAM ? In some architectures I've seen,
this is true if you have insufficient paging space or no paging
file at all. However, from what I have observed with Windows XP, it
does not. I have many processes (Cygwin for example) that have 1/2
GB VM sizes, but resident sizes of ~20K.
If you gain anything by having no page file - even if you have an enormous
amount of RAM - at this point with 32bit architecture anyway - it is so
small you won't notice.
2) Is there a performance benefit of not manipulating the paging
file ? In some architectures, this avoids a disk I/O, and some
internal OS manipulation of VM paging tables.
In Windows XP - which the article was written around - I would venture a
'no'. You should either 'set it and forget it' or better yet - let Windows
XP manage it for you.
The observation I have is that it does not appear that Windows XP is
reserving space within the paging file for executable text as Alex
Nichol states above. So what is actually in memory is text that
has been faulted in, any allocated memory that has been faulted in,
and any dynamic data (stack, etc.). I have yet to have my laptop
run more than ~1.3 GB of memory in use.
Yes - so? The point is never to 'use all your memory' - it is 'to have
enough memory that you never use all your memory'. Think about it. If you
are utilizing all of the memory you have in your computer at all times -
okay, fine - but what happens when you add a new task - it has no memory to
use. In other words - just because you have memory that never gets used -
don't assume there is a problem. What you have is a balance and as long as
you have the free (unused) memory - that is *GOOD*.
Just curious as to how the VM in XP actually works.
The posts you responded to had some of the best information out there. The
problem is I believe you are worrying about something where there is no
problem. In a 32bit architecture - you can only utilize so much for
applications anyway - the rest is reserved for the OS. You should - for
best results - just allow Windows XP to manage the Virtual Memory and make
sure you have more memory than you will need. It's not like RAM is that
expensive and it is *not* a bad thing to have unused memory. Now if you had
4GB memory and never went above 768MB of used - ever - you might have wasted
that couple-hundred dollars, but it's not like that's *bad* unless you
couldn't afford it.