Managing pagefile and hiberfil.sys and disk resize

  • Thread starter Robert Carnegie
  • Start date
R

Robert Carnegie

I got myself a Gigabyte M912 little-notebook touchscreen PC, with
Windows XP Home, which is probably enough for now. I'm looking
mainly for credible pagefile advice: so hit me, folks ;-)

Quite a big hard disk for the category. 1 GBytes RAM, replaceable
with a 2 GBytes module instead: not so good.

Now, as to pagefile (swap file):

I plan to back up the machine (have done so once) by booting with
Linux "Live CD" in external drive such as Knoppix 6.0.1 and copying
drive C into compressed files on D, then backing those up elsewhere
(DVD). With "experimental" NTFS support, only actual file space on C
is backed up. But I suspect this includes pagefile.sys. So part 1: I
want at most a small pagefile on C, and more elsewhere, unless there's
a strong counter-argument.

The disk is large enough to have any appropriate permanent pagefile
allowance on another partition, dedicated or not. The machine also
has an ExpressCard slot, SD card slot (SD-HC?), and three USB ports.

There's a lot of advice online that is contradictory and that I'm
suspicious of.
<http://www.aumha.org/win5/a/xpvm.php> was written by Alex Nichol, but
at least one reference says he's passed on. Other stuff just gets
passed around.

Like this:

01. The more RAM you have, the smaller pagefile you need.

02. The more RAM you have, the larger pagefile you need.

03. If you have enough RAM, you don't want a pagefile.

04. You should have a pagefile on C even if you have a pagefile on
another drive. What if the other drive breaks?

05. However much RAM you have, you should set a pagefile of at least
50 MB on C.

06. However much RAM you have, you should set a pagefile of at least
500 MB on C.

07. A pagefile can't be defragmented.

08. To avoid pagefile fragmentation, set its minimum size and maximum
size equal. The pagefile will immediately take up that amount of disk
space.

09. A pagefile can be defragmented, for instance with free download
PageDefrag.

10. But you may as well stick with point 08 anyway.

11. Pagefile should be 0.5 times / 1.5 times / 2 times / 2.5 times / 3
times RAM, and no more.

12. Pagefile should be no more than 4 gigabytes minus the physical RAM
size.

13. Pagefile doesn't need to be on a fault tolerant storage.

14. A disk fault in the pagefile is liable to crash your computer.

15. Pagefile on a separate disk - not separate partition of same disk
- is likely to improve performance.

16. Pagefile on a separate disk will hurt performance if the bus to
its disk (USB) is slower than for the built-in hard disk.

I think that's about it. The size issue is my main interest.

Other current concerns:

21. If - when - hibernation is switched on, a file \hiberfil.sys is
generated on C, the same size as RAM. You can't do anything about
that, except disable hibernation and maybe reboot.

22. How large should C be for Windows XP and a few major
applications, such as OpenOffice and Firefox, and with a view to
copying a very compressed version onto, say, one DVD? On my last
machine I made C about 14 gigabytes. It's nearly full now, which I
guess means I did it pretty right.

23. I can't remember what I actually did the last time I resized a
partition. Vista does it, XP doesn't. Currently my plan is:
a. Back up Windows XP (done)
b. Download and use MyDefrag to move files on C (NTFS) out of the MFT
and consoldiate them at the start of C.
c. Use Knoppix 6.0.1 and the tool "parted" to impose a smaller
partition size for C.
d. Immediately use Windows CHKDSK or other built-in tools to correct
the consequences of doing that.
 
C

C.Joseph Drayton

I got myself a Gigabyte M912 little-notebook touchscreen PC, with
Windows XP Home, which is probably enough for now. I'm looking
mainly for credible pagefile advice: so hit me, folks ;-)

Quite a big hard disk for the category. 1 GBytes RAM, replaceable
with a 2 GBytes module instead: not so good.

Now, as to pagefile (swap file):

I plan to back up the machine (have done so once) by booting with
Linux "Live CD" in external drive such as Knoppix 6.0.1 and copying
drive C into compressed files on D, then backing those up elsewhere
(DVD). With "experimental" NTFS support, only actual file space on C
is backed up. But I suspect this includes pagefile.sys. So part 1: I
want at most a small pagefile on C, and more elsewhere, unless there's
a strong counter-argument.

The disk is large enough to have any appropriate permanent pagefile
allowance on another partition, dedicated or not. The machine also
has an ExpressCard slot, SD card slot (SD-HC?), and three USB ports.

There's a lot of advice online that is contradictory and that I'm
suspicious of.
<http://www.aumha.org/win5/a/xpvm.php> was written by Alex Nichol, but
at least one reference says he's passed on. Other stuff just gets
passed around.

Like this:

01. The more RAM you have, the smaller pagefile you need.

02. The more RAM you have, the larger pagefile you need.

03. If you have enough RAM, you don't want a pagefile.

04. You should have a pagefile on C even if you have a pagefile on
another drive. What if the other drive breaks?

05. However much RAM you have, you should set a pagefile of at least
50 MB on C.

06. However much RAM you have, you should set a pagefile of at least
500 MB on C.

07. A pagefile can't be defragmented.

08. To avoid pagefile fragmentation, set its minimum size and maximum
size equal. The pagefile will immediately take up that amount of disk
space.

09. A pagefile can be defragmented, for instance with free download
PageDefrag.

10. But you may as well stick with point 08 anyway.

11. Pagefile should be 0.5 times / 1.5 times / 2 times / 2.5 times / 3
times RAM, and no more.

12. Pagefile should be no more than 4 gigabytes minus the physical RAM
size.

13. Pagefile doesn't need to be on a fault tolerant storage.

14. A disk fault in the pagefile is liable to crash your computer.

15. Pagefile on a separate disk - not separate partition of same disk
- is likely to improve performance.

16. Pagefile on a separate disk will hurt performance if the bus to
its disk (USB) is slower than for the built-in hard disk.

I think that's about it. The size issue is my main interest.

Other current concerns:

21. If - when - hibernation is switched on, a file \hiberfil.sys is
generated on C, the same size as RAM. You can't do anything about
that, except disable hibernation and maybe reboot.

22. How large should C be for Windows XP and a few major
applications, such as OpenOffice and Firefox, and with a view to
copying a very compressed version onto, say, one DVD? On my last
machine I made C about 14 gigabytes. It's nearly full now, which I
guess means I did it pretty right.

23. I can't remember what I actually did the last time I resized a
partition. Vista does it, XP doesn't. Currently my plan is:
a. Back up Windows XP (done)
b. Download and use MyDefrag to move files on C (NTFS) out of the MFT
and consoldiate them at the start of C.
c. Use Knoppix 6.0.1 and the tool "parted" to impose a smaller
partition size for C.
d. Immediately use Windows CHKDSK or other built-in tools to correct
the consequences of doing that.

Hello Robert,

The questions and points you raise are good ones, but I would like to
point out to you that your question is somewhat subjective.

The hardware your using, the software you run and what you actually do
on your computer will effect the answer.

Let me give you an example;

On my development machine, I do not run a pagefile at all. I have 2GB of
RAM and when I am programming, I tend to not be doing anything else.
When I have to manipulate graphics for an application that I am running,
I have a VM with nothing but Adobe Design Studio CS3 installed on it, in
that VM I run a pagefile that is 2.5 times the size of the allocated RAM
for the VM (in this case 1GB).

The development machine and the VM are set up so differently because
Adobe loves a big pagefile, where-as the compiler I use is CPU intensive
and I have it do most of its work in RAM. With 2GB of RAM, I can set up
a RAM disk that holds compiler and resource files. That way I can do
very fast compiles.

In short, there really is no ONE answer. I would say that you should
experiment and take good notes to find out what works BEST for YOU.

Sincerely,
C.Joseph Drayton, Ph.D. AS&T

CSD Computer Services

Web site: http://csdcs.site90.net/
E-mail: (e-mail address removed)90.net
 
T

Twayne

What does "not so good" mean?

With only one disk drive, there is zero advantage to having the pf
(pagefile) located on another drive letter. It's only helpful when the
pf can reside on a separate, different physical hard drive. Using a
different partition on the same hard drive will give you no benefit and
depending on where it gets created physically in the platter/track
system, could even result in the opposite condition; slowing things
down.
Since this is a single-disk only machine, forget about moving or
splitting the pf; it will be of no benefit.
You are quite right; there are a lot of varying opinions on the pf usage
et al, but everyone will pretty much tell you that it's of no benefit on
a single-drive system, regardless of how many drive letters (partitions)
that drive is split into.

Well, the less it gets used. The pg is only used for RAM "overflow" if
you will; when there is not enough RAM to hold everything that needs to
be in RAM. The hard drive is then used as an extension to RAM to get
the additional needed space.
In an ideal situation, the pg would never get used because everything
ever needed will fit into RAM, so it won't "overflow". In practice,
it's not that simple but it's close. With 2 or 3 Gig of RAM one should
never see any pf activity for normal day to day tasks. But if say a
large graphic or video editing/rendering, or some very intensive number
crunching has to be done, THEN it might spill over into the pf because
everything might not fit in RAM at the same time when it's needed to be
there at the same time.

That comes from the idiotic myth that you should have a pf 1.5 times the
amount of RAM you have, which is not at all what MS intended. 1.5 time
RAM is a starting figure for a 512 Meg of system RAM. The LESS RAM you
have, the more you MIGHT need a larger pagefile. Again, it depends on
what the machine is doing.

One thing you should do is avail yourself of one of the many free
pagefile monitors available on the 'net. PC Mag, Eldergeek, all kinds
of reputable places have them available. That way you get a finite look
at how much, if any, the pf is actually being used.
Normally the pf will contain a couple hundred or so k of data. It
keeps enough minimum space available for mini-dumps in case something
goes wrong. But if the system never needs the pf, those numbers don't
change more than a few tens of bytes as various things go on.

No idea where you got that; It's total malarky.
In addition, it's technically impossible to not have a pf. Even if
you turn it off, an amount of space on the drive will still be allocated
for pf activity and will be used if needed, rather than allow Windows to
crash just because it needs another k or so of RAM space. Windows tries
to protect itself in that manner. Likewise, if you spec too small a pf,
and windows wants more, it'll make the pf larger on its own and simply
advise you with a dialog box that it's doing it. So you don't crash,
but you suffer the annoyance of the dialogs popping up when it's too
small. Set too large, then it just wastes space for nothing.

The pf on C will be reallocated for what it needs and you'll get error
messages. See the previous item.

Yes. I usually use 250 and have never seen 50, but ... in general
that's true. Memory dumps can only be written to C so you need to keep
something there.

Same as previous. I suspect you're making up these numbers now.

False. There are several ways to defrag a pf. It just can't be
defragged with XP or most defraggers, that's all.

Yup. Forever. And it's a waste of space. It's always best to let the
system manage it. There are a lot of opinions about this one. Six of
one, half a dozen of the other IMO. I only use system managed size and
only have ever had 2 fragments in my pf. That's because, as long as the
drive isn't full, and the pf size was set well before it started to
become full, XP is smart enough to keep space allocated for it to grow
and that pf area will be the last area of the drive to become occupied
with file storage. If the drive starts to fill up though, you may well
get fragmentation of the pf. Whether it will be enough to bother
anything is anyone's guess. The pf prefers to live in the center area
of the disk, equidistant from both ends of the disk w/r to seek times
for the heads.

You just stated it couldn't above. Your'e getting montonous.

Forget the rules of thumb. Get as much RAM as you can, then a pf
monitor, and run your most intensive programs. See how much it gets
used, if at all. With 1.5 to 2 Gig of RAM, most people will seldom see
their pf being used.

Yes, on a separate PHYSICAL hard drive.

No; the same size as occupied RAM. How else could you store the data?

Better to get a bigger hard drive; they're cheap. Allow plenty of room
for expansion. Don't get caught having to resize the partitions down
the road; it can get pretty messy.

You'd be a hell of a lot further ahead buying a terabyte external drive
to hold your sequential backups and employ a disk imaging application
such as Norton's Ghost our Acronis' True Image. It takes me a total of
23 minutes to pop in the boot CD, tell it where the images are stored,
and to end up with a fully functional machine as it was on the date the
image was made.
You don't seem to do scheduled backups, but Norton and TI will do
those for you so that your data is never more than 12 hours old or
whatever figure you decide to use. That includes the operating system
and all data on all the drives attached to the machine except the
external storage drive in my case. I use Ghost and it doesn an
incremental backup every night or whenever anything is uninstalled or
installed and when more than 50 Meg of data has been erased or added to
any of my drives. I also use XXCopy to make special, extra-important
backups of things like development history, things like that. The idea
is to never lose anymore data than absolutely necessary at any time.
Then monthly, after the full backups have run, I transfer the full
backup to DVDs forpermanent storage; one for me and one for my sister
(we trade image sets so we always have a set stored "offsite".

And finally, I don't have any arguement with the method proposed below
either, except that I don't think it covers all the bases. By having
the pf on and set to system managed, it's there should it ever need to
be used. No farting around with annoying messages when windows decides
it has to have one or crash; no one wants the crash.<g>

HTH,

Twayne`
 
R

Robert Carnegie

C.Joseph Drayton said:
On my development machine, I do not run a pagefile at all. I have 2GB of
RAM and when I am programming, I tend to not be doing anything else.
When I have to manipulate graphics for an application that I am running,
I have a VM with nothing but Adobe Design Studio CS3 installed on it, in
that VM I run a pagefile that is 2.5 times the size of the allocated RAM
for the VM (in this case 1GB).

The development machine and the VM are set up so differently because
Adobe loves a big pagefile, where-as the compiler I use is CPU intensive
and I have it do most of its work in RAM. With 2GB of RAM, I can set up
a RAM disk that holds compiler and resource files. That way I can do
very fast compiles.

In short, there really is no ONE answer. I would say that you should
experiment and take good notes to find out what works BEST for YOU.

That does go against some stuff that I thought I did know, and maybe
should have put into the list, with more contradictions.

For regular computing - word processing, spreadsheeting, web browsing
obviously - I think the computer will use real RAM if it's there, not
pagefile - almost entirely. Until it runs out of RAM.

Now:
<http://support.microsoft.com/kb/314482/>

"The optimal solution is to create one paging file that is stored on
the boot partition, and then create one paging file on another
partition that is less frequently accessed on a different physical
hard disk if a different physical hard disk is available.
Additionally, it is optimal to create the second paging file so that
it exists on its own partition, with no data or operating-system-
specific files. By design, Windows uses the paging file on the less
frequently accessed partition over the paging file on the more heavily
accessed boot partition. An internal algorithm is used to determine
which paging file to use for virtual memory management.

"However, if you remove the paging file from the boot partition,
Windows cannot create a dump file (Memory.dmp) in which to write
debugging information in the event that a kernel mode Stop Error
message occurs. This could lead to extended downtime if you must debug
to troubleshoot the Stop error message."

Same goes, apparently, if C:\Pageflie.sys is not at least a few
megabytes larger than RAM, some say +50 MB. But if you're like me,
you'll figure that if Microsoft supplied an electron microscope to
examine the state of memory in the event of a crash, it wouldn't help
to troubleshoot the Stop error. So just plan to unplug all USB
devices and reboot, then add them, one by one. Then try adding them in
a different order.

If you don't want that benefit, there seems to be no reason to prefer
putting /any/ pagefile on C instead of its own partition. If you're
happy to give it a partition.

<http://members.shaw.ca/bsanders/WindowsGeneralWeb/
RAMVirtualMemoryPageFileEtc.htm>

Each process can have 2 GBytes of virtual memory exclusively its own.
That is, each program. So 100 greedy programs running simultaneously
could require 200 GBytes of pagefile? Hey, I don't know.

<http://support.microsoft.com/kb/237740>

Each hard disk partition can have its own pagefile, see above. A
pagefile can't be larger than 4095 megabytes. This article provides a
trick - method - to put more than one pagefile on the same hard disk
partition.

<http://reviews.cnet.com/4520-10165_7-5554402-1.html>

actually tells you how to switch /off/ a feature that writes strings
of 0s into the pagefile at shutdown. Besides security (imperfect),
the feature also means that my design to back up all of disk C (minus
free space) using Linux and compression would have the advantage of a
highly compressible pagefile on the disk. Large but empty.

On the downside, apparently I couldn't stop it cleaning /all/ my
pagefiles.

Conclusion:

1. I can afford to put 4 GBytes of pagefile on a partition that isn't
C. That's 4 times current RAM, 2 times possible upgraded RAM.
Windows isn't designed to make it easy to get more than that, and
there's no particular doubt of it being enough. If it isn't enough, I
now know a trick to add more page space from the same partition,
create another pagefile.

And since there isn't one right answer, I may as well go for this
one! I think!
 
R

Robert Carnegie

What does "not so good" mean?

Well, it could have been more. The PC only has one memory slot, and
if I put in 2 GBytes then I have a 1 GByte memory unused and probably
no warranty - it isn't a part for a customer to install.
With only one disk drive, there is zero advantage to having the pf
(pagefile) located on another drive letter.  It's only helpful when the
pf can reside on a separate, different physical hard drive.  Using a
different partition on the same hard drive will give you no benefit and
depending on where it gets created physically in the platter/track
system, could even result in the opposite condition; slowing things
down.
   Since this is a single-disk only machine, forget about moving or
splitting the pf; it will be of no benefit.

It's for convenience of backup. If I am supposed to have a large
pagefile, I don't want to back up the pagefile when I back up the
system partition. I want to back up the system partition so that I
can get my computer back quickly if the disk fails.
That comes from the idiotic myth that you should have a pf 1.5 times the
amount of RAM you have, which is not at all what MS intended.  1.5 time
RAM is a starting figure for a 512 Meg of system RAM.  The LESS RAM you
have, the more you MIGHT need a larger pagefile.  Again, it depends on
what the machine is doing.

On reflection, maybe the logic is that you specify your computer with
RAM to suit the programs you want to run - or else the other way
around - and the useful amount of page file is proportional to that.
But that assumes that programs have similar behaviour and needs in
using memory.
Yes.  I usually use 250 and have never seen 50, but ... in general
that's true.  Memory dumps can only be written to C so you need to keep
something there.




Same as previous.  I suspect you're making up these numbers now.

Well, that one may be a typo, but I've seen it. I think these are
garbled echoes of "Must have a pagefile equal to RAM size to receive
dump in event of a crash." People see a size reflecting their RAM -
actually, 50 MB is more likely to be the typo - and they repeat
online, "For a crash dump you need a 126 MB swap file." And as I say,
who /does/ want a crash dump?

This is a new netbook, so I'm not looking to fit upgrades in it -
except maybe RAM - it /is/ the upgrade.

Here's my thinking now: a notebook hard disk is priced maybe 1 USD per
gigabyte. So if I assign maximum permitted size of one pagefile, 4
gigabytes, I am spending 4 dollars on it. I can afford four dollars
and I get to stop fussing and just enjoy my new computer. On the
other hand, if the pagefile gets full of digital junk then it'll take
up to one DVD on its own to back up that file, and time, and I don't /
need/ to back it up.

Of course if a product such as Ghost can back up system but skip the
pagefile and the hibernation file - well, can they?

With a Linux live CD backup process, the tool comes to me free and I
don't have to consider how many machines of my own - or friends if it
comes to that - I'm licensed on. But it may be worth paying the money
there, too.

Thank you for giving your opinions! And hey, what do you think about
anti-virus and security software? This netbook came with a trial of
Norton's "2009" offering, but I like F-Secure, and at the store
there's a steep discount on Kaspersky, for up to three PCs...
 
T

Twayne

Just for grins I took a look at that; not bad. Kind of a neat little
thing, eh?
Well, it could have been more. The PC only has one memory slot, and
if I put in 2 GBytes then I have a 1 GByte memory unused and probably
no warranty - it isn't a part for a customer to install.

I see.
It's for convenience of backup. If I am supposed to have a large
pagefile, I don't want to back up the pagefile when I back up the
system partition. I want to back up the system partition so that I
can get my computer back quickly if the disk fails.

Aha, I see. Well, if the disk drive is less than 50% occupied, you
could probably move it to another drive on the same partition. BUT:
I don't think that's really necessary though as any decent backup
program I'm aware of except maybe XPs ntbackup.exe will let you skip
such files. Some set them to be skipped automatically for you. You also
don't want things like the System Volume Information (restore points -
they'll be useless), the goback file is you use goback, temporary
internet files, backup files, etc. etc.. There may be others you don't
want to once you get going.
On reflection, maybe the logic is that you specify your computer with
RAM to suit the programs you want to run - or else the other way
around - and the useful amount of page file is proportional to that.
But that assumes that programs have similar behaviour and needs in
using memory.

That's not a bad way to look at it. It's a subjective thing, and I
personally like to think I specced enough RAM for what I plan to do, and
a little more for whatever happens "tomorrow".
This is a case of "more is better" up to the point of diminishing
returns, which occurs in XP 32-bit sytems at 2 or 3 Gig, depending. It's
really difficult to predict what you may actually want for RAM so I
generally say "as much as you can afford, up to 3 Gig, 4 if it's the
only way your machine will accept it.
RAM didn't used to be cheap, but it's pretty inexpensive nowadays for
most PCs. Notebooks can be the exception.
Well, that one may be a typo, but I've seen it. I think these are
garbled echoes of "Must have a pagefile equal to RAM size to receive
dump in event of a crash." People see a size reflecting their RAM -
actually, 50 MB is more likely to be the typo - and they repeat
online, "For a crash dump you need a 126 MB swap file." And as I say,
who /does/ want a crash dump?

You "might", depending on the kind of trouble you get into. If you take
your machine to a shop for repair, it can be pretty useful to them. A
"dump" is a description of the states of the computer at the point where
it crashed so it's really a trouble-shooting tool.
That's why there is a mini-dump and a full dump. You size the pf to
accept a mini-dump; a couple hundred k or so.
This is a new netbook, so I'm not looking to fit upgrades in it -
except maybe RAM - it /is/ the upgrade.

Ahh, you've been afflicted with NTS (New Toy Syndrome)! Been there,
done that! said:
Here's my thinking now: a notebook hard disk is priced maybe 1 USD per
gigabyte. So if I assign maximum permitted size of one pagefile, 4
gigabytes, I am spending 4 dollars on it. I can afford four dollars
and I get to stop fussing and just enjoy my new computer. On the
other hand, if the pagefile gets full of digital junk then it'll take
up to one DVD on its own to back up that file, and time, and I don't /
need/ to back it up.

True! Being forward thinknig enough to account for "tomorrow" is the
trick you need. The hard drive should be large enough that you can
foresee it being plenty of room at 80% of that size for say the next 3
years. That's under "current plans". But if, say tomorrow, you get a
new digital camera and suddenly start storing hundreds, maybe thousands
of pics on it, well it could get small pretty quick!
IMO again, the best situation is 500 Gig drives for internals and a 1
Gig external drive for backup storage. This machine originally came
with an 80 Gig drive several years ago. Then I added a 500 Gig internal
drive. All 7200 rpm drives. Since then Iv'e split the first 80 Gig
drive into C and D: The OS and everything I've downloaded from the
internet and want to keep. Research, programs, troubleshooting tips,
articles, papers, etc. etc.. If it's on drive D, it came from Internet
downloads. The second drive is E through H, each for its own specific
purpose, two of which are for video work because those drives require
frequent defrags when they get used. So I can start a defrag or
whatever onthe video drives and go do other work on the different drives
without impacting the ongoing defrag.
I now also have two external 1 Gig drives that I rotate on a monthly
basis. That seems wasteful until you find out that I lost a LOT of data
when my first one went belly up on me; catastrophic, unrecoverable
failure. Not only did I not have DVDs of the most recent monthly, but I
also had a lot of data on it that was unique, not saved in other places.
I was stupid, I admit it! So now I have two external drives. They get
traded each month as to which one is connected and powered, so there
should be no way to lose more than a month's worth of data, ever. Even
if lightning takes out the whole room of equipment, I still have that
unconnected drive with good data on it less than a month old. At first
that made me lazy about creating my DVD sets, but I've since fixed that
by automating the process and annoying myself with frequent notices that
it's time to do it said:
Of course if a product such as Ghost can back up system but skip the
pagefile and the hibernation file - well, can they?

YES, Ghost can skip ANY file or folder during backups. The most common
ones are selected for you, like the pagefile, Volume, *.baks, *.wbk and
a few more I forget right now. It's very handy IMO.
With a Linux live CD backup process, the tool comes to me free and I
don't have to consider how many machines of my own - or friends if it
comes to that - I'm licensed on. But it may be worth paying the money
there, too.

Umm, I don't know. It sounds subjective to me and would depend on what
you wanted vs what it can give you. So I don't have any way of comparing
the two.

I'm trying to move to Linux myself but there are still drivers missing
that I absolutely have to have or some expensive equipment will need
replacement, not something I can afford. I am sure however that XP is
my last Microsoft operating system even if they find a way to
forecefully make me abandon it, as they've done with some of their
development tools. I haven't yet found a Linux replacement for
PaintShop Pro either, but I'm always looking. The common ones like Gimp,
etc. just don't cut it for the kind of work I do but patience may pay
off, I hope. I see a lot of projects going on.
Thank you for giving your opinions! And hey, what do you think about
anti-virus and security software? This netbook came with a trial of
Norton's "2009" offering, but I like F-Secure, and at the store
there's a steep discount on Kaspersky, for up to three PCs...

Again, that's subjective; I'm currently using a bundle I purchased at a
cut rate at Symantec of Norton 2009 NAV (antivirus), NIS (Internet
Security) and SystemWorks. The 2009 version works well, has a much
smaller footprint than it used to and IMO more useful bells & whistles
than anyone else. BUT ... their virus subscriptions have gotten so
expensive, they're offputting me at the moment. BUT again, with Norton
you get a turn-key application. With the others, many of which are also
free so far, you end up with three to five or more separate applications
to acieve what Norton is giving me all in one app. I've had occasion to
use their tech support last winter too and they were top notch, besides
being quick. But we all know that sort of thing varies with the culture
at any place, so ... who knows what it's like today<g>? Or tomorrow?
One of the biggies I lke with Norton is their auto update seems to be
rock solid and never requires attention.
Only downside I can think of, is when you switch off scanning
incoming emails you get an error situation on the system tray. They
claim, and apparently it's true, to have fixed the problems of messing
up emails by scanning them. I've had it turned back on for the last
several months and nary a problem, even though I do e-mails sometimes
when the sytem is very busy and the cpu is maxed out at near 100%. But
the good news is, it's not using my pagefile<G>!! lol, sorry, couldn't
resist.

HTH a little, & enjoy!

Twayne`
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top