Son Goku
No lover of dogma
- Joined
- 14 Jun 2004
- Messages
- 1,980
Unless they've changed it (and actually there are defrag programs that can defrag the pagefile) there is a reason to set the min and max to the same amount. I always do that myself...
On the other hand, with 512 MB of RAM, you need to plan on pageing in winXP, and 1.5x physical memory has tended to be the recommendation from back in the Windows NT 4.0 days. BTW, Windows XP is based on the NT 4 code base (as was win2k before it).
In fact the description of what's happening here sounds a hell of a lot like what some were describing from the NT 4 days, and what happened when I went to patch Unreal from 2.19 to 2.20, installed on NT 4, some years back. The patch had issues, and it used swathes of memory that a straight OOB and patch directly to 2.20 didn't... Course when Unreal came out (and in the days before Unreal Tournament) our computers were smaller, with less memory and hard drive as well...
What happens (and what did happen) is it would approach the limit for physical and virtual, and decide to increase it. But unlike the way win9x would dynamically grow/shrink, it would only increase it a few MB at a time. Just enough to get it out of hot water (with the warning message it gave). As requirements increased, it would repeat, again, and again...
Once it was done, even after the paging file had shrunk back down, Diskeeper showed pagefile.sys having something like 4,000 to 5,000 file fragments. With this heavily a fragmented file (and in those days there wasn't a means to defrag the pageing file) one was stuck with p*ss poor performance until after a reboot. My performance was negatively impacted, even with a 10,000 rpm SCSI drive...
A few notes on this:
- With 512 MB physical memory, you need more then a 512 MB swap file. This is also obvious from what is being seen here
- 1.5x is recommended (you might need more, depending on your needs). I have 512 MB, but haven't loaded BF2, so can't comment on what the need here will be.
- Having a static swapfile in a winNT based system (which given the code base, and also the suggestion here, would still seem to apply to XP) can be a good thing. It isn't about "saving disk space" necessarily, but rather preventing extreme file fragmentation on the thing...
Have you checked the amount of fragmentation on your drive after it had increased as such, to see if it has gotten very high since the swapfile expansion?
- There are now tools to defragment it (which didn't exist back in the day). If it is heavily fragmented, and you decide to use a static swapfile, check it with a defragmenter (capeable of pageing file defragmentation), and if it's there, defrag that page file... Setting it static now, would probably do little if it's already fragmented as bad as I have noticed in the past, and that isn't taken care of
- The algorithm that winNT (and I'm gathering 2k and XP) had for expanding the pagefile is IMO, not as useable. This is one area where I think win9x had handled this given situation better. It's also less bothersome to users to not have to click on dialog boxes the whole time it's expanding...
On the other hand, with 512 MB of RAM, you need to plan on pageing in winXP, and 1.5x physical memory has tended to be the recommendation from back in the Windows NT 4.0 days. BTW, Windows XP is based on the NT 4 code base (as was win2k before it).
In fact the description of what's happening here sounds a hell of a lot like what some were describing from the NT 4 days, and what happened when I went to patch Unreal from 2.19 to 2.20, installed on NT 4, some years back. The patch had issues, and it used swathes of memory that a straight OOB and patch directly to 2.20 didn't... Course when Unreal came out (and in the days before Unreal Tournament) our computers were smaller, with less memory and hard drive as well...
What happens (and what did happen) is it would approach the limit for physical and virtual, and decide to increase it. But unlike the way win9x would dynamically grow/shrink, it would only increase it a few MB at a time. Just enough to get it out of hot water (with the warning message it gave). As requirements increased, it would repeat, again, and again...
Once it was done, even after the paging file had shrunk back down, Diskeeper showed pagefile.sys having something like 4,000 to 5,000 file fragments. With this heavily a fragmented file (and in those days there wasn't a means to defrag the pageing file) one was stuck with p*ss poor performance until after a reboot. My performance was negatively impacted, even with a 10,000 rpm SCSI drive...
A few notes on this:
- With 512 MB physical memory, you need more then a 512 MB swap file. This is also obvious from what is being seen here
- 1.5x is recommended (you might need more, depending on your needs). I have 512 MB, but haven't loaded BF2, so can't comment on what the need here will be.
- Having a static swapfile in a winNT based system (which given the code base, and also the suggestion here, would still seem to apply to XP) can be a good thing. It isn't about "saving disk space" necessarily, but rather preventing extreme file fragmentation on the thing...
Have you checked the amount of fragmentation on your drive after it had increased as such, to see if it has gotten very high since the swapfile expansion?
- There are now tools to defragment it (which didn't exist back in the day). If it is heavily fragmented, and you decide to use a static swapfile, check it with a defragmenter (capeable of pageing file defragmentation), and if it's there, defrag that page file... Setting it static now, would probably do little if it's already fragmented as bad as I have noticed in the past, and that isn't taken care of
- The algorithm that winNT (and I'm gathering 2k and XP) had for expanding the pagefile is IMO, not as useable. This is one area where I think win9x had handled this given situation better. It's also less bothersome to users to not have to click on dialog boxes the whole time it's expanding...