Memory Paging - Tweak Request

Unless they've changed it (and actually there are defrag programs that can defrag the pagefile) there is a reason to set the min and max to the same amount. I always do that myself...

On the other hand, with 512 MB of RAM, you need to plan on pageing in winXP, and 1.5x physical memory has tended to be the recommendation from back in the Windows NT 4.0 days. BTW, Windows XP is based on the NT 4 code base (as was win2k before it).

In fact the description of what's happening here sounds a hell of a lot like what some were describing from the NT 4 days, and what happened when I went to patch Unreal from 2.19 to 2.20, installed on NT 4, some years back. The patch had issues, and it used swathes of memory that a straight OOB and patch directly to 2.20 didn't... Course when Unreal came out (and in the days before Unreal Tournament) our computers were smaller, with less memory and hard drive as well...

What happens (and what did happen) is it would approach the limit for physical and virtual, and decide to increase it. But unlike the way win9x would dynamically grow/shrink, it would only increase it a few MB at a time. Just enough to get it out of hot water (with the warning message it gave). As requirements increased, it would repeat, again, and again...

Once it was done, even after the paging file had shrunk back down, Diskeeper showed pagefile.sys having something like 4,000 to 5,000 file fragments. With this heavily a fragmented file (and in those days there wasn't a means to defrag the pageing file) one was stuck with p*ss poor performance until after a reboot. My performance was negatively impacted, even with a 10,000 rpm SCSI drive...

A few notes on this:

- With 512 MB physical memory, you need more then a 512 MB swap file. This is also obvious from what is being seen here

- 1.5x is recommended (you might need more, depending on your needs). I have 512 MB, but haven't loaded BF2, so can't comment on what the need here will be.

- Having a static swapfile in a winNT based system (which given the code base, and also the suggestion here, would still seem to apply to XP) can be a good thing. It isn't about "saving disk space" necessarily, but rather preventing extreme file fragmentation on the thing...

Have you checked the amount of fragmentation on your drive after it had increased as such, to see if it has gotten very high since the swapfile expansion?

- There are now tools to defragment it (which didn't exist back in the day). If it is heavily fragmented, and you decide to use a static swapfile, check it with a defragmenter (capeable of pageing file defragmentation), and if it's there, defrag that page file... Setting it static now, would probably do little if it's already fragmented as bad as I have noticed in the past, and that isn't taken care of

- The algorithm that winNT (and I'm gathering 2k and XP) had for expanding the pagefile is IMO, not as useable. This is one area where I think win9x had handled this given situation better. It's also less bothersome to users to not have to click on dialog boxes the whole time it's expanding...
 
Son Goku said:
Unless they've changed it (and actually there are defrag programs that can defrag the pagefile) there is a reason to set the min and max to the same amount. I always do that myself...
there is no good reason, all the reasons anyone ever mentioned are not correct

Once it was done, even after the paging file had shrunk back down, Diskeeper showed pagefile.sys having something like 4,000 to 5,000 file fragments
in XP the pagefile is in exactly the same place and condition once extents are discarded on restart...if it as contiguous before expansion it is contiguous after expansion on restart if the drive is healthy

now, if the initial minimum is so small that xp needs a bigger file and the os immediately expands the pagefile, then of course the new extents are fragmented due to the incorrect settings...this is a too small pagefile which is never at the intitial minimum setting

-
Having a static swap file in a winNT based system (which given the code base, and also the suggestion here, would still seem to apply to XP) can be a good thing. It isn't about "saving disk space" necessarily, but rather preventing extreme file fragmentation on the thing...
there is no fragmentation of the original extent of the pagefile once it's contiguous

Have you checked the amount of fragmentation on your drive after it had increased as such, to see if it has gotten very high since the swapfile expansion?
it's only fragmented because of the added extent, this is discarded on restart...the original extent has not moved and must remain in the same condition it was in before expansion

please click the link in my sig
 
Last edited:
Most of those pagefile tweaks were written back when people had small hard drives so saving disk space became an issue. Back when I had a 10 GB drive I would set the page file but now with large drives it really is not an issue.

I really feel there is no reason to mess with the pagefile anymore since people on average have 512 MB of ram and hard drives of at least 80 GB or larger. I just set the page file to system managed then I never have to worry about pagefile errors.

Windows sets the minimum to 1534 MB as I have 1 GB of ram and since I have 1 GB of ram the minimum is large enough that it will probably never be expanded. The highest I have ever seen in task manager under commit charge peak was 800 MB and that was when I was encoding videos.

Going from that number I could probably set my page file minimum lower but life is too short to worry about stuff like that in my opinion. XP does a good job at managing memory so I let it do it's job and I worry about more important things.

Losing 1 GB or so of hard drive space means nothing to me as I have over 250 left and that is just one of my drives. :up:
 
Last edited:
winXP might have made a change wrt this. One can actually hope, as Windows NT 4's handling of virtual memory, was not really how I would like to have seen it done. The system had it's pluses over win95/98, but that wasn't one of them IMO... What I mentioned is exactly what had occured in the days of winNT 4.0. Since one experience with the high degree of fragmentation, I never wanted to try that one again :)

Then again I had since purchased a second Cheetah (but from the third generation, rather then second), and as such can divide the pagefile between them (so it gets the cumulative read/write access) in accessing virtual memory. The newer is 2x faster then the older in contiguous reads, when I last benched them both in SCSI bench 32. I try to keep 2/3 on the faster drive...

On win9x, I always left it for the system to manage.

I do wonder what Sinister was seeing with

I keep getting VM is low warning when I play BF2.

From my experience in NT 4, I'm reading this that he keeps getting this message per game playing session. This is because in the past at least, it just grew it enough so that it wasn't almost out, but not enough to account for just how high it could go. It was extremely conservative on the growing, and through up a dialog box every time it did so.

Again, this might have changed (as I never did bother to put myself in a situation to see this again). What sinister is saying could sound like this.

BTW, I hate getting swathes of popups (one reason peeps don't like popup ads at websites :) ), including dialog boxes that one has to keep clicking OK on. If you're gonna expand it, just do so...don't interupt me with a popup. I was getting these when I applied the 2.20 patch in Unreal, back in the day, every 30 seconds. It was a constant stream of clicking "OK", "OK", "OK", etc :eek: :(

At least it was a patch though... Popup dialogs when in game (and one could be close to death) is a tad annoying :mad: :) I had that when Azureus wanted to upgrade and kept pestering me (though only once a day, thank gawd), but the then newer version ate up 100% of my CPU like there was no tomarrow. The lattest version (out there currently) doesn't do that :up:
 
Last edited:
I do wonder what Sinister was seeing with


Quote:
I keep getting VM is low warning when I play BF2.

my guess is that he listened to the irresponible advise on alex's paper and made a 512 initial minimum...WELL below the settings ms insists should be the least to set the pf for best performance
 
Last edited:
perris said:
my guess is that he listened to the irresponible advise on alex's paper and made a 512 initial minimum...WELL below the settings ms insists should be the least to set the pf for best performance

Actually I took his advice by setting my Min and Max the same...

as far as the 512 goes I plug that in myself.. I haven't had problems for the past yr or so like that so I really never dug into it until last night. Yes I've read set it to 1.5 before and read elsewhere to go off what you are using in Task Manager. I usually take information from everyone and try what works the best for me.
 
Sinster said:
Actually I took his advice by setting my Min and Max the same...

as far as the 512 goes I plug that in myself.. I haven't had problems for the past yr or so like that so I really never dug into it until last night. Yes I've read set it to 1.5 before and read elsewhere to go off what you are using in Task Manager. I usually take information from everyone and try what works the best for me.

Makes sense to try to look at different view points on something and try to compare them to see what will work best.

As to taking what one is seeing in Task Manager, I would contend that a problem can arise when the software, installed on the HD, isn't set. First off, not everything is loaded into memory when one first launches a program. If one monitors it for awhile, one will notice that depending on what one does, the resource requirements can go up as it's run for awhile.

But more to the point, in adding new software, there is no gaurentee, that for instance a new game like BF2 wouldn't use more memory then what one has used previously. Some head room is a good idea to account for the possibility of bigger programs.

BTW, with 512 MB RAM, I'm not exactly strapped, but am beginning to see a situation on my own PC where I would definitely buy more, whenever I upgrade. I'll probably move to a GB or something next comp, and keep the virtual at 1.5x physical...

perris said:
my guess is that he listened to the irresponible advise on alex's paper and made a 512 initial minimum...WELL below the settings ms insists should be the least to set the pf for best performance

Actually, I meant more wrt the dialog boxes comming up to indicate he's running out of memory, and whether what he's seeing here is similar to what I saw in NT 4. Getting a dialog box that says "you're almost out of memory" every so many seconds :down: Microsoft might have changed the way they grow the swap file since then however.

Personally if it could grow, and wouldn't fragment though, I'd prefer that it do so seemlessly, without interupting the user. A message could be left in the system log to indicate this has happened, so one could check Event Viewer on their own time. That said, they might also be thinking "not everyone checks Event Viewer"...
 
Son Goku said:
Personally if it could grow, and wouldn't fragment though, I'd prefer that it do so seemlessly

it's an important notice, you MUST raise the initial minimum for best performance if this notice appears...you need to avoid expansion by keeping a bigger pagefile then memory is in use...usually, once you have more memory in use then pagefile the pf is going to need to expand

the extents of the pf ARE usually fragmented when the pf grows, but on reboot the added extents are tossed out...the original pf is in the same condition as it was before it expanded as soon as you reboot...this expansion is neccessary for the memory management model to be efficient..if there isn't enough pagefile then recently used data will be swapped out faster then they should be

if the pf has ever grown it means your inititial minimum is way too small, increase the initial minimum to the point that the os will never need to expand it...usually the default is more then enough...never go below that


PLEASE read the link in my sig, thought there are a few errors concerning such things as network information doesn't need an area on the disk for rem backup, the concepts of this is explained
 
perris said:
it's an important notice, you MUST raise the initial minimum for best performance if this notice appears...you need to avoid expansion by keeping a bigger pagefile then memory is in use...

Fair enough, but they could do it just as effectively through the system log as they could through a dialog box that grabs the user's attention at the moment, and forces them to keep clicking OK again and again... Especially where (and this is where I was wondering if he was seeing the same behaviour here that I did in NT 4, or if MS changed this) it happens upon each incremental, and rather conservative growth to the page file.

Lets take playing a game for instance, as this is where it's mentioned: The pagefile fragments, performance takes a bit of a nose dive, the person gripes a bit about lag...

On the other hand, a person is playing and a dialog keeps comming up. If it minimizes the game, it can be worse... (I never really did let things hit this point, and on that one buggy Unreal patch, filed a bug report on what I was seeing...which did get addressed in notes in a latter patch for the game.)

Slow performance, vs "where did my game go", get back "I'M DEAD!!!! :mad:

:D

Yeah, point taken not everyone checks their system log routinely. But there are some things (if they came up for me), I'd rather address on my own time...
 
the point is you should not touch pagefile settings, if you left them as they were after a windows install or even set it to system managed, you'de never get a performance hit and your game would never disappear while windows tells you "nyah, thought you were being smart eh??"

Do as peris says, set your minumum to the recommended minum or set it to system managed and leave it alone. Everyone - even if you think you know better :) Just becuase you read an article on the internet about virtual memory doesnt mean its true.
 
BTW, what I had mentioned, I mentioned from first hand experience. I had not mentioned it simply from having read an article or two :D

Truth be told, I have always tended to set the 1.5x physical (albeit given a person's uses, they might need to set it higher). Our uses on the computers are not all the same, and some programs are more intensive then others. That said, someone running a real memory hog of a program probably should get more RAM.

The problem with the Unreal patch was more to do with a bug in the patch (and issues with an upgrade from 2.19). I suspect what it was doing was constantly allocating memory without ever freeing the memory when it was finished with it (until the program finally terminated). Needless to say, Epic listed as a bug fix that very problem in their next version...

That was NT 4, and though XP is based on the same code base (several generations latter) improvements to this situation is certainly possible. Reason for having said I wondered what Sinister meant...not that he shouldn't have decreased it to 512 MB minimum, but rather as a "does it manage the expansion the same way as NT 4"...

But already such mention has gone beyond the scope of a simple settings and into the realm of a suggestion to Microsoft (aka their "wish list") for changes to how the software operates, is coded...

From a settings perspective, to avoid that, I would set the min and max high enough (but the same) to avoid it ever having to grow. He is saying he would set the min high enough so as to avoid the same thing, but leave the max...

And yes, arguments could be had both ways. As to a need to increase it, either situation "it's growing and fragmenting" or "it's hit the ceiling and is having to pull stuff out of memory to put back in" would both indicate a need to increase it further (or possibly get more physical RAM and increase it further)... Neither scenario is an ideal situation, though it's good to hear the swapfile doesn't remain fragmented (as it had in my NT 4 system where this happened). :up:

The other is more of a potential gripe on how Microsoft tries to alert us (and something I would need to check up on) that their expansion in past versions of the NT code base were as conservative as they were... Personally, I find the system log preferable, as I don't think it's something the user has to be "immediatly interupted about", especially on a continual basis with multiple dialog boxes each succeeding the other. As long as it functions, one can get back to it when they're finished with what they're doing... If they had made it one box, and left it at that, it wouldn't have been so bad. But there comes a point where it no longer feels like notification, but rather harassment by the software, and as obnoxious or more so, then the infamous Mr. Clippy :)
 

Members online

No members online now.

Latest profile posts

Also Hi EP and people. I found this place again while looking through a oooollllllldddd backup. I have filled over 10TB and was looking at my collection of antiques. Any bids on the 500Mhz Win 95 fix?
Any of the SP crew still out there?
Xie wrote on Electronic Punk's profile.
Impressed you have kept this alive this long EP! So many sites have come and gone. :(

Just did some crude math and I apparently joined almost 18yrs ago, how is that possible???
hello peeps... is been some time since i last came here.
Electronic Punk wrote on Sazar's profile.
Rest in peace my friend, been trying to find you and finally did in the worst way imaginable.

Forum statistics

Threads
62,015
Messages
673,494
Members
5,621
Latest member
naeemsafi
Back