Virtual Memory Question?

L

lilsnoop40

Guest
hi, i have 1.0gig of DDR 333mhz memory and i am getting a virtual memory error saying to low. and i know how to adjust it, but i don't know what to set it at to make it go away.

thanks
 
control panel >system>advanced>performance settings>advanced>virtual memory

set min and max to 512mb
 
look in the my "favourite tweaks" thread, this is one of Dealer's favourite subjects. So far from what I can figure, it is best to just let Windows manage your virtual mem.
 
It sounds like the page file may be fragmented, and the O/S cannot allocate a single contiguous chunk large enough for a particular operation.

Try deleting the page file (or setting it to 0 min and max), then defragmenting the hard disk, afterwards resetting the page file to your preferred settings.

Here's why;
After you have defragged, all of your available free space will be a single contiguous chunk. If you create a fixed sized page file (min and max the same, it will sit as a single allocated file in the new defragmented space. If it is larger than your available physical RAM, there will be plenty of space for all operations.

However (there's always a however), this could be down to resource-hungry applications not releasing resources once they've finished with them. Rebooting clears the swap file, hence the issue goes away. There's really not much that can be done about this except to track down which are the leaky apps and try to use alternatives, or else use these apps only when necessary.

Also... How much free space you got? You can safely lower the pagefile setting to about 150MB or so for both Min and Max values. Keep them the same so that windows doesn't expand it. When the thing bittches about not having enough, look in the task manager under processes for the biggest hog... that will most probably be your problem... it might be some app you don't know is running thats hogging up everything.

Have you ever tried Cacheman. Works really well for those type of problems. You can download the program from here:
http://www.outertech.com/downloads.php

Although I don't have the pagefile problem, I did know someone that had the ZERO Pagefile problem... He found talking to the boys at MicroSquish... that certain Intel chip sets have problems with XP.. He download a patch file and the problem was gone...
Supposidly to fix the problem, you download the Intel Application Accelerator at http://support.intel.com/support/chipsets/iaa/

Straight from the guys at Microsoft.

I hope this works for you :)
 
thanx dreamliner...hehe

ya, do not set the max and minimum to 512.

just about every user will suffer a performance hit with this setting, especially a person who's allready getting a low memory warning.

since you're getting this warning, and assuming you haven't already touched the page file, you must increase the inital minimum to at least 2x ram, (666), and you should set the maximum to 4096.

that should solve your problem.

if you ever get the warning again, then simply increase the initial minimum, do not make the maximimum the same size as the minimum...ever...does not make sense in xp, or any nt kernal
 
gonaads, i'm very surprised at your post...there is no user that can safely lower the pagefile to that small setting, what will happen with a setting that small, is the os will find areas that you did not allocate, and it will page there, creating more hardrive activity, and creating a more fragmented envirnment, not a less fragmented enfinment.

sometimes, we will suffer performance hits that we don't even realize, and with a pagefile this small, the os is definately slower, though some people might not notice, the slow down is there regardless.

you cannot stop xp from expanding the pagefile, even though you try... by setting a static max and min, as soon as the commit charge reaches the commit limit, which is the only time the pagefile even wants to expand, and of course, is the very time YOU NEED the pagefile to be bigger, (AND which by the way, and oviously, THE COMMIT CHARGE REACHES THE COMMIT LIMIT SOONER WITH A SMALL PAGEFILE THEN WITH A BIG PAGEFILE), the os will therefore find other areas on the hardfrive, and page to those other areas, it will do it more, and it will do it sooner, and it will do it less efficiently, then if you allow the os and the pagefile to do the job it was well designed to do

you can easily prove this to yourself, by lowering your pagefile to the setting you suggest, and take a look at permon...you will see more pagefile activity, not less with this setting.

all of this is allready documented by microsoft, way back when nt was first released...here's the paragraph, and referance;

"...A pagefile that's set too small can lead to overactive disk swapping, or "disk thrashing." The only real drawback with a relatively large swapfile is that you might not have as much disk space available for other uses as you would if you'd followed the pagefile setup recommendations."...

for your referance, document number; Q102020:

i'm amazed there are people tha tstill believe there can possibly be a benefit to a small pagefile

now, since that document, microsoft has severely increased the minimum recomendation, but the facts of that document remain, there is no slowdown whatsoever with a big pagefile, and quite a performance hit if the pagefile is too small

and now, about cahcheman...no, cahcheman will make this problem worse, it will release ram that is in use, and if it's not in use, then it's allready released by xp.

cacheman is for millenium and 9x......hardly fotr nt...that's exactly what the pagefile is for in the first place
 
by the way, the simplest way to defrag the pagefile is with this free pagefile deffrag program...

once your pagefile is contigous, it will never, ever get fragmented again, until you increase the size of the pagefile yourself manually, or increase your ram
 
This was only to find a possible Memory Hog nothing more than that.

And as for Cacheman, this is directly from their site:

All programs except of CpuUsage System Service support Windows 95, 98, 98 SE, ME, NT4, 2000 and XP.
 
ah...I didn't realize you just wanted to track down a memory hog
and that was the purpose of lowering the pagefile.

there is an easier way to do it;

For every program running on a computer, the operating system allocates a portion of physical memory. This is called the working set. Even if the program is not generating any activity, the operating system allocates memory for the program's working set.

if you watch perfmon, when closing any program you will watch corresponding pagfile use decrease as you turn off each program...you will be able to monitor the working set of any program in that fassion

now, as far as that qoute you poste claiming xp support for cacheman.

gonaads, that is an intel site, not a microsoft site...micrsoft admonishes the use of memory programs in xp, and they do not support the use of cacheman in xp
 

Members online

No members online now.

Latest profile posts

Also Hi EP and people. I found this place again while looking through a oooollllllldddd backup. I have filled over 10TB and was looking at my collection of antiques. Any bids on the 500Mhz Win 95 fix?
Any of the SP crew still out there?
Xie wrote on Electronic Punk's profile.
Impressed you have kept this alive this long EP! So many sites have come and gone. :(

Just did some crude math and I apparently joined almost 18yrs ago, how is that possible???
hello peeps... is been some time since i last came here.
Electronic Punk wrote on Sazar's profile.
Rest in peace my friend, been trying to find you and finally did in the worst way imaginable.

Forum statistics

Threads
62,015
Messages
673,494
Members
5,621
Latest member
naeemsafi
Back