I have a Windows 7 Professional computer with 12GB of RAM. On a fresh start up, the amount of RAM is more than adequate for my needs. However, a couple of weeks into up-time and it appears that the memory just disappears and starts using virtual memory, causing my performance to degrade.
For example, I typically have 3 instances of visual studio open, which early on in the up-time is perfectly fine. However, it gets to the point where even one instance of visual studio is too much overload to handle, even after restarting visual studio.
Here are some screenshots that show you the differences in the task manager:
On Fresh Restart:
2 Weeks Up-time:
I understand that Windows 7 will try to use up all available ram over time to allow for quick access of programs and other services (caching basically), but this doesn't explain why after 2 weeks of computer up-time I am having to close tons of programs/processes just to use a program without screen-freezing lag, whereas with barely any up-time I can run multiple programs just fine.
What exactly is happening here?
Answer
You are looking at Private WOrking Set. I highly recommend adding Commit Size to your taskman columns.
RAMMap is the bomb, but most of the time you just need Commit Size, as apposed to all the other memory counters available.
Commit size, IMXP, has always more closely matched the Memory meter (The old "PF Usage" on XP)
If I worked for MS, I would make it the default, it would save many wasted hours of head scratching
No comments:
Post a Comment