• Still running Windows 7 or earlier? Support for Windows 7 ended on January 14th 2020. Please review the thread here for more details.

[SOLVED] Typical Cached Memory

jayrod12

Well-known member
Joined
Aug 1, 2013
Posts
87
Good Day All,

Quick question, what is a typical amount of memory that should be showing as cached when the computer is at idle?

Under the performance tab in Task Manager I currently have 8 gbs total, but 5.2 gbs cached, 5.3 available, and 189mb free. And all that I currently have running is Nightly which is currently using 500mb.

Maybe it's just me but it seems like there really isn't much free. Am I being concerned over nothing?
 
There is no typical. For one, your computer is rarely ever idle. When YOU, the user start slacking, Windows starts doing housekeeping. Your anti-malware program scans, files are "indexed", updaters looks for Windows and program updates, email programs look for new email, social networking programs look for new content, background or scheduled defragging may occur (if not disabled), and more.

For another, there's hardly such a thing as a "typical user". That is, within seconds of firing up a Windows computer for the very first time, users begin customizing Windows to their liking, installing their favorite programs, setting up their security, etc.

and 189mb free
Where do you see that? And what version of Windows are using? If Windows 7 and that is the amount of Physical RAM, then that just means that is the amount of RAM that is totally uncommitted. It may not sound like much, and it isn't, but since you have 5.3Gb available, you have nothing to worry about.

Of course, there is a caveat - what matters is what is using your RAM. I am assuming you have scanned your system with updated anti-malware solutions to ensure your system is clean, and you have reviewed your startup programs to ensure unwanted junk is not running.
 
Ya windows 7 pro,

I'm running Security Essentials, and Spybot S&D. The computer itself is not running slowly, I've just never noticed that there is the majority of the RAM marked as stand-by as opposed to free.

I try to keep my computer fairly clean of start-up programs and mal-ware. Whether I am succeeding at that goal or not is another question.
 
I am not a fan of Spybot S&D anymore. Years ago with XP yes. But today, with Windows 7/8, I prefer to verify MSE (WD in W8) is keeping me safe by running supplemental scans with Malwarebytes's Anti-Malware (MBAM). I you really like Spybot, disable real-time protection (if not using the free version) and stick with MSE.

I try to keep my computer fairly clean of start-up programs and mal-ware. Whether I am succeeding at that goal or not is another question.
Understand there is no harm in having lots of things start with Windows. The only real downside is boot times may increase a little bit, and some extra system resources will be consumed - especially when Windows first starts. But, almost as quickly, most of the unused startup programs will go idle, and consumed resources will be released. This can easily be confirmed in Task Manager as CPU use will drop to, or near 0% and most of your RAM will be shown as "available".

So keeping your system "clean of start-up" programs is one thing - keeping your system clean of malware is another story.
 
That's exactly how I've got mine set up. Real-time protection from Spybot is turned off and I'm using MSE for the majority of my stuff.

If only I actually knew how I was doing with my battle against malware. MSE scans regularly (Quick Scan, Full scan weekly) and updated daily. As far as I remember it always comes back clean.
 
Back in the early days of Windows, RAM that wasn't used just sat idly by and it wasn't until a process needed something that Windows will try to seek out RAM that is free and then get it all straightened out and ready for use by the process. Now though, Windows seeks out all available RAM and makes sure it's put to use for whatever means, often even preemptively loading files into memory ready to be used by processes that you commonly run (like your browser), which this occurs through the Superfetch service. RAM is used a lot more intuitively now, instead of leaving large portions of it collecting dust like back then.

It's only when the standby and other available lists start getting sapped dry of their pages by processes constantly requesting more and more in order to run properly, that's when your memory starts running low. So think of it like this: low memory isn't because more and more of your RAM is being used up, because majority of it is always being used up by Windows for at least prepping; rather low memory is because more and more is being used by your drivers, applications, and services - the very stuff Windows is prepping and giving memory away too. When they request more than what Windows can give from RAM, it'll start using the paging file on the hard drive, at least for the memory that is allowed to use the paging file (most driver/Windows memory can't). When it maxes out the paging file, that's when the system starts giving up and telling them all that it's out of memory and can't offer anymore because it's already being used by some process(es). That is when the system starts bugging out.

Check this article for further details on how Windows rummages through memory for use.
 
Last edited:
That's exactly how I've got mine set up. Real-time protection from Spybot is turned off and I'm using MSE for the majority of my stuff.
That sounds just fine to me.
If only I actually knew how I was doing with my battle against malware.
If MSE, and supplemental scans come out clean, then that's your evidence you are doing well. That said, for another warm fuzzy, download and install MBAM as a double (or triple) check. The free version is on-demand only. That said, MBAM is the ONLY security program I have no problem recommending folks purchase. The paid version has a real-time component you can safely run WITH your current anti-malware program. Typically, running two real-time anti-malware solutions at the same time is not recommended because of potential conflicts (two dogs guarding the same bone) and unnecessary consumption of system resources. But MBAM plays well with other programs, and has a small resource footprint.

That said, the user is always the weakest link in security so it is still up to you to keep Windows and your programs fully patched and updated, and you avoid risky behavior - like being click happy on unsolicited downloads, attachments, or links, and you avoid participation in illegal filesharing via Torrents and P2P sites, and other illegal activities. This is because these places are places where badguys tend to wallow and release their new malicious code - code the anti-malware providers have not yet had time to create a update to their definition/signature files for.
 
RAM is used a lot more intuitively now
And much more efficiently and effectively too.
When it maxes out the paging file, that's when the system starts giving up and telling them all that it's out of memory and can't offer anymore because it's already being used by some process(es). That is when the system starts bugging out.
Very true - but then that's another reason to just let Windows manage your page file - even if you have gobs of RAM. After 20+ years, Microsoft has memory management figured out.

If you let Windows manage the page file and the page file maxes out, that is probably a good sign you need to free up some disk space, or buy more disk space.
 
Actually it's usually a sign of buying more memory or reevaluating your usage for the system. The system shouldn't ever really get to the point of using up the paging file, as its primary purpose has never been as a reservoir for overloaded memory but to store stale memory that doesn't need access frequently. The best system setup is with plenty of RAM and very little paging file!

Also, Windows paging file management still gets criticized by people like Mark Russinovich and joked around at being poorly built. I don't know how Windows 8 is with it, but even Windows 7 still runs with the ooooold method of a good percentage of your RAM, making the paging file vastly larger than it should be. Its memory manager is superb, but the paging file still is poorly setup.
 
The system shouldn't ever really get to the point of using up the paging file
Nope. I disagree with that. Windows will use the PF even if you have 32Gb of RAM installed. And that's fine. It does NOT mean Windows improperly stuffs data on to a slow HD instead of keeping it in fast RAM. Windows 7/8 knows how to prioritize.
Also, Windows paging file management still gets criticized by people like Mark Russinovich
Still? No. 5-10 years ago? Yeah. Windows 7 (and now 8) memory management (and much of the kernel itself) is nothing like that of Vista, XP or before.

The best system setup is with plenty of RAM and very little paging file!
Nope - don't agree with that either - at least not IF you are suggesting users should manually limit the PF file. I certainly agree 100% the best system should have plenty of RAM (8Gb is my recommended minimum for new dual-channel computers today), but leave the PF settings alone, unless you are an expert and know how to properly analyze your memory use.

I've met Mark and have a great deal of respect for him and what he has to say. But what he said in 2008 before Windows 7 came out does not apply today. Users should not dink with the Page File setting unless they too are an expert in Windows memory management. And Mark is the only one of those I know.

If you have seen ANY recent documentation by Mark that suggests users should manually set PF settings in Windows 7/8, I would be very interested in seeing it. But I don't think you will find any because Windows 7 is not XP.
 
Yeah, for sure that the paging file is used consistently, but as a means to page out stale memory that the application says won't be used frequently or Windows has checked the memory and saw it hasn't been touched in a good while and so pages it out - that's its primary purpose. The problem with a large paging file on a system with very sufficient RAM means the page file is expanded well beyond its actual usage. The system may end up using 10% or even just 5% of the bulk of the paging file, and that's only considering its initial minimum size that Windows establishes! Again, all because of some poorly constructed algorithm. So you have this hulking lump of hard drive space that hardly ever gets used.

As for the complaints on the paging file management, yes, Mark himself as well as I've heard on some other dev talks (Microsoft conferences no less!) that they joke around how it uses an ugly formula to calculate min and max for page file, and this was even referring to Windows 7. He mentions it in his Memory Management talk, and his Pushing Limits of Windows, and this was during a Teched Talk from 2010 with him using Windows 7.

Now one thing I certainly agree with is that no average joe should be touching the thing. The process of figuring it out isn't all that complicated, but unless the average user has knowledge of how to do it (they don't), then yes, that paging file should not be touched. My complaint isn't so much that people aren't adjusting their paging file and therefore have these huge masses of untouched HD space, but rather because Windows decided to make them into huge masses of untouched HD space. The formula generates ridiculously safe sizes for the paging file, as if to say the people who set it up thought, "No one would ever reach even half this amount!" In fact it makes problem scenarios worse since if there's a memory leak, the disk thrashes needlessly for a long time until it eventually fills that huge page file. At least a smaller one that will still fit normal memory usage patterns will get filled up a lot quicker, causing less stress on the HD and more prompt servicing - as people are more likely to get the system serviced faster if it's crashing, then if it's just being sluggish. Large page files are especially nightmares for SSDs, where their longevity is considerably affected by writes.

When working on people's systems, the minimum size I set for their paging file is actually around 75% of their total RAM. This gives them a large region of freedom while at the same time allows kernel dumps to be made without problem. If it weren't for the much needed size for the kernel dumps, I would even drop it further than that, but those dump files are important for diagnostics as we well know so it's best not to complicate things by not having them produced just cuz the paging file's too small.
 
The problem with a large paging file on a system with very sufficient RAM means the page file is expanded well beyond its actual usage.
No, sorry but that is not true. In Windows 7 and Windows 8, the Page File will expand and contract as needed. It does not start out maximum size possible, then stay there. As far as taking up large amounts of disk space, so what? Disk space is cheap. But does it take up huge amounts of space? No!

Note the screenshot of my W8 system below, which has 16Gb of RAM installed - a pretty large amount of RAM. And yet you can see I have Windows managing my PF on my boot (System) drive (256Gb SSD) and the "Recommended" is just 5.6Gb and only 2.3GB of disk space is currently allocated. Hardly a "hulking lump" or "huge masses of untouched HD space".

Large page files are especially nightmares for SSDs, where their longevity is considerably affected by writes.
I have a problem with that too. For one, today's SSDs are no longer affected by writes like early SSDs were. But to that, most PF accessing is reads, not writes so SSDs are ideally suited for Page Files (see SSD FAQs, Should the pagefile be placed on SSDs?).

PF.PNG

Curious? Did you listen to the link you provided above and note where he was talking about private virtual committed memory (which is saved on the PF)? Or how he says you MUST evaluate the system over time if you expect to set the PF manually? That is, you cannot just look at your RAM and your disk space and arbitrarily set numbers.

When working on people's systems, the minimum size I set for their paging file is actually around 75% of their total RAM
I contend that is a mistake and you should just let Windows manage it. Why? Because that is an arbitrary number. It is highly unlikely you sat down with the user (or all the users of that machine) for several days running to determine how that computer is used (the workload) day in and day out - then look at "peak" usage over that entire period. Analyzing over an extended period of time essential for a true analysis and proper fixed PF setting because many users don't perform the same tasks every day. They may only perform a resource intensive task once a month. So simply looking at the amount of RAM and disk size is not a proper analysis.

Not only that, you must re-evaluate workloads every so often to ensure requirements have not changed. This means if you, as a technician, set a client's machine to 75% today, that may be too small 6 months from now. Letting Windows manage it eliminates that problem.

So, if you want to manually set the PF size, as Mark notes in that link, you need to perform an extensive, extended workload analysis, or just let Windows manage it.

Is Windows 7/8 perfect? Heck no! Will Windows set a size larger than you need? Probably. But it is not outrageously too large as suggested, and certainly, it is not too small, either.

With today's disk prices, using the excuse the PF takes up too much disk space is no excuse - or rather, it is a perfect excuse to buy a bigger drive.

Believe me when I say I felt the same way you did - until Windows 7 came out.
 
I understand where you're getting at, and where I don't think it's entirely accurate I do believe it's a viable position on the topic. I won't debate further on the matter. Pardon for letting it get to that point.
 
Pardon for letting it get to that point.
I see no need for apologies. It was a good, healthy discussion on a controversial topic! :) A topic that needs to be re-addressed and discussed in light of the advances made in memory management in Windows 7 (and beyond).

It does not help there is no clear-cut guideline but I cannot fault MS. They have no way of knowing how their users will configure their hardware, what programs they will be using, or how disk-intensive tasks will be run.

And to be sure, I don't have a problem with informed users setting a fixed size using Mark's guidelines of analyzing commit sizes over extended periods of use. I am just saying, as a statement of fact, manual settings are not set-and-forget things. And I am saying, it is my opinion (for whatever that's worth) that the only reason to set a fixed page size is to free up critical disk space on a crowded drive. But, that is a temporary solution only. The real solution is to uninstall programs to free up space, or buy more disk space.

Also, there is no performance advantage to having a smaller, fixed sized PF - except on a very crowded disk. But again, that is a temporary solution.

As I noted before, with XP and earlier, I always used to manually set a fixed size PF. But part of my reasoning was back then drives were smaller and cost much more per byte of storage space. And being smaller they were more susceptible to performance issues due to fragmented files as the drives filled up. A fluctuating PF on a crowded drive can contribute to increased fragmentation.
 
Readdressing them also forces me to go back and see what has changed since I last formed my opinions, and to recheck and verify facts to make sure I am not blowing smoke up my own, let alone someone else's butt! ;)
 
Back
Top