On Wed, Feb 3, 2016 at 3:53 PM, Howard Chu <hyc at symas.com> wrote: > No. Windows will toss clean pages out even in the total absence of memory > pressure. It is moronic, but true. You can repeat the steps I outlined for > yourself and see. > > https://groups.google.com/d/msg/comp.os.ms-windows.nt.misc/449tdNYPoX0/it4cx8Gvv2AJ
I did repeat something similar. I tried running md5sum on a source tree containing many files. I'm not using this source tree for anything else right now, nor have I in many days. The first time I ran the script, it took 2 minutes. The second time I ran it, 30 minutes later, the same script took 30 seconds, implying some amount of data is still in the filesystem cache. I've been using this system for other things in those 30 minutes, including compiling code in another source tree. Indeed, if I run RAMMap ( https://technet.microsoft.com/en-us/sysinternals/ff700229.aspx ), I can see many of the source files from my md5sum test still in the cache. That same tool claims the majority of my non-process memory is being used as standby for mapped files, aka, being used by the file system cache. No doubt Windows isn't perfect here, but I do wonder if something else wasn't at play with the GCC test you did.