On 27/07/21 12:19 am, Paul Ripke wrote:
On Mon, Jul 26, 2021 at 05:53:19PM +1200, Lloyd Parkes wrote:
That's 12GB of RAM in use and 86MB of RAM free. Sounds pretty awful to me.
Sounds normal to me - I don't expect to see any free RAM unless I've just
- exited a large process
- deleted a large file with large cache footprint
- released a large chunk of RAM by other means (mmap, madvise, semctl, etc).
I haven't run NetBSD on a desktop for a while now, but I still think
12GB is a lot of memory in use. Maybe I'll get a new MacBook when they
start shipping 32GB Apple CPU ones and then put NetBSD on my current
MacBook.
A big chunk of it is in file cache, which is unsurprising when reading
thru a 400GiB file...
Page activity lasts 20s and at 30MB/s that means you should have 600MB
of file data active. Add 50% for inactive pages and that's still only
900MB. I'm willing to bet money that zstd only reads each block of data
once (sequentially in fact) and so it doesn't need any file data cache
at all. File metadata is a different matter, but that probably stays
active and there won't be much of it.
I suspect that your vm.filemax is set to more memory than you have
available for the file cache and once that happens anonymous pages start
to get swapped out. My experience is that while anonymous pages sound
unimportant, they are in fact the most important pages to keep in RAM.
Thinking about it, they are the irreplaceable bits of all our running
software.
Try setting vm.filemin=5 and vm.filemax=10. Really. I did it when
processing vast amounts of files in CVS and it worked for me.
Out of curiosity, what are you doing with zstd. You mentioned backups.
Is this dump or restore? dump implements its own file cache, which won;t
help with the memory burden.
"top -ores" will tell you what programs are using the most anonymous
pages, which might help identify where all this memory pressure is
coming from.
Cheers,
Lloyd