Thorbjørn Ravn Andersen wrote:
maybe this should be directed to some other list... I am in the quest of a library to store, key-based, byte-streams of very varying sizes in large quantities. I need to read and write, multithreaded, and I have big and small byte-streams, from 50 bytes till 3 mega-bytes.

Files work, of course, but there are very many small fragments there, like 100 bytes or so. This is the solution I am using, I have a directory of about 27000 files right now... berk.
Perhaps you could elaborate a bit on which key you use to look up the byte stream and how these are generated.

If the filenames are evenly distributed, you could have a K/KE/KEYOFDATA.xml two-level hierarchy to easen the load off the filesystem.

You would most likely also look into using an appropriate filesystem for the task. Linux e.g. has several, with different properties.
Thanks for the hint of a directory split in a way similar to a hash-table, I hadn't thought about it and it would solve the horrible 50'000 files/directory. Choosing the host platform is not an option though... so the filesystem also not.
Overall these 50'000 only bother if you look at it.

Where I am rather making hopes is for the small-sized chunks for which making a full-file is a waste on several file systems which have minimal sizes of 512 bytes, and some 4Kb... Of course I could write my library for this, with java.nio most probably, but I was hoping VFS or some other library to help me there.

paul

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to