>I don't know how to translate this into English, but in Greek it's called
>the "kobogianitiki" method! (e.g. eat until you puke to find out how much
>you can eat without puking!)

yes, that does describe the process fairly well

>It sucks first of all because it does not determine anything. What kind of
>files does it create?
>- If the contents are random, like some kind of encrypted data, then you
>might just have a good guess of the minimum free space

a good guess is anything that the code fragment is capable of, besides other 
limitations.

>assuming you haven't
>run into some other limitation of the filesystem, like maximum files per
>directory!

this is correct

>Nowadays filesystems report less and less accurately their free space just
>because they are optimized to reduce resource consumption when they are
>actually doing something, depending on what and how they are asked. So the
>free space you see is usually only something approximate just good enough
>for human consumption. I guess that's mostly the reason Java developers
>didn't want to mess with this gray (dark gray?) area.

a good point. but still a guess is better than knowing nothing

>> it's just to prove that there IS a way from java to determine 
>> disk space. please delete this mail as soon as you get it :D
>
>You haven't proved anything until you've done experiments! Try it on your
>system and it might just prove a nice way to crash it!

is there really a need for me to implement this abstract code? it's a concept, 
no program!

i know it is not very well suited for the need of determining free disk space.

but i don't know what a crashing filesystem has to do with it.
maybe the createANewFile() routine creates subdirectories to avoid a limit of 
files-per-dir? i have not described the routine in detail.

the filesize in this example was fixed at 1mb, but perhaps
- you can create files with a size beginning at 10gb as long as they fit, if 
that fails, create one with 5gb, then 2,5 and so on and sum the sizes up of the 
files 
that could be created (some sort of quadtree approach, logarithmic complexity)
- you can use this and start creating a file with a size of 10mb, then begin to 
halve the size, so the algorithm will be ***fairly*** fast (cough) and suitable 
for 
determining free space with a limit of <20mb, so it is usable emitting a 
"warning, low disk space: x mb free" when below the alert limit 
(10+5+2,5+1,25+.... mb 
~= 20mb. assuming transfer rates of 0,5mb/s (appx. 10mbitps network mounted 
directories) or 7mb/sec (my system's benchmark) will result in measure times 
of 40s or 3s, halved if the alert limit is around 10mb)

the content of the files was assumed to be fairly random, perhaps a even better 
approach would be using some files within the datastore, so the data 
structure is near reality. but they're encrypted, so any random generator would 
create nearly the same output and will fit fine...

again: this has to be seen as a..... hint of concept.... and i strongly 
disagree with the way this is working and discourage everyone thinking about 
using this 
approach ;)




_______________________________________________
devl mailing list
devl at freenetproject.org
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to