On Mon, 23 Apr 2001, Brandon wrote:
> > This is unnecessary complexity. If the problem that you trying to avoid
> > is files dropping out of Freenet, then stop dancing around it and deal
> > with ordinary split files, and fix the real problem.
>
> I'm not dancing around the problem. Split files inherently suck. They suck
> because the parts can fall out independently and what we want is for them
> to all fall out at the same time. They also suck because the compromise
> deniability. No one seems to have ideas as to how to solve either of these
> problems.
>
Lesser ability to deny is not really a problem. If someone's
downloading illegal files off freenet, the cops are going to find the
illegal data in their filesystem in one piece, not in a bunch of pieces
in their encrypted freenet cache.
The part about pieces falling out independently is no more of a problem
than files that are part of a web site falling out independently. If
it's really a problem, redundancy can help. If redundancy can't fix the
problem, then the file is gone, and the pieces that can be retrieved
will hopefully not be that often.
> One solution is to find a way that you can request parts of a file but
> still treat the file as an atomic unit rather than as a loose collection
> of independent files.
>
I don't like this kind of solution. Splitfiles are prefectly natural.
Treating a collection of files as one is called "making a tarball" and
inserting that as one file. It's the opposite of splitfiles.
freenet can, and will, go both ways. Whatever works best for people.
> Although if you have a solution to these problems using normal split files
> then I'd be interested to hear it.
I've responded as best I can to your issues above. Poke holes in my
arguments all you want, but we will just have to wait and see how well
they work before deciding not to use them.
Thelema
--
E-mail: [EMAIL PROTECTED] If you love something, set it free.
GPG 1536g/B9C5D1F7 fpr:075A A3F7 F70B 1397 345D A67E 70AA 820B A806 F95D
PGP signature