[freenet-dev] Node specialization vs. attack on specific content

2003-05-03 Thread Mark J Roberts
Edward J. Huff: > Not at all. This is a data compression scheme which avoids > the problem of incompressibility. Any 1 meg file can be > represented as a CHK. (It fails only when there are hash > collisions.) I don't understand how this relates to compression. It doubles the size of all

[freenet-dev] Node specialization vs. attack on specific content

2003-05-03 Thread Hui Zhang
> How do I trust the legitimacy of a given formula? > What prevents an > attacker from advertising tons of false formulas for > a file? This is a very convincing argument. If multiple versions are allowed for a single file, then it brings the problem of inauthentic upload.

[freenet-dev] Node specialization vs. attack on specific content

2003-05-03 Thread Edward J. Huff
BTW in my second post which you haven't replied to yet I didn't make a clear distinction between files as CHK's of say 1 meg of apparently random data vs files (documents) as content (not random). On Sat, 2003-05-03 at 14:31, Mark J Roberts wrote: > > If the police don't know which of the set of

[freenet-dev] Node specialization vs. attack on specific content

2003-05-03 Thread Edward J. Huff
On Sat, 2003-05-03 at 13:34, Edward J. Huff wrote: > On Sat, 2003-05-03 at 12:56, Mark J Roberts wrote: > > If you entangle your file with my illegal document, which is later > > suppressed, you have nobody to blame but yourself when your file > > must be reinserted. It's sort of like announcing

[freenet-dev] Node specialization vs. attack on specific content

2003-05-03 Thread Edward J. Huff
On Sat, 2003-05-03 at 12:56, Mark J Roberts wrote: > Edward J. Huff: > > Also, this idea addresses the node specialization problem by > > separating the routing problem from the defense against attack > > on specific content. The defense problem is moved out of the > > freenet protocol to the

[freenet-dev] Node specialization vs. attack on specific content

2003-05-03 Thread Mark J Roberts
Edward J. Huff: > The idea is your illegal document would have been entangled > with other legal documents. Note of the CHK files would be > usable by themselves (or at least, those which are wouldn't > be used for entanglement). Every CHK would be entangled into > lots of different documents,

[freenet-dev] Node specialization vs. attack on specific content

2003-05-03 Thread Edward J. Huff
I suppose maybe this belongs on the TECH list, but that list seems to have nothing but spam, while this list doesn't. Also, this idea addresses the node specialization problem by separating the routing problem from the defense against attack on specific content. The defense problem is moved out

[freenet-dev] Node specialization vs. attack on specific content

2003-05-03 Thread Mark J Roberts
Edward J. Huff: > Also, this idea addresses the node specialization problem by > separating the routing problem from the defense against attack > on specific content. The defense problem is moved out of the > freenet protocol to the application, removing some constraints > and making it easier to

[freenet-dev] Insert NPE build 593

2003-05-03 Thread Edward J. Huff
Hi, I've been running a freenet node for a week now, and reading this list. I've noticed a few of the things discussed: It's on my firewall machine, a 300 MHz AMD K6. Sometimes gets load average of 30. (Running build 593 just now, on RedHat 7.3). 256Mb was far to small a store. I'm running

[freenet-dev] Gateway

2003-05-03 Thread en...@despammed.com
I suggest detaching "bookmarks" from freenet.conf and putting them in a file like bookmarks.xml and having the distributed bookmarks.xml contain only mostly harmless freesites and/or indexes. Users could then download bookmarks or make their own (possibly with additional tools) without dealing