Matthew Toseland wrote:
> However, if the network is large and the attacker's capacity for 
> connecting to many nodes at once is limited, or if the attacker needs to 
> trace the originator very quickly, an adaptive search as I have described 
> will be more feasible / yield results much faster.

True - I wasn't trying to say the adaptive attack was weak, just that 
the non-adaptive attack was strong too.

> Right. But if the opennet is a million nodes, each of which has an uplink of 
> 100kB/sec and 20 peers (presumably uplink settings being limited by what your 
> ISP will tolerate rather than your actual upstream bandwidth), the attacker 
> will need on the order of 5GB/sec of bandwidth, and some ridiculously large 
> computers to go with it... or a botnet of comparable size to the whole 
> Freenet network.

But the opennet doesn't have a million nodes, and maybe it never will. 
Why would a million people join the network when it's vulnerable in its 
current form? There are fifty machines in the undergrad lab at my 
university that sit idle most of the time... if I ran an opennet node on 
each one they'd have 1,000 peers between them.

> On the other hand, on darknet, each connection may cost a 
> measurable dollar amount, so it would again be expensive even if the network 
> is small.

Agreed, darknet is the best defence against both attacks (adaptive and 
non-adaptive).

> Either way, an adaptive search is much cheaper than a brute force 
> search, although connecting to everyone is tempting if you have the 
> resources.

Adaptive search on a darknet would mean tricking users or compromising 
their machines, right? That sounds more dangerous and expensive than 
just paying for some hosting.

> As the paper argues, the space savings may not be very large. However there 
> are special benefits for Freenet e.g. a user may be waiting for a specific 
> file, regularly rerequesting it; if another user happens to have it, it would 
> be good if the first user would find it immediately and not have to pick up 
> the announcement from the second user.

If the first user knows the key, she must have learned it (maybe 
anonymously or indirectly) from the second user, right? So it doesn't 
matter how the second user generates the key.

> Similarly, if a user is able to get 
> most of a file, but not the last 20%, he may ask for it to be reinserted; it 
> would be best if the reinsert doesn't result in a completely new key, but 
> reuses the existing blocks.

Good point, maybe the salt should be derived from a password or the 
node's private key so the same user can reinsert the data under the same 
key. Or would that allow an attacker to identify the inserter by asking 
for the file to be reinserted?

> Obviously a few unusual applications would 
> benefit greatly from convergent encryption (e.g. daily system snapshotting).

True, but as you pointed out before it's unlikely that people will use 
Freenet for backups because data falls out.

Cheers,
Michael
_______________________________________________
Devl mailing list
Devl@freenetproject.org
http://emu.freenetproject.org/cgi-bin/mailman/listinfo/devl

Reply via email to