On 01/05/16 22:20, Arne Babenhauserheide wrote: > Not quite a paper, just some calculations which show their math is > wrong: > > http://127.0.0.1:8888/freenet:USK@sUm3oJISSEU4pl2Is9qa1eRoCLyz6r2LPkEqlXc3~oc,yBEbf-IJrcB8Pe~gAd53DEEHgbugUkFSHtzzLqnYlbs,AQACAAE/random_babcom/210/#Iamdarknetonlyagain > > There is now a detailed report how law enforcement tracks opennet > downloaders (though the statistics are flawed pretty badly¹) > > I’m not allowed to upload the report here, so I can only give a clearnet link > to the white paper: > https://www.ncjtc.org/ICAC/Courses/trngres/Freenet%20Investigations%20White%20Paper%20-Black%20Ice%20%20%28090413%29.pdf > > ¹: the vulnerability to HTL18 they use has already been addressed in > 2008, so any probability they claim using that is false. For every > connection there is a 50% chance that all the requests (not only a > single one) did not originate from the node from which we received them > but were forwarded one step. So for 10 connection (the lowest value), > there are 5 other nodes whose HTL18 requests are forwarded with HTL18, > so the probability that a given HTL18 request originated at the node > From which we received it is only about 17% (1 in 6). And this > probability does not get better when gathering more requests of chunks > From a specific file or a specific kind of files, because they can > reasonably all be forwarded from a different node — the one which really > sent them. The only way to get good statistics would be to connect to > this node over and over again at different times when the peers of the > node changed (that requires waiting at least 2 hours to change a > significant number of peers — the only way to be sure would be to wait > for the other node to go offline for more than 5 minutes and then to > connect to it again). However screening out every node which ever sent a > HTL17 or HTL16 request could improve the reliability a lot, though with > significant cost. That doesn’t change that their probabilities are > calculated incorrectly, but could give them a pretty good hit rate on > people downloading a large volume of material. > > - Code: > https://github.com/freenet/fred/blob/next/src/freenet/node/PeerNode.java#L1603 > - Commit: > https://github.com/freenet/fred/commit/4aaa08f11656af1dd857e45612763c9bd2d89fc2
You can still do a classic correlation attack: Connect to the node for the whole duration of the request and count the proportion of the file they've fetched from you. There are probably others, but this is *the* reason why we need darknet. And yes there's some plausible deniability re leaf darknet peers, though there may be ways around that. For the threat models I care about (China, whistleblowers etc), that's not a serious problem; and leaf nodes suck, very few people use them. It may not even be a problem for LEAs who play by the rules; they're good at the find/squeeze-his-friends game. But it does take much more (electronic!) resources than the "just search anyone who ever requests an illegal block at HTL 18" attack described above. Nonetheless, we should do something with this. The paper is clearly wrong, and if they're using that argument in court, something is very rotten, even if it's only their maths.
signature.asc
Description: OpenPGP digital signature
_______________________________________________ Devl mailing list [email protected] https://emu.freenetproject.org/cgi-bin/mailman/listinfo/devl
