Not quite a paper, just some calculations which show their math is
wrong:

http://127.0.0.1:8888/freenet:USK@sUm3oJISSEU4pl2Is9qa1eRoCLyz6r2LPkEqlXc3~oc,yBEbf-IJrcB8Pe~gAd53DEEHgbugUkFSHtzzLqnYlbs,AQACAAE/random_babcom/210/#Iamdarknetonlyagain

There is now a detailed report how law enforcement tracks opennet
downloaders (though the statistics are flawed pretty badly¹)

I’m not allowed to upload the report here, so I can only give a clearnet link 
to the white paper: 
https://www.ncjtc.org/ICAC/Courses/trngres/Freenet%20Investigations%20White%20Paper%20-Black%20Ice%20%20%28090413%29.pdf

¹: the vulnerability to HTL18 they use has already been addressed in
2008, so any probability they claim using that is false. For every
connection there is a 50% chance that all the requests (not only a
single one) did not originate from the node from which we received them
but were forwarded one step. So for 10 connection (the lowest value),
there are 5 other nodes whose HTL18 requests are forwarded with HTL18,
so the probability that a given HTL18 request originated at the node
From which we received it is only about 17% (1 in 6). And this
probability does not get better when gathering more requests of chunks
From a specific file or a specific kind of files, because they can
reasonably all be forwarded from a different node — the one which really
sent them. The only way to get good statistics would be to connect to
this node over and over again at different times when the peers of the
node changed (that requires waiting at least 2 hours to change a
significant number of peers — the only way to be sure would be to wait
for the other node to go offline for more than 5 minutes and then to
connect to it again). However screening out every node which ever sent a
HTL17 or HTL16 request could improve the reliability a lot, though with
significant cost. That doesn’t change that their probabilities are
calculated incorrectly, but could give them a pretty good hit rate on
people downloading a large volume of material.

- Code: 
https://github.com/freenet/fred/blob/next/src/freenet/node/PeerNode.java#L1603
- Commit: 
https://github.com/freenet/fred/commit/4aaa08f11656af1dd857e45612763c9bd2d89fc2

I also wrote something on paper (notebook, not scientific paper)
addressing that, but it’s (a) not yet typed down, and (b) not really
article quality (we’d need a journalist polishing it).

CC-ing Stef from TU Darmstadt, likely the one who can provide the most
convincing background information.

BCC-ing Glyn Moody who has the skills to write a solid article from
this, though I don’t know whether he has time.

Best wishes,
Arne

[email protected] writes:

> AFAIK Arne has been working on a paper in reply to some recent law 
> enforcement 
> actions.
> -> CCing this to him.

Ian Clarke wrote:

> > Some chatter on Reddit in the past few days about law enforcement apparently
> > providing incorrect information to courts about how Freenet works in order 
> > to
> > create probable cause to search people's computers:
> > 
> > https://www.reddit.com/r/Freenet/comments/4ebw9w/more_information_on_law_enforcements_freenet/
> > 
> > https://www.reddit.com/r/Freenet/comments/4es8lv/law_enforcement_freenet_project_links/
> > If law enforcement are indeed lying to courts in order to establish probable
> > cause to search computers, then we need to set the record straight.
> > Would anyone be interested in writing an article on this subject? If it is
> > well-written we might even be able to get a site like 
> > http://arstechnica.com/ to
> > publish it. Otherwise Medium.com might be another good vehicle.
> > Ian.
> > Ian Clarke
> > Founder, The Freenet Project
> > Email: [email protected]

-- 
Unpolitisch sein
heißt politisch sein
ohne es zu merken

Attachment: signature.asc
Description: PGP signature

_______________________________________________
Devl mailing list
[email protected]
https://emu.freenetproject.org/cgi-bin/mailman/listinfo/devl

Reply via email to