On 02/21/2020 03:41 AM, Roger Dingledine wrote: <SNIP>
> Services on the internet are inherently harder to make safe than clients, > (a) because they stay at the same place for long periods of time, and > (b) because the attacker can induce them to generate or receive traffic, > in a way that's harder to reliably do to clients. Yep. That's the fundamental problem. > Most identification problems with Tor users, and with onion services, > have turned out to be opsec mistakes, or flaws in the application > software at one end or the other. That is, nothing to do with the Tor > protocol at all. But of course in the "layers of conspiracy" world we > live in nowadays, you can never be quite sure, because maybe "they" > used a complex attack on Tor and then covered it up by pointing to an > opsec flaw. One hopefully productive way forward is to point out that > even if we don't know how every successful attack really started, we > know that opsec flaws are sufficient to explain most of them. I've looked at many of them, and I generally agree. The only exception I'm sure of is relay early, which took down at least PlayPen, and ~1000 of its users, directly or indirectly. I'm not sorry about them, but we don't know who else exploited it, against whom, or for how long. > When I'm doing talks about Tor these days, I list these four areas > of concern, ordered by how useful or usable they are to attackers in > practice: (1) Opsec mistakes, (2) Browser metadata fingerprints / proxy > bypass bugs, (3) Browser / webserver exploits, and (4) Traffic analysis. There's also Freedom Hosting and Freedom Hosting II. Although I haven't seen anything clear about how they were compromised, it seems arguable (even obvious, in retrospect) that servers with numerous onion URLs are far^N more vulnerable. Not to say, doomed. <SNIP> -- tor-talk mailing list - tor-talk@lists.torproject.org To unsubscribe or change other settings go to https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-talk