-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

On Sun, Mar 26, 2017 at 6:21 PM, Rusty Bird <rustyb...@openmailbox.org> wrote:
> When the attacker is infecting the user's computer, they could add some
> code to copy the sealed encrypted keyfile into the nooks and crannies
> (firmware, reserved disk sectors, ...) of that computer. A multi-stage
> attack would go like this:
>
> 1. Visually capture the boot passwords (or retroactively access CCTV)
> 2. Infect the computer's boot code (but back up the uninfected version)
> 3. Wait for the user to connect the AEM stick on the next boot attempt.
>    Even if they immediately destroy the stick after noticing
>    something's up, they're still screwed:
> 4. Seize computer, restore sealed encrypted keyfile, restore uninfected
>    boot code, replay captured passwords, decrypt disk
>
> So when the computer fails to authenticate itself, not only must the
> user have the discipline to stop _using_ the computer. If multi-stage
> attacks are part of their threat model, they must also _destroy_ the
> computer
>
> - - completely: because who knows where exactly the sealed encrypted
>   keyfile has been been copied to
> - - quickly: before the attacker can get to it
>
> Whereas, if we put the encrypted keyfile on a separate stick that's only
> ever connected _after_ the computer has authenticated itself to the
> user, getting rid of that is more realistic. It's just a dirt cheap
> little commodity device that doesn't contain any irreplaceable data.

For such multi-stage attack, it could be be much more effective and
still perfectly feasible to implant a passive hardware device into
the target computer that would silently capture and record relevant
USB traffic. Such device would be invisible to all existing measured
boot mechanisms and thus undetectable by the user -- the computer would
successfully authenticate itself, so there would be no apparent reason
not to insert the secondary USB stick with their LUKS keyfile.

However, should hardware implant attacks be out of the scope of the
user's threat model, then you're of course correct and the original
proposal provides better protection against multi-stage evil maid
attacks.


> A nice setup would be two microSD cards, in tiny USB adapters[1] if
> necessary, on a key chain. We could call them "verification stick" and
> "decryption stick", sort of like how people are used to having a
> building key and an apartment key?
>
>
> That said, although the scheme you're proposing wouldn't prevent the
> multi-stage attack, it still has _a lot_ of good things going:
>
> - - It improves security and UX, compared to the AEM status quo
> - - UX is vastly better than the two sticks + TOTP device scheme
> - - No need to trust the TOTP device (probably a smartphone - eew!) to
>   authenticate the computer
> - - No need for conspicuous TOTP verification

As I was thinking about the original proposal, I realized that most
users would propably opt to use a smartphone as a TOTP token -- and
since smartphones are not air-gapped, they could potentially be tricked
into rolling back the time (eg. via spoofed cellular network with fake
time synchronization data; this can be explicitly disabled in most/all
phone's settings but it's an important fix to make). TOTP code
verification can also be cumbersome in case the clock difference between
the computer and TOTP device becomes high (a few seconds out of the 30s
TOTP interval perhaps) -- this also increases maintenance burden placed
upon the user.

That was one of the reasons along with hardware implants undetectable
by trusted boot process mentioned above why I devised the alternative
scheme (which is much simpler and has near-zero maintenance cost).


> We could support both schemes and let users choose either or none:
>
> - - If secret.luks.encrypted.sealed2 exists on the verification stick:
>   unseal it, decrypt it, and use it as a keyfile
> - - Otherwise, if secret.totp.sealed2 exists and we're not in "I don't
>   have my TOTP device" fallback mode: unseal it and show the code
> - - Otherwise, unseal and show secret.png.sealed2/secret.txt.sealed2
> - - If we don't already have a keyfile, and the user inserts a decryption
>   stick before unplugging the verification stick, unseal
>   secret.luks.encrypted.sealed2 from there, decrypt it, and use that
>
> I don't know if this combination would be too much work for a GSoC
> project though?

In case you deem the probability of software-based (but requiring prior
physical access) multi-stage evil maid attacks much higher than
hardware-based ones, I could implement both schemes (probably not in the
two month time-frame of the GSoC, but I could work on it in my free time
before and/or after GSoC).
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQI5BAEBCAAjBQJY2CPMHBx4aGFnYXIwMEBzdHVkLmZlZWMudnV0YnIuY3oACgkQ
XB5x3wMfmuXJ+RAAjxQmayBBNt3+35rsWIzF2pSfjDLNpBYJXDniPbIohjL6KGbb
ZcHi8vlqiU58VjXAh+knKlwzVh9UogUIJj2ySdkcfKGNJ2jYkoryCXFwh86nh/Ai
lrAuKigzRodX/FSmZbyRwwdXzM/wqFBJ3S2J9GFmvX+Ff2EOfQ400AHtqOv6n9DI
fx7Q7VI71RpqTzkKJwL8YoRRC/4o1f6hUNvZpapeth5fQ5eZYbxjA5sGU/8BDtru
kHp1RWzw05ABRQUKYMNRH7B7sQUQfw+vVATEFEIC5bo88u04Lz5Z9LZSRBcdvEWQ
pZierpI5QDSFNkUX/yp/c4adsAq9mCsmfPO0MxKknOW5shywcKEsPI2n1rXXjNJ/
VyGnNYI6mWWa92vk9TW79L13N4WD0fIdAnwqQ2RDCtx+XwMXPjSByoJxyhBdb/hz
8yFo0Ro9Ibyv+9bhDTktxqQoop/SrchOfEGJuB+nmqZraz9IiiBhH+Xz870nFXtw
U/WnVRmXQ9YfuaPqQsIjGG/702FxdiOuKzqS79OJ0WOC+Svp6Nd7PZPSHOIOHppq
YVHY87iVkRgAP1HMBH2hSae+JEwUfdGRLC5eS9rVGcGeQXuuk3lRRMCAk4uwj4w8
hMRPhxUv+vpuphWIOSfba52sILpHEJgCE+TOYALfBd0Z4RT/8l0B5196480=
=x04N
-----END PGP SIGNATURE-----

-- 
You received this message because you are subscribed to the Google Groups 
"qubes-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to qubes-devel+unsubscr...@googlegroups.com.
To post to this group, send email to qubes-devel@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/qubes-devel/CAFfni-G7Tkykgu4_%2BNyiJKi_U9iQ_R1ZQif52UE5Qh8DmZZ3%2Bg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to