Re: Intent to deprecate: Insecure HTTP

2015-04-24 Thread Roger Hågensen
On Tuesday, April 21, 2015 at 2:56:21 PM UTC+2, Gervase Markham wrote:
 Very briefly:
 
 On 21/04/15 12:43, Roger Hågensen wrote:
  1. User downloads a browser (be it Firefox, Chrome, Opera, etc.)
  securely (https?) from the official download location. 2. Upon
  installation a private key is created for that browser installation
  and signed by the browser's certificate server. 
 
 This makes checking in with the browser maker a necessary prerequisite
 for secure connections. That has problems.

How so? Certificates have to be checked today as well (if they have been 
revocated or not).
Also, it would only be at installation time for the user.
The server itself would heck if it's been revocated or not.
StartSSL uses client certificates for logins, so does several other sites.

If you an have a client-server connection where only the server has a 
certifiate then the opposite is also possible, where the client-server 
connection is secured with only the client having a certificate.

  3. When the user
  later connect to a server that support automatic encryption, the
  browser sends a (public) session key that the server should use, this
  key is signed with the browser installation key, the server can
  verify the signature and that this key is not modified by checking
  the certificate server.
 
 What you just built is a unique identifier for every browser which can
 be tracked across sites.

How can this be tracked? This can be tracked just like any other client 
certificate can be tracked currently, no difference.

  4. The server exchanges it's session key with
  the browser. 5. A secure/encrypted connection is now possible.
 
 Except that the browser has not yet identified the site. It is important
 for the user to check the site is genuine before the user sends any
 important information to it.
 
  The benefit is that there is no server side certificates needed to
  establish a encrypted connection. 
 
 They are needed if the user wants to have any confidence in who they are
 actually talking to.

DNSSEC exists and should help mitigate who you are talking to issue.
Also certificates have been falsified (didn't Mozilla just untrust all 
certificates by a certain certificate issuer recently that allowed fake 
Google.com certificates to be made?)

Also with certificates like the free ones from StartSSL the only site identity 
you can see is identity not verified yet the connection is still HTTPS. Just 
look at https://skuldwyrm.no/ which uses a free StartSSL certificate.
Do note however that this .no domain do have DNSSEC enabled (does all latest 
browsers support that?)
So one can be relatively sure to be talking to skuldwyrm.no without https.

What I'm proposing is no worse than automatic domain verified certificates 
currently are.
The important thing is that the connection is encrypted here.
Not whether the site is trusted or not.
Heck, there are sites with a green url bar that do rip people off, so trust 
or ensuring you do'nt get fooled is not automagic with any type of HTTPS in 
that regard.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Intent to deprecate: Insecure HTTP

2015-04-24 Thread Roger Hågensen
On Tuesday, April 21, 2015 at 3:56:31 PM UTC+2, Mike Hoye wrote:
 On 2015-04-21 6:43 AM, Roger Hågensen wrote:
  I know, not that well explained and over simplified. But the concept 
  is hopefully clear, but in case it's not...
 For what it's worth, a lot of really smart people have been thinking 
 about this problem for a while and there aren't a lot of easy buckets 
 left on this court. Even if we had the option of starting with a clean 
 slate it's not clear how much better we could do, and scrubbing the 
 internet's security posture down to the metal and starting over isn't 
 really an option. We have to work to improve the internet as we find it, 
 imperfections and tradeoffs and all.

How about HTTP/2 ?
Also a lot of smart minds completely ignored HTTP Digest Authentication for 
years, allowing Basic (plain text) password to be sent when login in on sites.

I hate plain text logins, how many blogs and forums out there have plain text 
logins right now? The number is scary I'm sure.
MITM attacks are one thing, what is worse are network eavesdropping, login to 
your blog or forum from a Cafe and you are screwed basically. IT has been shown 
that despite using WPA2 to the router, others on the same router can catch 
packets and decrypt them. And then they have your login/password.

Now when I make logins for web projects I use a Javascript client side based 
HMAC and a challenge-response so I do not even send the HMAC/hash over the 
network.

The server gives the javascript/client a challenge and a nonce, the password 
which the user knows and server knows (actually the server only knows a hmac of 
the pass and salt) is used with the challenge and then the result is sent back 
as a answer.
An eaves dropper will not be able to get the password.

Now, there are other attacks that could be used like session exploits but this 
is true even for HTTPS connections.

And a javascript/client solution like this is open to a MITTM attack since it's 
not encrypted or signed in any way (code signing certificates are even more 
expensive than site certificates).

I'd like to see a Client based HMAC Challenge-Responsive built in and a way for 
a browser and a serverside script to establish a encrypted connection without 
he need for certificates.
This would solve the plaintext login headache, and would be attractive to sites 
that only have HTTP (no HTTPS option) but has for example PHP support or some 
other scripting language.

HTTP/2 could be extended to improve the way HTTP Digest Authentication works, 
adding a HMAC(PSWD+SALT) + Challenge(NONCE) = Response(HASH) method.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform