On Sunday, September 22, 2019 at 7:49:14 AM UTC-7, Gijs Kruitbosch wrote:

[snip]

> On 22/09/2019 00:52, Kirk Hall wrote:
> > (1) *97%* of respondents agreed or strongly agreed with the statement: 
> > "Customers / users have the right to know which organization is running a 
> > website if the website asks the user to provide sensitive data."
> 
> Although I intuitively would like to think that we have a right to know 
> "who is running a website", this doesn't mean that EV certificate 
> information is an appropriate vehicle for this information. Even without 
> all the significant issues that EV certification has, if we pretended it 
> was perfect, it still only shows UI for the tls connection made for the 
> toplevel document, whereas other resources and subframes could easily 
> have (and usually do) come from other domains that either do not have an 
> EV cert or have one belonging to a different entity. And even if that 
> were not the case, the entity controlling the website does not 
> necessarily control the data in a legal sense.*** So the EV UI does not, 
> in the legal sense, always indicate who will control the "sensitive 
> data" that users/customers submit.

[PW] I agree with some of this. When I co-instigated the creation of the W3C 
Standard for URL Classification and Content Labeling that replaced PICS in 
2009, it was for this reason; PICS didn’t support assertions about folders - 
only domains. Furthermore, when I co-founded the W3C Mobile Web Initiative I 
helped to write the first draft of the “mobileOK” specification - the ability 
to make assertions about any part of a URI was also a priority then. So, I 
agree with general observations about the importance of being able to 
distinguish between domains, sub-domains, folders etc. when making assertions 
about the content or content creator. 

However, there’s so much to unbundle and it’s drawing the wrong conclusions. 
Allow me to first paint the problem that browser vendors are making worse with 
their decision to scrap website identity instead of fixing what they got wrong 
with the UI and UX.

According to Verizon, phishing represents 93% of all data breaches.

According to Proofpoint, in the first quarter of 2019, cyberattacks using 
dangerous links outnumbered those with malicious attachments by five to one.

According to the Webroot nearly 1.5 million new phishing sites are created each 
month.

According to Wombat Security 76% of businesses reported being a victim of a 
phishing attack in the last year.

According to IBM, phishing attacks increased 250% in 2018.

According to Palo Alto Networks, 70% of all newly registered domains are 
malicious, suspicious or not safe for work.

New tools such as Modlishka now automate phishing attacks, making it virtually 
impossible for any browser or security solution to detect -  bypassing 2FA. 
Google has admitted that it’s unable to detect these phishing scams as they use 
a phishing domain but instead of a fake website, they use the legitimate 
website to steal credentials, including 2FA. This is why Google banned its 
users from signing into its own websites via mobile apps with a WebView. If 
Google can prevent these attacks, Mozilla can’t. 

What’s the common thread? Almost all the cybersecurity problems we read about, 
start with one user falling for a counterfeit website. 

According to Webroot, 93% of all new phishing domains display a padlock thanks 
to free, automatically issued DV certificates. According to MetaCert and some 
CAs, 98% of DV certs used for phishing were issued for free by Let’s Encrypt. 
Given that Let’s Encrypt's growth is exploding, this problem can only get 
worse. 

If browser vendors designed proper UI and UX for website identity in the first 
place, the web would be a much safer place. CAs are not responsible for browser 
UI. EV certs aren’t responsible for browser UI. Browser vendors designed the UI 
and UX and it’s totally broken. 

People who say website identity is broken, are in fact pointing the finger at 
browser vendors, even if they don’t realize it.

It’s pretty easy to make assertions about different parts of a website. It’s 
not rocket science. If you want to talk about certificate issuance that’s 
broken, look at how Let’s Encrypt has issued more than 14,000 DV certs to 
domains with PayPal in it. But what’s weird, is that the same people who think 
CAs are doing things wrong and EV certs are bad, are the same people who say 
it’s not Let’s Encrypt’s responsibility to fight phishing. Pot, kettle, black. 

And Google doesn’t agree with Google - "keep URLs simple and only show the 
domain name for safety, but allow me to introduce you to AMP”…  where URLs and 
brand identity goes to die. A big company is only as good as the few people who 
work on a given project. I have seen zero data from Mozilla (or Google) when it 
comes to website identity and their decision to remove it from the UI. I can 
dig out many instances of where browser vendors contradict what they say and 
what they do.

[snip]
 
> > (3) When respondents were asked “How important is it that your website has 
> > an SSL certificate that tells customers they are at your company's official 
> > website via a unique and consistent UI in the URL bar?” *74%* said it was 
> > either extremely important or very important to them. Another *13%* said it 
> > was somewhat important (total: *87%*).
> 
> This again sounds very nice, but surely the actually important thing is 
> that (potential) customers realize when they are *not* at that official 
> website when some other website tries to persuade them to part with 
> their data/money (so that they don't, or if they do, don't blame the 
> "real" company later)? As has been pointed out repeatedly in this forum, 
> we have pretty good evidence that customers do not, in fact, realize the 
> absence of the EV indicator, as well as evidence that such indicators 
> can be "spoofed", viz. the Stripe Inc. work.

[PW] You’re mixing up what website owners want / believe, and what end-users 
want / believe. Website owners want to protect their brand and they want their 
customers to be safe. And they want to stop wasting time and money trying to 
play whack-a-mole with phishing sites that impersonate them. If they think 
website identity is important, who are you or I to argue? 

End-users want to be safe. They’re not safe because it’s virtually impossible 
to tell which sites are real and which sites are impersonating them. The fix is 
to tell them when a site owner’s identity has been verified. And this can be 
achieved by implementing better designed UI and UX on the browser side. 


> 
> If anything, this survey shows that the 87% of people who thought this 
> was important misunderstood where the risks of digital identity 
> confusion lie.
> 
> > (4) When respondents were asked “Do you believe that positive visual 
> > signals in the browser UI (such as the EV UI for EV sites) are important to 
> > encourage website owners to choose EV certificates and undergo the EV 
> > validation process for their organization?” *73%* said it was either 
> > extremely important or very important to them. Another *17%* said it was 
> > somewhat important (total *90%*).
> 
> This implies that the UI is the/a main motivator for people to get these 
> certificates, but doesn't by itself have any implications for the 
> importance of that UI in keeping consumers and businesses safe.
> 
> If 90% of people surveyed think that people should wear helmets when 
> cycling, that's good for people selling bicycle helmets but doesn't have 
> anything to do with how effective those helmets are at preventing 
> injuries in cyclists.

[PW] If 90% of people wore a helmet when cycling, 90% of all cyclests would be 
safer when riding a bike. In my books, that’s a good thing. Separately, people 
who sell helmets, car drivers, hospitals, insurance companies and family 
members also benefit from the sale of helmets. 

When reading your comment, it would appear that you're saying that it’s wrong 
to sell helmets because people who sell them stand to benefit? Should we stop 
selling helmets? No of course not. 

Could the sale of identity services be improved, accelerated, and more 
accessible through lower cost? Yes of course. But let’s tackle those things 
separately and stop mixing it up with the browser’s responsibilities to keep 
its users safe.

I’d suggest the CAB forum but that’s hardly the right forum anymore - some 
browser vendors dig out the fine print of the bylaws when it suits them, and 
then say it’s not an official entity when it suits them. And also tell others 
that their understanding of the bylaws is wrong when they disagree with 
something. Something new and different is needed.


> 
> > (5) *92%* agreed or strongly agreed with the statement: “Web browser 
> > security indicators should be standardized across different browsers to 
> > make the UI easier for users to understand.”
> > 
> > (6) Finally, when asked “Do you think browsers should standardize among 
> > themselves on a common Extended Validation UI so that it appears roughly 
> > the same in all browsers?” *91%* said yes.
> 
> Both of these actually appear to be arguments for Firefox not to 
> reinstate its in-address-bar EV UI, given that all the other browsers 
> have moved this information out of there. The most consistent UI is only 
> providing this information when activating (clicking/tapping/...) the 
> lock icon, which is what browsers have now pretty universally implemented.

[PW] If Mozilla implemented useful UI and UX it’s possible that enterprises and 
government agencies would switch vendors for better protection - I really 
believe this. Phishing is the single biggest cybersecurity problem that we face 
today. There’s a lot that Mozilla got wrong over the years. Mobile first, then 
the OS. That didn’t stop other companies from moving forward with mobile 
browser or OS strategy.

> 
> > We again recommend the binary Apple UI to all browsers, which works in both 
> > desktop and mobile environments and distinguishes between EV/identity sites 
> > (with a green lock symbol and URL) and DV/anonymous sites (with a black 
> > lock symbol and URL) – check it out in an iPhone.  (Apple did not eliminate 
> > the EV UI, as some has erroneously said.)  This is easy for users to 
> > understand at a glance.
> 
> With due respect to the good folks at Apple, I do not believe this is an 
> accessible solution (distinguishing information only by colour, 
> https://www.w3.org/TR/WCAG20/#visual-audio-contrast ).

[PW] I agree that Apple has it wrong, but not just the color differences. The 
visual indicator needs to be a separate icon, far away from the padlock. 

> 
> Additionally, (even if we presuppose EV certs were perfect) it does not 
> help address the requests made in your survey's questions #1 and #3, ie 
> which organization is actually running this website or controlling your 
> data? It only establishes that *some* organization got an EV certificate 
> for this site… 

[PW] This alone, is enough to combat one of the biggest cybersecurity problems 
faced by the world today. That is, provided there’s a more meaningful design 
implementation from Mozilla. 


you'd have to click/tap through to see, and your own 
> recommendation text here suggests this is "easy for users to understand 
> at a glance", glossing over the fact that they would actually have to 
> click through to see the identity information that you think is so 
> important, and that even then they may be vulnerable to confusion given 
> all the prior research into how poorly enforced restrictions in company 
> registers are in many countries, the possibility for confusion across 
> jurisdictions, etc.

[PW] I love this one. If you exclude theoretical work conducted by researchers, 
can you point me to a single resource that describes an actual attack that used 
an EV certificate? Just because something can be done, doesn’t mean it will. 
Theoretically someone can break into any home. But they will pick the ones that 
are easiest. I could ask a social engineer to compromise almost anyone. But 
that doesn’t mean those people will be compromised. Cost vs reward.

Identity verification for a company requires the incorporation of a real 
company, followed by the cost of a verification process. When they’re found to 
be a bad actor, the identity information is revoked and can never be used 
again. The time, cost and effort makes this attack cost prohibitive. That said, 
my company has a solution that makes this potential attack vector less 
attractive. So it’s possible to reduce the risk. 

If we saw amazing adoption for a new visual indicator, that helped to reduce 
phishing scams, threat actors might then pay for identity verification - but 
that’s a long way off and provides industry enough time to improve the 
verification process. No process is perfect or will ever be perfect. No 
security system is perfect. 

Removing something because it didn’t work due to poor design is not the way to 
build a product. You iterate until you get it right. 

BTW with 85,000 active crypto traders and investors in our social experiment, 
we didn’t see a single victim of the problems described above - all thanks to a 
new shield that turns green inside a browser add-on. Visual indicators do work. 
63% of users said they would be extremely disappointed if they could no longer 
rely on the green shield to keep them safe. 

Regards,
Paul


> 
> In other words, it is not "easy to understand" at all...
> 
> ~ Gijs
> 
> *** This may be a confusing point. In the EU, under GDPR, it appears 
> (IANAL) to be legal for an organization to run a database and front it 
> with a website allowing modification, on behalf of some other entity. In 
> this case, that other entity is the data controller, the website 
> operator is "merely" the "data processor". For a practical example, the 
> UK electoral register (or "electoral roll") is considered held/"owned" 
> by individual councils, but usually updating their records is contracted 
> out to private companies as it's felt they'd do a better job than the 
> small council's own IT department in managing/securing this data. An 
> example is ERS, whose privacy policy is here 
> https://householdresponse.com/Home/Policy . The certificate is for 
> "Electoral Reform Services Ltd (GB)", but the data controller is 
> actually the respective city/town/borough/county councils, and if I 
> wanted to request copies or corrections of the information held on me 
> from the register, under GDPR I'd have to contact my council, not the 
> company running the website; ditto for requests to "stop processing [my] 
> information".

_______________________________________________
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy

Reply via email to