Can you remind me of the use case here? Who wants to load HTML pages from local disk and have JavaScript in that HTML have local disk access?

That specific case isn't a requirement. The use case I'm defending is this one:

Developer creates a web page on local disk and is able to
load that file direct into the browser without special
configuration or responding to any popups or dialogs.

In the bugs discussed, a safer user experience (which I also
vote for) has thrown up several options that make this
use-case more difficult to support. So I'm looking
for an option that's inclusive of this use-case.

Surely files Mozilla has saved to disk are the ones _not_ to be trusted, because they came from
> the web rather than being locally authored?

Mozilla can't save files to disk; only users can do that
using Mozilla as a tool. I'm not recommending it, but you
can imagine a browser that records all files saved to
local disk, and allows those files to be reloaded on the
assumption that they're "safer" (since they're blessed by
the user).

In general, I don't see why Mozilla should deem the
Web to be a more hostile place than local disk. They
merely have different risk profiles, to be handled differently.

I've said marking the web as unsafe is suicidal technology
politics, and I'm happy to iterate for cross-examination
that view somewhere else where it's on-topic.

One of the proposed solutions for this problem, that of
marking the web page with meta-data, is a weak form of this
stronger solution.
>>
Why is it weaker?

Because the Mozilla-generated credential in roc's scenario (an HTML comment) is easily forged (its trivially encrypted). Anyone can trivially manufacture that mark and put it in a local disk-based HTML document (forgery). In fact, by saving via IE, the forgery is automated.

A stronger credential would be a private-key encrypted checksum
of the page content. That's harder to forge.

Only if you use an encrypted string.

Right; I maintain you have to do that if you want to reliably export and import credentials from the mozilla security model. (This isn't my proposal though).

Files saved in Word do require extra security - Word's macro virus protection. Files saved in PDF might, actually, as I think you can embed script in them. Not sure how Acrobat reader handles this. Filed viewed in vi don't because they don't contain executable code.

Sure, it's an uncertain world. My point is just this: viewed from the form-factor perspective, it's desireable that HTML content manipulated with Mozilla be as easy to manipulate as PDF documents manipulated with Acrobat Reader. If we start insisting on extra hobbles for HTML, hobbles that slow developer workflow (dialog boxes, etc), other document types will look increasingly more attractive to developers (like XAML).

Executable code downloaded from the web needs extra security.
> This isn't discriminatory, it's sensible.

It's the user that needs security, not the files. Mozilla
currently allows you to save your downloaded .exe in the
StartUp folder on Windows, for example (he says without testing).
Or you can save a file as C:\autoexec.bat. Where's the
"extra security" in that.

We all agree some file formats are risky; the question is,
how and when to mitigate the risk. I'm saying leave the
files alone, do the mitigation at runtime using Mozilla
software only.

I don't understand this either. What's wrong with marking files downloaded from the web as "untrusted" in this way? They are, after all.

It's a road to hell even to think that way. I'll post my argument seperately in this ng.

As roc points out, all options except robust versions of 3. can
be undone by local disk manipulation via the desktop.
By whom?

By anyone with access to the desktop or command line; the user.

"Best effort" is probably not the best phrase to use here. It implies "we'll try and protect you, but sometimes we won't manage it", rather than "We'll try and load the content silently, but sometimes we may have to ask you".

Good point; I turn to the audience for a better name.

iv) content. If the page contains no JS, PASS.
That's extremely hard to assess.

Indeed. It's just an example. This topic's about "policy". I agree it has to be feasible before any policy can be upheld. Obviously some analysis at parse time might help.

i) URL. If, according to a small database maintained by
   Mozilla, this file: URL matches no http: URL ever
   saved to disk, then PASS.

How does this avoid the problem outlined by Jesse in comment 16 of bug 273419:
"As a result, roc's solution would make our message to users *more* complicated than it is now.

Since it's a PASS, there's no message. Less messages = less complexity. WHY there's no message is more complicated, I agree. But will users or developers pause to wonder why it "just worked?". In the limit a status bar icon could light up with the semantics "I worked out for you that this file is safe to display".

I imagine, though, that each of these rules that I tossed about
would have to be hotly debated (not in this thread). The question
is: is such a set of tests (whatever they are) a reasonable
approach. I suppose we should argue that a useful minimum number
should be found as a necessary condition to the overall policy being
useful.

iv) "Mark of the web". If the content contains this MS
    feature, then FAIL. It could have been auto-generated
    by an IDE, and will probably be discredited as a
    security measure any day.

Again, I'm confused here. The MOTW is, as I understand it, used to apply security restrictions to the page (e.g. so it can't read from local disk). Why would an IDE ever want to put one in? And, if it did, what's the security problem with the page having less abilities than it otherwise would?

"Put a doctor in a leper colony, and all you get is another leper". (with apologies to those with Hodgkin's disease). Neither you not I can rely on all the IDEs, servers, browsers and misguided Visual Basic programmers out there to use the mark of the web correctly. Just stay away; it's a Microsoft dependancy akin to file extension sniffing. And then there's political arguments.

"To break it, just catch a virus that scans local .htm files and removes this metadata."
roc sensibly replied:
"Once an attacker is running arbitrary code on your system, all dominos have fallen.

Sure, no point in arguing that. But we still need a policy for local disk interactions because of conflicts between user and developer needs.

I don't understand how 2 and 4 exclude phishing attacks, I'm afraid.

A bit hasty of me to say "all". At least this scenario perhaps: a website tricks a user into downloading a file to local disk, and then into viewing that file under carefully arranged circumstances. That requires tricking the user into a series of steps that probably will only work if they're done in sequence and not too far apart in time. Perhaps by not allowing recently saved files from being re-viewed straight away, this chain of steps can be broken. But I admit it was as a hasty example, perhaps better disregarded.

- N.
_______________________________________________
Mozilla-security mailing list
Mozilla-security@mozilla.org
http://mail.mozilla.org/listinfo/mozilla-security

Reply via email to