[long post]
I've been trying to progress bug 273419 (disclosure of local files) and bug 230606 (same origin for local files). Some notes.
Where I'm coming from:
Firefox's "smooth user experience" makes Fx a popular product for end users. A similarly smooth experience will help make moz/xulrunner/Fx a popular product for app developers. Developers, however, use local disk a lot and that puts them at odds with some security goals. In particular, they want to "do whatever they want" without annoying warnings or restrictions, whereas users depend on warnings and restrictions for safety. There has to be a clever solution somewhere that keeps both parties happy.
In these two bugs, the fundamental architectural problems are:
a) no reasonable same-origin policy for local files. b) intrinsic security problems across borders.
I outline a possible option for a) and cover briefly a few tactics commonly used for b). First b).
Inside Mozilla, where "inside" is defined as "how the runtime engine perceives the universe" content sits inside a security model that is reasonably restrictive.
Developers, and those maintaining HTML content on local disk operate in a separate secure environment. A particularly nice attribute of the filesystem security environment is that you can alter any file that is owned by your user.
I'm suggesting that these two security models have equal status in terms of use. Neither is subordinate.
When files pass across the security boundary, both sides need to ensure that they are not compromised during the exchange; the only way to do that is to check the credentials of the information that's in transit, or for the two sides to have a "handshake" agrement separate to the file content. There's no handshake protocol support in most common filesystems.
If the credentials check fails, the content in transit is suspect. If the security checks pass, the content in transit passes through without incident.
The case of mozilla->local disk (saving pages) is easy; weaker security on local disk means that the filesystem need perform no checks other than folder permissions (cannot write) and reporting denial-of-service attacks (write failed: disk full). That's all in place.
The case of local disk->mozilla is harder. Somehow, any file raised from local disk into the mozilla environment must present credentials sufficient to persuade Mozilla that Mozilla's security model is not compromised. The question is: what are sufficient credentials, and where do they come from?
The current arrangement is that the credential test disk->mozilla is weak - in fact virtually nonexistent.
Here is a rundown of some of the credential test solutions that are available.
1) Move the border
In this radical solution, Mozilla's security borders are extended to include some portion of the local file system. Files that lie within that new circle of trust are automatically trusted without additional credentials.
This kind of thing can be achieved by layering mozilla security on top of the local file system, either actually or virtually.
In my view this is highly problematic, because you end up taking on all the problems of operating systems. In the limit you force the user to allocate a raw partition to mozilla for local file storage, or something equally stupid. Mozilla is not a database or a filesystem per se. This doesn't help developers either - the OS filesystem is a useful tool in its own right and Mozilla should integrate with it, not try to overcome it.
2) maintain a whitelist
In this solution, which is analogous to nightclub membership, only those files known to Mozilla are admitted from disk to mozilla's secure environment. If users save files to disk then those files can be reloaded anytime, based on a URL check against some small database maintained by Mozilla. If the file's not found then the user gets a prompt. The problem with this system is that it's bureaucratic and high-maintenance. If the maintenence problem is automated away, then its value as a solution is reduced.
3) manufacture trusted credentials.
One of the proposed solutions for this problem, that of marking the web page with meta-data, is a weak form of this stronger solution. When the page is emitted to local disk, an encrypted string is also emitted, either as a separate file or inline. When the file is loaded again, the string is checked against a private key or similar, and the file allowed if there is a match. Since the encryption is generated inside the secure mozilla environment, and survives export and import to local disk, it is trustworthy. This is the same argument that national mints use: a dollar note contains extensive credentials from the government (a "trusted" environment) that says that it is authentic, no matter what untrustworthy person offers it. The note's intrinsic value is preserved when it is exported to an insecure environment (an economy).
The problem with this approach is that it doesn't help developers at all; and also every Mozilla install becomes a mint. That creates a problem where each download of Fx (eg) must be accompanied by a unique private key or a private key generator. It's debateable whether each mozilla runtime should be uniquely identified like that; that's possibly a privacy risk for consumers.
Also, I've said elsewhere that this kind of thing is an attack on the web itself. Files saved in Word or vi or PDF don't require extra security in order to be useable - why should web pages? Microsoft's "Mark of the web" solution is very distasteful in this respect. It's anti-Web. If the web is a shared commons, then files on the web shouldn't have to wear armbands.
4) proposed solution - best effort with fallback.
As roc points out, all options except robust versions of 3. can be undone by local disk manipulation via the desktop.
In such an environment everything is up for grabs, but I suggest that we can perform quite well just by applying "best effort" checks. In a banking environment (for example) best effort identity checks are used over the counter and over the phone all the time. Using "best effort" we can reduce the impact of implementing robust security by finding many cases where the page can "pass though" and be displayed without annoying the user or developer.
Without modification, a saved web page contains quite a lot of information. I propose that mozilla use this information to make a best effort attempt at assessing credentials when pages transit across security models from the local disk->mozilla.
There are many pieces of information that can be inputs to a "best effort" policy for a given local HTML file.
i) file: URL ii) filepath iii) filesystem attributes: datestamps, ownership iv) content
From this information can be extracted a number of tests that can be used to PASS the file for local display without user security warnings. If no PASS is derived, then the user is prompted with a robust security warning. That warning is based on the assumption that all local files are unsafe to load.
This is the inverse of the information bar where users are protected from miscellaneous attacks by default, with the user white-listing in as they go. Here, users are also protected from miscellaneous attacks by default, but specific exceptions are whitelisted in by design.
Here are some of the checks that can make the display of web pages accessed by the file: protocol seamless and smooth without compromising security.
iv) content. If the page contains no JS, PASS.
i) URL. If, according to a small database maintained by Mozilla, this file: URL matches no http: URL ever saved to disk, then PASS. (file created by a developer or deliberately copied to the filesystem by the user)
ii) filepath. If the stem of the filepath fails to match a filepath ever used before, then FAIL. (no precedent for loading files from this part of the filesystem).
iii) datestamp. If this file hasn't been touched in a month, then it's unlikely to be a phishing or social engineering attack. PASS.
iv) "Mark of the web". If the content contains this MS feature, then FAIL. It could have been auto-generated by an IDE, and will probably be discredited as a security measure any day.
Some of these could be used in combination, for example, the 2nd and 4th option can be used exclude all possible phishing attacks along the lines of bug 273419.
I'm sure there are other tests that might be useful. By putting these tests in the right order, we maximise the number of cases where the page can be displayed.
For example, separately I suggested that for one specific case (form submission + recent modification + never downloaded) a "local submit" feature might also increase the utility of mozilla for developers. That kind of nice-to-have feature can certainly be constrained or exposed by a set of best effort tests like this.
Looking at my own disk, I'm a fairly heavy user of the Web; I have 1,708 web pages saved to local disk, not including fileserver content. Even so, that's not many. Mozilla can easily manage a small database about what I've downloaded, and that info can assist the best effort analysis, as in the 2nd and 3rd cases above. For non technical users, that small database would save their lives many times over.
As far as I can tell, there's no way out of requesting credentials from the file to be displayed. At least this way, we work hard to reduce the number of times that those credentials have to be supplied by the user in response to a prompt. Underneath all that, we have a robust default to fall back on: always prompt.
- Nigel.
ps. Each of these tests is effectively a deduced capability. _______________________________________________ Mozilla-security mailing list Mozilla-security@mozilla.org http://mail.mozilla.org/listinfo/mozilla-security