[EMAIL PROTECTED] wrote:
> Analyzed, no... but I agree that the Request-Source checks should only
> be made for non-safe methods.  The proposal includes that statement,
> though perhaps it could have been made more prominently:
> http://people.mozilla.com/~bsterne/site-security-policy/details.html#non-safe

Yes; I think the current write-up is confusing on this point.

>> This means that all script has to be in external .js files. (This is one
>> of the differences in approach from Content-Restrictions.) While this is
>> an encouragement of best practice in JS authorship (unobtrusive script,
>> addition of event handlers using addEventListener() and so on) would
>> site authors find it too restrictive?
> 
> This is admittedly the most cumbersome aspect of the proposal for site
> authors.  The assumptions I have been going on that led to this model
> are twofold:
> 1.  It is extremely difficult to differentiate "intended" inline
> script from "injected" inline script and a clear boundary can be
> established if we simply require that all JavaScript be included from
> external files from white-listed hosts.

My attempt at making such a differentiation is here:
http://www.gerv.net/security/script-keys/

But perhaps it's unnecessary. It would be useful to get web developer
feedback on this point. If it is too cumbersome, the proposal will not
be adopted. Perhaps we could take it to the WASP or other
web-development forums for discussion?

> 2.  Sites that wish to utilize Site Security Policy will perhaps be
> willing to do more work in reorganizing their pages, at least for
> those resources that they consider "sensitive" enough to justify using
> SSP.

My desire in writing Content-Restrictions (although this may have got
somewhat obscured as the spec evolved) was that the gains were
incremental - the more work you did, the more benefit you'd gain. So you
could just add a header encoding what your site does now, for some
protection, or you could rearrange the site for greater protection. Half
a loaf is better than no bread.

>> - Can you more carefully define the relative priority and order of
>> application of allow and deny rules in e.g. Script-Source?
> 
> Yes.  I made comments in the add-on code that does this, but you're
> right that it should be explained in the proposal as well.  For
> example, if a host matches any deny rule, that rule will take
> precedence over any rule allowing the host.

OK. When might we expect an update?

>> - Do you plan to permit these policies to also be placed in <meta
>> http-equiv=""> tags? There are both pros and cons to this, of course.
> 
> Yes, I've thought about this as well, and I think http-equiv will
> probably be useful for the set of users who don't have CGI privileges
> on their server and can't set custom headers the traditional way.

Right. It would require a parsing restart, just as a charset change I
believe still does. But that's an unavoidable penalty.

> Also, I'm seeing a lot of suggestions/requests (off newsgroup mostly)
> that policies be defined in an external file rather than via headers.
> This would obviously be closer to Adobe's Flash model.  I have heard
> you, in this discussion and previously, bring up the issue of the log
> spam created by all the 404s generated in sites that don't have a
> policy file.  What about the following idea (from Dan Veditz):  If a
> server wants to set Site Security Policy, it sends a HTTP header or
> http-equiv meta tag that points the user agent to the location where
> the policy file sits.

Could do. Although if we are doing that, perhaps the <link> tag might be
a more appropriate method in HTML.

Have you thought about how much SSP applies to non-HTML (e.g. XML)
content? I tried to make Content Restrictions generalizable, at least in
principle, to other sorts of content which contained embedded script or
embedded documents.

Gerv
_______________________________________________
dev-security mailing list
dev-security@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security

Reply via email to