On Tue, 27 Mar 2007 15:24:23 +0200, Thomas Roessler <[EMAIL PROTECTED]> wrote:
The advantages of this proposal are that each header rule and
each processing instruction contributes one item which is
individually analyzed. It's not really clear why this is needed
or desirable though especially as it also allows scenarios as
pointed out above. The main problem with this approach is that
it's quite complex to grasp
What's complex about it again?
(1) It's hard to explain how it works. The WG has repeatedly
misunderstood the model.
(2) It's unclear what problem having them grouped solves. All
problems that are solved by this proposal are also solved
by the other (two global lists) proposal.
(3) People can easily misunderstand it. The example I included
should demonstrate that I think.
The other idea which was specified initially is that all rules
specified by HTTP headers and processing instructions are
combined into two global lists. One list of allow rules and one
list of exceptions to those allow rules. (The latter could
probably be called "deny" as it would be effectively the same.)
The algorithm for this would be that once both lists are
constructed you first match the request URL against the items in
the allow list and if there's match and there's no match in the
exception / deny list you grant access. Otherwise access is
denied. (Assuming that the access control read policy is
applicable to the requested resource.
So this is equivalent to the one-pair special case of the first
proposal, right?
Yes, except that the one-pair is formed using all HTTP headers and
processing instructions from the resource. The rest should remain
equivalent imo.
--
Anne van Kesteren
<http://annevankesteren.nl/>
<http://www.opera.com/>