On Wed, Feb 11, 2009 at 2:36 PM, Adam Barth <w...@adambarth.com> wrote:
> On Wed, Feb 11, 2009 at 2:15 PM, Breno de Medeiros <br...@google.com> > wrote: > > 1. The mechanism is not sufficient strong to prevent against defacing > > attacks. > > We're not worried about defacement attacks. We're worried about Web > servers that explicitly allow their users to upload content. For > example: > > 1) Webmail providers (e.g., Gmail) let users upload attachments. > 2) Forums let users upload avatar images. > 3) Wikipedia lets users upload various types of content. > > > An attacker that can upload a file and choose how to set the > > content-type would be able to implement the attack. If servers are > willing > > to let users upload files willy-nilly, and do not worry about magical > paths, > > will they worry about magical content types? > > In fact, none of these servers let users specify arbitrary content > types. They restrict the content type of resources to protect > themselves from XSS attacks to an to ensure that they function > properly. For some purposes, such as the one you described, putting a host-meta almost anywhere in a site could expose a browser to attacks similar to the ones that crossdomain.xml presented. However, such applications could handle specifying content-type as a requirement, as Eran rightly pointed out. > > > > 2. This technique may prevent legitimate uses of the spec by developers > who > > do not have the ability to set the appropriate header. > > Many developers can control Content-Type headers using .htaccess files > (and their ilk). And many others cannot. This is particularly irksome in outsourcing situations where you have only partial control of the hosting environment or depend on non-technical users to perform administrative tasks. > > > > Is this more likely to prevent legitimate developers from getting things > > done than to prevent attacks from spoofing said magical paths? I would > say > > yes. > > What is your evidence for this claim? My evidence for this being a > serious security issue is the experience of Adobe with their > crossdomain.xml file. They started out with the same design you > currently use and were forced to add strict Content-Type handling to > protect Web sites from this very attack. What is different about your > policy file system that will prevent you from falling into the same > trap? The difference being that cross-domain.xml is intended primarily for browser use and therefore optimization for that case sounds legitimate. This is not the case here. Again, again, there is an application layer where browsers can implement such policies. > > > > Defacing attacks are a threat to applications relying on this spec, > > We're not talking about defacement attacks. > > > and they > > should be explicitly aware of it rather than have a false sense of > security > > based on ad-hoc mitigation techniques. > > This mechanism does not provide a false sense of security. In fact, > it provides real security today for Adobe's crossdomain.xml policy > file and for a similar Gears feature. (Gears also started with your > design and was forced to patch their users.) > > Adam > -- --Breno +1 (650) 214-1007 desk +1 (408) 212-0135 (Grand Central) MTV-41-3 : 383-A PST (GMT-8) / PDT(GMT-7)