Re: Practical use of CsrfPreventionFilter

2023-12-15 Thread Christopher Schultz

All,

I've opened a PR for this:

https://github.com/apache/tomcat/pull/681

Comments welcome.

-chris

On 12/15/23 10:20, Christopher Schultz wrote:

Mark,

On 12/15/23 04:53, Mark Thomas wrote:

On 13/12/2023 22:05, Christopher Schultz wrote:

All,

I've been playing with this Filter recently, and I have some concerns 
for its practical use. I'm considering adding some features to it in 
order to make it more practical to use, and I'm interested to see 
what others think about these "problems" and whether my potential 
additions have merit.


These are in no particular order (e.g. importance).

1. Killing caching

If used as prescribed, all URLs will have CSRF tokens added to them 
even when they are unnecessary. Specifically, I'm thinking about 
static resources such as images, scripts, and CSS.


The don't map the filter to those resources? Ah, I've now read you 
comment below.


It sounds like we do need a more sophisticated functionality to URL 
mapping than we can get with just the filter mapping.


Yep. My initial code, here, was to add some configuration to the filter 
to simple direct the response-wrapper to NOT adjust certain URLs sent to 
encodeURL and encodeRedirectURL. I'm open to other suggestions.



2. s with either GET or POST. Or, really... "Forms"

When using , different 
browsers do different things with the query string. I'm mostly 
testing on Firefox and I don't have time to check this on every 
browser. It's enough that Firefox behaves in this way.



   


Submitting this form causes a 403 response. The browser shows that 
the URL requested from the server is /foo?btn=Go! and the CSRF token 
has been removed from the action. (Surprise!) I don't see anything in 
WHATWG's description of how form-submission is supposed to work that 
indicates that GET versus POST should behave differently with respect 
to the 'action' attribute, but this is what the current version of 
Firefox actually does.


There are three options I can think of to get around this:

a. Switch to POST:


   


You get HTTP POST /foo?csrf=blahblah with an 
application/x-www-urlencoded entity containing the form  
element(s).


b. Manually add the CSRF token to the form:


[ if 
request.getAttribute("org.apache.catalina.filters.CSRF_NONCE_PARAM_NAME") ]
   

[ end-if]
   


In this case, you get a GET request with the csrf parameter in there 
because, well, you asked for it.


c. Hack the CsrfPreventionFilter to allow all POST methods. This is 
one of my proposals: allow users to configure the CSRF prevention 
filter to always-allow HTTP POSTs. This cannot be done in web.xml 
using e.g. GET for the  because 
Filters cannot be configured in this way. And if they could be, the 
Filter would not be invoked and all the URLs on your page would lack 
the required CSRF tokens to make /those/ links work.


Doesn't option c) defeat the purpose of the CSRF Filter? It looks like 
a) or b) are the better choices here.


I don't think (c) defeats the purpose of CSRF -- at least not 
completely. Here's my thought process.


First, the biggest threat from CSRF is not tricking a user on one site 
into submitting a form to the victim application (which would be a POST, 
which always requires user interaction from a browser). The biggest 
threat is triggering a non-idempotent GET request from one site to the 
victim site. This can be done in a variety of ways, but the simplest is 
something like this:


https://attacker.example.com/evilpage.html

...
https://victim.example.com/user/delete?id=1; />
...

Assuming that CSP, etc. hasn't been configured for these sites (or it 
has been configured intentionally for an attack), the browser will make 
that request to the victim application without any user interaction.


So, without protecting POST, we still get a lot of mileage out of CSRF 
protections only on GET (or at least non-POST).


Why not protect POST? Well, this gets back to the "practical" part.

It would be great to tell people "just enable CSRF protection and you 
are good". But the reality is that you have to enable it and then test 
every corner of your application. This process for me has been 
agonizing. We have lots of places in our nearly 20-year-old application 
(which has been continually maintained, improved, etc. and not just left 
to rot) where CSRF breaks everything.


By ignoring POST, we can almost just enable it and fix each instance of 
problems as they come up, because the number of times we have forgotten 
to use response.encodeURL() is infinitesimal. This gets us a huge bang 
for our buck even if it isn't 100% protection. I'll take 95% protection 
over 0% any day.


Had Tomcat's CSRF protection allowed this, it would probably be in 
production already instead of sitting in an "we'll see how this works 
out" pile of things to do.


This is very related to the "non-enforcement" suggestions below, because 
the idea is that you'd /eventually/ get all the POSTs working in your 
application, and then remove the 

Re: Practical use of CsrfPreventionFilter

2023-12-15 Thread Christopher Schultz

Mark,

On 12/15/23 04:53, Mark Thomas wrote:

On 13/12/2023 22:05, Christopher Schultz wrote:

All,

I've been playing with this Filter recently, and I have some concerns 
for its practical use. I'm considering adding some features to it in 
order to make it more practical to use, and I'm interested to see what 
others think about these "problems" and whether my potential additions 
have merit.


These are in no particular order (e.g. importance).

1. Killing caching

If used as prescribed, all URLs will have CSRF tokens added to them 
even when they are unnecessary. Specifically, I'm thinking about 
static resources such as images, scripts, and CSS.


The don't map the filter to those resources? Ah, I've now read you 
comment below.


It sounds like we do need a more sophisticated functionality to URL 
mapping than we can get with just the filter mapping.


Yep. My initial code, here, was to add some configuration to the filter 
to simple direct the response-wrapper to NOT adjust certain URLs sent to 
encodeURL and encodeRedirectURL. I'm open to other suggestions.



2. s with either GET or POST. Or, really... "Forms"

When using , different 
browsers do different things with the query string. I'm mostly testing 
on Firefox and I don't have time to check this on every browser. It's 
enough that Firefox behaves in this way.



   


Submitting this form causes a 403 response. The browser shows that the 
URL requested from the server is /foo?btn=Go! and the CSRF token has 
been removed from the action. (Surprise!) I don't see anything in 
WHATWG's description of how form-submission is supposed to work that 
indicates that GET versus POST should behave differently with respect 
to the 'action' attribute, but this is what the current version of 
Firefox actually does.


There are three options I can think of to get around this:

a. Switch to POST:


   


You get HTTP POST /foo?csrf=blahblah with an 
application/x-www-urlencoded entity containing the form  
element(s).


b. Manually add the CSRF token to the form:


[ if 
request.getAttribute("org.apache.catalina.filters.CSRF_NONCE_PARAM_NAME") ]
   

[ end-if]
   


In this case, you get a GET request with the csrf parameter in there 
because, well, you asked for it.


c. Hack the CsrfPreventionFilter to allow all POST methods. This is 
one of my proposals: allow users to configure the CSRF prevention 
filter to always-allow HTTP POSTs. This cannot be done in web.xml 
using e.g. GET for the  because 
Filters cannot be configured in this way. And if they could be, the 
Filter would not be invoked and all the URLs on your page would lack 
the required CSRF tokens to make /those/ links work.


Doesn't option c) defeat the purpose of the CSRF Filter? It looks like 
a) or b) are the better choices here.


I don't think (c) defeats the purpose of CSRF -- at least not 
completely. Here's my thought process.


First, the biggest threat from CSRF is not tricking a user on one site 
into submitting a form to the victim application (which would be a POST, 
which always requires user interaction from a browser). The biggest 
threat is triggering a non-idempotent GET request from one site to the 
victim site. This can be done in a variety of ways, but the simplest is 
something like this:


https://attacker.example.com/evilpage.html

...
https://victim.example.com/user/delete?id=1; />
...

Assuming that CSP, etc. hasn't been configured for these sites (or it 
has been configured intentionally for an attack), the browser will make 
that request to the victim application without any user interaction.


So, without protecting POST, we still get a lot of mileage out of CSRF 
protections only on GET (or at least non-POST).


Why not protect POST? Well, this gets back to the "practical" part.

It would be great to tell people "just enable CSRF protection and you 
are good". But the reality is that you have to enable it and then test 
every corner of your application. This process for me has been 
agonizing. We have lots of places in our nearly 20-year-old application 
(which has been continually maintained, improved, etc. and not just left 
to rot) where CSRF breaks everything.


By ignoring POST, we can almost just enable it and fix each instance of 
problems as they come up, because the number of times we have forgotten 
to use response.encodeURL() is infinitesimal. This gets us a huge bang 
for our buck even if it isn't 100% protection. I'll take 95% protection 
over 0% any day.


Had Tomcat's CSRF protection allowed this, it would probably be in 
production already instead of sitting in an "we'll see how this works 
out" pile of things to do.


This is very related to the "non-enforcement" suggestions below, because 
the idea is that you'd /eventually/ get all the POSTs working in your 
application, and then remove the "unsafe" non-enforcement of CSRF 
protection for POST.



3. Default token-cache size

The default token cache size is 5. If you use applications like I 

Re: Practical use of CsrfPreventionFilter

2023-12-15 Thread Mark Thomas

On 13/12/2023 22:05, Christopher Schultz wrote:

All,

I've been playing with this Filter recently, and I have some concerns 
for its practical use. I'm considering adding some features to it in 
order to make it more practical to use, and I'm interested to see what 
others think about these "problems" and whether my potential additions 
have merit.


These are in no particular order (e.g. importance).

1. Killing caching

If used as prescribed, all URLs will have CSRF tokens added to them even 
when they are unnecessary. Specifically, I'm thinking about static 
resources such as images, scripts, and CSS.


The don't map the filter to those resources? Ah, I've now read you 
comment below.


It sounds like we do need a more sophisticated functionality to URL 
mapping than we can get with just the filter mapping.



2. s with either GET or POST. Or, really... "Forms"

When using , different 
browsers do different things with the query string. I'm mostly testing 
on Firefox and I don't have time to check this on every browser. It's 
enough that Firefox behaves in this way.



   


Submitting this form causes a 403 response. The browser shows that the 
URL requested from the server is /foo?btn=Go! and the CSRF token has 
been removed from the action. (Surprise!) I don't see anything in 
WHATWG's description of how form-submission is supposed to work that 
indicates that GET versus POST should behave differently with respect to 
the 'action' attribute, but this is what the current version of Firefox 
actually does.


There are three options I can think of to get around this:

a. Switch to POST:


   


You get HTTP POST /foo?csrf=blahblah with an 
application/x-www-urlencoded entity containing the form  element(s).


b. Manually add the CSRF token to the form:


[ if 
request.getAttribute("org.apache.catalina.filters.CSRF_NONCE_PARAM_NAME") ]
   

[ end-if]
   


In this case, you get a GET request with the csrf parameter in there 
because, well, you asked for it.


c. Hack the CsrfPreventionFilter to allow all POST methods. This is one 
of my proposals: allow users to configure the CSRF prevention filter to 
always-allow HTTP POSTs. This cannot be done in web.xml using e.g. 
GET for the  because Filters cannot 
be configured in this way. And if they could be, the Filter would not be 
invoked and all the URLs on your page would lack the required CSRF 
tokens to make /those/ links work.


Doesn't option c) defeat the purpose of the CSRF Filter? It looks like 
a) or b) are the better choices here.



3. Default token-cache size

The default token cache size is 5. If you use applications like I do, 
you often open multiple links into separate tabs for various reasons. Do 
a search of products in your application, and there are several of them 
you'd like to get details for. So you middle-click (or whatever does it 
in your environment) on each of those links to open separate tabs, 
leaving the search-results page visible in the current tab.


Then you want to click the "next" button to view the next page of the 
results. An HTTP 403 response is returned because, if you opened more 
than 4 extra tabs, the tokens used on the search-results page have 
already expired, and you can't use your page anymore.


I'm not sure what the best token cache size is. It might depend upon the 
behavior of your users. If I use your site, you might want a very large 
token cache size. But I think maybe 5 isn't quite big enough. At least 
it's trivially settable, so I don't really have a recommendation here 
other than "maybe let's set the default to something higher"?


There is a trade-off here with memory usage but I wouldn't object to a 
higher default. Users with memory pressure concerns can always lower it.



4. No "test mode"

It might be really helpful to have a non-enforcement mode that generates 
logging without any errors. This would allow developers to enable the 
CSRF prevention filter in a mode where the nonces are generated and 
used, but only checked and logged. Any time a token would have been 
rejected, it is logged but the request is allowed.


The logging is already there, but requests are rejected.

I am having a problem at $work where anytime I enable this in a 
development environment, I get loads of complaints from users being 
locked-out of things that I haven't had the time to test and/or mitigate 
(such as  submissions). If there were a non-enforcement mode, I 
could enable it, then look at the logs after a week and fix anything I 
find in there without disturbing anybody.


+1


5. All or nothing

This may be 4(b) instead of "5", but...

When enabling the CSRF prevention filter, it's impractical to configure 
it for anything but /*. If you choose any 
other filter-pattern, you will find that you need numbers of "entry 
points" which are request URLs that are allowed to be requested 
/without/ providing a CSRF token.


This means that your *entire application* needs to be working perfectly 
in order to 

Practical use of CsrfPreventionFilter

2023-12-13 Thread Christopher Schultz

All,

I've been playing with this Filter recently, and I have some concerns 
for its practical use. I'm considering adding some features to it in 
order to make it more practical to use, and I'm interested to see what 
others think about these "problems" and whether my potential additions 
have merit.


These are in no particular order (e.g. importance).

1. Killing caching

If used as prescribed, all URLs will have CSRF tokens added to them even 
when they are unnecessary. Specifically, I'm thinking about static 
resources such as images, scripts, and CSS.


A sample page I'm using for testing uses a standard template for the web 
application I'm working with, and it normally requests 3 CSS files, 6 
Javascript files, and a site-logo. The browser very much likes to use 
the "favicons" that are defined in the page, and so it (Firefox) will 
load 2 different sizes of favicon for various purposes.


Simply clicking into the title bar and pressing ENTER shows that /all of 
those resources are re-fetched from the server for every request/. The 
browser cache has been defeated.


Each of these resources has been configured to return the following HTTP 
response headers to encourage caching:


Cache-Control: public
ETag: [the etag, which is constant]

The same of course happens if you perform a "quick" page reload (i.e. 
CTRL-R/CMD-R/click-reload-button) rather than performing a FULL page 
reload (i.e. SHITF-CTRL-R/SHIFT-CMD-R/SHIFT-click-reload-button).


This happens because every page-access generates a new CSRF token for 
that request which is used for all the URL references on that page.


Intermediate caches such as true caching HTTP servers will likely have 
the same problem. For caching HTTP servers, this represents a mild DOS 
attack since those caches will be wasting their cache space on one-time 
URLs. They could surely be tuned to ignore query-strings for certain URL 
patterns, but then cache performance (i.e. speed) will be decreased due 
to that URL-processing being added to the pipeline.


I think it might have some sense to add some filtering to the 
CsrfResponseWrapper (which is where HttpServletResponse.encodeURL and 
HttpServetRequest.encoreRedirectURL are hooked to add these token 
parameters to the URLs) to allow it to STOP adding tokens to certain URL 
patterns. Prefix (/images/*) or suffix (*.css) matching seems to make 
sense to me for inexpensive checks to get caching working again.


2. s with either GET or POST. Or, really... "Forms"

When using , different 
browsers do different things with the query string. I'm mostly testing 
on Firefox and I don't have time to check this on every browser. It's 
enough that Firefox behaves in this way.



  


Submitting this form causes a 403 response. The browser shows that the 
URL requested from the server is /foo?btn=Go! and the CSRF token has 
been removed from the action. (Surprise!) I don't see anything in 
WHATWG's description of how form-submission is supposed to work that 
indicates that GET versus POST should behave differently with respect to 
the 'action' attribute, but this is what the current version of Firefox 
actually does.


There are three options I can think of to get around this:

a. Switch to POST:


  


You get HTTP POST /foo?csrf=blahblah with an 
application/x-www-urlencoded entity containing the form  element(s).


b. Manually add the CSRF token to the form:


[ if 
request.getAttribute("org.apache.catalina.filters.CSRF_NONCE_PARAM_NAME") ]
  ]" />

[ end-if]
  


In this case, you get a GET request with the csrf parameter in there 
because, well, you asked for it.


c. Hack the CsrfPreventionFilter to allow all POST methods. This is one 
of my proposals: allow users to configure the CSRF prevention filter to 
always-allow HTTP POSTs. This cannot be done in web.xml using e.g. 
GET for the  because Filters cannot 
be configured in this way. And if they could be, the Filter would not be 
invoked and all the URLs on your page would lack the required CSRF 
tokens to make /those/ links work.


3. Default token-cache size

The default token cache size is 5. If you use applications like I do, 
you often open multiple links into separate tabs for various reasons. Do 
a search of products in your application, and there are several of them 
you'd like to get details for. So you middle-click (or whatever does it 
in your environment) on each of those links to open separate tabs, 
leaving the search-results page visible in the current tab.


Then you want to click the "next" button to view the next page of the 
results. An HTTP 403 response is returned because, if you opened more 
than 4 extra tabs, the tokens used on the search-results page have 
already expired, and you can't use your page anymore.


I'm not sure what the best token cache size is. It might depend upon the 
behavior of your users. If I use your site, you might want a very large 
token cache size. But I think maybe 5 isn't quite big enough. At least 
it's