On Tue, Jul 9, 2013 at 6:27 AM, Craig Jones craig.jo...@utas.edu.au wrote:
Hi All,
We are getting the exception below when downloading a large wfs response
in csv or gml format using gzip compression. Were are using geoserver
2.1.1 in production but this also happens in 2.3.3.
Hey Craig,
I reduced the fetch size to 200 but this didn't make any difference.
Thanks for the suggestion.
What Andrea said. I've a feeling the gzip compression isn't what's causing
this. I had a look at how GeoServer does its compression a while ago and its
pretty robust.
I wonder if
Hi Andrea,
We're connecting to a postgres 9.1 postgis 2.0 database.
I can download as much data as I want after removing the gzip filter
(several gigabytes) .
I was looking at the revision history in subversion which I'm assuming
is correct for those earlier changes, but yes I did read that
I think an alternative would be to keep the gzip filter off and just let
your servlet container do gzipping. Or the http server in front of it if
you have one. See like
http://viralpatel.net/blogs/enable-gzip-compression-in-tomcat/
I feel like that's now the general practice on the web? That the
Hi Chris,
Thanks, we'll have a look at the alternatives.
CraigJ
On 10/07/13 13:02, Chris Holmes wrote:
I think an alternative would be to keep the gzip filter off and just
let your servlet container do gzipping. Or the http server in front of
it if you have one. See like
Hi All,
We are getting the exception below when downloading a large wfs response
in csv or gml format using gzip compression. Were are using geoserver
2.1.1 in production but this also happens in 2.3.3.
java.lang.OutOfMemoryError: Java heap space