Hi Jim,

the data is proprietary at the moment, but the application will be 
launched in a few months.

I think the problem lies with Geoserver's handling of the CQL_FILTER=... 
IN (...) request. OpenLayers 2 has some client-side filtering, which 
appears to be faster, but map performance is much poorer than in 
OpenLayers 3.

i would like to keep the database, since the queries can be quite 
complex (such as looking for species names, species properties, 
geographical constraints and so on). Also data needs to be added or 
edited all the time.

Sincerely,

Eva

Am 27.06.2016 um 15:26 schrieb Jim Hughes:
> Hi Eva,
>
> Which database are you using?  There may be an optimization which can be
> made for your database.
>
> By chance is the data something which could be shared publicly? Another
> possibility is that the slowness is a bug in a GeoTools datastore
> implementation.  If so, it'd be great to have a bug report.
>
> It may be counterintuitive, but one really, really simple option may be
> to export your database as a CSV or Shapefile and skip using a
> database.  Some of the fastest queries may become a little slower, but
> you'd never run into a 2 minute bottleneck.
>
> Cheers,
>
> Jim
>
> On 06/27/2016 08:38 AM, emgerstner wrote:
>> Hello,
>>
>> I'm developing an web application based on OpenLayers 3 and Geoserver
>> 2.9.0. In this application, the user can search in a database containing
>> about 8000 entries of species with their locations. The search result is
>> then displayed in a map with the locations of the found species.
>>
>> To this end, I first do a database query based on the search parameters
>> (using PHP), which returns a list of species ids. I then forward this
>> list to Geoserver with a WFS request using CQL_FILTER such as
>>
>>      CQL_FILTER=SpeciesID IN (16,17,18,40,41,...)
>>
>> Geoserver then returns a map with only these species in the search
>> result, which is fine. However, I run into performance problems when the
>> list of ids becomes too large. For 8000 ids, it takes Geoserver about 2
>> minutes to draw the map which is way too long. The database query may
>> return anything between 0 and 8000 results.
>>
>> Is there another way to solve this problem more efficiently or some way
>> I can optimize Geoserver for this? I would be glad for any help.
>>
>> Sincerely,
>>
>> Eva
>>
>>
>> ------------------------------------------------------------------------------
>> Attend Shape: An AT&T Tech Expo July 15-16. Meet us at AT&T Park in San
>> Francisco, CA to explore cutting-edge tech and listen to tech luminaries
>> present their vision of the future. This family event has something for
>> everyone, including kids. Get more information and register today.
>> http://sdm.link/attshape
>> _______________________________________________
>> Geoserver-users mailing list
>> Geoserver-users@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/geoserver-users
>
> ------------------------------------------------------------------------------
> Attend Shape: An AT&T Tech Expo July 15-16. Meet us at AT&T Park in San
> Francisco, CA to explore cutting-edge tech and listen to tech luminaries
> present their vision of the future. This family event has something for
> everyone, including kids. Get more information and register today.
> http://sdm.link/attshape
> _______________________________________________
> Geoserver-users mailing list
> Geoserver-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/geoserver-users


------------------------------------------------------------------------------
Attend Shape: An AT&T Tech Expo July 15-16. Meet us at AT&T Park in San
Francisco, CA to explore cutting-edge tech and listen to tech luminaries
present their vision of the future. This family event has something for
everyone, including kids. Get more information and register today.
http://sdm.link/attshape
_______________________________________________
Geoserver-users mailing list
Geoserver-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/geoserver-users

Reply via email to