Python Array -> geometry(POINT, [SRID])[] -> unnest, you can use unnest in CTE something like
WITH cte AS ( SELECT unnest(array['POINT(1 1)'::geometry ......]) pnt ) SELECT * FROM t JOIN cte ON ST_.... Milions of points aren`t big data, if it is slow, maybe you should check your indexes. On Mon, Jun 13, 2016 at 08:08:34AM -0700, Michal Seidl wrote: > Hello, > I have probably simple question but I am stuck. I have table with about 1E6 > points in table. I have about hundreds of points in Python array. How to > pass these points into Postgis SQL query to check against table (if the same > point exist) with millions points? > > I started with Python loop to check every point individually but it is slow. > I have found lots of receipts how to fight with large data but they have to > be already in database. > > Which way is good for this situation? > - create temporary table and firstly insert points into DB > - try to manage Python not to close connection and send SQL query with each > point fast > - try to push all points in one query? How? > - write stored procedure that will accept array of data? > - anything else? > > May be it is more general SQL questions, but i am working with points. > > Thanks for advice Michal > > > > -- > View this message in context: > http://postgis.17.x6.nabble.com/Performance-external-loop-WHERE-IN-values-stored-procedure-or-else-to-pass-data-to-SQL-query-tp5010138.html > Sent from the PostGIS - User mailing list archive at Nabble.com. > _______________________________________________ > postgis-users mailing list > postgis-users@lists.osgeo.org > http://lists.osgeo.org/mailman/listinfo/postgis-users _______________________________________________ postgis-users mailing list postgis-users@lists.osgeo.org http://lists.osgeo.org/mailman/listinfo/postgis-users