Am 17.12.2009 15:46, schrieb Carbonari, Katie (IS):
Good morning. I have a lot of data I need to visualize in QGIS (142 time
steps, each with over 200,000 points). I need a way to quickly and
easily load this data in QGIS. All of my data is in separate .csv files
(one file per time step). I load each file via the csv plug-in and then
use join attributes to connect my csv data to the polygon data that
defines my grid. Join attributes takes forever (trying to do 4,000
points took an hour). Any one know of any other, faster way to do this?
Or a way to speed up my join attributes? I know 200,000 points is a lot
to ask and can easily cut that down a bit, but an hour to do 4000 points
seems very slow. At that rate, it would take 3 weeks to load the entire
time series into QGIS. I'm running the latest version of QGIS (Mimas) on
a Mac OsX.

Good afternoon ;-)

I think with this amount of data is would be the best method to work with a spatially enhanced database - most prominently PostgreSQL/PostGIS and to do all the datamanagement there - that is the reason why databases exist...

So,
1. export your polygon data to Postgres/PostGIS
2. push your csv data also into the database
3. create the necessary objects as tables from point 1+2
4. map it in qgis

The same applies if you do it with spatiallite, which is also possible, but with that databasesystem I have no expierence...

Hope this helps,
greetings,
Albin

--
---------------------------------------------------------------------
| Albin Blaschka, Mag. rer.nat - Salzburg, Austria
| http://www.albinblaschka.info   http://www.thinkanimal.info
| It's hard to live in the mountains, hard, but not hopeless!
---------------------------------------------------------------------


_______________________________________________
Qgis-user mailing list
Qgis-user@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/qgis-user

Reply via email to