Hi All,

 I'm now very impressed with PostGIS and what it can do with spatial
data. Given the easy setup with Ubuntu, the easy admin with pgAdmin3,
and the great interface with Qgis, I'm won over totally.

 Of course I still need to read my spatial data into R (until I do all
my analysis in Python....).

 I'm trying to follow best practice with my database. So for any set
of polygons I have one table with the geometry and other tables with
data. I can create maps of the data by creating views where the data
table is linked to the geometry table via the polygon id. However some
of my views have many entries per polygon...

 For example I have monthly rainfall for 100 regions in Ethiopia over
5 years - so that's a data table with 12*100*5 rows. If I read that as
an sp object (via readOGR) I will end up with a 6000-row sp-object,
containing 60 copies of each of the 100 polygon boundaries. That
doesn't sound good to me.

 What I want to do with this data is draw 60 choropleth maps - one for
each month. I can think of four approaches:

 1. Suck it up - read the whole thing into an sp object and then
subset that by month for the map. Hope R doesn't fall over.

 2. Read the map (without attached data) into an sp object and read
the data (without map geometry) into a data frame (using RPgSQL). Then
attach each month to the sp object.

 3. As (2), but attach the whole dataframe to the sp-object and then
subset. Probably not much different from (2) really.

 4. Can I apply an SQL select to the readOGR call, so I only select a
map with a single month's rainfall? I've just thought of this one. I'm
not sure if the specification of a PostGIS layer allows it.

Any thoughts?

Barry

_______________________________________________
R-sig-Geo mailing list
R-sig-Geo@stat.math.ethz.ch
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Reply via email to