I'm building a national database of agricultural information and one of
the layers is a bit more than a gigabyte per state. That's 1-2 million
records per state, with a mult polygon geometry, and i've got about 40
states worth of data. I trying to store everything in a single PG table.
What I'm concerned about is if I combine every state into one big table
then will performance will be terrible, even with indexes? On the other
hand, if I store the data in several smaller files, then if a user zooms
in on a multi-state region, I've got to build or find a much more
complicated way to query multiple files.
So I'm wondering, should I be concerned with building a single national
size table (possibly 80-100 Gb) for all these records, or should I keep
the files smaller and hope there's something like ogrtindex out there
for PG tables? what do you all recommend in this case? I just moved over
to Postgres to handle big files, but I don't know its limits. With a
background working with MS Access and bitter memories of what happens
when you get near Access' two gigabyte database size limit, I'm a
little nervous of these much bigger files. So I'd appreciate anyone's
advice here.
TIA,
- Bill Thoen
- [GENERAL] How Big is Too Big for Tables? Bill Thoen
-