Having that many instances is not practical at all, so I'll have as many
databases as I have 'realms'. I'll use pg_dump | nc and nc | psql to
move databases
Mario
Then you can use schemas, too, it'll be easier.
--
Sent via pgsql-performance mailing list (pgsql-performance@po
I saw a presentation from Heroku where they discussed using a similar
paradigm, and they ran into trouble once they hit a couple thousand
databases. If memory serves, this was on an older version of
PostgreSQL and may not be relevant with 9.0 (or even 8.4?), but you
may want to try to track down on
On 11/30/2010 12:45 PM, Dimitri Fontaine wrote:
Mario Splivalo writes:
I have simple database schema, containing just three tables:
samples, drones, drones_history.
Now, those tables hold data for the drones for a simulation. Each simulation
dataset will grow to around 10 GB in around 6 month
Mario Splivalo writes:
> I have simple database schema, containing just three tables:
>
> samples, drones, drones_history.
>
> Now, those tables hold data for the drones for a simulation. Each simulation
> dataset will grow to around 10 GB in around 6 months.
>
> Since the data is not related in a
I have simple database schema, containing just three tables:
samples, drones, drones_history.
Now, those tables hold data for the drones for a simulation. Each
simulation dataset will grow to around 10 GB in around 6 months.
Since the data is not related in any way I was thinking in separatin