[GENERAL] Pushing the Limits

2006-11-12 Thread Cabbar Duzayak

Hi,

We have huge amount of data, and we are planning to use logical
partitioning to divide it over multiple machines instances. We are
planning to use Intel based machines and there is not much updates but
mostly selects. The main table that constitutes this much of data has
about 5 columns, and rows are about 50 bytes in size, and 3 columns in
this table need to be indexed.

So, what I wanted to learn is how much can we push it to the limits on
a single machine with about 2 gig rams? Do you think PostGres can
handle ~ 700-800 gigabyte on a single machine? And, is it OK to put
this much data in a single table, or should we divide it over multiple
tables? If that is the case, what would be the limit for a single
table?

Any help/input on this is greatly appreciated.

Thanks.

---(end of broadcast)---
TIP 4: Have you searched our list archives?

  http://archives.postgresql.org/


[GENERAL] Queue Implementation for/with PostgreSQL

2005-11-27 Thread Cabbar Duzayak
Hi,

Does Postgres support queues, I mean does it have something like
Oracle's Advanced Queue. If not, can you please recommend an
open-source, light-weight and more importantly RELIABLE point-to-point
queue implementation, built on postgres? Something similar to Oracle's
Advanced Queue? I don't need publish/subscribe or a priority-based
point-to-point queue, just a simple one which provides a reliable way
of enqueuing and dequeuing with LIFO...

Thanks...

---(end of broadcast)---
TIP 9: In versions below 8.0, the planner will ignore your desire to
   choose an index scan if your joining column's datatypes do not
   match