On Wed, May 23, 2012 at 5:11 PM, Herouth Maoz <hero...@unicell.co.il> wrote:
> A replication solution is not very good, either, because of course I can't 
> define indexes differently, I don't want *all* transactions in all tables to 
> be sent, and also, because I may want to cross reference data from different 
> systems. So ideally, I want to have a reporting database, where specific 
> tables (or maybe even just specific columns) from various databases are 
> collected, and have a reporting tool connect to this database. But I want to 
> push the data into into that database as close to real time as possible.

Look at PgQ from Skytools. You can queue your OLTP data changes and
restore specific columns only into your OLAP database.

>
> The most important data I am currently considering are two tables which have 
> an average of 7,600 transactions per hour (standard deviation 10,000, maximum 
> in May is 62,000 transactions per hour). There may be similar pairs of tables 
> collected from more than one database.
>
> I assume this is not an uncommon scenario. What solutions would you recommend?
>
>
> Herouth
> --
> Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-general



-- 
Sergey Konoplev

a database and software architect
http://www.linkedin.com/in/grayhemp

Jabber: gray...@gmail.com Skype: gray-hemp Phone: +79160686204

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to