On Wed, Jul 1, 2015 at 7:08 AM, ben.play <benjamin.co...@playrion.com>
wrote:

> In fact, the cron job will :
> -> select about 10 000 lines from a big table (>100 Gb of data). 1 user has
> about 10 lines.
> -> each line will be examinate by an algorithm
> -> at the end of each line, the cron job updates a few parameters for the
> user (add some points for example)
> -> Then, it inserts a line in another table to indicate to the user each
> transaction.
>
> All updates and inserts can be inserted ONLY by the cron job ...
> Therefore ... the merge can be done easily : no one can be update these new
> datas.
>
> But ... how big company like Facebook or Youtube can calculate on (a)
> dedicated server(s) without impacting users ?
>
>
>
> --
> View this message in context:
> http://postgresql.nabble.com/Which-replication-is-the-best-for-our-case-tp5855685p5856062.html
> Sent from the PostgreSQL - general mailing list archive at Nabble.com.
>
>
> --
> Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-general
>


I'm assuming this query is really HUGE,
otherwise I can't see why it'd bring your database to halt, specially with
that amount of main memory.

That aside, I don't see why you can't send inserts in small batches back to
the master DB.

Regards.

Reply via email to