Hi Greg, Am 2009-06-16 12:13:20, schrieb Greg Smith: > The first level of problems you'll run into are how to keep up with > loading data every day. The main way to get bulk data in PostgreSQL, > COPY, isn't particularly fast, and you'll be hard pressed to keep up with > 250GB/day unless you write a custom data loader that keeps multiple cores
AFAIK he was talking about 250 GByte/month which are around 8 GByte a
day or 300 MByte per hour
Thanks, Greetings and nice Day/Evening
Michelle Konzack
Systemadministrator
Tamay Dogan Network
Debian GNU/Linux Consultant
--
Linux-User #280138 with the Linux Counter, http://counter.li.org/
##################### Debian GNU/Linux Consultant #####################
<http://www.tamay-dogan.net/> Michelle Konzack
<http://www.can4linux.org/> c/o Vertriebsp. KabelBW
<http://www.flexray4linux.org/> Blumenstrasse 2
Jabber [email protected] 77694 Kehl/Germany
IRC #Debian (irc.icq.com) Tel. DE: +49 177 9351947
ICQ #328449886 Tel. FR: +33 6 61925193
signature.pgp
Description: Digital signature
