On Fri, Aug 19, 2016 at 4:58 PM Thomas Güttler <guettl...@thomas-guettler.de>
wrote:

>
>
> Am 19.08.2016 um 09:42 schrieb John R Pierce:
> > On 8/19/2016 12:32 AM, Thomas Güttler wrote:
> >> What do you think?
> >
> > I store most of my logs in flat textfiles syslog style, and use grep for
> adhoc querying.
> >
> >  200K rows/day, thats 1.4 million/week, 6 million/month, pretty soon
> you're talking big tables.
> >
> > in fact thats several rows/second on a 24/7 basis
>
> There is no need to store them more then 6 weeks in my current use case.
>
> I think indexing in postgres is much faster than grep.
>
> And queries including json data are not possible with grep (or at least
> very hard to type)
>
> My concern is which DB (or indexing) to use ...
>

How will you be using the logs? What kind of queries? What kind of
searches?
Correlating events and logs from various sources could be really easy with
joins, count and summary operations.

The kind of volume you are anticipating should be fine with Postgres but
before you really decide which one, you need to figure out what would you
want to do with this data once it is in Postgres.


> Regards,
>   Thomas
>
>
> --
> Thomas Guettler http://www.thomas-guettler.de/
>
>
> --
> Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-general
>
-- 
--
Best Regards
Sameer Kumar | DB Solution Architect
*ASHNIK PTE. LTD.*

101 Cecil Street, #11-11 Tong Eng Building, Singapore 069 533

T: +65 6438 3504 | M: +65 8110 0350

Skype: sameer.ashnik | www.ashnik.com

Reply via email to