On Thu, Aug 16, 2012 at 3:54 PM, Wells Oliver <wellsoli...@gmail.com> wrote:
> Hey folks, a question. We have a table that's getting large (6 million rows
> right now, but hey, no end in sight). It's wide-ish, too, 98 columns.
>
> The problem is that each of these columns needs to be searchable quickly at
> an application level, and I'm far too responsible an individual to put 98
> indexes on a table. Wondering what you folks have come across in terms of
> creative solutions that might be native to postgres. I can build something
> that indexes the data and caches it and runs separately from PG, but I
> wanted to exhaust all native options first.

Well, you could explore normalizing your table, particularly if many
of your 98 columns are null most of the time.  Another option would be
to implement hstore for attributes and index with GIN/GIST --
especially if you need to filter on multiple columns.  Organizing big
data for fast searching is a complicated topic and requires
significant thought in terms of optimization.

merlin


-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to