On 12/7/2016 4:02 PM, metaresolve wrote:
I used to use Access to do my data crunching, matching, and cleaning at my
old job. I worked with a max of 600k records so Access could handle it. I
know, lame, but it's what I knew.

Access is really 2 completely different things bundled. One is that rather weak and poor "Jet" sorta-relational database that sorta implements a subset of SQL. The other is a database application development system oriented around forms and reports.


My thought was to use postgreSQL as a kind of more advanced Access that I
could use to crunch numbers on similarly. However, My file has 1.1M records
on it and pgadmin seems to be choking on it.

Postgres, on a properly scaled and tuned database server, can handle billions of records. Obviously, doing something silly like querying all billions at once will never be fast, thats a lot of data to marshal and process.



--
john r pierce, recycling bits in santa cruz



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to