I'm new to SQLite . not a programmer . not a DBA . just an end-user with no
dev support for a pilot project (single user, no updates, just queries).

 

I want to analyze the data contained in a 44GB csv file with 44M rows x 600
columns (fields all <15 char). Seems like a DBMS will allow me to query it
in a variety of ways to analyze the data. 

 

I have the data files and SQLite on my laptop: a 64-bit Win7 Intel dual-proc
with 4GB RAM + 200GB free disk space.

End-user tools like Excel & Access failed due to lack of memory. I
downloaded SQLite ver.3.6.23.1. I tried to use Firefox Data Manager add-on
but it would not load the csv files - 'csv worker failed'. So I tried
Database Master from Nucleon but it failed after loading (it took 100
minutes) ~57,000 rows with error message = 'database or disk is full". I
tried to create another table in the same db but could not with same error
message. The DB size shows as 10,000KB (that looks suspiciously like a size
setting?).

 

>From what I've read SQLite can handle this size DB. So it seems that either
I do not have enough RAM or there are memory/storage (default) limits or
maybe time-out issues that prevent loading this large file . or the 2 GUI
tools I tried have size limits. I do have a fast server (16GB, 12 procs,
64-bit intel, Win server) and an iMAC available.

 

1.       Is SQLite the wrong tool for this project? (I don't want the
overkill and admin overhead of a large MySQL or SQL Server, etc.)

2.       If SQLite will work, are there configuration settings in SQLite or
Win7 that will permit the load . or is there a better tool for this project?

 

Thanks much for helping a newbie!

 

peterK

 

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to