Thank you all. 
Look like I'm stuck with the CLI though I have contacted Nucleon software
support ... tried CLI yesterday but need more practice.
Is there a good reference book you would recommend for SQLite?

peter

-----Original Message-----
From: sqlite-users-boun...@sqlite.org
[mailto:sqlite-users-boun...@sqlite.org] On Behalf Of Black, Michael (IS)
Sent: Tuesday, May 01, 2012 4:22 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] is SQLite the right tool to analyze a 44GB file

You need to try and do an import from the shell.  GUIs seem to have way too
many limits.

http://sqlite.org/download.html



Don't do any indexes up front....do them afterwords if they'll help your
queries.  Indexes will slow down your import notably.



I don't think you're anywhere near the limits of sqlite since it talks about
terabytes.

http://sqlite.org/limits.html



Somebody else can answer for sure but wrapping your .import inside a
transaction may be a good thing.

I don't know if that's done by default.



Your queries are liable to be pretty slow depending on what you have to do.









Michael D. Black

Senior Scientist

Advanced Analytics Directorate

Advanced GEOINT Solutions Operating Unit

Northrop Grumman Information Systems

________________________________
From: sqlite-users-boun...@sqlite.org [sqlite-users-boun...@sqlite.org] on
behalf of peter korinis [kori...@earthlink.net]
Sent: Tuesday, May 01, 2012 3:06 PM
To: sqlite-users@sqlite.org
Subject: EXT :[sqlite] is SQLite the right tool to analyze a 44GB file

I'm new to SQLite . not a programmer . not a DBA . just an end-user with no
dev support for a pilot project (single user, no updates, just queries).



I want to analyze the data contained in a 44GB csv file with 44M rows x 600
columns (fields all <15 char). Seems like a DBMS will allow me to query it
in a variety of ways to analyze the data.



I have the data files and SQLite on my laptop: a 64-bit Win7 Intel dual-proc
with 4GB RAM + 200GB free disk space.

End-user tools like Excel & Access failed due to lack of memory. I
downloaded SQLite ver.3.6.23.1. I tried to use Firefox Data Manager add-on
but it would not load the csv files - 'csv worker failed'. So I tried
Database Master from Nucleon but it failed after loading (it took 100
minutes) ~57,000 rows with error message = 'database or disk is full". I
tried to create another table in the same db but could not with same error
message. The DB size shows as 10,000KB (that looks suspiciously like a size
setting?).



>From what I've read SQLite can handle this size DB. So it seems that either
I do not have enough RAM or there are memory/storage (default) limits or
maybe time-out issues that prevent loading this large file . or the 2 GUI
tools I tried have size limits. I do have a fast server (16GB, 12 procs,
64-bit intel, Win server) and an iMAC available.



1.       Is SQLite the wrong tool for this project? (I don't want the
overkill and admin overhead of a large MySQL or SQL Server, etc.)

2.       If SQLite will work, are there configuration settings in SQLite or
Win7 that will permit the load . or is there a better tool for this project?



Thanks much for helping a newbie!



peterK



_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to