There was a small error in my previous post.
The 'begin transaction;' line was missed when
I copied the text out of the script and editted
it to remove the code around the text.

On Sat, 24 Jun 2006 20:10:58 -0700 (PDT), onemind wrote:

>I am using the sqlite gui and click import table from csv. I select a txt
>file that contain over 200,000 words in a list. Sqlite works fine with a
>smaller list of 200-300 words but when i import my big list, it hangs for
>ages or completely crashes my computer.

>Does anyone know how i can import this list into a table successfully?

The following text file, "word.txt", was created by a small script:

create table wordlist (word text);
begin transaction;
select current_time;
insert into wordlist values ("test0");
insert into wordlist values ("test1");
...
insert into wordlist values ("test199998");
insert into wordlist values ("test199999");
commit transaction;
select current_time;
select count(*) from wordlist;

This text file was executed within the sqlite3 command line utility
by typing the command:

        .read word.txt

The resulting screen output was

11:20:40
11:20:48
200000

indicating the insertion of 200,000 words into a simple database
required 8 seconds of clock time on a 1.5 ghz AMD Sempron cpu
running MS-Windows XP.

The use of sequential numbers as the trailing part of the 'word'
results in a continual rebalancing of the b-tree with each insertion.
For performances reasons, this is probably the worst kind of data
to insert into a database. Random words inserted result in many
leaves that are partially filled and fewer rebalance acts.

Eight seconds to insert the lot is pretty good.

You could create a similar text file from the CSV file using your
editor.

Chris



Reply via email to