Gilles Ganault wrote:

> But then, how many people use SQLite to handle 68 milions rows 
> in a table?

That's a good question.  I don't know.  And I don't know if there 
is a recommended size for SQLite databases.  But I was able to 
create the index in 12 minutes after I set the right cache_size. 
The usage I foresee for SQLite is that structured data is provided 
for download and personal analysis, rather than XML dumps or Excel 
spreadsheets. And today I can easily download a 3 gigabyte 
database file, which is only 600 megabytes compressed.

Over on the unison mailing list (Unison is a tool for doing 2-way 
rsync, file transfers) we're constantly amazed that people 
complain they are unable to transfer terabytes in a single 
transfer.  Who could have imagined.  Maybe Gordon Moore.

My 68 million rows come from the Swedish Wikipedia, which is my 
small experimentation base before I try this on the full size 
German or English Wikipedia.  But I might have to back down to the 
Estonian or Faroese Wikipedia to get an even smaller dataset.


-- 
  Lars Aronsson ([EMAIL PROTECTED])
  Aronsson Datateknik - http://aronsson.se
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to