On 11 Dec 2014, at 11:51am, Paul <de...@ukr.net> wrote:

> I understand, that having them is a must for a decent performance. 
> In my specific case I have millions of individual database files. 
> This is one, among other reasons that I can't keep them open all the time. 
> Just too many of them. These databases are being opened frequently. 
> Let's say 500 times per second. In most cases, just to query a single row. 
> Ironically, querying takes only a handful of microseconds, and most 
> CPU time is spent reading same database structure over and over again.
> 
> Can you please make some advice, what can be done to reduce this overhead?

The problem with this is that it cannot be solved by SQLite's programmers 
because most of the time is taken by operating system calls.  Merely opening a 
file (which you no doubt know is not done by sqlite_open() but delayed until 
the first access) is a time-consuming procedure.  Once SQLite has access to the 
data it is, as you have shown, very fast.

You explain that you have millions of individual database files.  Is that the 
only reason you can't open the database and keep it open, or are there others ? 
 Also, do all these separate databases have the same tables with the same 
columns in ?

My normal advice would be that before you start querying you merge your 
millions of separate database files into one big one.  Judging by the degree of 
technical information in your question you don't need me to suggest ways of 
programming or scripting this, or of keeping a merged central copy up-to-date.  
The only question is whether it is appropriate to your circumstances.

Simon.
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to