Ah !
The database-tier did not flash into my mind.
I have not used sqlite anytime.
I am using MySQL; the table structure is partially normalised,
suitable indexed.
Because of these optimizations, whenever I query a huge table (>1
million rows), I get very fast response.
I don't know whether any shell is available for sqlite.
If available, run your query from the shell & see whether the speed is
OK or not.
If OK there, then the problem lies in your python code.
Otherwise, you need to optimize the table-structure at the database-
level.

My advise is to use a production database server like MySQL /
MariaDB / PostgreSQL.

If you have any questions regarding using MySQL through Python, pl.
feel free to ask.

Cheers!
:-)

On Jul 23, 11:56 am, Luis Goncalves <lgoncal...@gmail.com> wrote:
> Thanks, Vineet!  Lot's of good info there!
>
> I don't have actual code yet, because I couldn't even get the db queries to
> work in a reasonable amount of time.
>
> The little code I showed in my initial post already runs slow  ( my DB
> records are db.itemEntry, not db.table ...).
>
> The slowness (so far) is due to doing a query/select on a very large DB.
> I need to figure out how to query/select more efficiently.
>
> I wonder if the problem is with sqlite3 itself, since it stores the entire
> DB in a single file.
>
> I have constructed index tables for the fields I am searching on, but it is
> still incredibly slow.
> (see post below in reply to Massimo too!).
>
> Thanks!
> Luis.

Reply via email to