until now I used only small / simple databases in Python with sqlite3.
Now I've a large and rather complex database.

The most simple query (with just a result of 100 rows),
takes about 70 seconds.
And all that time is consumed in "cursor.fetchall"

Using the same database in Delphi,
using the same query,
takes less than 5 seconds (including displaying the full table in a grid).

While it may seem obvious, are you doing anything time-consuming with those results? Or have you tested just doing the fetchall() without doing any further processing? I'm curious on the timing of

  sql = "..."
  start = time()
  cursor.execute(sql)
  rows = cursor.fetchall()
  end = time()
  print end-start

with no other processing. I regularly write sql that's fairly complex and brings back somewhat large datasets (sometimes in sqlite), and have never experienced problems with "simple quer[ies] (with just a result of 100 rows" taking such extrordinary times

The answer from the above code will help determine whether it's the sqlite portion that's crazy (and might need some well-placed index statements; though if your Delphi code is fine, I suspect not), or if it's your application code that goes off into left field with the resulting data.

-tkc




--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to