On 10 Jul 2015, at 4:14pm, Andy Rahn <andy.rahn at gmail.com> wrote:

> I don't think
> there is a way I can just concatenate an extensive list of WHERE clauses
> into one monolithic SQLite query and also guarantee that SQLite will be
> able to SEARCH via indexes instead of falling back to slow SCANS.

You are engaged in premature optimization of your code.  The optimizer built 
into SQLite is extremely clever.  To take as an example

SELECT id FROM documents WHERE keyword LIKE '%,animal,%' AND rating > 3

If you have an index on the 'rating' column, then SQLite knows that it can 
satisfy the second condition very easily.  So it will use the index and make up 
its own list of rows which satisfy 'rating > 3'.  Then it will search only 
those rows for rows which satisfy the other part of the clause.

In other words, SQLite does internally what you were expecting to do yourself.

This extends to longer and more complicated combinations of ANDs and ORs in 
your WHERE clause.  Equipped with the results of ANALYZE the optimizer can 
explore each subclause, figure out how much it will narrow the search path, and 
choose the order of doing things which should yield the fastest path to the 
smallest number of rows to scan.  In a few milliseconds.

So there's no need to build the cleverness into your own code.  Instead

1) Make up a database with a convincing set of data in.  It doesn't have to be 
final, just get a lots of rows of plausible data into the tables.
2) Create some indexes which would facilitate most-used searches.
3) Run the SQL command "ANALYZE".  The result is stored in the database and 
unless the nature of the data in the tables changes, there's no need to rerun 
ANALYZE each time you add/update rows.
4) Make up your SQL commands using a standard SQL WHERE clause.

Simon.

Reply via email to