While I am opposed to premature optimization as well, this is not clearly a
case of that. As described, the user will be able to create arbitrarily
complex queries. Since it is impossible to know in advance what indexes
might be required, breaking it down to individual sub queries with simple
where clauses that can be virtually guaranteed to use a simple index can
provide fast enough queries for the general case, since the SQL will be
generated dynamically from complicated user input.
On Jul 10, 2015 9:36 AM, "Simon Slavin" <slavins at bigfraud.org> wrote:

>
> On 10 Jul 2015, at 4:14pm, Andy Rahn <andy.rahn at gmail.com> wrote:
>
> > I don't think
> > there is a way I can just concatenate an extensive list of WHERE clauses
> > into one monolithic SQLite query and also guarantee that SQLite will be
> > able to SEARCH via indexes instead of falling back to slow SCANS.
>
> You are engaged in premature optimization of your code.  The optimizer
> built into SQLite is extremely clever.  To take as an example
>
> SELECT id FROM documents WHERE keyword LIKE '%,animal,%' AND rating > 3
>
> If you have an index on the 'rating' column, then SQLite knows that it can
> satisfy the second condition very easily.  So it will use the index and
> make up its own list of rows which satisfy 'rating > 3'.  Then it will
> search only those rows for rows which satisfy the other part of the clause.
>
> In other words, SQLite does internally what you were expecting to do
> yourself.
>
> This extends to longer and more complicated combinations of ANDs and ORs
> in your WHERE clause.  Equipped with the results of ANALYZE the optimizer
> can explore each subclause, figure out how much it will narrow the search
> path, and choose the order of doing things which should yield the fastest
> path to the smallest number of rows to scan.  In a few milliseconds.
>
> So there's no need to build the cleverness into your own code.  Instead
>
> 1) Make up a database with a convincing set of data in.  It doesn't have
> to be final, just get a lots of rows of plausible data into the tables.
> 2) Create some indexes which would facilitate most-used searches.
> 3) Run the SQL command "ANALYZE".  The result is stored in the database
> and unless the nature of the data in the tables changes, there's no need to
> rerun ANALYZE each time you add/update rows.
> 4) Make up your SQL commands using a standard SQL WHERE clause.
>
> Simon.
> _______________________________________________
> sqlite-users mailing list
> sqlite-users at mailinglists.sqlite.org
> http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
>

Reply via email to