I just uploaded the output from sqlite3_analyze to dropbox.

On Thu, Aug 8, 2013 at 9:40 AM, Christopher W. Steenwyk <csteen...@gmail.com
> wrote:

> Here is the data from stat1:
>
> "tbl", "idx", "stat"
> "metrics", "metrics_idx", "68682102 2 2 2"
> "metrics", "metrics_frame_idx", "68682102 2"
> "metrics", "sqlite_autoindex_mobileye_ldw_metrics_1", "68682102 230 1"
> "object_characteristics", "object_characteristics_idx", "1148344 164050
> 31899 1"
> "object_characteristics", "sqlite_autoindex_object_characteristics_1",
> "1148344 4 1"
> "metadata", "metadata_type_value_idx", "510520 11873 2716 1 1"
> "metadata", "metadata_idx", "510520 4 3 1 1"
> "metadata", "sqlite_autoindex_metadata_1", "510520 4 3 1"
> "attribute_values", "sqlite_autoindex_attribute_values_1", "198 1"
> "attribute_types", "sqlite_autoindex_attribute_types_1", "50 1"
> "frames", "frames_idx", "51896158 11492 4052 1"
> "frames", "sqlite_autoindex_frames_1", "51896158 11492 4052 1"
> "objects", "objects_grade_idx", "350060 14"
>
> I am working on running analyze.
>
> So, looking into EAV, I understand the drawbacks, but how would you
> recommend me restructuring my schema to be more efficient?
>
> Perhaps have an "Objects" table that has a column per attribute, and then
> a "frames" table that also contains the contents of metadata as separate
> columns for each value? That would reduce me down to three tables. Or just
> dump everything into one, massive table?
>
> This database is generated once, and then queried and interrogated
> multiple times. So I am most concerned with read speed and not with writing
> or updating.
>
>
> On Wed, Aug 7, 2013 at 6:44 PM, Richard Hipp <d...@sqlite.org> wrote:
>
>> On Wed, Aug 7, 2013 at 2:54 PM, Christopher W. Steenwyk <
>> csteen...@gmail.com
>> > wrote:
>>
>> > Hi,
>> >
>> > I have been working on a large database and its queries now for several
>> > weeks and just can't figure out why my query is so slow. I've attached
>> the
>> > schema, my query, and the results of EXPLAIN QUERY from sqliteman.
>> >
>> > A few notes about this database:
>> > Its approximately 10GB in size, but I have it on a SSD on a linux
>> machine
>> > with 12 GB of RAM on a 24 core PC.
>> >
>> > As for data in the tables...
>> > 'Objects' has 350000 rows
>> > 'frames' has 51896158 rows
>> > 'attribute_types' has 50 rows
>> > 'attribute_values' has 200 rows
>> > 'metrics' has 68682102 rows
>> >
>> > For every object there are approximately 5 rows that relate to it in
>> > 'object_characteristics' and another 20 rows in 'metadata'.
>> >
>> > The attached query takes over 6 days to run.
>> >
>> > Any help or suggestions would be greatly appreciated.
>> >
>>
>> Can you please send the output of ".dump sqlite_stat1".
>>
>> Also, if you can run sqlite3_analyzer on the database file and send us
>> that, so much the better.
>>
>> Thanks.
>>
>>
>> >
>> > I noticed that the part of the query for 'frames' is not using a
>> covering
>> > index, cut I can't figure out why. I Was wondering if that is why it was
>> > slow. In the query I am only referencing items that are within an index
>> but
>> > it keeps using the PRIMARY KEY. So that was one thought I had.
>> >
>> > Thanks in advance!
>> > Chris
>> >
>> > _______________________________________________
>> > sqlite-users mailing list
>> > sqlite-users@sqlite.org
>> > http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>> >
>> >
>>
>>
>> --
>> D. Richard Hipp
>> d...@sqlite.org
>> _______________________________________________
>> sqlite-users mailing list
>> sqlite-users@sqlite.org
>> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>>
>
>
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to