Hello !
I'm still fighting to use sqlite with a 20GB database and now I'm trying
to update a table with 980M rows but sqlite is eating all my memory
(2GB) and making blocking my computer (too much swap).
I'm in doubt if sqlite can really manage databases above 1GB.
Have someone any experience with big databases ?
Cheers !
The query:
--explain query plan
update bolsas_familia set favorecido_id=(select id from favorecidos
where nis=favorecido_id);
Query plan:
selectid|order|from|detail
0|0|0|SCAN TABLE bolsas_familia
0|0|0|EXECUTE CORRELATED SCALAR SUBQUERY 0
0|0|0|SEARCH TABLE favorecidos USING COVERING INDEX favorecidos_nis_idx
(nis=?)
The schema is:
CREATE TABLE bolsas_familia(
id integer primary key,
favorecido_id integer not null,
valor_parcela numeric,
mes_competencia integer
); //980M rows
CREATE TABLE "favorecidos"(
id integer primary key,
nis integer,
nome varchar collate nocase_slna,
municipio_id integer,
programa_id integer not null,
fonte_finalidade_id integer not null,
first_ano_mes integer
); //14M rows
CREATE UNIQUE INDEX favorecidos_nis_idx ON favorecidos(nis);
Cheers !
_______________________________________________
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users