Hello.

I wonder if anyone used SQLite extensively with big datasets and could
provide some insight into performance?
In a nutshell, I am writing an ETL framework and need a good (read:
performing) engine for the "T"ransform part.
I suppose I could use flat files for that, but I'd like to have some SQL
capabilities at my disposal, which is why I'm poking around file-based,
serverless engines.
The question is, how does SQLite perform when faced with huge datasets,
where "huge" means 10s of gigabytes in size (typical for a Data
Warehouse's staging area)?
Most common operations (after unload) would include multi-table joins
(mostly merge joins), field transformations (contatenation, casting) and
record filtering.

Regards,
Misza
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to