In <[EMAIL PROTECTED]>, frevol nicolas
wrote:

> I am programming a filter for websites. to check if an
> url is in the blacklist, i have used a csv file. i
> have try to use sqlite but it looks as fast as csv
> files.
> 
> Can you tell me if sqlite is faster than csv files ?
> or not ?

Depends on the number of records/lines and how you access them.

If you have some kind of server and read the csv file into a set or
dictionary and query this thousands of times, it's faster than querying a
database the same amount of times.

If you have really many items in that blacklist and just query a few of
them per program run, say in a CGI script, then a database will become
faster than linear searching through a csv file.

Just try both and measure to know for sure.

Ciao,
        Marc 'BlackJack' Rintsch
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to