Hello,

I am writing a shared library which uses SQLite as the file format. This
library has got a learning subsystem which can learn a word and all
possible ways to type that word. Usually the word will be UTF-8 encoded
indic text and patterns will be words with latin characters. I am using the
SQLite C API and WAL as journal mode. All of this works really well on
local machine.

Recently, I implemented a web version of my program which internally uses
the shared library. There will be REST URLs exposed for the "learn" API
call, something like "http://websitename.com/learn"; with the word to learn
in the request parameters. Since the web-server allows concurrent requests,
there could be a possibility that two requests for learn getting executed
in parallel. In this case, SQLite fails with error message "Database is
locked" as there would be one writer already in progress.

I am looking for the best way to workaround this problem.

Currently, I have implemented a queue at the server side which will queue
all the requests for learn. Another worker process reads this queue and
call my library routine for each word sequentially. This works well. But I
am wondering is this the right way to workaround this problem?

Any help would be great!

-- 
-n
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to