mongodb isn't a supported backend for the scheduler.
And to answer your previous request: db2 is there just to support scheduler 
"internals" . if your task still wants to write to db there's nothing wrong 
with it.

On Thursday, October 15, 2015 at 4:39:17 AM UTC+2, kenny c wrote:
>
> If SQLite cannot handle many concurrent users while writing/reading, 
> should I move to postgres or mongoDB? There will be over 10k rows added 
> every 10mins.
>
> Thanks.
>
> On Wednesday, October 14, 2015 at 10:03:30 PM UTC-4, kenny c wrote:
>>
>> Hi Niphlod,
>>
>> Thank you for this information. 
>>
>> Could you tell me how you manage to copy new data into db from db2 after 
>> finishing the scheduler job?
>>
>> Thank you.
>>
>> On Friday, November 9, 2012 at 10:31:03 AM UTC-5, Niphlod wrote:
>>>
>>> if you can, use a separate db for the scheduler. SQLite doesn't handle 
>>> well concurrent writes (with default operational capabilities), so having 
>>> the scheduler operating on the same database of your "insertion" of 
>>> messages can lead to locks.
>>>
>>> Just do 
>>>
>>> db = DAL('whatever.db')
>>> db2 = DAL('whatever_scheduler.db')
>>>
>>> db.define_table('messages', .....)
>>>
>>> from gluon.scheduler import Scheduler
>>> mysched = Scheduler(db2)
>>>
>>> and to queue tasks you can then use
>>>
>>> db2.scheduler_tasks.validate_and_insert(****)
>>>
>>> or, with the new API
>>>
>>> mysched.queue_task(***)
>>>
>>>
>>>
>>> On Friday, November 9, 2012 3:56:59 PM UTC+1, Mike Anson wrote:
>>>>
>>>> Greetings...
>>>>
>>>> I may have a problem with my database (sqlite) locking.
>>>>
>>>> Traceback (most recent call last):
>>>>   File "/home/web2py/src/web2py/gluon/scheduler.py", line 218, in 
>>>> executor
>>>>     result = dumps(_function(*args,**vars))
>>>>   File "applications/ircmessage/models/tasks.py", line 57, in 
>>>> send_unsent_messages
>>>>     for row in db(db.messages.status=='unsent').select(db.messages.id, 
>>>> db.messages.message, db.messages.uid):
>>>>   File "/home/web2py/src/web2py/gluon/dal.py", line 8787, in select
>>>>     return adapter.select(self.query,fields,attributes)
>>>>   File "/home/web2py/src/web2py/gluon/dal.py", line 2127, in select
>>>>     return super(SQLiteAdapter, self).select(query, fields, attributes)
>>>>   File "/home/web2py/src/web2py/gluon/dal.py", line 1615, in select
>>>>     return self._select_aux(sql,fields,attributes)
>>>>   File "/home/web2py/src/web2py/gluon/dal.py", line 1580, in _select_aux
>>>>     self.execute(sql)
>>>>   File "/home/web2py/src/web2py/gluon/dal.py", line 1693, in execute
>>>>     return self.log_execute(*a, **b)
>>>>   File "/home/web2py/src/web2py/gluon/dal.py", line 1687, in log_execute
>>>>     ret = self.cursor.execute(*a, **b)
>>>> OperationalError: database is locked
>>>>
>>>> I have a scheduler every minute that get's records (messages) out of 
>>>> the DB with a status of unsent. The messages then get posted to an IRC 
>>>> channel and updated to sent to they are not included the next time the 
>>>> scheduler runs the script.
>>>>
>>>> This seems to work just fine.
>>>>
>>>> It's when I inject a new message (which by default has a message status 
>>>> set to unsent) via a bash script. The message inserts fine but my script 
>>>> that posts it to the IRC channel doesn't not post anything but simply 
>>>> updates it's status to sent without actually sending it. By sending it I 
>>>> mean post a message over sockets.
>>>>
>>>> Here is my model for sending unsent messages:
>>>>         for row in db(db.messages.status=='unsent').select(
>>>> db.messages.id, db.messages.message, db.messages.uid):
>>>>             message_id = row.id
>>>>             message_message = row.message
>>>>             message_uid = row.uid
>>>>             
>>>>             #socket connection already opened earlier in the script
>>>>             s.send("PRIVMSG %s :%s - %s\r\n" % (channel, message_uid, 
>>>> message_message))
>>>>             print "message %s has been sent" % message_id
>>>>
>>>>             ## Only seems to print message when a delay is here.
>>>>             time.sleep(5)
>>>>
>>>>             ## Set message record to sent and update modified field
>>>>             modified_stamp = strftime("%Y-%m-%d %H:%M:%S")
>>>>             db.messages[message_id] = dict(status='sent', 
>>>> modified=modified_stamp)
>>>>         
>>>>             db.commit()
>>>>
>>>> Inserting a message via jsonrpc (shown) and cURL (not shown):
>>>>
>>>> @service.jsonrpc
>>>> def savemessage(message, uid):
>>>>     db.messages.insert(message=message, uid=uid)
>>>>     db.commit()
>>>>     
>>>>     message = {"status":"saved"}
>>>>
>>>>     return message
>>>>
>>>>
>>>> Should I simply switch to a postgresql or mysql database to prevent 
>>>> this locking?
>>>>
>>>> Thanks for any help/advice in advance
>>>>
>>>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to