In addition to what Mike said... 

My guess is that you probably have an issue with rq.  I had run into issues 
with celery (similar) where I spawned too many background processes. Any 
given "web request" required 2 database connections -- one for the web 
request, and a second one in the celery task.  I had to use pooling 
features and rethink how I moved data around.

If you're just doing log data, I would not suggest your current approach. 
 Instead I would suggest a way to create "batches" of data and then have 
another process handle it.  You could also stash your logging routine 
somewhere in the wsgi pipeline /after/ everything is sent to the user.  

FWIW, I handle logging 2 ways:
- a separate database connection/pool that has logging credentials/setup. 
 it's not transaction safe, and just writes.
- data is logged to a flatfile.  the logfile is rotated then processed into 
sql (using 1 connection in a background task)

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to