We use a custom SqlAlchemy+Pyramid backed client for requesting and 
managing LetsEncrypt SSL certificates.  It centrally stores/manages the 
certificates, which can then be deployed to various servers on a network, 
with support built-in for PostgreSQL and SqlIte data storage.  

I'm working on an update right now to integrate rate limit awareness and 
hitting a conceptual roadblock for Sqlite.  While the main work runs within 
the scope of a single transaction, I need to independently read/write to 
the database for some logging work.  With PostgreSQL, I would just create a 
secondary connection - but I'm not sure about the safety of that in sqlite.

A good example of what I'm trying to deal with is a certificate request

The transaction scoped work looks like this:

   [Begin] -> [Auth Domain 1] [Auth Domain 2] [Auth Domain 3] [Sign 
Certificate] [Commit]

The transactionless autocommit stuff looks like this:

   Auth Domain 1:
        Log requesting an auth
        Log validation request
        Log validation result (retry, pass, fail)
   Auth Domain 2 (repeat above)
   Auth Domain 3 (repeat above)
  Sign Cert
        Log requesting a cert, update with valid/not

Does a secondary autocommit session seem ok for this sort of sqlite usage?


-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to