>
> 1) What strategy for caching I should consider while using SQLAlchemy? 
> Currently, the only option I see is to have a duplicated declaration of 
> entities in a form of simple classes and use it when I don't need 
> modification. Needles to say, it's a lot of code duplication.


I cache SqlAlchemy objects into Redis by turning them into normal dicts 
first (iterating over the columns), and saving the dicts into Redis via 
msgpack and `dogpile.cache`.  Dogpile is great, because you can use it as a 
read-through cache -- a cache miss will load the info from SqlAlchemy 
direct.  The dicts are then loaded from Redis, and injected into a generic 
class that overrides __get__ to offer a a 'dotted attribute' style syntax.  
After a while, we extended that caching layer to load related cached 
objects from Redis; the columns and relations to be used in the cache are 
managed in our SqlAlchemy class declarations too.

99% of our application uses the read-through Redis cache. Only write and 
permission/auth-based operations will use SqlAlchemy objects.  

-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sqlalchemy/81db3849-d6da-48e6-b941-5b96811f78b8%40googlegroups.com.

Reply via email to