On Feb 13, 2010, at 9:24 PM, Kent wrote:

> Forgive the lack of understanding... still learning the framework:
> 
> It seems to me that will set the value on the class instead of the
> instance??

scoped_session returns an object that is a proxy to an actual session.   the 
methods you call upon it are invoked upon an actual Session object that is 
referenced from a threadlocal.   The reason we all think its a class is because 
the docs show it using CamelCase for its name and because it can be "called" to 
get an actual session, i.e. Session().  that's just a __call__() method.   

There's not much going on with ScopedSession source code wise so its worth a 
peek in order to de-mystify it (skip the _ScopedExt stuff, that's all out).


> 
> If this is on a webserver with multiple connections, will that affect
> any other connections for the period that it is set to false?

the autoflush setting is for the current thread's session.  Any connections 
that have been entered into the current transaction for that session will be 
subject to the autoflush setting.



> 
> 
> 
> 
> On Feb 13, 8:39 pm, Michael Bayer <mike...@zzzcomputing.com> wrote:
>> On Feb 13, 2010, at 7:05 PM, Kent wrote:
>> 
>> 
>> 
>>> # want to do more queries and change more, so set autoflush to False
>>> DBSession.autoflush = False
>> 
>> yeah sorry here, "autoflush" isn't propagated in 0.5.8 to the actual session 
>> when using scoped_session().  That was fixed in 0.6.   Here you'd say 
>> DBSession().autoflush = False.
>> 
>> 
>> 
>>> #however the following still causes flush
>>> rules=DBSession.query(Order).all()
>> 
>>> merged.ordersite=u'AA'
>> 
>>> #now is when I really want to do the flush
>>> DBSession.flush()
>> 
>>> =============================================
>>> (Output below)
>> 
>>> As a work-around I can instantiate a new session, but I didn't want
>>> more sessions than I need and I am not sure at what point I would
>>> dispose of the extra session (do I just .close() it after my queries?)
>> 
>>> It seems more efficient to use the same session the turbogears
>>> framework has set up... is that the case or can I create and close
>>> sessions often without concern of efficiency?  If I fail to close
>>> them, do they tie up database resources?
>> 
>>> In general, autoflush staying on is fine, so I'd rather turn autoflush
>>> off and back on when I am finished with certain transactions where I
>>> know I want it off.  Maybe I'll turn autoflush off in the
>>> sessionmaker, which seems to work, but I still wonder:
>> 
>>> Am I doing something wrong?  What is my best workaround?
>> 
>>> Output:
>> 
>>>>>> from sqlalchemy import *
>>>>>> from sqlalchemy.orm import *
>>>>>> from zope.sqlalchemy import ZopeTransactionExtension
>> 
>>>>>> engine = 
>>>>>> create_engine('postgres://user:p...@localhost:5444/name',echo=True)
>>>>>> metadata = MetaData()
>>>>>> maker = sessionmaker(bind=engine, autoflush=True, autocommit=False,
>>> ...                      extension=ZopeTransactionExtension())
>>>>>> DBSession = scoped_session(maker)
>> 
>>>>>> order_table = Table("orders", metadata,
>>> ...     Column("orderid", Unicode, primary_key=True),
>>> ...     Column("ordersite", Unicode)
>>> ... )
>> 
>>>>>> class Order(object):
>>> ...     pass
>>> ...
>>>>>> order_mapper = mapper(Order, order_table)
>> 
>>>>>> o=Order()
>>>>>> o.orderid = u'SALE25863'  #this order exists in the database
>>>>>> o.ordersite = u'00'
>> 
>>>>>> merged=DBSession.merge(o)
>>> 2010-02-13 06:48:46,816 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> BEGIN
>>> 2010-02-13 06:48:46,820 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> SELECT orders.orderid AS orders_orderid, orders.ordersite AS
>>> orders_ordersite
>>> FROM orders
>>> WHERE orders.orderid = %(param_1)s
>>> 2010-02-13 06:48:46,820 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> {'param_1': 'SALE25863'}
>> 
>>>>>> # want to do more queries and change more, so set autoflush to False
>>> ... DBSession.autoflush = False
>> 
>>>>>> #however the following still causes flush
>>> ... rules=DBSession.query(Order).all()
>>> 2010-02-13 06:48:54,620 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> UPDATE orders SET ordersite=%(ordersite)s WHERE orders.orderid = %
>>> (orders_orderid)s
>>> 2010-02-13 06:48:54,620 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> {'ordersite': '00', 'orders_orderid': 'SALE25863'}
>>> 2010-02-13 06:48:54,622 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> SELECT orders.orderid AS orders_orderid, orders.ordersite AS
>>> orders_ordersite
>>> FROM orders
>>> 2010-02-13 06:48:54,622 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> {}
>> 
>>>>>> merged.ordersite=u'AA'
>> 
>>>>>> #now is when I really want to do the flush
>>> ... DBSession.flush()
>>> 2010-02-13 06:49:23,429 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> UPDATE orders SET ordersite=%(ordersite)s WHERE orders.orderid = %
>>> (orders_orderid)s
>>> 2010-02-13 06:49:23,429 INFO sqlalchemy.engine.base.Engine.0x...2510
>>> {'ordersite': 'AA', 'orders_orderid': 'SALE25863'}
>> 
>>> --
>>> You received this message because you are subscribed to the Google Groups 
>>> "sqlalchemy" group.
>>> To post to this group, send email to sqlalch...@googlegroups.com.
>>> To unsubscribe from this group, send email to 
>>> sqlalchemy+unsubscr...@googlegroups.com.
>>> For more options, visit this group 
>>> athttp://groups.google.com/group/sqlalchemy?hl=en.
>> 
>> 
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "sqlalchemy" group.
> To post to this group, send email to sqlalch...@googlegroups.com.
> To unsubscribe from this group, send email to 
> sqlalchemy+unsubscr...@googlegroups.com.
> For more options, visit this group at 
> http://groups.google.com/group/sqlalchemy?hl=en.
> 

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.

Reply via email to