Re: [sqlalchemy] Idempotence of adding an event listener

2012-07-17 Thread Michael Bayer
Ideally it would be once, though there are implementational complexities to 
this.The function that is ultimately added to the list of listeners is not 
always the one that you passed - in the case of mapper and attribute events, 
the given function is more often than not wrapped in an adapting wrapper before 
being passed to event registration.So at the very least, OrderedSet 
wouldn't work here, it would need to be an ordered dictionary where at least 
the given function acts as the key.

But ordered dictionary with listener function as key is probably not enough.  
If the same event function were registered more than once with different 
modifying arguments, that suggests the function should in fact be registered 
more than once, as it will behave differently with different modifying 
arguments passed.

There's also the question of scopes, if an event function were registered onto 
a class, as well as onto a particular instance of that class, which one would 
prevail here, the per-class listener or the per-instance listener ?   Say we 
have the per-class listener result in the removal of the per-instance listener; 
later, we implement removal of event.  What happens when the per-class event is 
removed?  should the per-instance listener come online again ?   This would 
suggest we aren't using sets or dicts at all, instead it suggests some system 
of cascades where all the listeners remain in place but we have some 
rule-based system of favoring certain listeners.

It seems that keeping the system simple and free of more complex promises like 
'idempotence' allows us to avoid some thorny situations for now. I gave 
jQuery a try, which is kind of the most widely used event system, it does not 
appear to have idempotent behavior either.



On Jul 17, 2012, at 9:55 AM, Pedro Romano wrote:

 Hello group,
 
 This is sort of a philosophical / design question.
 
 I have already noticed that the current behaviour behaviour when adding the 
 same event listener more than once is that it will be invoked as many times 
 as it was added. It is a perfectly good and valid design decision.
 
 Would it make sense to make adding a specific event handling function an 
 idempotent operation? I.e. make the event dispatcher have the behaviour of an 
 ordered set instead of a queue? Or are there cases where it is useful that 
 the same event handling function be invoked more than once?
 
 This issue came up because I have a model mixin that adds an event handler in 
 '__declared_last__' (this may not be the ideal place to put it, but I 
 couldn't find a better one). My unit test suite will redefine the models 
 several times in the same interpreter execution. Everything else about a 
 model declaration is idempotent, except for the addition of the event 
 handlers in '__declared_last__' which keep accumulating.
 
 As I said in the beginning this is more of a philosophical question because 
 there are obvious easy workarounds to make sure that '__declared_last__' is 
 only invoked once.
 
 Thanks for any comments regarding this.
 
 --Pedro.
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To view this discussion on the web visit 
 https://groups.google.com/d/msg/sqlalchemy/-/sVjhMDLXHXAJ.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 To unsubscribe from this group, send email to 
 sqlalchemy+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/sqlalchemy?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



Re: [sqlalchemy] Custom in-query-only rendering for TypeDecorator types?

2012-07-17 Thread Russ
For future reference, it is not actually a great idea to use @compiles to 
render with AT TIME ZONE as I did above.  When done this way, SQLAlchemy 
renders all references to that column using this, *including* any 
references in a WHERE clause.  eg: when looking for log events later than 
some date you would get:

SELECT
  log.blah
 ,log.time AT TIME ZONE 'EST' -- intended use
FROM
  log
WHERE
  log.time AT TIME ZONE 'EST'  foo   -- not intended

As said above, the problem here is that the custom compilation happened in 
both the column specification *and* the WHERE clause.  This is not 
surprising (with hindsight), but it prevents any index on log.time from 
being used (unless there is an appropriate functional index).

For this case, I only wanted it applied to the column spec, not the WHERE, 
but I don't think this is currently possible to differentiate this and 
compile differently in each location...  or is it?  I looked into 
compiler.statement et al to figure out the compilation context, but could 
not.

Russ

PS: For what it's worth, for this specific case in PostgreSQL, this type of 
functionality is better suited to appropriate use of the timestamp with 
time zone data type, and correct session usage of SET TIME ZONE and/or 
PGTZ usage.  I'm currently wrestling with the use cases here instead.


-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/sqlalchemy/-/G99byYSmjoQJ.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



Re: [sqlalchemy] Custom in-query-only rendering for TypeDecorator types?

2012-07-17 Thread Michael Bayer

On Jul 17, 2012, at 2:25 PM, Russ wrote:

 For future reference, it is not actually a great idea to use @compiles to 
 render with AT TIME ZONE as I did above.  When done this way, SQLAlchemy 
 renders all references to that column using this, including any references in 
 a WHERE clause.  eg: when looking for log events later than some date you 
 would get:
 
 SELECT
   log.blah
  ,log.time AT TIME ZONE 'EST' -- intended use
 FROM
   log
 WHERE
   log.time AT TIME ZONE 'EST'  foo   -- not intended
 
 As said above, the problem here is that the custom compilation happened in 
 both the column specification and the WHERE clause.  This is not surprising 
 (with hindsight), but it prevents any index on log.time from being used 
 (unless there is an appropriate functional index).
 
 For this case, I only wanted it applied to the column spec, not the WHERE, 
 but I don't think this is currently possible to differentiate this and 
 compile differently in each location...  or is it?  I looked into 
 compiler.statement et al to figure out the compilation context, but could not.

it is, you need to look in the **kw passed to your custom compile function for 
the flag within_columns_clause=True, which indicates it's rendering the 
columns clause.




-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



Re: [sqlalchemy] fractional second percision- mysql

2012-07-17 Thread James
I have created a feature request ticket for MySQLdb to add fractional 
second support:

http://sourceforge.net/tracker/?func=detailaid=3545195group_id=22307atid=374935

Currently, I am still using my patched version of MySQLdb/times.py however 
I did notice a slight formatting issue with my original patch. To fix the 
issue, the return statement of 'def format_TIMEDELTA(v)' now reads:

return '%d %d:%d:%d.%06d' % (v.days, hours, minutes, seconds, microseconds)

Hopefully, MySQLdb will add support on their end, so you can proceed with 
committing the hypothetical changes that you suggested, which that work for 
me. 

Thank you for your help.

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/sqlalchemy/-/x_-Bd0lbg2gJ.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



[sqlalchemy] Re: Set-based association proxy through AppenderQuery?

2012-07-17 Thread Jon Parise
I have a similar use case, and aside from introducing a duplicate non-lazy
relationship to back the association_proxy, I haven't found a solution.

Does anyone have a more elegant approach?

On Saturday, February 11, 2012 12:15:38 PM UTC-8, Mark Friedenbach wrote:

 Hi, 

 Is it possible to have an association_proxy (in the association object 
 pattern) that emulates a set-based collection if it goes through a 
 lazy='dynamic' relationship? I can't for the life of me find a way to 
 make this work (setting collection_class on the dynamic relationship 
 doesn't seem to do anything). 

 Here's some example code of what I'm trying to do, extracted from the 
 actual project: 

 class ProofOfWork(object): 
   blocks = association_proxy('Intermediatory_nodes', 'block') 
 proof_of_work = Table('proof_of_work', db.metadata) 
 mapper(ProofOfWork, proof_of_work, properties={ 
   'Intermediatory_nodes': relationship(lambda: Intermediatory, 
 lazy = 'dynamic'), 
 }) 

 class Block(object): 
   proof_of_works = association_proxy('Intermediatory_nodes', 
 'proof_of_work') 
 block = Table('block', db.metadata) 
 mapper(Block, block, properties={ 
   'Intermediatory_nodes': relationship(lambda: Intermediatory, 
 lazy = 'dynamic'), 
 }) 

 class Intermediatory(object): 
   pass 
 intermediatory = Table('intermediatory', db.metadata, 
   Column('proof_of_work_id', Integer, 
 ForeignKey('proof_of_work.id'), 
 nullable = False), 
   Column('block_id', Integer, 
 ForeignKey('block.id')), 
 ) 
 mapper(Intermediatory, intermediatory, properties={ 
   'proof_of_work': relationship(lambda: ProofOfWork, 
 back_populates = 'Intermediatory_nodes', 
 remote_side= lambda: proof_of_work.c.id), 
   'block': relationship(lambda: Block, 
 back_populates = 'Intermediatory_nodes', 
 remote_side= lambda: block.c.id), 
 }) 

 How can I make ProofOfWork.blocks and Block.proof_of_works return an 
 _AssociationSet instead of _AssociationList? 

 Cheers, 
 Mark

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/sqlalchemy/-/gs9rqWLKooQJ.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



Re: [sqlalchemy] Re: Set-based association proxy through AppenderQuery?

2012-07-17 Thread Michael Bayer
association proxy documents the proxy_factory attribute for this purpose.  
see below.


from sqlalchemy import *
from sqlalchemy.orm import *
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.ext.associationproxy import association_proxy, _AssociationSet
import operator

Base = declarative_base()

class AppenderAssociationSet(_AssociationSet):
subclass _AssociationSet to adapt some set methods to that of
AppenderQuery.


def add(self, object_):
self.col.append(self._create(object_))

def extend(self, objects):
for obj in objects:
self.col.append(self._create(obj))

def clear(self):
the set assignment needs 'clear' but we dont
really have a consistent way to do that with
AppenderQuery. 

def set_factory(lazy_collection, creator, value_attr, assoc_prox):
Factory for associationproxy collections.

# does return MyObject.value_attr
getter = operator.attrgetter(value_attr)

# does MyObject.value_attr = v
setter = lambda o, v: setattr(o, value_attr, v)

return AppenderAssociationSet(lazy_collection, creator,
getter, setter, assoc_prox)

class A(Base):
__tablename__ = a

id = Column(Integer, primary_key=True)
bs = relationship(B, lazy=dynamic)

cs = association_proxy(bs, c, proxy_factory=set_factory)

class B(Base):
__tablename__ = b

id = Column(Integer, primary_key=True)

def __init__(self, c):
self.c = c

a_id = Column(Integer, ForeignKey('a.id'))
c_id = Column(Integer, ForeignKey('c.id'))
c = relationship(C)

class C(Base):
__tablename__ = c
id = Column(Integer, primary_key=True)

e = create_engine(sqlite://, echo=True)

Base.metadata.create_all(e)

c1, c2, c3 = C(), C(), C()
s = Session(e)
s.add_all([
A(cs=set([c1, c2]))
])
s.commit()

a1 = s.query(A).first()
print a1.cs
a1.cs.add(c3)

s.commit()

print a1.cs.difference([c1])









On Jul 17, 2012, at 6:26 PM, Jon Parise wrote:

 I have a similar use case, and aside from introducing a duplicate non-lazy
 relationship to back the association_proxy, I haven't found a solution.
 
 Does anyone have a more elegant approach?
 
 On Saturday, February 11, 2012 12:15:38 PM UTC-8, Mark Friedenbach wrote:
 Hi, 
 
 Is it possible to have an association_proxy (in the association object 
 pattern) that emulates a set-based collection if it goes through a 
 lazy='dynamic' relationship? I can't for the life of me find a way to 
 make this work (setting collection_class on the dynamic relationship 
 doesn't seem to do anything). 
 
 Here's some example code of what I'm trying to do, extracted from the 
 actual project: 
 
 class ProofOfWork(object): 
   blocks = association_proxy('Intermediatory_nodes', 'block') 
 proof_of_work = Table('proof_of_work', db.metadata) 
 mapper(ProofOfWork, proof_of_work, properties={ 
   'Intermediatory_nodes': relationship(lambda: Intermediatory, 
 lazy = 'dynamic'), 
 }) 
 
 class Block(object): 
   proof_of_works = association_proxy('Intermediatory_nodes', 
 'proof_of_work') 
 block = Table('block', db.metadata) 
 mapper(Block, block, properties={ 
   'Intermediatory_nodes': relationship(lambda: Intermediatory, 
 lazy = 'dynamic'), 
 }) 
 
 class Intermediatory(object): 
   pass 
 intermediatory = Table('intermediatory', db.metadata, 
   Column('proof_of_work_id', Integer, 
 ForeignKey('proof_of_work.id'), 
 nullable = False), 
   Column('block_id', Integer, 
 ForeignKey('block.id')), 
 ) 
 mapper(Intermediatory, intermediatory, properties={ 
   'proof_of_work': relationship(lambda: ProofOfWork, 
 back_populates = 'Intermediatory_nodes', 
 remote_side= lambda: proof_of_work.c.id), 
   'block': relationship(lambda: Block, 
 back_populates = 'Intermediatory_nodes', 
 remote_side= lambda: block.c.id), 
 }) 
 
 How can I make ProofOfWork.blocks and Block.proof_of_works return an 
 _AssociationSet instead of _AssociationList? 
 
 Cheers, 
 Mark
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To view this discussion on the web visit 
 https://groups.google.com/d/msg/sqlalchemy/-/gs9rqWLKooQJ.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 To unsubscribe from this group, send email to 
 sqlalchemy+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/sqlalchemy?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.