Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-09-04 Thread Paul Balomiri
Hi,
First off and again, thanks for the support here. I think I can work for
the momement with your proposed implementation.

My problem is related to the fact that the list values of the GBK
defaultdict are plain list types, and thus cannot fire events for
operations on them. The testcase below does not work, and, as you
mentioned, no other operation on GBK's list values will fire the
corresponding events.

Now my attempt (admittedly without enough knowledge of sqlalchemy
internals) was to create a list which *forwards* append/remove events
to the GBK Collection which could in turn add/remove them in their quality
as true InstrumentedAttribute (thus handling the DB part) . So more
specifically i used prepare_instrumentation() hoping to be able to
instantiate an InstrumentedList with event capabilities. The
InstrumentedLists would not need be first class InstrumentedAttributes ( --
perhaps could not because they appear after reflection time? --).

I see now that it was a far off longshot.

This approach would also remove the immutability constraint on keyfunc's
return value. Keyfunc would be a read/writable attribute instead.

I hope to find time to get more accustomed to sqlalchemy's internals and to
implement this idea somewhere in Jan. or Feb..

By the way I'm implementing Rails-Style DB binding (table people-Class
Person e.t.c.) using the SQLA inspector interface. I'll post the lib in
github as soon as I'm eating my own dog food (=it's usable) .

--
Paul

Here is a testcase where i would like to have the list
p1._addresses_by_role[home] fire a remove event which removes the person
from GBK:

def test_del_item(self):
sess = self.sess
p1 = Person()
a1 = Address(name=Bucharest)

# here, p2a already refers to a1/p1, the _addresses_by_role
# will be set up when it loads after a commit
p2a = PersonToAddress(address=a1, person=p1, role=home)
sess.add(p1)

sess.commit()
self._assertAssociated(p1, a1, p2a)

del p1._addresses_by_role[home][0]
sess.commit()
import pdb
pdb.set_trace()
self.assertFalse(home in p1._addresses_by_role)




2013/9/4 Michael Bayer mike...@zzzcomputing.com


 On Sep 3, 2013, at 8:47 AM, Paul Balomiri paulbalom...@gmail.com wrote:

  I would like to install
 
  event.listen(list, 'append', append_listener)
  event.listen(list, 'remove', rm_listener)
 
  on those lists, such that the GroupByKeyCollection can modify added
 objects according to the relationship it implements:
  * set the appropiate foreign key constraints
  * insert a removed object with it's new value for the key attribute
 after a change (announced by append_listener)
  * reset the fks upon item removal.

 using event.listen with GBK doesn't make sense.  events can only be used
 with specific target types, the remove append events only apply to an
 ORM-produced InstrumentedAttribute, such as Person._addresses_by_role here
 (note, we mean the class-bound attribute, not the collection on an
 instance).  There is no need to use event.listen with the collection
 itself, as remove/append are produced originally by the add()/remove()
 methods on GBK itself; any extra logic which should take place would be
 invoked directly from there (and in fact my original example fails to fire
 off the event with remove()).

 Additionally, all the usage of prepare_instrumentation() etc. should not
 be necessary, that's all internal stuff which is called automatically.

 As mentioned before, the behavior of this collection is completely outside
 the realm of a normal collection so it needs to implement the
 append/remove events directly, which isn't something a new user to
 SQLAlchemy would typically be able to handle without a much deeper
 understanding of how the attribute system works.

 I've implemented your test case as below as well as some other variants in
 association with the original code I gave you - for the remove case I've
 added the necessary code to the custom collection. All foreign key
 constraints are set correctly as a function of the ORM's normal operation,
 and as far as reset, when an association between Person and Address is
 removed, we want to just delete the association so cascade is used for
 that.   I'm not sure what insert a removed object with it's new value for
 the key attribute after a change means; add a test to the TestPA class
 illustrating the behavior you want and I'll add it.

 from sqlalchemy import *
 from sqlalchemy.orm import *
 from sqlalchemy.ext.declarative import declarative_base
 import collections
 from sqlalchemy.orm.collections import collection, collection_adapter
 from sqlalchemy.ext.associationproxy import association_proxy,
 _AssociationCollection
 Base = declarative_base()

 class GroupByKeyCollection(collections.defaultdict):
 def __init__(self, keyfunc):
 super(GroupByKeyCollection, self).__init__(list)
 self.keyfunc = keyfunc

 

Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-09-04 Thread Michael Bayer

On Sep 4, 2013, at 7:08 AM, Paul Balomiri paulbalom...@gmail.com wrote:

 
 My problem is related to the fact that the list values of the GBK defaultdict 
 are plain list types, and thus cannot fire events for operations on them. The 
 testcase below does not work, and, as you mentioned, no other operation on 
 GBK's list values will fire the corresponding events.
 
 Now my attempt (admittedly without enough knowledge of sqlalchemy internals) 
 was to create a list which *forwards* append/remove events
 to the GBK Collection which could in turn add/remove them in their quality as 
 true InstrumentedAttribute (thus handling the DB part) . So more specifically 
 i used prepare_instrumentation() hoping to be able to instantiate an 
 InstrumentedList with event capabilities. The InstrumentedLists would not 
 need be first class InstrumentedAttributes ( -- perhaps could not because 
 they appear after reflection time? --). 

OK let's go back to my original reply, this is the key sentence:

 In particular I stopped at getting 
Person.addresses_by_role['role'].append(Address()) to work, since that means 
we'd need two distinctly instrumented collections, it's doable but is more 
complex. 

that is, when we work with the list collections that are in the dict, *that* 
part is not instrumented, it is a plain list.   That's why the implementation 
has the methods add(key, value) and remove(key, value), which do the 
dictionary + list access in one step - direct mutation of the contained list is 
not part of any instrumented system and was omitted from my original example.  
For those to work, we need two new collections - there is no magic within the 
existing collection system that has any clue about these completely different 
kinds of collections, so the event system, prepare_instrumentation, etc, none 
of that applies here.   The event system is not relevant here, as that system 
allows consumers to be associated with producers, but in this case, we're the 
producer!  there is no shortcut and this is not anything built in to SQLAlchemy 
- these are totally new kinds of collections and all four interfaces (dict, 
list, proxied dict, proxied list) must be fully implemented, if you truly want 
completely transparent behavior.  Here are all four, not completed yet, might 
have bugs, but implementing instrumented (and proxied) access on the sub-list 
element.   


from sqlalchemy import *
from sqlalchemy.orm import *
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm.collections import collection, collection_adapter
from sqlalchemy.ext.associationproxy import association_proxy, 
_AssociationCollection, _AssociationList
Base = declarative_base()

class GroupByKeyCollection(dict):
def __init__(self, keyfunc):
self.keyfunc = keyfunc

def __missing__(self, key):
l = GroupByKeyList(self, key)
dict.__setitem__(self, key, l)
return l

@collection.appender
def add(self, value, _sa_initiator=None):
key = self.keyfunc(value)
self[key].append(value, _sa_initiator)

@collection.remover
def remove(self, value, _sa_initiator=None):
key = self.keyfunc(value)
self[key].remove(value, _sa_initiator)

@collection.internally_instrumented
def __setitem__(self, key, value):
if key in self:
old_list = self[key]
to_remove = set(list_).difference(value)
for item in old_list:
if item in to_remove:
old_list._remove_event(item)
else:
old_list = []

to_add = set(value).difference(old_list)
new_list = self.__missing__(key)
for item in value:
if item in to_add:
new_list._append_event(item)

@collection.internally_instrumented
def __delitem__(self, key):
if key in self:
existing = self[key]
for value in existing:
existing._remove_event(value)
dict.__delitem__(self, key)
else:
raise KeyError(key)

@collection.iterator
def iterate(self):
for collection in self.values():
for item in collection:
yield item

@collection.converter
def _convert(self, target):
for collection in target.values():
for item in collection:
yield item

def update(self, k):
raise NotImplementedError()

class GroupByKeyList(list):
def __init__(self, parent, key):
self.parent = parent
self.key = key

def append(self, value, _sa_initiator=None):
value = self._append_event(value, _sa_initiator)
list.append(self, value)

def remove(self, value, _sa_initiator=None):
self._remove_event(value, _sa_initiator)
list.remove(self, value)

def _append_event(self, value, _sa_initiator=None):
adapter = collection_adapter(self.parent)
return 

Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-09-03 Thread Paul Balomiri
Hi,

Trying to advance on this issue i wrote an InstrumentedList which shall:
* hold only values sharing the same key as defined by a property on the
values
* change that property to the list value upon insertion
* set the property to whatever null value is defined  when the value is
removed from the list
* listen for changes on the property of the value object, as to ensure that
the value is removed when it's key poroperty does not match the list key
any more

The GroupByKeyCollection is thus a collection of GroupedByKeyList objects
with the key being the constant key of each such list.

I would like to install

event.listen(list, 'append', append_listener)
event.listen(list, 'remove', rm_listener)

on those lists, such that the GroupByKeyCollection can modify added objects
according to the relationship it implements:
* set the appropiate foreign key constraints
* insert a removed object with it's new value for the key attribute after a
change (announced by append_listener)
* reset the fks upon item removal.

My current problem is that i have not successfully instrumented the event
dispatcher mechanism for GroupedByKeyList.
I have done the following (full code is attached. I refrained from
including it here):

class GroupedByKeyList(InstrumentedList):
   def init(key):
   
   def append(self, obj):
   ...
   def remove(self,obj):
   ...
__instance_for_instrumentation=GroupedByKeyList(Just for the sake of
prepare_instrumentation, key_attribute=blah)
'''Accomodate for no zero argument initialization'''

prepare_instrumentation( lambda : __instance_for_instrumentation)
del __instance_for_instrumentation
_instrument_class(GroupedByKeyList)

Now when i try to append in GroupByKeyCollection the event listeners i get

 Traceback (most recent call last):
   File
 /home/paul/maivic-server/libs/sqlalchemy_keyed_relationship/test/test_sqla_grouped_collection2.py,
 line 60, in test_append_to_list
 p1._addresses_by_role.add(p2a)
   File
 /home/paul/maivic-server/env/local/lib/python2.7/site-packages/sqlalchemy/orm/collections.py,
 line 1008, in wrapper
 return method(*args, **kw)
   File
 /home/paul/maivic-server/libs/sqlalchemy_keyed_relationship/python/sqla_keyed_relationship/sqla_grouped_collection2.py,
 line 129, in add
 self[key].append(value)
   File
 /home/paul/maivic-server/libs/sqlalchemy_keyed_relationship/python/sqla_keyed_relationship/sqla_grouped_collection2.py,
 line 122, in __missing__
 event.listen(l, 'remove', remove_listener)
   File
 /home/paul/maivic-server/env/local/lib/python2.7/site-packages/sqlalchemy/event.py,
 line 40, in listen
 tgt.dispatch._listen(tgt, identifier, fn, *args, **kw)
 AttributeError: 'GroupedByKeyList' object has no attribute 'dispatch'



Seamingly instrumenting the list is not enough for installing the event
dispatch mechanism.

Where is relationship(collection_class=) implemented?
I could not quite follow the code. collection_class is saved on the
RelationshipProperty object, but i could not figure out where it is picked
up again, in order to see how instrumentation is done for standard
collections.

Should i use my own instrumentation outside of sqlalchemy?

I have attached the code as i have it now.

to run it i created a test case based on your previous answer, which in
this mail

Thank you,
Paul


from sqlalchemy.ext.declarative.api import declarative_base
from sqlalchemy.schema import Column, ForeignKey
from sqlalchemy.types import Integer, String
from sqlalchemy.orm import relationship
from sqla_keyed_relationship.sqla_grouped_collection2 import
GroupByKeyCollection,\
AssociationGBK
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.engine import create_engine
from sqlalchemy.orm.session import Session
from unittest.case import TestCase

Base = declarative_base()
class Person(Base):
__tablename__ = 'person'

id = Column(Integer, primary_key=True)

_addresses_by_role = relationship(PersonToAddress,
collection_class=
lambda: GroupByKeyCollection(
key_attribute=role
)
)
addresses_by_role = association_proxy(
_addresses_by_role,
address,
proxy_factory=AssociationGBK,
creator=lambda k, v: PersonToAddress(role=k,
address=v))

class PersonToAddress(Base):
__tablename__ = 'person_to_address'

id = Column(Integer, primary_key=True)
person_id = Column(Integer, ForeignKey('person.id'))
address_id = Column(Integer, ForeignKey('address.id'))
role = Column(String)
person=relationship(Person, backref=p2as)
address = relationship(Address)

class Address(Base):
__tablename__ = 'address'

id = Column(Integer, primary_key=True)
name = Column(String)

class Test (TestCase):
def setUp(self):

Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-09-03 Thread Michael Bayer

On Sep 3, 2013, at 8:47 AM, Paul Balomiri paulbalom...@gmail.com wrote:

 I would like to install
 
 event.listen(list, 'append', append_listener)
 event.listen(list, 'remove', rm_listener)
 
 on those lists, such that the GroupByKeyCollection can modify added objects 
 according to the relationship it implements:
 * set the appropiate foreign key constraints
 * insert a removed object with it's new value for the key attribute after a 
 change (announced by append_listener)
 * reset the fks upon item removal.

using event.listen with GBK doesn't make sense.  events can only be used with 
specific target types, the remove append events only apply to an 
ORM-produced InstrumentedAttribute, such as Person._addresses_by_role here 
(note, we mean the class-bound attribute, not the collection on an instance).  
There is no need to use event.listen with the collection itself, as 
remove/append are produced originally by the add()/remove() methods on GBK 
itself; any extra logic which should take place would be invoked directly from 
there (and in fact my original example fails to fire off the event with 
remove()).

Additionally, all the usage of prepare_instrumentation() etc. should not be 
necessary, that's all internal stuff which is called automatically.

As mentioned before, the behavior of this collection is completely outside the 
realm of a normal collection so it needs to implement the append/remove 
events directly, which isn't something a new user to SQLAlchemy would typically 
be able to handle without a much deeper understanding of how the attribute 
system works.

I've implemented your test case as below as well as some other variants in 
association with the original code I gave you - for the remove case I've 
added the necessary code to the custom collection. All foreign key constraints 
are set correctly as a function of the ORM's normal operation, and as far as 
reset, when an association between Person and Address is removed, we want to 
just delete the association so cascade is used for that.   I'm not sure what 
insert a removed object with it's new value for the key attribute after a 
change means; add a test to the TestPA class illustrating the behavior you 
want and I'll add it.

from sqlalchemy import *
from sqlalchemy.orm import *
from sqlalchemy.ext.declarative import declarative_base
import collections
from sqlalchemy.orm.collections import collection, collection_adapter
from sqlalchemy.ext.associationproxy import association_proxy, 
_AssociationCollection
Base = declarative_base()

class GroupByKeyCollection(collections.defaultdict):
def __init__(self, keyfunc):
super(GroupByKeyCollection, self).__init__(list)
self.keyfunc = keyfunc

@collection.appender
def add(self, value, _sa_initiator=None):
key = self.keyfunc(value)
self[key].append(value)

@collection.remover
def remove(self, value, _sa_initiator=None):
key = self.keyfunc(value)
adapter = collection_adapter(self)
adapter.fire_remove_event(value, None)
self[key].remove(value)

@collection.internally_instrumented
def __setitem__(self, key, value):
adapter = collection_adapter(self)
# the collection API usually provides these events transparently, but 
due to
# the unusual structure, we pretty much have to fire them ourselves
# for each item.
for item in value:
item = adapter.fire_append_event(item, None)
collections.defaultdict.__setitem__(self, key, value)

@collection.internally_instrumented
def __delitem__(self, key):
adapter = collection_adapter(self)
for item in value:
adapter.fire_remove_event(item, None)
collections.defaultdict.__delitem__(self, key, value)

@collection.iterator
def iterate(self):
for collection in self.values():
for item in collection:
yield item

@collection.converter
def _convert(self, target):
for collection in target.values():
for item in collection:
yield item

def update(self, k):
raise NotImplementedError()


class AssociationGBK(_AssociationCollection):
def __init__(self, lazy_collection, creator, value_attr, parent):
getter, setter = parent._default_getset(parent.collection_class)
super(AssociationGBK, self).__init__(
lazy_collection, creator, getter, setter, parent)

def _create(self, key, value):
return self.creator(key, value)

def _get(self, object):
return self.getter(object)

def _set(self, object, key, value):
return self.setter(object, key, value)

def __getitem__(self, key):
return [self._get(item) for item in self.col[key]]

def __setitem__(self, key, value):
self.col[key] = [self._create(key, item) for item in value]

def add(self, key, item):
self.col.add(self._create(key, item))


Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-08-17 Thread Michael Bayer

On Aug 15, 2013, at 7:32 AM, Paul Balomiri paulbalom...@gmail.com wrote:

 Hi,
 
 Thank you for the elaborated Answer !
 
 I am trying to implement a general solution for the key-list problem
 using events.
 
 basically i want to instrument for GroupByKeyCollection any changes
 relevant to the keyfunc.
 
 say we have
 p= Person()
 p._address_by_role['r1']= [PersonToAddress(address=Address(name='a1'),
 role='r1')  ]
 
 My problem is that i cannot access the parent object (PersonToAddress)
 from ScalarAttributeImpl supplied by the events framework as
 initiation parameter of the set callback. What i want is to remove an
 object from a key-associated list when it's keying function result
 mutates. For this i have to fetch the PersonToAddress from
 PersonToAddress.role.set event. Can you hint me a way to fetch a
 mapped object from it's attribute set event ?

that's what the target argument is, it's the object

@event.listens_for(PersonToAddress.role, set)
def set(target, value, oldvalue, initiator):
person = target.person
if person:
# do the manipulation directly...
collection = person._addresses_by_role[oldvalue]
collection.remove(target)
person._addresses_by_role[value].append(target)

# or let the collection do it:
# target.person = None
# set the value early
# target.role = value
# target.person = person


what's person?  a backref, easy enough...

_addresses_by_role = relationship(PersonToAddress,
collection_class=
lambda: GroupByKeyCollection(
keyfunc=lambda item: item.role
),
backref=person
)


test

some_role = p1._addresses_by_role['r1'][1]
some_role.role = 'r5'

assert p1._addresses_by_role['r5'] == [some_role]



 

 2013/8/13 Michael Bayer mike...@zzzcomputing.com:
 
 On Aug 13, 2013, at 11:44 AM, Paul Balomiri paulbalom...@gmail.com wrote:
 
 I would like to get a list as value for the dict, such that i can
 assign more than one entity to any one key. The output should look
 like this:
 {u'home': [Address object at 0x29568d0,Address object at ...] ,
 u'work': [Address object at 0x2a3eb90]}
 
 Now in the database whenever i set a new value for a key(=role), the
 entry in PersonToAddress' table is replaced (not added). This is
 consistent with having a 1-key to 1-value mapping. Can I however
 change the behaviour in such a way that more than one Addresses are
 allowed for one Person using the same key(=role in this example)?
 
 
 OK, an attribute_mapped_collection is just an adapter for what is basically 
 a sequence.  Instead of a sequence of objects, it's a sequence of (key, 
 object).   So by itself, attribute_mapped_collection can only store mapped 
 objects, not collections as values.
 
 When using the association proxy, there is a way to get a dictionary of 
 values, but the association proxy only knows how to close two hops into 
 one.  So to achieve that directly, you'd need one relationship that is a 
 key/value mapping to a middle object, then that middle object has a 
 collection of things.So here PersonToAddress would be more like 
 PersonAddressCollection, and then each Address object would have a 
 person_address_collection_id.   That's obviously not the traditional 
 association object pattern - instead of a collection of associations to 
 scalars, it's a collection of collections, since that's really the structure 
 you're looking for here.
 
 To approximate the collection of collections on top of a traditional 
 association pattern is tricky.  The simplest way is probably to make a 
 read-only @property that just fabricates a dictionary of collections on the 
 fly, reading from the pure collection of PersonToAddress objects.  If you 
 want just a quick read-only system, I'd go with that.
 
 Otherwise, we need to crack open the collection mechanics completely, and 
 since you want association proxying, we need to crack that open as well.  
 I've worked up a proof of concept for this idea which is below, and it was 
 not at all trivial to come up with.  In particular I stopped at getting 
 Person.addresses_by_role['role'].append(Address()) to work, since that means 
 we'd need two distinctly instrumented collections, it's doable but is more 
 complex.Below I adapted collections.defaultdict() to provide us with a 
 collection of collections over a single collection and also the 
 association proxy's base collection adapter in order to reduce the hops:
 
 from sqlalchemy import *
 from sqlalchemy.orm import *
 from sqlalchemy.ext.declarative import declarative_base
 import collections
 from sqlalchemy.orm.collections import collection, collection_adapter
 from sqlalchemy.ext.associationproxy import association_proxy, 
 _AssociationCollection
 Base = declarative_base()
 
 class 

Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-08-15 Thread Paul Balomiri
Hi,

Thank you for the elaborated Answer !

I am trying to implement a general solution for the key-list problem
using events.

basically i want to instrument for GroupByKeyCollection any changes
relevant to the keyfunc.

say we have
p= Person()
p._address_by_role['r1']= [PersonToAddress(address=Address(name='a1'),
role='r1')  ]

My problem is that i cannot access the parent object (PersonToAddress)
from ScalarAttributeImpl supplied by the events framework as
initiation parameter of the set callback. What i want is to remove an
object from a key-associated list when it's keying function result
mutates. For this i have to fetch the PersonToAddress from
PersonToAddress.role.set event. Can you hint me a way to fetch a
mapped object from it's attribute set event ?

The following describes how i see path to the solution. Do you think i
am on the right track?

The keying function shall be reapplied whenever keying Attributes are
mutated on PersonToAddress. Upon detecting a changed value i want to
reorganize the _address_by_role structure.

The second step would be to implement callbacks on the instrumented
lists which form the values of the GroupByKeyCollection.

The behavior i target is such that:

p._address_by_role.append(PersonToAddress(address=Address(name='a1',
role='r1')) #OK
p._address_by_role['r2'].append(PersonToAddress(address=Address(name='a1'),
role='r1')) # OK, but  PersonToAddress.role is changed to 'r2'
p._address_by_role['r2'].append(PersonToAddress(address=Address(name='a1')))
#OK,  PersonToAddress.role is set to 'r2'
del p._address_by_role['r2'][0] #O.K, the first element is removed,
and it's role value is set to the default value

p._address_by_role['r2'][1]=  p._address_by_role['r1'][0]
# OK, but may steps should happen here:
#   -p._address_by_role['r1'][0] is put into p._address_by_role['r2']
#   -this changes the attr. value p._address_by_role['r1'][0].role to r2
#   this triggers the removal from p._address_by_role['r1']

Thank you
Paul

2013/8/13 Michael Bayer mike...@zzzcomputing.com:

 On Aug 13, 2013, at 11:44 AM, Paul Balomiri paulbalom...@gmail.com wrote:

 I would like to get a list as value for the dict, such that i can
 assign more than one entity to any one key. The output should look
 like this:
 {u'home': [Address object at 0x29568d0,Address object at ...] ,
 u'work': [Address object at 0x2a3eb90]}

 Now in the database whenever i set a new value for a key(=role), the
 entry in PersonToAddress' table is replaced (not added). This is
 consistent with having a 1-key to 1-value mapping. Can I however
 change the behaviour in such a way that more than one Addresses are
 allowed for one Person using the same key(=role in this example)?


 OK, an attribute_mapped_collection is just an adapter for what is basically a 
 sequence.  Instead of a sequence of objects, it's a sequence of (key, 
 object).   So by itself, attribute_mapped_collection can only store mapped 
 objects, not collections as values.

 When using the association proxy, there is a way to get a dictionary of 
 values, but the association proxy only knows how to close two hops into 
 one.  So to achieve that directly, you'd need one relationship that is a 
 key/value mapping to a middle object, then that middle object has a 
 collection of things.So here PersonToAddress would be more like 
 PersonAddressCollection, and then each Address object would have a 
 person_address_collection_id.   That's obviously not the traditional 
 association object pattern - instead of a collection of associations to 
 scalars, it's a collection of collections, since that's really the structure 
 you're looking for here.

 To approximate the collection of collections on top of a traditional 
 association pattern is tricky.  The simplest way is probably to make a 
 read-only @property that just fabricates a dictionary of collections on the 
 fly, reading from the pure collection of PersonToAddress objects.  If you 
 want just a quick read-only system, I'd go with that.

 Otherwise, we need to crack open the collection mechanics completely, and 
 since you want association proxying, we need to crack that open as well.  
 I've worked up a proof of concept for this idea which is below, and it was 
 not at all trivial to come up with.  In particular I stopped at getting 
 Person.addresses_by_role['role'].append(Address()) to work, since that means 
 we'd need two distinctly instrumented collections, it's doable but is more 
 complex.Below I adapted collections.defaultdict() to provide us with a 
 collection of collections over a single collection and also the association 
 proxy's base collection adapter in order to reduce the hops:

 from sqlalchemy import *
 from sqlalchemy.orm import *
 from sqlalchemy.ext.declarative import declarative_base
 import collections
 from sqlalchemy.orm.collections import collection, collection_adapter
 from sqlalchemy.ext.associationproxy import association_proxy, 
 _AssociationCollection
 Base = 

Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-08-13 Thread Michael Bayer

On Aug 13, 2013, at 11:44 AM, Paul Balomiri paulbalom...@gmail.com wrote:

 I would like to get a list as value for the dict, such that i can
 assign more than one entity to any one key. The output should look
 like this:
 {u'home': [Address object at 0x29568d0,Address object at ...] ,
 u'work': [Address object at 0x2a3eb90]}
 
 Now in the database whenever i set a new value for a key(=role), the
 entry in PersonToAddress' table is replaced (not added). This is
 consistent with having a 1-key to 1-value mapping. Can I however
 change the behaviour in such a way that more than one Addresses are
 allowed for one Person using the same key(=role in this example)?
 

OK, an attribute_mapped_collection is just an adapter for what is basically a 
sequence.  Instead of a sequence of objects, it's a sequence of (key, object).  
 So by itself, attribute_mapped_collection can only store mapped objects, not 
collections as values.

When using the association proxy, there is a way to get a dictionary of values, 
but the association proxy only knows how to close two hops into one.  So to 
achieve that directly, you'd need one relationship that is a key/value mapping 
to a middle object, then that middle object has a collection of things.So 
here PersonToAddress would be more like PersonAddressCollection, and then each 
Address object would have a person_address_collection_id.   That's obviously 
not the traditional association object pattern - instead of a collection of 
associations to scalars, it's a collection of collections, since that's really 
the structure you're looking for here.

To approximate the collection of collections on top of a traditional 
association pattern is tricky.  The simplest way is probably to make a 
read-only @property that just fabricates a dictionary of collections on the 
fly, reading from the pure collection of PersonToAddress objects.  If you want 
just a quick read-only system, I'd go with that.

Otherwise, we need to crack open the collection mechanics completely, and since 
you want association proxying, we need to crack that open as well.  I've worked 
up a proof of concept for this idea which is below, and it was not at all 
trivial to come up with.  In particular I stopped at getting 
Person.addresses_by_role['role'].append(Address()) to work, since that means 
we'd need two distinctly instrumented collections, it's doable but is more 
complex.Below I adapted collections.defaultdict() to provide us with a 
collection of collections over a single collection and also the association 
proxy's base collection adapter in order to reduce the hops:

from sqlalchemy import *
from sqlalchemy.orm import *
from sqlalchemy.ext.declarative import declarative_base
import collections
from sqlalchemy.orm.collections import collection, collection_adapter
from sqlalchemy.ext.associationproxy import association_proxy, 
_AssociationCollection
Base = declarative_base()

class GroupByKeyCollection(collections.defaultdict):
def __init__(self, keyfunc):
super(GroupByKeyCollection, self).__init__(list)
self.keyfunc = keyfunc

@collection.appender
def add(self, value, _sa_initiator=None):
key = self.keyfunc(value)
self[key].append(value)

@collection.remover
def remove(self, value, _sa_initiator=None):
key = self.keyfunc(value)
self[key].remove(value)

@collection.internally_instrumented
def __setitem__(self, key, value):
adapter = collection_adapter(self)
# the collection API usually provides these events transparently, but 
due to
# the unusual structure, we pretty much have to fire them ourselves
# for each item.
for item in value:
item = adapter.fire_append_event(item, None)
collections.defaultdict.__setitem__(self, key, value)

@collection.internally_instrumented
def __delitem__(self, key, value):
adapter = collection_adapter(self)
for item in value:
item = adapter.fire_remove_event(item, None)
collections.defaultdict.__delitem__(self, key, value)

@collection.iterator
def iterate(self):
for collection in self.values():
for item in collection:
yield item

@collection.converter
def _convert(self, target):
for collection in target.values():
for item in collection:
yield item

def update(self, k):
raise NotImplementedError()


class AssociationGBK(_AssociationCollection):
def __init__(self, lazy_collection, creator, value_attr, parent):
getter, setter = parent._default_getset(parent.collection_class)
super(AssociationGBK, self).__init__(
lazy_collection, creator, getter, setter, parent)

def _create(self, key, value):
return self.creator(key, value)

def _get(self, object):
return self.getter(object)

def _set(self, object, key, value):
return