Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-09-04 Thread Paul Balomiri
Hi,
First off and again, thanks for the support here. I think I can work for
the momement with your proposed implementation.

My problem is related to the fact that the list values of the GBK
defaultdict are plain list types, and thus cannot fire events for
operations on them. The testcase below does not work, and, as you
mentioned, no other operation on GBK's list values will fire the
corresponding events.

Now my attempt (admittedly without enough knowledge of sqlalchemy
internals) was to create a list which *forwards* append/remove events
to the GBK Collection which could in turn add/remove them in their quality
as true InstrumentedAttribute (thus handling the DB part) . So more
specifically i used prepare_instrumentation() hoping to be able to
instantiate an InstrumentedList with event capabilities. The
InstrumentedLists would not need be first class InstrumentedAttributes ( --
perhaps could not because they appear after reflection time? --).

I see now that it was a far off longshot.

This approach would also remove the immutability constraint on keyfunc's
return value. Keyfunc would be a read/writable attribute instead.

I hope to find time to get more accustomed to sqlalchemy's internals and to
implement this idea somewhere in Jan. or Feb..

By the way I'm implementing Rails-Style DB binding (table people-Class
Person e.t.c.) using the SQLA inspector interface. I'll post the lib in
github as soon as I'm eating my own dog food (=it's usable) .

--
Paul

Here is a testcase where i would like to have the list
p1._addresses_by_role[home] fire a remove event which removes the person
from GBK:

def test_del_item(self):
sess = self.sess
p1 = Person()
a1 = Address(name=Bucharest)

# here, p2a already refers to a1/p1, the _addresses_by_role
# will be set up when it loads after a commit
p2a = PersonToAddress(address=a1, person=p1, role=home)
sess.add(p1)

sess.commit()
self._assertAssociated(p1, a1, p2a)

del p1._addresses_by_role[home][0]
sess.commit()
import pdb
pdb.set_trace()
self.assertFalse(home in p1._addresses_by_role)




2013/9/4 Michael Bayer mike...@zzzcomputing.com


 On Sep 3, 2013, at 8:47 AM, Paul Balomiri paulbalom...@gmail.com wrote:

  I would like to install
 
  event.listen(list, 'append', append_listener)
  event.listen(list, 'remove', rm_listener)
 
  on those lists, such that the GroupByKeyCollection can modify added
 objects according to the relationship it implements:
  * set the appropiate foreign key constraints
  * insert a removed object with it's new value for the key attribute
 after a change (announced by append_listener)
  * reset the fks upon item removal.

 using event.listen with GBK doesn't make sense.  events can only be used
 with specific target types, the remove append events only apply to an
 ORM-produced InstrumentedAttribute, such as Person._addresses_by_role here
 (note, we mean the class-bound attribute, not the collection on an
 instance).  There is no need to use event.listen with the collection
 itself, as remove/append are produced originally by the add()/remove()
 methods on GBK itself; any extra logic which should take place would be
 invoked directly from there (and in fact my original example fails to fire
 off the event with remove()).

 Additionally, all the usage of prepare_instrumentation() etc. should not
 be necessary, that's all internal stuff which is called automatically.

 As mentioned before, the behavior of this collection is completely outside
 the realm of a normal collection so it needs to implement the
 append/remove events directly, which isn't something a new user to
 SQLAlchemy would typically be able to handle without a much deeper
 understanding of how the attribute system works.

 I've implemented your test case as below as well as some other variants in
 association with the original code I gave you - for the remove case I've
 added the necessary code to the custom collection. All foreign key
 constraints are set correctly as a function of the ORM's normal operation,
 and as far as reset, when an association between Person and Address is
 removed, we want to just delete the association so cascade is used for
 that.   I'm not sure what insert a removed object with it's new value for
 the key attribute after a change means; add a test to the TestPA class
 illustrating the behavior you want and I'll add it.

 from sqlalchemy import *
 from sqlalchemy.orm import *
 from sqlalchemy.ext.declarative import declarative_base
 import collections
 from sqlalchemy.orm.collections import collection, collection_adapter
 from sqlalchemy.ext.associationproxy import association_proxy,
 _AssociationCollection
 Base = declarative_base()

 class GroupByKeyCollection(collections.defaultdict):
 def __init__(self, keyfunc):
 super(GroupByKeyCollection, self).__init__(list)
 self.keyfunc = keyfunc

Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-09-03 Thread Paul Balomiri
):
self.e = create_engine(sqlite://, echo=True)
Base.metadata.create_all(self.e)
def tearDown(self):
self.e.dispose()
def test_append_to_list(self):
sess=Session(self.e)
p1=Person()
a1= Address(name=Bucharest)
p2a=PersonToAddress(person=p1,address=a1, role=home)
sess.add_all([p1,a1,p2a])
# this fails:
p1._addresses_by_role.add(p2a)
sess.submit()




2013/8/15 Paul Balomiri paulbalom...@gmail.com

 Hi,

 Thank you for the elaborated Answer !

 I am trying to implement a general solution for the key-list problem
 using events.

 basically i want to instrument for GroupByKeyCollection any changes
 relevant to the keyfunc.

 say we have
 p= Person()
 p._address_by_role['r1']= [PersonToAddress(address=Address(name='a1'),
 role='r1')  ]

 My problem is that i cannot access the parent object (PersonToAddress)
 from ScalarAttributeImpl supplied by the events framework as
 initiation parameter of the set callback. What i want is to remove an
 object from a key-associated list when it's keying function result
 mutates. For this i have to fetch the PersonToAddress from
 PersonToAddress.role.set event. Can you hint me a way to fetch a
 mapped object from it's attribute set event ?

 The following describes how i see path to the solution. Do you think i
 am on the right track?

 The keying function shall be reapplied whenever keying Attributes are
 mutated on PersonToAddress. Upon detecting a changed value i want to
 reorganize the _address_by_role structure.

 The second step would be to implement callbacks on the instrumented
 lists which form the values of the GroupByKeyCollection.

 The behavior i target is such that:

 p._address_by_role.append(PersonToAddress(address=Address(name='a1',
 role='r1')) #OK
 p._address_by_role['r2'].append(PersonToAddress(address=Address(name='a1'),
 role='r1')) # OK, but  PersonToAddress.role is changed to 'r2'

 p._address_by_role['r2'].append(PersonToAddress(address=Address(name='a1')))
 #OK,  PersonToAddress.role is set to 'r2'
 del p._address_by_role['r2'][0] #O.K, the first element is removed,
 and it's role value is set to the default value

 p._address_by_role['r2'][1]=  p._address_by_role['r1'][0]
 # OK, but may steps should happen here:
 #   -p._address_by_role['r1'][0] is put into p._address_by_role['r2']
 #   -this changes the attr. value p._address_by_role['r1'][0].role to r2
 #   this triggers the removal from p._address_by_role['r1']

 Thank you
 Paul

 2013/8/13 Michael Bayer mike...@zzzcomputing.com:
 
  On Aug 13, 2013, at 11:44 AM, Paul Balomiri paulbalom...@gmail.com
 wrote:
 
  I would like to get a list as value for the dict, such that i can
  assign more than one entity to any one key. The output should look
  like this:
  {u'home': [Address object at 0x29568d0,Address object at ...] ,
  u'work': [Address object at 0x2a3eb90]}
 
  Now in the database whenever i set a new value for a key(=role), the
  entry in PersonToAddress' table is replaced (not added). This is
  consistent with having a 1-key to 1-value mapping. Can I however
  change the behaviour in such a way that more than one Addresses are
  allowed for one Person using the same key(=role in this example)?
 
 
  OK, an attribute_mapped_collection is just an adapter for what is
 basically a sequence.  Instead of a sequence of objects, it's a sequence of
 (key, object).   So by itself, attribute_mapped_collection can only store
 mapped objects, not collections as values.
 
  When using the association proxy, there is a way to get a dictionary of
 values, but the association proxy only knows how to close two hops into
 one.  So to achieve that directly, you'd need one relationship that is a
 key/value mapping to a middle object, then that middle object has a
 collection of things.So here PersonToAddress would be more like
 PersonAddressCollection, and then each Address object would have a
 person_address_collection_id.   That's obviously not the traditional
 association object pattern - instead of a collection of associations to
 scalars, it's a collection of collections, since that's really the
 structure you're looking for here.
 
  To approximate the collection of collections on top of a traditional
 association pattern is tricky.  The simplest way is probably to make a
 read-only @property that just fabricates a dictionary of collections on the
 fly, reading from the pure collection of PersonToAddress objects.  If you
 want just a quick read-only system, I'd go with that.
 
  Otherwise, we need to crack open the collection mechanics completely,
 and since you want association proxying, we need to crack that open as
 well.  I've worked up a proof of concept for this idea which is below, and
 it was not at all trivial to come up with.  In particular I stopped at
 getting Person.addresses_by_role['role'].append(Address()) to work, since
 that means we'd need two distinctly instrumented collections, it's doable

Re: [sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-08-15 Thread Paul Balomiri
Hi,

Thank you for the elaborated Answer !

I am trying to implement a general solution for the key-list problem
using events.

basically i want to instrument for GroupByKeyCollection any changes
relevant to the keyfunc.

say we have
p= Person()
p._address_by_role['r1']= [PersonToAddress(address=Address(name='a1'),
role='r1')  ]

My problem is that i cannot access the parent object (PersonToAddress)
from ScalarAttributeImpl supplied by the events framework as
initiation parameter of the set callback. What i want is to remove an
object from a key-associated list when it's keying function result
mutates. For this i have to fetch the PersonToAddress from
PersonToAddress.role.set event. Can you hint me a way to fetch a
mapped object from it's attribute set event ?

The following describes how i see path to the solution. Do you think i
am on the right track?

The keying function shall be reapplied whenever keying Attributes are
mutated on PersonToAddress. Upon detecting a changed value i want to
reorganize the _address_by_role structure.

The second step would be to implement callbacks on the instrumented
lists which form the values of the GroupByKeyCollection.

The behavior i target is such that:

p._address_by_role.append(PersonToAddress(address=Address(name='a1',
role='r1')) #OK
p._address_by_role['r2'].append(PersonToAddress(address=Address(name='a1'),
role='r1')) # OK, but  PersonToAddress.role is changed to 'r2'
p._address_by_role['r2'].append(PersonToAddress(address=Address(name='a1')))
#OK,  PersonToAddress.role is set to 'r2'
del p._address_by_role['r2'][0] #O.K, the first element is removed,
and it's role value is set to the default value

p._address_by_role['r2'][1]=  p._address_by_role['r1'][0]
# OK, but may steps should happen here:
#   -p._address_by_role['r1'][0] is put into p._address_by_role['r2']
#   -this changes the attr. value p._address_by_role['r1'][0].role to r2
#   this triggers the removal from p._address_by_role['r1']

Thank you
Paul

2013/8/13 Michael Bayer mike...@zzzcomputing.com:

 On Aug 13, 2013, at 11:44 AM, Paul Balomiri paulbalom...@gmail.com wrote:

 I would like to get a list as value for the dict, such that i can
 assign more than one entity to any one key. The output should look
 like this:
 {u'home': [Address object at 0x29568d0,Address object at ...] ,
 u'work': [Address object at 0x2a3eb90]}

 Now in the database whenever i set a new value for a key(=role), the
 entry in PersonToAddress' table is replaced (not added). This is
 consistent with having a 1-key to 1-value mapping. Can I however
 change the behaviour in such a way that more than one Addresses are
 allowed for one Person using the same key(=role in this example)?


 OK, an attribute_mapped_collection is just an adapter for what is basically a 
 sequence.  Instead of a sequence of objects, it's a sequence of (key, 
 object).   So by itself, attribute_mapped_collection can only store mapped 
 objects, not collections as values.

 When using the association proxy, there is a way to get a dictionary of 
 values, but the association proxy only knows how to close two hops into 
 one.  So to achieve that directly, you'd need one relationship that is a 
 key/value mapping to a middle object, then that middle object has a 
 collection of things.So here PersonToAddress would be more like 
 PersonAddressCollection, and then each Address object would have a 
 person_address_collection_id.   That's obviously not the traditional 
 association object pattern - instead of a collection of associations to 
 scalars, it's a collection of collections, since that's really the structure 
 you're looking for here.

 To approximate the collection of collections on top of a traditional 
 association pattern is tricky.  The simplest way is probably to make a 
 read-only @property that just fabricates a dictionary of collections on the 
 fly, reading from the pure collection of PersonToAddress objects.  If you 
 want just a quick read-only system, I'd go with that.

 Otherwise, we need to crack open the collection mechanics completely, and 
 since you want association proxying, we need to crack that open as well.  
 I've worked up a proof of concept for this idea which is below, and it was 
 not at all trivial to come up with.  In particular I stopped at getting 
 Person.addresses_by_role['role'].append(Address()) to work, since that means 
 we'd need two distinctly instrumented collections, it's doable but is more 
 complex.Below I adapted collections.defaultdict() to provide us with a 
 collection of collections over a single collection and also the association 
 proxy's base collection adapter in order to reduce the hops:

 from sqlalchemy import *
 from sqlalchemy.orm import *
 from sqlalchemy.ext.declarative import declarative_base
 import collections
 from sqlalchemy.orm.collections import collection, collection_adapter
 from sqlalchemy.ext.associationproxy import association_proxy, 
 _AssociationCollection
 Base

[sqlalchemy]

2013-08-13 Thread Paul Balomiri
Hi,

I am trying to build an attribute_mapped_collection reference from
table people (Mapped class is called Person). However, I would like to
get a list of entities for each key.

I have the following tables with the relevant PK and FK listed
Person:
  - id

PersonToAddress:
- id
- person_id
- address_id
  role # this is the mapped special key

Address:
- id

to establish a relationship i do the following (only the relationships
are included in the listing)
class PersonToAddress:
person = relationship( __table_to_classnames__['people'],

backref=backref('people_to_addresses',
collection_class=attribute_mapped_collection(role)))

class Person:
addresses_by_role =
association_proxy('people_to_addresses','address',creator = lambda
k,v:PeopleToAddress(role=k,address=v))

Now querying yields this result:
p = Session.query(Person).get(id=1)
print p.addresses_by_role
{u'home': Address object at 0x29568d0, u'work': Address object at 0x2a3eb90}

I would like to get a list as value for the dict, such that i can
assign more than one entity to any one key. The output should look
like this:
{u'home': [Address object at 0x29568d0,Address object at ...] ,
u'work': [Address object at 0x2a3eb90]}

Now in the database whenever i set a new value for a key(=role), the
entry in PersonToAddress' table is replaced (not added). This is
consistent with having a 1-key to 1-value mapping. Can I however
change the behaviour in such a way that more than one Addresses are
allowed for one Person using the same key(=role in this example)?

I should note that i tried supplying the uselist=True parameter in the
backref argument to PersonToAddress.person. This, however does
nothing.
Adding uselist=True to the parameters of the relationship (as opposed
to the backref) does create a list in both the backref and
addresses_by_role's values. The list, however only contains an
element, and if a new one is added, the entry in the db is
changed.Still only 1 element of the list is ever present in
PersonToAddress' table.

Am i overlooking something in the way attribute_mapped_collection
should be used ?

Paul

-- 
paulbalom...@gmail.com

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/groups/opt_out.


[sqlalchemy] attribute_mapped_collection use as a key-list dictionary

2013-08-13 Thread Paul Balomiri
sorry for having forgot to add a subject

2013/8/13 Paul Balomiri paulbalom...@gmail.com:
 Hi,

 I am trying to build an attribute_mapped_collection reference from
 table people (Mapped class is called Person). However, I would like to
 get a list of entities for each key.

 I have the following tables with the relevant PK and FK listed
 Person:
   - id

 PersonToAddress:
 - id
 - person_id
 - address_id
   role # this is the mapped special key

 Address:
 - id

 to establish a relationship i do the following (only the relationships
 are included in the listing)
 class PersonToAddress:
 person = relationship( __table_to_classnames__['people'],

 backref=backref('people_to_addresses',
 collection_class=attribute_mapped_collection(role)))

 class Person:
 addresses_by_role =
 association_proxy('people_to_addresses','address',creator = lambda
 k,v:PeopleToAddress(role=k,address=v))

 Now querying yields this result:
 p = Session.query(Person).get(id=1)
 print p.addresses_by_role
 {u'home': Address object at 0x29568d0, u'work': Address object at 
 0x2a3eb90}

 I would like to get a list as value for the dict, such that i can
 assign more than one entity to any one key. The output should look
 like this:
 {u'home': [Address object at 0x29568d0,Address object at ...] ,
 u'work': [Address object at 0x2a3eb90]}

 Now in the database whenever i set a new value for a key(=role), the
 entry in PersonToAddress' table is replaced (not added). This is
 consistent with having a 1-key to 1-value mapping. Can I however
 change the behaviour in such a way that more than one Addresses are
 allowed for one Person using the same key(=role in this example)?

 I should note that i tried supplying the uselist=True parameter in the
 backref argument to PersonToAddress.person. This, however does
 nothing.
 Adding uselist=True to the parameters of the relationship (as opposed
 to the backref) does create a list in both the backref and
 addresses_by_role's values. The list, however only contains an
 element, and if a new one is added, the entry in the db is
 changed.Still only 1 element of the list is ever present in
 PersonToAddress' table.

 Am i overlooking something in the way attribute_mapped_collection
 should be used ?

 Paul

 --
 paulbalom...@gmail.com

 --
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.



-- 
paulbalom...@gmail.com

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/groups/opt_out.


[sqlalchemy] sql particle as bind parameter option:

2010-04-23 Thread Paul Balomiri

Hi,

Several days ago i has a similar question, which
was, perhaps a bit too theoretical. Here is a simple case:

select(text(select * from f(:time)) )

can I supply alternatively either
{time: datetime.datetime(1999,9,9,9,9,9)}
or
{time:  now()}
 I cannot get around that sqlalchemy interprets   now() as a  
string, while i try to use the postgres(or other backends) now()  
function.




Paul Balomiri
paulbalom...@gmail.com



--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



Re: [sqlalchemy] sql particle as bind parameter option:

2010-04-23 Thread Paul Balomiri

Hi,

Thanks, for your reply. I was aware that it is not just a string  
substitution,

and that the bindparams also maps the type of the parameters e.t.c.

But is it possible to supply a type /set of types which are or  
represent a lexical sub-structure ?
I might go into doing it, but first i want to check that i'm not  
reinventing the wheel.


I noticed that the where clause can be supplied several times, so
select( [my_table_metadata]). where(). where(...) is valid, thus  
it is possible
for this special case to insert a logical expression after the initial  
select() definition.


I am looking for a similar, but more general case, not necessarily  
using bindparam where  func.now()
would be a valid substitution or insertion element into a preexisting  
expression.


regards
Paul Balomiri
paulbalom...@gmail.com



On 23.04.2010, at 15:26, Michael Bayer wrote:



On Apr 23, 2010, at 6:58 AM, Paul Balomiri wrote:


Hi,

Several days ago i has a similar question, which
was, perhaps a bit too theoretical. Here is a simple case:

select(text(select * from f(:time)) )

can I supply alternatively either
{time: datetime.datetime(1999,9,9,9,9,9)}
or
{time:  now()}
I cannot get around that sqlalchemy interprets   now() as a  
string, while i try to use the postgres(or other backends) now()  
function.


bind parameters exist only for the purpose of specifying literal  
data to be injected as the values into a SQL statement.   They  
cannot be used to modify the lexical structure of a statement.   So  
a lexical unit like CURRENT TIMESTAMP or NOW()  is part of the  
statement's lexical source and cannot be injected as a bind parameter.


To put it another way, bind parameters are not just string  
substitutions the way something like foo %s % ('bar') is in  
Python, even though they may look that way.



--
You received this message because you are subscribed to the Google  
Groups sqlalchemy group.

To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en 
.




--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



Re: [sqlalchemy] sql particle as bind parameter option:

2010-04-23 Thread Paul Balomiri




On 23.04.2010, at 17:03, Michael Bayer wrote:



On Apr 23, 2010, at 9:59 AM, Paul Balomiri paulbalom...@gmail.com  
wrote:



Hi,

Thanks, for your reply. I was aware that it is not just a string  
substitution,

and that the bindparams also maps the type of the parameters e.t.c.

But is it possible to supply a type /set of types which are or  
represent a lexical sub-structure ?
I might go into doing it, but first i want to check that i'm not  
reinventing the wheel.


I noticed that the where clause can be supplied several times, so
select( [my_table_metadata]). where(). where(...) is valid,  
thus it is possible
for this special case to insert a logical expression after the  
initial select() definition.


I am looking for a similar, but more general case, not necessarily  
using bindparam where  func.now()
would be a valid substitution or insertion element into a  
preexisting expression.




Why not just create a python function that generatively produces the  
desired statement based on arguments?  I don't see the advantage to  
something more magical than that.

Well, this is what i already do:

lazy_sql_partial = functools.partial(lambda x: select([...] ,  
from_obj(func.a(bindparam('a'),bindparam('b'),x   ))) )

and then i name the binding like so:
lazy_sql_unbound = (lazy_sql_partial, (funcarg))


At the time when i generate the sql expression I do not know the  
parameters, not even a default. Those are generated based on user input.
in case parameters are missing for a whole expression the whole  
expression is skipped.


A even simpler expression would be a=1 or a=2 or ... At the moment  
i cannot generate such a constuct, which allows me to
define a select in one place, and then later add a or_(1,2,...) clause  
at a certain point.As a special case i could use   select().where(1).  
where(2)... to get an and_(1,2,...)



An example :
Whenever the map window changes, i need to calculate the dataset which  
falls out of the current window, and the ones which drops in.
Additionally i want to query the a modified sql expression whenever  
the table changes (modified = with an additional in_ costraining to  
pks in the table)

to do both i need to regenerate parts of the sql, but not all of it.

The way i do it now seems rather ugly because:
1) (partial_bound_func, (funcarg)) is error prone ( I know i cold do  
more reflection to find out argument numbers and names , but it would  
in the end duplicate the bindparam mechanism )

2) to execute i have to:
	- first look for parameters which are arguments to the partial  
generative functions  myselect= lazy_sql_unbound( user input params  
for partial )

- then use the remaining parameters in session.execute( )

It just feels like all of this rather belongs into the lib, because it  
could be all solved by allowing bindparam to have a value of type  
SQLExpression.

moreover, the expression substitutions could accept bindparams as well.

As a last point , the compiler could check the validity, as it does  
already.










regards
Paul Balomiri
paulbalom...@gmail.com



On 23.04.2010, at 15:26, Michael Bayer wrote:



On Apr 23, 2010, at 6:58 AM, Paul Balomiri wrote:


Hi,

Several days ago i has a similar question, which
was, perhaps a bit too theoretical. Here is a simple case:

select(text(select * from f(:time)) )

can I supply alternatively either
{time: datetime.datetime(1999,9,9,9,9,9)}
or
{time:  now()}
I cannot get around that sqlalchemy interprets   now() as a  
string, while i try to use the postgres(or other backends) now()  
function.


bind parameters exist only for the purpose of specifying literal  
data to be injected as the values into a SQL statement.   They  
cannot be used to modify the lexical structure of a statement.
So a lexical unit like CURRENT TIMESTAMP or NOW()  is part of  
the statement's lexical source and cannot be injected as a bind  
parameter.


To put it another way, bind parameters are not just string  
substitutions the way something like foo %s % ('bar') is in  
Python, even though they may look that way.



--
You received this message because you are subscribed to the Google  
Groups sqlalchemy group.

To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en 
.




--
You received this message because you are subscribed to the Google  
Groups sqlalchemy group.

To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en 
.




--
You received this message because you are subscribed to the Google  
Groups sqlalchemy group.

To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group

Re: [sqlalchemy] list filter

2010-04-23 Thread Paul Balomiri

Hi,

That would be table_metadata.c.column_name.in_([1,2,3]),
If you are looking for an in relation. You could also use  
MappedObjectClass.property instead of the column object.


Otherwhise, if you are really testing for array equality your  
expression can be used. Also, in this latter case, make sure that  
there is a DB array type, with the db you are using.


look here for some examples and for API Docs
http://www.sqlalchemy.org/docs/_06/reference/sqlalchemy/expressions.html#functions

Paul Balomiri
paulbalom...@gmail.com



On 23.04.2010, at 16:33, Alexander Zhabotinskiy wrote:


Hello.

How to filter by list
may be like:

.filter(Some.value==[1,2,3])

--
You received this message because you are subscribed to the Google  
Groups sqlalchemy group.

To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en 
.




--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



[sqlalchemy] sql clauses as bound parameters

2010-04-21 Thread Paul Balomiri

Hi,
Is it possible to supply bindparam a clause type?

I want to execute a query of the form

select * from f(:a, :b,:c)

where :b might be either a String or a function.
if :b is a function, i'd like to do something like:

s=select([col1 , col2, col3], from_obj=func.f(bindparam('a'),  
bindparam('b' , type=Clause) ) )

and
clause_variant_1 = func.bbox( bindparam('x1') ,  bindparam('y1'),  
bindparam('y2'), bindparam('y2'))

clause_variant_2= text(some funny value)
when i execute o'd like to do something like:


connection.execute(s, a=vala, c=valc, b= clause_variant_1,  x1=11,  
x2=12, y1=2, y2=3 )

or, alernativelly
connection.execute(s, a=vala, c=valc, b= clause_variant_2)

I already realized that i could set parameters with params, and then  
compile SQL expressions

would perhaps this be a more accurate approach ?:
connection.execute(s, a=vala, c=valc, b=  
clause_variant_1.params(  x1=11, x2=12, y1=2, y2=3).compile() ).


I'm kind of stuck here

cheers,
Paul




--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



[sqlalchemy] Re: sql clauses as bound parameters

2010-04-21 Thread Paul Balomiri
Alternatively a possibility to replace a bindparam Expression with
some other Expression would do it too...

2010/4/21 Paul Balomiri paulbalom...@gmail.com:
 Hi,
 Is it possible to supply bindparam a clause type?

 I want to execute a query of the form

 select * from f(:a, :b,:c)

 where :b might be either a String or a function.
 if :b is a function, i'd like to do something like:

 s=select([col1 , col2, col3], from_obj=func.f(bindparam('a'),
 bindparam('b' , type=Clause) ) )
 and
 clause_variant_1 = func.bbox( bindparam('x1') ,  bindparam('y1'),
 bindparam('y2'), bindparam('y2'))
 clause_variant_2= text(some funny value)
 when i execute o'd like to do something like:


 connection.execute(s, a=vala, c=valc, b= clause_variant_1,  x1=11,
 x2=12, y1=2, y2=3 )
 or, alernativelly
 connection.execute(s, a=vala, c=valc, b= clause_variant_2)

 I already realized that i could set parameters with params, and then compile
 SQL expressions
 would perhaps this be a more accurate approach ?:
 connection.execute(s, a=vala, c=valc, b= clause_variant_1.params(
  x1=11, x2=12, y1=2, y2=3).compile() ).

 I'm kind of stuck here

 cheers,
 Paul








-- 
paulbalom...@gmail.com

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



[sqlalchemy] session.merge(obj, load=False) yields Parent instance obj at 0x18817d0 is not bound to a Session; lazy load operation of attribute 'predecessor' cannot proceed

2010-04-16 Thread Paul Balomiri
Hi,

I was just trying to load a relation ( obj.predecessor) for obj.
obj was created in a session, which has been committed, and closed afterwards.
I merged obj to the session using

session.merge(obj, load=False)

 but doing so yields this exception:

Parent instance obj at 0x18817d0 is not bound to a Session; lazy
load operation of attribute 'predecessor' cannot proceed

without load=False everything is o.k.

I cannot quite understand this behaviour. I thought that load=False
does not update the object from db, but it merges it as-is to the
session.
but perhaps i'm missing some insight...

Paul

-- 
paulbalom...@gmail.com

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



[sqlalchemy] Re: session.merge(obj, load=False) yields Parent instance obj at 0x18817d0 is not bound to a Session; lazy load operation of attribute 'predecessor' cannot proceed

2010-04-16 Thread Paul Balomiri
My  Fault,

session.merge(obj, load=False) should be

obj=session.merge(obj, load=False)



2010/4/17 Paul Balomiri paulbalom...@gmail.com:
 Hi,

 I was just trying to load a relation ( obj.predecessor) for obj.
 obj was created in a session, which has been committed, and closed afterwards.
 I merged obj to the session using

 session.merge(obj, load=False)

  but doing so yields this exception:

 Parent instance obj at 0x18817d0 is not bound to a Session; lazy
 load operation of attribute 'predecessor' cannot proceed

 without load=False everything is o.k.

 I cannot quite understand this behaviour. I thought that load=False
 does not update the object from db, but it merges it as-is to the
 session.
 but perhaps i'm missing some insight...

 Paul

 --
 paulbalom...@gmail.com




-- 
paulbalom...@gmail.com

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.



Re: [sqlalchemy] Notification framework for table changes

2010-03-28 Thread Paul Balomiri





On 28.03.2010, at 17:58, Michael Bayer wrote:



On Mar 28, 2010, at 11:38 AM, paul wrote:


Hi,

Sorry for the last mail, which i sent in error, before finishing it.
If you(Michael) still have it in the filter, please just reject it.

I am working on a notification framework, which saves the dirty, new
and deleted states of objects in sessions to an Eventlog at commit
time.
The target is to have asynchronous agents listening to table changes
and a sqla extension to notify them when data is written to db.

The notification sender core code is written as a SessionExtension at
the moment.
The design uses the postgres NOTIFY / listen statements, and a table
(event_log) for carrying the per commit information of primary id's
which have added/deleted/modified. I know that NOTIFY is not sql
standard, but many dbs have some sort of notification capability.  
So i

think a general case is abstractable from this db agnostic proposal
(=pg's NOTIFY).
It is even conceivable to send  the notification event through OS
services, while still keep the notification data in the db.

I have already implemented this simple protocol:

before_commit(self, session):
 event_data=# get all pks for session.['dirty', 'new', 'deleted']
 for each x  of [dirty', 'new', 'deleted']
   session.add(Event( data=event_data[x]))

 session.add(Event(data=event_data)) # gener
 for each object type:
   for each notification type: #= one of (dirty', 'new', 'deleted')
   self.pending_notifications = [self.get_table_name(class_)]
 
First Question:  How does SQLAlchemy map the  call to
session.query(MappedObject) to a MetaData.tables[x]. Is there any way
to do this without using a custom mapper fuction ? I would prefer
using the same mechanism as SQLAlchemy
 self.pending_notifications += General Event for table


after_commit:
 send Notifications on separate db agnostic connection (in pg i need
to set a different isolation level)


My second concern is about what session.execute(UPDATE table_x set
pop = ''newval) actually does.
Is the sql statement parsed for table names? My interest would be in
marking the objects which are changed by this update as dirty. So  
what

i am really looking for is
a method to generate
Note that i use pk=Primary Key

'SELECT pk1, pk2 , change_field from tablename where prop=x'
from an update:
'UPDATE tablename set change_field=...  where prop=x'
This would enable me to generate a dirty pk  list from any (text) sql
statement accepted by sqla.

Please let me know if you feel that this design is not the 'sqla way
to do things'. I want to OS the code, but i wanted to get an short
feedback if this is of any use for anyone out there. So please
comment :)


some details here are unclear, but it appears you are attempting to  
save Event objects to the database in response to flushed changes,  
and also sending out notifications.   A SessionExtension is an  
appropriate place to do that.If I were writing a generic change  
event system, I would probably use rabbitmq instead of NOTIFY.  If  
I wanted to also have a log of changes in the database, I'd probably  
forego the usage of Session.add() and just do a series of INSERT  
statements using executemany() syntax, as it would have the minimal  
performance/complexity impact.

Thanks for the pointer to rabbitmq!


To get a Table from a mapped class you can use  
class_mapper(MappedClass).mapped_table.   A Session.execute() given  
a literal string-based statement does not parse the string - usually  
its a given that high-level ORM enabled features, like notifying  
listeners of newly changes objects, aren't going to work if the user  
drops down to raw SQL.


That last point leads to naturally towards, if you really want to  
NOTIFY when any changes whatsoever occur on a table, then you should  
be using triggers.
Yes, but i plan to introduce these at table creation time, and only to  
write them to db when the lib user adds a specific ddl extention. DDLs  
would have to be written for all dialects available, so this is a bit  
far fetched.
At the moment I just try to stick with sqla for change management, and  
just abstract the NOTIFY concept. The event log can also be omitted,  
so basically there are 3 Levels (in increasing order of intrusiveness  
into the data model)


1) pure change notification  = here each  listener must maintain it's  
own table state management, no modification to the db model  
whatsoever. Just a 'wake up' event is sent
2) change notification + event_log = the changeset is published  
through an additional table, but the overall datamodel is not affected
3) trigger based change notification = the changeset is maintained by  
triggers, and their definition for each DB is handled by sqla ddl  
extensions on creation time


  Postgresql also allows stored procedures to be written in many  
languages including Python so even using rabbitmq for notification  
is quite doable entirely 

Re: [sqlalchemy] Notification framework for table changes

2010-03-28 Thread Paul Balomiri

Hi,
How do i get a list of class mappers  from a metadata table  ?




On 28.03.2010, at 17:58, Michael Bayer wrote:


mapped_table


--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.