[sqlalchemy] overriding collection methods
hi i need to have a list collection with list.appender (in SA 0.4 terms) that accepts either one positional arg as the value, or keyword args which it uses to create the value. Each collection instance knows what type of values to create. so i do: class MyCollection( list): factory = None @collection.appender def append( me, obj =_NOTSET, **kargs): if obj is _NOTSET:#marker for notset obj = me.factory( **kargs) list.append( me, obj) return obj @classmethod def myCollectionFactory( klas): m = Association.MyCollection() m.factory = klas return m and in the mapper, ... relation( ..., uselist = True, collection_class = assoc_klas.myCollectionFactory ) well, it doesnot work. all is well until in _instrument_class() the ABC decoration kicks in, and setup a preset append-wrapping decorator that has another interface (as in _list_decorators(): def append(self, item, _sa_initiator=None):... Any idea to fix/enhance this, letting **kwargs through to my function? The dynamic wrapper() can do this, while these preset ones cannot... while they should be equaly powerful. There are 2 (different) uses of an appender, one is the SA itself, but the other is the programmer. SA will always use single arg/positionals, while i could use this or that or combination. === coupe of comments on orm.collections.py: - there are several lines like setattr(fn, '_sa_instrumented', True) why not just use fn._sa_instrumented= True ? - the repeated check/setup in _instrument_class() can be looped: # ensure all roles are present, and apply implicit instrumentation if needed for rolename,eventname in dict( appender='fire_append_event', remover ='fire_remove_event', iterator=None, ).iteritems(): roler = roles.get( rolename, None) if not role or not hasattr(cls, roler): typename = cls.__name__ raise exceptions.ArgumentError( Type %(typename)s must elect an %(rolename)s method to be a collection class % locals() ) elif (eventname and roler not in methods and not hasattr(getattr(cls, roler), '_sa_instrumented')): methods[ roler] = ( eventname, 1, None) patch attached. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~--- Index: orm/collections.py === --- orm/collections.py (revision ) +++ orm/collections.py (working copy) @@ -647,35 +655,32 @@ # ensure all roles are present, and apply implicit instrumentation if # needed -if 'appender' not in roles or not hasattr(cls, roles['appender']): +for rolename,eventname in dict( +appender='fire_append_event', +remover ='fire_remove_event', +iterator=None, +).iteritems(): +roler = roles.get( rolename, None) +if not role or not hasattr(cls, roler): +typename = cls.__name__ raise exceptions.ArgumentError( -Type %s must elect an appender method to be -a collection class % cls.__name__) -elif (roles['appender'] not in methods and - not hasattr(getattr(cls, roles['appender']), '_sa_instrumented')): -methods[roles['appender']] = ('fire_append_event', 1, None) - -if 'remover' not in roles or not hasattr(cls, roles['remover']): -raise exceptions.ArgumentError( -Type %s must elect a remover method to be -a collection class % cls.__name__) -elif (roles['remover'] not in methods and - not hasattr(getattr(cls, roles['remover']), '_sa_instrumented')): -methods[roles['remover']] = ('fire_remove_event', 1, None) - -if 'iterator' not in roles or not hasattr(cls, roles['iterator']): -raise exceptions.ArgumentError( -Type %s must elect an iterator method to be -a collection class % cls.__name__) +Type %(typename)s must elect an %(role)s method to be +a collection class % locals() ) +elif (eventname and +roler not in methods and +not hasattr(getattr(cls, roler), '_sa_instrumented')): +methods[ roler] = ( eventname, 1, None) # apply ad-hoc instrumentation from decorators, class-level defaults # and implicit role declarations
[sqlalchemy] Re: overriding collection methods
sorry, fixed patch --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~--- Index: orm/collections.py === --- orm/collections.py (revision ) +++ orm/collections.py (working copy) @@ -647,35 +648,32 @@ # ensure all roles are present, and apply implicit instrumentation if # needed -if 'appender' not in roles or not hasattr(cls, roles['appender']): +for rolename,eventname in dict( +appender='fire_append_event', +remover ='fire_remove_event', +iterator=None, +).iteritems(): +roler = roles.get( rolename, None) +if not rolename or not hasattr(cls, roler): +typename = cls.__name__ raise exceptions.ArgumentError( -Type %s must elect an appender method to be -a collection class % cls.__name__) -elif (roles['appender'] not in methods and - not hasattr(getattr(cls, roles['appender']), '_sa_instrumented')): -methods[roles['appender']] = ('fire_append_event', 1, None) - -if 'remover' not in roles or not hasattr(cls, roles['remover']): -raise exceptions.ArgumentError( -Type %s must elect a remover method to be -a collection class % cls.__name__) -elif (roles['remover'] not in methods and - not hasattr(getattr(cls, roles['remover']), '_sa_instrumented')): -methods[roles['remover']] = ('fire_remove_event', 1, None) - -if 'iterator' not in roles or not hasattr(cls, roles['iterator']): -raise exceptions.ArgumentError( -Type %s must elect an iterator method to be -a collection class % cls.__name__) +Type %(typename)s must elect an %(role)s method to be +a collection class % locals() ) +elif (eventname and +roler not in methods and +not hasattr(getattr(cls, roler), '_sa_instrumented')): +methods[ roler] = ( eventname, 1, None) # apply ad-hoc instrumentation from decorators, class-level defaults # and implicit role declarations
[sqlalchemy] Re: Ascii codec instead of unicode ?
So in short if I specify use_unicode=True at the SA engine level then I can skip specifying use_unicode and specify only charset=utf8 at mysqldb level ? On 8/19/07, jason kirtland [EMAIL PROTECTED] wrote: Arun Kumar PG wrote: Ok, you need to get that charset to the driver. Try removing SET NAMES from your init_command, and instead pass charset=utf8 and use_unicode=0 in your database connection URL. why do we want to say use_unicode=0 instead or use_unicode=True here? You can go either way with that. The MySQLdb driver's default behavior when given a 'charset' is to also turn on its return all strings in Unicode mode. If you want all of your strings as Unicode that's just dandy, but if you expecting them to come back as regular strings encoded in the charset you requested you'd be in for a surprise... In my own code I enable use_unicode and I don't specify any Unicode options or column types at the SQLAlchemy level. -j -- Cheers, - A --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] adjacency list: filtering relations eagerly
Hi. I am using a self-referential mapper to represent a multi-level tree of parent-child relationships. Typically I've been querying each parent for children that I am interested in. Up till now I have made a new query for each child that I am looking for, which is doesn't seem like the most efficient way to approach this. I've found that 0.4's aliased joins are great for selecting parents based on their children, and can be eager-loaded to grab entire subtrees with a single query. However each parent in my table can have many children (and trees can many many levels deep) so eager loading root nodes can be a bit slow too. Ideally I would like to be able to eager load only the children that were involved in the join as those are the only ones that I am interested in, so that something like: parent_instance.children or equivalent, only loads children that were part of the original join rather than all of parent_instance's children. Is something like this possible? The closet that I've found is using add_entity after each join in my query. Thanks Stephen --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
another thing noted, the collections instrumentation fails over old python classes (not inheriting object), e.g. class myX: ...whatever... it fails at _instrument_class(), because type(myX()) being type 'instance' is recognized as builtin, and apart of that the util.duck_type_collection() may fail because issubclass does not work just straighforward, e.g. must be import types isa = isinstance(specimen, (type, types.ClassType)) and issubclass or isinstance neither types.ClassType (type 'clasobj' or the above mentioned type 'instance' are otherwise accesible as some predefined python type - but only as type(someclass) or type(someclass()). ciao svil --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Join vs. 2 selects?
Thank you both for your help! I tend to using 2 select-statements, because the most frequent use case will be a user who views the overview-page (containing several blog posts, possibly summarized) and then selecting a specific one to read, including comments. Thus, the blog post will most probably be in the cache already and I only need to query the comments by parent_id, without a join. Cheers, Martin --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
a suggestion about _list_decorators() and similar. they can be easily made into classes, i.e. non dynamic (and overloadable/patchable :-). class _list_decorators( object): #instead of def _list_decorators() all contents/decorators stays same... _funcs = _get_decorators( locals() ) def __new__( klas): return klas._funcs def _get_decorators( d): skip = '__module__', '__doc__', if 1: #Whichever way preferred r = dict( (k,v) for k,v in d.iteritems() if k not in skip) or: r = d.copy() for s in skip: r.pop(s) return r _def _tidy(fn): ... #becomes global the only prerequisite for this is to rename __del() and __set() into _del /_set or else they get looked up as private-named identifiers (?? no idea why). ciao svilen --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
a patch, it got even tidier ;-) - no more _tidy() calls, all automated. On Monday 20 August 2007 16:41:30 svilen wrote: a suggestion about _list_decorators() and similar. they can be easily made into classes, i.e. non dynamic (and overloadable/patchable :-). class _list_decorators( object): --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~--- Index: sqlalchemy/orm/collections.py === --- sqlalchemy/orm/collections.py (revision 3375) +++ sqlalchemy/orm/collections.py (working copy) @@ -746,7 +749,7 @@ pass return wrapper -def __set(collection, item, _sa_initiator=None): +def _set(collection, item, _sa_initiator=None): Run set events, may eventually be inlined into decorators. if _sa_initiator is not False and item is not None: @@ -754,7 +757,7 @@ if executor: getattr(executor, 'fire_append_event')(item, _sa_initiator) -def __del(collection, item, _sa_initiator=None): +def _del(collection, item, _sa_initiator=None): Run del events, may eventually be inlined into decorators. if _sa_initiator is not False and item is not None: @@ -762,17 +765,28 @@ if executor: getattr(executor, 'fire_remove_event')(item, _sa_initiator) -def _list_decorators(): +def _tidy_(fn, base): +fn._sa_instrumented = True +fn.__doc__ = getattr( base, fn.__name__).__doc__ +return fn + +def _tider( func, base): +def f( fn): return _tidy_( func(fn), base) +return f + +def _get_decorators( d, base): +skip = '__module__', '__doc__', +return dict( (k,_tider(v,base)) for k,v in d.iteritems() if k not in skip ) + + +class _list_decorators( object):#def _list_decorators(): Hand-turned instrumentation wrappers that can decorate any list-like class. -def _tidy(fn): -setattr(fn, '_sa_instrumented', True) -fn.__doc__ = getattr(getattr(list, fn.__name__), '__doc__') def append(fn): def append(self, item, _sa_initiator=None): -# FIXME: example of fully inlining __set and adapter.fire +# FIXME: example of fully inlining _set and adapter.fire # for critical path if _sa_initiator is not False and item is not None: executor = getattr(self, '_sa_adapter', None) @@ -780,21 +794,18 @@ executor.attr.fire_append_event(executor._owner(), item, _sa_initiator) fn(self, item) -_tidy(append) return append def remove(fn): def remove(self, value, _sa_initiator=None): fn(self, value) -__del(self, value, _sa_initiator) -_tidy(remove) +_del(self, value, _sa_initiator) return remove def insert(fn): def insert(self, index, value): -__set(self, value) +_set(self, value) fn(self, index, value) -_tidy(insert) return insert def __setitem__(fn): @@ -802,8 +813,8 @@ if not isinstance(index, slice): existing = self[index] if existing is not None: -__del(self, existing) -__set(self, value) +_del(self, existing) +_set(self, value) fn(self, index, value) else: # slice assignment requires __delitem__, insert, __len__ @@ -830,114 +841,101 @@ len(rng))) for i, item in zip(rng, value): self.__setitem__(i, item) -_tidy(__setitem__) return __setitem__ def __delitem__(fn): def __delitem__(self, index): if not isinstance(index, slice): item = self[index] -__del(self, item) +_del(self, item) fn(self, index) else: # slice deletion requires __getslice__ and a slice-groking # __getitem__ for stepped deletion # note: not breaking this into atomic dels for item in self[index]: -__del(self, item) +_del(self, item) fn(self, index) -_tidy(__delitem__) return __delitem__ def __setslice__(fn): def __setslice__(self, start, end, values): for value in self[start:end]: -
[sqlalchemy] Re: overriding collection methods
and no need for that __new__ replacement either - just use _list_decorators._funcs instead of _list_decorators() On Monday 20 August 2007 17:05:32 svilen wrote: a patch, it got even tidier ;-) - no more _tidy() calls, all automated. On Monday 20 August 2007 16:41:30 svilen wrote: a suggestion about _list_decorators() and similar. they can be easily made into classes, i.e. non dynamic (and overloadable/patchable :-). class _list_decorators( object): --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Using 0.4 in anger
On Aug 19, 2007, at 9:13 AM, Paul Johnston wrote: Hi, Couple more things: 1) When specifying foreign_keys manually on a relation, you have to use table.c.column; table.column doesn't work. yes we havent taken away the c attribute on Table, that only applies to class-mapped attributes. 2) I used to be able to do obj.delete() but now obj.query.delete() isn't available. well the idea is that you'd have your Session object hanging around, and you'd call Session.delete(obj). But I had mentioned to Mike Orr that id want to wait and see what the backlash is to my removing all the other assignmapper methods...so delete() might still be a good candidate to be restored. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Editing row objects
manipulating and holding onto RowProxy objects is a little squirrely. you probably want to convert them to dicts first: r = t.select().execute() for row in r: print dict(row) On Aug 19, 2007, at 12:59 PM, Brendan Arnold wrote: hi there, i'm treating the row objects like dictionaries and i'd like to 'tag' an extra key/value pair on each entry. i have experimented with the __setattr__ function but this doesn't seem to be bulletproof, is there a way to do this? brendan --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: sqlalchemy with turbogears and mapper-part2 connect and select
Lukasz Szybalski ha scritto: Got another error here. but I guess its fixed in a newer version of sqlalchemy via ticket 482 Yes. I wasn't aware of 482 because I usually try to avoid table names that _must_ be escaped (mixed caps, reserved words, etc). put assign_mapper() in place of mapper() Are these two the same? assign_mapper in TG, is the same as mapper in SA? No, SA provides both mapper and assign_mapper. Assign mapper does the work of the former, plus attached several methods to your mapped classes and binds a default session context. SA 0.4 deprecates this and uses a scoped_session, but you should not worry about it now. How can I iterate through myuser fields? It's columns, btw. If you call them fields, baby Jesus cries for column in myuser.c.keys(): print getattr(myuser,column) Thanks for the help. I have moved ahead in these few emails more then I was able in a week. I know how it feels ;-) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
On Monday 20 August 2007 17:29:52 jason kirtland wrote: [EMAIL PROTECTED] wrote: hi i need to have a list collection with list.appender (in SA 0.4 terms) that accepts either one positional arg as the value, or keyword args which it uses to create the value. Each collection instance knows what type of values to create. [...] Any idea to fix/enhance this, letting **kwargs through to my function? The dynamic wrapper() can do this, while these preset ones cannot... while they should be equaly powerful. Hi Svil, @collections.appender @collections.internally_instrumented def append(self, obj=_NOTSET, **kargs): ... There are 2 (different) uses of an appender, one is the SA itself, but the other is the programmer. SA will always use single arg/positionals, while i could use this or that or combination. SQLAlchemy's appender doesn't have to be the programmer's appender. You can add a method solely for the orm's use if you like. That's one of the points of the decorator syntax, divorcing function names from the interface. If you want to keep 'append' for your own use, just tag another method as the @appender. thanks for suggestion. This would work if the default decorators were not force wrapped anyway, in that ABC decoration part. i'm looking now to see why is it so. And anyway i need to first create the object and just then append it (the decorators will first fire event on the object and just then append(), that is call me), so may have to look further/deeper. Maybe i can make my append create objects first and then call the actual appender - so yes, this is the way. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
svilen wrote: a suggestion about _list_decorators() and similar. they can be easily made into classes, i.e. non dynamic (and overloadable/patchable :-). The stdlib decorators end up in a static, module-level dictionary that can be manipulated if you want to. Wouldn't this be replacing a dict with some_cls.__dict__? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
On Monday 20 August 2007 18:01:49 jason kirtland wrote: svilen wrote: a suggestion about _list_decorators() and similar. they can be easily made into classes, i.e. non dynamic (and overloadable/patchable :-). The stdlib decorators end up in a static, module-level dictionary that can be manipulated if you want to. Wouldn't this be replacing a dict with some_cls.__dict__? well, more or less... i use similar function-made locals() namespaces a _lot_, and maybe thats why i avoid using them whenever i can - their contents is not easily changeable/inheritable/splitable... programaticaly, piece-by-piece. whatever. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: adjacency list: filtering relations eagerly
On Aug 20, 2007, at 5:35 AM, stephen emslie wrote: Hi. I am using a self-referential mapper to represent a multi-level tree of parent-child relationships. Typically I've been querying each parent for children that I am interested in. Up till now I have made a new query for each child that I am looking for, which is doesn't seem like the most efficient way to approach this. I've found that 0.4's aliased joins are great for selecting parents based on their children, and can be eager-loaded to grab entire subtrees with a single query. However each parent in my table can have many children (and trees can many many levels deep) so eager loading root nodes can be a bit slow too. Ideally I would like to be able to eager load only the children that were involved in the join as those are the only ones that I am interested in, so that something like: parent_instance.children or equivalent, only loads children that were part of the original join rather than all of parent_instance's children. Is something like this possible? The closet that I've found is using add_entity after each join in my query. so for me to get this straight, say you are searching for node N. you then are searching for an N such that a child node, two levels deep, has an attribute foo which equals bar: N.children.children.foo=bar Then when youre done loading, you have N. and youd like to load exactly 1 child node on N, but then the full set of children on *that* child. The only thing we have which is like this (without writing out a query), is the eagerload option: query.options(eagerload('children.children'))... what that means is, the children attribute is lazy loading on the parent, but the children attribute on each of that parent's nodes are *eager* loading. but I dont think this is exactly what you want. The other way, is to construct your own eager-loading statement and set it up. But youd have to set up contains_eager options all the way down the chain as you'd like nodes to be assembled. So *maybe* combining contains_eager() with eagerload() can do it, such as: my_eager_alias = mynodes.alias('mynodes') query = query.filter(...wahtever criterion you're using...) query = query.from_statement(select([mynodes, my_eager_alias], query._criterion)).\ .options(contains_eager('children', alias=my_eager_alias)) .options(eagerload('children.children')) That will set the children attribute to eagerly load against your my_eager_alias criterion and also set chilrdren.children to eager load. but the above is not something ive tested, its pretty exotic. I think theres probably some relatively simple options we can be adding to Query here to support what you're trying to dosuch as contains_eager('children', alias=myalias, add_columns=True) which would automatically add the columns to the Query's SELECT statement without needing to construct your from_statement() like we're doing above. I'd be curious though if you could play around a little with the approach I just suggested to see if its at all workable ? - mike --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
svilen wrote: And anyway i need to first create the object and just then append it (the decorators will first fire event on the object and just then append(), that is call me), so may have to look further/deeper. Maybe i can make my append create objects first and then call the actual appender - so yes, this is the way. Either way. The @internally_instrumented is there exactly for that flexibility on ORM interface methods like 'append' and as an override for ABC decoration on python interface methods. (The regular recipes also override ABC decoration.) Or you can do you work elsewhere and forward to an instrumented method for event service. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Ascii codec instead of unicode ?
Arun wrote: So in short if I specify use_unicode=True at the SA engine level then I can skip specifying use_unicode and specify only charset=utf8 at mysqldb level ? If you configure this DB-API driver for all-Unicode (which is what happens when you only give it a 'charset') all strings will come back from the database to SQLAlchemy as Unicode. You can ask the Engine and/or types to convert_unicode=True, but it won't do anything except add processing overhead- the strings are already Unicode from the driver. Try playing with the following to find a combination that suits your needs. The first two engine configurations aren't options for you obviously, but they make a good demo. from sqlalchemy import * e = create_engine('mysql:///test') #e = create_engine('mysql:///test', convert_unicode=True) #e = create_engine('mysql:///test?charset=utf8') #e = create_engine('mysql:///test?charset=utf8', # convert_unicode=True) #e = create_engine('mysql:///test?charset=utf8use_unicode=0') #e = create_engine('mysql:///test?charset=utf8use_unicode=0', # convert_unicode=True) m = MetaData(e) t = Table('unicodings', m, Column('string', String(32)), Column('unicode', Unicode(32))) if not t.exists(): t.create() t.insert().execute({'string':'foo', 'unicode':'bar'}) print repr(list(t.select().execute())) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
On Monday 20 August 2007 18:09:41 jason kirtland wrote: svilen wrote: And anyway i need to first create the object and just then append it (the decorators will first fire event on the object and just then append(), that is call me), so may have to look further/deeper. Maybe i can make my append create objects first and then call the actual appender - so yes, this is the way. Either way. The @internally_instrumented is there exactly for that flexibility on ORM interface methods like 'append' and as an override for ABC decoration on python interface methods. (The regular recipes also override ABC decoration.) Or you can do you work elsewhere and forward to an instrumented method for event service. an example on this? i can't figure it out, whatever i do, that ABC-auto-decorators loop kicks in and byebye my nice append - even if the appender is not append() at all. Why should append() be instrumented regardless of it being or not the used appender? @internally_instrumented - u mean my append() to do everything, even firing of events? uhm. ugly, for my simple case. and are there some fire-this-event() funcs? those __del()/__set() do not seem like very exposed... class _NotSet: pass class MyCollection( list): factory = None def append( me, obj =_NotSet, **kwargs): if obj is _NotSet: print 'success!', kwargs obj = factory(**kwargs) me._append( obj) return obj @sqlalchemy.orm.collections.collection.appender def _append( me, *a,**k): return list.append(me, *a,**k) m = mapper( A, .., rel = relation( collection_class = MyCollection)... ) a=A() a.rel.append( key1=val1, key2=val2) #this always fails/TypeError, as the append called is not mine, but an wrapped/instrumented one and that one has no kwargs. btw, i noted the sequence fire-event() / original-func-call() is not same/consistent in those auto-decorators, e.g. for remove() and del() etc. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
On Monday 20 August 2007 20:58:45 [EMAIL PROTECTED] wrote: On Monday 20 August 2007 18:09:41 jason kirtland wrote: svilen wrote: And anyway i need to first create the object and just then append it (the decorators will first fire event on the object and just then append(), that is call me), so may have to look further/deeper. Maybe i can make my append create objects first and then call the actual appender - so yes, this is the way. Either way. The @internally_instrumented is there exactly for that flexibility on ORM interface methods like 'append' and as an override for ABC decoration on python interface methods. (The regular recipes also override ABC decoration.) Or you can do you work elsewhere and forward to an instrumented method for event service. an example on this? i can't figure it out, whatever i do, that ABC-auto-decorators loop kicks in and byebye my nice append - even if the appender is not append() at all. Why should append() be instrumented regardless of it being or not the used appender? so @internally_instrumented prevents any automatic wrapping, ok i got it. (The docstrings in collections.py should make their way to the normal docs...) it is not at all obvious why, if there is a choosen appender, the default one (append() or whatever) has to be instrumented too... --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
[EMAIL PROTECTED] wrote: On Monday 20 August 2007 18:09:41 jason kirtland wrote: svilen wrote: And anyway i need to first create the object and just then append it (the decorators will first fire event on the object and just then append(), that is call me), so may have to look further/deeper. Maybe i can make my append create objects first and then call the actual appender - so yes, this is the way. Either way. The @internally_instrumented is there exactly for that flexibility on ORM interface methods like 'append' and as an override for ABC decoration on python interface methods. (The regular recipes also override ABC decoration.) Or you can do you work elsewhere and forward to an instrumented method for event service. an example on this? i can't figure it out, whatever i do, that ABC-auto-decorators loop kicks in and byebye my nice append - even if the appender is not append() at all. For your example, something like: class MyCollection(list): @collection.internally_instrumented def append(self, options, **kw): # do factory stuff, forward to _append new_obj = factory_stuff(options, **kw) # forward to _append, which will fire events self._append(new_obj) @collection.appender def _append(self, item): ... But tacking a factory method onto a regular Python list is much simpler with a separation of concerns: class FactoryCollection(list): def create(self, options, **kw): obj = factory_stuff(options, **kw) self.append(obj) return obj No decorators needed. For a while there was a no-op decorator that did the same job as @internally_instrumented in the first example, just a different name for clarity. It could easily come back if this pattern becomes common- I yanked it after working with the collections for a while and finding the second form much more common in my work. Why should append() be instrumented regardless of it being or not the used appender? If you use an object with a Python list interface, all the list interface methods will be instrumented so that relation collections have natural Pythonic behavior. If you don't want automatic instrumentation on list methods you don't have to have it- see the 0.4 docs for how to opt out via __emulates__. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: overriding collection methods
But tacking a factory method onto a regular Python list is much simpler with a separation of concerns: class FactoryCollection(list): def create(self, options, **kw): eh sorry, i want it the hard way.. now as i think of it, its just me being lazy and fancy - preferring implicitness and less method names to remember. For a while there was a no-op decorator that did the same job as @internally_instrumented in the first example, just a different name for clarity. It could easily come back if this pattern becomes common- I yanked it after working with the collections for a while and finding the second form much more common in my work. @do_not_instrument might a be better name -- but regardless the name, if the opting-out concept is not not advertised, one does not know what to look for.. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Newbie question: Too many connections
Hi, I keep getting The error: DBAPIError: (Connection failed) (OperationalError) (1040, 'Too many connections'). At no time do I really need more than a few connections so I am doing something basically wrong. From reading the doc I concluded that pooling in the simple cases is automatic. I must have misunderstood. Questions: 1) What is the suggested method to debug such a problem? E.g. retrieve the number of open connections. 2) When is a connection closed? For that matter , when is it open and when taken from the pool? TIA --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---