[sqlalchemy] Object inheritance

2012-02-22 Thread Andrea
Hi all, I have some object on a pre-existing model. Now we want to add a persistance layer and so SQLAlchemy/SQLite will be our choice. When I add an object to session a UnmappedInstanceError is raised: Class 'try_sqlalchemy.example2.applib_model.DescriptorBean' is mapped, but this instance lacks

[sqlalchemy] Re: Object inheritance

2012-02-22 Thread Andrea
Another update! Maybe the bad thing is to override self.__dict__. Now I set values without overriding all and seems to work: class _Struct(dict): def __init__(self,**kw): dict.__init__(self, kw) for k,v in kw.iteritems(): self.__dict__[k] = v On Feb 22, 1:25 pm,

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Michael Bayer
When we want to test if a Python program has a leak, we do that via seeing how many uncollected objects are present. This is done via gc: import gc print total number of objects:, len(gc.get_objects()) That's the only real way to measure if the memory used by Python objects is growing

Re: [sqlalchemy] Object inheritance

2012-02-22 Thread Michael Bayer
A few things: 1. the Python dict class cannot be mapped. Classes can only extend from object or other classes that in turn extend from object. 2. SQLAlchemy instrumentation relies upon Python descriptors (see http://docs.python.org/howto/descriptor.html) to intercept changes in state on an

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Vlad K.
Hi, thanks for your reply. I haven't yet tested this with a profiler to see exactly what exactly is happening, but the bottom line is that the overall memory use grows with each iteration (or transaction processed), to the point of grinding the server to a halt, and top shows only the

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Vlad K.
Yes, definitely growing at a rate of 700-800 per iteration. .oO V Oo. On 02/22/2012 07:23 PM, Michael Bayer wrote: When we want to test if a Python program has a leak, we do that via seeing how many uncollected objects are present. This is done via gc: import gc print total number of

[sqlalchemy] problem with dynamic tables/classes and inheritance

2012-02-22 Thread lars van gemerden
I am trying to generate tables/classes dynamically. The code below is my latest attempt, but I cannot get it to work. - class TableName(object): @declared_attr def __tablename__(cls): return cls.__name__ class Inherit(object):

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Claudio Freire
On Wed, Feb 22, 2012 at 4:29 PM, Michael Bayer mike...@zzzcomputing.com wrote: thanks for your reply. I haven't yet tested this with a profiler to see exactly what exactly is happening, but the bottom line is that the overall memory use grows with each iteration (or transaction processed), to

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Michael Bayer
On Feb 22, 2012, at 2:46 PM, Claudio Freire wrote: On Wed, Feb 22, 2012 at 4:29 PM, Michael Bayer mike...@zzzcomputing.com wrote: thanks for your reply. I haven't yet tested this with a profiler to see exactly what exactly is happening, but the bottom line is that the overall memory use

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Michael Bayer
On Feb 22, 2012, at 3:28 PM, Claudio Freire wrote: Like I said, it's not a leak situation as much of a fragmentation situation, where long-lived objects in high memory positions can prevent the process' heap from shrinking. [0]

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Claudio Freire
On Wed, Feb 22, 2012 at 5:40 PM, Michael Bayer mike...@zzzcomputing.com wrote: Saw that a bit, but looking at the tips at the bottom, concrete implementation changes are not coming to mind.   An eternal structure is ubiquitous in any programming language.  sys.modules is a big list of all the

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Claudio Freire
On Wed, Feb 22, 2012 at 5:51 PM, Claudio Freire klaussfre...@gmail.com wrote: Such caches, for instance, are better made limited in lifespan (say, giving them a finite lifetime, making them expire, actively cleaning them from time to time). Structures that are truly required to be eternal are

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Michael Bayer
On Feb 22, 2012, at 3:51 PM, Claudio Freire wrote: On Wed, Feb 22, 2012 at 5:40 PM, Michael Bayer mike...@zzzcomputing.com wrote: Saw that a bit, but looking at the tips at the bottom, concrete implementation changes are not coming to mind. An eternal structure is ubiquitous in any

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Claudio Freire
On Wed, Feb 22, 2012 at 6:21 PM, Michael Bayer mike...@zzzcomputing.com wrote: IMHO the whole point of using a high level, interpreted language like Python is that we don't have to be bogged down thinking like C programmers.   How come I've never had a memory fragmentation issue before ?      

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Vlad K.
Okay, thanks to this article: http://neverfear.org/blog/view/155/Investigating_memory_leaks_in_Python I made similar plot of object counts in time, showing top 50 types. The resulting PDF is here (you might wish to download it first, Google messes it up for me):

Re: [sqlalchemy] Re: Working with large IN lists

2012-02-22 Thread Michael Bayer
On Feb 22, 2012, at 6:36 PM, Vlad K. wrote: Okay, thanks to this article: http://neverfear.org/blog/view/155/Investigating_memory_leaks_in_Python I made similar plot of object counts in time, showing top 50 types. The resulting PDF is here (you might wish to download it first,