Re: [sqlalchemy] newest address for each user

2013-10-21 Thread Sebastian Elsner
Thank you for the help! Additionally, I was wondering if it would be
able to make newest_address an attribute on the user class, which can
then be used in a query with .options(joinedload('newest_address')).
My goal would be that I get an attribute that returns the newest'
address date time on normal access and is also able to be eagerly
loaded/joined in a query object.  I read up in the docs and either
column_property or Correlated Subquery Relatonship Hybrid seems to be
made for this. Which one should I use?



ButAm 20.10.2013 04:41, schrieb Michael Bayer:
 On Oct 19, 2013, at 4:24 PM, Sebastian Elsner sebast...@risefx.com wrote:

 Hello,

 using the Address and User example, where the Address is connected to
 the User via a many-to-many relationship, I want to get all users with
 the date of their newest address. This is what I have now:

 s.query(User, s.query(func.max(Address.created)).\
 filter(Address.users.any()).correlate(User).as_scalar()).\
outerjoin(User.addresses).all()

 But this is giving me all users with the newest address in the whole
 address table. I think the error is in the subquery's filter, but I fail
 to see how I can fix it. I am also not tied to this query, so if you
 know a better way to get a list of all Users and their newest address
 date, shoot!
 the format for this is the select user rows + an aggregate of a related 
 table, this format is illustrated here: 
 http://docs.sqlalchemy.org/en/rel_0_8/orm/tutorial.html#using-subqueries  
 where we illustrate the count of address rows per user. 

 I see here though you have an association table in between them so that just 
 has to be added to the subquery to create a row that goes across Address and 
 UserAddresses, same idea though, use subquery with aggregate + group_by, 
 (outer) join to that:

 subq = session.query(
 func.max(Address.created).label(created),
 UserAddresses.user_id).join(UserAddresses).\
 group_by(UserAddresses.user_id).subquery()

 q = session.query(User, subq.c.created).outerjoin(subq)
 print q.all()




 Here is a working example. As you can see if you run it, even Users with
 no Addresses assigned will get the newest address date in the query.

 import datetime
 from sqlalchemy.ext.declarative import declarative_base
 from sqlalchemy.engine import create_engine
 from sqlalchemy.orm.session import sessionmaker
 from sqlalchemy.schema import Column, ForeignKey
 from sqlalchemy.types import Integer, DateTime, String
 from sqlalchemy.orm import relationship
 from sqlalchemy.sql.expression import func

 Base = declarative_base()


 class Address(Base):
__tablename__ = 'address'
id = Column(Integer, primary_key=True)
created = Column(DateTime)
users = relationship('User', back_populates='addresses',
 secondary='useraddress')

def __repr__(self):
return Address: %s, %s % (self.id, self.created)


 class User(Base):
__tablename__ = 'user'
id = Column(Integer, primary_key=True)
name = Column(String)
addresses = relationship('Address', back_populates='users',
 secondary='useraddress')

def __repr__(self):
return User:  + self.name


 class UserAddresses(Base):
__tablename__ = 'useraddress'
user_id = Column(Integer, ForeignKey('user.id'), primary_key=True)
address_id = Column(Integer, ForeignKey('address.id'), primary_key=True)

 engine = create_engine('sqlite://')
 Base.metadata.create_all(engine)
 session = sessionmaker(engine)()

 u1 = User(name=Foo)
 u2 = User(name=Bar)
 u1.addresses.append(Address(created=datetime.datetime.now()))
 u1.addresses.append(Address(created=datetime.datetime.now() -
 datetime.timedelta(days=1)))
 session.add(u1)
 session.add(u2)
 session.commit()
 print u1, u1.addresses
 print u2, u2.addresses
 print session.query(User, print session.query(User,
 session.query(func.max(Address.created)).filter(Address.users.any()).correlate(User).as_scalar()).outerjoin(User.addresses).all()

 Cheers

 Sebastian

 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/groups/opt_out.


Re: [sqlalchemy] Occasional InvalidRequestError: Table 'xxx' is already defined for this MetaData instance.

2013-10-21 Thread Joseph Casale
Hi Michael,
I spoke with the developer of the application that launches the plugins I
am writing and each
python script is run by starting an entirely new process with
the interpreter and each plugin.

It should be a clean environment. From the plugins, I simply import which
ever tables from
the module below that I would need.

In the event I am doing something wrong, my code looks like:

from os import path
from sqlalchemy import (
Column, CheckConstraint, DDL, ForeignKey, Index, Integer, String,
create_engine, event
)
from sqlalchemy.engine import Engine
from sqlalchemy.ext.declarative import declared_attr, declarative_base
from sqlalchemy.orm import sessionmaker


db_name = path.join('path', 'config', 'sqlite.db')
engine = create_engine('sqlite:///{}'.format(db_name), echo=False)
Session = sessionmaker(bind=engine, autoflush=False, autocommit=False)
Base = declarative_base()


@event.listens_for(Engine, 'connect')
def set_sqlite_pragma(dbapi_connection, connection_record):
cursor = dbapi_connection.cursor()
cursor.execute('PRAGMA foreign_keys=ON')
cursor.close()


table defs...


if __name__ == '__main__':
Base.metadata.create_all(engine)


Thanks so much for the help,
jlc


On Sun, Oct 20, 2013 at 11:01 PM, Michael Bayer mike...@zzzcomputing.comwrote:

 the MetaData object holds one Table object per unique name given.   If you
 use the Table constructor more than once with the same name and the same
 MetaData, you get this error.

 That's how the error is caused, then the fact that the error is
 occasional points strongly to a race condition of some kind, more than
 one thread both calling the constructor for Table with the same name.
 Patterns that could cause this could be some kind of unsynchronized global
 registry or singleton object that when called produces a new Table object.

 The recommended pattern is for Table objects (as well as mapped classes)
 to generally be declared at the module level so that these names are
 produced only at module import time, which itself should occur before the
 application starts any threads in addition to the main application thread.




 On Oct 20, 2013, at 11:04 PM, Joseph Casale jcas...@gmail.com wrote:

 I have a module that is imported by several Python scripts run by an
 application
 that fires up a new interpreter session for each invocation. When not
 under load or
 running the code manually at the cli things work fine but once the
 concurrency
 raises and the application starts seeing some load it emits
 InvalidRequestError
 exceptions on one table.

 After searching I am not sure the results relate to my issue or maybe my
 lack
 of familiarity with SQLAlchemy has the better of me.

 Any guidance would be greatly appreciated,
 jlc

 --
 You received this message because you are subscribed to the Google Groups
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.




-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/groups/opt_out.


[sqlalchemy] quelified/disambiguated column names in query

2013-10-21 Thread lars van gemerden
Hello,

Say that i have a table with reports with a column 'title' and a one to 
many relationship 'chapters' to chapters also with a column 'title', if i 
join these tables in a query, like:

q = 
self.session.query(Report).join(Report.chapters).add_columns(Report.title, 
Chapter.title, Chapter.text)
for p in q.all():
print p.title

print p.title prints the title of the chapter.

Is there a way to let the query.all() return NamedTuples with 
qualified/disambiguated  names (either 'title' and 'chapters.title' or 
'Report.title' and 'Chapter.title' or perhaps with underscores), because 
now i see no way to distinguish the columns (apart maybe from the order).

p.keys() returns 'title' twice.

Cheers, Lars

PS: i am using sqla 0.7.5

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/groups/opt_out.


Re: [sqlalchemy] Occasional InvalidRequestError: Table 'xxx' is already defined for this MetaData instance.

2013-10-21 Thread Michael Bayer
how about a stack trace?


On Oct 21, 2013, at 4:11 PM, Joseph Casale jcas...@gmail.com wrote:

 Hi Michael,
 I spoke with the developer of the application that launches the plugins I am 
 writing and each
 python script is run by starting an entirely new process with the interpreter 
 and each plugin.
 
 It should be a clean environment. From the plugins, I simply import which 
 ever tables from
 the module below that I would need.
 
 In the event I am doing something wrong, my code looks like:
 
 from os import path
 from sqlalchemy import (
 Column, CheckConstraint, DDL, ForeignKey, Index, Integer, String,
 create_engine, event
 )
 from sqlalchemy.engine import Engine
 from sqlalchemy.ext.declarative import declared_attr, declarative_base
 from sqlalchemy.orm import sessionmaker
 
 
 db_name = path.join('path', 'config', 'sqlite.db')
 engine = create_engine('sqlite:///{}'.format(db_name), echo=False)
 Session = sessionmaker(bind=engine, autoflush=False, autocommit=False)
 Base = declarative_base()
 
 
 @event.listens_for(Engine, 'connect')
 def set_sqlite_pragma(dbapi_connection, connection_record):
 cursor = dbapi_connection.cursor()
 cursor.execute('PRAGMA foreign_keys=ON')
 cursor.close()
 
 
 table defs...
 
 
 if __name__ == '__main__':
 Base.metadata.create_all(engine)
 
 
 Thanks so much for the help,
 jlc
 
 
 On Sun, Oct 20, 2013 at 11:01 PM, Michael Bayer mike...@zzzcomputing.com 
 wrote:
 the MetaData object holds one Table object per unique name given.   If you 
 use the Table constructor more than once with the same name and the same 
 MetaData, you get this error.
 
 That's how the error is caused, then the fact that the error is occasional 
 points strongly to a race condition of some kind, more than one thread both 
 calling the constructor for Table with the same name.   Patterns that could 
 cause this could be some kind of unsynchronized global registry or singleton 
 object that when called produces a new Table object.
 
 The recommended pattern is for Table objects (as well as mapped classes) to 
 generally be declared at the module level so that these names are produced 
 only at module import time, which itself should occur before the application 
 starts any threads in addition to the main application thread.
 
 
 
 
 On Oct 20, 2013, at 11:04 PM, Joseph Casale jcas...@gmail.com wrote:
 
 I have a module that is imported by several Python scripts run by an 
 application
 that fires up a new interpreter session for each invocation. When not under 
 load or
 running the code manually at the cli things work fine but once the 
 concurrency
 raises and the application starts seeing some load it emits 
 InvalidRequestError
 exceptions on one table.
 
 After searching I am not sure the results relate to my issue or maybe my lack
 of familiarity with SQLAlchemy has the better of me.
 
 Any guidance would be greatly appreciated,
 jlc
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.
 
 
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.



signature.asc
Description: Message signed with OpenPGP using GPGMail


Re: [sqlalchemy] quelified/disambiguated column names in query

2013-10-21 Thread Michael Bayer

On Oct 21, 2013, at 5:20 PM, lars van gemerden l...@rational-it.com wrote:

 Hello,
 
 Say that i have a table with reports with a column 'title' and a one to many 
 relationship 'chapters' to chapters also with a column 'title', if i join 
 these tables in a query, like:
 
 q = 
 self.session.query(Report).join(Report.chapters).add_columns(Report.title, 
 Chapter.title, Chapter.text)
 for p in q.all():
 print p.title
 
 print p.title prints the title of the chapter.
 
 Is there a way to let the query.all() return NamedTuples with 
 qualified/disambiguated  names (either 'title' and 'chapters.title' or 
 'Report.title' and 'Chapter.title' or perhaps with underscores), because now 
 i see no way to distinguish the columns (apart maybe from the order).
 
 p.keys() returns 'title' twice.

use labels:

add_columns(Report.title.label(report_title), 
Chapter.title.label(chapter_title))




 
 Cheers, Lars
 
 PS: i am using sqla 0.7.5
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.



signature.asc
Description: Message signed with OpenPGP using GPGMail


Re: [sqlalchemy] newest address for each user

2013-10-21 Thread Michael Bayer
if we're talking about just the timestamp, then that would be a column property 
and if you don't want it to load normally it would be under a deferred().

An analogue of the subquery example using count() is here: 
http://docs.sqlalchemy.org/en/rel_0_8/orm/mapper_config.html#using-column-property

again, you can adapt this to look like your max() + association table:


class User(Base):
__tablename__ = 'user'
id = Column(Integer, primary_key=True)

address_created = column_property(
select([func.max(Address.created)]).\
where(Address.id == UserAddresses.address_id).\
where(UserAddresses.user_id == id).\
correlate_except(Address, UserAddresses)
)

On Oct 21, 2013, at 3:12 AM, Sebastian Elsner sebast...@risefx.com wrote:

 Thank you for the help! Additionally, I was wondering if it would be
 able to make newest_address an attribute on the user class, which can
 then be used in a query with .options(joinedload('newest_address')).
 My goal would be that I get an attribute that returns the newest'
 address date time on normal access and is also able to be eagerly
 loaded/joined in a query object.  I read up in the docs and either
 column_property or Correlated Subquery Relatonship Hybrid seems to be
 made for this. Which one should I use?
 
 
 
 ButAm 20.10.2013 04:41, schrieb Michael Bayer:
 On Oct 19, 2013, at 4:24 PM, Sebastian Elsner sebast...@risefx.com wrote:
 
 Hello,
 
 using the Address and User example, where the Address is connected to
 the User via a many-to-many relationship, I want to get all users with
 the date of their newest address. This is what I have now:
 
 s.query(User, s.query(func.max(Address.created)).\
 filter(Address.users.any()).correlate(User).as_scalar()).\
   outerjoin(User.addresses).all()
 
 But this is giving me all users with the newest address in the whole
 address table. I think the error is in the subquery's filter, but I fail
 to see how I can fix it. I am also not tied to this query, so if you
 know a better way to get a list of all Users and their newest address
 date, shoot!
 the format for this is the select user rows + an aggregate of a related 
 table, this format is illustrated here: 
 http://docs.sqlalchemy.org/en/rel_0_8/orm/tutorial.html#using-subqueries  
 where we illustrate the count of address rows per user. 
 
 I see here though you have an association table in between them so that just 
 has to be added to the subquery to create a row that goes across Address and 
 UserAddresses, same idea though, use subquery with aggregate + group_by, 
 (outer) join to that:
 
 subq = session.query(
func.max(Address.created).label(created),
UserAddresses.user_id).join(UserAddresses).\
group_by(UserAddresses.user_id).subquery()
 
 q = session.query(User, subq.c.created).outerjoin(subq)
 print q.all()
 
 
 
 
 Here is a working example. As you can see if you run it, even Users with
 no Addresses assigned will get the newest address date in the query.
 
 import datetime
 from sqlalchemy.ext.declarative import declarative_base
 from sqlalchemy.engine import create_engine
 from sqlalchemy.orm.session import sessionmaker
 from sqlalchemy.schema import Column, ForeignKey
 from sqlalchemy.types import Integer, DateTime, String
 from sqlalchemy.orm import relationship
 from sqlalchemy.sql.expression import func
 
 Base = declarative_base()
 
 
 class Address(Base):
   __tablename__ = 'address'
   id = Column(Integer, primary_key=True)
   created = Column(DateTime)
   users = relationship('User', back_populates='addresses',
 secondary='useraddress')
 
   def __repr__(self):
   return Address: %s, %s % (self.id, self.created)
 
 
 class User(Base):
   __tablename__ = 'user'
   id = Column(Integer, primary_key=True)
   name = Column(String)
   addresses = relationship('Address', back_populates='users',
 secondary='useraddress')
 
   def __repr__(self):
   return User:  + self.name
 
 
 class UserAddresses(Base):
   __tablename__ = 'useraddress'
   user_id = Column(Integer, ForeignKey('user.id'), primary_key=True)
   address_id = Column(Integer, ForeignKey('address.id'), primary_key=True)
 
 engine = create_engine('sqlite://')
 Base.metadata.create_all(engine)
 session = sessionmaker(engine)()
 
 u1 = User(name=Foo)
 u2 = User(name=Bar)
 u1.addresses.append(Address(created=datetime.datetime.now()))
 u1.addresses.append(Address(created=datetime.datetime.now() -
 datetime.timedelta(days=1)))
 session.add(u1)
 session.add(u2)
 session.commit()
 print u1, u1.addresses
 print u2, u2.addresses
 print session.query(User, print session.query(User,
 session.query(func.max(Address.created)).filter(Address.users.any()).correlate(User).as_scalar()).outerjoin(User.addresses).all()
 
 Cheers
 
 Sebastian
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails 

Re: [sqlalchemy] newest address for each user

2013-10-21 Thread Sebastian Elsner
This is just anazing! I was tinkering with the select statement, but
using the correlate_except would never have come to my mind. Thank you!

Am 22.10.2013 00:16, schrieb Michael Bayer:
 if we're talking about just the timestamp, then that would be a column 
 property and if you don't want it to load normally it would be under a 
 deferred().

 An analogue of the subquery example using count() is here: 
 http://docs.sqlalchemy.org/en/rel_0_8/orm/mapper_config.html#using-column-property

 again, you can adapt this to look like your max() + association table:


 class User(Base):
 __tablename__ = 'user'
 id = Column(Integer, primary_key=True)

 address_created = column_property(
 select([func.max(Address.created)]).\
   where(Address.id == UserAddresses.address_id).\
 where(UserAddresses.user_id == id).\
 correlate_except(Address, UserAddresses)
 )

 On Oct 21, 2013, at 3:12 AM, Sebastian Elsner sebast...@risefx.com wrote:

 Thank you for the help! Additionally, I was wondering if it would be
 able to make newest_address an attribute on the user class, which can
 then be used in a query with .options(joinedload('newest_address')).
 My goal would be that I get an attribute that returns the newest'
 address date time on normal access and is also able to be eagerly
 loaded/joined in a query object.  I read up in the docs and either
 column_property or Correlated Subquery Relatonship Hybrid seems to be
 made for this. Which one should I use?



 ButAm 20.10.2013 04:41, schrieb Michael Bayer:
 On Oct 19, 2013, at 4:24 PM, Sebastian Elsner sebast...@risefx.com wrote:

 Hello,

 using the Address and User example, where the Address is connected to
 the User via a many-to-many relationship, I want to get all users with
 the date of their newest address. This is what I have now:

 s.query(User, s.query(func.max(Address.created)).\
 filter(Address.users.any()).correlate(User).as_scalar()).\
   outerjoin(User.addresses).all()

 But this is giving me all users with the newest address in the whole
 address table. I think the error is in the subquery's filter, but I fail
 to see how I can fix it. I am also not tied to this query, so if you
 know a better way to get a list of all Users and their newest address
 date, shoot!
 the format for this is the select user rows + an aggregate of a related 
 table, this format is illustrated here: 
 http://docs.sqlalchemy.org/en/rel_0_8/orm/tutorial.html#using-subqueries  
 where we illustrate the count of address rows per user. 

 I see here though you have an association table in between them so that 
 just has to be added to the subquery to create a row that goes across 
 Address and UserAddresses, same idea though, use subquery with aggregate + 
 group_by, (outer) join to that:

 subq = session.query(
func.max(Address.created).label(created),
UserAddresses.user_id).join(UserAddresses).\
group_by(UserAddresses.user_id).subquery()

 q = session.query(User, subq.c.created).outerjoin(subq)
 print q.all()




 Here is a working example. As you can see if you run it, even Users with
 no Addresses assigned will get the newest address date in the query.

 import datetime
 from sqlalchemy.ext.declarative import declarative_base
 from sqlalchemy.engine import create_engine
 from sqlalchemy.orm.session import sessionmaker
 from sqlalchemy.schema import Column, ForeignKey
 from sqlalchemy.types import Integer, DateTime, String
 from sqlalchemy.orm import relationship
 from sqlalchemy.sql.expression import func

 Base = declarative_base()


 class Address(Base):
   __tablename__ = 'address'
   id = Column(Integer, primary_key=True)
   created = Column(DateTime)
   users = relationship('User', back_populates='addresses',
 secondary='useraddress')

   def __repr__(self):
   return Address: %s, %s % (self.id, self.created)


 class User(Base):
   __tablename__ = 'user'
   id = Column(Integer, primary_key=True)
   name = Column(String)
   addresses = relationship('Address', back_populates='users',
 secondary='useraddress')

   def __repr__(self):
   return User:  + self.name


 class UserAddresses(Base):
   __tablename__ = 'useraddress'
   user_id = Column(Integer, ForeignKey('user.id'), primary_key=True)
   address_id = Column(Integer, ForeignKey('address.id'), primary_key=True)

 engine = create_engine('sqlite://')
 Base.metadata.create_all(engine)
 session = sessionmaker(engine)()

 u1 = User(name=Foo)
 u2 = User(name=Bar)
 u1.addresses.append(Address(created=datetime.datetime.now()))
 u1.addresses.append(Address(created=datetime.datetime.now() -
 datetime.timedelta(days=1)))
 session.add(u1)
 session.add(u2)
 session.commit()
 print u1, u1.addresses
 print u2, u2.addresses
 print session.query(User, print session.query(User,
 session.query(func.max(Address.created)).filter(Address.users.any()).correlate(User).as_scalar()).outerjoin(User.addresses).all()

 Cheers

Re: [sqlalchemy] Occasional InvalidRequestError: Table 'xxx' is already defined for this MetaData instance.

2013-10-21 Thread Joseph Casale
Hey Michael,
For reasons beyond my control at the moment, logging was utterly butchered
in a completely
incomprehensible means by another developer.

Traceback (most recent call last):
  File D:\app_folder\plugin\idm_rw.py, line 6, in module
from idm_config import Attribute, Session
  File D:\app_folder\include\idm_config.py, line 478, in module
class PortalConfig(Base, EnvironmentMixin):
  File D:\Python33\lib\site-packages\sqlalchemy\ext\declarative\api.py,
line 50, in __init__
_as_declarative(cls, classname, cls.__dict__)
  File D:\Python33\lib\site-packages\sqlalchemy\ext\declarative\base.py,
line 222, in
_as_declarative **table_kw)
  File D:\Python33\lib\site-packages\sqlalchemy\schema.py, line 319, in
__new__ existing Table object. % key)

sqlalchemy.exc.InvalidRequestTable 'attribute_config' is already
defined for this MetaData instance.  Specify 'extend_existing=True' to
redefine
options and columns on an existing Table object.

I think thats what it looked like before he had his way:)

Thanks!
jlc


On Mon, Oct 21, 2013 at 3:25 PM, Michael Bayer mike...@zzzcomputing.comwrote:

 how about a stack trace?


 On Oct 21, 2013, at 4:11 PM, Joseph Casale jcas...@gmail.com wrote:

 Hi Michael,
 I spoke with the developer of the application that launches the plugins I
 am writing and each
 python script is run by starting an entirely new process with
 the interpreter and each plugin.

 It should be a clean environment. From the plugins, I simply import which
 ever tables from
 the module below that I would need.

 In the event I am doing something wrong, my code looks like:

 from os import path
 from sqlalchemy import (
 Column, CheckConstraint, DDL, ForeignKey, Index, Integer, String,
 create_engine, event
 )
 from sqlalchemy.engine import Engine
 from sqlalchemy.ext.declarative import declared_attr, declarative_base
 from sqlalchemy.orm import sessionmaker


 db_name = path.join('path', 'config', 'sqlite.db')
 engine = create_engine('sqlite:///{}'.format(db_name), echo=False)
 Session = sessionmaker(bind=engine, autoflush=False, autocommit=False)
 Base = declarative_base()


 @event.listens_for(Engine, 'connect')
 def set_sqlite_pragma(dbapi_connection, connection_record):
 cursor = dbapi_connection.cursor()
 cursor.execute('PRAGMA foreign_keys=ON')
 cursor.close()


 table defs...


 if __name__ == '__main__':
 Base.metadata.create_all(engine)


 Thanks so much for the help,
 jlc


 On Sun, Oct 20, 2013 at 11:01 PM, Michael Bayer 
 mike...@zzzcomputing.comwrote:

 the MetaData object holds one Table object per unique name given.   If
 you use the Table constructor more than once with the same name and the
 same MetaData, you get this error.

 That's how the error is caused, then the fact that the error is
 occasional points strongly to a race condition of some kind, more than
 one thread both calling the constructor for Table with the same name.
 Patterns that could cause this could be some kind of unsynchronized global
 registry or singleton object that when called produces a new Table object.

 The recommended pattern is for Table objects (as well as mapped classes)
 to generally be declared at the module level so that these names are
 produced only at module import time, which itself should occur before the
 application starts any threads in addition to the main application thread.




 On Oct 20, 2013, at 11:04 PM, Joseph Casale jcas...@gmail.com wrote:

 I have a module that is imported by several Python scripts run by an
 application
 that fires up a new interpreter session for each invocation. When not
 under load or
 running the code manually at the cli things work fine but once the
 concurrency
 raises and the application starts seeing some load it emits
 InvalidRequestError
 exceptions on one table.

 After searching I am not sure the results relate to my issue or maybe my
 lack
 of familiarity with SQLAlchemy has the better of me.

 Any guidance would be greatly appreciated,
 jlc

 --
 You received this message because you are subscribed to the Google Groups
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.




 --
 You received this message because you are subscribed to the Google Groups
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.




-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this 

Re: [sqlalchemy] Occasional InvalidRequestError: Table 'xxx' is already defined for this MetaData instance.

2013-10-21 Thread Michael Bayer

OK what's idm_rw.py, and is it the only module in the app that specifies a 
table named attribute_config?  is idm_rw.py always imported right when the 
app starts up?  or is there some kind of loading going on ?



On Oct 21, 2013, at 9:51 PM, Joseph Casale jcas...@gmail.com wrote:

 Hey Michael,
 For reasons beyond my control at the moment, logging was utterly butchered in 
 a completely
 incomprehensible means by another developer.
 
 Traceback (most recent call last):
   File D:\app_folder\plugin\idm_rw.py, line 6, in module
 from idm_config import Attribute, Session
   File D:\app_folder\include\idm_config.py, line 478, in module
 class PortalConfig(Base, EnvironmentMixin):
   File D:\Python33\lib\site-packages\sqlalchemy\ext\declarative\api.py, 
 line 50, in __init__
 _as_declarative(cls, classname, cls.__dict__)
   File D:\Python33\lib\site-packages\sqlalchemy\ext\declarative\base.py, 
 line 222, in
 _as_declarative **table_kw)
   File D:\Python33\lib\site-packages\sqlalchemy\schema.py, line 319, in
 __new__ existing Table object. % key)
 
 sqlalchemy.exc.InvalidRequestTable 'attribute_config' is already
 defined for this MetaData instance.  Specify 'extend_existing=True' to 
 redefine
 options and columns on an existing Table object.
 
 I think thats what it looked like before he had his way:)
 
 Thanks!
 jlc
 
 
 On Mon, Oct 21, 2013 at 3:25 PM, Michael Bayer mike...@zzzcomputing.com 
 wrote:
 how about a stack trace?
 
 
 On Oct 21, 2013, at 4:11 PM, Joseph Casale jcas...@gmail.com wrote:
 
 Hi Michael,
 I spoke with the developer of the application that launches the plugins I am 
 writing and each
 python script is run by starting an entirely new process with the 
 interpreter and each plugin.
 
 It should be a clean environment. From the plugins, I simply import which 
 ever tables from
 the module below that I would need.
 
 In the event I am doing something wrong, my code looks like:
 
 from os import path
 from sqlalchemy import (
 Column, CheckConstraint, DDL, ForeignKey, Index, Integer, String,
 create_engine, event
 )
 from sqlalchemy.engine import Engine
 from sqlalchemy.ext.declarative import declared_attr, declarative_base
 from sqlalchemy.orm import sessionmaker
 
 
 db_name = path.join('path', 'config', 'sqlite.db')
 engine = create_engine('sqlite:///{}'.format(db_name), echo=False)
 Session = sessionmaker(bind=engine, autoflush=False, autocommit=False)
 Base = declarative_base()
 
 
 @event.listens_for(Engine, 'connect')
 def set_sqlite_pragma(dbapi_connection, connection_record):
 cursor = dbapi_connection.cursor()
 cursor.execute('PRAGMA foreign_keys=ON')
 cursor.close()
 
 
 table defs...
 
 
 if __name__ == '__main__':
 Base.metadata.create_all(engine)
 
 
 Thanks so much for the help,
 jlc
 
 
 On Sun, Oct 20, 2013 at 11:01 PM, Michael Bayer mike...@zzzcomputing.com 
 wrote:
 the MetaData object holds one Table object per unique name given.   If you 
 use the Table constructor more than once with the same name and the same 
 MetaData, you get this error.
 
 That's how the error is caused, then the fact that the error is occasional 
 points strongly to a race condition of some kind, more than one thread both 
 calling the constructor for Table with the same name.   Patterns that could 
 cause this could be some kind of unsynchronized global registry or singleton 
 object that when called produces a new Table object.
 
 The recommended pattern is for Table objects (as well as mapped classes) to 
 generally be declared at the module level so that these names are produced 
 only at module import time, which itself should occur before the application 
 starts any threads in addition to the main application thread.
 
 
 
 
 On Oct 20, 2013, at 11:04 PM, Joseph Casale jcas...@gmail.com wrote:
 
 I have a module that is imported by several Python scripts run by an 
 application
 that fires up a new interpreter session for each invocation. When not under 
 load or
 running the code manually at the cli things work fine but once the 
 concurrency
 raises and the application starts seeing some load it emits 
 InvalidRequestError
 exceptions on one table.
 
 After searching I am not sure the results relate to my issue or maybe my 
 lack
 of familiarity with SQLAlchemy has the better of me.
 
 Any guidance would be greatly appreciated,
 jlc
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/groups/opt_out.
 
 
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an