[sqlalchemy] TLS 1.2
I am connecting to a MySQL db using sqlalchemy and TLS 1.2 is enforced server side. I connect by setting my URI to the correct connection string and adding the ?ssl=true flag at the end. However I am getting an error that the TLS version does not meet the minimum requirements of the server. My question is how to force sqlalchemy to connect using TLS 1.2? App env: Python 3.6 using Flask -- SQLAlchemy - The Python SQL Toolkit and Object Relational Mapper http://www.sqlalchemy.org/ To post example code, please provide an MCVE: Minimal, Complete, and Verifiable Example. See http://stackoverflow.com/help/mcve for a full description. --- You received this message because you are subscribed to the Google Groups "sqlalchemy" group. To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/8803031f-ca72-454a-bd73-ad3b0bc65773n%40googlegroups.com.
Re: many to many relation with foreign key to composite primary key
sa.PrimaryKeyConstraint('id'), > sa.UniqueConstraint('email') > ) > op.create_table('roles_permissions', > sa.Column('permissions_id', sa.BigInteger(), nullable=False), > sa.Column('roles_id', sa.BigInteger(), nullable=False), > sa.ForeignKeyConstraint(['permissions_id'], ['permissions.id'], ), > sa.ForeignKeyConstraint(['roles_id'], ['roles.id'], ), > sa.PrimaryKeyConstraint('permissions_id', 'roles_id') > ) > op.create_table('users_permissions', > sa.Column('users_id', sa.BigInteger(), nullable=False), > sa.Column('permissions_id', sa.BigInteger(), nullable=False), > sa.ForeignKeyConstraint(['permissions_id'], ['permissions.id'], ), > sa.ForeignKeyConstraint(['users_id'], ['users.id'], ), > sa.PrimaryKeyConstraint('users_id', 'permissions_id') > ) > op.create_table('users_roles', > sa.Column('users_id', sa.BigInteger(), nullable=False), > sa.Column('roles_id', sa.BigInteger(), nullable=False), > sa.ForeignKeyConstraint(['roles_id'], ['roles.id'], ), > sa.ForeignKeyConstraint(['users_id'], ['users.id'], ), > sa.PrimaryKeyConstraint('users_id', 'roles_id') > ) > > > then I run upgrade, SQL script output matches what you specified: > > INFO [alembic.runtime.migration] Running upgrade -> 678e61b9e311, rev1 > INFO [sqlalchemy.engine.base.Engine] > CREATE TABLE permissions ( > id BIGSERIAL NOT NULL, > name VARCHAR(100) NOT NULL, > description VARCHAR(255), > PRIMARY KEY (id), > UNIQUE (name) > ) > > > INFO [sqlalchemy.engine.base.Engine] {} > INFO [sqlalchemy.engine.base.Engine] > CREATE TABLE roles ( > id BIGSERIAL NOT NULL, > name VARCHAR(100) NOT NULL, > PRIMARY KEY (id), > UNIQUE (name) > ) > > > INFO [sqlalchemy.engine.base.Engine] {} > INFO [sqlalchemy.engine.base.Engine] > CREATE TABLE users ( > id BIGSERIAL NOT NULL, > email VARCHAR(100) NOT NULL, > name VARCHAR(100) NOT NULL, > hashed_password VARCHAR(100) NOT NULL, > is_admin BOOLEAN, > is_active BOOLEAN, > created TIMESTAMP WITHOUT TIME ZONE, > modified TIMESTAMP WITHOUT TIME ZONE, > PRIMARY KEY (id), > UNIQUE (email) > ) > > > INFO [sqlalchemy.engine.base.Engine] {} > INFO [sqlalchemy.engine.base.Engine] > CREATE TABLE roles_permissions ( > permissions_id BIGINT NOT NULL, > roles_id BIGINT NOT NULL, > PRIMARY KEY (permissions_id, roles_id), > FOREIGN KEY(permissions_id) REFERENCES permissions (id), > FOREIGN KEY(roles_id) REFERENCES roles (id) > ) > > > INFO [sqlalchemy.engine.base.Engine] {} > INFO [sqlalchemy.engine.base.Engine] > CREATE TABLE users_permissions ( > users_id BIGINT NOT NULL, > permissions_id BIGINT NOT NULL, > PRIMARY KEY (users_id, permissions_id), > FOREIGN KEY(permissions_id) REFERENCES permissions (id), > FOREIGN KEY(users_id) REFERENCES users (id) > ) > > > INFO [sqlalchemy.engine.base.Engine] {} > INFO [sqlalchemy.engine.base.Engine] > CREATE TABLE users_roles ( > users_id BIGINT NOT NULL, > roles_id BIGINT NOT NULL, > PRIMARY KEY (users_id, roles_id), > FOREIGN KEY(roles_id) REFERENCES roles (id), > FOREIGN KEY(users_id) REFERENCES users (id) > ) > > > > > On Thu, Dec 13, 2018 at 12:18 PM patrick payet wrote: > > > > Hi Mike, > > The message has been truncated. > > > > The CREATE TABLE statements is : > > CREATE TABLE users ( > > id BIGSERIAL NOT NULL, > > email VARCHAR(100) NOT NULL, > > name VARCHAR(100) NOT NULL, > > hashed_password VARCHAR(100) NOT NULL, > > is_admin BOOLEAN, > > is_active BOOLEAN, > > created TIMESTAMP WITHOUT TIME ZONE, > > modified TIMESTAMP WITHOUT TIME ZONE, > > PRIMARY KEY (id), > > UNIQUE (email) > > ) > > > > CREATE TABLE permissions ( > > id BIGSERIAL NOT NULL, > > name VARCHAR(100) NOT NULL, > > description VARCHAR(255), > > PRIMARY KEY (id), > > UNIQUE (name) > > ) > > > > CREATE TABLE roles ( > > id BIGSERIAL NOT NULL, > > name VARCHAR(100) NOT NULL, > > PRIMARY KEY (id), > > UNIQUE (name) > > ) > > > > CREATE TABLE users_roles ( > > users_id BIGINT NOT NULL, > > roles_id BIGINT NOT NULL, > > PRIMARY KEY (users_id, roles_id), > > FOREIGN KEY(users_id) REFERENCES users (id), > > FOREIGN KEY(roles_id) REFERENCES roles (id) > > ) > > > > CREATE
Re: many to many relation with foreign key to composite primary key
Hi Mike, The message has been truncated. The CREATE TABLE statements is : CREATE TABLE users ( id BIGSERIAL NOT NULL, email VARCHAR(100) NOT NULL, name VARCHAR(100) NOT NULL, hashed_password VARCHAR(100) NOT NULL, is_admin BOOLEAN, is_active BOOLEAN, created TIMESTAMP WITHOUT TIME ZONE, modified TIMESTAMP WITHOUT TIME ZONE, PRIMARY KEY (id), UNIQUE (email) ) CREATE TABLE permissions ( id BIGSERIAL NOT NULL, name VARCHAR(100) NOT NULL, description VARCHAR(255), PRIMARY KEY (id), UNIQUE (name) ) CREATE TABLE roles ( id BIGSERIAL NOT NULL, name VARCHAR(100) NOT NULL, PRIMARY KEY (id), UNIQUE (name) ) CREATE TABLE users_roles ( users_id BIGINT NOT NULL, roles_id BIGINT NOT NULL, PRIMARY KEY (users_id, roles_id), FOREIGN KEY(users_id) REFERENCES users (id), FOREIGN KEY(roles_id) REFERENCES roles (id) ) CREATE TABLE roles_permissions ( roles_id BIGINT NOT NULL, permissions_id BIGINT NOT NULL, PRIMARY KEY (roles_id, permissions_id), FOREIGN KEY(roles_id) REFERENCES roles (id), FOREIGN KEY(permissions_id) REFERENCES permissions (id) ) CREATE TABLE users_permissions ( users_id BIGINT NOT NULL, permissions_id BIGINT NOT NULL, PRIMARY KEY (users_id, permissions_id), FOREIGN KEY(users_id) REFERENCES users (id), FOREIGN KEY(permissions_id) REFERENCES permissions (id) ) My mistake is to have mixed in my mind the information of the object mapping and the relational constraints of the database. But why Alembic can’t create this model in auto-generation mode? Best regards Le jeu. 13 déc. 2018 à 18:12, patrick payet a écrit : > PRIMARY KEY (id), > UNIQUE (email) > ) > > CREATE TABLE permissions ( > id BIGSERIAL NOT NULL, > name VARCHAR(100) NOT NULL, > description VARCHAR(255), > PRIMARY KEY (id), > UNIQUE (name) > ) > > CREATE TABLE roles ( > id BIGSERIAL NOT NULL, > name VARCHAR(100) NOT NULL, > PRIMARY KEY (id), > UNIQUE (name) > ) > > CREATE TABLE users_roles ( > users_id BIGINT NOT NULL, > roles_id BIGINT NOT NULL, > PRIMARY KEY (users_id, roles_id), > FOREIGN KEY(users_id) REFERENCES users (id), > FOREIGN KEY(roles_id) REFERENCES roles (id) > ) > > CREATE TABLE roles_permissions ( > roles_id BIGINT NOT NULL, > permissions_id BIGINT NOT NULL, > PRIMARY KEY (roles_id, permissions_id), > FOREIGN KEY(roles_id) REFERENCES roles (id), > FOREIGN KEY(permissions_id) REFERENCES permissions (id) > ) > > CREATE TABLE users_permissions ( > users_id BIGINT NOT NULL, > permissions_id BIGINT NOT NULL, > PRIMARY KEY (users_id, permissions_id), > FOREIGN KEY(users_id) REFERENCES users (id), > FOREIGN KEY(permissions_id) REFERENCES permissions (id) > ) > > My mistake is to have mixed in my mind the information of the object > mapping and the relational constraints of the database. > But why Alembic can’t create this model in auto-generation mode? > > Best regards > > Le jeu. 13 déc. 2018 à 14:44, Mike Bayer a > écrit : > >> On Thu, Dec 13, 2018 at 7:11 AM patrick payet wrote: >> > >> > Hi Mike, >> > Thanks for your answer. >> > My problem it’s a classical problem for application development, I want >> to make authentication and authorisation for a web application. >> > I want to understand why Alembic can’t create the model in >> autogeneration mode and how modify this model to resolve this problem. >> > With this model I have a join table to provide a many-to-many >> relationship between users and roles (users group) for example. >> > This allows me to access all roles for one user. For this, I declare a >> variable roles in users class and I want to make a foreign key with this >> join table >> > to reach the roles. >> > This joins table has a composite primary key (users.id foreign key and >> roles.id foreign key) and I don’t know how to make the link (foreign >> key) with alembic for the users.roles and this composite primary key. >> >> the link is the user_roles table and the FOREIGN KEY constraints it >> has between users and roles tables. you can't locate a unique row >> in user_roles given only the user table because it has no column that >> refers to user_roles.roles_id.at least I'm trying to parse your >> words as given. however CREATE TABLE statements will show exactly >> what you mean in case it is someth
Re: many to many relation with foreign key to composite primary key
Hi Mike, Thanks for your answer. My problem it’s a classical problem for application development, I want to make authentication and authorisation for a web application. I want to understand why Alembic can’t create the model in autogeneration mode and how modify this model to resolve this problem. With this model I have a join table to provide a many-to-many relationship between users and roles (users group) for example. This allows me to access all roles for one user. For this, I declare a variable roles in users class and I want to make a foreign key with this join table to reach the roles. This joins table has a composite primary key (users.id foreign key and roles.id foreign key) and I don’t know how to make the link (foreign key) with alembic for the users.roles and this composite primary key. When I declare just one element of this primary key, I have the error quoted previously ( there is no unique constraint matching given keys for referenced table "users_roles"). I will send you the CREATE TABLE statements by email in a moment. Best regards, Patrick Le mer. 12 déc. 2018 à 19:40, Mike Bayer a écrit : > On Wed, Dec 12, 2018 at 1:01 PM patrick payet wrote: > > > > I had a SQLAlchemy model like - > > > > class User(DeclarativeBase): > > __tablename__ = 'users' > > > > id = Column(BigInteger, primary_key=True) > > email = Column(String(100), unique=True, nullable=False) > > name = Column(String(100), nullable=False) > > hashed_password = Column(String(100), nullable=False) > > is_admin = Column(BOOLEAN, default=False) > > is_active = Column(BOOLEAN, default=True) > > created = Column(DateTime, default=datetime.now) > > modified = Column(DateTime, default=datetime.now, > onpudate=datetime.datetime.now) > > roles = relationship('Role', secondary=users_roles, > back_populates='users', cascade="all, delete-orphan") > > permissions = relationship('Permission', > secondary=users_permissions, back_populates='users', > >cascade="all, delete-orphan") > > > > def __repr__(self): > > return "" % ( > self.name, self.email, self.hashed_password) > > > > > > class Role(DeclarativeBase): > > __tablename__ = 'roles' > > > > id = Column(BigInteger, primary_key=True) > > name = Column(String(100), unique=True, nullable=False) > > users = relationship('User', secondary=users_roles, > back_populates='roles',cascade="all, delete-orphan") > > permissions = relationship('Permission', > secondary=roles_permissions, back_populates='roles', > >cascade="all, delete-orphan") > > > > def __repr__(self): > > return "" % self.name > > > > > > class Permission(DeclarativeBase): > > __tablename__ = 'permissions' > > > > id = Column(BigInteger, primary_key=True) > > name = Column(String(100), unique=True, nullable=False) > > description = Column(String(255)) > > users = relationship('User', secondary=users_permissions, > back_populates='permissions', > > cascade="all, delete-orphan") > > roles = relationship('Role', secondary=roles_permissions, > back_populates='permissions', > > cascade="all, delete-orphan") > > > > def __repr__(self): > > return "" % (self.name, > self.description) > > > > However, Alembic is not generating the correct upgrade, i try to make it > > > > > > def upgrade(): > > # ### commands auto generated by Alembic - please adjust! ### > > # op.drop_table('users') > > # ### end Alembic commands ### > > > > op.create_table('roles', > > sa.Column('id', sa.BIGINT(), autoincrement=True, > nullable=False), > > sa.Column('name', sa.String(100), > autoincrement=False, nullable=False), > > sa.Column('users', BigInteger), > > sa.Column('permissions', BigInteger), > > sa.PrimaryKeyConstraint('id'), > > sa.UniqueConstraint('name', name='roles_name_key') > > # sa.ForeignKeyConstraint(['users'], > ['users_roles.users_id'], ondelete='CASCADE'), > > # sa.ForeignKeyConstraint(['permissions'], > ['roles_permissions.permissions_id'], ondelete='CASCADE') > > ) > > op.create_table('permissions', > > sa.Column('id',
many to many relation with foreign key to composite primary key
I had a SQLAlchemy model like - class User(DeclarativeBase): __tablename__ = 'users' id = Column(BigInteger, primary_key=True) email = Column(String(100), unique=True, nullable=False) name = Column(String(100), nullable=False) hashed_password = Column(String(100), nullable=False) is_admin = Column(BOOLEAN, default=False) is_active = Column(BOOLEAN, default=True) created = Column(DateTime, default=datetime.now) modified = Column(DateTime, default=datetime.now, onpudate=datetime.datetime.now) roles = relationship('Role', secondary=users_roles, back_populates='users', cascade="all, delete-orphan") permissions = relationship('Permission', secondary=users_permissions, back_populates='users', cascade="all, delete-orphan") def __repr__(self): return "" % (self.name, self.email, self.hashed_password) class Role(DeclarativeBase): __tablename__ = 'roles' id = Column(BigInteger, primary_key=True) name = Column(String(100), unique=True, nullable=False) users = relationship('User', secondary=users_roles, back_populates='roles',cascade="all, delete-orphan") permissions = relationship('Permission', secondary=roles_permissions, back_populates='roles', cascade="all, delete-orphan") def __repr__(self): return "" % self.name class Permission(DeclarativeBase): __tablename__ = 'permissions' id = Column(BigInteger, primary_key=True) name = Column(String(100), unique=True, nullable=False) description = Column(String(255)) users = relationship('User', secondary=users_permissions, back_populates='permissions', cascade="all, delete-orphan") roles = relationship('Role', secondary=roles_permissions, back_populates='permissions', cascade="all, delete-orphan") def __repr__(self): return "" % (self.name, self.description) However, Alembic is not generating the correct upgrade, i try to make it def upgrade(): # ### commands auto generated by Alembic - please adjust! ### # op.drop_table('users') # ### end Alembic commands ### op.create_table('roles', sa.Column('id', sa.BIGINT(), autoincrement=True, nullable=False), sa.Column('name', sa.String(100), autoincrement=False, nullable=False), sa.Column('users', BigInteger), sa.Column('permissions', BigInteger), sa.PrimaryKeyConstraint('id'), sa.UniqueConstraint('name', name='roles_name_key') # sa.ForeignKeyConstraint(['users'], ['users_roles.users_id'], ondelete='CASCADE'), # sa.ForeignKeyConstraint(['permissions'], ['roles_permissions.permissions_id'], ondelete='CASCADE') ) op.create_table('permissions', sa.Column('id', sa.BIGINT(), autoincrement=True, nullable=False), sa.Column('name', sa.String(100), autoincrement=False, nullable=False), sa.Column('description', sa.String(255)), sa.Column('users', BigInteger), sa.Column('roles', BigInteger), sa.PrimaryKeyConstraint('id') # sa.ForeignKeyConstraint(['users'], ['users_permissions.users_id'], ondelete='CASCADE'), # sa.ForeignKeyConstraint(['roles'], ['role_permissions.roles_id'], ondelete='CASCADE') ) op.create_table('users_roles', sa.Column('users_id', BigInteger, sa.ForeignKey('users.id'), primary_key=True), sa.Column('roles_id', BigInteger, sa.ForeignKey('roles.id'), primary_key=True) ) op.create_table('users_permissions', sa.Column('users_id', BigInteger, sa.ForeignKey('users.id'), primary_key=True), sa.Column('permissions_id', BigInteger, sa.ForeignKey('permissions.id'), primary_key=True) ) op.create_table('roles_permissions', sa.Column('roles_id', BigInteger, sa.ForeignKey('roles.id'), primary_key=True), sa.Column('permissions_id', BigInteger, sa.ForeignKey('permissions.id'), primary_key=True) ) op.drop_constraint('users_name_key', 'users') op.add_column('users', sa.Column('is_admin', sa.BOOLEAN, autoincrement=False, nullable=False, default=False)) op.add_column('users', sa.Column('is_active', sa.BOOLEAN, autoincrement=False, nullable=False, default=True)) op.add_column('users', sa.Column('created', sa.DateTime, autoincrement=False, nullable=False, default=datetime.datetime.now)) op.add_column('users', sa.Column('modified', sa.DateTime, autoincrement=False, nullable=False, default=datetime.datetime.now))
[sqlalchemy] pandas read_csv returning "nan" - sqlalchemy inserting "~"
For some reason Pandas is returning NULL values from Oracle into "nan" instead of "NaN" or "None" so I have to check for this and change it to "None" or else SQLAlchemy inserts a "~" instead of NULL to my Oracle database. What's up with that? Anybody else have this happen? if (pd.isnull(dfZZ['smgref'][0])): dfZZ['smgref'] = None so it will insert NULL ... -- SQLAlchemy - The Python SQL Toolkit and Object Relational Mapper http://www.sqlalchemy.org/ To post example code, please provide an MCVE: Minimal, Complete, and Verifiable Example. See http://stackoverflow.com/help/mcve for a full description. --- You received this message because you are subscribed to the Google Groups "sqlalchemy" group. To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+unsubscr...@googlegroups.com. To post to this group, send email to sqlalchemy@googlegroups.com. Visit this group at https://groups.google.com/group/sqlalchemy. For more options, visit https://groups.google.com/d/optout.
Re: [sqlalchemy] Updates in after_flush_postexec
On Monday, November 9, 2015 at 1:50:36 PM UTC-8, Michael Bayer wrote: > > On 11/09/2015 03:10 PM, Patrick Lewis wrote: > > > Is this an issue, or am I misunderstanding the doc? > > the documentation was ambiguous. I have corrected the paragraph as > follows: > > diff --git a/doc/build/orm/session_events.rst > b/doc/build/orm/session_events.rst > index 50c63e6..ecfc517 100644 > --- a/doc/build/orm/session_events.rst > +++ b/doc/build/orm/session_events.rst > @@ -72,7 +72,10 @@ for finalized objects and possibly emit additional > SQL. In this hook, > there is the ability to make new changes on objects, which means the > :class:`.Session` will again go into a "dirty" state; the mechanics of > the > :class:`.Session` here will cause it to flush **again** if new changes > -are detected in this hook. A counter ensures that an endless loop in > this > +are detected in this hook if the flush were invoked in the context of > +:meth:`.Session.commit`; otherwise, the pending changes will be bundled > +as part of the next normal flush. When the hook detects new changes > within > +a :meth:`.Session.commit`, a counter ensures that an endless loop in this > regard is stopped after 100 iterations, in the case that an > :meth:`.SessionEvents.after_flush_postexec` > hook continually adds new state to be flushed each time it is called. > That makes sense. When I change my test to use session.commit(), it works as described. Thanks for your help. - Patrick -- You received this message because you are subscribed to the Google Groups "sqlalchemy" group. To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+unsubscr...@googlegroups.com. To post to this group, send email to sqlalchemy@googlegroups.com. Visit this group at http://groups.google.com/group/sqlalchemy. For more options, visit https://groups.google.com/d/optout.
[sqlalchemy] Updates in after_flush_postexec
I have a sample gist that makes a change to an object in the after_flush_postexec event. https://gist.github.com/p-lewis/deb498bdfb6172bf0013 SQLAlchemy==1.0.9 As I read the documentation <http://docs.sqlalchemy.org/en/rel_1_0/orm/session_events.html#after-flush-postexec>, I expect that the event infrastructure should detect that the object is dirty again, and reprocess the flush. But that does not appear to happen. My output (ex-schema creation): 2015-11-09 20:00:30,044 INFO sqlalchemy.engine.base.Engine BEGIN (implicit) 2015-11-09 20:00:30,045 INFO sqlalchemy.engine.base.Engine INSERT INTO person (name) VALUES (?) 2015-11-09 20:00:30,045 INFO sqlalchemy.engine.base.Engine ('P Body',) In handler, p: session.dirty: IdentitySet([]) session.info: {'RENAMED': True} Is this an issue, or am I misunderstanding the doc? Thanks, Patrick -- You received this message because you are subscribed to the Google Groups "sqlalchemy" group. To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+unsubscr...@googlegroups.com. To post to this group, send email to sqlalchemy@googlegroups.com. Visit this group at http://groups.google.com/group/sqlalchemy. For more options, visit https://groups.google.com/d/optout.
[sqlalchemy] relation auto-correlation
Hi, I'm trying to use the secondary argument to create a relation that joins two tables through an intermediary. SQLAlchemy says that it can't figure out the correlation automatically. There are other relations and mappings going on here on these tables, but the base of this is as follows. Relations don't have any correlate keyword like queries. Any input? class Sequence(BaseStruct, Base): __tablename__ = 'sequence' __table_args__ = ( {'autoload':True} ) class FamilySequence(BaseStruct, Base): __tablename__ = 'family_sequence' __table_args__ = ( ForeignKeyConstraint(['sequence_key'], ['protein.sequence_key']), ForeignKeyConstraint(['family_key'], ['family.id']), {'autoload':True} ) sequence = relation(Sequence,primaryjoin=FamilySequence.sequence_key==Sequence.id,lazy=True) class Family(BaseStruct, Base): __tablename__ = 'family' __table_args__ = ( {'autoload':True} ) proteins = relation(Protein,secondary=FamilySequence.__table__,primaryjoin=Family.id==FamilySequence.family_key,secondaryjoin=FamilySequence.sequence_key==Protein.sequence_key,lazy=True,uselist=True,backref=families) InvalidRequestError: Select statement 'SELECT count(family_sequence.sequence_key) AS count_1 FROM family_sequence, family WHERE family.id = family_sequence.family_key' returned no FROM clauses due to auto-correlation; specify correlate() to control correlation manually. -- Patrick -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en.
[sqlalchemy] Re: insert defaults
Thanks. I might have figured out a solution. I can get it to assign func.compressed to the column attribute, but I can't get it to load the text uncompressed now. class BaseStruct(object): def _set_text(self, text): self.compressed = func.compress(text) def _get_text(self): return select([uncompress(compressed)]) text = property(_get_text, _set_text) class Matrix(BaseStruct, Base): __tablename__ = 'matrix' __table_args__ = ( {'autoload':True} ) compressed = deferred(Column(Binary())) text = BaseStruct.text s = Session() m = s.query(Matrix).get(1) m.compressed read-only buffer for 0x1e67000, size -1, offset 0 at 0x12e7680 m.text='Blah' m.compressed sqlalchemy.sql.expression.Function at 0x12e76b0; compress So after assigning the text... I get what I want. The column mapped attribute is assigned func.compress(text). But I also need to uncompress the field to read it sometimes. s = Session() m = s.query(Matrix).get(7) m.text sqlalchemy.sql.expression.Function at 0x1392cf0; uncompress In the past I assigned Matrix.text=column_property(select([uncompress(compressed)]),deferred=True) I could probably write a little SqlSoup to do this select statement, but is there a way I can integrate this all well? The way I'm doing this feels very sloppy. SqlAlchemy is powerful, but can be very complicated. On Mar 5, 8:30 pm, Michael Bayer mike...@zzzcomputing.com wrote: On Mar 5, 2010, at 6:44 PM, patrick wrote: Well it's something between the two. The instance variable compressed will always be NULL when adding or updating an instance, but I want it to infer a value from another instance variable. When inserting... the value of 'compressed' in the sql statement needs to be the raw SQL string COMPRESS('%s') % instance.text. It isn't simply running it through a function... it's dynamically creating the column value from the object's instance variables upon insert or update. MySQL has some funky compression function so the insert value for the column has to be raw sql. so do a before_insert() mapper extension and set the attribute as needed to func.compressed(instance.text). Or do the same at the object level, i.e user sets myobject.foo, foo is a descriptor-enabled method which then sets myobject.bar = func.compressed(foo) or whatever. the technique here ishttp://www.sqlalchemy.org/docs/session.html#embedding-sql-insert-upda On Mar 4, 3:30 pm, Michael Bayer mike...@zzzcomputing.com wrote: patrick wrote: Hey, I'm trying to create dynamic defaults for columns ala http:// www.sqlalchemy.org/docs/metadata.html#context-sensitive-default-funct MySQL has COMPRESS and UNCOMPRESS functions that I'm trying to leverage. I don't want to compress with python's zlib because I have legacy tables that were compressed using MySQL (which has a weird non- standard zip header and body), and I need to interface with them. Anyway, during an insert or update, I want to grab the 'text' variable from the instance object and insert it into the database like: COMPRESS(the text value). Obviously context.current_parameters is not the appropriate object, but I can't figure out if it's possible to access the instance being inserted/updated. are you trying to create a *default* value for an INSERT/UPDATE when NULL would otherwise be passed, or are you trying to run all incoming/outgoing data through a SQL function ? those are two completely separate topics. def compress_text(context): return COMPRESS('%s') % context.current_parameters['text'] class Tree(BaseStruct, Base): __tablename__ = 'tree' __table_args__ = ( {'autoload':True} ) compressed = deferred(Column(Binary(),default=compress_text,default=compress_text,onupdate=compress_text)) text = column_property(select([UNCOMPRESS(compressed)]),deferred=True) Is this possible with 0.5.7? -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en. -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group athttp://groups.google.com/group/sqlalchemy?hl=en. -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group
[sqlalchemy] Re: insert defaults
Yes, but then my descriptor is washed away. I'm trying to make it like a column property on 'get' and a descriptor enabled property on 'set'. On Mar 25, 3:27 pm, Michael Bayer mike...@zzzcomputing.com wrote: patrick wrote: In the past I assigned Matrix.text=column_property(select([uncompress(compressed)]),deferred=True) I could probably write a little SqlSoup to do this select statement, but is there a way I can integrate this all well? The way I'm doing this feels very sloppy. SqlAlchemy is powerful, but can be very complicated. you should map an attribute directly to column_property(uncompress(table.c.compressed)). if you want it only to fire when you read it, use deferred() instead of column_property(). On Mar 5, 8:30 pm, Michael Bayer mike...@zzzcomputing.com wrote: On Mar 5, 2010, at 6:44 PM, patrick wrote: Well it's something between the two. The instance variable compressed will always be NULL when adding or updating an instance, but I want it to infer a value from another instance variable. When inserting... the value of 'compressed' in the sql statement needs to be the raw SQL string COMPRESS('%s') % instance.text. It isn't simply running it through a function... it's dynamically creating the column value from the object's instance variables upon insert or update. MySQL has some funky compression function so the insert value for the column has to be raw sql. so do a before_insert() mapper extension and set the attribute as needed to func.compressed(instance.text). Or do the same at the object level, i.e user sets myobject.foo, foo is a descriptor-enabled method which then sets myobject.bar = func.compressed(foo) or whatever. the technique here ishttp://www.sqlalchemy.org/docs/session.html#embedding-sql-insert-upda On Mar 4, 3:30 pm, Michael Bayer mike...@zzzcomputing.com wrote: patrick wrote: Hey, I'm trying to create dynamic defaults for columns ala http:// www.sqlalchemy.org/docs/metadata.html#context-sensitive-default-funct MySQL has COMPRESS and UNCOMPRESS functions that I'm trying to leverage. I don't want to compress with python's zlib because I have legacy tables that were compressed using MySQL (which has a weird non- standard zip header and body), and I need to interface with them. Anyway, during an insert or update, I want to grab the 'text' variable from the instance object and insert it into the database like: COMPRESS(the text value). Obviously context.current_parameters is not the appropriate object, but I can't figure out if it's possible to access the instance being inserted/updated. are you trying to create a *default* value for an INSERT/UPDATE when NULL would otherwise be passed, or are you trying to run all incoming/outgoing data through a SQL function ? those are two completely separate topics. def compress_text(context): return COMPRESS('%s') % context.current_parameters['text'] class Tree(BaseStruct, Base): __tablename__ = 'tree' __table_args__ = ( {'autoload':True} ) compressed = deferred(Column(Binary(),default=compress_text,default=compress_text,onupdate=compress_text)) text = column_property(select([UNCOMPRESS(compressed)]),deferred=True) Is this possible with 0.5.7? -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en. -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group athttp://groups.google.com/group/sqlalchemy?hl=en. -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en. -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en.
[sqlalchemy] Re: insert defaults
Well it's something between the two. The instance variable compressed will always be NULL when adding or updating an instance, but I want it to infer a value from another instance variable. When inserting... the value of 'compressed' in the sql statement needs to be the raw SQL string COMPRESS('%s') % instance.text. It isn't simply running it through a function... it's dynamically creating the column value from the object's instance variables upon insert or update. MySQL has some funky compression function so the insert value for the column has to be raw sql. On Mar 4, 3:30 pm, Michael Bayer mike...@zzzcomputing.com wrote: patrick wrote: Hey, I'm trying to create dynamic defaults for columns ala http:// www.sqlalchemy.org/docs/metadata.html#context-sensitive-default-funct MySQL has COMPRESS and UNCOMPRESS functions that I'm trying to leverage. I don't want to compress with python's zlib because I have legacy tables that were compressed using MySQL (which has a weird non- standard zip header and body), and I need to interface with them. Anyway, during an insert or update, I want to grab the 'text' variable from the instance object and insert it into the database like: COMPRESS(the text value). Obviously context.current_parameters is not the appropriate object, but I can't figure out if it's possible to access the instance being inserted/updated. are you trying to create a *default* value for an INSERT/UPDATE when NULL would otherwise be passed, or are you trying to run all incoming/outgoing data through a SQL function ? those are two completely separate topics. def compress_text(context): return COMPRESS('%s') % context.current_parameters['text'] class Tree(BaseStruct, Base): __tablename__ = 'tree' __table_args__ = ( {'autoload':True} ) compressed = deferred(Column(Binary(),default=compress_text,default=compress_text,onupdate=compress_text)) text = column_property(select([UNCOMPRESS(compressed)]),deferred=True) Is this possible with 0.5.7? -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en. -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en.
[sqlalchemy] insert defaults
Hey, I'm trying to create dynamic defaults for columns ala http:// www.sqlalchemy.org/docs/metadata.html#context-sensitive-default-functions. MySQL has COMPRESS and UNCOMPRESS functions that I'm trying to leverage. I don't want to compress with python's zlib because I have legacy tables that were compressed using MySQL (which has a weird non- standard zip header and body), and I need to interface with them. Anyway, during an insert or update, I want to grab the 'text' variable from the instance object and insert it into the database like: COMPRESS(the text value). Obviously context.current_parameters is not the appropriate object, but I can't figure out if it's possible to access the instance being inserted/updated. def compress_text(context): return COMPRESS('%s') % context.current_parameters['text'] class Tree(BaseStruct, Base): __tablename__ = 'tree' __table_args__ = ( {'autoload':True} ) compressed = deferred(Column(Binary(),default=compress_text,default=compress_text,onupdate=compress_text)) text = column_property(select([UNCOMPRESS(compressed)]),deferred=True) Is this possible with 0.5.7? -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalch...@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en.
[sqlalchemy] Noob WARNING, sqlalchemy for Desktop Apps?
I am just learning the basics of Sqlalchemy, I am reading through the O'Reilly book. I was just wondering if anyone is using Sqlalchemy for desktop applications? Does anyone know of specific problems I might have doing so? Thanks in advance-Patrick --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Python 2.6 hash behavior change
On Mar 3, 2008, at 11:23 AM, Michael Bayer wrote: On Mar 3, 2008, at 11:56 AM, Patrick Hartling wrote: hash(Works()) hash(Works2()) # Raises TypeError with Python 2.6 because Fails is deemed unhashable. hash(Fails()) Does anyone know of a workaround for this issue? So far, sqlalchemy.schema.PrimaryKeyConstraint and sqlalchemy.schema.Column are the two classes that have tripped me up. I have added __hash__() methods to both, but I cannot vouch for the correctness of the implementations. hmm, subtle difference between the 2.5 docs: If a class does not define a __cmp__() method it should not define a __hash__() operation either; if it defines __cmp__() or __eq__() but not __hash__(), its instances will not be usable as dictionary keys. If a class defines mutable objects and implements a __cmp__() or __eq__() method, it should not implement __hash__(), since the dictionary implementation requires that a key's hash value is immutable (if the object's hash value changes, it will be in the wrong hash bucket). and the 2.6 docs: If a class does not define a __cmp__() or __eq__() method it should not define a __hash__() operation either; if it defines __cmp__() or __eq__() but not __hash__(), its instances will not be usable as dictionary keys. If a class defines mutable objects and implements a __cmp__() or __eq__() method, it should not implement __hash__(), since the dictionary implementation requires that a key’s hash value is immutable (if the object’s hash value changes, it will be in the wrong hash bucket). both claim that if we define __eq__() but not __hash__(), it wont be useable as a dictionary key. but in the case of 2.5 this seems to be incorrect, or at least not enforced. The only difference here is that 2.6 says if we dont define __cmp__() or __eq__(), then we shouldn't define __hash__() either whereas 2.5 only mentions __cmp__() in that regard. Subtle indeed. We define __eq__() all over the place so that would be a lot of __hash__() methods to add, all of which return id(self). I wonder if we shouldn't just make a util.Mixin called Hashable so that we can centralize the idea. That sounds like a reasonable approach. I have not looked much at the SQLAlchemy internals before today, but if I follow your idea correctly, I could apply that solution wherever I run into problems during testing. -Patrick -- Patrick L. Hartling Senior Software Engineer, Priority 5 http://www.priority5.com/ PGP.sig Description: This is a digitally signed message part
[sqlalchemy] Date only (no time stamp) selects not working with datetime/mx.DateTime objects
I used to be able to select from a mysql date column using a datetime object; where the timestamp would be ignored because the DB schema isn't using the datetime column type. query = Table.query().filter(Table.c.date == datetime(2007, 1, 1)) The mysql log shows the select being generated like this: ... table.date = '2007-01-01 00:00:00.00' ... Which doesn't return any results because of the timestamp. If however, I string format my date in the query the proper mySQL query is generated: query = Table.query().filter(Table.c.date == %4d-%02d-%02d % 2007, 1, 1)) Am I doing something wrong? Thanks! --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] polymorphic inheritance with multiple foreign keys
Howdy, I'm trying to create a fairly generic graph structure in a database using SQLAlchemy. Basically, I'd like to create a root object of class Element, and have everything inherit from Element. Connections between Elements are done through a generic class Link, which is an Element itself, and points to two other elements as the source and the target of the link. In this way I'll be able to attach arbitrary elements to links, which I can't do if Link doesn't inherit from element. Anyway, I've read the entry on better ways to do some polymorphic unions at http://groups.google.com/group/sqlalchemy/browse_thread/thread/4ef1ce9f7acbd494/98591e1a03b4bd71?lnk=gstq=inheritancernum=44 An have been modeling much of my work off that, however, the model seems to fall apart when you have multiple foreign keys. Specifically, I've attached a file to this message which gets the following error: Traceback (most recent call last): File poly.py, line 34, in module link_mapper.compile() File build/bdist.cygwin-1.5.24-i686/egg/sqlalchemy/orm/mapper.py, line 219, in compile File build/bdist.cygwin-1.5.24-i686/egg/sqlalchemy/orm/mapper.py, line 234, in _compile_all File build/bdist.cygwin-1.5.24-i686/egg/sqlalchemy/orm/mapper.py, line 260, in _do_compile File build/bdist.cygwin-1.5.24-i686/egg/sqlalchemy/orm/mapper.py, line 324, in _compile_inheritance File build/bdist.cygwin-1.5.24-i686/egg/sqlalchemy/sql.py, line 109, in join File build/bdist.cygwin-1.5.24-i686/egg/sqlalchemy/sql.py, line 2424, in __init__ File build/bdist.cygwin-1.5.24-i686/egg/sqlalchemy/sql.py, line 2516, in _match_primaries sqlalchemy.exceptions.ArgumentError: Can't determine join between 'element' and 'link'; tables have more than one foreign key constraint relationship between them. Please specify the 'onclause' of this join explicitly. Unfortunately, when creating the mapper, I can't specify an onclause to the mapper. Any help on how I would accomplish this and get my such a structure in SQLAlchemy? Thanks! --Patrick --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~--- from sqlalchemy import * from sqlalchemy.orm import * metadata = MetaData('sqlite://') element_table = Table('element', metadata, Column('id', Integer, primary_key=True, default=None), Column('type', CHAR(1)), ) link_table = Table('link', metadata, Column('id', Integer, ForeignKey('element.id'), primary_key=True), Column('source_id', Integer, ForeignKey('element.id')), Column('target_id', Integer, ForeignKey('element.id')), ) metadata.create_all() element_join = element_table.outerjoin(link_table, onclause=link_table.c.id==element_table.c.id) class Element(object): def __init__(self, **kwargs): for key, value in kwargs.iteritems(): setattr(self, key, value) def __repr__(self): return %s(%s) % (self.__class__.__name__, ','.join([%s=%s % (k, repr(v)) for k, v in self.__dict__.iteritems() if k[0] != '_'])) class Link(Element):pass element_mapper = mapper(Element, element_table, select_table=element_join, polymorphic_on=element_table.c.type, polymorphic_identity='e') element_mapper.compile() link_mapper = mapper(Link, link_table, inherits=element_mapper, polymorphic_identity='l') link_mapper.compile() sess = create_session() # objs = [Page(page_no=5), MagazinePage(page_no=6, orders='some text'), ClassifiedPage(page_no=7, orders='some other text', titles='classified titles')] # for o in objs: # sess.save(o) # sess.flush() # sess.clear() # print sess.query(Page).list() # print sess.query(MagazinePage).list() # print sess.query(ClassifiedPage).list()
[sqlalchemy] Re: Cant locate any foreign key columns in primary join condition (one-to-many tree)
I'm still having problems with this (sorry about the huge delay, I got side-tracked for a while). After creating parent/child/attribute table rows, I get one of two errors when I try to print parent.attributes: If I use the column name id in my child table I get this error: sqlalchemy.exceptions.InvalidRequestError: Column 'child.id' is not available, due to conflicting property 'id':sqlalchemy.orm.properties.ColumnProperty object at 0x2b8d77fea410 Perhaps using id is a bad idea? I see most people use something like child_id; I'm just not a fan of the redundancy. When I use child_id instead I get this error: sqlalchemy.exceptions.InvalidRequestError: No column child.child_id is configured on mapper Mapper|Parent|parent... These exceptions are the two possible error paths when _getpropbycolumn receives a key error when doing prop = self.columntoproperty(column) (sqlalchemy/orm/mapper.py:983) Any help would be much appreciated! Here is the code: from sqlalchemy import * metadata = BoundMetaData(sqlite:///:memory:) metadata.engine.echo = True session = create_session(bind_to = metadata.engine) parent_table = Table('parent', metadata, Column('id', Integer, primary_key=True), Column('name', Unicode), ) child_table = Table('child', metadata, Column('child_id', Integer, primary_key=True), Column('parent_id', Integer, ForeignKey('parent.id')), ) attribute_table = Table('attribute', metadata, Column('id', Integer, primary_key=True), Column('child_id', Integer, ForeignKey('child.child_id')), ) class Parent(object): pass class Child(object): pass class Attribute(object): pass parent_mapper = mapper( Parent, parent_table, properties = dict( children = relation(Child, backref='parent'), attributes = relation(Attribute, primaryjoin = and_(attribute_table.c.child_id == child_table.c.child_id, child_table.c.parent_id == parent_table.c.id), foreign_keys=[attribute_table.c.child_id], backref = backref('parent', foreign_keys=[attribute_table.c.child_id], primaryjoin = and_(attribute_table.c.child_id == child_table.c.child_id, child_table.c.parent_id == parent_table.c.id)), viewonly = True, ) )) child_mapper = mapper( Child, child_table,) attribute_mapper = mapper( Attribute, attribute_table, properties = dict( child = relation(Child, backref=backref('attributes'), viewonly=True), ) ) metadata.create_all() parent = Parent() child = Child() child.parent = parent attr = Attribute() attr.child = child session.save(parent) session.flush() print parent.attributes On May 24, 10:19 am, Michael Bayer [EMAIL PROTECTED] wrote: On May 24, 2007, at 10:02 AM, Patrick McKinnon wrote: sqlalchemy.exceptions.ArgumentError: Cant locate any foreign key columns in primary join condition 'attribute.child_id = child.id AND child.parent_id = parent.id' for relationship 'Parent.attributes (Attribute)'. Specify 'foreign_keys' argument to indicate which columns in the join condition are foreign. 0.3.7 is more strict about analyzing relationships and in this case nothing in the join condition expresses a foreign key relationship between the parent and attribute table. when only primaryjoin is present, it assumes a one-to-many or many-to-one relationship between two tables. what youre doing here is trying to skip over. parent_mapper = mapper( Parent, parent_table, properties = dict( children = relation(Child, backref='parent'), attributes = relation(Attribute, primaryjoin = and_(attribute_table.c.child_id == child_table.c.id, child_table.c.parent_id == parent_table.c.id), backref = 'parent', ) ) ) so to force it: parent_mapper = mapper( Parent, parent_table, properties = dict( children = relation(Child, backref='parent'), attributes = relation(Attribute, primaryjoin = and_(attribute_table.c.child_id == child_table.c.id, child_table.c.parent_id == parent_table.c.id), foreign_keys=[attribute_table.c.child_id], backref = backref('parent', foreign_keys= [attribute_table.c.child_id], primaryjoin = and_ (attribute_table.c.child_id == child_table.c.id, child_table.c.parent_id == parent_table.c.id),), ) ) ) however you want to put viewonly=True for the attributes relationship, since that relationship will not persist correctly at all. the correct way would be to just deal with the attributes explicitly as they are attached to each child object. Normally id try to use the associationproxy extension here but your attributes property is actually the product of many individual Child objects which is a little unusual. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google
[sqlalchemy] Cant locate any foreign key columns in primary join condition (one-to-many tree)
Hi Everybody, New to the forum. I'm having a problem setting up a relation. In this hypothetical example, I have a tree with one parent having many children, and each child having many attributes. I want to create a relation where each Parent has a list of all attributes that any of it's children have. This relation works fine using 0.3.4, but breaks with the following error with 0.3.7 when I try to instantiate a Parent object: sqlalchemy.exceptions.ArgumentError: Cant locate any foreign key columns in primary join condition 'attribute.child_id = child.id AND child.parent_id = parent.id' for relationship 'Parent.attributes (Attribute)'. Specify 'foreign_keys' argument to indicate which columns in the join condition are foreign. I tried adding various foreign_keys attributes without much luck; I'm guessing I just don't know what I'm doing... Here is the example: -- from sqlalchemy import * metadata = BoundMetaData(sqlite:///relation.db) metadata.engine.echo = True session = create_session(bind_to = metadata.engine) parent_table = Table('parent', metadata, Column('id', Integer, primary_key=True), ) child_table = Table('child', metadata, Column('id', Integer, primary_key=True), Column('parent_id', Integer, ForeignKey('parent.id')), ) attribute_table = Table('attribute', metadata, Column('id', Integer, primary_key=True), Column('child_id', Integer, ForeignKey('child.id')), ) class Parent(object): pass class Child(object): pass class Attribute(object): pass parent_mapper = mapper( Parent, parent_table, properties = dict( children = relation(Child, backref='parent'), attributes = relation(Attribute, primaryjoin = and_(attribute_table.c.child_id == child_table.c.id, child_table.c.parent_id == parent_table.c.id), backref = 'parent', ) ) ) child_mapper = mapper( Child, child_table, ) attribute_mapper = mapper( Attribute, attribute_table, properties = dict( child = relation(Child, backref=backref('attributes')), ) ) metadata.create_all() p = Parent() -- Any ideas? Thanks! -Patrick --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Inspect and __init__ of a mapped class
On Feb 4, 5:54 am, [EMAIL PROTECTED] wrote: I'm trying to create a mapped object where I don't know what the exact constructor arguments of the object might be. I was trying to use the inspect module to get the right arguments, but it looks like the mapper is redefining the original class __init__. Any thoughts as to how I might ferret out what the original classes arguments may have been? I see that the original __init__ it is stored in a local variable 'oldinit' inside of Mapper._compile_class, but I can't think of any way to get to that. Any suggestions? maybe do it before compiling the mappers, e.g. before creating any instance? store/rename it somewhere under the class, then use that one... Sorry, I don't think that's an option. This is for a library that doesn't have any control over how the classes are declared or mapped. By the time I get a reference to the class, it will have almost certainly been mapped. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Inspect and __init__ of a mapped class
Hi, I'm trying to create a mapped object where I don't know what the exact constructor arguments of the object might be. I was trying to use the inspect module to get the right arguments, but it looks like the mapper is redefining the original class __init__. Any thoughts as to how I might ferret out what the original classes arguments may have been? I see that the original __init__ it is stored in a local variable 'oldinit' inside of Mapper._compile_class, but I can't think of any way to get to that. Any suggestions? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Test data apparently not saving
After upgrading sqlite to the most recent version (3.3.10), my problem went away and everything works as expected. Thanks for the help, and sorry for the noise. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Test data apparently not saving
I reran the above with a postgres database, and it all worked as expected. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: autoloading oracle tables and the owner parameter
Thanks, I will give that a try. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Test data apparently not saving
Ok, that works for me, too. But, if I rework it how I think the test suite is running things, I get the same error. http://paste.turbogears.org/paste/832 --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---
[sqlalchemy] Re: Test data apparently not saving
A minor revision (made user1 global) http://paste.turbogears.org/paste/833 --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups sqlalchemy group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~--~~~~--~~--~--~---