Re: [sqlalchemy] How can i use LIKE with an association proxy?

2015-04-28 Thread Mike Bayer


First off, this is unusual because most people would just use the 
primary attribute, association proxy isn't buying you anything here:


s.query(User).filter(User._all_emails.any(UserEmail.email.like('foo')))

vs.

s.query(User).filter(User.all_emails.any(UserEmail.email.like('foo')))

same amount of typing!

but anyway, sure it's a bug, this is 
https://bitbucket.org/zzzeek/sqlalchemy/issue/3397/association-proxy-any-on-o2m-non-object 
fixed in 4f6e9ccae93b9c50298b04135.




On 4/28/15 1:26 PM, Adrian wrote:
Ugh, somehow my reply sent by email nerver arrived here... here's my 
code: https://gist.github.com/ThiefMaster/40cd1f91e2a792150496

--
You received this message because you are subscribed to the Google 
Groups sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send 
an email to sqlalchemy+unsubscr...@googlegroups.com 
mailto:sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com 
mailto:sqlalchemy@googlegroups.com.

Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] [ORM] Fetching columns that were assigned a SQL expression as part of INSERT...RETURNING and UPDATE...RETURNING

2015-04-28 Thread Mike Bayer

use the eager_defaults flag:


http://docs.sqlalchemy.org/en/rel_1_0/orm/mapping_api.html?highlight=eager_defaults#sqlalchemy.orm.mapper.params.eager_defaults

http://docs.sqlalchemy.org/en/rel_0_9/changelog/migration_09.html#orm-can-efficiently-fetch-just-generated-insert-update-defaults-using-returning


not sure if this works for the ad-hoc set up updated_at, try it out.   
if it's a server_default on the Column, should definitely work.





On 4/28/15 7:35 PM, univerio wrote:

Suppose I have a PostgreSQL backend and I have the following class:

class Foo(Base):
id = Column(Integer, primary_key=True)
updated_at = Column(DateTime)

and I do

foo = Foo(updated_at=func.now())
session.add(foo)
session.flush()
foo.id  # this is already loaded, no additional query emitted
foo.updated_at  # this is not loaded, will cause an additional 
query to be emitted


Is it possible to have the SQLAlchemy ORM fetch the actual value of 
updated_at as part of the INSERT...RETURNING statement like it does 
for id, instead of leaving it unloaded and having to issue a second 
query when I access it?



Jack
--
You received this message because you are subscribed to the Google 
Groups sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send 
an email to sqlalchemy+unsubscr...@googlegroups.com 
mailto:sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com 
mailto:sqlalchemy@googlegroups.com.

Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


[sqlalchemy] [ORM] Fetching columns that were assigned a SQL expression as part of INSERT...RETURNING and UPDATE...RETURNING

2015-04-28 Thread univerio
Suppose I have a PostgreSQL backend and I have the following class:

class Foo(Base):
id = Column(Integer, primary_key=True)
updated_at = Column(DateTime)

and I do

foo = Foo(updated_at=func.now())
session.add(foo)
session.flush()
foo.id  # this is already loaded, no additional query emitted
foo.updated_at  # this is not loaded, will cause an additional query to 
be emitted

Is it possible to have the SQLAlchemy ORM fetch the actual value of 
updated_at as part of the INSERT...RETURNING statement like it does for id, 
instead of leaving it unloaded and having to issue a second query when I 
access it?


Jack

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] Why is an association object mark as dirty instead of deleted when removed?

2015-04-28 Thread Mike Bayer



On 4/28/15 6:57 PM, st...@canary.md wrote:

Hi,

Background information: I am trying to implement functionality similar 
to the history_meta.py example 
(http://docs.sqlalchemy.org/en/rel_0_9/_modules/examples/versioned_history/history_meta.html). 
I am listening for after_flush events and create an audit record and 
am having problems with association objects. Here is an example:


class User(Auditable, self.Base, ComparableEntity):
__tablename__ = 'usertable'
id = Column(Integer, primary_key=True)
name = Column(String)
keywords = association_proxy('assocs', 'keyword')

class Keyword(Auditable, self.Base, ComparableEntity):
__tablename__ = 'keywordtable'
id = Column(Integer, primary_key=True)
word = Column(String)

class UserKeyword(Auditable, self.Base, ComparableEntity):
__tablename__ = 'userkeywordtable'
user_id = Column(Integer, ForeignKey(usertable.id),
   primary_key=True)
keyword_id = Column(Integer, ForeignKey(keywordtable.id),
  primary_key=True)
user = relationship(User,
  backref=backref(assocs,
  cascade=all, delete-orphan))
keyword = relationship(Keyword)
def __init__(self, keyword=None, user=None):
self.user = user
self.keyword = keyword


apple = Keyword(word='apple')
pear = Keyword(word='pear')
bob = User(name='bob')
bob.keywords = [apple, pear]
sess.add(bob)
sess.commit()

bob.keywords.remove(apple)   == this is when my question is about
sess.commit()
When we remove the keyword, it marks the UserKeyword association 
object is dirty instead of deleted. Why is that? Since the row is 
being removed, I would expect it to be marked as deleted, so that I 
could make an audit record indicating it was deleted.


does the row actually get deleted?  the calculation of orphan isn't 
done until flush time, because theoretically you could be associating 
the UserKeyword to another User.


it doesn't look like the versioned rows recipe has support for this use 
case right now.  You could force the up-front delete using a remove 
attribute event on that collection.



--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] [ORM] Fetching columns that were assigned a SQL expression as part of INSERT...RETURNING and UPDATE...RETURNING

2015-04-28 Thread Jack Zhou
Great! That worked, though I had to add server_default=FetchedValue()
and server_onupdate=FetchedValue().
Would be nice if SQLAlchemy auto-detected this case, but no big deal
otherwise.


Thanks!

Jack

On Tue, Apr 28, 2015 at 4:42 PM Mike Bayer mike...@zzzcomputing.com wrote:

  use the eager_defaults flag:



 http://docs.sqlalchemy.org/en/rel_1_0/orm/mapping_api.html?highlight=eager_defaults#sqlalchemy.orm.mapper.params.eager_defaults


 http://docs.sqlalchemy.org/en/rel_0_9/changelog/migration_09.html#orm-can-efficiently-fetch-just-generated-insert-update-defaults-using-returning


 not sure if this works for the ad-hoc set up updated_at, try it out.   if
 it's a server_default on the Column, should definitely work.





 On 4/28/15 7:35 PM, univerio wrote:

 Suppose I have a PostgreSQL backend and I have the following class:

  class Foo(Base):
 id = Column(Integer, primary_key=True)
 updated_at = Column(DateTime)

  and I do

  foo = Foo(updated_at=func.now())
 session.add(foo)
 session.flush()
 foo.id  # this is already loaded, no additional query emitted
 foo.updated_at  # this is not loaded, will cause an additional query
 to be emitted

  Is it possible to have the SQLAlchemy ORM fetch the actual value of
 updated_at as part of the INSERT...RETURNING statement like it does for id,
 instead of leaving it unloaded and having to issue a second query when I
 access it?


  Jack
  --

 You received this message because you are subscribed to the Google Groups
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to sqlalchemy+unsubscr...@googlegroups.com.


 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/d/optout.

  --
 You received this message because you are subscribed to a topic in the
 Google Groups sqlalchemy group.
 To unsubscribe from this topic, visit
 https://groups.google.com/d/topic/sqlalchemy/F2QKToSNKKM/unsubscribe.
 To unsubscribe from this group and all its topics, send an email to
 sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/d/optout.


-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] Automap not reflecting tables in Postgres schemas

2015-04-28 Thread Sam Zhang
Just hit the issue tracker with this, and the two snags I encountered so 
far doing this. Hopefully it's not because I missed some glaring 
instructions about how to build the documentation?

On Tuesday, April 28, 2015 at 3:46:11 PM UTC-4, Michael Bayer wrote:

  

 On 4/28/15 3:02 PM, Sam Zhang wrote:
  
 Thanks Michael! it was the lack of a primary key. I see references to it 
 now that I know what to look for 
 - a very interesting explanation: 
 http://docs.sqlalchemy.org/en/latest/faq/ormconfiguration.html#how-do-i-map-a-table-that-has-no-primary-key
 - 
 http://stackoverflow.com/questions/23765681/sqlalchemy-automap-does-not-create-class-for-tables-without-primary-key

 It looks like there's no mention of this requirement in the automap 
 documentation page though: 
 http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html. I'd be 
 happy to add a brief note about it and submit a pull request if you'd like.
  

 sure thing!


   
  Sam

 On Monday, April 27, 2015 at 6:54:13 PM UTC-4, Michael Bayer wrote: 

  

 On 4/27/15 4:29 PM, Sam Zhang wrote:
  
 Hello, 

  I'm following the documentation for reflecting database tables using 
 `automap`: 
 http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html#generating-mappings-from-an-existing-metadata
 .

  When I don't specific a schema, and Postgres uses the default `public` 
 schema, this works as expected, and I find the names of my tables:

   m = MetaData()
  b = automap_base(bind=engine, metadata=m)
  b.prepare(engine, reflect=True)
  b.classes.keys()
 ['ads', 'spatial_ref_sys', 'income']

  But when I specific an explicit schema, I don't have access to the 
 tables in `Base.classes` anymore.

   m = MetaData(schema='geography')
  b = automap_base(bind=engine, metadata=m)
  b.prepare(engine, reflect=True)
  b.classes.keys()
 []

  The MetaData reflected correctly though:

   b.metadata.tables
 immutabledict({geography.usa_cbsa_centroids': 
 Table('usa_cbsa_centroids', 
 MetaData(bind=Engine(postgresql://asteroids:***@localhost:5432/asteroids)), 
 Column('GEOID', VARCHAR(length=5), table=u
 sa_cbsa_centroids, nullable=False), ...})

  Note that the tables and columns are only known at runtime.
  
 Here's a demo that works for me.  Does it work for you?Do all your 
 tables have primary keys defined?


 from sqlalchemy.ext.automap import automap_base
 from sqlalchemy.orm import Session
 from sqlalchemy import create_engine, MetaData


 engine = create_engine(postgresql://scott:tiger@localhost/test, 
 echo=True)
 engine.execute(
 create table if not exists test_schema.user (
 id serial primary key, name varchar(30)
 )
 )
 engine.execute(
 create table if not exists test_schema.address (
 id serial primary key,
 email_address varchar(30),
 user_id integer references test_schema.user(id)
 )
 )

 m = MetaData(schema=test_schema)

 Base = automap_base(bind=engine, metadata=m)

 # reflect the tables
 Base.prepare(engine, reflect=True)

 assert Base.classes.keys() == ['user', 'address']

 User = Base.classes.user
 Address = Base.classes.address


 session = Session(engine)

 session.add(Address(email_address=f...@bar.com, user=User(name=foo)))
 session.commit()

 u1 = session.query(User).first()
 print(u1.address_collection)






  
  Any thoughts?

  This is duplicated from  
 http://stackoverflow.com/questions/29905160/automap-reflect-tables-within-a-postgres-schema-with-sqlalchemy
 http://stackoverflow.com/questions/29905160/automap-reflect-tables-within-a-postgres-schema-with-sqlalchemy,
  
 feel free to answer there as well.

  Thanks,
 Sam
  -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+...@googlegroups.com.
 To post to this group, send email to sqlal...@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/d/optout.


-- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+...@googlegroups.com javascript:.
 To post to this group, send email to sqlal...@googlegroups.com 
 javascript:.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/d/optout.


  

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] polymorphic objects

2015-04-28 Thread Richard Gerd Kuesters
yeah, i know. sorry for the late question, i think i was a bit too tired 
to continue coding. anyway, i circumvented the problem by using a select 
using A.__table__, which works for what I need (i have a heavily 
modified hierarchical extension based on Mariano Mara's code 
(https://pypi.python.org/pypi/sqla-hierarchy)). when it comes to 
polymorphism, i forgot that i had to use select instead of query to 
retrieve my objects :)


best regards,
richard.

On 04/27/2015 07:41 PM, Mike Bayer wrote:



On 4/27/15 4:31 PM, Richard Gerd Kuesters wrote:
well, i'm having trouble dealing with polymorphic objects. i mean, 
the functionality is fine, i just don't know how to obtain the main 
object.


let me be clear: i have A, which is my main object, and is inherited 
by B and C. I would like to work with the A object, even though it's 
polymorphic identity refers to B or C. I know it's not the default 
behavior, but can I obtain A from a query?


perhaps you could be more specific.

If a query returns B and C objects, if those inherit from A, they 
*are* A objects.






thanks a lot!
richard.
--
You received this message because you are subscribed to the Google 
Groups sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, 
send an email to sqlalchemy+unsubscr...@googlegroups.com 
mailto:sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com 
mailto:sqlalchemy@googlegroups.com.

Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google 
Groups sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send 
an email to sqlalchemy+unsubscr...@googlegroups.com 
mailto:sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com 
mailto:sqlalchemy@googlegroups.com.

Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.
attachment: richard.vcf

[sqlalchemy] Why is an association object mark as dirty instead of deleted when removed?

2015-04-28 Thread steve
Hi,

Background information: I am trying to implement functionality similar to 
the history_meta.py example 
(http://docs.sqlalchemy.org/en/rel_0_9/_modules/examples/versioned_history/history_meta.html).
 
I am listening for after_flush events and create an audit record and am 
having problems with association objects. Here is an example:

class User(Auditable, self.Base, ComparableEntity):
__tablename__ = 'usertable'
id = Column(Integer, primary_key=True)
name = Column(String)
keywords = association_proxy('assocs', 'keyword')

class Keyword(Auditable, self.Base, ComparableEntity):
__tablename__ = 'keywordtable'
id = Column(Integer, primary_key=True)
word = Column(String)

class UserKeyword(Auditable, self.Base, ComparableEntity):
__tablename__ = 'userkeywordtable'
user_id = Column(Integer, ForeignKey(usertable.id),
 primary_key=True)
keyword_id = Column(Integer, ForeignKey(keywordtable.id),
primary_key=True)
user = relationship(User, 
backref=backref(assocs,
cascade=all, 
delete-orphan))
keyword = relationship(Keyword)
def __init__(self, keyword=None, user=None):
self.user = user
self.keyword = keyword


apple = Keyword(word='apple')
pear = Keyword(word='pear')
bob = User(name='bob')
bob.keywords = [apple, pear]
sess.add(bob)
sess.commit()

bob.keywords.remove(apple)   == this is when my question is 
about
sess.commit()

When we remove the keyword, it marks the UserKeyword association object is 
dirty instead of deleted. Why is that? Since the row is being removed, 
I would expect it to be marked as deleted, so that I could make an audit 
record indicating it was deleted.

Thanks,
Steve

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] Automap not reflecting tables in Postgres schemas

2015-04-28 Thread Mike Bayer



On 4/28/15 10:08 PM, Sam Zhang wrote:
Just hit the issue tracker with this, and the two snags I encountered 
so far doing this. Hopefully it's not because I missed some glaring 
instructions about how to build the documentation?
there's no instructions right now.  it's a sphinx build, plus the things 
that are in requirements.txt.   the themes and plugins are all extremely 
custom so you can't modify the themes or anything like that.





On Tuesday, April 28, 2015 at 3:46:11 PM UTC-4, Michael Bayer wrote:



On 4/28/15 3:02 PM, Sam Zhang wrote:

Thanks Michael! it was the lack of a primary key. I see
references to it now that I know what to look for
- a very interesting explanation:

http://docs.sqlalchemy.org/en/latest/faq/ormconfiguration.html#how-do-i-map-a-table-that-has-no-primary-key

http://docs.sqlalchemy.org/en/latest/faq/ormconfiguration.html#how-do-i-map-a-table-that-has-no-primary-key
-

http://stackoverflow.com/questions/23765681/sqlalchemy-automap-does-not-create-class-for-tables-without-primary-key

http://stackoverflow.com/questions/23765681/sqlalchemy-automap-does-not-create-class-for-tables-without-primary-key

It looks like there's no mention of this requirement in the
automap documentation page though:
http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html
http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html.
I'd be happy to add a brief note about it and submit a pull
request if you'd like.


sure thing!




Sam

On Monday, April 27, 2015 at 6:54:13 PM UTC-4, Michael Bayer wrote:



On 4/27/15 4:29 PM, Sam Zhang wrote:

Hello,

I'm following the documentation for reflecting database
tables using `automap`:

http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html#generating-mappings-from-an-existing-metadata

http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html#generating-mappings-from-an-existing-metadata.

When I don't specific a schema, and Postgres uses the
default `public` schema, this works as expected, and I find
the names of my tables:

 m = MetaData()
 b = automap_base(bind=engine, metadata=m)
 b.prepare(engine, reflect=True)
 b.classes.keys()
['ads', 'spatial_ref_sys', 'income']

But when I specific an explicit schema, I don't have access
to the tables in `Base.classes` anymore.

 m = MetaData(schema='geography')
 b = automap_base(bind=engine, metadata=m)
 b.prepare(engine, reflect=True)
 b.classes.keys()
[]

The MetaData reflected correctly though:

 b.metadata.tables
immutabledict({geography.usa_cbsa_centroids':
Table('usa_cbsa_centroids',

MetaData(bind=Engine(postgresql://asteroids:***@localhost:5432/asteroids)),
Column('GEOID', VARCHAR(length=5), table=u
sa_cbsa_centroids, nullable=False), ...})

Note that the tables and columns are only known at runtime.

Here's a demo that works for me.  Does it work for you?Do
all your tables have primary keys defined?


from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
from sqlalchemy import create_engine, MetaData


engine =
create_engine(postgresql://scott:tiger@localhost/test,
echo=True)
engine.execute(
create table if not exists test_schema.user (
id serial primary key, name varchar(30)
)
)
engine.execute(
create table if not exists test_schema.address (
id serial primary key,
email_address varchar(30),
user_id integer references test_schema.user(id)
)
)

m = MetaData(schema=test_schema)

Base = automap_base(bind=engine, metadata=m)

# reflect the tables
Base.prepare(engine, reflect=True)

assert Base.classes.keys() == ['user', 'address']

User = Base.classes.user
Address = Base.classes.address


session = Session(engine)

session.add(Address(email_address=f...@bar.com,
user=User(name=foo)))
session.commit()

u1 = session.query(User).first()
print(u1.address_collection)








Any thoughts?

This is duplicated from

http://stackoverflow.com/questions/29905160/automap-reflect-tables-within-a-postgres-schema-with-sqlalchemy,
feel free to answer there as well.

Thanks,
Sam
-- 
You received this message because you are subscribed to the

Google Groups sqlalchemy group.
To unsubscribe from this group and stop receiving emails
from it, send an email to sqlalchemy+...@googlegroups.com.
To post to this 

[sqlalchemy] Re: pandas.DataFrame.to_sql method: how to speed up exporting to Microsoft SQL Server (6 minutes for 11 MB!)

2015-04-28 Thread Sam Mourad
Joe - I'm having the same issue. Dataframe.to_sql performance is fine with 
sqllite and mysql but SQLAlchemy engine it too slow. After a few days of 
research and trial and error, I was able to improve performance by using 
pymssql. However, the performance is still way below standards. 35000 rows 
of an account relation table (accountid - int, relatedaccountid - int) 
takes 2 minutes to finish. 

After a long research and reading I found an article explaining that 
SQLAlchemy ORM being used by df.to_sql is VERY SLOW because it uses a Unit 
of Work pattern. SQLAlchemy CORE is supposed to be much faster. Check this 
link out:
http://docs.sqlalchemy.org/en/rel_0_8/faq.html#i-m-inserting-400-000-rows-with-the-orm-and-it-s-really-slow

My issue is still not solved because my goal is to insert a dataframe into 
a sql table. If I use SQLAlchemy CORE I'll have to iterate through 
dataframe rows and insert them into the SQL table which ends up being even 
slower than to_sql.

Did you find a solution? If so, please share.

Thanks,


On Wednesday, April 22, 2015 at 5:11:50 PM UTC-5, John Doe wrote:



 On Wednesday, April 22, 2015 at 7:19:08 PM UTC+1, Jonathan Vanasco wrote:

 [...]

 I'm guessing that this issue is with the driver.

 Here's a semi-related thread:
 * 
 http://stackoverflow.com/questions/5693885/pyodbc-very-slow-bulk-insert-speed
  
 http://www.google.com/url?q=http%3A%2F%2Fstackoverflow.com%2Fquestions%2F5693885%2Fpyodbc-very-slow-bulk-insert-speedsa=Dsntz=1usg=AFQjCNFZ1HfXtEGFbFLWP2uhtIyql-mHZg
  
   It looks like the pyodbc driver handles executemany in a not very ideal 
 manner.

 There were also some threads that noted ODBC tracing being on, and others 
 that noted most python drivers are just painfully slow. 

 In any event, if you're talking about a single insert statement that 
 sounds like an executemany context and a driver issue.


 Tracing is off.
 If I had to place a bet, my money would be on pyodbc having too slow a 
 network connection, for some reason that's totally beyond me.
 On my home PC I generated a dataframe of random numbers in Python, then 
 used the to_sql() method to transfer it to a SQL Server express running on 
 the same machine, and it was fast. This suggests that SQL server has no 
 issue with the data per se.
 When I ran the same code on my work PC, trying to export to a SQL Server 
 2014 machine which is part of the same company network and only a few miles 
 away, it took ages.

 I'll try having Python installed on the SQL server, and running it from 
 there, to see if this theory is correct. 

 I also tried pymssql, but it took ages.
 Some stack overflow users had luck with adodb, but sqlalchemy no longer 
 supports it.

 I miss Matlab's database toolbox! Yes, it's expensive, and Matlab has tons 
 of flaws, but at least Matlab's documentation is excellent and doesn't 
 cause you to lose the will to live wasting hours trying to figure out how 
 to carry out a banal task like exporting a table...

 The bottom line is that pandas to_sql() methos is basically unusable if 
 you're trying to export more than 2 MBs of data

 Anyway, thanks a lot for your help, Jonathan and Michael.


-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] How can i use LIKE with an association proxy?

2015-04-28 Thread Adrian
Ugh, somehow my reply sent by email nerver arrived here... here's my code: 
https://gist.github.com/ThiefMaster/40cd1f91e2a792150496

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] Automap not reflecting tables in Postgres schemas

2015-04-28 Thread Sam Zhang
Thanks Michael! it was the lack of a primary key. I see references to it 
now that I know what to look for
- a very interesting 
explanation: 
http://docs.sqlalchemy.org/en/latest/faq/ormconfiguration.html#how-do-i-map-a-table-that-has-no-primary-key
- 
http://stackoverflow.com/questions/23765681/sqlalchemy-automap-does-not-create-class-for-tables-without-primary-key

It looks like there's no mention of this requirement in the automap 
documentation page 
though: http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html. 
I'd be happy to add a brief note about it and submit a pull request if 
you'd like.

Sam

On Monday, April 27, 2015 at 6:54:13 PM UTC-4, Michael Bayer wrote:

  

 On 4/27/15 4:29 PM, Sam Zhang wrote:
  
 Hello, 

  I'm following the documentation for reflecting database tables using 
 `automap`: 
 http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html#generating-mappings-from-an-existing-metadata
 .

  When I don't specific a schema, and Postgres uses the default `public` 
 schema, this works as expected, and I find the names of my tables:

   m = MetaData()
  b = automap_base(bind=engine, metadata=m)
  b.prepare(engine, reflect=True)
  b.classes.keys()
 ['ads', 'spatial_ref_sys', 'income']

  But when I specific an explicit schema, I don't have access to the 
 tables in `Base.classes` anymore.

   m = MetaData(schema='geography')
  b = automap_base(bind=engine, metadata=m)
  b.prepare(engine, reflect=True)
  b.classes.keys()
 []

  The MetaData reflected correctly though:

   b.metadata.tables
 immutabledict({geography.usa_cbsa_centroids': 
 Table('usa_cbsa_centroids', 
 MetaData(bind=Engine(postgresql://asteroids:***@localhost:5432/asteroids)), 
 Column('GEOID', VARCHAR(length=5), table=u
 sa_cbsa_centroids, nullable=False), ...})

  Note that the tables and columns are only known at runtime.
  
 Here's a demo that works for me.  Does it work for you?Do all your 
 tables have primary keys defined?


 from sqlalchemy.ext.automap import automap_base
 from sqlalchemy.orm import Session
 from sqlalchemy import create_engine, MetaData


 engine = create_engine(postgresql://scott:tiger@localhost/test, 
 echo=True)
 engine.execute(
 create table if not exists test_schema.user (
 id serial primary key, name varchar(30)
 )
 )
 engine.execute(
 create table if not exists test_schema.address (
 id serial primary key,
 email_address varchar(30),
 user_id integer references test_schema.user(id)
 )
 )

 m = MetaData(schema=test_schema)

 Base = automap_base(bind=engine, metadata=m)

 # reflect the tables
 Base.prepare(engine, reflect=True)

 assert Base.classes.keys() == ['user', 'address']

 User = Base.classes.user
 Address = Base.classes.address


 session = Session(engine)

 session.add(Address(email_address=f...@bar.com javascript:, 
 user=User(name=foo)))
 session.commit()

 u1 = session.query(User).first()
 print(u1.address_collection)






  
  Any thoughts?

  This is duplicated from 
 http://stackoverflow.com/questions/29905160/automap-reflect-tables-within-a-postgres-schema-with-sqlalchemy,
  
 feel free to answer there as well.

  Thanks,
 Sam
  -- 
 You received this message because you are subscribed to the Google Groups 
 sqlalchemy group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to sqlalchemy+...@googlegroups.com javascript:.
 To post to this group, send email to sqlal...@googlegroups.com 
 javascript:.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/d/optout.


  

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] when do I have to do a rollback so the app keep working fine

2015-04-28 Thread David Chavez
Thank you, I looked at it and it is quite more complicated than I
thought it was, really extensive what you have there. If it's not too
much to ask can anyone explain a little bit how to use that?

On 4/27/15, Mike Bayer mike...@zzzcomputing.com wrote:


 On 4/27/15 7:40 AM, David Allouche wrote:
 On 18 Apr 2015, at 02:20, dcgh...@gmail.com wrote:

 Hi everyone, I have a considerably amount of code that uses SQLAlchemy
 and I want to implement a decorator that captures the SQLAlchemy
 exceptions, then make session.rollback() and recall the decorated
 function, so I don't have to write the try except statement whenever I
 use SQLAlchemy.
 For implementing such a decorator I need the exceptions I can certainly
 capture to make session.rollback() and the app keep working fine because
 there are exceptions that will cause an endless loop and should never be
 captured (e.g., generating the same primary key due to a bug and always
 raising IntegrityError)
 So, can anyone tell me what are those exceptions that are safe to make
 session.rollback()?
 I guess that your intent is to retry transactions that failed because of a
 serialisation error in the SERIALIZABLE isolation level.

 My understanding is that, to SQLAlchemy, this is a database-specific
 issue. You can expect the SQLAlchemy exception to be an OperationalError,
 but you would need to add additional checks to specifically identify the
 kind error returned by your database driver. An argument could be made
 that serialisation errors should be wrapped in a more specific exception
 class by SQLAlchemy, but I do not believe that is the case at the moment.

 I am no expert, so please someone correct me if I am wrong.

 that's pretty much the current situation - OperationalError refers to
 something went wrong with the connection and IntegrityError means
 something went wrong with the data the query is attempting to modify.

 In Openstack we have an elaborate system of catching those exceptions we
 care about across many backends; this is probably more than you need but
 this is sort of what is needed:
 https://github.com/openstack/oslo.db/blob/master/oslo_db/sqlalchemy/exc_filters.py


 --
 You received this message because you are subscribed to a topic in the
 Google Groups sqlalchemy group.
 To unsubscribe from this topic, visit
 https://groups.google.com/d/topic/sqlalchemy/Uifgo8n7yHw/unsubscribe.
 To unsubscribe from this group and all its topics, send an email to
 sqlalchemy+unsubscr...@googlegroups.com.
 To post to this group, send email to sqlalchemy@googlegroups.com.
 Visit this group at http://groups.google.com/group/sqlalchemy.
 For more options, visit https://groups.google.com/d/optout.


-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


[sqlalchemy] How to view (get actual print-out) of the contents of db tables?

2015-04-28 Thread Ivan Evstegneev
Greetings everyone!!!

I have a little question.

Is there some way (tool) to print the contents of the tables in sqlalchemy?

My work is mostly with python, I load data from excel files, and then view 
it's contents via console (for now). 
Just print the data in some organized way (like list and dictionaries).

Recently I started to play with sqlalchemy.

I'm looking for some kind of data visualization in order to verify the way 
I put my data inside db.

I googled a lot, but somehow have been stuck and going rounds...

There is sqlalchemy reflection, but while reading documentation this is 
not what I'm looking for, or at least that is what it seems to me, cause 
there is a lack of examples (IMHO) for newbies.

Them I found a way, by using mysql console:

mysql show databases
mysql show tables
mysql show columns from TABLE_NAME

Here the example of what I'm talking 
about: http://i.ytimg.com/vi/XiDnK9Lq-Ng/maxresdefault.jpg  (just googled 
some images)

But in order to use this I need to install additional software and so on...

Any help will be greatly appreciated.

Thanks in advance,

Ivan.

 

-- 
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] Automap not reflecting tables in Postgres schemas

2015-04-28 Thread Mike Bayer



On 4/28/15 3:02 PM, Sam Zhang wrote:
Thanks Michael! it was the lack of a primary key. I see references to 
it now that I know what to look for
- a very interesting 
explanation: http://docs.sqlalchemy.org/en/latest/faq/ormconfiguration.html#how-do-i-map-a-table-that-has-no-primary-key

- 
http://stackoverflow.com/questions/23765681/sqlalchemy-automap-does-not-create-class-for-tables-without-primary-key

It looks like there's no mention of this requirement in the automap 
documentation page 
though: http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html. 
I'd be happy to add a brief note about it and submit a pull request if 
you'd like.


sure thing!




Sam

On Monday, April 27, 2015 at 6:54:13 PM UTC-4, Michael Bayer wrote:



On 4/27/15 4:29 PM, Sam Zhang wrote:

Hello,

I'm following the documentation for reflecting database tables
using `automap`:

http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html#generating-mappings-from-an-existing-metadata

http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html#generating-mappings-from-an-existing-metadata.

When I don't specific a schema, and Postgres uses the default
`public` schema, this works as expected, and I find the names of
my tables:

 m = MetaData()
 b = automap_base(bind=engine, metadata=m)
 b.prepare(engine, reflect=True)
 b.classes.keys()
['ads', 'spatial_ref_sys', 'income']

But when I specific an explicit schema, I don't have access to
the tables in `Base.classes` anymore.

 m = MetaData(schema='geography')
 b = automap_base(bind=engine, metadata=m)
 b.prepare(engine, reflect=True)
 b.classes.keys()
[]

The MetaData reflected correctly though:

 b.metadata.tables
immutabledict({geography.usa_cbsa_centroids':
Table('usa_cbsa_centroids',
MetaData(bind=Engine(postgresql://asteroids:***@localhost:5432/asteroids)),
Column('GEOID', VARCHAR(length=5), table=u
sa_cbsa_centroids, nullable=False), ...})

Note that the tables and columns are only known at runtime.

Here's a demo that works for me.  Does it work for you?Do all
your tables have primary keys defined?


from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
from sqlalchemy import create_engine, MetaData


engine = create_engine(postgresql://scott:tiger@localhost/test,
echo=True)
engine.execute(
create table if not exists test_schema.user (
id serial primary key, name varchar(30)
)
)
engine.execute(
create table if not exists test_schema.address (
id serial primary key,
email_address varchar(30),
user_id integer references test_schema.user(id)
)
)

m = MetaData(schema=test_schema)

Base = automap_base(bind=engine, metadata=m)

# reflect the tables
Base.prepare(engine, reflect=True)

assert Base.classes.keys() == ['user', 'address']

User = Base.classes.user
Address = Base.classes.address


session = Session(engine)

session.add(Address(email_address=f...@bar.com javascript:,
user=User(name=foo)))
session.commit()

u1 = session.query(User).first()
print(u1.address_collection)








Any thoughts?

This is duplicated from

http://stackoverflow.com/questions/29905160/automap-reflect-tables-within-a-postgres-schema-with-sqlalchemy

http://stackoverflow.com/questions/29905160/automap-reflect-tables-within-a-postgres-schema-with-sqlalchemy,
feel free to answer there as well.

Thanks,
Sam
-- 
You received this message because you are subscribed to the

Google Groups sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it,
send an email to sqlalchemy+...@googlegroups.com javascript:.
To post to this group, send email to sqlal...@googlegroups.com
javascript:.
Visit this group at http://groups.google.com/group/sqlalchemy
http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout
https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google 
Groups sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send 
an email to sqlalchemy+unsubscr...@googlegroups.com 
mailto:sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com 
mailto:sqlalchemy@googlegroups.com.

Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to