no DBAPI I've tested does it, at most it would only be possible for PG, SQL
Server dialects.
On Mar 27, 2014, at 2:30 AM, Cosmia Luna wrote:
> Wow, I didn't know that... is it a bug?
>
> But, the RETURNING clause will work though :)
>
> stmt = P.__table__.insert(returning=[P.id], values=[{"va
Wow, I didn't know that... is it a bug?
But, the RETURNING clause will work though :)
stmt = P.__table__.insert(returning=[P.id], values=[{"val": 1}, {"val": 2}])
with engine.connect() as conn:
result = conn.execute(stmt)
# do something with result
hmm, maybe perfomance suffers from huge
RETURNING doesn't work with DBAPI's "executemany" style of execution, however,
which is what conn.execute(stmt, [list of parameter sets]) calls.
On Mar 24, 2014, at 5:33 AM, Cosmia Luna wrote:
> INSERT statement of postgresql supports RETURNING, read this
> http://docs.sqlalchemy.org/en/rel_
INSERT statement of postgresql supports RETURNING, read this
http://docs.sqlalchemy.org/en/rel_0_8/core/dml.html#sqlalchemy.sql.expression.Insert.returning
On Monday, March 24, 2014 2:43:46 PM UTC+8, James Meneghello wrote:
>
> Oops, I should add - the reason I can't use an itertools counter to
Oops, I should add - the reason I can't use an itertools counter to
pre-assign IDs is because the table is potentially being dumped to by
multiple scripts, which is why I have to commit the parts prior to the
segments (since engine.execute can't return multiple insert_ids).
On Monday, 24 March
Thanks for the quick reply!
This seems to work pretty well. I took out the batching (as it's already
batched at a higher level) and modified it to suit the insertion of
children as well (and reducded the unique to a single field) , and it
appears to work.
with db_session() as db:
existing_
On Mar 23, 2014, at 11:33 AM, James Meneghello wrote:
> I'm having a few issues with unique constraints and bulk inserts. The
> software I'm writing takes data from an external source (a lot of it,
> anywhere from 1,000 rows per minute to 100-200k+), crunches it down into its
> hierarchy and
I'm having a few issues with unique constraints and bulk inserts. The
software I'm writing takes data from an external source (a lot of it,
anywhere from 1,000 rows per minute to 100-200k+), crunches it down into
its hierarchy and saves it to the DB, to be aggregated in the background.
The func