On Fri, Feb 12, 2021, at 2:06 PM, Cristian Bulgaru wrote:
> Hi Vineet, Mike,
>
> @Vineet, thank you for the interesting blog post on bulk insert with
> SQLAlchemy ORM:
>
I'm not familiar with this exactly, but have a bit of experience in this
area.
I just took a look at this module (nice work!). It's VERY well documented
in the docstrings (even nicer work!)
I think the core bit of this technique looks to be in
`_get_next_sequence_values` -
Hi Vineet, Mike,
@Vineet, thank you for the interesting blog post on bulk insert with
SQLAlchemy ORM:
https://benchling.engineering/sqlalchemy-batch-inserts-a-module-for-when-youre-inserting-thousands-of-rows-and-it-s-slow-16ece0ef5bf7
A few questions:
1. Do we need to get the incremented IDs
Hi Vineet -
glad that worked! I'll have to find some time to recall what we worked out here
and how it came out for you, I wonder where on the site this kind of thing
could be mentioned. we have 3rd party dialects listed out in the docs but not
yet a place for extensions.
On Wed, Feb 19,
>
> if you're using Postgresql, there's a vastly easier technique to use
> which is just to pre-fetch from the sequence:
> identities = [
> val for val, in session.execute(
> "select nextval('mytable_seq') from "
> "generate_series(1,%s)" % len(my_objects))
>
On Mon, Oct 9, 2017 at 4:15 AM, wrote:
> Hello! I've spent some time looking at SQLAlchemy's ability to batch
> inserts, and have a few questions about bulk_save_objects (and flushing in
> general).
>
> Two statements that I think are true:
>
> Right now, bulk_save_objects
Hello! I've spent some time looking at SQLAlchemy's ability to batch
inserts, and have a few questions about bulk_save_objects (and flushing in
general).
Two statements that I think are true:
1. Right now, bulk_save_objects does not fetch primary keys for inserted
rows (unless