Re: [pylons-discuss] Saving a lot of data to Postgresql database in Pyramid using SQLAlchemy

2020-03-20 Thread Emerson Barea
Thank you Michael Merickel and Theron Luhn for the answers. I'll try to follow bulk_insert_mappings procedure, but I don't know how to pass database parameters (servername, username, password, database schema) in a connection script like Theron posted. I know that is a silly doubt, but can you

Re: [pylons-discuss] Saving a lot of data to Postgresql database in Pyramid using SQLAlchemy

2020-03-20 Thread Theron Luhn
For CLI scripts like this, I usually skip Pyramid entirely and instantiate the SQLAlchemy session manually. Won’t help your performance woes, but it is less machinery to deal with. I second Michael’s bulk_insert_mappings, it's what I usually reach for in cases like this and the performance is

Re: [pylons-discuss] Saving a lot of data to Postgresql database in Pyramid using SQLAlchemy

2020-03-20 Thread Michael Merickel
dbsession.add on each row is pretty much worse case scenario. Start with https://docs.sqlalchemy.org/en/13/_modules/examples/performance/bulk_inserts.html which shows you how to use

[pylons-discuss] Saving a lot of data to Postgresql database in Pyramid using SQLAlchemy

2020-03-20 Thread Emerson Barea
Hi there. Some times my app needs to create and save almost a million records in a Postgres database. In that case, I'm looking for the best way to do this, because the procedures I've used so far are very slow. I will present some ways that I tried that were very slow, as well the workarounds