On Friday, November 2, 2018 at 7:48:22 PM UTC-4, Ruben Di Battista wrote:
>
> ... 
>
I was asked to furtherly optimise that query and the solution that was 
> found was to build up the query textually. I really hate it for a multitude 
> of reasons, but well… I’m not the one making the decisions! :/
> ...
>
 

> The CLI version does not work for us since the resulting data come from a 
> fairly complex optimisation procedure. 
>
>
I think you may have misunderstood my suggestion.  My idea is to use Python 
to generate the INSERT data as a txt/csv file , then use the `mysql` client 
on the server to handle it as a batch import from a file. That procedure 
can be automated across machines using Python libraries like `Fabric`, and 
will usually get around a lot of bottlenecks such as: foreign key checks, 
DBAPI/Curser overhead, general network traffic.

I LOVE SqlAlchemy... but if you're talking about a batch job which inserts 
2-3MM rows daily and a potential scale to 20-30MM... there's only so much 
you can accomplish within SqlAlchemy or even Python. You're at a volume 
where, IMHO, I would use Python to pre-generate SQL to be uploaded onto a 
server for the fastest 'load-in' times possible.

-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to