Hi, James, thank you for the info. I'll give this a try. Still, it does 
seem to require that the data be read into Python and then written out to 
the DB, albeit quickly. What I'd prefer is a way to issue a call to the DB 
to have it 'suck in' some csv file directly, rather than having to read it 
into Python first. Most databases have a bulk loader to handle such jobs so 
I would think it would be possible to provide a vendor-neutral way to call 
this functionality but I could be missing something. Perhaps I should look 
at adding this to the SQLAlchemy github project but I'm not sure that the 
world would want to rely on my code :) 

In any case, thank you so much for taking the time to reply.

Ben

On Sunday, April 19, 2020 at 1:23:06 PM UTC-4, James Fennell wrote:
>
> Hi Ben,
>
> Have you checked out bulk operations? 
> https://docs.sqlalchemy.org/en/13/orm/persistence_techniques.html#bulk-operations
>
> Bulk operations provide a DB-agnostic API for doing large numbers of 
> inserts and/or updates. The speed up you see, compared to session.add, 
> depends on your database and the database connection arguments. With 
> Postgres, I've needed to enable batch mode to see the fullest benefit: 
> https://docs.sqlalchemy.org/en/13/dialects/postgresql.html#psycopg2-batch-mode
>
> Personally I've seen huge performance impovements in my applications after 
> migrating big operations from session.add over to the bulk API.
>
> Of course, as you say, you can do more low level SQL calls to get it even 
> faster, but then you run into a bunch of other issues.
>
> James
>
>
> On Sun, Apr 19, 2020, 12:46 PM Ben <benjam...@dataspace.com <javascript:>> 
> wrote:
>
>> I hope this is the right place for this... I need to load large files 
>> into my database. As I understand it, I can do this in one of two ways with 
>> SQLAlchemy Core: 1) Bring the data into Python and then write it out with 
>> the *add* method or, alternatively, 2) Use SQLAlchemy to issue a command 
>> to the DB to use it's native bulk loader to read data from my file. I would 
>> expect this second approach to be faster, to require less code, and to 
>> avoid issues such as trying to put too much in memory at one time. However, 
>> it is DB-vendor-specific (i.e. I believe the command I send to a MySQL DB 
>> will differ from that I send to a Postgres DB). 
>>
>> So, 
>>
>>    - Do I properly understand SQLAlchemy's capabilities here or am I 
>>    missing something?
>>    - If I do have this right, is generic access to bulk loaders 
>>    something that is on the upgrade / new development list?
>>
>> Sorry if this isn't the right place for this.
>> Thanks!
>> Ben
>>
>> -- 
>> SQLAlchemy - 
>> The Python SQL Toolkit and Object Relational Mapper
>>  
>> http://www.sqlalchemy.org/
>>  
>> To post example code, please provide an MCVE: Minimal, Complete, and 
>> Verifiable Example. See http://stackoverflow.com/help/mcve for a full 
>> description.
>> --- 
>> You received this message because you are subscribed to the Google Groups 
>> "sqlalchemy" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to sqlal...@googlegroups.com <javascript:>.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/sqlalchemy/fd55ee9a-6918-4b60-88bb-961787e3c53e%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/sqlalchemy/fd55ee9a-6918-4b60-88bb-961787e3c53e%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>>
>

-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sqlalchemy/abc915b7-f856-4c72-9898-d1ec34322860%40googlegroups.com.

Reply via email to