In our application we are using SQLAlchemy to retrieve data from our Oracle 
database. However, we have a policy that any INSERTS to the Oracle database 
are done via stored procedures. We have approximately 1000 rows (where each 
row  consists of 3-4 simple scalars such as numbers or strings) that we 
need to insert.

Question 1 - is there a sophisticated approach of creating an array of 
these 1000 items and submitting this collection to Oracle in one shot? If 
so, are there any references on this?

Question 2 - If we adopt a naive approach of opening a transaction, making 
1000 calls to this stored proc that does simple inserts, and then commiting 
to database at the end of the 1000 calls to the stored proc, does this 
represent a bottleneck on database/computer resources of some sort, such as:
a - will multiple threads will be created for this?
b - presuming the default of 20 pooled connections to the database 
<https://docs.sqlalchemy.org/en/13/core/pooling.html>, how many connections 
from this pool will be used for the 1000 calls to the 
inserting-stored-procedure before we commit?
c - Would synchronous calls of this stored procedure pose a serious 
performance/system-overhead issue? 




-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sqlalchemy/24223156-4e69-4f14-9991-935e858e99d5n%40googlegroups.com.

Reply via email to