Hi,

Has anyone tried mixing Spark with some of the other python jdbc/odbc
packages to create an end to end ETL framework. Framwork would enable
making update, delete and other DML operations along with Stored proc /
function calls across variety of databases. Any setup that would be easy to
use.

I know only know of few odbc python packages that are production ready and
widely used, such as pyodbc or sqlAlchemy

JayDeBeApi which can interface with JDBC is in Beta stage

Would it be a bad use case if this is attempted with foreachpartition
through Spark? If not, what could be a good stack for such an
implementation using python.

Regards,
Upkar

Reply via email to