Re: Creating JDBC source table schema(DDL) dynamically

2018-07-12 Thread Kadam, Gangadhar (GE Aviation, Non-GE)
Ok. Thanks. On 7/12/18, 11:12 AM, "Thakrar, Jayesh" wrote: Unless the tables are very small (< 1000 rows), the impact of hitting the catalog tables is negligible. Furthermore, normally the catalog tables (or views) are usually in memory because they are needed for query compilation,

Re: Creating JDBC source table schema(DDL) dynamically

2018-07-12 Thread Thakrar, Jayesh
Unless the tables are very small (< 1000 rows), the impact of hitting the catalog tables is negligible. Furthermore, normally the catalog tables (or views) are usually in memory because they are needed for query compilation, query execution (for triggers, referential integrity, etc) and even to

Re: Creating JDBC source table schema(DDL) dynamically

2018-07-12 Thread Kadam, Gangadhar (GE Aviation, Non-GE)
Thanks Jayesh. I was aware of the catalog table approach but I was avoiding that because I will hit the database twice for one table, one to create DDL and other to read the data. I have lots of table to transport from one environment to other and I don’t want to create unnecessary load on

Re: Creating JDBC source table schema(DDL) dynamically

2018-07-12 Thread Thakrar, Jayesh
One option is to use plain JDBC to interrogate Postgresql catalog for the source table and generate the DDL to create the destination table. Then using plain JDBC again, create the table at the destination. See the link below for some pointers…..

Creating JDBC source table schema(DDL) dynamically

2018-07-11 Thread Kadam, Gangadhar (GE Aviation, Non-GE)
Hi All, I am trying to build a spark application which will read the data from Postgresql (source) one environment and write it to postgreSQL, Aurora (target) on a dfiffernt environment (like to PROD to QA or QA to PROD etc) using spark JDBC. When I am loading the dataframe back to