The other approach would to write to temp table and then merge the data.
But this may be expensive solution.

Thanks
Deepak

On Mon, Mar 19, 2018, 08:04 Gurusamy Thirupathy <thirug...@gmail.com> wrote:

> Hi,
>
> I am trying to read data from Hive as DataFrame, then trying to write the
> DF into the Oracle data base. In this case, the date field/column in hive
> is with Type Varchar(20)
> but the corresponding column type in Oracle is Date. While reading from
> hive , the hive table names are dynamically decided(read from another
> table) based on some job condition(ex. Job1). There are multiple tables
> like this, so column and the table names are decided only run time. So I
> can't do type conversion explicitly when read from Hive.
>
> So is there any utility/api available in Spark to achieve this conversion
> issue?
>
>
> Thanks,
> Guru
>

Reply via email to