There was a inherent bug in my code which did this, On Wed, Aug 24, 2016 at 8:07 PM, sujeet jog <sujeet....@gmail.com> wrote:
> Hi, > > I have a table with definition as below , when i write any records to this > table, the varchar(20 ) gets changes to text, and it also losses the > primary key index, > any idea how to write data with spark SQL without loosing the primary key > index & data types. ? > > > MariaDB [analytics]> show columns from fcast; > +-----------------+-------------+------+-----+-------------- > -----+-----------------------------+ > | Field | Type | Null | Key | Default | Extra > | > +-----------------+-------------+------+-----+-------------- > -----+-----------------------------+ > | TimeSeriesID | varchar(20) | NO | PRI | | > | > | TimeStamp | timestamp | NO | PRI | CURRENT_TIMESTAMP | on > update CURRENT_TIMESTAMP | > | Forecast | double | YES | | NULL | > | > > I'm just doinig DF.write.mode("append").jdbc > > Thanks, > >