Hi,
I came across this strange behavior of Apache Spark 1.6.1:
when I was reading mysql table into spark dataframe ,a column of data type
float got mapped into double.

dataframe schema:

root

 |-- id: long (nullable = true)

 |-- ctime: double (nullable = true)

 |-- atime: double (nullable = true)

mysql schema:

mysql> desc test.user_action_2;

+-------+------------------+------+-----+---------+-------+

| Field | Type             | Null | Key | Default | Extra |

+-------+------------------+------+-----+---------+-------+

| id    | int(10) unsigned | YES  |     | NULL    |       |

| ctime | float            | YES  |     | NULL    |       |

| atime | double           | YES  |     | NULL    |       |

+-------+------------------+------+-----+---------+-------+
I wonder if anyone have seen this behavior before.

Reply via email to