select 10 sample rows for columns id, ctime from each (MySQL and spark)
tables and post the output please.

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 25 June 2016 at 13:36, 刘虓 <ipf...@gmail.com> wrote:

> Hi,
> I came across this strange behavior of Apache Spark 1.6.1:
> when I was reading mysql table into spark dataframe ,a column of data type
> float got mapped into double.
>
> dataframe schema:
>
> root
>
>  |-- id: long (nullable = true)
>
>  |-- ctime: double (nullable = true)
>
>  |-- atime: double (nullable = true)
>
> mysql schema:
>
> mysql> desc test.user_action_2;
>
> +-------+------------------+------+-----+---------+-------+
>
> | Field | Type             | Null | Key | Default | Extra |
>
> +-------+------------------+------+-----+---------+-------+
>
> | id    | int(10) unsigned | YES  |     | NULL    |       |
>
> | ctime | float            | YES  |     | NULL    |       |
>
> | atime | double           | YES  |     | NULL    |       |
>
> +-------+------------------+------+-----+---------+-------+
> I wonder if anyone have seen this behavior before.
>

Reply via email to