[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread 351zyf
Github user 351zyf commented on the issue: https://github.com/apache/spark/pull/22888 OK --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail:

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread HyukjinKwon
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22888 I would close this, @351zyf. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands,

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread HyukjinKwon
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22888 You're introducing a flag to convert. I think it's virtually same enabling the flag vs calling a function to convert. --- -

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread 351zyf
Github user 351zyf commented on the issue: https://github.com/apache/spark/pull/22888 > Then, you can convert the type into double or floats in Spark DataFrame. This is super easily able to work around at Pandas DataFrame or Spark's DataFrame. I don't think we should add this flag.

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread HyukjinKwon
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22888 Then, you can convert the type into double or floats in Spark DataFrame. This is super easily able to work around at Pandas DataFrame or Spark's DataFrame. I don't think we should add this

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread 351zyf
Github user 351zyf commented on the issue: https://github.com/apache/spark/pull/22888 and this also have no effect on timestamp values. tested. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread 351zyf
Github user 351zyf commented on the issue: https://github.com/apache/spark/pull/22888 > I think you can just manually convert from Pandas DataFrame, no? If I'm using function toPandas, I dont think decimal to object is right. Isn't decimal values usually a value to

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread HyukjinKwon
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22888 I think you can just manually convert from Pandas DataFrame, no? --- - To unsubscribe, e-mail:

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/22888 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/22888 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional

[GitHub] spark issue #22888: SPARK-25881

2018-10-30 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/22888 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional