[GitHub] spark issue #22888: SPARK-25881
Github user 351zyf commented on the issue: https://github.com/apache/spark/pull/22888 OK --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22888 I would close this, @351zyf. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22888 You're introducing a flag to convert. I think it's virtually same enabling the flag vs calling a function to convert. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user 351zyf commented on the issue: https://github.com/apache/spark/pull/22888 > Then, you can convert the type into double or floats in Spark DataFrame. This is super easily able to work around at Pandas DataFrame or Spark's DataFrame. I don't think we should add this flag. > > BTW, the same feature should be added to when Arrow optimization is enabled as well. Or can we correct this conversion in function dataframe._to_corrected_pandas_type ? Converting decimal type manually everytime sounds not good.. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22888 Then, you can convert the type into double or floats in Spark DataFrame. This is super easily able to work around at Pandas DataFrame or Spark's DataFrame. I don't think we should add this flag. BTW, the same feature should be added to when Arrow optimization is enabled as well. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user 351zyf commented on the issue: https://github.com/apache/spark/pull/22888 and this also have no effect on timestamp values. tested. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user 351zyf commented on the issue: https://github.com/apache/spark/pull/22888 > I think you can just manually convert from Pandas DataFrame, no? If I'm using function toPandas, I dont think decimal to object is right. Isn't decimal values usually a value to calculate? I mean, numbers. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22888 I think you can just manually convert from Pandas DataFrame, no? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/22888 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/22888 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22888: SPARK-25881
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/22888 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org