[GitHub] [incubator-livy] wypoon commented on a change in pull request #296: [LIVY-771][THRIFT] Do not remove trailing zeros from decimal values.
wypoon commented on a change in pull request #296: URL: https://github.com/apache/incubator-livy/pull/296#discussion_r452339853 ## File path: thriftserver/session/src/main/java/org/apache/livy/thriftserver/session/ResultSet.java ## @@ -88,7 +88,7 @@ private String toHiveString(Object value, boolean quoteStrings) { } else if (quoteStrings && value instanceof String) { return "\"" + value + "\""; } else if (value instanceof BigDecimal) { - return ((BigDecimal) value).stripTrailingZeros().toString(); + return ((BigDecimal) value).toString(); Review comment: Ok, I see that you need a global temp view: ``` scala> df.createGlobalTempView("aa") Hive Session ID = fe4de24f-f7dc-496c-8dde-460c24f68548 scala> val spark2 = spark.newSession() spark2: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@59ede173 scala> spark2.sql("select * from global_temp.aa").show() ++ | id| ++ |0E+1| |0E+1| |0E+1| |0E+1| |0E+1| |1E+1| |1E+1| |1E+1| |1E+1| |1E+1| ++ ``` In any case, I still stand by my argument. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-livy] wypoon commented on a change in pull request #296: [LIVY-771][THRIFT] Do not remove trailing zeros from decimal values.
wypoon commented on a change in pull request #296: URL: https://github.com/apache/incubator-livy/pull/296#discussion_r452339853 ## File path: thriftserver/session/src/main/java/org/apache/livy/thriftserver/session/ResultSet.java ## @@ -88,7 +88,7 @@ private String toHiveString(Object value, boolean quoteStrings) { } else if (quoteStrings && value instanceof String) { return "\"" + value + "\""; } else if (value instanceof BigDecimal) { - return ((BigDecimal) value).stripTrailingZeros().toString(); + return ((BigDecimal) value).toString(); Review comment: Ok, I see that you need a global temp view: ``` scala> df.createGlobalTempView("aa") Hive Session ID = fe4de24f-f7dc-496c-8dde-460c24f68548 scala> val spark2 = spark.newSession() spark2: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@59ede173 scala> spark2.sql("select * from global_temp.aa").show() ++ | id| ++ |0E+1| |0E+1| |0E+1| |0E+1| |0E+1| |1E+1| |1E+1| |1E+1| |1E+1| |1E+1| ++ ``` In any case, I still stand by my argument. There is exactly zero chance that any actual users out there are doing this. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-livy] jahstreet commented on pull request #167: [LIVY-588]: Full support for Spark on Kubernetes
jahstreet commented on pull request #167: URL: https://github.com/apache/incubator-livy/pull/167#issuecomment-656291933 Thanks @groodt , I really appreciate your feedback. > Hopefully someday we can have a the ability to run interactive Spark jobs that autoscale on a Kubernetes cluster. We can already. Please refer the [Helm chart repo](https://github.com/jahstreet/spark-on-kubernetes-helm). > Does your branch support Spark 3.0? This branch is rebased to master, which already has the Spark 3.0 support merged #300 . In the meantime I'm preparing the update of Docker images and the corresponding aforementioned Helm charts. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org