HyukjinKwon commented on a change in pull request #32365:
URL: https://github.com/apache/spark/pull/32365#discussion_r660245441



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -3167,6 +3167,16 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
+  val LEGACY_CAST_TO_STRING =
+    buildConf("spark.sql.legacy.castToString.enabled")
+      .internal()
+      .doc("When true, df.show() will print string as origin format." +
+        " Otherwise, if this is false, which is default, df.show() will print 
data " +
+        "as HiveResult's format.")

Review comment:
       ```suggestion
           "as Hive's result format.")
   ```

##########
File path: sql/core/src/main/scala/org/apache/spark/sql/Column.scala
##########
@@ -1190,6 +1190,22 @@ class Column(val expr: Expression) extends Logging {
    */
   def cast(to: String): Column = cast(CatalystSqlParser.parseDataType(to))
 
+  /**
+   * Casts the column to a pretty string.
+   * {{{
+   *   // Casts colA to pretty string.
+   *   df.select(df("colA").toPrettyString())
+   * }}}
+   *
+   * @group expr_ops
+   * @since 3.2.0
+   */
+  def toPrettyString: Column = withExpr {

Review comment:
       Why do we need this API?

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -3167,6 +3167,16 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
+  val LEGACY_CAST_TO_STRING =
+    buildConf("spark.sql.legacy.castToString.enabled")
+      .internal()
+      .doc("When true, df.show() will print string as origin format." +

Review comment:
       ```suggestion
         .doc("When true, df.show() prints the output in the legacy format." +
   ```

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -3167,6 +3167,16 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
+  val LEGACY_CAST_TO_STRING =
+    buildConf("spark.sql.legacy.castToString.enabled")
+      .internal()
+      .doc("When true, df.show() will print string as origin format." +
+        " Otherwise, if this is false, which is default, df.show() will print 
data " +

Review comment:
       ```suggestion
           " Otherwise, if this is false, df.show() prints the output " +
   ```

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -3167,6 +3167,16 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
+  val LEGACY_CAST_TO_STRING =
+    buildConf("spark.sql.legacy.castToString.enabled")
+      .internal()
+      .doc("When true, df.show() will print string as origin format." +
+        " Otherwise, if this is false, which is default, df.show() will print 
data " +
+        "as HiveResult's format.")

Review comment:
       ```suggestion
           "in the Hive compatible format.")
   ```

##########
File path: sql/core/src/main/scala/org/apache/spark/sql/Column.scala
##########
@@ -1190,6 +1190,22 @@ class Column(val expr: Expression) extends Logging {
    */
   def cast(to: String): Column = cast(CatalystSqlParser.parseDataType(to))
 
+  /**
+   * Casts the column to a pretty string.
+   * {{{
+   *   // Casts colA to pretty string.
+   *   df.select(df("colA").toPrettyString())
+   * }}}
+   *
+   * @group expr_ops
+   * @since 3.2.0
+   */
+  def toPrettyString: Column = withExpr {

Review comment:
       Shall we make this as a proper SQL expression? e.g.) `to_hive_string`. 
It makes less sense to expose this API at Column for DSL, that is specific to 
SQL compatibility.

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -3167,6 +3167,16 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
+  val LEGACY_CAST_TO_STRING =
+    buildConf("spark.sql.legacy.castToString.enabled")

Review comment:
       Do we plan to drop the legacy string format in the future?

##########
File path: sql/core/src/main/scala/org/apache/spark/sql/Column.scala
##########
@@ -1190,6 +1190,22 @@ class Column(val expr: Expression) extends Logging {
    */
   def cast(to: String): Column = cast(CatalystSqlParser.parseDataType(to))
 
+  /**
+   * Casts the column to a pretty string.
+   * {{{
+   *   // Casts colA to pretty string.
+   *   df.select(df("colA").toPrettyString())
+   * }}}
+   *
+   * @group expr_ops
+   * @since 3.2.0
+   */
+  def toPrettyString: Column = withExpr {

Review comment:
       If this is only internal purpose, we can hide it via `private[sql]` or 
something.

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -3167,6 +3167,16 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
+  val LEGACY_CAST_TO_STRING =
+    buildConf("spark.sql.legacy.castToString.enabled")

Review comment:
       If we want to make it consistent, we should add the configuration to 
`Cast` too I think.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to