Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22626#discussion_r229981376
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
    @@ -3905,6 +3905,47 @@ object functions {
         withExpr(SchemaOfCsv(csv.expr, options.asScala.toMap))
       }
     
    +  /**
    +   * (Scala-specific) Converts a column containing a `StructType` into a 
CSV string
    +   * with the specified schema. Throws an exception, in the case of an 
unsupported type.
    +   *
    +   * @param e a column containing a struct.
    +   * @param options options to control how the struct column is converted 
into a CSV string.
    +   *                It accepts the same options and the CSV data source.
    +   *
    +   * @group collection_funcs
    +   * @since 3.0.0
    +   */
    +  def to_csv(e: Column, options: Map[String, String]): Column = withExpr {
    --- End diff --
    
    Let's get rid of this Scala version for now.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to