Adrian Ionescu created SPARK-20193:
--------------------------------------

             Summary: Selecting empty struct causes ExpressionEncoder error.
                 Key: SPARK-20193
                 URL: https://issues.apache.org/jira/browse/SPARK-20193
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.1.0
            Reporter: Adrian Ionescu


{{def struct(cols: Column*): Column}}
Given the above signature and the lack of any note in the docs that a struct 
with no columns is not supported, I would expect the following to work:
{{spark.range(3).select(col("id"), struct().as("empty_struct")).collect}}

However, this results in:
{quote}
java.lang.AssertionError: assertion failed: each serializer expression should 
contains at least one `BoundReference`
  at scala.Predef$.assert(Predef.scala:170)
  at 
org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$$anonfun$11.apply(ExpressionEncoder.scala:240)
  at 
org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$$anonfun$11.apply(ExpressionEncoder.scala:238)
  at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
  at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
  at scala.collection.immutable.List.foreach(List.scala:381)
  at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
  at scala.collection.immutable.List.flatMap(List.scala:344)
  at 
org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.<init>(ExpressionEncoder.scala:238)
  at 
org.apache.spark.sql.catalyst.encoders.RowEncoder$.apply(RowEncoder.scala:63)
  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
  at 
org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withPlan(Dataset.scala:2837)
  at org.apache.spark.sql.Dataset.select(Dataset.scala:1131)
  ... 39 elided
{quote}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to