[jira] [Commented] (SPARK-16097) Encoders.tuple should handle null object correctly
[ https://issues.apache.org/jira/browse/SPARK-16097?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15344662#comment-15344662 ] Xiangrui Meng commented on SPARK-16097: --- Changed the fix versions to 2.0.1 and 2.1.0 since 2.0.0-RC1 is in vote. > Encoders.tuple should handle null object correctly > -- > > Key: SPARK-16097 > URL: https://issues.apache.org/jira/browse/SPARK-16097 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.0.0 >Reporter: Wenchen Fan >Assignee: Wenchen Fan > Fix For: 2.1.0, 2.0.1 > > > val enc = Encoders.tuple(Encoders.tuple(Encoders.STRING, Encoders.STRING), > Encoders.STRING) > val data = Seq((("a", "b"), "c"), (null, "d")) > val ds = spark.createDataset(data)(enc) > checkDataset(ds, (("a", "b"), "c"), (null, "d")) -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-16097) Encoders.tuple should handle null object correctly
[ https://issues.apache.org/jira/browse/SPARK-16097?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15341604#comment-15341604 ] Apache Spark commented on SPARK-16097: -- User 'cloud-fan' has created a pull request for this issue: https://github.com/apache/spark/pull/13807 > Encoders.tuple should handle null object correctly > -- > > Key: SPARK-16097 > URL: https://issues.apache.org/jira/browse/SPARK-16097 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.0.0 >Reporter: Wenchen Fan >Assignee: Wenchen Fan > > val enc = Encoders.tuple(Encoders.tuple(Encoders.STRING, Encoders.STRING), > Encoders.STRING) > val data = Seq((("a", "b"), "c"), (null, "d")) > val ds = spark.createDataset(data)(enc) > checkDataset(ds, (("a", "b"), "c"), (null, "d")) -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org