[ https://issues.apache.org/jira/browse/SPARK-11894?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15020601#comment-15020601 ]
Xiao Li commented on SPARK-11894: --------------------------------- [~marmbrus] [~rxin] [~cloud_fan] I am stuck in this again. To be honest, I am not familiar with the CodeGen. When the type is UnsafeRow, the dataset's collect function can return right answer. For example, val ds1 = Seq((null.asInstanceOf[java.lang.Integer], "1"), (new java.lang.Integer(22), "2")).toDS() ds1.collect() When the type is GenericMutableRow, the dataset's collect function is unable to return right answer. val newDS = ds1.joinWith(ds1, lit(true)) newDS.collect() I suspected it is caused by getStruct. I think I am unable to complete it in the short period. Could you take look at it? Thank you! > Incorrect results are returned when using null > ---------------------------------------------- > > Key: SPARK-11894 > URL: https://issues.apache.org/jira/browse/SPARK-11894 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 1.6.0 > Reporter: Xiao Li > > In DataSet APIs, the following two datasets are the same. > Seq((new java.lang.Integer(0), "1"), (new java.lang.Integer(22), > "2")).toDS() > Seq((null.asInstanceOf[java.lang.Integer],, "1"), (new > java.lang.Integer(22), "2")).toDS() > Note: java.lang.Integer is Nullable. > It could generate an incorrect result. For example, > val ds1 = Seq((null.asInstanceOf[java.lang.Integer], "1"), (new > java.lang.Integer(22), "2")).toDS() > val ds2 = Seq((null.asInstanceOf[java.lang.Integer], "1"), (new > java.lang.Integer(22), "2")).toDS()//toDF("key", "value").as('df2) > val res1 = ds1.joinWith(ds2, lit(true)).collect() > The expected result should be > ((null,1),(null,1)) > ((22,2),(null,1)) > ((null,1),(22,2)) > ((22,2),(22,2)) > The actual result is > ((0,1),(0,1)) > ((22,2),(0,1)) > ((0,1),(22,2)) > ((22,2),(22,2)) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org