[jira] [Comment Edited] (SPARK-8786) Create a wrapper for BinaryType
[ https://issues.apache.org/jira/browse/SPARK-8786?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14647248#comment-14647248 ] Takeshi Yamamuro edited comment on SPARK-8786 at 7/30/15 6:26 AM: -- Sorry to make you confused though, a master branch in spark does; ``` import org.apache.spark.sql._ import org.apache.spark.sql.types._ val schema = StructType(StructField(x, BinaryType, nullable=false) :: Nil) val data = sc.parallelize(Row(Array[Byte](1.toByte)) :: Row(Array[Byte](1.toByte)) :: Row(Array[Byte](2.toByte)) :: Nil) val df = sqlContext.createDataFrame(data, schema) df.registerTempTable(test) sqlContext.sql(SELECT DISTINCT x FROM test).show() +---+ | x| +---+ |[1]| |[2]| +---+ ``` was (Author: maropu): Sorry to make you confused though, a master branch in spark does; ``` import org.apache.spark.sql._ import org.apache.spark.sql.types._ val schema = StructType(StructField(x, BinaryType, nullable=false) :: Nil) val data = sc.parallelize(Row(Array[Byte](1.toByte)) :: Row(Array[Byte](1.toByte)) :: Row(Array[Byte](2.toByte)) :: Nil) val df = sqlContext.createDataFrame(data, schema) df.registerTempTable(test) sqlContext.sql(SELECT DISTINCT x FROM test).show() +---+ | x| +---+ |[1]| |[2]| +---+ ``` Create a wrapper for BinaryType --- Key: SPARK-8786 URL: https://issues.apache.org/jira/browse/SPARK-8786 Project: Spark Issue Type: Bug Components: SQL Reporter: Davies Liu The hashCode and equals() of Array[Byte] does check the bytes, we should create a wrapper (internally) to do that. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-8786) Create a wrapper for BinaryType
[ https://issues.apache.org/jira/browse/SPARK-8786?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14647248#comment-14647248 ] Takeshi Yamamuro edited comment on SPARK-8786 at 7/30/15 6:28 AM: -- Sorry to make you confused though, a master branch in spark does; {code} import org.apache.spark.sql._ import org.apache.spark.sql.types._ val schema = StructType(StructField(x, BinaryType, nullable=false) :: Nil) val data = sc.parallelize(Row(Array[Byte](1.toByte)) :: Row(Array[Byte](1.toByte)) :: Row(Array[Byte](2.toByte)) :: Nil) val df = sqlContext.createDataFrame(data, schema) df.registerTempTable(test) sqlContext.sql(SELECT DISTINCT x FROM test).show() +---+ | x| +---+ |[1]| |[2]| +---+ {code} was (Author: maropu): Sorry to make you confused though, a master branch in spark does; ``` import org.apache.spark.sql._ import org.apache.spark.sql.types._ val schema = StructType(StructField(x, BinaryType, nullable=false) :: Nil) val data = sc.parallelize(Row(Array[Byte](1.toByte)) :: Row(Array[Byte](1.toByte)) :: Row(Array[Byte](2.toByte)) :: Nil) val df = sqlContext.createDataFrame(data, schema) df.registerTempTable(test) sqlContext.sql(SELECT DISTINCT x FROM test).show() +---+ | x| +---+ |[1]| |[2]| +---+ ``` Create a wrapper for BinaryType --- Key: SPARK-8786 URL: https://issues.apache.org/jira/browse/SPARK-8786 Project: Spark Issue Type: Bug Components: SQL Reporter: Davies Liu The hashCode and equals() of Array[Byte] does check the bytes, we should create a wrapper (internally) to do that. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org