[jira] [Commented] (SPARK-40678) JSON conversion of ArrayType is not properly supported in Spark 3.2/2.13

2023-02-12 Thread Wei Guo (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-40678?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17687573#comment-17687573
 ] 

Wei Guo commented on SPARK-40678:
-

Fixed by PR 38154 https://github.com/apache/spark/pull/38154

> JSON conversion of ArrayType is not properly supported in Spark 3.2/2.13
> 
>
> Key: SPARK-40678
> URL: https://issues.apache.org/jira/browse/SPARK-40678
> Project: Spark
>  Issue Type: Bug
>  Components: Input/Output
>Affects Versions: 3.2.0
>Reporter: Cédric Chantepie
>Priority: Major
>
> In Spark 3.2 (Scala 2.13), values with {{ArrayType}} are no longer properly 
> support with JSON; e.g.
> {noformat}
> import org.apache.spark.sql.SparkSession
> case class KeyValue(key: String, value: Array[Byte])
> val spark = 
> SparkSession.builder().master("local[1]").appName("test").getOrCreate()
> import spark.implicits._
> val df = Seq(Array(KeyValue("foo", "bar".getBytes))).toDF()
> df.foreach(r => println(r.json))
> {noformat}
> Expected:
> {noformat}
> [{foo, bar}]
> {noformat}
> Encountered:
> {noformat}
> java.lang.IllegalArgumentException: Failed to convert value 
> ArraySeq([foo,[B@dcdb68f]) (class of class 
> scala.collection.mutable.ArraySeq$ofRef}) with the type of 
> ArrayType(Seq(StructField(key,StringType,false), 
> StructField(value,BinaryType,false)),true) to JSON.
>   at org.apache.spark.sql.Row.toJson$1(Row.scala:604)
>   at org.apache.spark.sql.Row.jsonValue(Row.scala:613)
>   at org.apache.spark.sql.Row.jsonValue$(Row.scala:552)
>   at 
> org.apache.spark.sql.catalyst.expressions.GenericRow.jsonValue(rows.scala:166)
>   at org.apache.spark.sql.Row.json(Row.scala:535)
>   at org.apache.spark.sql.Row.json$(Row.scala:535)
>   at 
> org.apache.spark.sql.catalyst.expressions.GenericRow.json(rows.scala:166)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-40678) JSON conversion of ArrayType is not properly supported in Spark 3.2/2.13

2022-10-06 Thread Jira


[ 
https://issues.apache.org/jira/browse/SPARK-40678?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17613529#comment-17613529
 ] 

Cédric Chantepie commented on SPARK-40678:
--

In Scala 2.13, the pattern `Seq[_]` doesn't match 
`scala.collection.immutable.Seq` .

```
...
  case (s: Seq[_], ArrayType(elementType, _)) =>
iteratorToJsonArray(s.iterator, elementType)
  case (m: Map[String @unchecked, _], MapType(StringType, valueType, _)) =>
new JObject(m.toList.sortBy(_._1).map {
  case (k, v) => k -> toJson(v, valueType)
})
  case (m: Map[_, _], MapType(keyType, valueType, _)) =>
new JArray(m.iterator.map {
  case (k, v) =>
new JObject("key" -> toJson(k, keyType) :: "value" -> toJson(v, 
valueType) :: Nil)
}.toList)
  case (r: Row, _) => r.jsonValue
  case (v: Any, udt: UserDefinedType[Any @unchecked]) =>
val dataType = udt.sqlType
toJson(CatalystTypeConverters.convertToScala(udt.serialize(v), 
dataType), dataType)
  case _ =>
throw new IllegalArgumentException(s"Failed to convert value $value " +
  s"(class of ${value.getClass}}) with the type of $dataType to JSON.")
...
```

> JSON conversion of ArrayType is not properly supported in Spark 3.2/2.13
> 
>
> Key: SPARK-40678
> URL: https://issues.apache.org/jira/browse/SPARK-40678
> Project: Spark
>  Issue Type: Bug
>  Components: Input/Output
>Affects Versions: 3.2.0
>Reporter: Cédric Chantepie
>Priority: Major
>
> In Spark 3.2 (Scala 2.13), values with {{ArrayType}} are no longer properly 
> support with JSON; e.g.
> {noformat}
> import org.apache.spark.sql.SparkSession
> case class KeyValue(key: String, value: Array[Byte])
> val spark = 
> SparkSession.builder().master("local[1]").appName("test").getOrCreate()
> import spark.implicits._
> val df = Seq(Array(KeyValue("foo", "bar".getBytes))).toDF()
> df.foreach(r => println(r.json))
> {noformat}
> Expected:
> {noformat}
> [{foo, bar}]
> {noformat}
> Encountered:
> {noformat}
> java.lang.IllegalArgumentException: Failed to convert value 
> ArraySeq([foo,[B@dcdb68f]) (class of class 
> scala.collection.mutable.ArraySeq$ofRef}) with the type of 
> ArrayType(Seq(StructField(key,StringType,false), 
> StructField(value,BinaryType,false)),true) to JSON.
>   at org.apache.spark.sql.Row.toJson$1(Row.scala:604)
>   at org.apache.spark.sql.Row.jsonValue(Row.scala:613)
>   at org.apache.spark.sql.Row.jsonValue$(Row.scala:552)
>   at 
> org.apache.spark.sql.catalyst.expressions.GenericRow.jsonValue(rows.scala:166)
>   at org.apache.spark.sql.Row.json(Row.scala:535)
>   at org.apache.spark.sql.Row.json$(Row.scala:535)
>   at 
> org.apache.spark.sql.catalyst.expressions.GenericRow.json(rows.scala:166)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org