Hello community,

I can not manage to run from_json method with "columnNameOfCorruptRecord"
option.
```
    import org.apache.spark.sql.functions._

    val data = Seq(
      "{'number': 1}",
      "{'number': }"
    )

    val schema = new StructType()
      .add($"number".int)
      .add($"_corrupt_record".string)

    val sourceDf = data.toDF("column")

    val jsonedDf = sourceDf
      .select(from_json(
        $"column",
        schema,
        Map("mode" -> "PERMISSIVE", "columnNameOfCorruptRecord" ->
"_corrupt_record")
      ) as "data").selectExpr("data.number", "data._corrupt_record")

      jsonedDf.show()
```
Does anybody can help me get `_corrupt_record` non empty?

Thanks in advance.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to