On 7/1/2016 6:42 AM, Akhil Das wrote:
case class Holder(str: String, js:JsValue)

Hello,

Thanks!

I tried that before posting the question to the list but I keep getting an error such as this even after the map() operation to convert (String,JsValue) -> Holder and then toDF().

I am simply invoking the following:

val rddDF:DataFrame = rdd.map(x => Holder(x._1,x._2)).toDF
rddDF.registerTempTable("rddf")

rddDF.schema.mkString(",")


And getting the following:

[2016-07-01 11:57:02,720] WARN .jobserver.JobManagerActor [] [akka://JobServer/user/context-supervisor/test] - Exception from job d4c9d145-92bf-4c64-8904-91c917bd61d3: java.lang.UnsupportedOperationException: Schema for type play.api.libs.json.JsValue is not supported at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:718) at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30) at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$schemaFor$1.apply(ScalaReflection.scala:693) at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$schemaFor$1.apply(ScalaReflection.scala:691) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:691) at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30) at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:630) at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30) at org.apache.spark.sql.SQLContext.createDataFrame(SQLContext.scala:414) at org.apache.spark.sql.SQLImplicits.rddToDataFrameHolder(SQLImplicits.scala:94)



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to