Akhil wrote
> You can do it like this:
> 
>     lines.foreachRDD(jsonRDD =>{ 
>    
>               val data = sqlContext.read.json(jsonRDD)
>               data.registerTempTable("mytable")
>               sqlContext.sql("SELECT * FROM mytable")
> 
>       })

See
http://spark.apache.org/docs/latest/streaming-programming-guide.html#dataframe-and-sql-operations
and
http://spark.apache.org/docs/latest/sql-programming-guide.html#json-datasets
for more information.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-json-records-from-kafka-how-can-I-process-help-please-tp25769p25782.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to