Thanks for the reply. Sorry I could not ask more earlier. Trying to use a parquet file is not working at all.
case class Rec(name:String,pv:Int) val sqlContext=new org.apache.spark.sql.SQLContext(sc) import sqlContext.createSchemaRDD val d1=sc.parallelize(Array(("a",10),("b",3))).map(e=>Rec(e._1,e._2)) d1.saveAsParquetFile("p1.parquet") val d2=sc.parallelize(Array(("a",10),("b",3),("c",5))).map(e=>Rec(e._1,e._2)) d2.saveAsParquetFile("p2.parquet") val f1=sqlContext.parquetFile("p1.parquet") val f2=sqlContext.parquetFile("p2.parquet") f1.registerAsTable("logs") f2.insertInto("logs") is giving the error : " java.lang.AssertionError: assertion failed: No plan for InsertIntoTable Map(), false " same as the one that occured while trying to insert into from rdd to table. So i guess inserting into parquet tables is also not supported? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Trying-to-run-SparkSQL-over-Spark-Streaming-tp12530p13004.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org