Thanks for your fast replies. I was wrong about HiveContext. val hive = new org.apache.spark.sql.hive.HiveContext(sc) var sample = hive.hql("select * from sample10") var countHive = sample.count() hive.registerRDDAsTable(sample,"temp") hive.sql("select * from temp").count()
It works so fine. Thanks, Kevin -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-can-not-use-SchemaRDD-from-Hive-tp10841p10847.html Sent from the Apache Spark User List mailing list archive at Nabble.com.