Hi All I have few tables in hive and I wanted to run query against them
with spark as execution engine.

Can I direct;y load these tables in spark shell and run query?

I tried with
1.val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
2.qlContext.sql("FROM event_impressions select count(*)") where
event_impressions is the table name.

It give me error saying "org.apache.spark.sql.AnalysisException: no such
table event_impressions; line 1 pos 5"

Does anybody hit similar issues?


regards
jeetendra

Reply via email to