I was trying to implement this example:
http://spark.apache.org/docs/1.3.1/sql-programming-guide.html#hive-tables

It worked well when I built spark in terminal using command specified:
http://spark.apache.org/docs/1.3.1/building-spark.html#building-with-hive-and-jdbc-support

But when I try to implement in IntelliJ, following the specifications
specified:
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ

It throws the error:
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
<console>:21: error: object hive is not a member of package
org.apache.spark.sql

Can any one help me get through this issue.

Regards
Akhil

Reply via email to