On Fri, Mar 6, 2015 at 2:47 PM, nitinkak001 <nitinkak...@gmail.com> wrote:
> I am trying to run a Hive query from Spark using HiveContext. Here is the
> code
>
> / val conf = new SparkConf().setAppName("HiveSparkIntegrationTest")
>
>
>     conf.set("spark.executor.extraClassPath",
> "/opt/cloudera/parcels/CDH-5.2.0-1.cdh5.2.0.p0.36/lib/hive/lib");
>     conf.set("spark.driver.extraClassPath",
> "/opt/cloudera/parcels/CDH-5.2.0-1.cdh5.2.0.p0.36/lib/hive/lib");
>     conf.set("spark.yarn.am.waitTime", "300000L")

You're missing "/*" at the end of your classpath entries. Also, since
you're on CDH 5.2, you'll probably need to filter out the guava jar
from Hive's lib directory, otherwise things might break. So things
will get a little more complicated.

With CDH 5.3 you shouldn't need to filter out the guava jar.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to