Hi, 
I think you just want to put the hive-site.xml in the spark/conf directory and 
it would load 
it into spark classpath.

Best,
Sun.



fightf...@163.com
 
From: Chandra Mohan, Ananda Vel Murugan
Date: 2015-11-27 15:04
To: user
Subject: error while creating HiveContext
Hi, 
 
I am building a spark-sql application in Java. I created a maven project in 
Eclipse and added all dependencies including spark-core and spark-sql. I am 
creating HiveContext in my spark program and then try to run sql queries 
against my Hive Table. When I submit this job in spark, for some reasons it is 
trying to create derby metastore. But my hive-site.xml clearly specifies the 
jdbc url of my MySQL . So I think my hive-site.xml is not getting picked by 
spark program. I specified hive-site.xml path using “—files” argument in 
spark-submit. I also tried placing hive-site.xml file in my jar . I even tried 
creating Configuration object with hive-site.xml path and updated my 
HiveContext by calling addResource() method.   
 
I want to know where I should put hive config files in my jar or in my eclipse 
project or in my cluster for it to be picked by correctly in my spark program. 
 
Thanks for any help. 
 
Regards,
Anand.C
 

Reply via email to