I used the web ui of spark and could see the conf directory is in CLASSPATH.
An abnormal thing is that when start spark-shell I always get the following
info:
WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable

At first, I think it's because the hadoop version is not compatible with the
pre-built spark. My hadoop version is 2.4.1 and the pre-built spark is built
against hadoop 2.2.0. Then, I built spark from src against hadoop 2.4.1.
However, I still got the info above.

Besides, when I set log4j.rootCategory to DEBUG, I got an exception which
said "HADOOP_HOME or hadoop.home.dir are not set" despite I have set
HADOOP_HOME.



alee526 wrote
> Could you enable HistoryServer and provide the properties and CLASSPATH
> for the spark-shell? And 'env' command to list your environment variables?
> 
> By the way, what does the spark logs says? Enable debug mode to see what's
> going on in spark-shell when it tries to interact and init HiveContext.





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/HiveContext-is-creating-metastore-warehouse-locally-instead-of-in-hdfs-tp10838p11147.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to