Hi all, I got a trouble on running Spark's HBaseTest example, It failed with Exception below: 14/02/14 14:41:58 DEBUG util.Shell: Failed to detect a valid hadoop home directory java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set. at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:225) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:250) at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76) at org.apache.hadoop.yarn.conf.YarnConfiguration.<clinit>(YarnConfiguration.java:345) at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.newConfiguration(YarnSparkHadoopUtil.scala:36) at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:32) at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.<init>(YarnSparkHadoopUtil.scala:29) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at java.lang.Class.newInstance0(Class.java:355) at java.lang.Class.newInstance(Class.java:308) at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:76) at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:74) at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) I have setted HADOOP_HOME on /root/.bash_profile and /root/.bashrc, It not works. I changed the src of this excample, add HADOOP_HOME in the SparkContext like this: val sc = new SparkContext(args(0), "HBaseTest", System.getenv("SPARK_HOME"), SparkContext.jarOfClass(this.getClass), Map("HADOOP_HOME" -> "/usr/lib/cdh5/hadoop")) It still not work, what's wrong with this? how can I run this successfully? Thank you for any help.
-------------------------------------------------------------------------------- Best Regards!