Which Spark release are you using ?
Is there other clue from the logs ? If so, please pastebin.
Cheers
On Thu, Feb 4, 2016 at 2:49 AM, Valentin Popov
wrote:
> Hi all,
>
> I’m trying run spark on local mode, i using such code:
>
> SparkConf conf = new
It is 1.6.0 builded from sources.
I’m trying it on mine eclipse project and want use spark on it, so I put
libraries there and have no ClassNotFoundException
akka-actor_2.10-2.3.11.jar
akka-remote_2.10-2.3.11.jar
akka-slf4j_2.10-2.3.11.jar
config-1.2.1.jar
hadoop-auth-2.7.1.jar
Hi all,
I’m trying run spark on local mode, i using such code:
SparkConf conf = new
SparkConf().setAppName("JavaWord2VecExample").setMaster("local[*]");
JavaSparkContext sc = new JavaSparkContext(conf);
but after while (10 sec) I got Exception, here is a stack trace:
I think this is an answer…
HADOOP_HOME or hadoop.home.dir are not set.
Sorry
2016-02-04 14:10:08 o.a.h.u.Shell [DEBUG] Failed to detect a valid hadoop home
directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at