What version of hadoop are you using ?

Is that version consistent with the one which was used to build Spark 1.4.0
?

Cheers

On Mon, Sep 28, 2015 at 4:36 PM, Renyi Xiong <renyixio...@gmail.com> wrote:

> I tried to run HdfsTest sample on windows spark-1.4.0
>
> bin\run-sample org.apache.spark.examples.HdfsTest <file>
>
> but got below exception, any body any idea what was wrong here?
>
> 15/09/28 16:33:56.565 ERROR SparkContext: Error initializing SparkContext.
> java.lang.NullPointerException
>         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
>         at org.apache.hadoop.util.Shell.run(Shell.java:418)
>         at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:633)
>         at
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:467)
>         at
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:130)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:515)
>         at org.apache.spark.examples.HdfsTest$.main(HdfsTest.scala:32)
>         at org.apache.spark.examples.HdfsTest.main(HdfsTest.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>         at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>         at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>

Reply via email to