I have Spark 1.5.0 (Prebuilt for Hadoop 2.6) with JDK 1.7.

NOTE: I do not have Hadoop installation.

When ever I start spark-shell, I get the following error

Caused by: java.lang.NullPointerException
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
        at org.apache.hadoop.util.Shell.run(Shell.java:455)
        at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
        at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
        at
org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:582)
        at
org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:557)
        at
org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
        at
org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
        ... 56 more
<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
              ^
scala>

Can someone point what could we resolve the issue here?

Reply via email to