Github user cko commented on the issue:

    https://github.com/apache/spark/pull/19739
  
    When I'm using the prebuilt version for hadoop 2.7 on NixOS I got the 
exception below everywhere, but it's patched in hadoop 2.8 and everything works 
fine, when I'm building it with 2.8.
    As building Spark is a bit tedious, I assume the most feasible approach is 
to use the "Pre-build with user-provided Apache Hadoop"-version and setting the 
environment variables to 2.8 [1]?
    
    [1] http://spark.apache.org/docs/2.2.0/hadoop-provided.html
    
    ```
    [nix-shell:~]$ spark-shell                                                  
                                                                                
                                                                                
    17/11/14 22:38:43 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable        
                                                                                
      
    17/11/14 22:39:12 WARN ObjectStore: Version information not found in 
metastore. hive.metastore.schema.verification is not enabled so recording the 
schema version 1.2.0                                                            
         
    17/11/14 22:39:12 WARN ObjectStore: Failed to get database default, 
returning NoSuchObjectException                                                 
                                                                                
        
    java.lang.IllegalArgumentException: Error while instantiating 
'org.apache.spark.sql.hive.HiveSessionState':                                   
                                                                                
              
      at 
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
                                                                                
                                                  
      at 
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
      at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
      at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
      at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
      at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
      at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
      at 
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
      at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
      at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
      at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
      at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
      ... 47 elided
    Caused by: java.lang.reflect.InvocationTargetException: 
java.lang.IllegalArgumentException: Error while instantiating 
'org.apache.spark.sql.hive.HiveExternalCatalog':
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      at 
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
      ... 58 more
    Caused by: java.lang.IllegalArgumentException: Error while instantiating 
'org.apache.spark.sql.hive.HiveExternalCatalog':
      at 
org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
      at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
      at 
org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
      at 
org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
      at scala.Option.getOrElse(Option.scala:121)
      at 
org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
      at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
      at 
org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
      at 
org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
      ... 63 more
    Caused by: java.lang.reflect.InvocationTargetException: 
java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: 
java.lang.RuntimeException: Error while running command to get file permissions 
: java.io.IOException: Cannot run program "/bin/ls": error=2, No such file or 
directory
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
            at org.apache.hadoop.util.Shell.runCommand(Shell.java:448)
            at org.apache.hadoop.util.Shell.run(Shell.java:418)
            at 
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
            at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
            at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
            at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
            at 
org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:559)
            at 
org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
            at 
org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
            at 
org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
            at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
            at 
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
            at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
            at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
            at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
            at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
            at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
            at 
org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
            at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
            at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
            at 
org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
            at 
org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
            at 
org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
            at 
org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
            at scala.Option.getOrElse(Option.scala:121)
            at 
org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
            at 
org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
            at 
org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
            at 
org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
            at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
            at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
            at 
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
            at 
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
            at 
org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
            at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
            at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
            at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
            at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
            at 
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
            at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
            at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
            at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
            at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
            at <init>(<console>:15)
            at <init>(<console>:42)
            at <init>(<console>:44)
            at .<init>(<console>:48)
            at .<clinit>(<console>)
            at .$print$lzycompute(<console>:7)
            at .$print(<console>:6)
            at $print(<console>)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at 
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
            at 
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
            at 
scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
            at 
scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
            at 
scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
            at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
            at 
scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
            at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
            at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
            at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
            at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
            at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
            at 
org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
            at 
org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
            at 
org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
            at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
            at 
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
            at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
            at 
scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
            at 
scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
            at 
scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
            at 
scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
            at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
            at org.apache.spark.repl.Main$.doMain(Main.scala:68)
            at org.apache.spark.repl.Main$.main(Main.scala:51)
            at org.apache.spark.repl.Main.main(Main.scala)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
            at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
            at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
            at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
            at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    Caused by: java.io.IOException: error=2, No such file or directory
            at java.lang.UNIXProcess.forkAndExec(Native Method)
            at java.lang.UNIXProcess.<init>(UNIXProcess.java:247)
            at java.lang.ProcessImpl.start(ProcessImpl.java:134)
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
            ... 96 more
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to