[jira] [Updated] (SPARK-7162) Launcher error in yarn-client

2015-04-27 Thread Sean Owen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-7162?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-7162:
-
Priority: Minor  (was: Major)

Looks OK but these dirs typically do not contain subdirectories.

> Launcher error in yarn-client
> -
>
> Key: SPARK-7162
> URL: https://issues.apache.org/jira/browse/SPARK-7162
> Project: Spark
>  Issue Type: Bug
>  Components: YARN
>Affects Versions: 1.4.0
>Reporter: Guoqiang Li
>Priority: Minor
>
> {code:none} 
> HADOOP_CONF_DIR=/usr/local/CDH5/hadoop-2.3.0-cdh5.0.1/etc/hadoop/ 
> ./bin/spark-shell --master yarn-client --driver-memory 8g  
> --driver-libraryath $LD_LIBRARY_PATH:$JAVA_LIBRARY_PATH  --jars 
> lib/hadoop-lzo-0.4.15-gplextras5.0.1-SNAPSHOT.jar
> {code} => 
> {code:none}
> Welcome to
>     __
>  / __/__  ___ _/ /__
> _\ \/ _ \/ _ `/ __/  '_/
>/___/ .__/\_,_/_/ /_/\_\   version 1.4.0-SNAPSHOT
>   /_/
> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_55)
> Type in expressions to have them evaluated.
> Type :help for more information.
> spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
> java.io.FileNotFoundException: /etc/hadoop/hadoop (是一个目录)
> at java.io.FileInputStream.open(Native Method)
> at java.io.FileInputStream.(FileInputStream.java:146)
> at 
> org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:124)
> at 
> org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:114)
> at org.spark-project.guava.io.ByteSource.copyTo(ByteSource.java:182)
> at org.spark-project.guava.io.Files.copy(Files.java:417)
> at 
> org.apache.spark.deploy.yarn.Client$$anonfun$createConfArchive$2.apply(Client.scala:374)
> at 
> org.apache.spark.deploy.yarn.Client$$anonfun$createConfArchive$2.apply(Client.scala:372)
> at 
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
> at 
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
> at 
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
> at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
> at 
> org.apache.spark.deploy.yarn.Client.createConfArchive(Client.scala:372)
> at 
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:288)
> at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:466)
> at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:106)
> at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:58)
> at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
> at org.apache.spark.SparkContext.(SparkContext.scala:469)
> at 
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
> at $iwC$$iwC.(:9)
> at $iwC.(:18)
> at (:20)
> at .(:24)
> at .()
> at .(:7)
> at .()
> at $print()
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
> at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
> at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
> at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
> at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
> at 
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
> at 
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
> at 
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
> at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$proc

[jira] [Updated] (SPARK-7162) Launcher error in yarn-client

2015-04-27 Thread Guoqiang Li (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-7162?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Guoqiang Li updated SPARK-7162:
---
Summary: Launcher error in yarn-client  (was:  spark-shell launcher error 
in yarn-client)

> Launcher error in yarn-client
> -
>
> Key: SPARK-7162
> URL: https://issues.apache.org/jira/browse/SPARK-7162
> Project: Spark
>  Issue Type: Bug
>  Components: YARN
>Affects Versions: 1.4.0
>Reporter: Guoqiang Li
>
> {code:none} 
> HADOOP_CONF_DIR=/usr/local/CDH5/hadoop-2.3.0-cdh5.0.1/etc/hadoop/ 
> ./bin/spark-shell --master yarn-client --driver-memory 8g  
> --driver-libraryath $LD_LIBRARY_PATH:$JAVA_LIBRARY_PATH  --jars 
> lib/hadoop-lzo-0.4.15-gplextras5.0.1-SNAPSHOT.jar
> {code} => 
> {code:none}
> Welcome to
>     __
>  / __/__  ___ _/ /__
> _\ \/ _ \/ _ `/ __/  '_/
>/___/ .__/\_,_/_/ /_/\_\   version 1.4.0-SNAPSHOT
>   /_/
> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_55)
> Type in expressions to have them evaluated.
> Type :help for more information.
> spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
> java.io.FileNotFoundException: /etc/hadoop/hadoop (是一个目录)
> at java.io.FileInputStream.open(Native Method)
> at java.io.FileInputStream.(FileInputStream.java:146)
> at 
> org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:124)
> at 
> org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:114)
> at org.spark-project.guava.io.ByteSource.copyTo(ByteSource.java:182)
> at org.spark-project.guava.io.Files.copy(Files.java:417)
> at 
> org.apache.spark.deploy.yarn.Client$$anonfun$createConfArchive$2.apply(Client.scala:374)
> at 
> org.apache.spark.deploy.yarn.Client$$anonfun$createConfArchive$2.apply(Client.scala:372)
> at 
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
> at 
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
> at 
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
> at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
> at 
> org.apache.spark.deploy.yarn.Client.createConfArchive(Client.scala:372)
> at 
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:288)
> at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:466)
> at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:106)
> at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:58)
> at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
> at org.apache.spark.SparkContext.(SparkContext.scala:469)
> at 
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
> at $iwC$$iwC.(:9)
> at $iwC.(:18)
> at (:20)
> at .(:24)
> at .()
> at .(:7)
> at .()
> at $print()
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
> at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
> at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
> at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
> at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
> at 
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
> at 
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
> at 
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
> at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.