[ https://issues.apache.org/jira/browse/SPARK-9485?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14648233#comment-14648233 ]
Philip Adetiloye edited comment on SPARK-9485 at 7/30/15 8:16 PM: ------------------------------------------------------------------ [~srowen] Thanks for the quick reply. It actually consistent (everytime) and here is the details of my configuration. conf/spark-env.sh basically has this settings: #!/usr/bin/env bash HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop" SPARK_YARN_QUEUE="dev" and my conf/slaves 10.0.0.204 10.0.0.205 ~/.profile contains my settings here: export JAVA_HOME=$(readlink -f /usr/share/jdk1.8.0_45/bin/java | sed "s:bin/java::") export HADOOP_INSTALL=/usr/local/hadoop export PATH=$PATH:$HADOOP_INSTALL/bin export PATH=$PATH:$HADOOP_INSTALL/sbin export HADOOP_MAPRED_HOME=$HADOOP_INSTALL export HADOOP_COMMON_HOME=$HADOOP_INSTALL export HADOOP_HDFS_HOME=$HADOOP_INSTALL export YARN_HOME=$HADOOP_INSTALL export HADOOP_YARN_HOME=$HADOOP_INSTALL export HADOOP_HOME=$HADOOP_INSTALL export HADOOP_CONF_DIR=${HADOOP_HOME}"/etc/hadoop" export HADOOP_COMMON_HOME=$HADOOP_INSTALL export YARN_CONF_DIR=$HADOOP_INSTALL export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native" export PATH=$PATH:/usr/local/spark/sbin export PATH=$PATH:/usr/local/spark/bin export LD_LIBRARY_PATH=/usr/local/hadoop/lib/native/:/usr/local/hadoop/lib/native/ export SCALA_HOME=/usr/local/scala-2.10.4 export PATH=$SCALA_HOME/bin:$PATH Hope this helps. Thanks, - Phil was (Author: pkadetiloye): [~srowen] Thanks for the quick reply. It actually consistent (everytime) and here is the details of my configuration. conf/spark-env.sh basically has this settings: #!/usr/bin/env bash HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop" SPARK_YARN_QUEUE="dev" and my conf/slaves 10.0.0.204 10.0.0.205 ~/.profile contains my settings here: ` export JAVA_HOME=$(readlink -f /usr/share/jdk1.8.0_45/bin/java | sed "s:bin/java::") export HADOOP_INSTALL=/usr/local/hadoop export PATH=$PATH:$HADOOP_INSTALL/bin export PATH=$PATH:$HADOOP_INSTALL/sbin export HADOOP_MAPRED_HOME=$HADOOP_INSTALL export HADOOP_COMMON_HOME=$HADOOP_INSTALL export HADOOP_HDFS_HOME=$HADOOP_INSTALL export YARN_HOME=$HADOOP_INSTALL export HADOOP_YARN_HOME=$HADOOP_INSTALL export HADOOP_HOME=$HADOOP_INSTALL export HADOOP_CONF_DIR=${HADOOP_HOME}"/etc/hadoop" export HADOOP_COMMON_HOME=$HADOOP_INSTALL export YARN_CONF_DIR=$HADOOP_INSTALL export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native" export PATH=$PATH:/usr/local/spark/sbin export PATH=$PATH:/usr/local/spark/bin export LD_LIBRARY_PATH=/usr/local/hadoop/lib/native/:/usr/local/hadoop/lib/native/ export SCALA_HOME=/usr/local/scala-2.10.4 export PATH=$SCALA_HOME/bin:$PATH ` Hope this helps. Thanks, - Phil > Failed to connect to yarn / spark-submit --master yarn-client > ------------------------------------------------------------- > > Key: SPARK-9485 > URL: https://issues.apache.org/jira/browse/SPARK-9485 > Project: Spark > Issue Type: Bug > Components: Spark Shell, Spark Submit, YARN > Affects Versions: 1.4.1 > Environment: DEV > Reporter: Philip Adetiloye > Priority: Minor > > Spark-submit throws an exception when connecting to yarn but it works when > used in standalone mode. > I'm using spark-1.4.1-bin-hadoop2.6 and also tried compiling from source but > got the same exception below. > spark-submit --master yarn-client > Here is a stack trace of the exception: > 15/07/29 17:32:15 INFO scheduler.DAGScheduler: Stopping DAGScheduler > 15/07/29 17:32:15 INFO cluster.YarnClientSchedulerBackend: Shutting down all > executors > Exception in thread "Yarn application state monitor" > org.apache.spark.SparkException: Error asking standalone schedule > r to shut down executors > at > org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.stopExecutors(CoarseGrainedSchedulerBacken > d.scala:261) > at > org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.stop(CoarseGrainedSchedulerBackend.scala:2 > 66) > at > org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.stop(YarnClientSchedulerBackend.scala:158) > at > org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:416) > at > org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1411) > at org.apache.spark.SparkContext.stop(SparkContext.scala:1644) > at > org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$$anon$1.run(YarnClientSchedulerBackend.scala: > 139) > Caused by: java.lang.InterruptedException > at > java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java > :1326) > at > scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:208) > at > scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218) > at > scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) > at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) > at > scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) > at scala.concurrent.Await$.result(package.scala:107) > at > org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102) > at > org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scal > at > org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scal > a:945) > at > scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) > at > org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) > at org.apache.spark.repl.Main$.main(Main.scala:31) > at org.apache.spark.repl.Main.main(Main.scala) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:497) > at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) > at > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > java.lang.NullPointerException > at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:193) > at > org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1033) > at $iwC$$iwC.<init>(<console>:9) > at $iwC.<init>(<console>:18) > at <init>(<console>:20) > at .<init>(<console>:24) > at .<clinit>(<console>) > at .<init>(<console>:7) > at .<clinit>(<console>) > at $print(<console>) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:497) > at > org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) > at > org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) > at > org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) > at > org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) > at > org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) > at > org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:130) > at > org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122) > at > org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) > at > org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122) > at > org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) > at > org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp > $5.apply$mcV$sp(SparkILoop.scala:974) > at > org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157) > at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) > at > org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106) > at > org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) > at > org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILo > op.scala:991) > at > org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scal > a:945) > at > org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scal > a:945) > at > scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) > at > org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) > at org.apache.spark.repl.Main$.main(Main.scala:31) > at org.apache.spark.repl.Main.main(Main.scala) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:497) > at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) > at > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > <console>:10: error: not found: value sqlContext > import sqlContext.implicits._ > ^ > <console>:10: error: not found: value sqlContext > import sqlContext.sql > ^ > Any ideas ? -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org