Hello everyone.  I am having what I am sure is a configuration error.  I am
trying to use my spark cluster in cluster mode with out success.  So far
search results have not yielded any clues.  If I use my the same submit
command but with client mode specified everything works fine. I have tried
changing up configs with little success.  Any suggestions would be greatly
appreciated. Setup looks like this right now.  1 master on a dedicated
machine, 1 worker on another machine. I have verified all DNS, IP addresses,
and that the Host files are correctly set.  

****Launch Command****
bin/spark-submit --class=class.path.Program  --deploy-mode cluster --master
spark://master:7077 -v /tmp/streaming-0.1.0.jar

****Driver STDERR on Worker01****

Launch Command: "/opt/java/bin/java" "-cp"
"/data/spark/worker1/work/driver-20150129104531-0000/streaming-0.1.0.jar:::/opt/spark-1.2.0-bin-hadoop2.4/sbin/../conf:/opt/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:/opt/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/opt/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/opt/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar"
"-Dspark.akka.askTimeout=10" "-Dspark.eventLog.enabled=true"
"-Dspark.app.name=class.path.Program"
"-Dspark.jars=file:/tmp/eas-streaming-0.1.0.jar"
"-Dspark.master=spark://master:7077" "-Dakka.loglevel=WARNING" "-Xms512M"
"-Xmx512M" "org.apache.spark.deploy.worker.DriverWrapper"
"akka.tcp://sparkWorker@worker01:45065/user/Worker" "class.path.Program"
========================================

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/data/spark/worker01/work/driver-20150129104531-0000/eas-streaming-0.1.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/01/29 10:45:36 INFO spark.SecurityManager: Changing view acls to:
userspark,
15/01/29 10:45:36 INFO spark.SecurityManager: Changing modify acls to:
userspark,
15/01/29 10:45:36 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(userspark, ); users with modify permissions: Set(userspark, )
Exception in thread "main" com.typesafe.config.ConfigException$Missing: No
configuration setting found for key 'akka.remote.gate-invalid-addresses-for'
        at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124)
        at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:145)
        at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151)
        at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151)
        at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159)
        at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164)
        at 
com.typesafe.config.impl.SimpleConfig.getDuration(SimpleConfig.java:260)
        at
com.typesafe.config.impl.SimpleConfig.getMilliseconds(SimpleConfig.java:249)
        at akka.remote.RemoteSettings.<init>(RemoteSettings.scala:60)
        at
akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:114)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
        at
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
        at scala.util.Try$.apply(Try.scala:161)
        at
akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
        at
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
        at
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
        at scala.util.Success.flatMap(Try.scala:200)
        at
akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
        at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:550)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
        at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
        at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
        at 
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
        at
org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:33)
        at 
org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
        
****Master env.sh****

export SPARK_MASTER_HOST=`hostname -f`
export SPARK_MASTER_IP=$SPARK_MASTER_HOST
export SPARK_LOCAL_IP=$SPARK_MASTER_HOST
export SPARK_PUBLIC_DNS=$SPARK_MASTER_HOST

****Master/Worker defaut*****

spark.master                    
spark://prdslhlcsyacm01.myfamilysouth.com:7077
spark.eventLog.enabled           true

****Worker env.sh*****

SPARK_MASTER_HOST=master
SPARK_MASTER_IP=$SPARK_MASTER_HOST
SPARK_PUBLIC_DNS=`hostname -f`
SPARK_LOCAL_IP=$SPARK_PUBLIC_DNS





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Driver-startup-error-when-submitting-in-cluster-mode-tp21425.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to