I am running the ./bin/spark-class from the workers.
I have added my slaves in conf/slaves file.Both ./sbin/start-all.sh and
./sbin/start-slaves.sh are returning "Failed to launch Worker" exception with
log in the first mail.
I am using standalone spark cluster with hadoop 1.2.
Where are you running the spark-class version? Hopefully also on the
workers.
If you're trying to centrally start/stop all workers, you can add a
"slaves" file to the spark conf/ directory which is just a list of your
hosts, one per line. Then you can just use "./sbin/start-slaves.sh" to
start the
Yes.
Thanks & Regards,
Meethu M
On Tuesday, 1 July 2014 6:14 PM, Akhil Das wrote:
Is this command working??
java -cp
::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
-XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents
Is this command working??
java -cp ::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/
assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
-XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
org.apache.spark.deploy.worker.Worker spark://x.x.x.174:7077
Thanks
Hi ,
I am using Spark Standalone mode with one master and 2 slaves.I am not able to
start the workers and connect it to the master using
./bin/spark-class org.apache.spark.deploy.worker.Worker spark://x.x.x.174:7077
The log says
Exception in thread "main" org.jboss.netty.channel.ChannelE