I'm trying to link a spark slave with an already-setup master, using: $SPARK_HOME/sbin/start-slave.sh spark://ip-172-31-32-12:7077
However the result shows that it cannot open a log file it is supposed to create: failed to launch org.apache.spark.deploy.worker.Worker: tail: cannot open '/opt/spark/spark-1.0.0-bin-hadoop1/sbin/../logs/spark-ubuntu-org.apache.spark.deploy.worker.Worker-spark://ip-172-31-32-12:7077-ip-172-31-36-80.out' for reading: No such file or directory full log in /opt/spark/spark-1.0.0-bin-hadoop1/sbin/../logs/spark-ubuntu-org.apache.spark.deploy.worker.Worker-spark://ip-172-31-32-12:7077-ip-172-31-36-80.out (ignore this line as the log file is not there) What happened here? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-slave-fail-to-start-with-wierd-error-information-tp8203.html Sent from the Apache Spark User List mailing list archive at Nabble.com.