Hi,

we are trying to setup apache spark on a raspberry pi cluster for educational 
use.
Spark is installed in a docker container and all necessary ports are exposed.

After we start master and workers, all workers are listed as alive in the 
master web ui (http://master:8080 <http://master:8080/>).

I want to run the SimpleApp-Example from the spark-homepage 
(http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications
 
<http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications>)
 on the cluster to verify that everything is working, but i cannot get it run.

I built an jar-file and submitted the application with spark-submit and get the 
following output:
spark-submit --master spark://master:6066 <spark://master:6066> --deploy-mode 
cluster --class SimpleApp target/simple-project-1.0.jar
Running Spark using the REST application submission protocol.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/10/28 12:54:43 INFO RestSubmissionClient: Submitting a request to launch an 
application in spark://localhost:6066 <spark://localhost:6066>.
15/10/28 12:54:43 INFO RestSubmissionClient: Submission successfully created as 
driver-20151028115443-0002. Polling submission state...
15/10/28 12:54:43 INFO RestSubmissionClient: Submitting a request for the 
status of submission driver-20151028115443-0002 in spark://localhost:6066 
<spark://localhost:6066>.
15/10/28 12:54:43 INFO RestSubmissionClient: State of driver 
driver-20151028115443-0002 is now SUBMITTED.
15/10/28 12:54:43 INFO RestSubmissionClient: Server responded with 
CreateSubmissionResponse:
{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20151028115443-0002",
  "serverSparkVersion" : "1.5.1",
  "submissionId" : "driver-20151028115443-0002",
  "success" : true
}

The driver is created correctly, but it never starts the application.
What am i missing?

Regards,
Mark

Reply via email to