Hello,

I'm new to Spark, and tried to setup a Spark cluster of 1 master VM SparkV1
and 1 worker VM SparkV4 (the error is the same if I have 2 workers). They
are connected without a problem now. But when I submit a job (as in
https://spark.apache.org/docs/latest/quick-start.html) at the master:

>spark-submit --master spark://SparkV1:7077 examples/src/main/python/pi.py

it seems to run ok and returns "Pi is roughly...", but the worker has the
following Error: 

15/02/07 15:22:33 ERROR EndpointWriter: AssociationError
[akka.tcp://sparkWorker@SparkV4:47986] <-
[akka.tcp://sparkExecutor@SparkV4:46630]: Error [Shut down address:
akka.tcp://sparkExecutor@SparkV4:46630] [
akka.remote.ShutDownAssociation: Shut down address:
akka.tcp://sparkExecutor@SparkV4:46630
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The
remote system terminated the association because it is shutting down.
]

More about the setup: each VM has only 4GB RAM, running Ubuntu, using
spark-1.2.0, built for Hadoop 2.6.0.

I have struggled with this error for a few days. Could anyone please tell me
what the problem is and how to fix it?

Thanks,
Lan




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/ERROR-EndpointWriter-AssociationError-tp21543.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to