1. It happens to all the classes inside the jar package. 2. I didn't do any changes. - I have three nodes: one master and two slaves in the conf/slaves file - In spark-env.sh I just set the HADOOP_CONF_DIR parameter - In spark-defaults.conf I didn't change anything 3. The container doesn't even starts.
It seems like there is some problem when sending the jar files. I have just realised I get the following message. Diagnostics: java.io.IOException: Resource file:/opt/spark/BenchMark-1.0-SNAPSHOT.jar changed on src filesystem (expected 1455792343000, was 1455793100000 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-when-executing-Spark-application-on-YARN-tp26248p26264.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org