Your jars are not delivered to the workers. Have a look at this:
http://stackoverflow.com/questions/24052899/how-to-make-it-easier-to-deploy-my-jar-to-spark-cluster-in-standalone-mode
--
View this message in context:
Here is the same issues:
[1]
http://stackoverflow.com/questions/28186607/java-lang-classcastexception-using-lambda-expressions-in-spark-job-on-remote-ser
[2]
Update: I deployed a stand-alone spark in localhost then set Master as
spark://localhost:7077 and it met the same issue
Don't know how to solve it.
--
View this message in context:
Hi, I try to run Spark on YARN cluster by set master as yarn-client on java
code. It works fine with count task by not working with other command.
It threw ClassCastException:
java.lang.ClassCastException: cannot assign instance of
java.lang.invoke.SerializedLambda to field