You need to add that jar in the classpath. While submitting the job, you
can use --jars, --driver-classpath etc configurations to add the jar. Apart
from that if you are running the job as a standalone application, then you
can use the sc.addJar option to add the jar (which will ship this jar into
After changing the '--deploy_mode client' the program seems to work
however it looks like there is a bug in spark when using --deploy_mode as
'yarn'. Should I open a bug?
On Tue, Aug 11, 2015 at 3:02 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:
I see the following line in the log 15/08/11
I see the following line in the log 15/08/11 17:59:12 ERROR
spark.SparkContext: Jar not found at
file:/home/ec2-user/./spark-streaming-test-0.0.1-SNAPSHOT-jar-with-dependencies.jar,
however I do see that this file exists on all the node in that path. Not
sure what's happening here. Please note I