I've see this same exact problem too and I've been ignoring, but I wonder if
I'm loosing data. Can anyone at least comment on this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFound-for-user-class-in-uber-jar-tp10613p11902.html
Sent from the
Not sure if this problem reached the Spark guys because it shows in Nabble
that This post has NOT been accepted by the mailing list yet.
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFound-for-user-class-in-uber-jar-td10613.html#a11902
I'm resubmitting.
Greetings,
I'm currently
I'm running on spark 1.0.0 and I see a similar problem when using the
socketTextStream receiver. The ReceiverTracker task sticks around after a
ssc.stop(false).
--
View this message in context:
Are there any workarounds for this? Seems to be a dead end so far.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-sbt-pack-with-Spark-1-0-0-tp6649p11502.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
I'm running a 1.0.0 standalone cluster based on amplab/dockerscripts with 3
workers. I'm testing out spark-submit and I'm getting errors using
*--deploy-mode cluster* and using an http:// url to my JAR. I'm getting the
following error back.
Sending launch command to spark://master:7077
Driver