.n3.nabble.com/Bug-in-Accumulators-tp17263p19579.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19579.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user
that uses accumulators.
Otherwise, you have to change their code such that the above issue does not
appear anymore.
Best,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19567.html
Sent from the Apache Spark User List mailing list
to change their code such that the above issue does not
appear anymore.
Best,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19567.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
)
at
org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:164)
... 14 more
Seems that there is a problem with mapPartitions ...
Thanks for your suggestion,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19579.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr
.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19579.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user
...
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p17372.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user
.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p17372.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail
maintain Spark is willing to solve or at least to tell us about ...
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p17372.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
that no one from the people
who maintain Spark is willing to solve or at least to tell us about ...
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p17372.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
Sorry, I forgot to say that this gives the above error just when run on a
cluster, not in local mode.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p17277.html
Sent from the Apache Spark User List mailing list archive
There is for sure a bug in the Accumulators code.
More specifically, the following code works well as expected:
def main(args: Array[String]) {
val conf = new SparkConf().setAppName(EL LBP SPARK)
val sc = new SparkContext(conf)
val accum = sc.accumulator(0)
sc.parallelize
works fine. Spark 1.1.0 on REPL
On Sat, Oct 25, 2014 at 1:41 PM, octavian.ganea octavian.ga...@inf.ethz.ch
wrote:
There is for sure a bug in the Accumulators code.
More specifically, the following code works well as expected:
def main(args: Array[String]) {
val conf = new SparkConf
14 matches
Mail list logo