I am trying to implement counters in Spark and I guess Accumulators are the
way to do it.
My motive is to update a counter in map function and access/reset it in the
driver code. However the /println/ statement at the end still yields value
0(It should 9). Am I doing something wrong?
def main(args : Array[String]){
val conf = new SparkConf().setAppName("SortedNeighbourhoodMatching")
val sc = new SparkContext(conf)
var counter = sc.accumulable(0, "Counter")
var inputFilePath = args(0)
val inputRDD = sc.textFile(inputFilePath)
inputRDD.map { x => {
counter += 1
} }
println(counter.value)
}
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Counters-in-Spark-tp21646.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]