.map is just a transformation, so no work will actually be performed until 
something takes action against it.  Try adding a .count(), like so:

inputRDD.map { x => {
     counter += 1
   } }.count()

In case it is helpful, here are the docs on what exactly the transformations 
and actions are:
http://spark.apache.org/docs/1.2.0/programming-guide.html#transformations
http://spark.apache.org/docs/1.2.0/programming-guide.html#actions

Cheers,

Sean


On Feb 13, 2015, at 9:50 AM, nitinkak001 
<[email protected]<mailto:[email protected]>> wrote:

I am trying to implement counters in Spark and I guess Accumulators are the
way to do it.

My motive is to update a counter in map function and access/reset it in the
driver code. However the /println/ statement at the end still yields value
0(It should 9). Am I doing something wrong?

def main(args : Array[String]){

   val conf = new SparkConf().setAppName("SortedNeighbourhoodMatching")
   val sc = new SparkContext(conf)
   var counter = sc.accumulable(0, "Counter")
   var inputFilePath = args(0)
   val inputRDD = sc.textFile(inputFilePath)

   inputRDD.map { x => {
     counter += 1
   } }
   println(counter.value)
}




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Counters-in-Spark-tp21646.html
Sent from the Apache Spark User List mailing list archive at 
Nabble.com<http://Nabble.com>.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
[email protected]<mailto:[email protected]>
For additional commands, e-mail: 
[email protected]<mailto:[email protected]>


Reply via email to