I'm running a simple streaming application that reads from Kafka, maps the 
events and prints them and I'm trying to use accumulators to count the number 
of mapped records.

While this works in standalone(IDE), when submitting to YARN I get 
NullPointerException on accumulator.add(1) or accumulator += 1

Anyone using accumulators in .map() with Spark 1.5 and YARN ?

Thanks,
Amit



Reply via email to