Hi All,

I was able to resolve this matter with a simple fix.  It seems that in order
to process a reduceByKey and the flat map operations at the same time, the
only way to resolve was to increase the number of threads to > 1.

Since I'm developing on my personal machine for speed, I simply updated the
sparkURL argument to:
   private static String sparkURL = "local[2]";  //Instead of "local"

,which is then used by the JavaStreamingContext method as a parameter.

After I made this change, I was able to see the reduceByKey values properly
aggregated and counted.

Best Regards,

D



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/reduceByKey-Not-Being-Called-by-Spark-Streaming-tp8684p8739.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to