I was able to get it working. Instead of using customers.flatMap to return
alerts. I had to use the following:

customers.foreachRDD(new Function<JavaPairRDD&lt;String,
Iterable&lt;QueueEvent>>, Void>() {
            @Override
            public Void call(final JavaPairRDD<String,
Iterable&lt;QueueEvent>> rdd) throws Exception {
                rdd.foreachPartition(new
VoidFunction<Iterator&lt;Tuple2&lt;String, Iterable&lt;QueueEvent>>>>() {
                    @Override
                    public void call(final Iterator<Tuple2&lt;String,
Iterable&lt;QueueEvent>>> i)
                            throws Exception {
                    }
               }
           }
}

This made sure that we only sent one alert per event for a customer. My unit
test showed that there was one RDD that had both customers with their events
as partitions.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-streaming-action-running-the-same-work-in-parallel-tp22613p22665.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to