mpty());
> }
> };
>
> The windowed operation is then:
>
> JavaPairDStream<String, String> cdr_kv =
> cdr_filtered.reduceByKeyAndWindow(appendString, removeString,
> Durations.seconds(WINDOW_DURATION), Durations.seconds(SLIDE_DURATION),
> PARTITIONS, filterEmptyRecords
tion, this function raises the following exception:
"Neither previous window has value for key, nor new values found. Are you sure
your key class hashes consistently?"
I've found this post from 2013:
https://groups.google.com/forum/#!msg/spark-users/9OM1YvWzwgE/PhFgdSTP2OQJ
which however
to represent keys,
which I'm pretty sure hash consistently.
Any clue why this happens and possible suggestions to fix it?
Thanks!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Neither-previous-window-has-value-for-key-nor-new-values-found-tp27140.ht