Try to add a filter to remove/replace the null elements within/before the
map operation.
Thanks
Best Regards
On Mon, Sep 7, 2015 at 3:34 PM, ZhengHanbin wrote:
> Hi,
>
> I am using spark streaming to join every RDD of a DStream to a stand alone
> RDD to generate a new
Probably, the problem here is that the recovered StreamingContext is trying
to refer to the pre-failure static RDD, which does exist after the failure.
The solution: When the driver process restarts from checkpoint, you need to
recreate the static RDD again explicitly, and make that the recreated
Hi,
I am using spark streaming to join every RDD of a DStream to a stand alone RDD
to generate a new DStream as followed:
def joinWithBatchEvent(contentFeature: RDD[(String, String)],
batchEvent: DStream[((String, String), (Long, Double,
Double))]) = {