Hi,
I finally got it working.
I was using the updateStateByKey() function to maintain the previous
value of the state, and I found that the event list was empty. Hence
handling the empty event list by using event.isEmtpy() sort out the
problem.

On Sun, May 22, 2016 at 7:59 PM, Ted Yu <yuzhih...@gmail.com> wrote:
> You mean when rdd.isEmpty() returned false, saveAsTextFile still produced
> empty file ?
>
> Can you show code snippet that demonstrates this ?
>
> Cheers
>
> On Sun, May 22, 2016 at 5:17 AM, Yogesh Vyas <informy...@gmail.com> wrote:
>>
>> Hi,
>> I am reading files using textFileStream, performing some action onto
>> it and then saving it to HDFS using saveAsTextFile.
>> But whenever there is no file to read, Spark will write and empty RDD(
>> [] ) to HDFS.
>> So, how to handle the empty RDD.
>>
>> I checked rdd.isEmpty() and rdd.count>0, but both of them does not works.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to