You mean when rdd.isEmpty() returned false, saveAsTextFile still produced
empty file ?

Can you show code snippet that demonstrates this ?

Cheers

On Sun, May 22, 2016 at 5:17 AM, Yogesh Vyas <informy...@gmail.com> wrote:

> Hi,
> I am reading files using textFileStream, performing some action onto
> it and then saving it to HDFS using saveAsTextFile.
> But whenever there is no file to read, Spark will write and empty RDD(
> [] ) to HDFS.
> So, how to handle the empty RDD.
>
> I checked rdd.isEmpty() and rdd.count>0, but both of them does not works.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to