Re: Spark Streaming dealing with broken files without dying

2015-08-11 Thread Akhil Das
You can do something like this: val fStream = ssc.textFileStream(/sigmoid/data/) .map(x = { try{ //Move all the transformations within a try..catch }catch{ case e: Exception = { logError(Whoops!! ); null } } }) Thanks Best Regards On Mon, Aug 10, 2015 at 7:44 PM, Mario Pastorelli

Spark Streaming dealing with broken files without dying

2015-08-10 Thread Mario Pastorelli
Hey Sparkers, I would like to use Spark Streaming in production to observe a directory and process files that are put inside it. The problem is that some of those files can be broken leading to a IOException from the input reader. This should be fine for the framework I think: the exception should