The laziness is hard to deal with in these situations. I would suggest
trying to handle expected cases "FileNotFound", etc using other methods
before even starting a Spark job. If you really want to try.catch a
specific portion of a Spark job, one way is to just follow it with an
action. You can ev
Hi Burak, thanks for your answer.
I have a "new MyResultFunction()(sparkContext, inputPath).collect" in the
unit test (so to evaluate the actual result), and there I can observe and
catch the exception. Even considering Spark's laziness, shouldn't I catch
the exception while occurring in the try..
textFile is a lazy operation. It doesn't evaluate until you call an action
on it, such as .count(). Therefore, you won't catch the exception there.
Best,
Burak
On Mon, Aug 24, 2015 at 9:09 AM, Roberto Coluccio <
roberto.coluc...@gmail.com> wrote:
> Hello folks,
>
> I'm experiencing an unexpected
Hello folks,
I'm experiencing an unexpected behaviour, that suggests me thinking about
my missing notions on how Spark works. Let's say I have a Spark driver that
invokes a function like:
- in myDriver -
val sparkContext = new SparkContext(mySparkConf)
val inputPath = "file://home/myUser