does throw Spark exception THEN
as far as I am concerned, chess-mate
From: Steve Lewis [mailto:lordjoe2...@gmail.com]
Sent: Sunday, April 19, 2015 8:16 PM
To: Evo Eftimov
Cc: Olivier Girardot; user@spark.apache.org
Subject: Re: Can a map function return null
So you imagine something
throw Spark exception THEN
as far as I am concerned, chess-mate
From: Steve Lewis [mailto:lordjoe2...@gmail.com]
Sent: Sunday, April 19, 2015 8:16 PM
To: Evo Eftimov
Cc: Olivier Girardot; user@spark.apache.org
Subject: Re: Can a map function return null
So you imagine something like
null
Sent from Samsung Mobile
Original message
From: Olivier Girardot
Date:2015/04/18 22:04 (GMT+00:00)
To: Steve Lewis ,user@spark.apache.org
Subject: Re: Can a map function return null
You can return an RDD with null values inside, and afterwards filter on
item
You can return an RDD with null values inside, and afterwards filter on
item != null
In scala (or even in Java 8) you'd rather use Option/Optional, and in Scala
they're directly usable from Spark.
Exemple :
sc.parallelize(1 to 1000).flatMap(item = if (item % 2 ==0) Some(item)
else