You forgot a return statement in the 'else' clause, which is what the
compiler is telling you. There's nothing more to it here. Your
function is much simpler however as

Function<String, Boolean> checkHeaders2 = (x ->
x.startsWith("npi")||x.startsWith("CPT"));

On Thu, Dec 24, 2015 at 1:13 AM, rdpratti <deprat...@easternct.edu> wrote:
> I am trying to pass lambda expressions to Spark JavaRDD methods.
>
> Having using lambda expressions in Java, in general, I was hoping for
> similar behavour and coding patterns, but am finding confusing compile
> errors.
>
> The use case is a lambda expression that has a number of statements,
> returning a boolean from various points in the logic.
>
> I have tried both inline, as well as defining a Function functional type
> with no luck.
>
> Here is an example:
>
> Function<String, Boolean> checkHeaders2 = x -> {if
> (x.startsWith("npi")||x.startsWith("CPT"))
>                                                                               
>                           return new Boolean(false);
>                                                                               
>                           else new Boolean(true); };
>
> This code gets an error stating that method must return a Boolean.
>
> I know that the lambda expression can be shortened and included as a simple
> one statement return, but using non-Spark Java 8 and a Predicate functional
> type this would compile and be usable.
>
> What am I missing and how to use the Spark Function to define lambda
> exressions made up of mutliple Java statements.
>
> Thanks
>
> rd
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Using-Java-Function-API-with-Java-8-tp25794.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to