Hi,

val test = persons.value
  .map{tuple => (tuple._1, tuple._2
  .filter{event => *****inactiveIDs.filter(event2 => event2._1 == ****
tuple._1).count() != 0})}

Your problem is right between the asterisk. You can't make an RDD operation 
inside an RDD operation, because RDD's can't be serialized. 
Therefore you are receiving the NullPointerException. Try joining the RDDs 
based on `event` and then filter based on that.

Best,
Burak

----- Original Message -----
From: "Blackeye" <black...@iit.demokritos.gr>
To: u...@spark.incubator.apache.org
Sent: Tuesday, September 9, 2014 3:34:58 AM
Subject: Re: Filter function problem

In order to help anyone to answer i could say that i checked the
inactiveIDs.filter operation seperated, and I found that it doesn't return
null in any case. In addition i don't how to handle (or check) whether a RDD
is null. I find the debugging to complicated to point the error. Any ideas
how to find the null pointer? 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Filter-function-problem-tp13787p13789.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to