Here's a simplified example:

        SparkConf conf = new SparkConf().setAppName(
                "Sigmoid").setMaster("local");
        JavaSparkContext sc = new JavaSparkContext(conf);

        List<String> user = new ArrayList<String>();

        user.add("Jack");
        user.add("Jill");
        user.add("Jack");
        user.add("Bob");

        JavaRDD<String> userRDD = sc.parallelize(user);

        //Now Lets filter all Jacks!
        JavaRDD<String> jackRDD = userRDD
                *.filter(new Function<String, Boolean>() {*

*                    public Boolean call(String v1) throws Exception {*
*                        return v1.equals("Jack");*
*                    }*

*                }*);


        //Lets print all jacks!
        for (String s : jackRDD.collect()) {
            System.out.println(s);
        }




Thanks
Best Regards

On Tue, Jul 7, 2015 at 5:39 PM, Hafsa Asif <hafsa.a...@matchinguu.com>
wrote:

> I have also tried this stupid code snippet, only thinking that it may even
> compile code
> Function1<User, Object> FILTER_USER = new AbstractFunction1<User, Object
> >()
> {
>         public Object apply(User user){
>             return user;
>         }
>     };
>
>
> FILTER_USER is fine but cannot be applied to the following two options but
> no results:
>                 User[] filterUsr =
> (User[])rdd.rdd().retag(User.class).filter(FILTER_USER);
>
>                 User userFilter = (User) rdd.rdd().filter(FILTER_USER);
>
> Giving issue: Inconertable types
> I really need proper code related to this issue.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-implement-top-and-filter-on-object-List-for-JavaRDD-tp23669p23677.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to