Hi Tenglong,

In addition to trsell's reply, you can add any method to an rdd without
making changes to spark code.

This can be achieved by using implicit class in your own client code:

implicit class extendRDD[T](rdd: RDD[T]){

def foo()

}

Then you basically nees to import this implicit class in scope where you
want to use the new foo method.

Thanks
Tarun Kumar

On Mon, 5 Dec 2016 at 6:59 AM, <trs...@gmail.com> wrote:

> How does your application fetch the spark dependency? Perhaps list your
> project dependencies and check it's using your dev build.
>
> On Mon, 5 Dec 2016, 08:47 tenglong, <longteng...@gmail.com> wrote:
>
> Hi,
>
> Apparently, I've already tried adding a new method to RDD,
>
> for example,
>
> class RDD {
>   def foo() // this is the one I added
>
>   def map()
>
>   def collect()
> }
>
> I can build Spark successfully, but I can't compile my application code
> which calls rdd.foo(), and the error message says
>
> value foo is not a member of org.apache.spark.rdd.RDD[String]
>
> So I am wondering if there is any mechanism prevents me from doing this or
> something I'm doing wrong?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to