How does your application fetch the spark dependency? Perhaps list your
project dependencies and check it's using your dev build.

On Mon, 5 Dec 2016, 08:47 tenglong, <longteng...@gmail.com> wrote:

> Hi,
>
> Apparently, I've already tried adding a new method to RDD,
>
> for example,
>
> class RDD {
>   def foo() // this is the one I added
>
>   def map()
>
>   def collect()
> }
>
> I can build Spark successfully, but I can't compile my application code
> which calls rdd.foo(), and the error message says
>
> value foo is not a member of org.apache.spark.rdd.RDD[String]
>
> So I am wondering if there is any mechanism prevents me from doing this or
> something I'm doing wrong?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to