Yes, I think changing the <version> property (line 29) in spark's root
pom.xml should be sufficient. However, keep in mind that you'll also
need to publish spark locally before you can access it in your test
application.

On Tue, Dec 6, 2016 at 2:50 AM, Teng Long <longteng...@gmail.com> wrote:
> Thank you Jokob for clearing things up for me.
>
> Before, I thought my application was compiled against my local build since I
> can get all the logs I just added in spark-core. But it was all along using
> spark downloaded from remote maven repository, and that’s why I “cannot" add
> new RDD methods in.
>
> How can I specify a custom version? modify version numbers in all the
> pom.xml file?
>
>
>
> On Dec 5, 2016, at 9:12 PM, Jakob Odersky <ja...@odersky.com> wrote:
>
> m rdds in an "org.apache.spark" package as well
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to