It looks like you're having issues with including your custom spark version (with the extensions) in your test project. To use your local spark version: 1) make sure it has a custom version (let's call it 2.1.0-CUSTOM) 2) publish it to your local machine with `sbt publishLocal` 3) include the modified version of spark in your test project with `libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0-CUSTOM"`
However, as others have said, it can be quite a lot of work to maintain a custom fork of spark. If you're planning on contributing these changes back to spark, than forking is the way to go (although I would recommend to keep an ongoing discussion with the maintainers, to make sure your work will be merged back). Otherwise, I would recommend to use "implicit extensions" to enrich your rdds instead. An easy workaround to access spark-private fields is to simply define your custom rdds in an "org.apache.spark" package as well ;) --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org