oh,
Yes. Thanks much.
> On Oct 14, 2015, at 18:47, Akhil Das wrote:
>
> com.holdenkarau.spark.testing
Did a quick search and found the following, I haven't tested it myself.
Add the following to your build.sbt
libraryDependencies += "com.holdenkarau" % "spark-testing-base_2.10" %
"1.5.0_1.4.0_1.4.1_0.1.2"
Create a class extending com.holdenkarau.spark.testing.SharedSparkContext
And you
Hi,
How to add dependency in build.sbt if I want to use SharedSparkContext?
I’ve added spark-core, but it doesn’t work.(cannot find SharedSparkContext)
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For