> > see discussions about Spark not really liking multiple contexts in the > same JVM >
Speaking of this - is there a standard way of writing unit tests that
require a SparkContext?
We've ended up copying out the code of SharedSparkContext to our own
testing hierarchy, but it occurs to me someone would have published a test
jar by now if that was the best way.
-Nathan
