You can write unittest with a local spark context by mixing
LocalSparkContext trait.
See
https://github.com/apache/spark/blob/master/mllib/src/test/scala/org/apache/spark/mllib/classification/LogisticRegressionSuite.scala
https://github.com/apache/spark/blob/master/mllib/src/test/scala/org/apache/spark/mllib/util/LocalSparkContext.scala
as an example.
Sincerely,
DB Tsai
---
My Blog: https://www.dbtsai.com
LinkedIn: https://www.linkedin.com/in/dbtsai
On Sun, Nov 9, 2014 at 9:12 PM, Kevin Burton bur...@spinn3r.com wrote:
What’s the best way to embed spark to run local mode in unit tests?
Some or our jobs are mildly complex and I want to keep verifying that they
work including during schema changes / migration.
I think for some of this I would just run local mode, read from a few text
files via resources, and then write to /tmp …
--
Founder/CEO Spinn3r.com
Location: San Francisco, CA
blog: http://burtonator.wordpress.com
… or check out my Google+ profile
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org