Add "-Dspark.master=local[*]" to the VM properties of your test run.

On Mon, Sep 26, 2016 at 2:25 PM, Mohit Jaggi <mohitja...@gmail.com> wrote:

> I want to use the following API  SparkILoop.run(...). I am writing a test
> case as that passes some scala code to spark interpreter and receives
> result as string.
>
> I couldn't figure out how to pass the right settings into the run()
> method. I get an error about "master' not being set.
>
> object SparkILoop {
>
>   /**
>    * Creates an interpreter loop with default settings and feeds
>    * the given code to it as input.
>    */
>   def run(code: String, sets: Settings = new Settings): String = {
>     import java.io.{ BufferedReader, StringReader, OutputStreamWriter }
>
>     stringFromStream { ostream =>
>       Console.withOut(ostream) {
>         val input = new BufferedReader(new StringReader(code))
>         val output = new JPrintWriter(new OutputStreamWriter(ostream), true)
>         val repl = new SparkILoop(input, output)
>
>         if (sets.classpath.isDefault) {
>           sets.classpath.value = sys.props("java.class.path")
>         }
>         repl process sets
>       }
>     }
>   }
>   def run(lines: List[String]): String = run(lines.map(_ + "\n").mkString)
> }
>
>

Reply via email to