I think that you have two options:
- to run your code locally, you can use local mode by using the 'local'
master like so:
new SparkConf().setMaster(local[4]) where 4 is the number of cores
assigned to the local mode.
- to run your code remotely you need to build the jar with dependencies and
For debugging, I run locally inside Eclipse without maven.
I just add the Spark assembly jar to my Eclipse project build path and click
'Run As... Scala Application'.
I have done the same with Java and Scala Test, it's quick and easy.
I didn't see any third party jar dependencies in your code, so
Hi,
I am trying to write and debug Spark applications in scala-ide and
maven, and in my code I target at a Spark instance at spark://xxx
object App {
def main(args : Array[String]) {
println( Hello World! )
val sparkConf = new