Hi Everyone,
I am developing a scala app, in which the main object does not call the
SparkContext, but another object defined in the same package creates it,
run spark operations and closes it. The jar file is built successfully in
maven, but when I called spark-submit with this jar, that spark does not
seem to execute any code.
So my code looks like
[Main.scala]
object Main(args) {
def main() {
/*check parameters */
Component1.start(parameters)
}
}
[Component1.scala]
object Component1{
def start{
val sc = new SparkContext(conf)
/* do spark operations */
sc.close()
}
}
The above code compiles into Main.jar but spark-submit does not execute
anything and does not show me any error (not in the logs either.)
spark-submit master= spark://.... Main.jar
I've got this all the code working before when I wrote a single scala file,
but now that I am separating into multiple scala source files, something
isn't running right.
Any advice on this would be greatly appreciated!
Regards,
Aung