Hello everyone,

I'm trying to develop a WebService launching jobs. The WebService is based on 
tomcat, and I'm working with Spark 2.1.0.
The SparkLauncher provides two method to launch the job. First 
SparkLauncher.launch(), and 
SparkLauncher.startApplication(SparkAppHandle.Listener... listeners). The 
latter is preferred in the document since it provides much more control over 
the application.

In my case, the first one works fine. However the second one is failing when 
creating the SparkContext.
>From my investigation, it might be related to the handle. startApplication 
>creates a launcher server as the user tomcat on a specific port. Then 
>SparkContext tries to create a Socket listening to the just created launcher 
>server, but it ends on connection refused and the job is FAILED. The driver 
>being logged as a different user than tomcat, it seems that it cannot register 
>the Socket on the given port (even tried starting the cluster on the root 
>user, but no success).

I did some check before calling "new JavaSparkContext();" to create the 
context. The Socket is created in org.apache.spark.launcher.LauncherBackend:
val port = sys.env.get(LauncherProtocol.ENV_LAUNCHER_PORT).map(_.toInt)
val s = new Socket(InetAddress.getLoopbackAddress(), port.get)

I'm able to create a local ServerSocket on a different port and create a Socket 
on that port.
I can't create a local ServerSocket on the same loopback address as Spark, and 
the same port. The message says that there already is a LauncherServer on that 
address / port.

I'm not a socket exert at all, but this situation leads me to believe that 
Spark is unable to manage this use case yet, and that I should stay with the 
less conveniant method SparkLauncher.launch().

Did I do something wrong? Or someone faced a similar issue and have a 
workaround?
Also, I'm quite new in the Spark community. Should I fill an issue on JIRA or 
somewhere else?

Yohann

Reply via email to