Thanks for the clarification, Marcelo.
On Tue, Nov 15, 2016 at 6:20 PM Marcelo Vanzin wrote:
> On Tue, Nov 15, 2016 at 5:57 PM, Elkhan Dadashov
> wrote:
> > This is confusing in the sense that, the client needs to stay alive for
> > Spark Job to finish successfully.
> >
> > Actually the client
On Tue, Nov 15, 2016 at 5:57 PM, Elkhan Dadashov wrote:
> This is confusing in the sense that, the client needs to stay alive for
> Spark Job to finish successfully.
>
> Actually the client can die or finish (in Yarn-cluster mode), and the spark
> job will successfully finish.
That's an internal
Hi Marcelo,
This part of the JaaDoc is confusing:
https://github.com/apache/spark/blob/master/launcher/src/main/java/org/apache/spark/launcher/LauncherServer.java
"
* In *cluster mode*, this means that the client that launches the
* application *must remain alive for the duration of the applica
I figured out JOB id returned from sparkAppHandle.getAppId(), is unique
ApplicationId which looks like these:
for Local mode Spark env: Local-1477184581895
For Distributed Spark mode: Application_1477504900821_0005
ApplicationId represents the globally unique identifier for an application.
The g
I found answer regarding logging in the JavaDoc of SparkLauncher:
"Currently, all applications are launched as child processes. The child's
stdout and stderr are merged and written to a logger (see
java.util.logging)."
One last question. sparkAppHandle.getAppId() - does this function
return org.a
Thanks, Marcelo.
One more question regarding getting logs.
In previous implementation of SparkLauncer we could read logs from :
sparkLauncher.getInputStream()
sparkLauncher.getErrorStream()
What is the recommended way of getting logs and logging of Spark execution
while using sparkLauncer#start
On Tue, Oct 18, 2016 at 3:01 PM, Elkhan Dadashov wrote:
> Does my map task need to wait until Spark job finishes ?
No...
> Or is there any way, my map task finishes after launching Spark job, and I
> can still query and get status of Spark job outside of map task (or failure
> reason, if it has
Hi,
Does the delegator map task of SparkLauncher need to stay alive until Spark
job finishes ?
1)
Currently, I have mapper tasks, which launches Spark job via
SparkLauncer#startApplication()
Does my map task need to wait until Spark job finishes ?
Or is there any way, my map task finishes