First, it's kinda confusing to change subjects in the middle of a thread...

On Tue, Jul 28, 2015 at 1:44 PM, Elkhan Dadashov <elkhan8...@gmail.com>
wrote:

> @Marcelo
> *Question1*:
> Do you know why launching Spark job through SparkLauncher in Java, stdout
> logs (i.e., INFO Yarn.Client) are written into error stream
> (spark.getErrorStream()) instead of output stream ?
>

All Spark jobs write that information to stderr. If you run "spark-submit
... 2>/dev/null" you won't see any of those logs.


> *Question2*:
>
> What is the best way to know about Spark job progress & final status in
> Java ?
>

There's no API for that. You'd have to write something, probably
implementing a SparkListener.

-- 
Marcelo

Reply via email to