Spark GUI runs by default on 4040 and if a job crashes (assuming you meant
there was an issue with spark-submit), then the GUI will disconnect.

GUI is not there for diagnostics as it reports on statistics. My
inclination would be to look at the YARN log files assuming you are using
YARN as your resource manager or the output from the spark-submit that you
piped to a file.

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 17 June 2016 at 14:49, Sumona Routh <sumos...@gmail.com> wrote:

> Hi there,
> Our Spark job had an error (specifically the Cassandra table definition
> did not match what was in Cassandra), which threw an exception that logged
> out to our spark-submit log.
> However ,the UI never showed any failed stage or job. It appeared as if
> the job finished without error, which is not correct.
>
> We are trying to define our monitoring for our scheduled jobs, and we
> intended to use the Spark UI to catch issues. Can we explain why the UI
> would not report an exception like this? Is there a better approach we
> should use for tracking failures in a Spark job?
>
> We are currently on 1.2 standalone, however we do intend to upgrade to 1.6
> shortly.
>
> Thanks!
> Sumona
>

Reply via email to