[ 
https://issues.apache.org/jira/browse/SPARK-21013?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin resolved SPARK-21013.
------------------------------------
    Resolution: Duplicate

You need the MR history server for aggregated logs to show. This is already 
explained in Spark's documentation.

> Spark History Server does not show the logs of completed Yarn Jobs
> ------------------------------------------------------------------
>
>                 Key: SPARK-21013
>                 URL: https://issues.apache.org/jira/browse/SPARK-21013
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 1.6.1, 2.0.1, 2.1.0
>            Reporter: Hari Ck
>            Priority: Minor
>              Labels: historyserver, ui
>
> I am facing issue when accessing the container logs of a completed Spark 
> (Yarn) application from the History Server.
> Repro Steps:
> 1) Run the spark-shell in yarn client mode. Or run Pi job in Yarn mode. 
> 2) Once the job is completed, (in the case of spark shell, exit after doing 
> some simple operations), try to access the STDOUT or STDERR logs of the 
> application from the Executors tab in the Spark History Server UI. 
> 3) If yarn log aggregation is enabled, then logs won't be available in node 
> manager's log location.  But history Server is trying to access the logs from 
> the nodemanager's log location giving below error in the UI:
> Failed redirect for container_e31_1496881617682_0003_01_000002
> ResourceManager
> RM Home
> NodeManager
> Tools
> Failed while trying to construct the redirect url to the log server. Log 
> Server url may not be configured
> java.lang.Exception: Unknown container. Container either has not started or 
> has already completed or doesn't belong to this node at all.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to