Tathagata Das created SPARK-1940:
------------------------------------

             Summary: Enable rolling of executor logs (stdout / stderr)
                 Key: SPARK-1940
                 URL: https://issues.apache.org/jira/browse/SPARK-1940
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
            Reporter: Tathagata Das


Currently, in the default log4j configuration, all the executor logs get sent 
to the file <code>[executor-working-dir]/stderr</code>. This does not all log 
files to be rolled, so old logs cannot be removed. 

Using log4j RollingFileAppender allows log4j logs to be rolled, but all the 
logs get sent to a different set of files, other than the files 
<code>stdout</code> and <code>stderr</code> . So the logs are not visible in 
the Spark web UI any more as Spark web UI only reads the files 
<code>stdout</code> and <code>stderr</code>. Furthermore, it still does not 
allow the stdout and stderr to be cleared periodically in case a large amount 
of stuff gets written to them (e.g. by explicit println inside map function).

Solving this requires rolling of the logs in such a way that Spark web UI is 
aware of it and can retrieve the logs across the rolled-over files.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to