what does the WebUI show? What do you see when you click on "stderr" and
"stdout" links ? These links must contain stdoutput and stderr for each
executor.
About your custom logging in executor, are you sure you checked "${spark.
yarn.app.container.log.dir}/spark-app.log"
Actual location of this fil
Hi Ted & Nguyen,
@Ted , I was under the belief that if the log4j.properties file would be
taken from the application classpath if file path is not specified.
Please correct me if I am wrong. I tried your approach as well still I
couldn't find the logs.
@nguyen I am running it on a Yarn cluster ,
Please use the following syntax:
--conf
"spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///local/file/log4j.properties"
FYI
On Fri, Apr 29, 2016 at 6:03 AM, dev loper wrote:
> Hi Spark Team,
>
> I have asked the same question on stack overflow , no luck yet.
>
>
> http://stackov
These are executor's logs, not the driver logs. To see this log files, you
have to go to executor machines where tasks is running. To see what you
will print to stdout or stderr you can either go to the executor machines
directly (will store in "stdout" and "stderr" files somewhere in the
executor
Hi Spark Team,
I have asked the same question on stack overflow , no luck yet.
http://stackoverflow.com/questions/36923949/where-to-find-logs-within-spark-rdd-processing-function-yarn-cluster-mode?noredirect=1#comment61419406_36923949
I am running my Spark Application on Yarn Cluster. No matter