Yes, spark.yarn.historyServer.address is used to access the spark history
server from yarn, it is not needed if you use only the yarn history server.
It may be possible to have both history servers running, but I have not tried
that yet.
Besides, as far as I have understood, yarn and spark
You can see this information in the yarn web UI using the configuration I
provided in my former mail (click on the application id, then on logs; you will
then be automatically redirected to the yarn history server UI).
On 24/02/2015 19:49, Colin Kincaid Williams wrote:
So back to my original
Hi Colin,
Here is how I have configured my hadoop cluster to have yarn logs available
through both the yarn CLI and the _yarn_ history server (with gzip compression
and 10 days retention):
1. Add the following properties in the yarn-site.xml on each node managers and
on the resource manager:
Looks like in my tired state, I didn't mention spark the whole time.
However, it might be implied by the application log above. Spark log
aggregation appears to be working, since I can run the yarn command above.
I do have yarn logging setup for the yarn history server. I was trying to
use the
the spark history server and the yarn history server are totally
independent. Spark knows nothing about yarn logs, and vice versa, so
unfortunately there isn't any way to get all the info in one place.
On Tue, Feb 24, 2015 at 12:36 PM, Colin Kincaid Williams disc...@uw.edu
wrote:
Looks like in
So back to my original question.
I can see the spark logs using the example above:
yarn logs -applicationId application_1424740955620_0009
This shows yarn log aggregation working. I can see the std out and std
error in that container information above. Then how can I get this
information in a
Hi,
I have been trying to get my yarn logs to display in the spark
history-server or yarn history-server. I can see the log information
yarn logs -applicationId application_1424740955620_0009
15/02/23 22:15:14 INFO client.ConfiguredRMFailoverProxyProvider: Failing
over to