Hello here, I am new to spark and am trying to add some monitoring for spark
applications specifically to handle the below situations - 1 - Forwarding
Spark Event Logs to identify critical events like job start, executor
failures, job failures etc to ElasticSearch via log4j. However I could not
find any way to foward event log via log4j configurations. Is there any
other recommended approach to track these application events?2 - For Spark
streaming jobs, is there any way to identify that data from Kafka is not
consumed for whatever reason, or the offsets are not progressing as expected
and also forward that to ElasticSearch via log4j for
monitoringThanks,Raymond



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

Reply via email to