Hi
I am working on implementing my idea but here is how it goes:
1. Use this library https://github.com/groupon/spark-metrics
2. Have a cron job which periodically curl /metrics/json endpoint at driver
and all other nodes
3. Parse the response and send the data through a telegraf agent installed
Hi Subramanian,
Did you find any solution for this ?
I am looking for something similar too.
Regards,
Chandan
On Wed, Jun 27, 2018 at 9:47 AM subramgr
wrote:
> I am planning to send these metrics to our KairosDB. Let me know if there
> are
> any examples that I can take a look
>
>
>
> --
> Sent
I am planning to send these metrics to our KairosDB. Let me know if there are
any examples that I can take a look
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail: user-unsubscr...@spar
In our Spark Structured Streaming job we listen to Kafka and we filter out
some messages which we feel are malformed.
Currently we log that information using the LOGGER.
Is there a way to emit some kind of metrics for each time such a malformed
message is seen in Structured Streaming ?
Thanks