Thanks for your reply.  Your help is very valuable and all these links are helpful (especially your example)

Best Regards

--Iacovos

On 3/27/19 10:42 PM, Luca Canali wrote:

I find that the Spark metrics system is quite useful to gather resource utilization metrics of Spark applications, including CPU, memory and I/O.

If you are interested an example how this works for us at: https://db-blog.web.cern.ch/blog/luca-canali/2019-02-performance-dashboard-apache-spark If instead you are rather looking at ways to instrument your Spark code with performance metrics, Spark task metrics and event listeners are quite useful for that. See also https://github.com/apache/spark/blob/master/docs/monitoring.md and https://github.com/LucaCanali/sparkMeasure

Regards,

Luca

*From:*manish ranjan <cse1.man...@gmail.com>
*Sent:* Tuesday, March 26, 2019 15:24
*To:* Jack Kolokasis <koloka...@ics.forth.gr>
*Cc:* user <user@spark.apache.org>
*Subject:* Re: Spark Profiler

I have found ganglia very helpful in understanding network I/o , CPU and memory usage  for a given spark cluster.

I have not used , but have heard good things about Dr Elephant ( which I think was contributed by LinkedIn but not 100%sure).

On Tue, Mar 26, 2019, 5:59 AM Jack Kolokasis <koloka...@ics.forth.gr <mailto:koloka...@ics.forth.gr>> wrote:

    Hello all,

         I am looking for a spark profiler to trace my application to
    find
    the bottlenecks. I need to trace CPU usage, Memory Usage and I/O
    usage.

    I am looking forward for your reply.

    --Iacovos


    ---------------------------------------------------------------------
    To unsubscribe e-mail: user-unsubscr...@spark.apache.org
    <mailto:user-unsubscr...@spark.apache.org>

Reply via email to