Hi Jack, You can try sparklens (https://github.com/qubole/sparklens). I think it won't give details at as low a level as you're looking for, but it can help you identify and remove performance bottlenecks.
~ Hariharan On Fri, Mar 29, 2019 at 12:01 AM bo yang <bobyan...@gmail.com> wrote: > Yeah, these options are very valuable. Just add another option :) We build > a jvm profiler (https://github.com/uber-common/jvm-profiler) to monitor > and profile Spark applications in large scale (e.g. sending metrics to > kafka / hive for batch analysis). People could try it as well. > > > On Wed, Mar 27, 2019 at 1:49 PM Jack Kolokasis <koloka...@ics.forth.gr> > wrote: > >> Thanks for your reply. Your help is very valuable and all these links >> are helpful (especially your example) >> >> Best Regards >> >> --Iacovos >> On 3/27/19 10:42 PM, Luca Canali wrote: >> >> I find that the Spark metrics system is quite useful to gather resource >> utilization metrics of Spark applications, including CPU, memory and I/O. >> >> If you are interested an example how this works for us at: >> https://db-blog.web.cern.ch/blog/luca-canali/2019-02-performance-dashboard-apache-spark >> If instead you are rather looking at ways to instrument your Spark code >> with performance metrics, Spark task metrics and event listeners are quite >> useful for that. See also >> https://github.com/apache/spark/blob/master/docs/monitoring.md and >> https://github.com/LucaCanali/sparkMeasure >> >> >> >> Regards, >> >> Luca >> >> >> >> *From:* manish ranjan <cse1.man...@gmail.com> <cse1.man...@gmail.com> >> *Sent:* Tuesday, March 26, 2019 15:24 >> *To:* Jack Kolokasis <koloka...@ics.forth.gr> <koloka...@ics.forth.gr> >> *Cc:* user <user@spark.apache.org> <user@spark.apache.org> >> *Subject:* Re: Spark Profiler >> >> >> >> I have found ganglia very helpful in understanding network I/o , CPU and >> memory usage for a given spark cluster. >> >> I have not used , but have heard good things about Dr Elephant ( which I >> think was contributed by LinkedIn but not 100%sure). >> >> >> >> On Tue, Mar 26, 2019, 5:59 AM Jack Kolokasis <koloka...@ics.forth.gr> >> wrote: >> >> Hello all, >> >> I am looking for a spark profiler to trace my application to find >> the bottlenecks. I need to trace CPU usage, Memory Usage and I/O usage. >> >> I am looking forward for your reply. >> >> --Iacovos >> >> >> --------------------------------------------------------------------- >> To unsubscribe e-mail: user-unsubscr...@spark.apache.org >> >>