Re: How to use HProf to profile Spark CPU overhead

2015-12-12 Thread Ted Yu
Have you tried adding the option below through spark.executor.extraJavaOptions ?

Cheers

> On Dec 13, 2015, at 3:36 AM, Jia Zou  wrote:
> 
> My goal is to use hprof to profile where the bottleneck is.
> Is there anyway to do this without modifying and rebuilding Spark source code.
> 
> I've tried to add 
> "-Xrunhprof:cpu=samples,depth=100,interval=20,lineno=y,thread=y,file=/home/ubuntu/out.hprof"
>  to spark-class script, but it can only profile the CPU usage of the  
> org.apache.spark.deploy.SparkSubmit class, and can not provide insights for 
> other classes like BlockManager, and user classes.
> 
> Any suggestions? Thanks a lot!
> 
> Best Regards,
> Jia

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to use HProf to profile Spark CPU overhead

2015-12-12 Thread Jia Zou
Hi, Ted, it works, thanks a lot for your help!

--Jia

On Sat, Dec 12, 2015 at 3:01 PM, Ted Yu  wrote:

> Have you tried adding the option below through
> spark.executor.extraJavaOptions ?
>
> Cheers
>
> > On Dec 13, 2015, at 3:36 AM, Jia Zou  wrote:
> >
> > My goal is to use hprof to profile where the bottleneck is.
> > Is there anyway to do this without modifying and rebuilding Spark source
> code.
> >
> > I've tried to add
> "-Xrunhprof:cpu=samples,depth=100,interval=20,lineno=y,thread=y,file=/home/ubuntu/out.hprof"
> to spark-class script, but it can only profile the CPU usage of the
> org.apache.spark.deploy.SparkSubmit class, and can not provide insights for
> other classes like BlockManager, and user classes.
> >
> > Any suggestions? Thanks a lot!
> >
> > Best Regards,
> > Jia
>