You can see what Uber JVM does at
https://github.com/uber-common/jvm-profiler :

--conf spark.jars=hdfs://hdfs_url/lib/jvm-profiler-1.0.0.jar
> --conf spark.executor.extraJavaOptions=-javaagent:jvm-profiler-1.0.0.jar


    -- Oleg

On Wed, May 15, 2019 at 6:28 AM Anton Puzanov <antonpuzdeve...@gmail.com>
wrote:

> Hi everyone,
>
> I want to run my spark application with javaagent, specifically I want to
> use newrelic with my application.
>
> When I run spark-submit I must pass --conf
> "spark.driver.extraJavaOptions=-javaagent=<full path to newrelic jar>"
>
> My problem is that I can't specify the full path as I run in cluster mode
> and I don't know the exact host which will serve as the driver.
> *Important:* I know I can upload the jar to every node, but it seems like
> a fragile solution as machines will be added and removed later.
>
> I have tried specifying the jar with --files but couldn't make it work, as
> I didn't know where exactly I should point the javaagent
>
> Any suggestions on what is the best practice to handle this kind of
> problems? and what can I do?
>
> Thanks a lot,
> Anton
>

Reply via email to