Hi all,

I am using *Spark 1.5.1* in *yarn-client* mode along with *CDH 5.5*

As per the documentation to enable Dynamic Allocation of Executors in Spark,
it is required to add the shuffle service jar to YARN Node Manager's
classpath and restart the YARN Node Manager.

Is there any way to to dynamically supply the shuffle service jar
information from the application itself and avoid disturbing the running
YARN service.

Tried couple of options by uploading the jar to hdfs and set
*yarn.application.classpath* but did not work. On container launch for the
executor it fails to recognize the shuffle service.

Any help would be greatly appreciated.

-- 
*Thanks and regards*
*Vinay Kashyap*

Reply via email to