Hi Spark Users,

I am trying to execute bash script from my spark app. I can run the below 
command without issues from spark-shell however when I use it in the spark-app 
and submit with spark-submit, container is not able to find the directories.

val result = "export LD_LIBRARY_PATH=/ binaries/ && /binaries/generatedata 
simulate -rows 1000 -payload 32 -file MyFile1" !!


Any inputs on how to make the script visible in spark executor ?


Thanks,
Nasrulla

Reply via email to