Are local paths not exposed in containers ? Thanks, Nasrulla
From: Nasrulla Khan Haris Sent: Thursday, July 23, 2020 6:13 PM To: user@spark.apache.org Subject: Unable to run bash script when using spark-submit in cluster mode. Importance: High Hi Spark Users, I am trying to execute bash script from my spark app. I can run the below command without issues from spark-shell however when I use it in the spark-app and submit with spark-submit, container is not able to find the directories. val result = "export LD_LIBRARY_PATH=/ binaries/ && /binaries/generatedata simulate -rows 1000 -payload 32 -file MyFile1" !! Any inputs on how to make the script visible in spark executor ? Thanks, Nasrulla