Hi Spark Users,
I am trying to execute bash script from my spark app. I can run the below
command without issues from spark-shell however when I use it in the spark-app
and submit with spark-submit, container is not able to find the directories.
val result = "export LD_LIBRARY_PATH=/ binaries/
Are local paths not exposed in containers ?
Thanks,
Nasrulla
From: Nasrulla Khan Haris
Sent: Thursday, July 23, 2020 6:13 PM
To: user@spark.apache.org
Subject: Unable to run bash script when using spark-submit in cluster mode.
Importance: High
Hi Spark Users,
I am trying to execute bash