Actually, that did work, thanks.
What I previously tried that did not work was
#BSUB -env "all,SPARK_LOCAL_DIRS=/tmp,/share/,SPARK_PID_DIR=..."
However, I am still getting "No space left on device" errors. It seems that
I need hierarchical directories, and round robin distribution is not good
Without spaces was the first thing I tried. The information in the pdf file
inspired me to try the space.
On Fri, Jan 12, 2024 at 10:23 PM Koert Kuipers wrote:
> try it without spaces?
> export SPARK_LOCAL_DIRS="/tmp,/share/"
>
> On Fri, Jan 12, 2024 at 5:00 PM Andrew Petersen
> wrote:
>
>>
try it without spaces?
export SPARK_LOCAL_DIRS="/tmp,/share/"
On Fri, Jan 12, 2024 at 5:00 PM Andrew Petersen
wrote:
> Hello Spark community
>
> SPARK_LOCAL_DIRS or
> spark.local.dir
> is supposed to accept a list.
>
> I want to list one local (fast) drive, followed by a gpfs network drive,
Hello Spark community
SPARK_LOCAL_DIRS or
spark.local.dir
is supposed to accept a list.
I want to list one local (fast) drive, followed by a gpfs network drive,
similar to what is done here:
https://cug.org/proceedings/cug2016_proceedings/includes/files/pap129s2-file1.pdf
"Thus it is preferable t
Hello,
I was hoping to use a distribution of GraphFrames for AWS Glue 4 which has
spark 3.3, but there is no found distribution for Spark 3.3 at this location:
https://spark-packages.org/package/graphframes/graphframes
Do you have any advice on the best compatible version to use for Spark 3.3?