Hi
Spark provides spark.local.dir configuration to specify work folder on the
pod. You can specify spark.local.dir as your mount path.
Best regards
Manoj GEORGE 于2022年9月1日周四 21:16写道:
> CONFIDENTIAL & RESTRICTED
>
> Hi Team,
>
>
>
> I am new to spark, so please excuse my ignorance.
>
>
>
>
Hi George,
You can try mounting a larger PersistentVolume to the work directory as
described here instead of using localdir which might have site-specific size
constraints:
https://spark.apache.org/docs/latest/running-on-kubernetes.html#using-kubernetes-volumes
-Matt
> On Sep 1, 2022, at
CONFIDENTIAL & RESTRICTED
Hi Team,
I am new to spark, so please excuse my ignorance.
Currently we are trying to run PySpark on Kubernetes cluster. The setup is
working fine for some jobs, but when we are processing a large file ( 36 gb),
we run into one of space issues.
Based on what was