This suggests you have mixed two versions of Spark libraries. You probably
packaged Spark itself in your Spark app?
On Thu, Aug 25, 2022 at 4:56 PM Elliot Metsger wrote:
> Elliot Metsger
> 9:48 AM (7 hours ago)
> to dev
> Howdy folks,
>
> Relative newbie to Spark, and super new to Beam. (I've
Elliot Metsger
9:48 AM (7 hours ago)
to dev
Howdy folks,
Relative newbie to Spark, and super new to Beam. (I've asked this
question on Beam lists, but this seems like a Spark-related issue so I'm
trying my query here, too). I'm attempting to get a simple Beam pipeline
(using the Go SDK)
Hi, vtygoss
In my memory, the memoryOverhead in Spark 2.3 includes all the memories that
are not executor onHeap memory, including the memory used by Spark
offheapMemoryPool(executorOffHeapMemory, this concept also exists in Spark
2.3), PySparkWorker, PipeRDD used, netty memory pool, JVM
Hi, community!
I notice a change about the memory module of yarn container between spark-2.3.0
and spark-3.2.1 when requesting containers from yarn.
org.apache.spark.deploy.yarn.Client.java # verifyClusterResources
```
spark-2.3.0
val executorMem = executorMemory + executorMemoryOverhead