Re: Java object serialization error, java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible

2022-08-25 Thread Sean Owen
This suggests you have mixed two versions of Spark libraries. You probably packaged Spark itself in your Spark app? On Thu, Aug 25, 2022 at 4:56 PM Elliot Metsger wrote: > Elliot Metsger > 9:48 AM (7 hours ago) > to dev > Howdy folks, > > Relative newbie to Spark, and super new to Beam. (I've

Java object serialization error, java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible

2022-08-25 Thread Elliot Metsger
Elliot Metsger 9:48 AM (7 hours ago) to dev Howdy folks, Relative newbie to Spark, and super new to Beam. (I've asked this question on Beam lists, but this seems like a Spark-related issue so I'm trying my query here, too). I'm attempting to get a simple Beam pipeline (using the Go SDK)

Re: memory module of yarn container

2022-08-25 Thread Yang,Jie(INF)
Hi, vtygoss In my memory, the memoryOverhead in Spark 2.3 includes all the memories that are not executor onHeap memory, including the memory used by Spark offheapMemoryPool(executorOffHeapMemory, this concept also exists in Spark 2.3), PySparkWorker, PipeRDD used, netty memory pool, JVM

memory module of yarn container

2022-08-25 Thread vtygoss
Hi, community! I notice a change about the memory module of yarn container between spark-2.3.0 and spark-3.2.1 when requesting containers from yarn. org.apache.spark.deploy.yarn.Client.java # verifyClusterResources ``` spark-2.3.0 val executorMem = executorMemory + executorMemoryOverhead