[ 
https://issues.apache.org/jira/browse/SPARK-46032?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17788583#comment-17788583
 ] 

Bobby Wang commented on SPARK-46032:
------------------------------------

Executor Logs,
{code:java}
Spark Executor Command: "/usr/lib/jvm/java-17-openjdk-amd64/bin/java" "-cp" 
"/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/conf/:/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/jars/*:/etc/hadoop"
 "-Xmx30720M" "-Dspark.driver.port=36847" 
"-Djava.net.preferIPv6Addresses=false" "-XX:+IgnoreUnrecognizedVMOptions" 
"--add-opens=java.base/java.lang=ALL-UNNAMED" 
"--add-opens=java.base/java.lang.invoke=ALL-UNNAMED" 
"--add-opens=java.base/java.lang.reflect=ALL-UNNAMED" 
"--add-opens=java.base/java.io=ALL-UNNAMED" 
"--add-opens=java.base/java.net=ALL-UNNAMED" 
"--add-opens=java.base/java.nio=ALL-UNNAMED" 
"--add-opens=java.base/java.util=ALL-UNNAMED" 
"--add-opens=java.base/java.util.concurrent=ALL-UNNAMED" 
"--add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED" 
"--add-opens=java.base/sun.nio.ch=ALL-UNNAMED" 
"--add-opens=java.base/sun.nio.cs=ALL-UNNAMED" 
"--add-opens=java.base/sun.security.action=ALL-UNNAMED" 
"--add-opens=java.base/sun.util.calendar=ALL-UNNAMED" 
"--add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED" 
"-Djdk.reflect.useDirectMethodHandle=false" 
"org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" 
"spark://CoarseGrainedScheduler@192.168.31.236:36847" "--executor-id" "0" 
"--hostname" "192.168.31.236" "--cores" "12" "--app-id" 
"app-20231122082518-0000" "--worker-url" "spark://Worker@192.168.31.236:40955" 
"--resourceProfileId" "0" "--resourcesFile" 
"/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/work/app-20231122082518-0000/0/resource-executor-10321350722431298323.json"
========================================

Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
23/11/22 08:25:19 INFO CoarseGrainedExecutorBackend: Started daemon with 
process name: 6758@spark-bobby
23/11/22 08:25:19 INFO SignalUtils: Registering signal handler for TERM
23/11/22 08:25:19 INFO SignalUtils: Registering signal handler for HUP
23/11/22 08:25:19 INFO SignalUtils: Registering signal handler for INT
23/11/22 08:25:19 WARN Utils: Your hostname, spark-bobby resolves to a loopback 
address: 127.0.1.1; using 192.168.31.236 instead (on interface wlp82s0)
23/11/22 08:25:19 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another 
address
23/11/22 08:25:19 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
23/11/22 08:25:19 INFO SecurityManager: Changing view acls to: xxx
23/11/22 08:25:19 INFO SecurityManager: Changing modify acls to: xxx
23/11/22 08:25:19 INFO SecurityManager: Changing view acls groups to: 
23/11/22 08:25:19 INFO SecurityManager: Changing modify acls groups to: 
23/11/22 08:25:19 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: xxx; groups with view 
permissions: EMPTY; users with modify permissions: xxx; groups with modify 
permissions: EMPTY
23/11/22 08:25:19 INFO TransportClientFactory: Successfully created connection 
to /192.168.31.236:36847 after 36 ms (0 ms spent in bootstraps)
23/11/22 08:25:20 INFO SecurityManager: Changing view acls to: xxx
23/11/22 08:25:20 INFO SecurityManager: Changing modify acls to: xxx
23/11/22 08:25:20 INFO SecurityManager: Changing view acls groups to: 
23/11/22 08:25:20 INFO SecurityManager: Changing modify acls groups to: 
23/11/22 08:25:20 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: xxx; groups with view 
permissions: EMPTY; users with modify permissions: xxx; groups with modify 
permissions: EMPTY
23/11/22 08:25:20 INFO TransportClientFactory: Successfully created connection 
to /192.168.31.236:36847 after 1 ms (0 ms spent in bootstraps)
23/11/22 08:25:20 INFO DiskBlockManager: Created local directory at 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/blockmgr-6b86d421-7d6c-4776-9c43-801254892526
23/11/22 08:25:20 INFO MemoryStore: MemoryStore started with capacity 17.8 GiB
23/11/22 08:25:20 INFO CoarseGrainedExecutorBackend: Connecting to driver: 
spark://CoarseGrainedScheduler@192.168.31.236:36847
23/11/22 08:25:20 INFO WorkerWatcher: Connecting to worker 
spark://Worker@192.168.31.236:40955
23/11/22 08:25:20 INFO TransportClientFactory: Successfully created connection 
to /192.168.31.236:40955 after 1 ms (0 ms spent in bootstraps)
23/11/22 08:25:20 INFO WorkerWatcher: Successfully connected to 
spark://Worker@192.168.31.236:40955
23/11/22 08:25:20 INFO ResourceUtils: 
==============================================================
23/11/22 08:25:20 INFO ResourceUtils: Custom resources for spark.executor:
gpu -> [name: gpu, addresses: 0]
23/11/22 08:25:20 INFO ResourceUtils: 
==============================================================
23/11/22 08:25:20 INFO CoarseGrainedExecutorBackend: Successfully registered 
with driver
23/11/22 08:25:20 INFO Executor: Starting executor ID 0 on host 192.168.31.236
23/11/22 08:25:20 INFO Executor: OS info Linux, 6.2.0-36-generic, amd64
23/11/22 08:25:20 INFO Executor: Java version 17.0.8.1
23/11/22 08:25:20 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 44825.
23/11/22 08:25:20 INFO NettyBlockTransferService: Server created on 
192.168.31.236:44825
23/11/22 08:25:20 INFO BlockManager: Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
23/11/22 08:25:20 INFO BlockManagerMaster: Registering BlockManager 
BlockManagerId(0, 192.168.31.236, 44825, None)
23/11/22 08:25:20 INFO BlockManagerMaster: Registered BlockManager 
BlockManagerId(0, 192.168.31.236, 44825, None)
23/11/22 08:25:20 INFO BlockManager: Initialized BlockManager: 
BlockManagerId(0, 192.168.31.236, 44825, None)
23/11/22 08:25:20 INFO Executor: Starting executor with user classpath 
(userClassPathFirst = false): ''
23/11/22 08:25:20 INFO Executor: Created or updated repl class loader 
org.apache.spark.util.MutableURLClassLoader@cb2d124 for default.
23/11/22 08:25:20 INFO Executor: Fetching 
spark://192.168.31.236:36847/files/org.apache.spark_spark-connect_2.12-3.5.0.jar
 with timestamp 1700612718053
23/11/22 08:25:20 INFO TransportClientFactory: Successfully created connection 
to /192.168.31.236:36847 after 1 ms (0 ms spent in bootstraps)
23/11/22 08:25:20 INFO Utils: Fetching 
spark://192.168.31.236:36847/files/org.apache.spark_spark-connect_2.12-3.5.0.jar
 to 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/spark-5c0e4363-3217-4bf7-b66c-8aaa4d9e6d83/fetchFileTemp17176133017062523071.tmp
23/11/22 08:25:20 INFO Utils: Copying 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/spark-5c0e4363-3217-4bf7-b66c-8aaa4d9e6d83/21403806221700612718053_cache
 to 
/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/work/app-20231122082518-0000/0/./org.apache.spark_spark-connect_2.12-3.5.0.jar
23/11/22 08:25:20 INFO Executor: Fetching 
spark://192.168.31.236:36847/files/org.spark-project.spark_unused-1.0.0.jar 
with timestamp 1700612718053
23/11/22 08:25:20 INFO Utils: Fetching 
spark://192.168.31.236:36847/files/org.spark-project.spark_unused-1.0.0.jar to 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/spark-5c0e4363-3217-4bf7-b66c-8aaa4d9e6d83/fetchFileTemp17001851547018168957.tmp
23/11/22 08:25:20 INFO Utils: Copying 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/spark-5c0e4363-3217-4bf7-b66c-8aaa4d9e6d83/-7438553561700612718053_cache
 to 
/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/work/app-20231122082518-0000/0/./org.spark-project.spark_unused-1.0.0.jar
23/11/22 08:25:20 INFO Executor: Fetching 
spark://192.168.31.236:36847/jars/org.apache.spark_spark-connect_2.12-3.5.0.jar 
with timestamp 1700612718053
23/11/22 08:25:20 INFO Utils: Fetching 
spark://192.168.31.236:36847/jars/org.apache.spark_spark-connect_2.12-3.5.0.jar 
to 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/spark-5c0e4363-3217-4bf7-b66c-8aaa4d9e6d83/fetchFileTemp10184976634287199568.tmp
23/11/22 08:25:20 INFO Utils: 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/spark-5c0e4363-3217-4bf7-b66c-8aaa4d9e6d83/-20788328491700612718053_cache
 has been previously copied to 
/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/work/app-20231122082518-0000/0/./org.apache.spark_spark-connect_2.12-3.5.0.jar
23/11/22 08:25:20 INFO Executor: Adding 
file:/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/work/app-20231122082518-0000/0/./org.apache.spark_spark-connect_2.12-3.5.0.jar
 to class loader default
23/11/22 08:25:20 INFO Executor: Fetching 
spark://192.168.31.236:36847/jars/org.spark-project.spark_unused-1.0.0.jar with 
timestamp 1700612718053
23/11/22 08:25:20 INFO Utils: Fetching 
spark://192.168.31.236:36847/jars/org.spark-project.spark_unused-1.0.0.jar to 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/spark-5c0e4363-3217-4bf7-b66c-8aaa4d9e6d83/fetchFileTemp10223084169908292672.tmp
23/11/22 08:25:20 INFO Utils: 
/tmp/spark-cc85e94e-6b00-416a-b1c9-df7699aa96aa/executor-8ff0fcba-d88e-49f0-bb75-d2ed95ea1f26/spark-5c0e4363-3217-4bf7-b66c-8aaa4d9e6d83/-17196869091700612718053_cache
 has been previously copied to 
/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/work/app-20231122082518-0000/0/./org.spark-project.spark_unused-1.0.0.jar
23/11/22 08:25:20 INFO Executor: Adding 
file:/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/work/app-20231122082518-0000/0/./org.spark-project.spark_unused-1.0.0.jar
 to class loader default
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 0
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 1
23/11/22 08:26:21 INFO Executor: Starting executor with user classpath 
(userClassPathFirst = false): ''
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 2
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 3
23/11/22 08:26:21 INFO Executor: Using REPL class URI: 
spark://192.168.31.236:36847/artifacts/7e184283-72c4-4ee4-b3fd-43c7c1453639/classes/
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 4
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 5
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 6
23/11/22 08:26:21 INFO Executor: Created or updated repl class loader 
org.apache.spark.executor.ExecutorClassLoader@2f2f0074 for 
7e184283-72c4-4ee4-b3fd-43c7c1453639.
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 7
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 8
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 9
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 10
23/11/22 08:26:21 INFO CoarseGrainedExecutorBackend: Got assigned task 11
23/11/22 08:26:21 INFO Executor: Running task 3.0 in stage 0.0 (TID 3)
23/11/22 08:26:21 INFO Executor: Running task 7.0 in stage 0.0 (TID 7)
23/11/22 08:26:21 INFO Executor: Running task 5.0 in stage 0.0 (TID 5)
23/11/22 08:26:21 INFO Executor: Running task 10.0 in stage 0.0 (TID 10)
23/11/22 08:26:21 INFO Executor: Running task 11.0 in stage 0.0 (TID 11)
23/11/22 08:26:21 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
23/11/22 08:26:21 INFO Executor: Running task 4.0 in stage 0.0 (TID 4)
23/11/22 08:26:21 INFO Executor: Running task 9.0 in stage 0.0 (TID 9)
23/11/22 08:26:21 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
23/11/22 08:26:21 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
23/11/22 08:26:21 INFO Executor: Running task 6.0 in stage 0.0 (TID 6)
23/11/22 08:26:21 INFO Executor: Running task 8.0 in stage 0.0 (TID 8)
23/11/22 08:26:21 INFO TorrentBroadcast: Started reading broadcast variable 0 
with 1 pieces (estimated total size 4.0 MiB)
23/11/22 08:26:21 INFO TransportClientFactory: Successfully created connection 
to /192.168.31.236:43439 after 1 ms (0 ms spent in bootstraps)
23/11/22 08:26:21 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in 
memory (estimated size 5.6 KiB, free 17.8 GiB)
23/11/22 08:26:21 INFO TorrentBroadcast: Reading broadcast variable 0 took 70 ms
23/11/22 08:26:22 INFO MemoryStore: Block broadcast_0 stored as values in 
memory (estimated size 11.9 KiB, free 17.8 GiB)
23/11/22 08:26:22 ERROR Executor: Exception in task 11.0 in stage 0.0 (TID 11)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 7.0 in stage 0.0 (TID 7)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 4.0 in stage 0.0 (TID 4)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 6.0 in stage 0.0 (TID 6)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 10.0 in stage 0.0 (TID 10)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 8.0 in stage 0.0 (TID 8)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 2.0 in stage 0.0 (TID 2)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 5.0 in stage 0.0 (TID 5)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 9.0 in stage 0.0 (TID 9)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 3.0 in stage 0.0 (TID 3)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 12
23/11/22 08:26:22 INFO Executor: Running task 5.1 in stage 0.0 (TID 12)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 13
23/11/22 08:26:22 INFO Executor: Running task 11.1 in stage 0.0 (TID 13)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 14
23/11/22 08:26:22 INFO Executor: Running task 6.1 in stage 0.0 (TID 14)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 15
23/11/22 08:26:22 INFO Executor: Running task 3.1 in stage 0.0 (TID 15)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 16
23/11/22 08:26:22 INFO Executor: Running task 10.1 in stage 0.0 (TID 16)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 17
23/11/22 08:26:22 INFO Executor: Running task 7.1 in stage 0.0 (TID 17)
23/11/22 08:26:22 ERROR Executor: Exception in task 5.1 in stage 0.0 (TID 12)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 18
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 19
23/11/22 08:26:22 INFO Executor: Running task 4.1 in stage 0.0 (TID 19)
23/11/22 08:26:22 INFO Executor: Running task 0.1 in stage 0.0 (TID 18)
23/11/22 08:26:22 ERROR Executor: Exception in task 3.1 in stage 0.0 (TID 15)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 20
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 21
23/11/22 08:26:22 ERROR Executor: Exception in task 6.1 in stage 0.0 (TID 14)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO Executor: Running task 2.1 in stage 0.0 (TID 21)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 22
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 23
23/11/22 08:26:22 INFO Executor: Running task 1.1 in stage 0.0 (TID 20)
23/11/22 08:26:22 ERROR Executor: Exception in task 11.1 in stage 0.0 (TID 13)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO Executor: Running task 8.1 in stage 0.0 (TID 22)
23/11/22 08:26:22 INFO Executor: Running task 9.1 in stage 0.0 (TID 23)
23/11/22 08:26:22 ERROR Executor: Exception in task 4.1 in stage 0.0 (TID 19)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 0.1 in stage 0.0 (TID 18)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 7.1 in stage 0.0 (TID 17)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 10.1 in stage 0.0 (TID 16)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 8.1 in stage 0.0 (TID 22)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 1.1 in stage 0.0 (TID 20)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 24
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 25
23/11/22 08:26:22 ERROR Executor: Exception in task 9.1 in stage 0.0 (TID 23)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO Executor: Running task 5.2 in stage 0.0 (TID 24)
23/11/22 08:26:22 ERROR Executor: Exception in task 2.1 in stage 0.0 (TID 21)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 26
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 27
23/11/22 08:26:22 INFO Executor: Running task 6.2 in stage 0.0 (TID 26)
23/11/22 08:26:22 INFO Executor: Running task 3.2 in stage 0.0 (TID 25)
23/11/22 08:26:22 INFO Executor: Running task 11.2 in stage 0.0 (TID 27)
23/11/22 08:26:22 ERROR Executor: Exception in task 5.2 in stage 0.0 (TID 24)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 28
23/11/22 08:26:22 INFO Executor: Running task 4.2 in stage 0.0 (TID 28)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 29
23/11/22 08:26:22 INFO Executor: Running task 7.2 in stage 0.0 (TID 29)
23/11/22 08:26:22 ERROR Executor: Exception in task 11.2 in stage 0.0 (TID 27)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 30
23/11/22 08:26:22 INFO Executor: Running task 0.2 in stage 0.0 (TID 30)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 31
23/11/22 08:26:22 INFO Executor: Running task 10.2 in stage 0.0 (TID 31)
23/11/22 08:26:22 ERROR Executor: Exception in task 3.2 in stage 0.0 (TID 25)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 4.2 in stage 0.0 (TID 28)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 6.2 in stage 0.0 (TID 26)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 7.2 in stage 0.0 (TID 29)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 10.2 in stage 0.0 (TID 31)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 32
23/11/22 08:26:22 INFO Executor: Running task 1.2 in stage 0.0 (TID 32)
23/11/22 08:26:22 ERROR Executor: Exception in task 0.2 in stage 0.0 (TID 30)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 33
23/11/22 08:26:22 INFO Executor: Running task 9.2 in stage 0.0 (TID 33)
23/11/22 08:26:22 ERROR Executor: Exception in task 1.2 in stage 0.0 (TID 32)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 34
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 35
23/11/22 08:26:22 INFO Executor: Running task 8.2 in stage 0.0 (TID 34)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 36
23/11/22 08:26:22 INFO Executor: Running task 5.3 in stage 0.0 (TID 36)
23/11/22 08:26:22 ERROR Executor: Exception in task 9.2 in stage 0.0 (TID 33)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 5.3 in stage 0.0 (TID 36)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 37
23/11/22 08:26:22 INFO Executor: Running task 11.3 in stage 0.0 (TID 37)
23/11/22 08:26:22 INFO Executor: Running task 2.2 in stage 0.0 (TID 35)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 38
23/11/22 08:26:22 ERROR Executor: Exception in task 8.2 in stage 0.0 (TID 34)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO Executor: Running task 3.3 in stage 0.0 (TID 38)
23/11/22 08:26:22 ERROR Executor: Exception in task 2.2 in stage 0.0 (TID 35)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 11.3 in stage 0.0 (TID 37)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 3.3 in stage 0.0 (TID 38)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 39
23/11/22 08:26:22 INFO Executor: Running task 4.3 in stage 0.0 (TID 39)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 40
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 41
23/11/22 08:26:22 INFO Executor: Running task 7.3 in stage 0.0 (TID 40)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 42
23/11/22 08:26:22 INFO Executor: Running task 0.3 in stage 0.0 (TID 42)
23/11/22 08:26:22 INFO Executor: Running task 10.3 in stage 0.0 (TID 41)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 43
23/11/22 08:26:22 INFO Executor: Running task 6.3 in stage 0.0 (TID 43)
23/11/22 08:26:22 ERROR Executor: Exception in task 7.3 in stage 0.0 (TID 40)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 0.3 in stage 0.0 (TID 42)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 INFO CoarseGrainedExecutorBackend: Got assigned task 44
23/11/22 08:26:22 INFO Executor: Running task 1.3 in stage 0.0 (TID 44)
23/11/22 08:26:22 ERROR Executor: Exception in task 6.3 in stage 0.0 (TID 43)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 10.3 in stage 0.0 (TID 41)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 4.3 in stage 0.0 (TID 39)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
23/11/22 08:26:22 ERROR Executor: Exception in task 1.3 in stage 0.0 (TID 44)
java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of 
org.apache.spark.rdd.MapPartitionsRDD
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
        at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
        at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
        at 
java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606)
        at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457)
        at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
        at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
        at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833) {code}

> connect: cannot assign instance of java.lang.invoke.SerializedLambda to field 
> org.apache.spark.rdd.MapPartitionsRDD.f
> ---------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-46032
>                 URL: https://issues.apache.org/jira/browse/SPARK-46032
>             Project: Spark
>          Issue Type: Bug
>          Components: Connect
>    Affects Versions: 3.5.0
>            Reporter: Bobby Wang
>            Priority: Major
>
> I downloaded spark 3.5 from the spark official website, and then I started a 
> Spark Standalone cluster in which both master and the only worker are in the 
> same node. 
>  
> Then I started the connect server by 
> {code:java}
> start-connect-server.sh \
>     --master spark://10.19.183.93:7077 \
>     --packages org.apache.spark:spark-connect_2.12:3.5.0 \
>     --conf spark.executor.cores=12 \
>     --conf spark.task.cpus=1 \
>     --executor-memory 30G \
>     --conf spark.executor.resource.gpu.amount=1 \
>     --conf spark.task.resource.gpu.amount=0.08 \
>     --driver-memory 1G{code}
>  
> I can 100% ensure the spark standalone cluster, the connect server and spark 
> driver are started observed from the webui.
>  
> Finally, I tried to run a very simple spark job 
> (spark.range(100).filter("id>2").collect()) from spark-connect-client using 
> pyspark, but I got the below error.
>  
> _pyspark --remote sc://localhost_
> _Python 3.10.0 (default, Mar  3 2022, 09:58:08) [GCC 7.5.0] on linux_
> _Type "help", "copyright", "credits" or "license" for more information._
> _Welcome to_
>       _____              ___
>      _/ __/_  {{_}}{_}__ ___{_}{{_}}/ /{{_}}{_}_
>     {_}{{_}}\ \/ _ \/ _ `/ {_}{{_}}/  '{_}/{_}
>    {_}/{_}_ / .{_}{{_}}/{_},{_}/{_}/ /{_}/{_}\   version 3.5.0{_}
>       {_}/{_}/_
>  
> _Using Python version 3.10.0 (default, Mar  3 2022 09:58:08)_
> _Client connected to the Spark Connect server at localhost_
> _SparkSession available as 'spark'._
> _>>> spark.range(100).filter("id > 3").collect()_
> _Traceback (most recent call last):_
>   _File "<stdin>", line 1, in <module>_
>   _File 
> "/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/python/pyspark/sql/connect/dataframe.py",
>  line 1645, in collect_
>     _table, schema = self._session.client.to_table(query)_
>   _File 
> "/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/python/pyspark/sql/connect/client/core.py",
>  line 858, in to_table_
>     _table, schema, _, _, _ = self._execute_and_fetch(req)_
>   _File 
> "/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/python/pyspark/sql/connect/client/core.py",
>  line 1282, in _execute_and_fetch_
>     _for response in self._execute_and_fetch_as_iterator(req):_
>   _File 
> "/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/python/pyspark/sql/connect/client/core.py",
>  line 1263, in _execute_and_fetch_as_iterator_
>     _self._handle_error(error)_
>   _File 
> "/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/python/pyspark/sql/connect/client/core.py",
>  line 1502, in _handle_error_
>     _self._handle_rpc_error(error)_
>   _File 
> "/home/xxx/github/mytools/spark.home/spark-3.5.0-bin-hadoop3/python/pyspark/sql/connect/client/core.py",
>  line 1538, in _handle_rpc_error_
>     _raise convert_exception(info, status.message) from None_
> _pyspark.errors.exceptions.connect.SparkConnectGrpcException: 
> (org.apache.spark.SparkException) Job aborted due to stage failure: Task 0 in 
> stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 
> (TID 35) (10.19.183.93 executor 0): java.lang.ClassCastException: cannot 
> assign instance of java.lang.invoke.SerializedLambda to field 
> org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance 
> of org.apache.spark.rdd.MapPartitionsRDD_
> _at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2301)_
> _at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1431)_
> _at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2437)_
> _at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)_
> _at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)_
> _at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)_
> _at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)_
> _at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)_
> _at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)_
> _at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)_
> _at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)_
> _at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)_
> _at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)_
> _at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)_
> _at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)_
> _at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)_
> _at org.apache.spark.scheduler.Task.run(Task.scala:141)_
> _at 
> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)_
> _at 
> org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)_
> _at 
> org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)_
> _at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)_
> _at org.apache.spark.executor.Executor$TaskRunner..._
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org


Reply via email to