Hi all, I encountered one issue when reading from Kafka as stream and then sink into HDFS (using delta lake format).
java.lang.NoSuchMethodError: org.apache.spark.sql.kafka010.consumer.InternalKafkaConsumerPool$PoolConfig.setMinEvictableIdleTime(Ljava/time/Duration;)V I looked into the details and found it occurred because Spark built-in jars has version 1.5.4 (commons-pool-1.5.4.jar) while this package org.apache.spark:spark-sql-kafka-0-10_2.12:3.3.0 replies on version 2.11.1 and these two version are not compatible. Thus I’ve done one workaround for now to place the latest version into my Spark class path as the following page documents: java.lang.NoSuchMethodError: PoolConfig.setMinEvictableIdleTime<https://kontext.tech/article/1178/javalangnosuchmethoderror-poolconfigsetminevictableidletime> Question from me: * Can we bump up the version in Spark future releases to avoid issues like this? * Will this workaround cause side effects based on your knowledge? I’m a frequent user of Spark but I don’t have much detailed knowledge in Spark underlying code (and I only looked into it whenever I need to debug a complex problem). Thanks and Regards, Raymond