Using the following dependency for Spark 3 in POM file (My Scala version is 2.12.14)
*<dependency> <groupId>org.elasticsearch</groupId> <artifactId>elasticsearch-spark-30_2.12</artifactId> <version>7.12.0</version> <scope>provided</scope></dependency>* The code throws error at this line : df.write.format("es").mode("overwrite").options(elasticOptions).save("index_name") The same code is working with Spark 2.4.0 and the following dependency *<dependency> <groupId>org.elasticsearch</groupId> <artifactId>elasticsearch-spark-20_2.12</artifactId> <version>7.12.0</version></dependency>* On Mon, 28 Aug 2023 at 12:17 AM, Holden Karau <hol...@pigscanfly.ca> wrote: > What’s the version of the ES connector you are using? > > On Sat, Aug 26, 2023 at 10:17 AM Dipayan Dev <dev.dipaya...@gmail.com> > wrote: > >> Hi All, >> >> We're using Spark 2.4.x to write dataframe into the Elasticsearch index. >> As we're upgrading to Spark 3.3.0, it throwing out error >> Caused by: java.lang.ClassNotFoundException: es.DefaultSource >> at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476) >> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589) >> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) >> >> Looking at a few responses from Stackoverflow >> <https://stackoverflow.com/a/66452149>. it seems this is not yet >> supported by Elasticsearch-hadoop. >> >> Does anyone have experience with this? Or faced/resolved this issue in >> Spark 3? >> >> Thanks in advance! >> >> Regards >> Dipayan >> > -- > Twitter: https://twitter.com/holdenkarau > Books (Learning Spark, High Performance Spark, etc.): > https://amzn.to/2MaRAG9 <https://amzn.to/2MaRAG9> > YouTube Live Streams: https://www.youtube.com/user/holdenkarau >