@Alfie Davidson <alfie.davids...@gmail.com> : Awesome, it worked with
"“org.elasticsearch.spark.sql”"
But as soon as I switched to *elasticsearch-spark-20_2.12, *"es" also
worked.


On Fri, Sep 8, 2023 at 12:45 PM Dipayan Dev <dev.dipaya...@gmail.com> wrote:

>
> Let me try that and get back. Just wondering, if there a change in  the
> way we pass the format in connector from Spark 2 to 3?
>
>
> On Fri, 8 Sep 2023 at 12:35 PM, Alfie Davidson <alfie.davids...@gmail.com>
> wrote:
>
>> I am pretty certain you need to change the write.format from “es” to
>> “org.elasticsearch.spark.sql”
>>
>> Sent from my iPhone
>>
>> On 8 Sep 2023, at 03:10, Dipayan Dev <dev.dipaya...@gmail.com> wrote:
>>
>> 
>>
>> ++ Dev
>>
>> On Thu, 7 Sep 2023 at 10:22 PM, Dipayan Dev <dev.dipaya...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> Can you please elaborate your last response? I don’t have any external
>>> dependencies added, and just updated the Spark version as mentioned below.
>>>
>>> Can someone help me with this?
>>>
>>> On Fri, 1 Sep 2023 at 5:58 PM, Koert Kuipers <ko...@tresata.com> wrote:
>>>
>>>> could the provided scope be the issue?
>>>>
>>>> On Sun, Aug 27, 2023 at 2:58 PM Dipayan Dev <dev.dipaya...@gmail.com>
>>>> wrote:
>>>>
>>>>> Using the following dependency for Spark 3 in POM file (My Scala
>>>>> version is 2.12.14)
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *<dependency>    <groupId>org.elasticsearch</groupId>
>>>>> <artifactId>elasticsearch-spark-30_2.12</artifactId>
>>>>> <version>7.12.0</version>    <scope>provided</scope></dependency>*
>>>>>
>>>>>
>>>>> The code throws error at this line :
>>>>> df.write.format("es").mode("overwrite").options(elasticOptions).save("index_name")
>>>>> The same code is working with Spark 2.4.0 and the following dependency
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *<dependency>    <groupId>org.elasticsearch</groupId>
>>>>> <artifactId>elasticsearch-spark-20_2.12</artifactId>
>>>>> <version>7.12.0</version></dependency>*
>>>>>
>>>>>
>>>>> On Mon, 28 Aug 2023 at 12:17 AM, Holden Karau <hol...@pigscanfly.ca>
>>>>> wrote:
>>>>>
>>>>>> What’s the version of the ES connector you are using?
>>>>>>
>>>>>> On Sat, Aug 26, 2023 at 10:17 AM Dipayan Dev <dev.dipaya...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi All,
>>>>>>>
>>>>>>> We're using Spark 2.4.x to write dataframe into the Elasticsearch
>>>>>>> index.
>>>>>>> As we're upgrading to Spark 3.3.0, it throwing out error
>>>>>>> Caused by: java.lang.ClassNotFoundException: es.DefaultSource
>>>>>>> at
>>>>>>> java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
>>>>>>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
>>>>>>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
>>>>>>>
>>>>>>> Looking at a few responses from Stackoverflow
>>>>>>> <https://stackoverflow.com/a/66452149>. it seems this is not yet
>>>>>>> supported by Elasticsearch-hadoop.
>>>>>>>
>>>>>>> Does anyone have experience with this? Or faced/resolved this issue
>>>>>>> in Spark 3?
>>>>>>>
>>>>>>> Thanks in advance!
>>>>>>>
>>>>>>> Regards
>>>>>>> Dipayan
>>>>>>>
>>>>>> --
>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>> Books (Learning Spark, High Performance Spark, etc.):
>>>>>> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
>>>>>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE: This electronic communication and any files
>>>> transmitted with it are confidential, privileged and intended solely for
>>>> the use of the individual or entity to whom they are addressed. If you are
>>>> not the intended recipient, you are hereby notified that any disclosure,
>>>> copying, distribution (electronic or otherwise) or forwarding of, or the
>>>> taking of any action in reliance on the contents of this transmission is
>>>> strictly prohibited. Please notify the sender immediately by e-mail if you
>>>> have received this email by mistake and delete this email from your system.
>>>>
>>>> Is it necessary to print this email? If you care about the environment
>>>> like we do, please refrain from printing emails. It helps to keep the
>>>> environment forested and litter-free.
>>>
>>>

Reply via email to