please see my earlier reply for 3.1.1 tested and worked in Google Dataproc
environment

Also this article of mine may be useful

Processing Change Data Capture with Spark Structured Streaming
<https://www.linkedin.com/pulse/processing-change-data-capture-spark-structured-talebzadeh-ph-d-/>

HTH



   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>


 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Fri, 25 Feb 2022 at 20:30, Michael Williams (SSI) <
michael.willi...@ssigroup.com> wrote:

> The use case is for spark structured streaming (a spark app will be
> launched by a worker service that monitors the kafka topic for new
> messages, once the messages are consumed, the spark app will terminate),
> but if there is a hitch here, it is that the Spark environment includes the
> MS dotnet for Spark wrapper, which means the each spark app will consume
> from one kafka topic and will be written in C#.  If possible, I’d really
> like to be able to manually download the necessary jars and do the kafka
> client installation as part of the docker image build, so that the
> dependencies already exist on disk.  If that makes any sense.
>
>
>
> Thank you
>
>
>
> *From:* Mich Talebzadeh [mailto:mich.talebza...@gmail.com]
> *Sent:* Friday, February 25, 2022 2:16 PM
> *To:* Michael Williams (SSI) <michael.willi...@ssigroup.com>
> *Cc:* user@spark.apache.org
> *Subject:* Re: Spark Kafka Integration
>
>
>
> What is the use case? Is this for spark structured streaming?
>
>
>
> HTH
>
>
>
>
>
>    view my Linkedin profile
> <https://urldefense.com/v3/__https:/www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/__;!!IPetXT4!lCrzhG1VJ33tiaN9wEQbbz1YK22GuptNP0ttoU3MuEFoo5yhyYOunxqT6ntBaiGS-IYaQ48$>
>
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
> <https://urldefense.com/v3/__https:/en.everybodywiki.com/Mich_Talebzadeh__;!!IPetXT4!lCrzhG1VJ33tiaN9wEQbbz1YK22GuptNP0ttoU3MuEFoo5yhyYOunxqT6ntBaiGSmSLsUws$>
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
>
>
>
> On Fri, 25 Feb 2022 at 19:38, Michael Williams (SSI) <
> michael.willi...@ssigroup.com> wrote:
>
> After reviewing Spark's Kafka Integration guide, it indicates that
> spark-sql-kafka-0-10_2.12_3.2.1.jar and its dependencies are needed for
> Spark 3.2.1 (+ Scala 2.12) to work with Kafka.  Can anybody clarify the
> cleanest, most repeatable (reliable) way to acquire these jars for
> including in a Spark Docker image without introducing version compatibility
> issues?
>
>
>
> Thank you,
>
> Mike
>
>
>
> This electronic message may contain information that is Proprietary,
> Confidential, or legally privileged or protected. It is intended only for
> the use of the individual(s) and entity named in the message. If you are
> not an intended recipient of this message, please notify the sender
> immediately and delete the material from your computer. Do not deliver,
> distribute or copy this message and do not disclose its contents or take
> any action in reliance on the information it contains. Thank You.
>
>
> This electronic message may contain information that is Proprietary,
> Confidential, or legally privileged or protected. It is intended only for
> the use of the individual(s) and entity named in the message. If you are
> not an intended recipient of this message, please notify the sender
> immediately and delete the material from your computer. Do not deliver,
> distribute or copy this message and do not disclose its contents or take
> any action in reliance on the information it contains. Thank You.
>

Reply via email to