Hi Darshan,
Did you try passing the config directly as an option, like this:
.option("kafka.sasl.jaas.config", saslConfig)
Where saslConfig can look like:
com.sun.security.auth.module.Krb5LoginModule required \
useKeyTab=true \
storeKey=true \
keyTab="/etc/security/key
HI Burak,
Well turns out it worked fine when i submit in cluster mode. I also tried
to convert my app in dstreams. In dstreams too it works well only when
deployed in cluster mode.
Here is how i configured the stream.
val lines = spark.readStream
.format("kafka")
.option("kafka.bootstrap.se
Hi Darshan,
How are you creating your kafka stream? Can you please share the options
you provide?
spark.readStream.format("kafka")
.option(...) // all these please
.load()
On Sat, Oct 14, 2017 at 1:55 AM, Darshan Pandya
wrote:
> Hello,
>
> I'm using Spark 2.1.0 on CDH 5.8 with kafka 0.10.
Hello,
I'm using Spark 2.1.0 on CDH 5.8 with kafka 0.10.0.1 + kerberos
I am unable to connect to the kafka broker with the following message
17/10/14 14:29:10 WARN clients.NetworkClient: Bootstrap broker
10.197.19.25:9092 disconnected
and is unable to consume any messages.
And am using it as