Thanks,
>Pradeep
>
>> On May 19, 2016, at 12:17 AM, Ramaswamy, Muthuraman
>> <muthuraman.ramasw...@viasat.com> wrote:
>>
>> I am using Spark 1.6.1 and Kafka 0.9+ It works for both receiver and
>> receiver-less mode.
>>
>> One thing I notic
: Mail.com [pradeep.mi...@mail.com]
Sent: Monday, May 16, 2016 9:33 PM
To: Ramaswamy, Muthuraman
Cc: Cody Koeninger; spark users
Subject: Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent
Serializers as Value Decoder.
Hi Muthu,
Are you on spark 1.4.1 and Kafka 0.8.2? I have
idin.com<mailto:j...@insidin.com>"
<j...@insidin.com<mailto:j...@insidin.com>>
Date: Tuesday, May 17, 2016 at 3:18 AM
To: "Ramaswamy, Muthuraman"
<muthuraman.ramasw...@viasat.com<mailto:muthuraman.ramasw...@viasat.com>>
Cc: spark users <
I am trying to consume AVRO formatted message through
KafkaUtils.createDirectStream. I followed the listed below example (refer link)
but the messages are not being fetched by the Stream.
http://stackoverflow.com/questions/30339636/spark-python-avro-kafka-deserialiser
Is there any code missing
~Muthu
On 5/16/16, 10:49 AM, "Cody Koeninger" <c...@koeninger.org> wrote:
>Have you checked to make sure you can receive messages just using a
>byte array for value?
>
>On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
><muthuraman.ramasw...@viasat.com&
I would like to develop Custom Source and Sink. So, I have a couple of
questions:
1. Do I have to use Scala or Java to develop these Custom Source/Sink?
1. Also, once the source/sink has been developed, to use in PySpark/Python,
do I have to develop any Py4J modules? Any pointers or
Hi All,
I am exploring PySpark Structured Streaming and the documentation says the
Foreach Sink is not supported in Python and is available only with Java/Scala.
Given the unavailability of this sink, what options are there for the following:
1. Will there be support for Foreach Sink in
All,
Currently, I am using PySpark Streaming (Classic Regular DStream Style and not
Structured Streaming). Now, our remote Kafka is secured with Kerberos.
To enable PySpark Streaming to access the secured Kafka, what steps I should
perform? Can I pass the principal/keytab and jaas config in
Hi All,
I would like to use PySpark Streaming with secured Kafka as the source stream.
What options or arguments that I should pass in spark-submit command?
A sample spark-submit command with all the required options/arguments to access
a remote-secured Kafka will help.
Thank you,
~Muthu R
Hi All,
I am using PySpark Direct Streaming to connect to a remote secured Kafka broker
and is secured with Kerberos Authentication. The KafkaUtils.createDirectStream
python call gives me the following error:
18/11/27 18:20:05 WARN VerifiableProperties: Property sasl.mechanism is not
valid
unsubscribe
11 matches
Mail list logo