No, I haven’t enabled Kerberos. Just the calls as specified in the stack
overflow thread on how to use the schema registry based serializer.
~Muthu
On 5/19/16, 5:25 PM, "Mail.com" wrote:
>Hi Muthu,
>
>Do you have Kerberos enabled?
>
>Thanks,
>Pradeep
>
>> On May 19,
Hi Muthu,
Do you have Kerberos enabled?
Thanks,
Pradeep
> On May 19, 2016, at 12:17 AM, Ramaswamy, Muthuraman
> wrote:
>
> I am using Spark 1.6.1 and Kafka 0.9+ It works for both receiver and
> receiver-less mode.
>
> One thing I noticed when you specify
I am using Spark 1.6.1 and Kafka 0.9+ It works for both receiver and
receiver-less mode.
One thing I noticed when you specify invalid topic name, KafkaUtils doesn't
fetch any messages. So, check you have specified the topic name correctly.
~Muthu
From:
Adding back users.
> On May 18, 2016, at 11:49 AM, Mail.com wrote:
>
> Hi Uladzimir,
>
> I run is as below.
>
> Spark-submit --class com.test --num-executors 4 --executor-cores 5 --queue
> Dev --master yarn-client --driver-memory 512M --executor-memory 512M test.jar
Thank you for the input.
Apparently, I was referring to incorrect Schema Registry Server. Once the
correct Schema Registry Server IP is used, serializer worked for me.
Thanks again,
~Muthu
From: Jan Uyttenhove >
Reply-To:
I think that if the Confluent deserializer cannot fetch the schema for the
avro message (key and/or value), you end up with no data. You should check
the logs of the Schemaregistry, it should show the HTTP requests it
receives so you can check if the deserializer can connect to it and if so,
what
Hi Muthu,
Are you on spark 1.4.1 and Kafka 0.8.2? I have a similar issue even for simple
string messages.
Console producer and consumer work fine. But spark always reruns empty RDD. I
am using Receiver based Approach.
Thanks,
Pradeep
> On May 16, 2016, at 8:19 PM, Ramaswamy, Muthuraman
>
Yes, I can see the messages. Also, I wrote a quick custom decoder for avro and
it works fine for the following:
>> kvs = KafkaUtils.createDirectStream(ssc, [topic], {"metadata.broker.list":
>> brokers}, valueDecoder=decoder)
But, when I use the Confluent Serializers to leverage the Schema
Have you checked to make sure you can receive messages just using a
byte array for value?
On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
wrote:
> I am trying to consume AVRO formatted message through
> KafkaUtils.createDirectStream. I followed the listed
I am trying to consume AVRO formatted message through
KafkaUtils.createDirectStream. I followed the listed below example (refer link)
but the messages are not being fetched by the Stream.
http://stackoverflow.com/questions/30339636/spark-python-avro-kafka-deserialiser
Is there any code missing
10 matches
Mail list logo