Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-20 Thread Ramaswamy, Muthuraman
No, I haven’t enabled Kerberos. Just the calls as specified in the stack 
overflow thread on how to use the schema registry based serializer.

~Muthu




On 5/19/16, 5:25 PM, "Mail.com"  wrote:

>Hi Muthu,
>
>Do you have Kerberos enabled?
>
>Thanks,
>Pradeep
>
>> On May 19, 2016, at 12:17 AM, Ramaswamy, Muthuraman 
>>  wrote:
>> 
>> I am using Spark 1.6.1 and Kafka 0.9+ It works for both receiver and 
>> receiver-less mode.
>> 
>> One thing I noticed when you specify invalid topic name, KafkaUtils doesn't 
>> fetch any messages. So, check you have specified the topic name correctly.
>> 
>> ~Muthu
>> 
>> From: Mail.com [pradeep.mi...@mail.com]
>> Sent: Monday, May 16, 2016 9:33 PM
>> To: Ramaswamy, Muthuraman
>> Cc: Cody Koeninger; spark users
>> Subject: Re: KafkaUtils.createDirectStream Not Fetching Messages with 
>> Confluent Serializers as Value Decoder.
>> 
>> Hi Muthu,
>> 
>> Are you on spark 1.4.1 and Kafka 0.8.2? I have a similar issue even for 
>> simple string messages.
>> 
>> Console producer and consumer work fine. But spark always reruns empty RDD. 
>> I am using Receiver based Approach.
>> 
>> Thanks,
>> Pradeep
>> 
>>> On May 16, 2016, at 8:19 PM, Ramaswamy, Muthuraman 
>>>  wrote:
>>> 
>>> Yes, I can see the messages. Also, I wrote a quick custom decoder for avro 
>>> and it works fine for the following:
>>> 
> kvs = KafkaUtils.createDirectStream(ssc, [topic], 
> {"metadata.broker.list": brokers}, valueDecoder=decoder)
>>> 
>>> But, when I use the Confluent Serializers to leverage the Schema Registry 
>>> (based on the link shown below), it doesn’t work for me. I am not sure 
>>> whether I need to configure any more details to consume the Schema 
>>> Registry. I can fetch the schema from the schema registry based on is Ids. 
>>> The decoder method is not returning any values for me.
>>> 
>>> ~Muthu
>>> 
>>> 
>>> 
 On 5/16/16, 10:49 AM, "Cody Koeninger"  wrote:
 
 Have you checked to make sure you can receive messages just using a
 byte array for value?
 
 On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
  wrote:
> I am trying to consume AVRO formatted message through
> KafkaUtils.createDirectStream. I followed the listed below example (refer
> link) but the messages are not being fetched by the Stream.
> 
> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser=CwIBaQ=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8=
> 
> Is there any code missing that I must add to make the above sample work.
> Say, I am not sure how the confluent serializers would know the avro 
> schema
> info as it knows only the Schema Registry URL info.
> 
> Appreciate your help.
> 
> ~Muthu
>>> ?B‹CB•?È?[œÝXœØÜšX™K??K[XZ[?ˆ?\Ù\‹][œÝXœØÜšX™P?Ü?\šË˜\?XÚ?K›Ü™ÃB‘›Üˆ?Y??]?[Û˜[??ÛÛ[X[™?Ë??K[XZ[?ˆ?\Ù\‹Z?[???Ü?\šË˜\?XÚ?K›Ü™ÃBƒB
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>> 


Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-19 Thread Mail.com
Hi Muthu,

Do you have Kerberos enabled?

Thanks,
Pradeep

> On May 19, 2016, at 12:17 AM, Ramaswamy, Muthuraman 
>  wrote:
> 
> I am using Spark 1.6.1 and Kafka 0.9+ It works for both receiver and 
> receiver-less mode.
> 
> One thing I noticed when you specify invalid topic name, KafkaUtils doesn't 
> fetch any messages. So, check you have specified the topic name correctly.
> 
> ~Muthu
> 
> From: Mail.com [pradeep.mi...@mail.com]
> Sent: Monday, May 16, 2016 9:33 PM
> To: Ramaswamy, Muthuraman
> Cc: Cody Koeninger; spark users
> Subject: Re: KafkaUtils.createDirectStream Not Fetching Messages with 
> Confluent Serializers as Value Decoder.
> 
> Hi Muthu,
> 
> Are you on spark 1.4.1 and Kafka 0.8.2? I have a similar issue even for 
> simple string messages.
> 
> Console producer and consumer work fine. But spark always reruns empty RDD. I 
> am using Receiver based Approach.
> 
> Thanks,
> Pradeep
> 
>> On May 16, 2016, at 8:19 PM, Ramaswamy, Muthuraman 
>>  wrote:
>> 
>> Yes, I can see the messages. Also, I wrote a quick custom decoder for avro 
>> and it works fine for the following:
>> 
 kvs = KafkaUtils.createDirectStream(ssc, [topic], {"metadata.broker.list": 
 brokers}, valueDecoder=decoder)
>> 
>> But, when I use the Confluent Serializers to leverage the Schema Registry 
>> (based on the link shown below), it doesn’t work for me. I am not sure 
>> whether I need to configure any more details to consume the Schema Registry. 
>> I can fetch the schema from the schema registry based on is Ids. The decoder 
>> method is not returning any values for me.
>> 
>> ~Muthu
>> 
>> 
>> 
>>> On 5/16/16, 10:49 AM, "Cody Koeninger"  wrote:
>>> 
>>> Have you checked to make sure you can receive messages just using a
>>> byte array for value?
>>> 
>>> On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
>>>  wrote:
 I am trying to consume AVRO formatted message through
 KafkaUtils.createDirectStream. I followed the listed below example (refer
 link) but the messages are not being fetched by the Stream.
 
 https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser=CwIBaQ=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8=
 
 Is there any code missing that I must add to make the above sample work.
 Say, I am not sure how the confluent serializers would know the avro schema
 info as it knows only the Schema Registry URL info.
 
 Appreciate your help.
 
 ~Muthu
>> ?B‹CB•?È?[œÝXœØÜšX™K??K[XZ[?ˆ?\Ù\‹][œÝXœØÜšX™P?Ü?\šË˜\?XÚ?K›Ü™ÃB‘›Üˆ?Y??]?[Û˜[??ÛÛ[X[™?Ë??K[XZ[?ˆ?\Ù\‹Z?[???Ü?\šË˜\?XÚ?K›Ü™ÃBƒB
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-18 Thread Ramaswamy, Muthuraman
I am using Spark 1.6.1 and Kafka 0.9+ It works for both receiver and 
receiver-less mode.

One thing I noticed when you specify invalid topic name, KafkaUtils doesn't 
fetch any messages. So, check you have specified the topic name correctly.

~Muthu

From: Mail.com [pradeep.mi...@mail.com]
Sent: Monday, May 16, 2016 9:33 PM
To: Ramaswamy, Muthuraman
Cc: Cody Koeninger; spark users
Subject: Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent 
Serializers as Value Decoder.

Hi Muthu,

Are you on spark 1.4.1 and Kafka 0.8.2? I have a similar issue even for simple 
string messages.

Console producer and consumer work fine. But spark always reruns empty RDD. I 
am using Receiver based Approach.

Thanks,
Pradeep

> On May 16, 2016, at 8:19 PM, Ramaswamy, Muthuraman 
>  wrote:
>
> Yes, I can see the messages. Also, I wrote a quick custom decoder for avro 
> and it works fine for the following:
>
>>> kvs = KafkaUtils.createDirectStream(ssc, [topic], {"metadata.broker.list": 
>>> brokers}, valueDecoder=decoder)
>
> But, when I use the Confluent Serializers to leverage the Schema Registry 
> (based on the link shown below), it doesn’t work for me. I am not sure 
> whether I need to configure any more details to consume the Schema Registry. 
> I can fetch the schema from the schema registry based on is Ids. The decoder 
> method is not returning any values for me.
>
> ~Muthu
>
>
>
>> On 5/16/16, 10:49 AM, "Cody Koeninger"  wrote:
>>
>> Have you checked to make sure you can receive messages just using a
>> byte array for value?
>>
>> On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
>>  wrote:
>>> I am trying to consume AVRO formatted message through
>>> KafkaUtils.createDirectStream. I followed the listed below example (refer
>>> link) but the messages are not being fetched by the Stream.
>>>
>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser=CwIBaQ=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8=
>>>
>>> Is there any code missing that I must add to make the above sample work.
>>> Say, I am not sure how the confluent serializers would know the avro schema
>>> info as it knows only the Schema Registry URL info.
>>>
>>> Appreciate your help.
>>>
>>> ~Muthu
> ?B‹CB•?È?[œÝXœØÜšX™K??K[XZ[?ˆ?\Ù\‹][œÝXœØÜšX™P?Ü?\šË˜\?XÚ?K›Ü™ÃB‘›Üˆ?Y??]?[Û˜[??ÛÛ[X[™?Ë??K[XZ[?ˆ?\Ù\‹Z?[???Ü?\šË˜\?XÚ?K›Ü™ÃBƒB

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-18 Thread Mail.com
Adding back users.



> On May 18, 2016, at 11:49 AM, Mail.com  wrote:
> 
> Hi Uladzimir,
> 
> I run is as below.
> 
> Spark-submit --class com.test --num-executors 4 --executor-cores 5 --queue 
> Dev --master yarn-client --driver-memory 512M --executor-memory 512M test.jar
> 
> Thanks,
> Pradeep
> 
> 
>> On May 18, 2016, at 5:45 AM, Vova Shelgunov  wrote:
>> 
>> Hi Pradeep,
>> 
>> How do you run your spark application? What is spark master? How many cores 
>> do you allocate?
>> 
>> Regards,
>> Uladzimir
>> 
>>> On May 17, 2016 7:33 AM, "Mail.com"  wrote:
>>> Hi Muthu,
>>> 
>>> Are you on spark 1.4.1 and Kafka 0.8.2? I have a similar issue even for 
>>> simple string messages.
>>> 
>>> Console producer and consumer work fine. But spark always reruns empty RDD. 
>>> I am using Receiver based Approach.
>>> 
>>> Thanks,
>>> Pradeep
>>> 
>>> > On May 16, 2016, at 8:19 PM, Ramaswamy, Muthuraman 
>>> >  wrote:
>>> >
>>> > Yes, I can see the messages. Also, I wrote a quick custom decoder for 
>>> > avro and it works fine for the following:
>>> >
>>> >>> kvs = KafkaUtils.createDirectStream(ssc, [topic], 
>>> >>> {"metadata.broker.list": brokers}, valueDecoder=decoder)
>>> >
>>> > But, when I use the Confluent Serializers to leverage the Schema Registry 
>>> > (based on the link shown below), it doesn’t work for me. I am not sure 
>>> > whether I need to configure any more details to consume the Schema 
>>> > Registry. I can fetch the schema from the schema registry based on is 
>>> > Ids. The decoder method is not returning any values for me.
>>> >
>>> > ~Muthu
>>> >
>>> >
>>> >
>>> >> On 5/16/16, 10:49 AM, "Cody Koeninger"  wrote:
>>> >>
>>> >> Have you checked to make sure you can receive messages just using a
>>> >> byte array for value?
>>> >>
>>> >> On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
>>> >>  wrote:
>>> >>> I am trying to consume AVRO formatted message through
>>> >>> KafkaUtils.createDirectStream. I followed the listed below example 
>>> >>> (refer
>>> >>> link) but the messages are not being fetched by the Stream.
>>> >>>
>>> >>> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser=CwIBaQ=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8=
>>> >>>
>>> >>> Is there any code missing that I must add to make the above sample work.
>>> >>> Say, I am not sure how the confluent serializers would know the avro 
>>> >>> schema
>>> >>> info as it knows only the Schema Registry URL info.
>>> >>>
>>> >>> Appreciate your help.
>>> >>>
>>> >>> ~Muthu
>>> >  
>>> > B‹CB• 
>>> > È [œÝXœØÜšX™K  K[XZ[ ˆ \Ù\‹][œÝXœØÜšX™P Ü \šË˜\ XÚ K›Ü™ÃB‘›Üˆ Y  ] [Û˜[  
>>> > ÛÛ[X[™ Ë  K[XZ[ ˆ \Ù\‹Z [   Ü \šË˜\ XÚ K›Ü™ÃBƒB
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org


Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-17 Thread Ramaswamy, Muthuraman
Thank you for the input.

Apparently, I was referring to incorrect Schema Registry Server. Once the 
correct Schema Registry Server IP is used, serializer worked for me.

Thanks again,

~Muthu

From: Jan Uyttenhove >
Reply-To: "j...@insidin.com" 
>
Date: Tuesday, May 17, 2016 at 3:18 AM
To: "Ramaswamy, Muthuraman" 
>
Cc: spark users >
Subject: Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent 
Serializers as Value Decoder.

I think that if the Confluent deserializer cannot fetch the schema for the avro 
message (key and/or value), you end up with no data. You should check the logs 
of the Schemaregistry, it should show the HTTP requests it receives so you can 
check if the deserializer can connect to it and if so, what the response code 
looks like.

If you use the Confluent serializer, each avro message is first serialized and 
afterwards the schema id is added to it. This way, the Confluent deserializer 
can fetch the schema id first and use it to lookup the schema in the 
Schemaregistry.


On Tue, May 17, 2016 at 2:19 AM, Ramaswamy, Muthuraman 
> wrote:
Yes, I can see the messages. Also, I wrote a quick custom decoder for avro and 
it works fine for the following:

>> kvs = KafkaUtils.createDirectStream(ssc, [topic], {"metadata.broker.list": 
>> brokers}, valueDecoder=decoder)

But, when I use the Confluent Serializers to leverage the Schema Registry 
(based on the link shown below), it doesn’t work for me. I am not sure whether 
I need to configure any more details to consume the Schema Registry. I can 
fetch the schema from the schema registry based on is Ids. The decoder method 
is not returning any values for me.

~Muthu



On 5/16/16, 10:49 AM, "Cody Koeninger" 
> wrote:

>Have you checked to make sure you can receive messages just using a
>byte array for value?
>
>On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
>> 
>wrote:
>> I am trying to consume AVRO formatted message through
>> KafkaUtils.createDirectStream. I followed the listed below example (refer
>> link) but the messages are not being fetched by the Stream.
>>
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser=CwIBaQ=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8=
>>
>> Is there any code missing that I must add to make the above sample work.
>> Say, I am not sure how the confluent serializers would know the avro schema
>> info as it knows only the Schema Registry URL info.
>>
>> Appreciate your help.
>>
>> ~Muthu
>>
>>
>>



--
Jan Uyttenhove
Streaming data & digital solutions architect @ Insidin bvba

j...@insidin.com
+32 474 56 24 39

https://twitter.com/xorto
https://www.linkedin.com/in/januyttenhove

This e-mail and any files transmitted with it are intended solely for the use 
of the individual or entity to whom they are addressed. It may contain 
privileged and confidential information. If you are not the intended recipient 
please notify the sender immediately and destroy this e-mail. Any form of 
reproduction, dissemination, copying, disclosure, modification, distribution 
and/or publication of this e-mail message is strictly prohibited. Whilst all 
efforts are made to safeguard e-mails, the sender cannot guarantee that 
attachments are virus free or compatible with your systems and does not accept 
liability in respect of viruses or computer problems experienced.


Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-17 Thread Jan Uyttenhove
I think that if the Confluent deserializer cannot fetch the schema for the
avro message (key and/or value), you end up with no data. You should check
the logs of the Schemaregistry, it should show the HTTP requests it
receives so you can check if the deserializer can connect to it and if so,
what the response code looks like.

If you use the Confluent serializer, each avro message is first serialized
and afterwards the schema id is added to it. This way, the Confluent
deserializer can fetch the schema id first and use it to lookup the schema
in the Schemaregistry.


On Tue, May 17, 2016 at 2:19 AM, Ramaswamy, Muthuraman <
muthuraman.ramasw...@viasat.com> wrote:

> Yes, I can see the messages. Also, I wrote a quick custom decoder for avro
> and it works fine for the following:
>
> >> kvs = KafkaUtils.createDirectStream(ssc, [topic],
> {"metadata.broker.list": brokers}, valueDecoder=decoder)
>
> But, when I use the Confluent Serializers to leverage the Schema Registry
> (based on the link shown below), it doesn’t work for me. I am not sure
> whether I need to configure any more details to consume the Schema
> Registry. I can fetch the schema from the schema registry based on is Ids.
> The decoder method is not returning any values for me.
>
> ~Muthu
>
>
>
> On 5/16/16, 10:49 AM, "Cody Koeninger"  wrote:
>
> >Have you checked to make sure you can receive messages just using a
> >byte array for value?
> >
> >On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
> > wrote:
> >> I am trying to consume AVRO formatted message through
> >> KafkaUtils.createDirectStream. I followed the listed below example
> (refer
> >> link) but the messages are not being fetched by the Stream.
> >>
> >>
> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser=CwIBaQ=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8=
> >>
> >> Is there any code missing that I must add to make the above sample work.
> >> Say, I am not sure how the confluent serializers would know the avro
> schema
> >> info as it knows only the Schema Registry URL info.
> >>
> >> Appreciate your help.
> >>
> >> ~Muthu
> >>
> >>
> >>
>



-- 
Jan Uyttenhove
Streaming data & digital solutions architect @ Insidin bvba

j...@insidin.com
+32 474 56 24 39

https://twitter.com/xorto
https://www.linkedin.com/in/januyttenhove

This e-mail and any files transmitted with it are intended solely for the
use of the individual or entity to whom they are addressed. It may contain
privileged and confidential information. If you are not the intended
recipient please notify the sender immediately and destroy this e-mail. Any
form of reproduction, dissemination, copying, disclosure, modification,
distribution and/or publication of this e-mail message is strictly
prohibited. Whilst all efforts are made to safeguard e-mails, the sender
cannot guarantee that attachments are virus free or compatible with your
systems and does not accept liability in respect of viruses or computer
problems experienced.


Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-16 Thread Mail.com
Hi Muthu,

Are you on spark 1.4.1 and Kafka 0.8.2? I have a similar issue even for simple 
string messages.

Console producer and consumer work fine. But spark always reruns empty RDD. I 
am using Receiver based Approach. 

Thanks,
Pradeep

> On May 16, 2016, at 8:19 PM, Ramaswamy, Muthuraman 
>  wrote:
> 
> Yes, I can see the messages. Also, I wrote a quick custom decoder for avro 
> and it works fine for the following:
> 
>>> kvs = KafkaUtils.createDirectStream(ssc, [topic], {"metadata.broker.list": 
>>> brokers}, valueDecoder=decoder)
> 
> But, when I use the Confluent Serializers to leverage the Schema Registry 
> (based on the link shown below), it doesn’t work for me. I am not sure 
> whether I need to configure any more details to consume the Schema Registry. 
> I can fetch the schema from the schema registry based on is Ids. The decoder 
> method is not returning any values for me.
> 
> ~Muthu
> 
> 
> 
>> On 5/16/16, 10:49 AM, "Cody Koeninger"  wrote:
>> 
>> Have you checked to make sure you can receive messages just using a
>> byte array for value?
>> 
>> On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
>>  wrote:
>>> I am trying to consume AVRO formatted message through
>>> KafkaUtils.createDirectStream. I followed the listed below example (refer
>>> link) but the messages are not being fetched by the Stream.
>>> 
>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser=CwIBaQ=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8=
>>>  
>>> 
>>> Is there any code missing that I must add to make the above sample work.
>>> Say, I am not sure how the confluent serializers would know the avro schema
>>> info as it knows only the Schema Registry URL info.
>>> 
>>> Appreciate your help.
>>> 
>>> ~Muthu
> B‹CB•È[œÝXœØÜšX™KK[XZ[ˆ\Ù\‹][œÝXœØÜšX™PÜ\šË˜\XÚK›Ü™ÃB‘›ÜˆY][Û˜[ÛÛ[X[™ËK[XZ[ˆ\Ù\‹Z[Ü\šË˜\XÚK›Ü™ÃBƒB

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-16 Thread Ramaswamy, Muthuraman
Yes, I can see the messages. Also, I wrote a quick custom decoder for avro and 
it works fine for the following:

>> kvs = KafkaUtils.createDirectStream(ssc, [topic], {"metadata.broker.list": 
>> brokers}, valueDecoder=decoder)

But, when I use the Confluent Serializers to leverage the Schema Registry 
(based on the link shown below), it doesn’t work for me. I am not sure whether 
I need to configure any more details to consume the Schema Registry. I can 
fetch the schema from the schema registry based on is Ids. The decoder method 
is not returning any values for me.

~Muthu



On 5/16/16, 10:49 AM, "Cody Koeninger"  wrote:

>Have you checked to make sure you can receive messages just using a
>byte array for value?
>
>On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
> wrote:
>> I am trying to consume AVRO formatted message through
>> KafkaUtils.createDirectStream. I followed the listed below example (refer
>> link) but the messages are not being fetched by the Stream.
>>
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser=CwIBaQ=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8=
>>  
>>
>> Is there any code missing that I must add to make the above sample work.
>> Say, I am not sure how the confluent serializers would know the avro schema
>> info as it knows only the Schema Registry URL info.
>>
>> Appreciate your help.
>>
>> ~Muthu
>>
>>
>>


Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent Serializers as Value Decoder.

2016-05-16 Thread Cody Koeninger
Have you checked to make sure you can receive messages just using a
byte array for value?

On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
 wrote:
> I am trying to consume AVRO formatted message through
> KafkaUtils.createDirectStream. I followed the listed below example (refer
> link) but the messages are not being fetched by the Stream.
>
> http://stackoverflow.com/questions/30339636/spark-python-avro-kafka-deserialiser
>
> Is there any code missing that I must add to make the above sample work.
> Say, I am not sure how the confluent serializers would know the avro schema
> info as it knows only the Schema Registry URL info.
>
> Appreciate your help.
>
> ~Muthu
>
>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org