Adding back users.


> On May 18, 2016, at 11:49 AM, Mail.com <pradeep.mi...@mail.com> wrote:
> 
> Hi Uladzimir,
> 
> I run is as below.
> 
> Spark-submit --class com.test --num-executors 4 --executor-cores 5 --queue 
> Dev --master yarn-client --driver-memory 512M --executor-memory 512M test.jar
> 
> Thanks,
> Pradeep
> 
> 
>> On May 18, 2016, at 5:45 AM, Vova Shelgunov <vvs...@gmail.com> wrote:
>> 
>> Hi Pradeep,
>> 
>> How do you run your spark application? What is spark master? How many cores 
>> do you allocate?
>> 
>> Regards,
>> Uladzimir
>> 
>>> On May 17, 2016 7:33 AM, "Mail.com" <pradeep.mi...@mail.com> wrote:
>>> Hi Muthu,
>>> 
>>> Are you on spark 1.4.1 and Kafka 0.8.2? I have a similar issue even for 
>>> simple string messages.
>>> 
>>> Console producer and consumer work fine. But spark always reruns empty RDD. 
>>> I am using Receiver based Approach.
>>> 
>>> Thanks,
>>> Pradeep
>>> 
>>> > On May 16, 2016, at 8:19 PM, Ramaswamy, Muthuraman 
>>> > <muthuraman.ramasw...@viasat.com> wrote:
>>> >
>>> > Yes, I can see the messages. Also, I wrote a quick custom decoder for 
>>> > avro and it works fine for the following:
>>> >
>>> >>> kvs = KafkaUtils.createDirectStream(ssc, [topic], 
>>> >>> {"metadata.broker.list": brokers}, valueDecoder=decoder)
>>> >
>>> > But, when I use the Confluent Serializers to leverage the Schema Registry 
>>> > (based on the link shown below), it doesn’t work for me. I am not sure 
>>> > whether I need to configure any more details to consume the Schema 
>>> > Registry. I can fetch the schema from the schema registry based on is 
>>> > Ids. The decoder method is not returning any values for me.
>>> >
>>> > ~Muthu
>>> >
>>> >
>>> >
>>> >> On 5/16/16, 10:49 AM, "Cody Koeninger" <c...@koeninger.org> wrote:
>>> >>
>>> >> Have you checked to make sure you can receive messages just using a
>>> >> byte array for value?
>>> >>
>>> >> On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
>>> >> <muthuraman.ramasw...@viasat.com> wrote:
>>> >>> I am trying to consume AVRO formatted message through
>>> >>> KafkaUtils.createDirectStream. I followed the listed below example 
>>> >>> (refer
>>> >>> link) but the messages are not being fetched by the Stream.
>>> >>>
>>> >>> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser&d=CwIBaQ&c=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk&r=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU&m=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0&s=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8&e=
>>> >>>
>>> >>> Is there any code missing that I must add to make the above sample work.
>>> >>> Say, I am not sure how the confluent serializers would know the avro 
>>> >>> schema
>>> >>> info as it knows only the Schema Registry URL info.
>>> >>>
>>> >>> Appreciate your help.
>>> >>>
>>> >>> ~Muthu
>>> >  
>>> > B‹KKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKCB• 
>>> > È [œÝXœØÜšX™K  K[XZ[ ˆ \Ù\‹][œÝXœØÜšX™P Ü \šË˜\ XÚ K›Ü™ÃB‘›Üˆ Y  ] [Û˜[  
>>> > ÛÛ[X[™ Ë  K[XZ[ ˆ \Ù\‹Z [   Ü \šË˜\ XÚ K›Ü™ÃBƒB
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to