There was a simple example
<https://github.com/dibbhatt/kafka-spark-consumer/blob/master/examples/scala/LowLevelKafkaConsumer.scala#L45>
which you can run after changing few lines of configurations.

Thanks
Best Regards

On Fri, Jan 16, 2015 at 12:23 PM, Dibyendu Bhattacharya <
dibyendu.bhattach...@gmail.com> wrote:

> Hi Kidong,
>
> Just now I tested the Low Level Consumer with Spark 1.2 and I did not see
> any issue with Receiver.Store method . It is able to fetch messages form
> Kafka.
>
> Can you cross check other configurations in your setup like Kafka broker
> IP , topic name, zk host details, consumer id etc.
>
> Dib
>
> On Fri, Jan 16, 2015 at 11:50 AM, Dibyendu Bhattacharya <
> dibyendu.bhattach...@gmail.com> wrote:
>
>> Hi Kidong,
>>
>> No , I have not tried yet with Spark 1.2 yet. I will try this out and let
>> you know how this goes.
>>
>> By the way, is there any change in Receiver Store method happened in
>> Spark 1.2 ?
>>
>>
>>
>> Regards,
>> Dibyendu
>>
>>
>>
>> On Fri, Jan 16, 2015 at 11:25 AM, mykidong <mykid...@gmail.com> wrote:
>>
>>> Hi Dibyendu,
>>>
>>> I am using kafka 0.8.1.1 and spark 1.2.0.
>>> After modifying these version of your pom, I have rebuilt your codes.
>>> But I have not got any messages from ssc.receiverStream(new
>>> KafkaReceiver(_props, i)).
>>>
>>> I have found, in your codes, all the messages are retrieved correctly,
>>> but
>>> _receiver.store(_dataBuffer.iterator())  which is spark streaming
>>> abstract
>>> class's method does not seem to work correctly.
>>>
>>> Have you tried running your spark streaming kafka consumer with kafka
>>> 0.8.1.1 and spark 1.2.0 ?
>>>
>>> - Kidong.
>>>
>>>
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Low-Level-Kafka-Consumer-for-Spark-tp11258p21180.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to