We're planning to use this as well (Dibyendu's
https://github.com/dibbhatt/kafka-spark-consumer ). Dibyendu, thanks for
the efforts. So far its working nicely. I think there is merit in make it
the default Kafka Receiver for spark streaming.

-neelesh

On Mon, Feb 2, 2015 at 5:25 PM, Dibyendu Bhattacharya <
[email protected]> wrote:

> Or you can use this Low Level Kafka Consumer for Spark :
> https://github.com/dibbhatt/kafka-spark-consumer
>
> This is now part of http://spark-packages.org/ and is running
> successfully for past few months in Pearson production environment . Being
> Low Level consumer, it does not have this re-balancing issue which High
> Level consumer have.
>
> Also I know there are few who has shifted to this Low Level Consumer which
> started giving them a better robust fault tolerant Kafka Receiver for Spark.
>
> Regards,
> Dibyendu
>
> On Tue, Feb 3, 2015 at 3:57 AM, Tathagata Das <[email protected]
> > wrote:
>
>> This is an issue that is hard to resolve without rearchitecting the whole
>> Kafka Receiver. There are some workarounds worth looking into.
>>
>>
>> http://mail-archives.apache.org/mod_mbox/kafka-users/201312.mbox/%3CCAFbh0Q38qQ0aAg_cj=jzk-kbi8xwf+1m6xlj+fzf6eetj9z...@mail.gmail.com%3E
>>
>> On Mon, Feb 2, 2015 at 1:07 PM, Greg Temchenko <[email protected]>
>> wrote:
>>
>>> Hi,
>>>
>>> This seems not fixed yet.
>>> I filed an issue in jira:
>>> https://issues.apache.org/jira/browse/SPARK-5505
>>>
>>> Greg
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Error-when-Spark-streaming-consumes-from-Kafka-tp19570p21471.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: [email protected]
>>> For additional commands, e-mail: [email protected]
>>>
>>>
>>
>

Reply via email to