Gabor,
Thanks for the clarification.
Thanks
On Fri, Sep 6, 2019 at 12:38 AM Gabor Somogyi
wrote:
> Sethupathi,
>
> Let me extract then the important part what I've shared:
>
> 1. "This ensures that each Kafka source has its own consumer group that
> does not face interference from any other
Sethupathi,
Let me extract then the important part what I've shared:
1. "This ensures that each Kafka source has its own consumer group that
does not face interference from any other consumer"
2. Consumers may eat the data from each other, offset calculation may give
back wrong result (that's
Gabor,
Thanks for the quick response and sharing about spark 3.0, we need to use
spark streaming (KafkaUtils.createDirectStream) than structured streaming
by following this document
https://spark.apache.org/docs/2.2.0/streaming-kafka-0-10-integration.html and
re-iterating the issue again for
Gabor,
Thanks for the quick response and sharing about spark 3.0, we need to use
spark streaming (KafkaUtils.createDirectStream) than structured streaming
by following this document
https://spark.apache.org/docs/2.2.0/streaming-kafka-0-10-integration.html and
re-iterating the issue again for
Hi,
Let me share Spark 3.0 documentation part (Structured Streaming and not
DStreams what you've mentioned but still relevant):
kafka.group.id string none streaming and batch The Kafka group id to use in
Kafka consumer while reading from Kafka. Use this with caution. By default,
each query
Hi Team,
We have secured Kafka cluster (which only allows to consume from the
pre-configured, authorized consumer group), there is a scenario where we
want to use spark streaming to consume from secured kafka. so we have
decided to use spark-streaming-kafka-0-10