Thanks Cody,
It worked for me buy keeping num executor with each having 1 core = num of
partitions of kafka.
On Mon, Sep 18, 2017 at 8:47 PM Cody Koeninger wrote:
> Have you searched in jira, e.g.
>
> https://issues.apache.org/jira/browse/SPARK-19185
>
> On Mon, Sep 18,
Have you searched in jira, e.g.
https://issues.apache.org/jira/browse/SPARK-19185
On Mon, Sep 18, 2017 at 1:56 AM, HARSH TAKKAR wrote:
> Hi
>
> Changing spark version if my last resort, is there any other workaround for
> this problem.
>
>
> On Mon, Sep 18, 2017 at 11:43
Hi
Changing spark version if my last resort, is there any other workaround for
this problem.
On Mon, Sep 18, 2017 at 11:43 AM pandees waran wrote:
> All, May I know what exactly changed in 2.1.1 which solved this problem?
>
> Sent from my iPhone
>
> On Sep 17, 2017, at
All, May I know what exactly changed in 2.1.1 which solved this problem?
Sent from my iPhone
> On Sep 17, 2017, at 11:08 PM, Anastasios Zouzias wrote:
>
> Hi,
>
> I had a similar issue using 2.1.0 but not with Kafka. Updating to 2.1.1
> solved my issue. Can you try with
Hi,
I had a similar issue using 2.1.0 but not with Kafka. Updating to 2.1.1
solved my issue. Can you try with 2.1.1 as well and report back?
Best,
Anastasios
Am 17.09.2017 16:48 schrieb "HARSH TAKKAR" :
Hi
I am using spark 2.1.0 with scala 2.11.8, and while iterating
You should paste some code. ConcurrentModificationException normally
happens when you modify a list or any non-thread safe data structure while
you are iterating over it.
On Sun, Sep 17, 2017 at 10:25 PM, HARSH TAKKAR
wrote:
> Hi,
>
> No we are not creating any thread for
Hi,
No we are not creating any thread for kafka DStream
however, we have a single thread for refreshing a resource cache on driver,
but that is totally separate to this connection.
On Mon, Sep 18, 2017 at 12:29 AM kant kodali wrote:
> Are you creating threads in your
Are you creating threads in your application?
On Sun, Sep 17, 2017 at 7:48 AM, HARSH TAKKAR wrote:
>
> Hi
>
> I am using spark 2.1.0 with scala 2.11.8, and while iterating over the
> partitions of each rdd in a dStream formed using KafkaUtils, i am getting
> the below
Hi
I am using spark 2.1.0 with scala 2.11.8, and while iterating over the
partitions of each rdd in a dStream formed using KafkaUtils, i am getting
the below exception, please suggest a fix.
I have following config
kafka :
enable.auto.commit:"true",
auto.commit.interval.ms:"1000",