I am currently using Spark 2.3.0. Will try it with 2.3.1

On Tue, Jul 3, 2018 at 3:12 PM, Shixiong(Ryan) Zhu <shixi...@databricks.com>
wrote:

> Which version are you using? There is a known issue regarding this and
> should be fixed in 2.3.1. See https://issues.apache.org/
> jira/browse/SPARK-23623 for details.
>
> Best Regards,
> Ryan
>
> On Mon, Jul 2, 2018 at 3:56 AM, kant kodali <kanth...@gmail.com> wrote:
>
>> Hi All,
>>
>> I get the below error quite often when I do an stream-stream inner join
>> on two data frames. After running through several experiments stream-stream
>> joins dont look stable enough for production yet. any advice on this?
>>
>> Thanks!
>>
>> java.util.ConcurrentModificationException: KafkaConsumer is not safe for
>> multi-threaded access
>> 18/07/02 09:32:14 INFO LineBufferedStream: stdout:     at
>> org.apache.kafka.clients.consumer.KafkaConsumer.acquire(
>> KafkaConsumer.java:1431)
>> 18/07/02 09:32:14 INFO LineBufferedStream: stdout:     at
>> org.apache.kafka.clients.consumer.KafkaConsumer.close(KafkaC
>> onsumer.java:1361)
>> 18/07/02 09:32:14 INFO LineBufferedStream: stdout:     at
>> org.apache.spark.sql.kafka010.CachedKafkaConsumer.close(Cach
>> edKafkaConsumer.scala:301)
>>
>
>

Reply via email to