Dear Apache Spark users,
 
 

I have a long running Spark application that is encountering an 
ArrayIndexOutOfBoundsException once every two weeks. The exception does not 
disrupt the operation of my app, but I'm still concerned about it and would 
like to find a solution.

 

Here's some additional information about my setup:

 

Spark is running in standalone mode
 Spark version is 3.3.1
 Scala version is 2.12.15
 I'm using Spark in Structured Streaming

 

Here's the relevant error message:
 java.lang.ArrayIndexOutOfBoundsException Index 59 out of bounds for length 16
 I've reviewed the code and searched online, but I'm still unable to find a 
solution. The full stacktrace can be found at this link:  
https://gist.github.com/rsi2m/ae54eccac93ae602d04d383e56c1a737
 I would appreciate any insights or suggestions on how to resolve this issue. 
Thank you in advance for your help.

 

Best regards,
 rsi2m

 
 

 
 

Reply via email to