[ https://issues.apache.org/jira/browse/SPARK-49480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan reassigned SPARK-49480: ----------------------------------- Assignee: Xi Chen > NullPointerException from SparkThrowableHelper.isInternalError method > --------------------------------------------------------------------- > > Key: SPARK-49480 > URL: https://issues.apache.org/jira/browse/SPARK-49480 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 4.0.0, 3.5.2 > Reporter: Xi Chen > Assignee: Xi Chen > Priority: Major > Labels: pull-request-available > > The SparkThrowableHelper.isInternalError method doesn't handle null input, > and it could lead to NullPointerException. > Example stacktrace from our environment with Spark 3.5.1: > {code:java} > Caused by: java.lang.NullPointerException: Cannot invoke > "String.startsWith(String)" because "errorClass" is null > at > org.apache.spark.SparkThrowableHelper$.isInternalError(SparkThrowableHelper.scala:64) > at > org.apache.spark.SparkThrowableHelper.isInternalError(SparkThrowableHelper.scala) > at org.apache.spark.SparkThrowable.isInternalError(SparkThrowable.java:50) > at > org.apache.spark.SparkException.isInternalError(SparkException.scala:27) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) > at > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:568) > at > com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:688) > at > com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:772) > ... 30 more {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org