HeartSaVioR opened a new pull request, #40705:
URL: https://github.com/apache/spark/pull/40705

   ### What changes were proposed in this pull request?
   
   This PR moves the error class resource file in Kafka connector from test to 
src, so that error class works without test artifacts.
   
   ### Why are the changes needed?
   
   Refer to the `How was this patch tested?`.
   
   ### Does this PR introduce _any_ user-facing change?
   
   Yes, but the possibility of encountering this is small enough.
   
   ### How was this patch tested?
   
   Ran spark-shell with Kafka connector artifacts (without test artifacts) and 
triggered KafkaExceptions to confirm that exception is properly raised.
   
   ```
   scala> import org.apache.spark.sql.kafka010.KafkaExceptions
   import org.apache.spark.sql.kafka010.KafkaExceptions
   
   scala> import org.apache.kafka.common.TopicPartition
   import org.apache.kafka.common.TopicPartition
   
   scala> 
KafkaExceptions.mismatchedTopicPartitionsBetweenEndOffsetAndPrefetched(Set[TopicPartition](),
 Set[TopicPartition]())
   res1: org.apache.spark.SparkException =
   org.apache.spark.SparkException: Kafka data source in Trigger.AvailableNow 
should provide the same topic partitions in pre-fetched offset to end offset 
for each microbatch. The error could be transient - restart your query, and 
report if you still see the same issue.
   topic-partitions for pre-fetched offset: Set(), topic-partitions for end 
offset: Set().
   ```
   
   Without the fix, triggering KafkaExceptions failed to load error class 
resource file and led unexpected exception. 
   
   ```
   scala> 
KafkaExceptions.mismatchedTopicPartitionsBetweenEndOffsetAndPrefetched(Set[TopicPartition](),
 Set[TopicPartition]())
   java.lang.IllegalArgumentException: argument "src" is null
     at 
com.fasterxml.jackson.databind.ObjectMapper._assertNotNull(ObjectMapper.java:4885)
     at 
com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3618)
     at 
org.apache.spark.ErrorClassesJsonReader$.org$apache$spark$ErrorClassesJsonReader$$readAsMap(ErrorClassesJSONReader.scala:95)
     at 
org.apache.spark.ErrorClassesJsonReader.$anonfun$errorInfoMap$1(ErrorClassesJSONReader.scala:44)
     at scala.collection.immutable.List.map(List.scala:293)
     at 
org.apache.spark.ErrorClassesJsonReader.<init>(ErrorClassesJSONReader.scala:44)
     at 
org.apache.spark.sql.kafka010.KafkaExceptions$.<init>(KafkaExceptions.scala:27)
     at 
org.apache.spark.sql.kafka010.KafkaExceptions$.<clinit>(KafkaExceptions.scala)
     ... 47 elided
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to