[ https://issues.apache.org/jira/browse/SPARK-44215?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Mridul Muralidharan updated SPARK-44215: ---------------------------------------- Fix Version/s: 3.3.3 > Client receives zero number of chunks in merge meta response which doesn't > trigger fallback to unmerged blocks > -------------------------------------------------------------------------------------------------------------- > > Key: SPARK-44215 > URL: https://issues.apache.org/jira/browse/SPARK-44215 > Project: Spark > Issue Type: Bug > Components: Shuffle > Affects Versions: 3.2.0 > Reporter: Chandni Singh > Assignee: Chandni Singh > Priority: Major > Fix For: 3.3.3, 3.5.0, 3.4.2 > > > We still see instances of the server returning 0 {{numChunks}} in > {{mergedMetaResponse}} which causes the executor to fail with > {{ArithmeticException}}. > {code} > java.lang.ArithmeticException: / by zero > at > org.apache.spark.storage.PushBasedFetchHelper.createChunkBlockInfosFromMetaResponse(PushBasedFetchHelper.scala:128) > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:1047) > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:90) > at > org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29) > at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484) > at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490) > at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) > at > org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:31) > at > org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37) > at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) > {code} > Here the executor doesn't fallback to fetch un-merged blocks and this also > doesn't result in a {{FetchFailure}}. So, the application fails. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org