Li Ying created SPARK-42834:
-------------------------------

             Summary: Divided by zero occurs in 
PushBasedFetchHelper.createChunkBlockInfosFromMetaResponse
                 Key: SPARK-42834
                 URL: https://issues.apache.org/jira/browse/SPARK-42834
             Project: Spark
          Issue Type: Bug
          Components: Shuffle
    Affects Versions: 3.2.0
            Reporter: Li Ying


{color:#222222}Sometimes when run a SQL job with push based shuffle, exception 
occurs as below.  It seems that there’s no element in the bitmaps which stores 
merge chunk meta. See 
org.apache.spark.storage.PushBasedFetchHelper.createChunkBlockInfosFromMetaResponse.{color}
{color:#222222} {color}
{color:#222222}Is it a bug that we should not createChunkBlockInfos when 
bitmaps is empty or the bitmaps should never be empty here ?{color}
 
{code:java}
Caused by: java.lang.ArithmeticException: / by zero
at 
org.apache.spark.storage.PushBasedFetchHelper.createChunkBlockInfosFromMetaResponse(PushBasedFetchHelper.scala:117)
at 
org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:980)
at 
org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84)
 {code}
related code:
{code:java}
def createChunkBlockInfosFromMetaResponse(
    shuffleId: Int,
    shuffleMergeId: Int,
    reduceId: Int,
    blockSize: Long,
    bitmaps: Array[RoaringBitmap]): ArrayBuffer[(BlockId, Long, Int)] = {
  val approxChunkSize = blockSize / bitmaps.length
  val blocksToFetch = new ArrayBuffer[(BlockId, Long, Int)]()
  for (i <- bitmaps.indices) {
    val blockChunkId = ShuffleBlockChunkId(shuffleId, shuffleMergeId, reduceId, 
i)
    chunksMetaMap.put(blockChunkId, bitmaps(i))
    logDebug(s"adding block chunk $blockChunkId of size $approxChunkSize")
    blocksToFetch += ((blockChunkId, approxChunkSize, SHUFFLE_PUSH_MAP_ID))
  }
  blocksToFetch
} {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to