[ 
https://issues.apache.org/jira/browse/KAFKA-12868?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuneng Xie updated KAFKA-12868:
-------------------------------
    Description: 
our broker spent too long loading the offset partition.

```

[2021-05-30 03:18:20,505] INFO [GroupMetadataManager brokerId=2] Finished 
loading offsets and group metadata from __consumer_offsets-47 in 1236029 
milliseconds. (kafka.coordinator.group.GroupMetadataManager)

```

 

so i checked the partition data , it's too big 

```

106G /var/lib/kafka/data/__consumer_offsets-47

```

 

the retention time is 7 days

```

log.retention.hours=168

```

 

and i found error log about log cleaner:

```

[2021-04-10 00:28:24,507] INFO Cleaner 0: Beginning cleaning of log 
__consumer_offsets-47. (kafka.log.LogCleaner)

[2021-04-10 00:28:24,507] INFO Cleaner 0: Building offset map for 
__consumer_offsets-47... (kafka.log.LogCleaner)

[2021-04-10 00:28:24,526] INFO Cleaner 0: Building offset map for log 
__consumer_offsets-47 for 17 segments in offset range [1010757586, 1020967392). 
(kafka.log.LogCleaner)

[2021-04-10 00:29:22,669] WARN [kafka-log-cleaner-thread-0]: Unexpected 
exception thrown when cleaning log 
Log(dir=/var/lib/kafka/data/__consumer_offsets-47, topic=__consumer_offsets, 
partition=47, highWatermark=1021411007, lastStableOffset=1021411007, 
logStartOffset=0, logEndOffset=1021411007). Marking its partition 
(__consumer_offsets-47) as uncleanable (kafka.log.LogCleaner)

kafka.log.LogCleaningException: -2147483648

        at 
kafka.log.LogCleaner$CleanerThread.cleanFilthiestLog(LogCleaner.scala:348)

        at 
kafka.log.LogCleaner$CleanerThread.tryCleanFilthiestLog(LogCleaner.scala:324)

        at kafka.log.LogCleaner$CleanerThread.doWork(LogCleaner.scala:313)

        at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:96)

Caused by: java.lang.ArrayIndexOutOfBoundsException: -2147483648

        at kafka.utils.CoreUtils$.readInt(CoreUtils.scala:241)

        at kafka.log.SkimpyOffsetMap.positionOf(OffsetMap.scala:183)

        at kafka.log.SkimpyOffsetMap.put(OffsetMap.scala:101)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$2(LogCleaner.scala:947)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$2$adapted(LogCleaner.scala:944)

        at scala.collection.Iterator.foreach(Iterator.scala:941)

        at scala.collection.Iterator.foreach$(Iterator.scala:941)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)

        at scala.collection.IterableLike.foreach(IterableLike.scala:74)

        at scala.collection.IterableLike.foreach$(IterableLike.scala:73)

        at scala.collection.AbstractIterable.foreach(Iterable.scala:56)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$1(LogCleaner.scala:944)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$1$adapted(LogCleaner.scala:933)

        at scala.collection.Iterator.foreach(Iterator.scala:941)

        at scala.collection.Iterator.foreach$(Iterator.scala:941)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)

        at scala.collection.IterableLike.foreach(IterableLike.scala:74)

        at scala.collection.IterableLike.foreach$(IterableLike.scala:73)

        at scala.collection.AbstractIterable.foreach(Iterable.scala:56)

        at kafka.log.Cleaner.buildOffsetMapForSegment(LogCleaner.scala:933)

        at kafka.log.Cleaner.$anonfun$buildOffsetMap$3(LogCleaner.scala:894)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMap$3$adapted(LogCleaner.scala:890)

        at 
scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:877)

        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)

        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)

        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)

        at 
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:876)

        at kafka.log.Cleaner.buildOffsetMap(LogCleaner.scala:890)

        at kafka.log.Cleaner.doClean(LogCleaner.scala:514)

        at kafka.log.Cleaner.clean(LogCleaner.scala:502)

        at kafka.log.LogCleaner$CleanerThread.cleanLog(LogCleaner.scala:371)

        at 
kafka.log.LogCleaner$CleanerThread.cleanFilthiestLog(LogCleaner.scala:344)

        ... 3 more

```

 

seems like the cleaner failed to compact  __consumer_offsets-47

any suggestion on this? 

 

  was:
our broker spent too long loading the offset partition.

```

[2021-05-30 03:18:20,505] INFO [GroupMetadataManager brokerId=2] Finished 
loading offsets and group metadata from __consumer_offsets-47 in 1236029 
milliseconds. (kafka.coordinator.group.GroupMetadataManager)

```

 

so i checked the partition data , it's too big 

```

106G /var/lib/kafka/data/__consumer_offsets-47

```

 

the retention time is 7 days

```

log.retention.hours=168

```

 

and i found error log about log cleaner:

```

[2021-04-10 00:28:24,507] INFO Cleaner 0: Beginning cleaning of log 
__consumer_offsets-47. (kafka.log.LogCleaner)

[2021-04-10 00:28:24,507] INFO Cleaner 0: Building offset map for 
__consumer_offsets-47... (kafka.log.LogCleaner)

[2021-04-10 00:28:24,526] INFO Cleaner 0: Building offset map for log 
__consumer_offsets-47 for 17 segments in offset range [1010757586, 1020967392). 
(kafka.log.LogCleaner)

[2021-04-10 00:29:22,669] WARN [kafka-log-cleaner-thread-0]: Unexpected 
exception thrown when cleaning log 
Log(dir=/var/lib/kafka/data/__consumer_offsets-47, topic=__consumer_offsets, 
partition=47, highWatermark=1021411007, lastStableOffset=1021411007, 
logStartOffset=0, logEndOffset=1021411007). Marking its partition 
(__consumer_offsets-47) as uncleanable (kafka.log.LogCleaner)

kafka.log.LogCleaningException: -2147483648

        at 
kafka.log.LogCleaner$CleanerThread.cleanFilthiestLog(LogCleaner.scala:348)

        at 
kafka.log.LogCleaner$CleanerThread.tryCleanFilthiestLog(LogCleaner.scala:324)

        at kafka.log.LogCleaner$CleanerThread.doWork(LogCleaner.scala:313)

        at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:96)

Caused by: java.lang.ArrayIndexOutOfBoundsException: -2147483648

        at kafka.utils.CoreUtils$.readInt(CoreUtils.scala:241)

        at kafka.log.SkimpyOffsetMap.positionOf(OffsetMap.scala:183)

        at kafka.log.SkimpyOffsetMap.put(OffsetMap.scala:101)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$2(LogCleaner.scala:947)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$2$adapted(LogCleaner.scala:944)

        at scala.collection.Iterator.foreach(Iterator.scala:941)

        at scala.collection.Iterator.foreach$(Iterator.scala:941)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)

        at scala.collection.IterableLike.foreach(IterableLike.scala:74)

        at scala.collection.IterableLike.foreach$(IterableLike.scala:73)

        at scala.collection.AbstractIterable.foreach(Iterable.scala:56)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$1(LogCleaner.scala:944)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$1$adapted(LogCleaner.scala:933)

        at scala.collection.Iterator.foreach(Iterator.scala:941)

        at scala.collection.Iterator.foreach$(Iterator.scala:941)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)

        at scala.collection.IterableLike.foreach(IterableLike.scala:74)

        at scala.collection.IterableLike.foreach$(IterableLike.scala:73)

        at scala.collection.AbstractIterable.foreach(Iterable.scala:56)

        at kafka.log.Cleaner.buildOffsetMapForSegment(LogCleaner.scala:933)

        at kafka.log.Cleaner.$anonfun$buildOffsetMap$3(LogCleaner.scala:894)

        at 
kafka.log.Cleaner.$anonfun$buildOffsetMap$3$adapted(LogCleaner.scala:890)

        at 
scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:877)

        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)

        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)

        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)

        at 
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:876)

        at kafka.log.Cleaner.buildOffsetMap(LogCleaner.scala:890)

        at kafka.log.Cleaner.doClean(LogCleaner.scala:514)

        at kafka.log.Cleaner.clean(LogCleaner.scala:502)

        at kafka.log.LogCleaner$CleanerThread.cleanLog(LogCleaner.scala:371)

        at 
kafka.log.LogCleaner$CleanerThread.cleanFilthiestLog(LogCleaner.scala:344)

        ... 3 more

```

 

any suggestion on this? 

 


> log cleaner failed with java.lang.ArrayIndexOutOfBoundsException: -2147483648
> -----------------------------------------------------------------------------
>
>                 Key: KAFKA-12868
>                 URL: https://issues.apache.org/jira/browse/KAFKA-12868
>             Project: Kafka
>          Issue Type: Bug
>          Components: log cleaner
>    Affects Versions: 2.4.0
>            Reporter: Yuneng Xie
>            Priority: Major
>
> our broker spent too long loading the offset partition.
> ```
> [2021-05-30 03:18:20,505] INFO [GroupMetadataManager brokerId=2] Finished 
> loading offsets and group metadata from __consumer_offsets-47 in 1236029 
> milliseconds. (kafka.coordinator.group.GroupMetadataManager)
> ```
>  
> so i checked the partition data , it's too big 
> ```
> 106G /var/lib/kafka/data/__consumer_offsets-47
> ```
>  
> the retention time is 7 days
> ```
> log.retention.hours=168
> ```
>  
> and i found error log about log cleaner:
> ```
> [2021-04-10 00:28:24,507] INFO Cleaner 0: Beginning cleaning of log 
> __consumer_offsets-47. (kafka.log.LogCleaner)
> [2021-04-10 00:28:24,507] INFO Cleaner 0: Building offset map for 
> __consumer_offsets-47... (kafka.log.LogCleaner)
> [2021-04-10 00:28:24,526] INFO Cleaner 0: Building offset map for log 
> __consumer_offsets-47 for 17 segments in offset range [1010757586, 
> 1020967392). (kafka.log.LogCleaner)
> [2021-04-10 00:29:22,669] WARN [kafka-log-cleaner-thread-0]: Unexpected 
> exception thrown when cleaning log 
> Log(dir=/var/lib/kafka/data/__consumer_offsets-47, topic=__consumer_offsets, 
> partition=47, highWatermark=1021411007, lastStableOffset=1021411007, 
> logStartOffset=0, logEndOffset=1021411007). Marking its partition 
> (__consumer_offsets-47) as uncleanable (kafka.log.LogCleaner)
> kafka.log.LogCleaningException: -2147483648
>         at 
> kafka.log.LogCleaner$CleanerThread.cleanFilthiestLog(LogCleaner.scala:348)
>         at 
> kafka.log.LogCleaner$CleanerThread.tryCleanFilthiestLog(LogCleaner.scala:324)
>         at kafka.log.LogCleaner$CleanerThread.doWork(LogCleaner.scala:313)
>         at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:96)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: -2147483648
>         at kafka.utils.CoreUtils$.readInt(CoreUtils.scala:241)
>         at kafka.log.SkimpyOffsetMap.positionOf(OffsetMap.scala:183)
>         at kafka.log.SkimpyOffsetMap.put(OffsetMap.scala:101)
>         at 
> kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$2(LogCleaner.scala:947)
>         at 
> kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$2$adapted(LogCleaner.scala:944)
>         at scala.collection.Iterator.foreach(Iterator.scala:941)
>         at scala.collection.Iterator.foreach$(Iterator.scala:941)
>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
>         at scala.collection.IterableLike.foreach(IterableLike.scala:74)
>         at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
>         at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
>         at 
> kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$1(LogCleaner.scala:944)
>         at 
> kafka.log.Cleaner.$anonfun$buildOffsetMapForSegment$1$adapted(LogCleaner.scala:933)
>         at scala.collection.Iterator.foreach(Iterator.scala:941)
>         at scala.collection.Iterator.foreach$(Iterator.scala:941)
>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
>         at scala.collection.IterableLike.foreach(IterableLike.scala:74)
>         at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
>         at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
>         at kafka.log.Cleaner.buildOffsetMapForSegment(LogCleaner.scala:933)
>         at kafka.log.Cleaner.$anonfun$buildOffsetMap$3(LogCleaner.scala:894)
>         at 
> kafka.log.Cleaner.$anonfun$buildOffsetMap$3$adapted(LogCleaner.scala:890)
>         at 
> scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:877)
>         at 
> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>         at 
> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>         at 
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:876)
>         at kafka.log.Cleaner.buildOffsetMap(LogCleaner.scala:890)
>         at kafka.log.Cleaner.doClean(LogCleaner.scala:514)
>         at kafka.log.Cleaner.clean(LogCleaner.scala:502)
>         at kafka.log.LogCleaner$CleanerThread.cleanLog(LogCleaner.scala:371)
>         at 
> kafka.log.LogCleaner$CleanerThread.cleanFilthiestLog(LogCleaner.scala:344)
>         ... 3 more
> ```
>  
> seems like the cleaner failed to compact  __consumer_offsets-47
> any suggestion on this? 
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to