Hey guys,

   I have a druid extension which supports a metric that stores image pixel
values and generate a heatmap of the clicks on each pixel at the end, and
we have a custom aggregation logic. I am using bytebuffer (direct) for all
the bit manipulations. But I am getting the below readonlybuffer exception
in the aggregation. I used ByteBuffer.allocate() and wherever possible I am
wrapping the bytebuffer and deep copying. Any pointers on how to fix this?

Two caveats:
1. I am using the real time kafka ingestion.
2. The events I am ingesting are randomly spread over last couple days(So
lot of re-indexing)!


io.druid.segment.realtime.appenderator.AppenderatorImpl - Failed to push
merged index for
segment[heatmap_events_2018-09-28T00:00:00.000Z_2018-09-28T01:00:00.000Z_2018-10-09T15:53:31.871Z_83].
java.nio.ReadOnlyBufferException
    at java.nio.DirectByteBufferR.put(DirectByteBufferR.java:309)
~[?:1.8.0_181]
    at
HeatmapCountMinSketch.addTwoByteValues(HeatmapCountMinSketch.java:283)
~[?:?]
    at fHeatmapCountMinSketch.addSketch(HeatmapCountMinSketch.java:253)
~[?:?]
    at HeatmapAggregatorFactory.combine(HeatmapAggregatorFactory.java:98)
~[?:?]
    at 
io.druid.segment.IndexMerger$RowboatMergeFunction.apply(IndexMerger.java:396)
~[druid-processing-0.12.3.jar:0.12.3]
    at 
io.druid.segment.IndexMerger$RowboatMergeFunction.apply(IndexMerger.java:365)
~[druid-processing-0.12.3.jar:0.12.3]
    at io.druid.collections.CombiningIterator.next(CombiningIterator.java:81)
~[druid-common-0.12.3.jar:0.12.3]
    at 
io.druid.segment.IndexMergerV9.mergeIndexesAndWriteColumns(IndexMergerV9.java:456)
~[druid-processing-0.12.3.jar:0.12.3]
    at io.druid.segment.IndexMergerV9.makeIndexFiles(IndexMergerV9.java:209)
~[druid-processing-0.12.3.jar:0.12.3]
    at io.druid.segment.IndexMergerV9.merge(IndexMergerV9.java:837) ~[druid
-processing-0.12.3.jar:0.12.3]
    at 
io.druid.segment.IndexMergerV9.mergeQueryableIndex(IndexMergerV9.java:710)
~[druid-processing-0.12.3.jar:0.12.3]
    at 
io.druid.segment.IndexMergerV9.mergeQueryableIndex(IndexMergerV9.java:688)
~[druid-processing-0.12.3.jar:0.12.3]
    at 
io.druid.segment.realtime.appenderator.AppenderatorImpl.mergeAndPush(AppenderatorImpl.java:659)
~[druid-server-0.12.3.jar:0.12.3]
    at 
io.druid.segment.realtime.appenderator.AppenderatorImpl.lambda$push$0(AppenderatorImpl.java:563)
~[druid-server-0.12.3.jar:0.12.3]
    at com.google.common.util.concurrent.Futures$1.apply(Futures.java:713)
[guava-16.0.1.jar:?]
    at
com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:861)
[guava-16.0.1.jar:?]
    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[?:1.8.0_181]
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[?:1.8.0_181]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
2018-10-24T01:26:10,225 ERROR [publish-0]
io.druid.indexing.kafka.KafkaIndexTask
- Error while publishing segments for

-- 
-spurthi

Reply via email to