I wanted to make sure you saw this.
---------- Forwarded message ---------
From: zhoujk <[email protected]>
Date: Mon, Jul 15, 2019 at 8:28 AM
Subject: [apache/incubator-druid] DumpSegment tool does not resolve type id
'quantilesDoublesSketchMerge' (#8082)
To: apache/incubator-druid <[email protected]>
CC: Subscribed <[email protected]>
Hi, all
When I used the segment tool to dump some dataset, I got an exception:
2019-07-15T16:55:49,624 INFO [main]
org.apache.druid.server.metrics.MetricsModule - Adding
monitor[org.apache.druid.server.initialization.jetty.JettyServerModule$JettyMonito
r@b2f4ece]
2019-07-15T16:55:49,700 WARN [main] org.apache.druid.segment.IndexIO -
Failed to load metadata for segment
[/data/tingyun/druid-0.12.3/20190714T000000.000Z_20190715T000000.0
00Z/2019-07-15T00_37_12.065Z/0]
com.fasterxml.jackson.databind.JsonMappingException: Could not resolve
type id 'quantilesDoublesSketchMerge' into a subtype of [simple type,
class org.apache.druid.query.agg
regation.AggregatorFactory]: known type ids = [AggregatorFactory,
cardinality, count, doubleFirst, doubleLast, doubleMax, doubleMin,
doubleSum, filtered, floatFirst, floatLa
st, floatMax, floatMin, floatSum, histogram, hyperUnique, javascript,
longFirst, longLast, longMax, longMin, longSum, stringFirst,
stringFirstFold, stringLast, stringLastFol
d]
at [Source: [B@ffaaaf0; line: 1, column: 185] (through reference
chain: org.apache.druid.segment.Metadata["aggregators"]->Object[][2])
at
com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.DeserializationContext.unknownTypeException(DeserializationContext.java:967)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.jsontype.impl.TypeDeserializerBase._handleUnknownTypeId(TypeDeserializerBase.java:277)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.jsontype.impl.TypeDeserializerBase._findDeserializer(TypeDeserializerBase.java:159)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:108)
~[jackson-databind-2.6.7.jar:2
.6.7]
at
com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:93)
~[jackson-databind-2.6.7.ja
r:2.6.7]
at
com.fasterxml.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:131)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.deser.std.ObjectArrayDeserializer.deserialize(ObjectArrayDeserializer.java:158)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.deser.std.ObjectArrayDeserializer.deserialize(ObjectArrayDeserializer.java:17)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:520)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeWithErrorWrapping(BeanDeserializer.java:463)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:378)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:133)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3736)
~[jackson-databind-2.6.7.jar:2.6.7]
at
com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2819)
~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.IndexIO$V9IndexLoader.load(IndexIO.java:571)
[druid-processing-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.segment.IndexIO.loadIndex(IndexIO.java:187)
[druid-processing-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.cli.DumpSegment.run(DumpSegment.java:180)
[druid-services-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.cli.Main.main(Main.java:118)
[druid-services-0.13.0-incubating.jar:0.13.0-incubating]
2019-07-15T16:55:49,759 INFO [main]
org.apache.druid.segment.CompressedPools - Allocating new
littleEndByteBuf[1]
Exception in thread "main" java.lang.NullPointerException
at
org.apache.druid.segment.column.SimpleColumnHolder.getColumn(SimpleColumnHolder.java:68)
at
org.apache.druid.segment.QueryableIndexColumnSelectorFactory.lambda$makeColumnValueSelector$1(QueryableIndexColumnSelectorFactory.java:125)
at java.util.HashMap.computeIfAbsent(HashMap.java:1118)
at
org.apache.druid.segment.QueryableIndexColumnSelectorFactory.makeColumnValueSelector(QueryableIndexColumnSelectorFactory.java:122)
at
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at
java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at org.apache.druid.cli.DumpSegment$2$1.apply(DumpSegment.java:280)
at org.apache.druid.cli.DumpSegment$2$1.apply(DumpSegment.java:272)
at
org.apache.druid.java.util.common.guava.MappingAccumulator.accumulate(MappingAccumulator.java:40)
at
org.apache.druid.java.util.common.guava.FilteringAccumulator.accumulate(FilteringAccumulator.java:41)
at
org.apache.druid.java.util.common.guava.MappingAccumulator.accumulate(MappingAccumulator.java:40)
at
org.apache.druid.java.util.common.guava.BaseSequence.accumulate(BaseSequence.java:45)
at
org.apache.druid.java.util.common.guava.MappedSequence.accumulate(MappedSequence.java:43)
at
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
at
org.apache.druid.java.util.common.guava.SequenceWrapper.wrap(SequenceWrapper.java:55)
at
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
at
org.apache.druid.java.util.common.guava.FilteredSequence.accumulate(FilteredSequence.java:45)
at
org.apache.druid.java.util.common.guava.MappedSequence.accumulate(MappedSequence.java:43)
at
org.apache.druid.cli.DumpSegment.evaluateSequenceForSideEffects(DumpSegment.java:494)
at org.apache.druid.cli.DumpSegment.access$100(DumpSegment.java:103)
at org.apache.druid.cli.DumpSegment$2.apply(DumpSegment.java:312)
at org.apache.druid.cli.DumpSegment$2.apply(DumpSegment.java:265)
at
org.apache.druid.cli.DumpSegment.withOutputStream(DumpSegment.java:427)
at org.apache.druid.cli.DumpSegment.runDump(DumpSegment.java:263)
at org.apache.druid.cli.DumpSegment.run(DumpSegment.java:183)
at org.apache.druid.cli.Main.main(Main.java:118)
I have a quantilesDoublesSketch aggragation metric in my ingestion
datasource:
...
{
"name": "resp_time_his",
"fieldName": "resp_time",
"type": "quantilesDoublesSketch",
"k": 256
},
...
Then I performed the dump command, the exception occured. But it worked
well without quantilesDoublesSketch aggragation metric.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<https://github.com/apache/incubator-druid/issues/8082?email_source=notifications&email_token=ADCXRQVB4QVGYLW2DJP3UR3P7SJSPA5CNFSM4IDYA23KYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4G7IBVZA>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADCXRQX3AS7XU76M35TE3TTP7SJSPANCNFSM4IDYA23A>
.
--
>From my cell phone.