Re: HadoopIndexer job with input as the Druid datasource and configured segments table

2019-04-16 Thread Samarth Jain
You are right about the code path, Jihoon. Is there a way to provide the "druid.metadata.storage.tables" config in the ingestion spec itself? Clearly providing it as part of metadataUpdateSpec doesn't work. Or does it have to be in some properties file that is in the classpath of the indexer job?

Re: Domain-driven Observability

2019-04-16 Thread Gian Merlino
Hmm, interesting read. Some stuff like this seems to have evolved organically - I'd say TaskRealtimeMetricsMonitor is an example of this technique. It seems like a reasonable pattern when you're doing multiple different kinds of instrumentation (like, both logs and metrics), if the thing you're in

Re: HadoopIndexer job with input as the Druid datasource and configured segments table

2019-04-16 Thread Jihoon Son
>From the code here: https://github.com/apache/incubator-druid/blob/master/indexing-service/src/main/java/org/apache/druid/indexing/overlord/ForkingTaskRunner.java#L340-L353 , I think you can put "druid.indexer.fork.property.druid.metadata.storage.tables.segments" in the task context. I haven't tes

Re: HadoopIndexer job with input as the Druid datasource and configured segments table

2019-04-16 Thread Samarth Jain
I am not sure if there is a way to provide taskContext for hadoop based ingestion. Will take a look into fixing this issue properly. On Tue, Apr 16, 2019 at 12:54 PM Jihoon Son wrote: > From the code here: > > https://github.com/apache/incubator-druid/blob/master/indexing-service/src/main/java/o

Re: HadoopIndexer job with input as the Druid datasource and configured segments table

2019-04-16 Thread Jihoon Son
It looks not documented, but it should work: https://github.com/apache/incubator-druid/blob/master/indexing-service/src/main/java/org/apache/druid/indexing/common/task/HadoopIndexTask.java#L146 . You can simply put context as below: "context" : { "druid.indexer.fork.property.druid.metadata.stor