[jira] [Commented] (FLINK-10460) DataDog reporter JsonMappingException
[ https://issues.apache.org/jira/browse/FLINK-10460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16753231#comment-16753231 ] Elias Levy commented on FLINK-10460: [~lining] as you can tell from the backtrace, that is not a user metric. Rather it appear to be a Kafka metric gathered in KafkaMetricWrapper > DataDog reporter JsonMappingException > - > > Key: FLINK-10460 > URL: https://issues.apache.org/jira/browse/FLINK-10460 > Project: Flink > Issue Type: Improvement > Components: Metrics >Affects Versions: 1.4.2 >Reporter: Elias Levy >Priority: Minor > Attachments: image-2019-01-24-16-00-56-280.png > > > Observed the following error in the TM logs this morning: > {code:java} > WARN org.apache.flink.metrics.datadog.DatadogHttpReporter - Failed > reporting metrics to Datadog. > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException: > (was java.util.ConcurrentModificationException) (through reference chain: > org.apache.flink.metrics.datadog.DSeries["series"]-> > java.util.ArrayList[88]->org.apache.flink.metrics.datadog.DGauge["points"]) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:379) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:339) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer.wrapAndThrow(StdSerializer.java:342) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:686) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:157) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:119) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:672) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:678) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:157) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:130) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:3631) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:2998) >at > org.apache.flink.metrics.datadog.DatadogHttpClient.serialize(DatadogHttpClient.java:90) >at > org.apache.flink.metrics.datadog.DatadogHttpClient.send(DatadogHttpClient.java:79) >at > org.apache.flink.metrics.datadog.DatadogHttpReporter.report(DatadogHttpReporter.java:143) > at > org.apache.flink.runtime.metrics.MetricRegistryImpl$ReporterTask.run(MetricRegistryImpl.java:417) >at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) >at java.util.concurrent.FutureTask.runAndReset(Unknown Source) >at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown > Source) >at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown > Source) >at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) >at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) >at java.lang.Thread.run(Unknown Source) > Caused by: java.util.ConcurrentModificationException >at java.util.LinkedHashMap$LinkedHashIterator.nextNode(Unknown Source) >at java.util.LinkedHashMap$LinkedKeyIterator.next(Unknown Source) >at java.util.AbstractCollection.addAll(Unknown Source) >at java.util.HashSet.(Unknown Source) >at > org.apache.kafka.common.internals.PartitionStates.partitionSet(PartitionStates.java:65) >at > org.apache.kafka.clients.consumer.internals.SubscriptionState.assignedPartitions(SubscriptionState.java:298) >at > org.apache.kafka.clients.consumer.internals.ConsumerCoordinator$ConsumerCoordinatorMetri
[jira] [Commented] (FLINK-10460) DataDog reporter JsonMappingException
[ https://issues.apache.org/jira/browse/FLINK-10460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16750840#comment-16750840 ] lining commented on FLINK-10460: [~elevy] In code write `Will throw exception if the Gauge is not of Number type`, can you post anything about the metric? > DataDog reporter JsonMappingException > - > > Key: FLINK-10460 > URL: https://issues.apache.org/jira/browse/FLINK-10460 > Project: Flink > Issue Type: Improvement > Components: Metrics >Affects Versions: 1.4.2 >Reporter: Elias Levy >Priority: Minor > Attachments: image-2019-01-24-16-00-56-280.png > > > Observed the following error in the TM logs this morning: > {code:java} > WARN org.apache.flink.metrics.datadog.DatadogHttpReporter - Failed > reporting metrics to Datadog. > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException: > (was java.util.ConcurrentModificationException) (through reference chain: > org.apache.flink.metrics.datadog.DSeries["series"]-> > java.util.ArrayList[88]->org.apache.flink.metrics.datadog.DGauge["points"]) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:379) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:339) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer.wrapAndThrow(StdSerializer.java:342) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:686) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:157) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:119) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:672) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:678) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:157) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:130) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:3631) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:2998) >at > org.apache.flink.metrics.datadog.DatadogHttpClient.serialize(DatadogHttpClient.java:90) >at > org.apache.flink.metrics.datadog.DatadogHttpClient.send(DatadogHttpClient.java:79) >at > org.apache.flink.metrics.datadog.DatadogHttpReporter.report(DatadogHttpReporter.java:143) > at > org.apache.flink.runtime.metrics.MetricRegistryImpl$ReporterTask.run(MetricRegistryImpl.java:417) >at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) >at java.util.concurrent.FutureTask.runAndReset(Unknown Source) >at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown > Source) >at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown > Source) >at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) >at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) >at java.lang.Thread.run(Unknown Source) > Caused by: java.util.ConcurrentModificationException >at java.util.LinkedHashMap$LinkedHashIterator.nextNode(Unknown Source) >at java.util.LinkedHashMap$LinkedKeyIterator.next(Unknown Source) >at java.util.AbstractCollection.addAll(Unknown Source) >at java.util.HashSet.(Unknown Source) >at > org.apache.kafka.common.internals.PartitionStates.partitionSet(PartitionStates.java:65) >at > org.apache.kafka.clients.consumer.internals.SubscriptionState.assignedPartitions(SubscriptionState.java:298) >at > org.apache.kafka.clients.consumer.internals.ConsumerCoordinator$ConsumerCoordinatorMetrics$1.measure(ConsumerCoordina
[jira] [Commented] (FLINK-10460) DataDog reporter JsonMappingException
[ https://issues.apache.org/jira/browse/FLINK-10460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16750837#comment-16750837 ] lining commented on FLINK-10460: [~phoenixjiangnan] see code org.apache.flink.metrics.datadog.DatadogHttpReporter#report, as has exception will remove the key, will cause java.util.ConcurrentModificationException. > DataDog reporter JsonMappingException > - > > Key: FLINK-10460 > URL: https://issues.apache.org/jira/browse/FLINK-10460 > Project: Flink > Issue Type: Improvement > Components: Metrics >Affects Versions: 1.4.2 >Reporter: Elias Levy >Priority: Minor > > Observed the following error in the TM logs this morning: > {code:java} > WARN org.apache.flink.metrics.datadog.DatadogHttpReporter - Failed > reporting metrics to Datadog. > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException: > (was java.util.ConcurrentModificationException) (through reference chain: > org.apache.flink.metrics.datadog.DSeries["series"]-> > java.util.ArrayList[88]->org.apache.flink.metrics.datadog.DGauge["points"]) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:379) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:339) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer.wrapAndThrow(StdSerializer.java:342) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:686) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:157) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:119) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:672) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:678) > at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:157) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:130) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:3631) >at > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:2998) >at > org.apache.flink.metrics.datadog.DatadogHttpClient.serialize(DatadogHttpClient.java:90) >at > org.apache.flink.metrics.datadog.DatadogHttpClient.send(DatadogHttpClient.java:79) >at > org.apache.flink.metrics.datadog.DatadogHttpReporter.report(DatadogHttpReporter.java:143) > at > org.apache.flink.runtime.metrics.MetricRegistryImpl$ReporterTask.run(MetricRegistryImpl.java:417) >at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) >at java.util.concurrent.FutureTask.runAndReset(Unknown Source) >at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown > Source) >at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown > Source) >at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) >at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) >at java.lang.Thread.run(Unknown Source) > Caused by: java.util.ConcurrentModificationException >at java.util.LinkedHashMap$LinkedHashIterator.nextNode(Unknown Source) >at java.util.LinkedHashMap$LinkedKeyIterator.next(Unknown Source) >at java.util.AbstractCollection.addAll(Unknown Source) >at java.util.HashSet.(Unknown Source) >at > org.apache.kafka.common.internals.PartitionStates.partitionSet(PartitionStates.java:65) >at > org.apache.kafka.clients.consumer.internals.SubscriptionState.assignedPartitions(SubscriptionState.java:298) >at > org.apache.kafka.clients.consumer.internals.ConsumerCoordinator$ConsumerCoordinatorMetrics$1.measure(ConsumerCoordin