[jira] [Created] (FLINK-33137) FLIP-312: Prometheus Sink Connector

2023-09-23 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-33137:
--

 Summary: FLIP-312: Prometheus Sink Connector
 Key: FLINK-33137
 URL: https://issues.apache.org/jira/browse/FLINK-33137
 Project: Flink
  Issue Type: New Feature
Reporter: Lorenzo Nicora


Umbrella Jira for implementation of Prometheus Sink Connector
https://cwiki.apache.org/confluence/display/FLINK/FLIP-312:+Prometheus+Sink+Connector



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33138) Prometheus Connector Sink - DataStream API implementation

2023-09-23 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-33138:
--

 Summary: Prometheus Connector Sink - DataStream API implementation
 Key: FLINK-33138
 URL: https://issues.apache.org/jira/browse/FLINK-33138
 Project: Flink
  Issue Type: Sub-task
Reporter: Lorenzo Nicora






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33139) Prometheus Sink Connector - Table API support

2023-09-24 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-33139:
--

 Summary: Prometheus Sink Connector - Table API support
 Key: FLINK-33139
 URL: https://issues.apache.org/jira/browse/FLINK-33139
 Project: Flink
  Issue Type: Sub-task
Reporter: Lorenzo Nicora






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33140) Prometheus Sink Connector - E2E test

2023-09-24 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-33140:
--

 Summary: Prometheus Sink Connector - E2E test
 Key: FLINK-33140
 URL: https://issues.apache.org/jira/browse/FLINK-33140
 Project: Flink
  Issue Type: Sub-task
Reporter: Lorenzo Nicora






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33142) Prometheus Sink Connector - Update Documentation

2023-09-24 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-33142:
--

 Summary: Prometheus Sink Connector - Update Documentation
 Key: FLINK-33142
 URL: https://issues.apache.org/jira/browse/FLINK-33142
 Project: Flink
  Issue Type: Sub-task
  Components: Documentation
Reporter: Lorenzo Nicora






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33141) Promentheus Sink Connector - Amazon Managed Prometheus Request Signer

2023-09-24 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-33141:
--

 Summary: Promentheus Sink Connector - Amazon Managed Prometheus 
Request Signer
 Key: FLINK-33141
 URL: https://issues.apache.org/jira/browse/FLINK-33141
 Project: Flink
  Issue Type: Sub-task
  Components: Connectors / AWS
Reporter: Lorenzo Nicora






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33152) Prometheus Sink Connector - Integration tests

2023-09-25 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-33152:
--

 Summary: Prometheus Sink Connector - Integration tests
 Key: FLINK-33152
 URL: https://issues.apache.org/jira/browse/FLINK-33152
 Project: Flink
  Issue Type: Sub-task
Reporter: Lorenzo Nicora


Integration tests against containerised Prometheus



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-36008) Guidance for connector docs contributions

2024-08-08 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-36008:
--

 Summary: Guidance for connector docs contributions
 Key: FLINK-36008
 URL: https://issues.apache.org/jira/browse/FLINK-36008
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Reporter: Lorenzo Nicora


The [Docs 
README]([https://github.com/apache/flink/tree/master/docs#include-externally-hosted-documentation)]
 has some guidance about generating docs of connectors. 
However, this only works for docs already in the official connector repo 
({{{}apache/flink-connector-{}}}). 

If you are working on a PR to docs of an existing connector or you are 
contributing to a new connector, how to generate the preview of your WIP docs 
is not obvious at all.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-17486) ClassCastException when copying AVRO SpecificRecord containing a decimal field

2020-04-30 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-17486:
--

 Summary: ClassCastException when copying AVRO SpecificRecord 
containing a decimal field
 Key: FLINK-17486
 URL: https://issues.apache.org/jira/browse/FLINK-17486
 Project: Flink
  Issue Type: Bug
  Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
Affects Versions: 1.10.0
 Environment: Flink 1.10.0

AVRO 1.9.2

Java 1.8.0 (but also Java 14)

Scala binary 2.11
Reporter: Lorenzo Nicora


When consuming from a Kafka source AVRO SpecificRecord containing a {{decimal}} 
(logical type) field, copying the record fails with:

{{java.lang.ClassCastException: class java.math.BigDecimal cannot be cast to 
class java.nio.ByteBuffer}}

 

This code reproduces the problem:

{{AvroSerializer serializer = new AvroSerializer<>(Sample.class);}}

{{Sample s1 = Sample.newBuilder()}}
 {{  .setPrice(BigDecimal.valueOf(42.32))}}
 {{  .setId("A12345")}}
 {{  .build();}}

{{Sample s2 = serializer.copy(s1);}}

 

The AVRO SpecificRecord is generated using avro-maven-plugin from this IDL:

{{@namespace("example.avro")}}
{{protocol SampleProtocol {}}
{{  record Sample{}}
{{    string id;}}
{{    decimal(9,2) price;}}
{{    timestamp_ms eventTime;}}
{{   }}}
{{}}}

 

The deepCopy of the record happens behind the scenes when attaching an 
AssignerWithPeriodicWatermark to a Kafka Source consuming AVRO SpecificRecord 
and using Confluent Schema Registry. The assigned extracts the event time from 
the record and enabling bookmarking (not sure whether this is related)

A simplified version of the application is 
[here|[https://github.com/nicusX/flink-avro-bug|https://github.com/nicusX/flink-avro-bug/blob/master/src/main/java/example/StreamJob.java]].

 

The problem looks similar to AVRO-1895 but that issue has been fixed since AVRO 
1.8.2 (I'm using AVRO 1.9.2)

In fact, the following code doing deepCopy and only relying on AVRO does work: 

 

{{Sample s1 = Sample.newBuilder()}}
 {{  .setPrice(BigDecimal.valueOf(42.32))}}
 {{  .setId("A12345")}}
 {{  .build();}}
 {{Sample s2 = Sample.newBuilder(s1).build();}}

 

A simplified version of the Flink application causing the problem is 
[here|[https://github.com/nicusX/flink-avro-bug/blob/master/src/main/java/example/StreamJob.java]].

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (FLINK-18223) AvroSerializer does not correctly instantiate GenericRecord

2020-06-09 Thread Lorenzo Nicora (Jira)
Lorenzo Nicora created FLINK-18223:
--

 Summary: AvroSerializer does not correctly instantiate 
GenericRecord
 Key: FLINK-18223
 URL: https://issues.apache.org/jira/browse/FLINK-18223
 Project: Flink
  Issue Type: Bug
  Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
Affects Versions: 1.10.1
Reporter: Lorenzo Nicora


{{AvroSerializer.createInstance()}} simply calls 
{{InstantiationUtil.instantiate(type)}} to create a new instance, also when 
type is GenericRecord.

This fails with an exception, because a GenericRecord must be instantiated 
through {{GenericRecordBuilder}} but {{InstantiationUtil}} is not aware of it.
{code:java}
The class 'org.apache.avro.generic.GenericRecord' is not instantiable: The 
class is not a proper class. It is either abstract, an interface, or a 
primitive type.{code}
This can be proven with this test
{code:java}
@Test
public void shouldInstantiateGenericRecord() {
org.apache.avro.Schema SCHEMA = new 
org.apache.avro.Schema.Parser().parse("{\"type\":\"record\",\"name\":\"Dummy\",\"namespace\":\"dummy\",\"fields\":[{\"name\":\"something\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"}}]}");
AvroSerializer serializer = new 
AvroSerializer<>(GenericRecord.class, SCHEMA);

serializer.createInstance();
}
{code}
 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)