davidradl commented on code in PR #130:
URL:
https://github.com/apache/flink-connector-kafka/pull/130#discussion_r1843657844
##########
flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaRecordSerializationSchemaBuilder.java:
##########
@@ -369,5 +420,46 @@ public ProducerRecord<byte[], byte[]> serialize(
value,
headerProvider != null ?
headerProvider.getHeaders(element) : null);
}
+
+ @Override
+ public Optional<KafkaDatasetFacet> getKafkaDatasetFacet() {
+ if (!(topicSelector instanceof KafkaDatasetIdentifierProvider)) {
+ LOG.info("Cannot identify topics. Not an
TopicsIdentifierProvider");
+ return Optional.empty();
+ }
+
+ Optional<DefaultKafkaDatasetIdentifier> topicsIdentifier =
+ ((KafkaDatasetIdentifierProvider)
(topicSelector)).getDatasetIdentifier();
+
+ if (!topicsIdentifier.isPresent()) {
+ LOG.info("No topics' identifiers provided");
+ return Optional.empty();
+ }
+
+ return Optional.of(new
DefaultKafkaDatasetFacet(topicsIdentifier.get()));
+ }
+
+ @Override
+ public Optional<TypeDatasetFacet> getTypeDatasetFacet() {
+ if (this.valueSerializationSchema instanceof ResultTypeQueryable) {
+ return Optional.of(
+ new DefaultTypeDatasetFacet(
+ ((ResultTypeQueryable<?>)
this.valueSerializationSchema)
+ .getProducedType()));
+ } else {
+ // gets type information from serialize method signature
+ TypeToken serializationSchemaType =
+ TypeToken.of(valueSerializationSchema.getClass());
+ Class parameterType =
+ serializationSchemaType
Review Comment:
Is there a way to avoid using this reflection - instanceof and Class. Maybe
using a config driven approach and java SPI. The connectors / formats bring in
serialization implementations in this way that avoid the overhead of
reflection.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]