[jira] [Updated] (FLINK-35808) Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in KafkaSourceBuilder

2024-07-11 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-35808:
---
Labels: pull-request-available  (was: )

> Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder
> ---
>
> Key: FLINK-35808
> URL: https://issues.apache.org/jira/browse/FLINK-35808
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / Kafka
>Affects Versions: kafka-3.2.0
>Reporter: Kevin Lam
>Assignee: Kevin Lam
>Priority: Minor
>  Labels: pull-request-available
>
> This issue is a follow-up to [this mailing list 
> discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 
> I'd like to propose letting the 
> ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder, as shown in this DRAFT PR:
>  
> [https://github.com/apache/flink-connector-kafka/pull/108]
>  
> From the PR description: 
> {quote}This allows users to easily implement the [{{claim check}} large 
> message 
> pattern|https://developer.confluent.io/patterns/event-processing/claim-check/]
>  without bringing any concerns into the Flink codebase otherwise, by 
> specifying a {{value.deserializer}} that handles it, but otherwise passes 
> through the bytes.
> Note: overriding value.serializer is already supported on the Producer side: 
> |[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]|
>  
> Other Reading:
> [https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/]
> [https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]
> {quote}
>  
> What do folks think? If it seems reasonable I can follow the steps in the 
> [contribution guide|https://flink.apache.org/how-to-contribute/overview/].  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-35808) Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in KafkaSourceBuilder

2024-07-10 Thread Kevin Lam (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kevin Lam updated FLINK-35808:
--
Description: 
This issue is a follow-up to [this mailing list 
discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 

I'd like to propose letting the ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG 
be overridable by user in KafkaSourceBuilder, as shown in this DRAFT PR:
 
[https://github.com/apache/flink-connector-kafka/pull/108]
 
>From the PR description: 
{quote}This allows users to easily implement the [{{claim check}} large message 
pattern|https://developer.confluent.io/patterns/event-processing/claim-check/] 
without bringing any concerns into the Flink codebase otherwise, by specifying 
a {{value.deserializer}} that handles it, but otherwise passes through the 
bytes.
Note: overriding value.serializer is already supported on the Producer side: 
|[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]|

 
Other Reading:
[https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/]

[https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]
{quote}
 
What do folks think? If it seems reasonable I can follow the steps in the 
[contribution guide|https://flink.apache.org/how-to-contribute/overview/].  

  was:
This issue is a follow-up to [this mailing list 
discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 

I'd like to propose letting the ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG 
be overridable by user in KafkaSourceBuilder, as shown in this DRAFT PR:
 
[https://github.com/apache/flink-connector-kafka/pull/108]
 
>From the PR description: 
{quote}This allows users to easily implement the [{{claim check}} large message 
pattern|https://developer.confluent.io/patterns/event-processing/claim-check/] 
without bringing any concerns into the Flink codebase otherwise, by specifying 
a {{value.deserializer}} that handles it, but otherwise passes through the 
bytes.
Note: [overriding value.serializer is already supported on the Producer side|
|[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]|

]

 
Other Reading:
[https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
] 
[https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]|
{quote}
 
What do folks think? If it seems reasonable I can follow the steps in the 
[contribution guide|https://flink.apache.org/how-to-contribute/overview/].  


> Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder
> ---
>
> Key: FLINK-35808
> URL: https://issues.apache.org/jira/browse/FLINK-35808
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / Kafka
>Affects Versions: kafka-3.2.0
>Reporter: Kevin Lam
>Priority: Minor
>
> This issue is a follow-up to [this mailing list 
> discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 
> I'd like to propose letting the 
> ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder, as shown in this DRAFT PR:
>  
> [https://github.com/apache/flink-connector-kafka/pull/108]
>  
> From the PR description: 
> {quote}This allows users to easily implement the [{{claim check}} large 
> message 
> pattern|https://developer.confluent.io/patterns/event-processing/claim-check/]
>  without bringing any concerns into the Flink codebase otherwise, by 
> specifying a {{value.deserializer}} that handles it, but otherwise passes 
> through the bytes.
> Note: overriding value.serializer is already supported on the Producer side: 
> |[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]|
>  
> Other Reading:
> [https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/]
> [https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]
> {quote}
>  
> What do folks think? If it seems reasonable I can follow the steps in the 
> [contribution guide|https://flink.apache.org

[jira] [Updated] (FLINK-35808) Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in KafkaSourceBuilder

2024-07-10 Thread Kevin Lam (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kevin Lam updated FLINK-35808:
--
Description: 
This issue is a follow-up to [this mailing list 
discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 

I'd like to propose letting the ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG 
be overridable by user in KafkaSourceBuilder, as shown in this DRAFT PR:
 
[https://github.com/apache/flink-connector-kafka/pull/108]
 
>From the PR description: 
{quote}This allows users to easily implement the [{{claim check}} large message 
pattern|https://developer.confluent.io/patterns/event-processing/claim-check/] 
without bringing any concerns into the Flink codebase otherwise, by specifying 
a {{value.deserializer}} that handles it, but otherwise passes through the 
bytes.
Note: [overriding value.serializer is already supported on the Producer side|
|[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]|

]

 
Other Reading:
[https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
] 
[https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]|
{quote}
 
What do folks think? If it seems reasonable I can follow the steps in the 
[contribution guide|https://flink.apache.org/how-to-contribute/overview/].  

  was:
This issue is a follow-up to [this mailing list 
discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 

I'd like to propose letting the ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG 
be overridable by user in KafkaSourceBuilder, as shown in this DRAFT PR:
 
[https://github.com/apache/flink-connector-kafka/pull/108]
 
>From the PR description: 
{quote}This allows users to easily implement the [{{claim check}} large message 
pattern|https://developer.confluent.io/patterns/event-processing/claim-check/] 
without bringing any concerns into the Flink codebase otherwise, by specifying 
a {{value.deserializer}} that handles it, but otherwise passes through the 
bytes.
Note: [overriding value.serializer is already supported on the Producer side](
|[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83])|


 
Other Reading:
[https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
] 
[https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]|
{quote}
 
What do folks think? If it seems reasonable I can follow the steps in the 
[contribution guide|https://flink.apache.org/how-to-contribute/overview/].  


> Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder
> ---
>
> Key: FLINK-35808
> URL: https://issues.apache.org/jira/browse/FLINK-35808
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / Kafka
>Affects Versions: kafka-3.2.0
>Reporter: Kevin Lam
>Priority: Minor
>
> This issue is a follow-up to [this mailing list 
> discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 
> I'd like to propose letting the 
> ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder, as shown in this DRAFT PR:
>  
> [https://github.com/apache/flink-connector-kafka/pull/108]
>  
> From the PR description: 
> {quote}This allows users to easily implement the [{{claim check}} large 
> message 
> pattern|https://developer.confluent.io/patterns/event-processing/claim-check/]
>  without bringing any concerns into the Flink codebase otherwise, by 
> specifying a {{value.deserializer}} that handles it, but otherwise passes 
> through the bytes.
> Note: [overriding value.serializer is already supported on the Producer side|
> |[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]|
> ]
>  
> Other Reading:
> [https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
> ] 
> [https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]|
> {quote}
>  
> What do folks think? If it seems reasonable I can follow the steps in the 
> [contribution guide|https://fl

[jira] [Updated] (FLINK-35808) Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in KafkaSourceBuilder

2024-07-10 Thread Kevin Lam (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kevin Lam updated FLINK-35808:
--
Description: 
This issue is a follow-up to [this mailing list 
discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 

I'd like to propose letting the ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG 
be overridable by user in KafkaSourceBuilder, as shown in this DRAFT PR:
 
[https://github.com/apache/flink-connector-kafka/pull/108]
 
>From the PR description: 
{quote}This allows users to easily implement the [{{claim check}} large message 
pattern|https://developer.confluent.io/patterns/event-processing/claim-check/] 
without bringing any concerns into the Flink codebase otherwise, by specifying 
a {{value.deserializer}} that handles it, but otherwise passes through the 
bytes.
Note: [overriding value.serializer is already supported on the Producer side](
|[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83])|


 
Other Reading:
[https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
] 
[https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]|
{quote}
 
What do folks think? If it seems reasonable I can follow the steps in the 
[contribution guide|https://flink.apache.org/how-to-contribute/overview/].  

  was:
This issue is a follow-up to [this mailing list 
discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 

I'd like to propose letting the ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG 
be overridable by user in KafkaSourceBuilder, as shown in this DRAFT PR:
 
[https://github.com/apache/flink-connector-kafka/pull/108]
 
>From the PR description: 
{quote}This allows users to easily implement the [{{claim check}} large message 
pattern|https://developer.confluent.io/patterns/event-processing/claim-check/] 
without bringing any concerns into the Flink codebase otherwise, by specifying 
a {{value.deserializer}} that handles it, but otherwise passes through the 
bytes.
Note: [overriding {{value.serializer}} is already supported on the Producer 
side.](
|[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83|https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]])|
|
 
Other Reading:
[https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
] 
[https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]|
{quote}
 
What do folks think? If it seems reasonable I can follow the steps in the 
[contribution guide|https://flink.apache.org/how-to-contribute/overview/].  


> Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder
> ---
>
> Key: FLINK-35808
> URL: https://issues.apache.org/jira/browse/FLINK-35808
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / Kafka
>Affects Versions: kafka-3.2.0
>Reporter: Kevin Lam
>Priority: Minor
>
> This issue is a follow-up to [this mailing list 
> discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 
> I'd like to propose letting the 
> ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder, as shown in this DRAFT PR:
>  
> [https://github.com/apache/flink-connector-kafka/pull/108]
>  
> From the PR description: 
> {quote}This allows users to easily implement the [{{claim check}} large 
> message 
> pattern|https://developer.confluent.io/patterns/event-processing/claim-check/]
>  without bringing any concerns into the Flink codebase otherwise, by 
> specifying a {{value.deserializer}} that handles it, but otherwise passes 
> through the bytes.
> Note: [overriding value.serializer is already supported on the Producer side](
> |[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83])|
>  
> Other Reading:
> [https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
> ] 
> [https://www.conduktor.io/kafka/how-to-send-large-mes

[jira] [Updated] (FLINK-35808) Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in KafkaSourceBuilder

2024-07-10 Thread Kevin Lam (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kevin Lam updated FLINK-35808:
--
Description: 
This issue is a follow-up to [this mailing list 
discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 

I'd like to propose letting the ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG 
be overridable by user in KafkaSourceBuilder, as shown in this DRAFT PR:
 
[https://github.com/apache/flink-connector-kafka/pull/108]
 
>From the PR description: 
{quote}This allows users to easily implement the [{{claim check}} large message 
pattern|https://developer.confluent.io/patterns/event-processing/claim-check/] 
without bringing any concerns into the Flink codebase otherwise, by specifying 
a {{value.deserializer}} that handles it, but otherwise passes through the 
bytes.
Note: [overriding {{value.serializer}} is already supported on the Producer 
side.](
|[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83|https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]])|
|
 
Other Reading:
[https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
] 
[https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]|
{quote}
 
What do folks think? If it seems reasonable I can follow the steps in the 
[contribution guide|https://flink.apache.org/how-to-contribute/overview/].  

  was:
This issue is a follow-up to [this mailing list 
discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 

I'd like to propose letting the ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG 
be overridable by user in KafkaSourceBuilder, as shown in this DRAFT PR:
 
https://github.com/apache/flink-connector-kafka/pull/108
 
>From the PR description: 
{quote}This allows users to easily implement the [{{claim check}} large message 
pattern|https://developer.confluent.io/patterns/event-processing/claim-check/] 
without bringing any concerns into the Flink codebase otherwise, by specifying 
a {{value.deserializer}} that handles it, but otherwise passes through the 
bytes.
Note: [overriding {{value.serializer}} is already supported on the Producer 
side.
|https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]Other
 Reading:
[https://www.kai-waehner.de/blog/2020/08/07/apache-kafka-handling-large-messages-and-files-for-image-video-audio-processing/
] 
[https://www.conduktor.io/kafka/how-to-send-large-messages-in-apache-kafka/#Option-1:-using-an-external-store-(GB-size-messages)-0]{quote}
 
What do folks think? If it seems reasonable I can follow the steps in the 
[contribution guide|https://flink.apache.org/how-to-contribute/overview/].  


> Let ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder
> ---
>
> Key: FLINK-35808
> URL: https://issues.apache.org/jira/browse/FLINK-35808
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / Kafka
>Affects Versions: kafka-3.2.0
>Reporter: Kevin Lam
>Priority: Minor
>
> This issue is a follow-up to [this mailing list 
> discussion|https://lists.apache.org/thread/spl88o63sjm2dv4l5no0ym632d2yt2o6]. 
> I'd like to propose letting the 
> ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG be overridable by user in 
> KafkaSourceBuilder, as shown in this DRAFT PR:
>  
> [https://github.com/apache/flink-connector-kafka/pull/108]
>  
> From the PR description: 
> {quote}This allows users to easily implement the [{{claim check}} large 
> message 
> pattern|https://developer.confluent.io/patterns/event-processing/claim-check/]
>  without bringing any concerns into the Flink codebase otherwise, by 
> specifying a {{value.deserializer}} that handles it, but otherwise passes 
> through the bytes.
> Note: [overriding {{value.serializer}} is already supported on the Producer 
> side.](
> |[https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83|https://github.com/apache/flink-connector-kafka/blob/15d3fbd4e65dae6c334e2386dd337d2bf423c216/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaSinkBuilder.java#L82-L83]])|
> |
>