Re: [VOTE] KIP-991: Allow DropHeaders SMT to drop headers by wildcard/regexp

2024-03-09 Thread Roman Schmitz
Hi all,

a gentle reminder in KIP-991  - any thoughts/feedback/votes on this?
If you'd like to re-open the discussion, please let me know.

Thanks,
Roman

Am Mo., 15. Jan. 2024 um 13:01 Uhr schrieb Roman Schmitz <
roman.schm...@gmail.com>:

> Hi all,
>
> Thank you for your feedback on the suggested KIP so far.
> As there seem to be no new updates / suggestions I'd like to start a vote
> on the (SMT-) KIP 991.
>
> Please have a look at KIP-991
> <https://cwiki.apache.org/confluence/x/oYtEE>.
>
> Thanks,
> Roman
>


[VOTE] KIP-991: Allow DropHeaders SMT to drop headers by wildcard/regexp

2024-01-15 Thread Roman Schmitz
Hi all,

Thank you for your feedback on the suggested KIP so far.
As there seem to be no new updates / suggestions I'd like to start a vote
on the (SMT-) KIP 991.

Please have a look at KIP-991 .

Thanks,
Roman


Re: KIP-991: Allow DropHeaders SMT to drop headers by wildcard/regexp

2024-01-12 Thread Roman Schmitz
Hi,

Thanks, 100% agree, I fixed the description.

Thanks,
Roman

Am Do., 11. Jan. 2024 um 18:27 Uhr schrieb Mickael Maison <
mickael.mai...@gmail.com>:

> Hi Roman,
>
> Thanks for the updates, this looks much better.
>
> Just a couple of small comments:
> - The type of the field is listed as "boolean". I think it should be
> string (or list)
> - Should the field be named "headers.patterns" instead of
> "headers.pattern" since it accepts a list of patterns?
>
> Thanks,
> Mickael
>
> On Thu, Jan 11, 2024 at 12:56 PM Roman Schmitz 
> wrote:
> >
> > Hi Mickael,
> > Hi all,
> >
> > Thanks for the feedback!
> > I have adapted the KIP description - actually much shorter and just
> > reflecting the general functionality and interface/configuration changes.
> >
> > Kindly let me know if you have any comments, questions, or suggestions
> for
> > this KIP!
> >
> > Thanks,
> > Roman
> >
> > Am Fr., 5. Jan. 2024 um 17:36 Uhr schrieb Mickael Maison <
> > mickael.mai...@gmail.com>:
> >
> > > Hi Roman,
> > >
> > > Thanks for the KIP! This would be a useful improvement.
> > >
> > > Ideally you want to make a concerte proposal in the KIP instead of
> > > listing a series of options. Currently the KIP seems to list two
> > > alternatives.
> > >
> > > Also a KIP focuses on the API changes rather than on the pure
> > > implementation. It seems you're proposing adding a configuration to
> > > the DropHeaders SMT. It would be good to describe that new
> > > configuration. For example see KIP-911 which also added a
> > > configuration.
> > >
> > > Thanks,
> > > Mickael
> > >
> > > On Mon, Oct 16, 2023 at 9:50 AM Roman Schmitz  >
> > > wrote:
> > > >
> > > > Hi Andrew,
> > > >
> > > > Ok, thanks for the feedback! I added a few more details and code
> examples
> > > > to explain the proposed changes.
> > > >
> > > > Thanks,
> > > > Roman
> > > >
> > > > Am So., 15. Okt. 2023 um 22:12 Uhr schrieb Andrew Schofield <
> > > > andrew_schofield_j...@outlook.com>:
> > > >
> > > > > Hi Roman,
> > > > > Thanks for the KIP. I think it’s an interesting idea, but I think
> the
> > > KIP
> > > > > document needs some
> > > > > more details added before it’s ready for review. For example,
> here’s a
> > > KIP
> > > > > in the same
> > > > > area which was delivered in an earlier version of Kafka. I think
> this
> > > is a
> > > > > good KIP to copy
> > > > > for a suitable level of detail and description (
> > > > >
> > >
> https://cwiki.apache.org/confluence/display/KAFKA/KIP-585%3A+Filter+and+Conditional+SMTs
> > > > > ).
> > > > >
> > > > > Hope this helps.
> > > > >
> > > > > Thanks,
> > > > > Andrew
> > > > >
> > > > > > On 15 Oct 2023, at 21:02, Roman Schmitz  >
> > > wrote:
> > > > > >
> > > > > > Hi all,
> > > > > >
> > > > > > While working with different customers I came across the case
> several
> > > > > times
> > > > > > that we'd like to not only explicitly remove headers by name but
> by
> > > > > pattern
> > > > > > / regexp. Here is a KIP for this feature!
> > > > > >
> > > > > > Please let me know if you have any comments, questions, or
> > > suggestions!
> > > > > >
> > > > > > https://cwiki.apache.org/confluence/x/oYtEE
> > > > > >
> > > > > > Thanks,
> > > > > > Roman
> > > > >
> > > > >
> > >
>


Re: KIP-991: Allow DropHeaders SMT to drop headers by wildcard/regexp

2024-01-11 Thread Roman Schmitz
Hi Mickael,
Hi all,

Thanks for the feedback!
I have adapted the KIP description - actually much shorter and just
reflecting the general functionality and interface/configuration changes.

Kindly let me know if you have any comments, questions, or suggestions for
this KIP!

Thanks,
Roman

Am Fr., 5. Jan. 2024 um 17:36 Uhr schrieb Mickael Maison <
mickael.mai...@gmail.com>:

> Hi Roman,
>
> Thanks for the KIP! This would be a useful improvement.
>
> Ideally you want to make a concerte proposal in the KIP instead of
> listing a series of options. Currently the KIP seems to list two
> alternatives.
>
> Also a KIP focuses on the API changes rather than on the pure
> implementation. It seems you're proposing adding a configuration to
> the DropHeaders SMT. It would be good to describe that new
> configuration. For example see KIP-911 which also added a
> configuration.
>
> Thanks,
> Mickael
>
> On Mon, Oct 16, 2023 at 9:50 AM Roman Schmitz 
> wrote:
> >
> > Hi Andrew,
> >
> > Ok, thanks for the feedback! I added a few more details and code examples
> > to explain the proposed changes.
> >
> > Thanks,
> > Roman
> >
> > Am So., 15. Okt. 2023 um 22:12 Uhr schrieb Andrew Schofield <
> > andrew_schofield_j...@outlook.com>:
> >
> > > Hi Roman,
> > > Thanks for the KIP. I think it’s an interesting idea, but I think the
> KIP
> > > document needs some
> > > more details added before it’s ready for review. For example, here’s a
> KIP
> > > in the same
> > > area which was delivered in an earlier version of Kafka. I think this
> is a
> > > good KIP to copy
> > > for a suitable level of detail and description (
> > >
> https://cwiki.apache.org/confluence/display/KAFKA/KIP-585%3A+Filter+and+Conditional+SMTs
> > > ).
> > >
> > > Hope this helps.
> > >
> > > Thanks,
> > > Andrew
> > >
> > > > On 15 Oct 2023, at 21:02, Roman Schmitz 
> wrote:
> > > >
> > > > Hi all,
> > > >
> > > > While working with different customers I came across the case several
> > > times
> > > > that we'd like to not only explicitly remove headers by name but by
> > > pattern
> > > > / regexp. Here is a KIP for this feature!
> > > >
> > > > Please let me know if you have any comments, questions, or
> suggestions!
> > > >
> > > > https://cwiki.apache.org/confluence/x/oYtEE
> > > >
> > > > Thanks,
> > > > Roman
> > >
> > >
>


Re: KIP-991: Allow DropHeaders SMT to drop headers by wildcard/regexp

2023-10-16 Thread Roman Schmitz
Hi Andrew,

Ok, thanks for the feedback! I added a few more details and code examples
to explain the proposed changes.

Thanks,
Roman

Am So., 15. Okt. 2023 um 22:12 Uhr schrieb Andrew Schofield <
andrew_schofield_j...@outlook.com>:

> Hi Roman,
> Thanks for the KIP. I think it’s an interesting idea, but I think the KIP
> document needs some
> more details added before it’s ready for review. For example, here’s a KIP
> in the same
> area which was delivered in an earlier version of Kafka. I think this is a
> good KIP to copy
> for a suitable level of detail and description (
> https://cwiki.apache.org/confluence/display/KAFKA/KIP-585%3A+Filter+and+Conditional+SMTs
> ).
>
> Hope this helps.
>
> Thanks,
> Andrew
>
> > On 15 Oct 2023, at 21:02, Roman Schmitz  wrote:
> >
> > Hi all,
> >
> > While working with different customers I came across the case several
> times
> > that we'd like to not only explicitly remove headers by name but by
> pattern
> > / regexp. Here is a KIP for this feature!
> >
> > Please let me know if you have any comments, questions, or suggestions!
> >
> > https://cwiki.apache.org/confluence/x/oYtEE
> >
> > Thanks,
> > Roman
>
>


KIP-991: Allow DropHeaders SMT to drop headers by wildcard/regexp

2023-10-15 Thread Roman Schmitz
Hi all,

While working with different customers I came across the case several times
that we'd like to not only explicitly remove headers by name but by pattern
/ regexp. Here is a KIP for this feature!

Please let me know if you have any comments, questions, or suggestions!

https://cwiki.apache.org/confluence/x/oYtEE

Thanks,
Roman


[jira] [Created] (KAFKA-15597) Allow Connect DropHeaders SMT remove headers on a wildcard-basis

2023-10-12 Thread Roman Schmitz (Jira)
Roman Schmitz created KAFKA-15597:
-

 Summary: Allow Connect DropHeaders SMT remove headers on a 
wildcard-basis
 Key: KAFKA-15597
 URL: https://issues.apache.org/jira/browse/KAFKA-15597
 Project: Kafka
  Issue Type: Improvement
  Components: KafkaConnect
Reporter: Roman Schmitz


In many use cases you might not only want to drop a few specific Kafka headers 
but a set of headers whose names can also dynamically change (e.g. when used 
with some end-to-end-encryption libraries). To prevent those headers to be 
further forwarded/processed downstream, I suggest to add regexp matching to the 
*apply* method instead of a set-based {*}contains{*}. Link to the relevant code:
[https://github.com/apache/kafka/blob/7b5d640cc656443a078bda096d01910b3edfdb37/connect/transforms/src/main/java/org/apache/kafka/connect/transforms/DropHeaders.java#L54
 |http://example.com]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: How to Integrate MySQL database with Kafka? Need Demo

2023-06-27 Thread Roman Schmitz
Hi Avani,

not sure what you mean by creating your own code. Of course you can write
an own Connector but especially in Source Connectors offset handling can
challenging depending on the system.
There are hundreds of different Connectors out there that can connect to
existing systems. If you find one that suits your needs that's probably a
better idea than starting from scratch.
Typically there are lots of configuration options and for quite simple
transformations you can use SMTs. To run Connectors, simply install some
Connect instance, have the jars for your connectors installed and
configure/deploy your Connector(s) using the Connect API.

Kind regards,
Roman

Am Di., 27. Juni 2023 um 11:26 Uhr schrieb Avani Panchal
:

> Hi Roman,
>
> Thanks for replying, I want to ask one question: can I create my own code
> to integrate data using Kafka connectors?
>
> Kind regards,
> Avani Panchal
>
> On Tue, Jun 27, 2023 at 12:29 PM Roman Schmitz 
> wrote:
>
> > Hi Avani,
> >
> > totally agree - depending on the way you'd like to integrate your source
> > that would be probably Debezium for CDC or JDBC source that can be used
> > either in a "full-load" as well as in an incremental way.
> > As a sink JDBC sink with the corresponding driver for the DB. Kindly
> have a
> > look at the Connect Documentation and some examples.
> > If you'd like to test with runnable Docker-based examples, you can have a
> > look at this Docker Playground Repo:
> > https://github.com/vdesabou/kafka-docker-playground/tree/master/connect
> >
> > Kind regards,
> > Roman
> >
> > Am Di., 27. Juni 2023 um 08:48 Uhr schrieb Avani Panchal
> > :
> >
> > > Hi Sagar,
> > >
> > > Thank you for your email.
> > >
> > > Thanks & Regards,
> > > Avani Panchal
> > >
> > >
> > > On Tue, Jun 27, 2023 at 12:15 PM Sagar 
> > wrote:
> > >
> > > > Hi Avani,
> > > >
> > > > I already shared the documentation link for Kafka Connect. Let me
> share
> > > it
> > > > again: https://kafka.apache.org/documentation/#connect
> > > >
> > > > Regarding the connector documentation, you should be able to find
> them
> > by
> > > > just searching for JDBC source connector, JDBC sink connector and
> > > Debezium
> > > > connector for MySQL.
> > > >
> > > > Let me know if that works.
> > > >
> > > > Thanks!
> > > > Sagar.
> > > >
> > > >
> > > >
> > > > On Mon, Jun 26, 2023 at 2:10 PM Avani Panchal
> > > >  wrote:
> > > >
> > > > > Hi Sagar,
> > > > >
> > > > > Thank you for the information, you solved our confusion.
> > > > > I also saw lots of links for documentation on Kafka, but I am
> > confused
> > > > > which document I should use.
> > > > > So can you share the proper link from where I can read the
> documents.
> > > > >
> > > > > Thanks,
> > > > > Avani Panchal
> > > > >
> > > > >
> > > > > On Mon, Jun 26, 2023 at 1:48 PM Sagar 
> > > wrote:
> > > > >
> > > > > > Hey Avani,
> > > > > >
> > > > > > Kafka Connect <https://kafka.apache.org/documentation/#connect>
> is
> > > the
> > > > > > tool
> > > > > > to use when you want to stream data to/from Kafka via external
> > > systems.
> > > > > One
> > > > > > would typically configure connectors which allow streaming data
> > > to/from
> > > > > > Kafka. There are 2 types of connectors:
> > > > > > 1) Source Connectors: Which stream data from external systems
> like
> > > > > > databases etc to Kafka and
> > > > > > 2) Sink Connectors: Which stream data from Kafka to external
> > systems.
> > > > > >
> > > > > > Since you want to stream data from MySQL to SQL Server, with
> Kafka
> > > > > Connect
> > > > > > it would be a 2 step process:
> > > > > >
> > > > > > 1) Capture changes from MySQL to Kafka using connectors like JDBC
> > > > source
> > > > > > connector or Debezium MySQL connector.
> > > > > > 2) Once the data is in Kafka, you can use JDBC sink connectors to
> > > >

Re: How to Integrate MySQL database with Kafka? Need Demo

2023-06-27 Thread Roman Schmitz
Hi Avani,

totally agree - depending on the way you'd like to integrate your source
that would be probably Debezium for CDC or JDBC source that can be used
either in a "full-load" as well as in an incremental way.
As a sink JDBC sink with the corresponding driver for the DB. Kindly have a
look at the Connect Documentation and some examples.
If you'd like to test with runnable Docker-based examples, you can have a
look at this Docker Playground Repo:
https://github.com/vdesabou/kafka-docker-playground/tree/master/connect

Kind regards,
Roman

Am Di., 27. Juni 2023 um 08:48 Uhr schrieb Avani Panchal
:

> Hi Sagar,
>
> Thank you for your email.
>
> Thanks & Regards,
> Avani Panchal
>
>
> On Tue, Jun 27, 2023 at 12:15 PM Sagar  wrote:
>
> > Hi Avani,
> >
> > I already shared the documentation link for Kafka Connect. Let me share
> it
> > again: https://kafka.apache.org/documentation/#connect
> >
> > Regarding the connector documentation, you should be able to find them by
> > just searching for JDBC source connector, JDBC sink connector and
> Debezium
> > connector for MySQL.
> >
> > Let me know if that works.
> >
> > Thanks!
> > Sagar.
> >
> >
> >
> > On Mon, Jun 26, 2023 at 2:10 PM Avani Panchal
> >  wrote:
> >
> > > Hi Sagar,
> > >
> > > Thank you for the information, you solved our confusion.
> > > I also saw lots of links for documentation on Kafka, but I am confused
> > > which document I should use.
> > > So can you share the proper link from where I can read the documents.
> > >
> > > Thanks,
> > > Avani Panchal
> > >
> > >
> > > On Mon, Jun 26, 2023 at 1:48 PM Sagar 
> wrote:
> > >
> > > > Hey Avani,
> > > >
> > > > Kafka Connect  is
> the
> > > > tool
> > > > to use when you want to stream data to/from Kafka via external
> systems.
> > > One
> > > > would typically configure connectors which allow streaming data
> to/from
> > > > Kafka. There are 2 types of connectors:
> > > > 1) Source Connectors: Which stream data from external systems like
> > > > databases etc to Kafka and
> > > > 2) Sink Connectors: Which stream data from Kafka to external systems.
> > > >
> > > > Since you want to stream data from MySQL to SQL Server, with Kafka
> > > Connect
> > > > it would be a 2 step process:
> > > >
> > > > 1) Capture changes from MySQL to Kafka using connectors like JDBC
> > source
> > > > connector or Debezium MySQL connector.
> > > > 2) Once the data is in Kafka, you can use JDBC sink connectors to
> > stream
> > > > data from Kafka topics to the tables in SQL Server.
> > > >
> > > > Note that this is a very simplified view of how you can achieve your
> > goal
> > > > of streaming changes from MySQL to SQL Server and I would recommend
> > > reading
> > > > the documentation of the individual connectors and the Kafka Connect
> > > > framework to understand how to make it work for your usecase.
> > > >
> > > > Thanks for your interest on Apache Kafka!
> > > >
> > > > Thanks!
> > > > Sagar.
> > > >
> > > >
> > > > On Mon, Jun 26, 2023 at 11:42 AM Avani Panchal
> > > >  wrote:
> > > >
> > > > > Hi,
> > > > > In my application I  want to sync my client's data to my SQL
> server.
> > at
> > > > > client place the database is MYSQL.
> > > > >
> > > > > How can I achieve this using Kafka? I read a lot of documents but I
> > > don't
> > > > > understand which setup I need and how I can achieve it.
> > > > >
> > > > > I was also wondering about "Book a demo with Kafka" but didn't find
> > it.
> > > > >
> > > > > Please help me.
> > > > >
> > > > > Thank you,
> > > > > Avani
> > > > >
> > > >
> > >
> >
>


Re: [ANNOUNCE] New PMC chair: Mickael Maison

2023-04-24 Thread Roman Schmitz
Congratulations Mickael!


Am Mo., 24. Apr. 2023 um 19:26 Uhr schrieb José Armando García Sancio
:

> Congratulations Mickael and thank you Jun for performing this role for
> the past 10 years!
>
> On Mon, Apr 24, 2023 at 10:15 AM Yash Mayya  wrote:
> >
> > Congratulations Mickael!
> >
> > On Fri, Apr 21, 2023 at 8:39 PM Jun Rao 
> wrote:
> >
> > > Hi, everyone,
> > >
> > > After more than 10 years, I am stepping down as the PMC chair of Apache
> > > Kafka. We now have a new chair Mickael Maison, who has been a PMC
> member
> > > since 2020. I plan to continue to contribute to Apache Kafka myself.
> > >
> > > Congratulations, Mickael!
> > >
> > > Jun
> > >
>
>
>
> --
> -José
>


Re: [ANNOUNCE] New Kafka PMC Member: Chris Egerton

2023-03-09 Thread Roman Schmitz
Congratulations Chris!

Am Do., 9. März 2023 um 20:33 Uhr schrieb Chia-Ping Tsai :

> Congratulations Chris!
>
> > Mickael Maison  於 2023年3月10日 上午2:21 寫道:
> >
> > Congratulations Chris!
> >
> >> On Thu, Mar 9, 2023 at 7:17 PM Bill Bejeck  wrote:
> >>
> >> Congratulations Chris!
> >>
> >>> On Thu, Mar 9, 2023 at 1:12 PM Jun Rao 
> wrote:
> >>>
> >>> Hi, Everyone,
> >>>
> >>> Chris Egerton has been a Kafka committer since July 2022. He has been
> very
> >>> instrumental to the community since becoming a committer. It's my
> pleasure
> >>> to announce that Chris is now a member of Kafka PMC.
> >>>
> >>> Congratulations Chris!
> >>>
> >>> Jun
> >>> on behalf of Apache Kafka PMC
> >>>
>


[VOTE] KIP-887 - Add ConfigProvider to make use of environment variables

2022-12-14 Thread Roman Schmitz
Hi all,

Thank you for the feedback so far.
The KIP is rather straightforward and I'd like to start a vote on it.
Please have a look at the KIP: https://cwiki.apache.org/confluence/x/15jGDQ

Thanks,
Roman


[DISCUSS] KIP-887 - Add ConfigProvider to make use of environment variables

2022-11-12 Thread Roman Schmitz
Hi all,

as I had this discussion with several customers and also colleagues I'd
like to implement a ConfigProvider that makes use of env variables,

Please let me know if you have any comments, questions, or suggestions!

https://cwiki.apache.org/confluence/display/KAFKA/KIP-887%3A+Add+ConfigProvider+to+make+use+of+environment+variables

Thanks,
Roman


Permissions Request to contribute to Apache Kafka

2022-11-12 Thread Roman Schmitz
Hi all,

I already have raised an issue that requires a KIP (having an initial
implementation already in place), I'd like to contribute to Kafka.

My IDs:

   - Wiki: schm1tz1
   - Jira: Schm1tz1


Kind regards,
Roman


[jira] [Created] (KAFKA-14376) Add ConfigProvider to make use of environment variables

2022-11-10 Thread Roman Schmitz (Jira)
Roman Schmitz created KAFKA-14376:
-

 Summary: Add ConfigProvider to make use of environment variables
 Key: KAFKA-14376
 URL: https://issues.apache.org/jira/browse/KAFKA-14376
 Project: Kafka
  Issue Type: Improvement
  Components: config
Reporter: Roman Schmitz


So far it is not possible to inject additional configurations stored in 
environment variables. This topic came up in several projects and would be a 
useful feature to have as a Kafka config feature similar to file/directory 
providers, e.g.:

{{config.providers=env}}
{{{}config.providers.env.class=org.apache.kafka.common.config.provider.EnvVarConfigProvider{}}}{{{}ssl.key.password=${env:<...>:KEY_PASSPHRASE}{}}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)