Re: JDBC Sink connector

2022-10-10 Thread Steve Howard
Hi Charandeep, What is in the connect logs?  That generally has a full call stack. On 10/10/2022 7:11 AM, Singh, Charandeep wrote: Hi whenever I try to create a new connector it gets timed outcurl -s -X POST -H "Accept:application/json" -H "Content-Type:application/json" -d @connector-config.

JDBC Sink connector

2022-10-10 Thread Singh, Charandeep
Hi whenever I try to create a new connector it gets timed outcurl -s -X POST -H "Accept:application/json" -H "Content-Type:application/json" -d @connector-config.json http://localhost:8083/connectors {"error_code":500,"message":"Request timed out"} 4:13

Re: JDBC Sink Connector: Counterpart to Dead Letter Queue to keep track of successfully processed records

2021-04-27 Thread Florian McKee
ailure topic > >- persist valid messages in PostgreSQL > >- forward messages that have been persisted to ingestion_success topic > > > > > > The last point is key: Only forward messages to ingestion_successful if > > they have been persisted in PostgreSQL. I

Re: JDBC Sink Connector: Counterpart to Dead Letter Queue to keep track of successfully processed records

2021-04-27 Thread Ran Lupovich
y: Only forward messages to ingestion_successful if > they have been persisted in PostgreSQL. Is there any way I can do that with > a JDBC sink connector? > > I'm basically looking for the counterpart of the Dead Letter Queue for > records that have been processed successfu

JDBC Sink Connector: Counterpart to Dead Letter Queue to keep track of successfully processed records

2021-04-27 Thread Florian McKee
ingestion_success topic The last point is key: Only forward messages to ingestion_successful if they have been persisted in PostgreSQL. Is there any way I can do that with a JDBC sink connector? I'm basically looking for the counterpart of the Dead Letter Queue for records that have been proc

How to config multiple tables in kafka jdbc sink connector?

2020-05-11 Thread wangl...@geekplus.com.cn
As described here: https://docs.confluent.io/current/connect/kafka-connect-jdbc/sink-connector/sink_config_options.html connector.class=io.confluent.connect.jdbc.JdbcSinkConnector tasks.max=1 topics=orders connection.url=jdbc:xxx connection.user=xxx connection.password=xxx insert.mode=upsert

Re: Re: Write to database directly by referencing schema registry, no jdbc sink connector

2020-05-11 Thread wangl...@geekplus.com.cn
Hi robin, Seems i didn't make it clear. Actually we still use jdbc sink connector. But we want to use the JDBC Sink function in our own distributed platform intead of kafka connector I want to consolidate the code here: https://github.com/confluentinc/kafka-connect-jdbc/ Receive

Re: Write to database directly by referencing schema registry, no jdbc sink connector

2020-05-11 Thread Robin Moffatt
> wirite to target database. I want to use self-written java code instead of kafka jdbc sink connector. Out of interest, why do you want to do this? Why not use the JDBC sink connector (or a fork of it if you need to amend its functionality)? -- Robin Moffatt | Senior Developer Advoc

Re: JDBC Sink Connector

2020-05-11 Thread Robin Moffatt
Schema Registry and its serde libraries are part of Confluent Platform, licensed under Confluent Community Licence ( https://www.confluent.io/confluent-community-license-faq/) -- Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff On Fri, 8 May 2020 at 13:39, vishnu mura

Re: Re: Write to database directly by referencing schema registry, no jdbc sink connector

2020-05-11 Thread wangl...@geekplus.com.cn
%s%n", key, value); } Next I need to write it to database using the existing kafka jdbc sink connector API: Seems i need to consolidate the code here: https://github.com/confluentinc/kafka-connect-jdbc/ Just new a JDBCSinkTask, add the record to the JDBCSinkTask, then the task will aut

Re: Re: Write to database directly by referencing schema registry, no jdbc sink connector

2020-05-09 Thread Liam Clarke-Hutchinson
tomatically by referencing schema > registry > 2 change the record to a sql statement needed to be executed and > execute it > > Seems the kafka jdbc sink connector ( > https://docs.confluent.io/current/connect/kafka-connect-jdbc/sink-connector/index.html) > can achi

Re: Re: Write to database directly by referencing schema registry, no jdbc sink connector

2020-05-09 Thread wangl...@geekplus.com.cn
kafka jdbc sink connector (https://docs.confluent.io/current/connect/kafka-connect-jdbc/sink-connector/index.html) can achieve this function. But i have no idea how to write with java code. Is there any code example to achieve this? Thanks, Lei wangl...@geekplus.com.cn From: Liam Clarke

Re: Write to database directly by referencing schema registry, no jdbc sink connector

2020-05-09 Thread Liam Clarke-Hutchinson
debezium to parse binlog, using avro serialization and send to kafka. > > Need to consume the avro serialized binlog data and wirite to target > database > I want to use self-written java code instead of kafka jdbc sink > connector. > > How can i reference the schema reg

Re: Write to database directly by referencing schema registry, no jdbc sink connector

2020-05-08 Thread Chris Toomey
Write your own implementation of the JDBC sink connector and use the avro serializer to convert the kafka record into a connect record that your connector takes and writes to DB via JDBC. On Fri, May 8, 2020 at 7:38 PM wangl...@geekplus.com.cn < wangl...@geekplus.com.cn> wrote: >

Write to database directly by referencing schema registry, no jdbc sink connector

2020-05-08 Thread wangl...@geekplus.com.cn
Using debezium to parse binlog, using avro serialization and send to kafka. Need to consume the avro serialized binlog data and wirite to target database I want to use self-written java code instead of kafka jdbc sink connector. How can i reference the schema registry, convert a kafka

Re: JDBC Sink Connector

2020-05-08 Thread vishnu murali
Thank you so much Robin It helped me a lot to define sink connector with upsert mode and it is very helpful. For that schema related question i am not getting proper understanding. Because i am using Normal Apache kafka,i don't know whether those schema registry ,kql,avro serializers are present

Re: JDBC Sink Connector

2020-05-07 Thread Robin Moffatt
If you don't want to send the schema each time then serialise your data using Avro (or Protobuf), and then the schema is held in the Schema Registry. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=981s If you want to update a record insert of insert, you can use the upsert mode. See https://www

Re: JDBC Sink Connector

2020-05-07 Thread Liam Clarke-Hutchinson
Hi Vishnu, I wrote an implementation of org.apache.kafka.connect.storage.Converter, included it in the KC worker classpath (then set it with the property value.converter) to provide the schema that the JDBC sink needs. That approach may work for 1). For 2) KC can use upsert if your DB supports i

JDBC Sink Connector

2020-05-06 Thread vishnu murali
Hey Guys, i am working on JDBC Sink Conneector to take data from kafka topic to mysql. i am having 2 questions. i am using normal Apache Kafka 2.5 not a confluent version. 1)For inserting data every time we need to add the schema data also with every data,How can i overcome this situation?i wan

Problem with jdbc sink connector

2019-04-16 Thread Valentin
Hi there, > I am using the confluent jdbc-connector to move data from kafka to oracle > My avro object has 3 elements: > > String Id, String name, bytes data > > My database has also 3 columns: > > ID VARCHAR2(255 CHAR), > NAME VARCHAR2(255 CHAR), > DATABLOB > Using connector c

Re: Kafka connect jdbc Sink connector issues when moving from MySQL to Oracle DB

2016-09-30 Thread Srikrishna Alla
to the database, you can use the `fields.whitelist` > configuration to whitelist the desired fields. > > Best, > > Shikhar > > On Fri, Sep 30, 2016 at 8:38 AM Srikrishna Alla > > wrote: > > > Hi, > > > > I am facing issues with jdbc Sink Connector when

Re: Kafka connect jdbc Sink connector issues when moving from MySQL to Oracle DB

2016-09-30 Thread Shikhar Bhushan
the database, you can use the `fields.whitelist` configuration to whitelist the desired fields. Best, Shikhar On Fri, Sep 30, 2016 at 8:38 AM Srikrishna Alla wrote: > Hi, > > I am facing issues with jdbc Sink Connector when working with Oracle DB. > This functionality was working

Kafka connect jdbc Sink connector issues when moving from MySQL to Oracle DB

2016-09-30 Thread Srikrishna Alla
Hi, I am facing issues with jdbc Sink Connector when working with Oracle DB. This functionality was working fine when I was using MySQL DB. First error I had was when trying to create table using auto.create = true. It tried to create table for STRING fields as NVARCHAR2(4000) (which I see is by