Yes I have, I do need to build and run Schema Registry as a pre-requisite
isn't that correct? because the QuickStart seems to start AVRO - without
AVRO you need your own implementation of transformer/serdes etc.

I am only asking since my deployment platform is Windows Server 2012 - and
Confluent pkg is meant to be run on Linux. I guess there is a lot of manual
conversion I need to do here?

On 16 September 2017 at 21:43, Ted Yu <yuzhih...@gmail.com> wrote:

> Have you looked at https://github.com/confluentinc/kafka-connect-jdbc ?
>
> On Sat, Sep 16, 2017 at 1:39 PM, M. Manna <manme...@gmail.com> wrote:
>
> > Sure. But all these are not available via Kafka open source (requires
> > manual coding), correct? Only Confluence seems to provide some
> > off-the-shelf connector but Confluent isn't compatible on Windows (yet),
> > also correct?
> >
> >
> >
> > On 13 September 2017 at 18:11, Sreejith S <srssreej...@gmail.com> wrote:
> >
> > > This is possible. Once you have records in your put method, its up your
> > > logic how you are redirecting it to multiple jdbc connections for
> > > insertion.
> > >
> > > In my use case i have implemented many to many sources and sinks.
> > >
> > > Regards,
> > > Srijith
> > >
> > > On 13-Sep-2017 10:14 pm, "M. Manna" <manme...@gmail.com> wrote:
> > >
> > > Hi,
> > >
> > > I need a little help/suggestion if possible. Does anyone know if it's
> > > possible in Kafka to develop a connector that can sink for multiple
> JDBC
> > > urls for the same topic (i.e. table) ?
> > >
> > > The examples I can see on Confluent talks about one JDBC url
> (one-to-one
> > > sink). Would it be possible to achieve a one-to-many ?
> > >
> > > What I am trying to do is the following:
> > >
> > > 1) Write to a topic
> > > 2) Sink it to multiple DBs (they all will have the same table).
> > >
> > > Is this doable/correct way for Connect API?
> > >
> > > Kindest Regards,
> > >
> >
>

Reply via email to