Hello kafka community, I need to replicate a few topics from an Aiven cluster into a Confluent cluster (on local laptop), I saw two ways until now: - replicating with mirror maker + creating schema locally (creating with the same id is a problem) - using a consumer from first cluster and a producer to the second (a variation to that is Alpakka connectors) Which is your advice? Thanks,
More info/details: Hi, is it possible to manually create a schema with a certain id? via http post/put to schema registry rest api? can i control the id like that? by sending a message on local _schemas? which format is it? or how to make the schema ids from the replicated messages with the schema that i create locally? as context i mirrored an Avro topic with mirror maker and now i need the schema in the registry and with that specific id; how can this be done? i do not have the right to replicate _schemas and i doubt i can be given that access; thanks https://doc.akka.io/docs/alpakka-kafka/current/serialization.html#avro-with-schema-registry & using a reactive streams for src & dest kafka consumer & producer Do you have any hint which might work more “easily” ? (edited) first options are “how to create schema locally after binary replicating topic”, the last 2 options are “how to make replication from scratch, via producer publishing schema locally, with producer + consumer, wrapped in reactive streams for fun/robustness/declarativeness” -- Dumitru-Nicolae Marasoui Software Engineer w kaluza.com <https://www.kaluza.com/> LinkedIn <https://www.linkedin.com/company/kaluza> | Twitter <https://twitter.com/Kaluza_tech> Kaluza Ltd. registered in England and Wales No. 08785057 VAT No. 100119879 Help save paper - do you need to print this email?