Re: Backing up Kafka data and using it later?

2016-05-11 Thread Gerard Klijs
You could create a docker image with a kafka installation, and start a mirror maker in it, you could set the retention time for it to infinite, and mount the data volume. With the data you could always restart the docker, en mirror it to somewhere else. Not sure that's what you want, but it's an

Re: Backing up Kafka data and using it later?

2016-05-10 Thread Alex Loddengaard
You may find this interesting, although I don't believe it's exactly what you're looking for: https://github.com/pinterest/secor I'm not sure how stable and commonly used it is. Additionally, I see a lot of users use MirrorMaker for a "backup," where MirrorMaker copies all topics from one Kafka

Re: Backing up Kafka data and using it later?

2016-05-05 Thread Rad Gruchalski
John, I’m not as expert expert in Kafka but I would assume so. Best regards,
 Radek Gruchalski 
ra...@gruchalski.com (mailto:ra...@gruchalski.com)
 (mailto:ra...@gruchalski.com) de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/) Confidentiality: This

Re: Backing up Kafka data and using it later?

2016-05-04 Thread John Bickerstaff
Thanks - does that mean that the only way to safely back up Kafka is to have replication? (I have done this partially - I can get the entire topic on the command line, after completely recreating the server, but my code that is intended to do the same thing just hangs) On Wed, May 4, 2016 at

Re: Backing up Kafka data and using it later?

2016-05-04 Thread Rad Gruchalski
John, I believe you mean something along the lines of: http://markmail.org/message/f7xb5okr3ujkplk4 I don’t think something like this has been done. Best regards,
 Radek Gruchalski 
ra...@gruchalski.com (mailto:ra...@gruchalski.com)
 (mailto:ra...@gruchalski.com)

Backing up Kafka data and using it later?

2016-05-04 Thread John Bickerstaff
Hi, I have what is probably an edge use case. I'd like to back up a single Kafka instance such that I can recreate a new server, drop Kafka in, drop the data in, start Kafka -- and have all my data ready to go again for consumers. Is such a thing done? Does anyone have any experience trying