Hi,
I have a setup (in mind) where data is written to Kafka and this data is
persisted in HDFS (e.g., using camus) so that I have an all-time archive of
all stream data ever received. Now I want to process that all-time archive
and when I am done with that, continue with the live stream, using
Hi,
On Wed, Sep 24, 2014 at 7:23 PM, Dibyendu Bhattacharya
dibyendu.bhattach...@gmail.com wrote:
So you have a single Kafka topic which has very high retention period (
that decides the storage capacity of a given Kafka topic) and you want to
process all historical data first using Camus and