do the same thing just hangs)
>
> On Wed, May 4, 2016 at 3:18 PM, Rad Gruchalski (mailto:ra...@gruchalski.com)> wrote:
>
> > John,
> >
> > I believe you mean something along the lines of:
> > http://markmail.org/message/f7xb5okr3ujkpl
John,
I believe you mean something along the lines of:
http://markmail.org/message/f7xb5okr3ujkplk4
I don’t think something like this has been done.
Best regards,
Radek Gruchalski
ra...@gruchalski.com (mailto:ra...@gruchalski.com)
(mailto:ra...@gruchalski.com)
de.linkedin.com/in/r
Hi Scott,
I remember that one. That would be an awesome feature. If there was anybody
wanting to help with contributing such thing, I’ll be happy to dig the details
out.
Best regards,
Radek Gruchalski
ra...@gruchalski.com (mailto:ra...@gruchalski.com)
(mailto:ra...@gruchalski.com)
Apache Samza is the way to go. Never used Kafka Streams so no opinion on that
one.
Best regards,
Radek Gruchalski
ra...@gruchalski.com (mailto:ra...@gruchalski.com)
(mailto:ra...@gruchalski.com)
de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/)
Confidentia
Hello Tushar,
From the documentation:
Zookeeper also allows you to add a "chroot" path which will make all kafka data
for this cluster appear under a particular path. This is a way to setup
multiple Kafka clusters or other applications on the same zookeeper cluster. To
do this give a connect
probably easier to run it
> > via command line. You can check README.md (http://README.md) for more
> > information on how to
> > run tests.
> >
> > Dong
> >
> > On Thu, Nov 5, 2015 at 2:33 PM, Rad Gruchalski > (mailto:ra...@gruchalski.com)>
&
Dong,
Does it allow running, say, tests in debug? I tried that and never managed to
get any test to run in intellij. Say, to set some breakpoints and debug...
Kind regards,
Radek Gruchalski
ra...@gruchalski.com (mailto:ra...@gruchalski.com)
(mailto:ra...@gruchalski.com)
de.linkedi
Sounds like the same idea. The nice thing about having such option is that,
with a correct application of containers, backup and restore strategy, one can
create an infinite ordered backup of raw input stream using native Kafka
storage format.
I understand the point of having the data in other f
ied consumer
> application on the cold segments.
>
>
> --Scott
>
>
> On Mon, Jul 13, 2015 at 6:57 AM, Rad Gruchalski (mailto:ra...@gruchalski.com)>
> wrote:
>
> > Scott,
> >
> > This is what I was trying to target in one of my previous
Scott,
This is what I was trying to target in one of my previous responses to Daniel.
The one in which I suggest another compaction setting for kafka.
Kind regards,
Radek Gruchalski
ra...@gruchalski.com (mailto:ra...@gruchalski.com)
(mailto:ra...@gruchalski.com)
de.linkedin.com/in
wrote:
> Radek: I don't see how data could be stored more efficiently than in Kafka
> itself. It's optimized for cheap storage and offers high-performance bulk
> export, exactly what you want from long-term archival.
> On fre. 10. jul. 2015 at 23.16 Rad Gruchalski (mailto:ra...@
Hello all,
This is a very interesting discussion. I’ve been thinking of a similar use case
for Kafka over the last few days.
The usual data workflow with Kafka is most likely something this:
- ingest with Kafka
- process with Storm / Samza / whathaveyou
- put some processed data back on Kafk
Hi everyone,
Same errors can be seen when using embedded kafka and embedded zookeeper in
unit tests. They’re absolutely normal. As long as you see a successful
connection, it’s all good!
Kind regards,
Radek Gruchalski
ra...@gruchalski.com (mailto:ra...@gruchalski.com)
(mailto:ra...
13 matches
Mail list logo