Hi all,
I have been trying to modify one of the Kafka wiki pages [1] to correct
a few outdated code examples but it turns out that my Confluence account
(miguno [2]) apparently does not have edit permissions.
The "Page Restrictions" for [1] are listed as:
- No view restrictions are defined f
Many thanks for the clarification, Jun!
Michael
> On 16.09.2014, at 02:11, Jun Rao wrote:
>
> Yes, that description is not precise. We do allow dots in general. However,
> a topic can't be just "." or "..".
>
> Thanks,
>
> Jun
>
: topic name te+dd is illegal, contains a
> character other than ASCII alphanumerics, '.', '_' and '-'
>
> Thanks,
>
> Jun
>
> On Sun, Sep 14, 2014 at 1:46 AM, Michael G. Noll
> wrote:
>
>> Wouldn't it be helpful to throw an
Wouldn't it be helpful to throw an error or a warning if the user
tries to create a topic with an invalid name? Currently neither the
API nor the CLI tools inform you that you are naming a topic in a way
you shouldn't.
And as Otis pointed out elsewhere in this thread this ties back into
the JMX/M
be --topic test2 --zookeeper localhost:2181
>
> Topic:test2 PartitionCount:3 ReplicationFactor:2 Configs:
>
> Topic: test2 Partition: 0 Leader: 1 Replicas: 1,0 Isr: 1,0
>
> Topic: test2 Partition: 1 Leader: 0 Replicas: 0,1 Isr: 0,1
>
> Topic: test2 Partition: 2 Leader: 1 Replicas:
rt, I start seeing this exception. In this case i only have one
> broker. I still create the topic the way i described earlier.
> I understand this is not the ideal production topology, but its annoying to
> see it during development.
>
> Thanks
>
>
> On Wed, Jun 11, 2014
Take a look at Loggly.com's AWS setup for Kafka, e.g. as described on theor
blog (very recently) as well as in their talk at AWS reInvent 2013.
--Michael
> On 11.06.2014, at 19:43, S Ahmed wrote:
>
> For those of you hosting on ec2, could someone suggest a "minimum"
> recommended setup for ka
Prakash,
you are configure the topic with a replication factor of only 1, i.e. no
additional replica beyond "the original one". This replication setting
of 1 means that only one of the two brokers will ever host the (single)
replica -- which is implied to also be the leader in-sync replica -- of
Hi everyone,
to sweeten the upcoming long weekend I have released code examples that
show how to integrate Kafka 0.8+ with Storm 0.9+, while using Apache
Avro as the data serialization format.
https://github.com/miguno/kafka-storm-starter
Since the integration of the latest Kafka and Storm v
You might be running into the following known bug in 0.8.1:
https://issues.apache.org/jira/browse/KAFKA-1310
The fix is to downgrade to 0.8.0, or to migrate to 0.8.1.1 once it gets
released (IIRC the tentative 0.8.1.1 release date is mid-April).
Best,
Michael
On 30.03.2014 18:30, Edward Capri
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Thanks for this follow-up, and +1 here.
I was aware, for instance, that the newly introduced delete topic
feature in 0.8.1 is not fully ready for prime time. But IIRC I
learned about this by following the mailing list (It think it was
actually a repl
Hi everyone,
I have released a tool called Wirbelsturm
(https://github.com/miguno/wirbelsturm) that allows you to perform local
and remote deployments of Kafka. It's also a small way of saying a big
"thank you" to the Kafka community.
Wirbelsturm uses Vagrant for creating and managing machines,
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Many thanks to everyone involved in the release!
Please let me share two comments:
One, there's a typo on the Downloads page [1] for the text of the
source download link: It incorrectly says "kafka-0.8.0-src.tgz"
instead of "kafka-0.8.1-src.tgz". (
Andrew,
I am actually referencing the WM puppet module (notably because it
targets Debian whereas ours is currently focused on RHEL). I really
like that your module already supports Kafka mirroring and jmxtrans. :-)
--Michael
On 02/26/2014 03:41 PM, Andrew Otto wrote:
> Oh so many puppet modul
Hi everyone,
I have released a Puppet module to deploy Kafka 0.8 in case anyone is
interested.
The module uses Puppet parameterized classes and as such decouples code
(Puppet manifests) from configuration data -- hence you can use Puppet
Hiera to configure the way Kafka is deployed without having
Btw Sridhar: If you only want to write producer and/or consumer code in
Scala 2.10 you do not need to run the brokers (and thus build Kafka
itself) with 2.10, too. That is, you can run the brokers with the stock
Scala 2.8 build and still write "your" code with 2.10. (But I am not
sure if that's wh
16 matches
Mail list logo