Hi All,
Thanks for the inputs: apparently this is an issue for which everyone tries
to come up with a solution.
I think it should be done in the core Kafka CLI; it cries for a feature
request/improvement.
I've created a JIRA issue for it; if you think it would be helpful for you
as well, please
Us too:
https://github.com/wikimedia/puppet/blob/production/modules/confluent/files/kafka/kafka.sh
This requires that the various kafka-* scrips are in your PATH.
And then this gets rendered into /etc/profile.d to set env variables.
We also have created simple wrapper scripts for common operations.
On Sat, Apr 21, 2018 at 2:20 AM, Peter Bukowinski wrote:
> One solution is to build wrapper scripts around the standard kafka
> scripts. You’d put your relevant cluster parameters (brokers, zookeepers)
> in a
One solution is to build wrapper scripts around the standard kafka scripts.
You’d put your relevant cluster parameters (brokers, zookeepers) in a single
config file (I like yaml), then your script would import that config file and
pass the appropriate parameters to the kafka command. You could
HTH
Martin
__
From: Horváth Péter Gergely <horvath.peter.gerg...@gmail.com>
Sent: Friday, April 20, 2018 6:23 AM
To: users@kafka.apache.org
Subject: Using Kafka CLI without specifying the URLs every single time?
Hello
Hello All,
I wondering if there is any way to avoid having to enter the host URLs for
each Kafka CLI command you execute.
This is kind of tedious as different CLI commands require specifying
different servers (--broker-list, --bootstrap-server and --zookeeper);
which is especially painful if the