mimaison commented on code in PR #17844:
URL: https://github.com/apache/kafka/pull/17844#discussion_r1858085494
##########
docs/security.html:
##########
@@ -847,27 +847,27 @@ <h3 class="anchor-heading"><a id="security_sasl"
class="anchor-link"></a><a href
Kafka supports <a
href="https://tools.ietf.org/html/rfc7677">SCRAM-SHA-256</a> and SCRAM-SHA-512
which
can be used with TLS to perform secure authentication. Under
the default implementation of <code>principal.builder.class</code>, the
username is used as the authenticated
<code>Principal</code> for configuration of ACLs etc. The
default SCRAM implementation in Kafka
- stores SCRAM credentials in Zookeeper and is suitable for use
in Kafka installations where Zookeeper
- is on a private network. Refer to <a
href="#security_sasl_scram_security">Security Considerations</a>
+ stores SCRAM credentials in KRaft controllers. Refer to <a
href="#security_sasl_scram_security">Security Considerations</a>
Review Comment:
In other places in the docs, we use "in the metadata log" to described where
metadata is stored in KRaft mode. For example, see:
- https://kafka.apache.org/documentation/#brokerconfigs_early.start.listeners
- https://kafka.apache.org/documentation/#security_authz
I think we should stick with that term.
##########
docs/security.html:
##########
@@ -847,27 +847,27 @@ <h3 class="anchor-heading"><a id="security_sasl"
class="anchor-link"></a><a href
Kafka supports <a
href="https://tools.ietf.org/html/rfc7677">SCRAM-SHA-256</a> and SCRAM-SHA-512
which
can be used with TLS to perform secure authentication. Under
the default implementation of <code>principal.builder.class</code>, the
username is used as the authenticated
<code>Principal</code> for configuration of ACLs etc. The
default SCRAM implementation in Kafka
- stores SCRAM credentials in Zookeeper and is suitable for use
in Kafka installations where Zookeeper
- is on a private network. Refer to <a
href="#security_sasl_scram_security">Security Considerations</a>
+ stores SCRAM credentials in KRaft controllers. Refer to <a
href="#security_sasl_scram_security">Security Considerations</a>
for more details.</p>
<ol>
<li><h5 class="anchor-heading"><a
id="security_sasl_scram_credentials" class="anchor-link"></a><a
href="#security_sasl_scram_credentials">Creating SCRAM Credentials</a></h5>
- <p>The SCRAM implementation in Kafka uses Zookeeper as
credential store. Credentials can be created in
- Zookeeper using <code>kafka-configs.sh</code>. For
each SCRAM mechanism enabled, credentials must be created
+ <p>The SCRAM implementation in Kafka uses KRaft
controllers as credential store. Credentials can be created in
Review Comment:
Again, I think we should use "metadata log". Or maybe even remove that
sentence as we already say the same thing just above.
##########
docs/security.html:
##########
@@ -883,10 +883,14 @@ <h3 class="anchor-heading"><a id="security_sasl"
class="anchor-link"></a><a href
<li>Pass the JAAS config file location as JVM
parameter to each Kafka broker:
<pre><code
class="language-bash">-Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf</code></pre></li>
<li>Configure SASL port and SASL mechanisms in
server.properties as described <a href="#security_sasl_brokerconfig">here</a>.
For example:
- <pre><code
class="language-text">listeners=SASL_SSL://host.name:port
-security.inter.broker.protocol=SASL_SSL
+ <pre><code
class="language-text">listeners=SASL_PLAINTEXT://host.name.1:port.1,CONTROLLER://host.name.2:port.2
+inter.broker.listener.name=SASL_PLAINTEXT
+advertised.listeners=SASL_PLAINTEXT://host.name.1:port.1,CONTROLLER://host.name.2:port.2
+controller.listener.names=CONTROLLER
Review Comment:
Do we need to introduce controller configs here?
##########
docs/security.html:
##########
@@ -909,21 +913,21 @@ <h3 class="anchor-heading"><a id="security_sasl"
class="anchor-link"></a><a href
as described <a
href="#security_client_staticjaas">here</a>. Clients use the login section named
<code>KafkaClient</code>. This option allows
only one user for all client connections from a JVM.</p></li>
<li>Configure the following properties in
producer.properties or consumer.properties:
- <pre><code
class="language-text">security.protocol=SASL_SSL
+ <pre><code
class="language-text">security.protocol=SASL_PLAINTEXT
Review Comment:
Why are we changing this to `SASL_PLAINTEXT`?
##########
docs/security.html:
##########
@@ -909,21 +913,21 @@ <h3 class="anchor-heading"><a id="security_sasl"
class="anchor-link"></a><a href
as described <a
href="#security_client_staticjaas">here</a>. Clients use the login section named
<code>KafkaClient</code>. This option allows
only one user for all client connections from a JVM.</p></li>
<li>Configure the following properties in
producer.properties or consumer.properties:
- <pre><code
class="language-text">security.protocol=SASL_SSL
+ <pre><code
class="language-text">security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-256 (or SCRAM-SHA-512)</code></pre></li>
</ol>
</li>
<li><h5><a id="security_sasl_scram_security"
href="#security_sasl_scram_security">Security Considerations for
SASL/SCRAM</a></h5>
<ul>
- <li>The default implementation of SASL/SCRAM in Kafka
stores SCRAM credentials in Zookeeper. This
- is suitable for production use in installations
where Zookeeper is secure and on a private network.</li>
+ <li>The default implementation of SASL/SCRAM in Kafka
stores SCRAM credentials in KRaft controllers. This
Review Comment:
Ditto, metadata log
##########
docs/security.html:
##########
@@ -847,27 +847,27 @@ <h3 class="anchor-heading"><a id="security_sasl"
class="anchor-link"></a><a href
Kafka supports <a
href="https://tools.ietf.org/html/rfc7677">SCRAM-SHA-256</a> and SCRAM-SHA-512
which
can be used with TLS to perform secure authentication. Under
the default implementation of <code>principal.builder.class</code>, the
username is used as the authenticated
<code>Principal</code> for configuration of ACLs etc. The
default SCRAM implementation in Kafka
- stores SCRAM credentials in Zookeeper and is suitable for use
in Kafka installations where Zookeeper
- is on a private network. Refer to <a
href="#security_sasl_scram_security">Security Considerations</a>
+ stores SCRAM credentials in KRaft controllers. Refer to <a
href="#security_sasl_scram_security">Security Considerations</a>
for more details.</p>
<ol>
<li><h5 class="anchor-heading"><a
id="security_sasl_scram_credentials" class="anchor-link"></a><a
href="#security_sasl_scram_credentials">Creating SCRAM Credentials</a></h5>
- <p>The SCRAM implementation in Kafka uses Zookeeper as
credential store. Credentials can be created in
- Zookeeper using <code>kafka-configs.sh</code>. For
each SCRAM mechanism enabled, credentials must be created
+ <p>The SCRAM implementation in Kafka uses KRaft
controllers as credential store. Credentials can be created in
+ KRaft controllers using <code>kafka-storage.sh</code>
or <code>kafka-configs.sh</code>. For each SCRAM mechanism enabled, credentials
must be created
by adding a config with the mechanism name.
Credentials for inter-broker communication must be created
- before Kafka brokers are started. Client credentials
may be created and updated dynamically and updated
- credentials will be used to authenticate new
connections.</p>
- <p>Create SCRAM credentials for user <i>alice</i> with
password <i>alice-secret</i>:
- <pre><code class="language-bash">$ bin/kafka-configs.sh
--zookeeper localhost:2182 --zk-tls-config-file zk_tls_config.properties
--alter --add-config
'SCRAM-SHA-256=[iterations=8192,password=alice-secret],SCRAM-SHA-512=[password=alice-secret]'
--entity-type users --entity-name alice</code></pre>
+ before Kafka brokers are started.
<code>kafka-storage.sh</code> can format storage with initial credentials.
+ Client credentials may be created and updated
dynamically and updated credentials will be used to authenticate new
connections.
+ <code>kafka-configs.sh</code> can be used to create
and update credentials after Kafka brokers are started.</p>
+ <p>Create initial SCRAM credentials for user <i>admin</i>
with password <i>admin-secret</i>:
+ <pre><code class="language-bash">$ bin/kafka-storage.sh
format -t $(bin/kafka-storage.sh random-uuid) -c config/kraft/server.properties
--add-scram 'SCRAM-SHA-256=[name="admin",password="admin-secret"]'</code></pre>
+ <p>Create SCRAM credentials for user <i>alice</i> with
password <i>alice-secret</i> (refer to <a
href="#security_sasl_scram_clientconfig">Configuring Kafka Clients</a> for
client configuration):
+ <pre><code class="language-bash">$ bin/kafka-configs.sh
--bootstrap-server localhost:9092 --alter --add-config
'SCRAM-SHA-256=[iterations=8192,password=alice-secret]' --entity-type users
--entity-name alice --command-config client.properties</code></pre>
<p>The default iteration count of 4096 is used if
iterations are not specified. A random salt is created
Review Comment:
"A random salt is created", I believe this is only if it's not specified,
right?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]