Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17246#discussion_r106722120
  
    --- Diff: docs/structured-streaming-kafka-integration.md ---
    @@ -373,11 +375,204 @@ The following configurations are optional:
     </tr>
     </table>
     
    +## Producing Data to Kafka
    +
    +### Writing Streaming Queries to Kafka
    +
    +<div class="codetabs">
    +<div data-lang="scala" markdown="1">
    +{% highlight scala %}
    +
    +// Write key-value data from a DataFrame to a specific Kafka topic 
specified in an option
    +val s1 = df1
    +  .selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)")
    +  .writeStream
    +  .format("kafka")
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
    +  .option("topic", "topic1")
    +  .start()
    +
    +// Write key-value data from a DataFrame to Kafka using a topic specified 
in the data
    +val s2 = df2
    +  .selectExpr("topic", "CAST(key AS STRING)", "CAST(value AS STRING)")
    +  .writeStream
    +  .format("kafka")
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
    +  .start()
    +
    +{% endhighlight %}
    +</div>
    +<div data-lang="java" markdown="1">
    +{% highlight java %}
    +
    +// Write key-value data from a DataFrame to a specific Kafka topic 
specified in an option
    +StreamingQuery s1 = df1
    +  .selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)")
    +  .writeStream()
    +  .format("kafka")
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
    +  .option("topic", "topic1")
    +  .start()
    +
    +// Write key-value data from a DataFrame to Kafka using a topic specified 
in the data
    +StreamingQuery s2 = df1
    +  .selectExpr("topic", "CAST(key AS STRING)", "CAST(value AS STRING)")
    +  .writeStream()
    +  .format("kafka")
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
    +  .start()
    +
    +{% endhighlight %}
    +</div>
    +<div data-lang="python" markdown="1">
    +{% highlight python %}
    +
    +# Write key-value data from a DataFrame to a specific Kafka topic 
specified in an option
    +s1 = df1 \
    +  .selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)") \
    +  .writeStream \
    +  .format("kafka") \
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2") \
    +  .option("topic", "topic1") \
    +  .start()
    +
    +# Write key-value data from a DataFrame to Kafka using a topic specified 
in the data
    +s2 = df2 \
    +  .selectExpr("topic", "CAST(key AS STRING)", "CAST(value AS STRING)") \
    +  .writeStream \
    +  .format("kafka") \
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2") \
    +  .start()
    +
    +{% endhighlight %}
    +</div>
    +</div>
    +
    +### Writing Batch Queries to Kafka
    +
    +<div class="codetabs">
    +<div data-lang="scala" markdown="1">
    +{% highlight scala %}
    +
    +// Write key-value data from a DataFrame to a specific Kafka topic 
specified in an option
    +df1.selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)")
    +  .write
    +  .format("kafka")
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
    +  .option("topic", "topic1")
    +  .save()
    +
    +// Write key-value data from a DataFrame to Kafka using a topic specified 
in the data
    +df2.selectExpr("topic", "CAST(key AS STRING)", "CAST(value AS STRING)")
    +  .write
    +  .format("kafka")
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
    +  .save()
    +
    +{% endhighlight %}
    +</div>
    +<div data-lang="java" markdown="1">
    +{% highlight java %}
    +
    +// Write key-value data from a DataFrame to a specific Kafka topic 
specified in an option
    +df1.selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)")
    +  .write()
    +  .format("kafka")
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
    +  .option("topic", "topic1")
    +  .save()
    +
    +// Write key-value data from a DataFrame to Kafka using a topic specified 
in the data
    +df1.selectExpr("topic", "CAST(key AS STRING)", "CAST(value AS STRING)")
    +  .write()
    +  .format("kafka")
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
    +  .save()
    +
    +{% endhighlight %}
    +</div>
    +<div data-lang="python" markdown="1">
    +{% highlight python %}
    +
    +# Write key-value data from a DataFrame to a specific Kafka topic 
specified in an option
    +df1.selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)") \
    +  .write \
    +  .format("kafka") \
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2") \
    +  .option("topic", "topic1") \
    +  .save()
    +
    +# Write key-value data from a DataFrame to Kafka using a topic specified 
in the data
    +df2.selectExpr("topic", "CAST(key AS STRING)", "CAST(value AS STRING)") \
    +  .write \
    +  .format("kafka") \
    +  .option("kafka.bootstrap.servers", "host1:port1,host2:port2") \
    +  .save()
    +
    +{% endhighlight %}
    +</div>
    +</div>
    +
    +Each row being written to Kafka has the following schema:
    +<table class="table">
    +<tr><th>Column</th><th>Type</th></tr>
    +<tr>
    +  <td>key (optional)</td>
    +  <td>string or binary</td>
    +</tr>
    +<tr>
    +  <td>value (required)</td>
    +  <td>string or binary</td>
    +</tr>
    +<tr>
    +  <td>topic (*optional)</td>
    +  <td>string</td>
    +</tr>
    +</table>
    +\* The topic column is required if the "topic" configuration option is not 
specified.<br>
    +
    +The value column is the only required option. If a key column is not 
specified then 
    +a ```null``` valued key column will be automatically added (see Kafka 
semantics on 
    +how ```null``` valued key values are handled). If a topic column exists 
then its value
    +is used as the topic when writing the given row to Kafka, unless the 
"topic" configuration
    +option is set i.e., the "topic" configuration option overrides the topic 
column.
    +
    +The following options must be set for the Kafka sink
    +for both batch and streaming queries.
    +
    +<table class="table">
    +<tr><th>Option</th><th>value</th><th>meaning</th></tr>
    +<tr>
    +  <td>kafka.bootstrap.servers</td>
    +  <td>A comma-separated list of host:port</td>
    +  <td>The Kafka "bootstrap.servers" configuration.</td>
    +</tr>
    +</table>
    +
    +The following configurations are optional:
    +
    +<table class="table">
    +<tr><th>Option</th><th>value</th><th>default</th><th>query 
type</th><th>meaning</th></tr>
    +<tr>
    +  <td>topic</td>
    +  <td>string</td>
    +  <td>none</td>
    +  <td>streaming and batch</td>
    +  <td>Sets the topic that all rows will be written to in Kafka. This 
option overrides any
    +  topic column that may exist in the data.</td>
    +</tr>
    +</table>
    +
    +
    +## Kafka Specific Configurations
    +
     Kafka's own configurations can be set via `DataStreamReader.option` with 
`kafka.` prefix, e.g, 
     `stream.option("kafka.bootstrap.servers", "host:port")`. For possible 
kafkaParams, see 
    --- End diff --
    
    kafkaParams -> kafka parameters


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to