This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


    from 77e9b58  [SPARK-28969][PYTHON][ML] OneVsRestParams parity between 
scala and python
     add 1675d51  [SPARK-23539][SS] Add support for Kafka headers in Structured 
Streaming

No new revisions were added by this update.

Summary of changes:
 docs/structured-streaming-kafka-integration.md     | 55 +++++++++++++
 .../org/apache/spark/sql/kafka010/KafkaBatch.scala |  5 +-
 .../sql/kafka010/KafkaBatchPartitionReader.scala   | 13 +--
 .../spark/sql/kafka010/KafkaContinuousStream.scala | 30 ++++---
 .../spark/sql/kafka010/KafkaMicroBatchStream.scala |  9 ++-
 .../spark/sql/kafka010/KafkaOffsetReader.scala     | 14 ----
 .../sql/kafka010/KafkaRecordToRowConverter.scala   | 93 ++++++++++++++++++++++
 .../kafka010/KafkaRecordToUnsafeRowConverter.scala | 54 -------------
 .../apache/spark/sql/kafka010/KafkaRelation.scala  | 24 +++---
 .../apache/spark/sql/kafka010/KafkaSource.scala    | 41 +++++-----
 .../spark/sql/kafka010/KafkaSourceProvider.scala   | 22 +++--
 .../apache/spark/sql/kafka010/KafkaWriteTask.scala | 39 ++++++++-
 .../apache/spark/sql/kafka010/KafkaWriter.scala    | 15 +++-
 .../sql/kafka010/KafkaDataConsumerSuite.scala      | 28 +++++--
 .../spark/sql/kafka010/KafkaRelationSuite.scala    | 33 +++++++-
 .../apache/spark/sql/kafka010/KafkaSinkSuite.scala | 48 +++++++++--
 .../apache/spark/sql/kafka010/KafkaTestUtils.scala | 35 ++++++--
 17 files changed, 402 insertions(+), 156 deletions(-)
 create mode 100644 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaRecordToRowConverter.scala
 delete mode 100644 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaRecordToUnsafeRowConverter.scala


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to