Github user markgrover commented on the pull request:

    https://github.com/apache/spark/pull/11143#issuecomment-182154267
  
    Hi @tdas ! Thanks for reviewing.
    
    I talk about the compatibility story at length in the related PR #10953 (in 
particular, [here](https://github.com/apache/spark/pull/10953#issue-129260890))
    
    TLDR from that comment:
    The status quo is only supporting 0.8 which will work with 0.8 or later 
brokers. There is definitely tons of interest in 0.9's new consumer API, so we 
will have to support that imminently (#10953). So, the question  that really 
remains is, which version(s) do we support the Kafka old consumer API from?
    
    We have two options:
    1. Support Apache 0.9 and later ONLY.
    Pros: 
    * Easy to manage and easy to support, since there is only one version of 
Kafka that's support.
    * No changes required from users to their Spark apps.
    * Kafka community is pushing all their users to move to 0.9.0.
    
    Cons:
    * Users will have to upgrade Kafka brokers to Kafka 0.9
    
    2. Support both Apache Kafka 0.8 and 0.9+.
    Pros:
    * Support for both 0.8 and 0.9 brokers
    
    Cons:
    * Build management - we'll likely need a 2 maven profiles, 2 assemblies, 2 
builds, actual code duplication, to publish two flavors of the kafka 
integration artifacts. Decide what will be our default, etc.
    * We'll have to this around for at least another major release i.e. until 
Spark 3.0.
    
    Given that we are doing Spark 2.0 where we have the liberty of breaking 
away from old versions of Kafka, I proposed that we go with Option 1 and only 
support Kafka 0.9 release. Does that sound reasonable?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to