[ 
https://issues.apache.org/jira/browse/SPARK-19767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16387217#comment-16387217
 ] 

Cody Koeninger commented on SPARK-19767:
----------------------------------------

[~nafshartous] I think at this point people are more likely to need easy access 
to the API docs for the 0.10 artifact rather than the 0.8, do you agree?

If so look at commit 

1f0d0213

for what I did to skip 0.10

If you want to change that to skip 0.8 instead, and include 0.10, I'd support 
that.

Although now that I look at it, the latest api docs 
[http://spark.apache.org/docs/latest/api/scala/index.html]  no longer have the 
kafka namespace at all, whereas 
[http://spark.apache.org/docs/2.2.1/api/scala/index.html]  still did.  Haven't 
dug into why.

> API Doc pages for Streaming with Kafka 0.10 not current
> -------------------------------------------------------
>
>                 Key: SPARK-19767
>                 URL: https://issues.apache.org/jira/browse/SPARK-19767
>             Project: Spark
>          Issue Type: Bug
>          Components: DStreams
>    Affects Versions: 2.1.0
>            Reporter: Nick Afshartous
>            Priority: Minor
>
> The API docs linked from the Spark Kafka 0.10 Integration page are not 
> current.  For instance, on the page
>    https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
> the code examples show the new API (i.e. class ConsumerStrategies).  However, 
> following the links
>     API Docs --> (Scala | Java)
> lead to API pages that do not have class ConsumerStrategies) .  The API doc 
> package names  also have {code}streaming.kafka{code} as opposed to 
> {code}streaming.kafka10{code} 
> as in the code examples on streaming-kafka-0-10-integration.html.  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to