Repository: spark
Updated Branches:
  refs/heads/branch-2.0 9c1596b6c -> 801fb7994


[SPARK-16359][STREAMING][KAFKA] unidoc skip kafka 0.10

## What changes were proposed in this pull request?
during sbt unidoc task, skip the streamingKafka010 subproject and filter kafka 
0.10 classes from the classpath, so that at least existing kafka 0.8 doc can be 
included in unidoc without error

## How was this patch tested?
sbt spark/scalaunidoc:doc | grep -i error

Author: cody koeninger <c...@koeninger.org>

Closes #14041 from koeninger/SPARK-16359.

(cherry picked from commit 1f0d021308f2201366111f8390015114710d4f9b)
Signed-off-by: Tathagata Das <tathagata.das1...@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/801fb799
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/801fb799
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/801fb799

Branch: refs/heads/branch-2.0
Commit: 801fb7994d890fa4112b97fc339520f5ce3ab6cb
Parents: 9c1596b
Author: cody koeninger <c...@koeninger.org>
Authored: Tue Jul 5 16:44:15 2016 -0700
Committer: Tathagata Das <tathagata.das1...@gmail.com>
Committed: Tue Jul 5 16:44:24 2016 -0700

----------------------------------------------------------------------
 project/SparkBuild.scala | 18 ++++++++++++++++--
 1 file changed, 16 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/801fb799/project/SparkBuild.scala
----------------------------------------------------------------------
diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index 6018b22..b1a9f39 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -701,15 +701,29 @@ object Unidoc {
       
.map(_.filterNot(_.getCanonicalPath.contains("org/apache/spark/sql/hive/test")))
   }
 
+  private def ignoreClasspaths(classpaths: Seq[Classpath]): Seq[Classpath] = {
+    classpaths
+      
.map(_.filterNot(_.data.getCanonicalPath.matches(""".*kafka-clients-0\.10.*""")))
+      
.map(_.filterNot(_.data.getCanonicalPath.matches(""".*kafka_2\..*-0\.10.*""")))
+  }
+
   val unidocSourceBase = settingKey[String]("Base URL of source links in 
Scaladoc.")
 
   lazy val settings = scalaJavaUnidocSettings ++ Seq (
     publish := {},
 
     unidocProjectFilter in(ScalaUnidoc, unidoc) :=
-      inAnyProject -- inProjects(OldDeps.project, repl, examples, tools, 
streamingFlumeSink, yarn, tags),
+      inAnyProject -- inProjects(OldDeps.project, repl, examples, tools, 
streamingFlumeSink, yarn, tags, streamingKafka010),
     unidocProjectFilter in(JavaUnidoc, unidoc) :=
-      inAnyProject -- inProjects(OldDeps.project, repl, examples, tools, 
streamingFlumeSink, yarn, tags),
+      inAnyProject -- inProjects(OldDeps.project, repl, examples, tools, 
streamingFlumeSink, yarn, tags, streamingKafka010),
+
+    unidocAllClasspaths in (ScalaUnidoc, unidoc) := {
+      ignoreClasspaths((unidocAllClasspaths in (ScalaUnidoc, unidoc)).value)
+    },
+
+    unidocAllClasspaths in (JavaUnidoc, unidoc) := {
+      ignoreClasspaths((unidocAllClasspaths in (JavaUnidoc, unidoc)).value)
+    },
 
     // Skip actual catalyst, but include the subproject.
     // Catalyst is not public API and contains quasiquotes which break 
scaladoc.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to