[ 
https://issues.apache.org/jira/browse/SPARK-21185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16060296#comment-16060296
 ] 

Hyukjin Kwon commented on SPARK-21185:
--------------------------------------

Yea, I believe this is a duplicate of 
https://issues.apache.org/jira/browse/SPARK-20840

> Spurious errors in unidoc causing PRs to fail
> ---------------------------------------------
>
>                 Key: SPARK-21185
>                 URL: https://issues.apache.org/jira/browse/SPARK-21185
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 2.2.0
>            Reporter: Tathagata Das
>            Assignee: Hyukjin Kwon
>
> Some PRs are failing because of unidoc throwing random errors. When 
> GenJavaDoc generates Java files from Scala files, the generated java files 
> can have errors in them. When JavaDoc attempts to generate docs on these 
> generated java files, it throws errors. Usually, the errors are marked as 
> warnings, so the unidoc does not fail the build. 
> Example - 
> https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/78270/consoleFull
> {code}
> [info] Constructing Javadoc information...
> [warn] 
> /home/jenkins/workspace/SparkPullRequestBuilder/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117:
>  error: ExecutorAllocationClient is not public in org.apache.spark; cannot be 
> accessed from outside package
> [warn]   public   BlacklistTracker 
> (org.apache.spark.scheduler.LiveListenerBus listenerBus, 
> org.apache.spark.SparkConf conf, 
> scala.Option<org.apache.spark.ExecutorAllocationClient> allocationClient, 
> org.apache.spark.util.Clock clock)  { throw new RuntimeException(); }
> [warn]                                                                        
>                                                                             ^
> [warn] 
> /home/jenkins/workspace/SparkPullRequestBuilder/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118:
>  error: ExecutorAllocationClient is not public in org.apache.spark; cannot be 
> accessed from outside package
> [warn]   public   BlacklistTracker (org.apache.spark.SparkContext sc, 
> scala.Option<org.apache.spark.ExecutorAllocationClient> allocationClient)  { 
> throw new RuntimeException(); }
> {code}
> However in some PR builds these are marked as errors, thus causing the build 
> to fail due to unidoc. Example - 
> https://github.com/apache/spark/pull/18355
> https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/78484/consoleFull
> {code}
> [info] Constructing Javadoc information...
> [error] 
> /home/jenkins/workspace/SparkPullRequestBuilder@3/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117:
>  error: ExecutorAllocationClient is not public in org.apache.spark; cannot be 
> accessed from outside package
> [error]   public   BlacklistTracker 
> (org.apache.spark.scheduler.LiveListenerBus listenerBus, 
> org.apache.spark.SparkConf conf, 
> scala.Option<org.apache.spark.ExecutorAllocationClient> allocationClient, 
> org.apache.spark.util.Clock clock)  { throw new RuntimeException(); }
> [error]                                                                       
>                                                                              ^
> [error] 
> /home/jenkins/workspace/SparkPullRequestBuilder@3/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118:
>  error: ExecutorAllocationClient is not public in org.apache.spark; cannot be 
> accessed from outside package
> [error]   public   BlacklistTracker (org.apache.spark.SparkContext sc, 
> scala.Option<org.apache.spark.ExecutorAllocationClient> allocationClient)  { 
> throw new RuntimeException(); }
> [error]                                             
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to