Github user squito commented on the pull request:

    https://github.com/apache/spark/pull/8438#issuecomment-141514207
  
    Hi @mccheah , on the mima tests -- those check for binary compatabilities 
across spark versions.  If you click through to the link for the test build, 
you'll see this in the test output:
    
    ```
    [info] spark-core: found 12 potential binary incompatibilities (filtered 
560)
    [error]  * object org.apache.spark.CleanCheckpoint does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanCheckpoint$")
    [error]  * class org.apache.spark.CleanupTaskWeakReference does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanupTaskWeakReference")
    [error]  * class org.apache.spark.CleanBroadcast does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanBroadcast")
    [error]  * object org.apache.spark.CleanShuffle does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanShuffle$")
    [error]  * class org.apache.spark.CleanRDD does not have a correspondent in 
new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanRDD")
    [error]  * class org.apache.spark.CleanCheckpoint does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanCheckpoint")
    [error]  * object org.apache.spark.CleanBroadcast does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanBroadcast$")
    [error]  * object org.apache.spark.CleanAccum does not have a correspondent 
in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanAccum$")
    [error]  * object org.apache.spark.CleanRDD does not have a correspondent 
in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanRDD$")
    [error]  * class org.apache.spark.CleanAccum does not have a correspondent 
in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanAccum")
    [error]  * class org.apache.spark.CleanShuffle does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanShuffle")
    [error]  * interface org.apache.spark.CleanupTask does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.CleanupTask")
    ```
    
    you can see its complaining about the classes like 
`org.apache.spark.CleanAccum` that you moved around.  Mima often has false 
positives, so the tool lets you override these checks.  Its telling you how to 
do that by each "filter with" statement.  Just stick those here:  
https://github.com/apache/spark/blob/master/project/MimaExcludes.scala#L73
    
    I *think* this is a case of false positives, since those classes are 
`private`.  (I would want to take a closer look before being certain about 
that.)  In any case, you can stick those filters in for now to at least let the 
tests get further.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to