[ 
https://issues.apache.org/jira/browse/SPARK-3819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14164453#comment-14164453
 ] 

Patrick Wendell commented on SPARK-3819:
----------------------------------------

Sorry I should have been more clear. We actually run the tests after every 
commit (not just nightly). I usually revert patches that cause failures on 
certain Hadoop versions and ask the person to re-submit them - this is as 
explicit as we can be. We just don't run every version before merging every 
commit because it would increase the durations of the tests by several hours.

The reason we didn't notice your failed patch immediately was that [~vanzin] 
was working on fixing a separate issue which was failing some builds so we 
couldn't notice right away. We are in the process of cleaning these up.

> Jenkins should compile Spark against multiple versions of Hadoop
> ----------------------------------------------------------------
>
>                 Key: SPARK-3819
>                 URL: https://issues.apache.org/jira/browse/SPARK-3819
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.1.0
>            Reporter: Matt Cheah
>            Priority: Minor
>              Labels: Jenkins
>             Fix For: 1.1.1
>
>
> The build broke because of PR 
> https://github.com/apache/spark/pull/2609#issuecomment-57962393 - however the 
> build failure was not caught by Jenkins. From what I understand the build 
> failure occurs when Spark is built manually against certain versions of 
> Hadoop.
> It seems intuitive that Jenkins should catch this sort of thing. The code 
> should be compiled against multiple Hadoop versions. It seems like overkill 
> to run the full test suite against all Hadoop versions.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to