Re: Can not build master

2015-07-04 Thread tomo cocoa
Hi all,

I have a same error and it seems depending on Maven versions.

I tried building Spark using Maven with several versions on Jenkins.

+ Output of
/Users/tomohiko/.jenkins/tools/hudson.tasks.Maven_MavenInstallation/mvn-3.3.3/bin/mvn
-version:

Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06;
2015-04-22T20:57:37+09:00)
Maven home:
/Users/tomohiko/.jenkins/tools/hudson.tasks.Maven_MavenInstallation/mvn-3.3.3
Java version: 1.8.0, vendor: Oracle Corporation
Java home: /Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: mac os x, version: 10.10.3, arch: x86_64, family: mac

+ Jenkins Configuration:
Jenkins project type: Maven Project
Goals and options: -Phadoop-2.6 -DskipTests clean package

+ Maven versions and results:
3.3.3 - infinite loop
3.3.1 - infinite loop
3.2.5 - SUCCESS


So do we prefer to build Spark with Maven 3.2.5?


On 4 July 2015 at 12:28, Andrew Or and...@databricks.com wrote:

 Thanks, I just tried it with 3.3.3 and I was able to reproduce it as well.

 2015-07-03 18:51 GMT-07:00 Tarek Auel tarek.a...@gmail.com:

 That's mine

 Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06;
 2015-04-22T04:57:37-07:00)

 Maven home: /usr/local/Cellar/maven/3.3.3/libexec

 Java version: 1.8.0_45, vendor: Oracle Corporation

 Java home:
 /Library/Java/JavaVirtualMachines/jdk1.8.0_45.jdk/Contents/Home/jre

 Default locale: en_US, platform encoding: UTF-8

 OS name: mac os x, version: 10.10.3, arch: x86_64, family: mac

 On Fri, Jul 3, 2015 at 6:32 PM Ted Yu yuzhih...@gmail.com wrote:

 Here is mine:

 Apache Maven 3.3.1 (cab6659f9874fa96462afef40fcf6bc033d58c1c;
 2015-03-13T13:10:27-07:00)
 Maven home: /home/hbase/apache-maven-3.3.1
 Java version: 1.8.0_45, vendor: Oracle Corporation
 Java home: /home/hbase/jdk1.8.0_45/jre
 Default locale: en_US, platform encoding: UTF-8
 OS name: linux, version: 2.6.32-504.el6.x86_64, arch: amd64,
 family: unix

 On Fri, Jul 3, 2015 at 6:05 PM, Andrew Or and...@databricks.com wrote:

 @Tarek and Ted, what maven versions are you using?

 2015-07-03 17:35 GMT-07:00 Krishna Sankar ksanka...@gmail.com:

 Patrick,
I assume an RC3 will be out for folks like me to test the
 distribution. As usual, I will run the tests when you have a new
 distribution.
 Cheers
 k/

 On Fri, Jul 3, 2015 at 4:38 PM, Patrick Wendell pwend...@gmail.com
 wrote:

 Patch that added test-jar dependencies:
 https://github.com/apache/spark/commit/bfe74b34

 Patch that originally disabled dependency reduced poms:

 https://github.com/apache/spark/commit/984ad60147c933f2d5a2040c87ae687c14eb1724

 Patch that reverted the disabling of dependency reduced poms:

 https://github.com/apache/spark/commit/bc51bcaea734fe64a90d007559e76f5ceebfea9e

 On Fri, Jul 3, 2015 at 4:36 PM, Patrick Wendell pwend...@gmail.com
 wrote:
  Okay I did some forensics with Sean Owen. Some things about this
 bug:
 
  1. The underlying cause is that we added some code to make the tests
  of sub modules depend on the core tests. For unknown reasons this
  causes Spark to hit MSHADE-148 for *some* combinations of build
  profiles.
 
  2. MSHADE-148 can be worked around by disabling building of
  dependency reduced poms because then the buggy code path is
  circumvented. Andrew Or did this in a patch on the 1.4 branch.
  However, that is not a tenable option for us because our *published*
  pom files require dependency reduction to substitute in the scala
  version correctly for the poms published to maven central.
 
  3. As a result, Andrew Or reverted his patch recently, causing some
  package builds to start failing again (but publishing works now).
 
  4. The reason this is not detected in our test harness or release
  build is that it is sensitive to the profiles enabled. The
 combination
  of profiles we enable in the test harness and release builds do not
  trigger this bug.
 
  The best path I see forward right now is to do the following:
 
  1. Disable creation of dependency reduced poms by default (this
  doesn't matter for people doing a package build) so typical users
  won't have this bug.
 
  2. Add a profile that re-enables that setting.
 
  3. Use the above profile when publishing release artifacts to maven
 central.
 
  4. Hope that we don't hit this bug for publishing.
 
  - Patrick
 
  On Fri, Jul 3, 2015 at 3:51 PM, Tarek Auel tarek.a...@gmail.com
 wrote:
  Doesn't change anything for me.
 
  On Fri, Jul 3, 2015 at 3:45 PM Patrick Wendell pwend...@gmail.com
 wrote:
 
  Can you try using the built in maven build/mvn...? All of our
 builds
  are passing on Jenkins so I wonder if it's a maven version issue:
 
  https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/
 
  - Patrick
 
  On Fri, Jul 3, 2015 at 3:14 PM, Ted Yu yuzhih...@gmail.com
 wrote:
   Please take a look at SPARK-8781
   (https://github.com/apache/spark/pull/7193)
  
   Cheers
  
   On Fri, Jul 3, 2015 at 3:05 PM, Tarek Auel 
 tarek.a...@gmail.com 

What versions of Hadoop Spark supports?

2014-10-04 Thread tomo cocoa
Hi,

I reported this issue (https://issues.apache.org/jira/browse/SPARK-3794)
about compilation error of spark core.
This error depends on a Hadoop version, and problematic versions are
1.1.1--2.2.0.

At first, we should argue about what versions of Hadoop Spark supports.
If we decide to omit a support for those versions, things are so simple and
no modification is needed.
Otherwise, we should be careful to use only commons-io 2.1.


Regards,
cocoatomo

-- 
class Cocoatomo:
name = 'cocoatomo'
email_address = 'cocoatom...@gmail.com'
twitter_id = '@cocoatomo'