Build failed in Jenkins: Hadoop-Common-trunk #937

2013-10-30 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Common-trunk/937/changes

Changes:

[sandy] YARN-1306. Clean up hadoop-sls sample-conf according to YARN-1228 (Wei 
Yan via Sandy Ryza)

[arp] HDFS-5436. Move HsFtpFileSystem and HFtpFileSystem into 
org.apache.hdfs.web. (Contributed by Haohui Mai)

[bikas] Fix inadvertent file changes made via r1536888 (bikas)

[bikas] YARN-1068. Add admin support for HA operations (Karthik Kambatla via 
bikas)

[cmccabe] move HDFS-4657 to branch-2.2.1

[jlowe] MAPREDUCE-5598. TestUserDefinedCounters.testMapReduceJob is flakey. 
Contributed by Robert Kanter

[jlowe] MAPREDUCE-5596. Allow configuring the number of threads used to serve 
shuffle connections. Contributed by Sandy Ryza

--
[...truncated 57195 lines...]
Adding reference: maven.local.repository
[DEBUG] Initialize Maven Ant Tasks
parsing buildfile 
jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.7/maven-antrun-plugin-1.7.jar!/org/apache/maven/ant/tasks/antlib.xml
 with URI = 
jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.7/maven-antrun-plugin-1.7.jar!/org/apache/maven/ant/tasks/antlib.xml
 from a zip file
parsing buildfile 
jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.2/ant-1.8.2.jar!/org/apache/tools/ant/antlib.xml
 with URI = 
jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.2/ant-1.8.2.jar!/org/apache/tools/ant/antlib.xml
 from a zip file
Class org.apache.maven.ant.tasks.AttachArtifactTask loaded from parent loader 
(parentFirst)
 +Datatype attachartifact org.apache.maven.ant.tasks.AttachArtifactTask
Class org.apache.maven.ant.tasks.DependencyFilesetsTask loaded from parent 
loader (parentFirst)
 +Datatype dependencyfilesets org.apache.maven.ant.tasks.DependencyFilesetsTask
Setting project property: test.build.dir - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-dir
Setting project property: test.exclude.pattern - _
Setting project property: hadoop.assemblies.version - 3.0.0-SNAPSHOT
Setting project property: test.exclude - _
Setting project property: distMgmtSnapshotsId - apache.snapshots.https
Setting project property: project.build.sourceEncoding - UTF-8
Setting project property: java.security.egd - file:///dev/urandom
Setting project property: distMgmtSnapshotsUrl - 
https://repository.apache.org/content/repositories/snapshots
Setting project property: distMgmtStagingUrl - 
https://repository.apache.org/service/local/staging/deploy/maven2
Setting project property: avro.version - 1.7.4
Setting project property: test.build.data - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-dir
Setting project property: commons-daemon.version - 1.0.13
Setting project property: hadoop.common.build.dir - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/../../hadoop-common-project/hadoop-common/target
Setting project property: testsThreadCount - 4
Setting project property: maven.test.redirectTestOutputToFile - true
Setting project property: jdiff.version - 1.0.9
Setting project property: distMgmtStagingName - Apache Release Distribution 
Repository
Setting project property: project.reporting.outputEncoding - UTF-8
Setting project property: build.platform - Linux-i386-32
Setting project property: protobuf.version - 2.5.0
Setting project property: failIfNoTests - false
Setting project property: protoc.path - ${env.HADOOP_PROTOC_PATH}
Setting project property: jersey.version - 1.9
Setting project property: distMgmtStagingId - apache.staging.https
Setting project property: distMgmtSnapshotsName - Apache Development Snapshot 
Repository
Setting project property: ant.file - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/pom.xml
[DEBUG] Setting properties with prefix: 
Setting project property: project.groupId - org.apache.hadoop
Setting project property: project.artifactId - hadoop-common-project
Setting project property: project.name - Apache Hadoop Common Project
Setting project property: project.description - Apache Hadoop Common Project
Setting project property: project.version - 3.0.0-SNAPSHOT
Setting project property: project.packaging - pom
Setting project property: project.build.directory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target
Setting project property: project.build.outputDirectory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/classes
Setting project property: project.build.testOutputDirectory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-classes
Setting project property: project.build.sourceDirectory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/src/main/java
Setting project property: project.build.testSourceDirectory - 

Re: Question on hadoop dependencies.

2013-10-30 Thread Steve Loughran
Why hello Peter,

Which version are you using?

The reason those dependencies are declared are because things like Jetty
use them -and the classpath for the server side Hadoop is things needed to
run Hadoop.

Client-side I think there's too much in the maven dependency tree
(servlets, jetty, ...)



On 29 October 2013 19:22, Petar Tahchiev paranoia...@gmail.com wrote:

 Hi guys,

 I'm using Spring-data-solr in my project. SDS declares Solr-core as a
 dependency. Solr declares hadoop-auth, hadoop-common, hadoop-hdfs
 dependencies. Each and every one of those dependencies declares log4j and
 slf4j-log4j12 as runtime dependencies, and also hadoop-common declares
 servlet-api version 2.5 as runtime dependency. So in the end I also get
 servlet-api 2.5, slf4 and log4j in my classpath.
 This normally shouldn't be a problem, but in my case I'm using Servlet 3.0,
 log4j2, and SL4j for log4j2. This completely messes up my classpath, so I
 have to manually exclude those dependencies, like this:
 ---
 dependency
 groupIdorg.apache.solr/groupId
 artifactIdsolr-core/artifactId
 version${solr.version}/version
 exclusions
 exclusion
 groupIdorg.slf4j/groupId
 artifactIdslf4j-jdk14/artifactId
 /exclusion
 exclusion
 groupIdorg.apache.solr/groupId
 artifactIdsolr-core/artifactId
 /exclusion
 /exclusions
 /dependency
 dependency
 groupIdorg.apache.solr/groupId
 artifactIdsolr-core/artifactId
 version${solr.version}/version
 exclusions
 exclusion
 groupIdorg.apache.hadoop/groupId
 artifactIdhadoop-auth/artifactId
 /exclusion
 exclusion
 groupIdorg.apache.hadoop/groupId
 artifactIdhadoop-common/artifactId
 /exclusion
 exclusion
 groupIdorg.apache.hadoop/groupId
 artifactIdhadoop-hdfs/artifactId
 /exclusion
 /exclusions
 /dependency
 dependency
 groupIdorg.apache.hadoop/groupId
 artifactIdhadoop-auth/artifactId
 version${hadoop.version}/version
 exclusions
 exclusion
 groupIdlog4j/groupId
 artifactIdlog4j/artifactId
 /exclusion
 exclusion
 groupIdorg.slf4j/groupId
 artifactIdslf4j-log4j12/artifactId
 /exclusion
 /exclusions
 /dependency
 dependency
 groupIdorg.apache.hadoop/groupId
 artifactIdhadoop-common/artifactId
 version${hadoop.version}/version
 exclusions
 exclusion
 groupIdlog4j/groupId
 artifactIdlog4j/artifactId
 /exclusion
 exclusion
 groupIdorg.slf4j/groupId
 artifactIdslf4j-log4j12/artifactId
 /exclusion
 exclusion
 groupIdjavax.servlet/groupId
 artifactIdservlet-api/artifactId
 /exclusion
 /exclusions
 /dependency
 dependency
 groupIdorg.apache.hadoop/groupId
 artifactIdhadoop-hdfs/artifactId
 version${hadoop.version}/version
 exclusions
 exclusion
 groupIdlog4j/groupId
 artifactIdlog4j/artifactId
 /exclusion
 exclusion
 groupIdorg.slf4j/groupId
 artifactIdslf4j-log4j12/artifactId
 /exclusion
 exclusion
 groupIdjavax.servlet/groupId
 artifactIdservlet-api/artifactId
 /exclusion
 /exclusions
 /dependency
 ---
 Just out of curiosity here - what is the reason to include log4j,
 slf4j-log4j12 and servlet-api as runtime dependencies? I think they should
 be with scope provided. It's good to program your API against slf4j, but
 then the connector should be specified by the user (Log4J2 in my case).
 Also are there any plans to migrate to Log4J2 - it seems pretty solid (9
 betas so far) and It is supposed to be released soon.

 Thanks for you time and keep up the good work.

 --
 Regards, Petar!
 Karlovo, Bulgaria.
 ---
 Public PGP Key at:
 https://keyserver1.pgp.com/vkd/DownloadKey.event?keyid=0x19658550C3110611
 Key Fingerprint: A369 A7EE 61BC 93A3 CDFF  55A5 1965 8550 C311 0611


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is 

Hadoop in Fedora updated to 2.2.0

2013-10-30 Thread Robert Rati
I've updated the version of Hadoop in Fedora 20 to 2.2.0.  This means 
Hadoop 2.2.0 will be the included in the official release of Fedora 20.


Hadoop on Fedora is running against numerous updated dependencies, 
including:

Java 7 (OpenJDK IcedTea)
Jetty 9
Tomcat 7
Jets3t 0.9.0

I've logged/updated jiras for all the changes we've made that could be 
useful to the Hadoop project:


https://issues.apache.org/jira/browse/HADOOP-9594
https://issues.apache.org/jira/browse/MAPREDUCE-5431
https://issues.apache.org/jira/browse/HADOOP-9611
https://issues.apache.org/jira/browse/HADOOP-9613
https://issues.apache.org/jira/browse/HADOOP-9623
https://issues.apache.org/jira/browse/HDFS-5411
https://issues.apache.org/jira/browse/HADOOP-10067
https://issues.apache.org/jira/browse/HDFS-5075
https://issues.apache.org/jira/browse/HADOOP-10068
https://issues.apache.org/jira/browse/HADOOP-10075
https://issues.apache.org/jira/browse/HADOOP-10076
https://issues.apache.org/jira/browse/HADOOP-9849

Most of the changes are minor.  There are 2 big updates though: Jetty 9 
(which requires java 7) and tomcat 7.  These are also the most difficult 
patches to rebase when hadoop produces a new release.


It would be great to get some feedback on these proposed changes and 
discuss how/when/if these could make it into a Hadoop release.


Rob


Re: Question on hadoop dependencies.

2013-10-30 Thread Steve Loughran
On 30 October 2013 13:07, Petar Tahchiev paranoia...@gmail.com wrote:

 Oh, hi Steve,

 didn't know you were on this list :) ...


Well I didn' t know you were doing Hadoop stuff





 So spring-data-solr (1.1.SNAPSHOT) uses solr 4.5.1 (just came out a few
 days ago), which uses Hadoop 2.0.5-alpha.
 I would be glad if we can clean up the poms a bit and leave only the
 dependencies
 that hadoop really depend on.


 I'd like to take this opportunity to assign a JIRA to you

https://issues.apache.org/jira/browse/HADOOP-9991

spring-data needs to move on to Hadoop 2.2 -by the time they are ready we
can make sure that the 2.3 poms are better

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.


Re: Question on hadoop dependencies.

2013-10-30 Thread Roman Shaposhnik
On Wed, Oct 30, 2013 at 1:07 PM, Steve Loughran ste...@hortonworks.com wrote:
 On 30 October 2013 13:07, Petar Tahchiev paranoia...@gmail.com wrote:
 So spring-data-solr (1.1.SNAPSHOT) uses solr 4.5.1 (just came out a few
 days ago), which uses Hadoop 2.0.5-alpha.
 I would be glad if we can clean up the poms a bit and leave only the
 dependencies
 that hadoop really depend on.

To pile on top of what Steve has said -- do you happen to know if there's
 a JIRA to re target Solr to depend on Hadoop 2.2.0?

Thanks,
Roman.


Re: Question on hadoop dependencies.

2013-10-30 Thread Petar Tahchiev
Hi Roman,

looks like they have already upgraded to 2.2

https://issues.apache.org/jira/browse/SOLR-5382

and will be shipping it SOLR 4.6. I just hope you guys release cleaned 2.3
first :)


2013/10/30 Roman Shaposhnik r...@apache.org

 On Wed, Oct 30, 2013 at 1:07 PM, Steve Loughran ste...@hortonworks.com
 wrote:
  On 30 October 2013 13:07, Petar Tahchiev paranoia...@gmail.com wrote:
  So spring-data-solr (1.1.SNAPSHOT) uses solr 4.5.1 (just came out a few
  days ago), which uses Hadoop 2.0.5-alpha.
  I would be glad if we can clean up the poms a bit and leave only the
  dependencies
  that hadoop really depend on.

 To pile on top of what Steve has said -- do you happen to know if there's
  a JIRA to re target Solr to depend on Hadoop 2.2.0?

 Thanks,
 Roman.




-- 
Regards, Petar!
Karlovo, Bulgaria.
---
Public PGP Key at:
https://keyserver1.pgp.com/vkd/DownloadKey.event?keyid=0x19658550C3110611
Key Fingerprint: A369 A7EE 61BC 93A3 CDFF  55A5 1965 8550 C311 0611


test-patch failing with OOM errors in javah

2013-10-30 Thread Alejandro Abdelnur
The following is happening in builds for MAPREDUCE and YARN patches.
I've seen the failures in hadoop5 and hadoop7 machines. I've increased
Maven memory to 1GB (export MAVEN_OPTS=-Xmx1024m in the jenkins
jobs) but still some failures persist:
https://builds.apache.org/job/PreCommit-MAPREDUCE-Build/4159/

Does anybody has an idea of what may be going on?



thx


[INFO] --- native-maven-plugin:1.0-alpha-7:javah (default) @ hadoop-common ---
[INFO] /bin/sh -c cd
/home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-common
 /home/jenkins/tools/java/latest/bin/javah -d
/home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-common/target/native/javah
-classpath 
/home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-common/target/classes:/home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-annotations/target/classes:/home/jenkins/tools/java/jdk1.6.0_26/jre/../lib/tools.jar:/home/jenkins/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/home/jenkins/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/home/jenkins/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/home/jenkins/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/home/jenkins/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar:/home/jenkins/.m2/repository/commons-io/commons-io/2.1/commons-io-2.1.jar:/home/jenkins/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/jenkins/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/home/jenkins/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/home/jenkins/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/home/jenkins/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/home/jenkins/.m2/repository/stax/stax-api/1.0.1/stax-api-1.0.1.jar:/home/jenkins/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/home/jenkins/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/home/jenkins/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.8/jackson-jaxrs-1.8.8.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.8/jackson-xc-1.8.8.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/home/jenkins/.m2/repository/asm/asm/3.2/asm-3.2.jar:/home/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar:/home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/jenkins/.m2/repository/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar:/home/jenkins/.m2/repository/commons-lang/commons-lang/2.5/commons-lang-2.5.jar:/home/jenkins/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/home/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/home/jenkins/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/home/jenkins/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/home/jenkins/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/home/jenkins/.m2/repository/org/apache/avro/avro/1.7.4/avro-1.7.4.jar:/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar:/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-auth/target/classes:/home/jenkins/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/home/jenkins/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar
org.apache.hadoop.io.compress.zlib.ZlibCompressor
org.apache.hadoop.io.compress.zlib.ZlibDecompressor
org.apache.hadoop.io.compress.bzip2.Bzip2Compressor
org.apache.hadoop.io.compress.bzip2.Bzip2Decompressor

Re: test-patch failing with OOM errors in javah

2013-10-30 Thread Roman Shaposhnik
I can take a look sometime later today. Meantime I can only
say that I've been running into 1Gb limit in a few builds as
of late. These days -- I just go with 2G by default.

Thanks,
Roman.

On Wed, Oct 30, 2013 at 3:33 PM, Alejandro Abdelnur t...@cloudera.com wrote:
 The following is happening in builds for MAPREDUCE and YARN patches.
 I've seen the failures in hadoop5 and hadoop7 machines. I've increased
 Maven memory to 1GB (export MAVEN_OPTS=-Xmx1024m in the jenkins
 jobs) but still some failures persist:
 https://builds.apache.org/job/PreCommit-MAPREDUCE-Build/4159/

 Does anybody has an idea of what may be going on?



 thx


 [INFO] --- native-maven-plugin:1.0-alpha-7:javah (default) @ hadoop-common ---
 [INFO] /bin/sh -c cd
 /home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-common
  /home/jenkins/tools/java/latest/bin/javah -d
 /home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-common/target/native/javah
 -classpath 
 

[jira] [Created] (HADOOP-10077) o.a.h.s.Groups should refresh in the background

2013-10-30 Thread Colin Patrick McCabe (JIRA)
Colin Patrick McCabe created HADOOP-10077:
-

 Summary: o.a.h.s.Groups should refresh in the background
 Key: HADOOP-10077
 URL: https://issues.apache.org/jira/browse/HADOOP-10077
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.2.1
Reporter: Colin Patrick McCabe
Assignee: Colin Patrick McCabe


{{org.apache.hadoop.security.Groups}} maintains a cache of mappings between 
user names and sets of associated group names.  Periodically, the entries in 
this cache expire and must be refetched from the operating system.

Currently, this is done in the context of whatever thread happens to try to 
access the group mapping information right after the time period expires.  
However, this is problematic, since that thread may be holding the 
{{FSNamesystem}} lock.  This means that if the {{GroupMappingServiceProvider}} 
is slow, the whole NameNode may grind to a halt until it finishes.  This can 
generate periodic load spikes or even NameNode failovers.

Instead, we should allow the refreshing of the group mappings to be done 
asynchronously in a background thread pool.



--
This message was sent by Atlassian JIRA
(v6.1#6144)


Re: test-patch failing with OOM errors in javah

2013-10-30 Thread Omkar Joshi
yes even I do the same.. I just use this to build

export _JAVA_OPTIONS=-Djava.awt.headless=true -Xmx2048m -Xms2048m
mvn clean install package -Pdist -Dtar -DskipTests -Dmaven.javadoc.skip=true


Thanks,
Omkar Joshi
*Hortonworks Inc.* http://www.hortonworks.com


On Wed, Oct 30, 2013 at 3:39 PM, Roman Shaposhnik r...@apache.org wrote:

 I can take a look sometime later today. Meantime I can only
 say that I've been running into 1Gb limit in a few builds as
 of late. These days -- I just go with 2G by default.

 Thanks,
 Roman.

 On Wed, Oct 30, 2013 at 3:33 PM, Alejandro Abdelnur t...@cloudera.com
 wrote:
  The following is happening in builds for MAPREDUCE and YARN patches.
  I've seen the failures in hadoop5 and hadoop7 machines. I've increased
  Maven memory to 1GB (export MAVEN_OPTS=-Xmx1024m in the jenkins
  jobs) but still some failures persist:
  https://builds.apache.org/job/PreCommit-MAPREDUCE-Build/4159/
 
  Does anybody has an idea of what may be going on?
 
 
 
  thx
 
 
  [INFO] --- native-maven-plugin:1.0-alpha-7:javah (default) @
 hadoop-common ---
  [INFO] /bin/sh -c cd
 
 /home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-common
   /home/jenkins/tools/java/latest/bin/javah -d
 
 /home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-common/target/native/javah
  -classpath
 /home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-common/target/classes:/home/jenkins/jenkins-slave/workspace/PreCommit-MAPREDUCE-Build/trunk/hadoop-common-project/hadoop-annotations/target/classes:/home/jenkins/tools/java/jdk1.6.0_26/jre/../lib/tools.jar:/home/jenkins/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/home/jenkins/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/home/jenkins/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/home/jenkins/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/home/jenkins/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar:/home/jenkins/.m2/repository/commons-io/commons-io/2.1/commons-io-2.1.jar:/home/jenkins/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/jenkins/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/home/jenkins/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/home/jenkins/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/home/jenkins/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/home/jenkins/.m2/repository/stax/stax-api/1.0.1/stax-api-1.0.1.jar:/home/jenkins/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/home/jenkins/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/home/jenkins/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.8/jackson-jaxrs-1.8.8.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.8/jackson-xc-1.8.8.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/home/jenkins/.m2/repository/asm/asm/3.2/asm-3.2.jar:/home/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar:/home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/jenkins/.m2/repository/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar:/home/jenkins/.m2/repository/commons-lang/commons-lang/2.5/commons-lang-2.5.jar:/home/jenkins/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/home/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/home/jenkins/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/home/jenkins/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/home/jenkins/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/home/jenkins/.m2/repository/org/apache/avro/avro/1.7.4/avro-1.7.4.jar:/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/
 

Re: Empty value in write() method : Custom Datatype in Hadoop MapReduce

2013-10-30 Thread unmesha sreeveni
Empty value in write() method : Custom Datatype in Hadoop MapReduce[Solved]


On Wed, Oct 30, 2013 at 11:23 AM, unmesha sreeveni unmeshab...@gmail.comwrote:

 I am emiting two 2D double array as key and value.I am in construction of
 my WritableComparable class.


 *public class MF implements WritableComparableMF{
 /**
  * @param args
  */private double[][] value;
 public MF() {
 // TODO Auto-generated constructor stub
 }
   public MF(double[][] value) {
 // TODO Auto-generated constructor stub

   this.value = new double[value.length][value[0].length];
 // System.out.println(in matrix);}
  public void set(double[][] value) {
   this.value = value;
   }
  public double[][] getValue() {
 return value;
   }

  @Override
   public void write(DataOutput out) throws IOException {
  System.out.println(write);
  int row=0;
 int column=0;
 for(int i=0; ivalue.length;i++){
 row = value.length;
 for(int j=0; jvalue[i].length; j++){
 column = value[i].length;
 }
 }
 out.writeInt(row);
 out.writeInt(column);



 for(int i=0;ivalue.length ; i++){
 for(int j= 0 ; j value[0].length;j++){
  out.writeDouble(value[i][j]);
 }
 }
 for(int i =0;i value.length ;i ++){
 for(int j = 0;j value[0].length;j++){
 System.out.print(value[i][j]+ \t);
 }
 System.out.println();
 }

   }

   @Override
   public void readFields(DataInput in) throws IOException {
   int row = in.readInt();
   int col = in.readInt();


   double[][] value = new double[row][col];
   for(int i=0;irow ; i++){
 for(int j= 0 ; j col;j++){
 value[i][j] = in.readDouble();

 }
 }
   }

   @Override
   public int hashCode() {

   }

   @Override
   public boolean equals(Object o) {

   }

 @Overridepublic int compareTo(MF o) {
 // TODO Auto-generated method stub
 return 0;}
  @Override
   public String toString() {
  System.out.println(Arrays.toString(value));
 return Arrays.toString(value);

   }*

 *}*

 And by half of the way...when i tried to print my matrix within write
 method,the matrix is not having values


   write

 0.0   0.0 0.0 
 0.0   0.0 0.0 
 0.0   0.0 0.0 


   write

 0.0   0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 
 0.0   0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 
 0.0   0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 


 I am doing something wrong. Can anyone help me out to resolve this.I am
 getting apt no of rows and col.

 --
 *Thanks  Regards*
 *
 *
 Unmesha Sreeveni U.B*
 *

 **




-- 
*Thanks  Regards*
*
*
Unmesha Sreeveni U.B*
*
*Junior Developer
*
*Amrita Center For Cyber Security
*
*
Amritapuri.

www.amrita.edu/cyber/
*