Build failed in Jenkins: Hadoop-Common-trunk #535

2012-09-16 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Common-trunk/535/

--
[...truncated 13300 lines...]
[DEBUG]   (f) outputDirectory = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/target/test-classes
[DEBUG]   (f) project = MavenProject: 
org.apache.hadoop:hadoop-auth:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/pom.xml
[DEBUG]   (f) resources = [Resource {targetPath: null, filtering: true, FileSet 
{directory: 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/src/test/resources,
 PatternSet [includes: {krb5.conf}, excludes: {}]}}]
[DEBUG] -- end configuration --
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ 
hadoop-auth ---
[DEBUG] Configuring mojo 
org.apache.maven.plugins:maven-compiler-plugin:2.5.1:testCompile from plugin 
realm ClassRealm[pluginorg.apache.maven.plugins:maven-compiler-plugin:2.5.1, 
parent: sun.misc.Launcher$AppClassLoader@126b249]
[DEBUG] Configuring mojo 
'org.apache.maven.plugins:maven-compiler-plugin:2.5.1:testCompile' with basic 
configurator --
[DEBUG]   (f) basedir = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth
[DEBUG]   (f) buildDirectory = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/target
[DEBUG]   (f) classpathElements = 
[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/target/test-classes,
 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/target/classes,
 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/target/hadoop-annotations-3.0.0-SNAPSHOT.jar,
 /home/jenkins/.m2/repository/junit/junit/4.8.2/junit-4.8.2.jar, 
/home/jenkins/.m2/repository/org/mockito/mockito-all/1.8.5/mockito-all-1.8.5.jar,
 /home/jenkins/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar, 
/home/jenkins/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar,
 
/home/jenkins/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar, 
/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.1.jar, 
/home/jenkins/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar,
 /home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar, 
/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar,
 /home/jenkins/tools/java/jdk1.6.0_26/jre/../lib/tools.jar]
[DEBUG]   (f) compileSourceRoots = 
[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/src/test/java]
[DEBUG]   (f) compilerId = javac
[DEBUG]   (f) debug = true
[DEBUG]   (f) encoding = UTF-8
[DEBUG]   (f) failOnError = true
[DEBUG]   (f) fork = false
[DEBUG]   (f) generatedTestSourcesDirectory = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/target/generated-test-sources/test-annotations
[DEBUG]   (f) optimize = false
[DEBUG]   (f) outputDirectory = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/target/test-classes
[DEBUG]   (f) outputFileName = hadoop-auth-3.0.0-SNAPSHOT
[DEBUG]   (f) session = org.apache.maven.execution.MavenSession@727896
[DEBUG]   (f) showDeprecation = false
[DEBUG]   (f) showWarnings = false
[DEBUG]   (f) source = 1.6
[DEBUG]   (f) staleMillis = 0
[DEBUG]   (f) target = 1.6
[DEBUG]   (f) verbose = false
[DEBUG] -- end configuration --
[DEBUG] Using compiler 'javac'.
[DEBUG] Source directories: 
[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/src/test/java]
[DEBUG] Classpath: 
[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/target/test-classes
 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/target/classes
 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/target/hadoop-annotations-3.0.0-SNAPSHOT.jar
 /home/jenkins/.m2/repository/junit/junit/4.8.2/junit-4.8.2.jar
 
/home/jenkins/.m2/repository/org/mockito/mockito-all/1.8.5/mockito-all-1.8.5.jar
 /home/jenkins/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
 
/home/jenkins/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
 /home/jenkins/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar
 /home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.1.jar
 
/home/jenkins/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
 /home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar
 
/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar
 

[jira] [Created] (HADOOP-8816) HTTP Error 413 full HEAD if using kerberos authentication

2012-09-16 Thread Moritz Moeller (JIRA)
Moritz Moeller created HADOOP-8816:
--

 Summary: HTTP Error 413 full HEAD if using kerberos authentication
 Key: HADOOP-8816
 URL: https://issues.apache.org/jira/browse/HADOOP-8816
 Project: Hadoop Common
  Issue Type: Bug
  Components: net
Affects Versions: 2.0.1-alpha
 Environment: ubuntu linux with active directory kerberos.
Reporter: Moritz Moeller


The HTTP Authentication: header is too large if using kerberos and the request 
is rejected by Jetty because Jetty has a too low default header size limit.

Can be fixed by adding ret.setHeaderBufferSize(1024*128); in 
org.apache.hadoop.http.HttpServer.createDefaultChannelConnector



--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-8817) Backport Network Topology Extension for Virtualization (HADOOP-8468) to branch-1

2012-09-16 Thread Junping Du (JIRA)
Junping Du created HADOOP-8817:
--

 Summary: Backport Network Topology Extension for Virtualization 
(HADOOP-8468) to branch-1
 Key: HADOOP-8817
 URL: https://issues.apache.org/jira/browse/HADOOP-8817
 Project: Hadoop Common
  Issue Type: Sub-task
Affects Versions: 1.0.0
Reporter: Junping Du
Assignee: Junping Du




--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-8818) Should use equals() rather than == to compare String or Text in MD5MD5CRC32FileChecksum and TFileDumper

2012-09-16 Thread Brandon Li (JIRA)
Brandon Li created HADOOP-8818:
--

 Summary: Should use equals() rather than == to compare String or 
Text in MD5MD5CRC32FileChecksum and TFileDumper
 Key: HADOOP-8818
 URL: https://issues.apache.org/jira/browse/HADOOP-8818
 Project: Hadoop Common
  Issue Type: Bug
  Components: fs, io
Reporter: Brandon Li
Assignee: Brandon Li
Priority: Minor
 Attachments: HADOOP-8818.patch

Should use equals() rather than == to compare String or Text in 
MD5MD5CRC32FileChecksum and TFileDumper.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-8819) Should use instead of in a few places in FTPFileSystem,FTPInputStream,S3InputStream,ViewFileSystem,ViewFs

2012-09-16 Thread Brandon Li (JIRA)
Brandon Li created HADOOP-8819:
--

 Summary: Should use  instead of   in a few places in 
FTPFileSystem,FTPInputStream,S3InputStream,ViewFileSystem,ViewFs
 Key: HADOOP-8819
 URL: https://issues.apache.org/jira/browse/HADOOP-8819
 Project: Hadoop Common
  Issue Type: Bug
  Components: fs
Reporter: Brandon Li
Assignee: Brandon Li


Should use  instead of   in a few places in 
FTPFileSystem,FTPInputStream,S3InputStream,ViewFileSystem,ViewFs.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-8820) Backport HADOOP-8469 and HADOOP-8470: add NodeGroup layer in new NetworkTopology (also known as NetworkTopologyWithNodeGroup)

2012-09-16 Thread Junping Du (JIRA)
Junping Du created HADOOP-8820:
--

 Summary: Backport HADOOP-8469 and HADOOP-8470: add NodeGroup 
layer in new NetworkTopology (also known as NetworkTopologyWithNodeGroup)
 Key: HADOOP-8820
 URL: https://issues.apache.org/jira/browse/HADOOP-8820
 Project: Hadoop Common
  Issue Type: New Feature
  Components: net
Affects Versions: 1.0.0
Reporter: Junping Du




--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-8821) Findbugs warning Configuration.dumpDeprecatedKeys() concatenates strings using + in a loop

2012-09-16 Thread Suresh Srinivas (JIRA)
Suresh Srinivas created HADOOP-8821:
---

 Summary: Findbugs warning Configuration.dumpDeprecatedKeys() 
concatenates strings using + in a loop
 Key: HADOOP-8821
 URL: https://issues.apache.org/jira/browse/HADOOP-8821
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 3.0.0
Reporter: Suresh Srinivas
Assignee: Suresh Srinivas
Priority: Trivial


Configuration.dumpDeprecatedKeys() concatenates strings using + in a loop. 
Instead use StringBuilder.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira