[jira] [Created] (HADOOP-10130) RawLocalFS::LocalFSFileInputStream.pread does not track FS::Statistics

2013-11-27 Thread Binglin Chang (JIRA)
Binglin Chang created HADOOP-10130:
--

 Summary: RawLocalFS::LocalFSFileInputStream.pread does not track 
FS::Statistics
 Key: HADOOP-10130
 URL: https://issues.apache.org/jira/browse/HADOOP-10130
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Binglin Chang
Assignee: Binglin Chang
Priority: Minor


RawLocalFS::LocalFSFileInputStream.pread does not track FS::Statistics



--
This message was sent by Atlassian JIRA
(v6.1#6144)


Build failed in Jenkins: Hadoop-Common-trunk #964

2013-11-27 Thread Apache Jenkins Server
See 

Changes:

[cmccabe] HDFS-5565. CacheAdmin help should match against non-dashed commands 
(wang via cmccabe)

[jing9] HDFS-5561. FSNameSystem#getNameJournalStatus() in JMX should return 
plain text instead of HTML. Contributed by Haohui Mai.

[szetszwo] HDFS-5286. Flatten INodeDirectory hierarchy: Replace 
INodeDirectoryWithQuota with DirectoryWithQuotaFeature.

[brandonli] HDFS-5548. Use ConcurrentHashMap in portmap. Contributed by Haohui 
Mai

--
[...truncated 59851 lines...]
Adding reference: maven.local.repository
[DEBUG] Initialize Maven Ant Tasks
parsing buildfile 
jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.7/maven-antrun-plugin-1.7.jar!/org/apache/maven/ant/tasks/antlib.xml
 with URI = 
jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.7/maven-antrun-plugin-1.7.jar!/org/apache/maven/ant/tasks/antlib.xml
 from a zip file
parsing buildfile 
jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.2/ant-1.8.2.jar!/org/apache/tools/ant/antlib.xml
 with URI = 
jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.2/ant-1.8.2.jar!/org/apache/tools/ant/antlib.xml
 from a zip file
Class org.apache.maven.ant.tasks.AttachArtifactTask loaded from parent loader 
(parentFirst)
 +Datatype attachartifact org.apache.maven.ant.tasks.AttachArtifactTask
Class org.apache.maven.ant.tasks.DependencyFilesetsTask loaded from parent 
loader (parentFirst)
 +Datatype dependencyfilesets org.apache.maven.ant.tasks.DependencyFilesetsTask
Setting project property: test.build.dir -> 

Setting project property: test.exclude.pattern -> _
Setting project property: hadoop.assemblies.version -> 3.0.0-SNAPSHOT
Setting project property: test.exclude -> _
Setting project property: distMgmtSnapshotsId -> apache.snapshots.https
Setting project property: project.build.sourceEncoding -> UTF-8
Setting project property: java.security.egd -> file:///dev/urandom
Setting project property: distMgmtSnapshotsUrl -> 
https://repository.apache.org/content/repositories/snapshots
Setting project property: distMgmtStagingUrl -> 
https://repository.apache.org/service/local/staging/deploy/maven2
Setting project property: avro.version -> 1.7.4
Setting project property: test.build.data -> 

Setting project property: commons-daemon.version -> 1.0.13
Setting project property: hadoop.common.build.dir -> 

Setting project property: testsThreadCount -> 4
Setting project property: maven.test.redirectTestOutputToFile -> true
Setting project property: jdiff.version -> 1.0.9
Setting project property: distMgmtStagingName -> Apache Release Distribution 
Repository
Setting project property: project.reporting.outputEncoding -> UTF-8
Setting project property: build.platform -> Linux-i386-32
Setting project property: protobuf.version -> 2.5.0
Setting project property: failIfNoTests -> false
Setting project property: protoc.path -> ${env.HADOOP_PROTOC_PATH}
Setting project property: jersey.version -> 1.9
Setting project property: distMgmtStagingId -> apache.staging.https
Setting project property: distMgmtSnapshotsName -> Apache Development Snapshot 
Repository
Setting project property: ant.file -> 

[DEBUG] Setting properties with prefix: 
Setting project property: project.groupId -> org.apache.hadoop
Setting project property: project.artifactId -> hadoop-common-project
Setting project property: project.name -> Apache Hadoop Common Project
Setting project property: project.description -> Apache Hadoop Common Project
Setting project property: project.version -> 3.0.0-SNAPSHOT
Setting project property: project.packaging -> pom
Setting project property: project.build.directory -> 

Setting project property: project.build.outputDirectory -> 

Setting project property: project.build.testOutputDirectory -> 

Setting project property: project.build.sourceDirectory -> 

Setting project property: project.build.testSourceDirectory -> 

Setting project property: localRepository ->id: local
  url: file:

[jira] [Resolved] (HADOOP-9765) Precommit Admin job chokes on issues without an attachment

2013-11-27 Thread Todd Lipcon (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9765?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lipcon resolved HADOOP-9765.
-

  Resolution: Fixed
Hadoop Flags: Reviewed

> Precommit Admin job chokes on issues without an attachment
> --
>
> Key: HADOOP-9765
> URL: https://issues.apache.org/jira/browse/HADOOP-9765
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Reporter: Brock Noland
>Assignee: Brock Noland
> Attachments: HADOOP-9765.patch, HADOOP-9765.patch
>
>
> Check out this file:  
> https://builds.apache.org/job/PreCommit-Admin/lastSuccessfulBuild/artifact/patch_tested.txt
> It has "corrupt" data:
> {noformat}
> HIVE-4877HDFS-5010,12593214
> HIVE-4877HBASE-8693,12593082
> HIVE-4877YARN-919,12593107
> YARN-905,12593225
> HIVE-4877HBASE-8752,12588069
> {noformat}
> which resulted in the Hive precommit job being called with the ISSUE_NUM of 
> 5010, 8693, 919, and 8752.
> Looking at the script and some output, I pulled from the last run, it looks 
> like it gets hosed up when there is a JIRA which is PA but doesn't have an 
> attachment (as ZK-1402 is currently sitting). For example:
> This is the bad data the script is encountering:
> {noformat}
> $ grep -A 2 'ZOOKEEPER-1402' patch_available2.elements 
> ZOOKEEPER-1402
> HBASE-8348
>  id="12592318"
> {noformat}
> This is where it screws up:
> {noformat}
> $ awk '{ printf "%s", $0 }' patch_available2.elements | sed -e 
> "s/\W*id=\"/,/g" | perl -pe "s/\"/\n/g"  | grep ZOOKEEPER-1402
> ZOOKEEPER-1402HBASE-8348 ,12592318
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.1#6144)


[jira] [Created] (HADOOP-10133) winutils detection on windows-cygwin fails

2013-11-27 Thread Franjo Markovic (JIRA)
Franjo Markovic created HADOOP-10133:


 Summary: winutils detection on windows-cygwin fails
 Key: HADOOP-10133
 URL: https://issues.apache.org/jira/browse/HADOOP-10133
 Project: Hadoop Common
  Issue Type: Bug
  Components: fs
Affects Versions: 2.2.0
 Environment: windows 7, cygwin
Reporter: Franjo Markovic


java.io.IOException: Could not locate executable null\bin\winutils.exe in the 
Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
 at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.(Shell.java:293)





--
This message was sent by Atlassian JIRA
(v6.1#6144)


[jira] [Created] (HADOOP-10134) [JDK8] Fix Javadoc errors caused by incorrect or illegal tags in doc comments

2013-11-27 Thread Andrew Purtell (JIRA)
Andrew Purtell created HADOOP-10134:
---

 Summary: [JDK8] Fix Javadoc errors caused by incorrect or illegal 
tags in doc comments 
 Key: HADOOP-10134
 URL: https://issues.apache.org/jira/browse/HADOOP-10134
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 3.0.0, 2.3.0
Reporter: Andrew Purtell
Priority: Minor
 Attachments: 10134-branch-2.patch, 10134-trunk.patch

Javadoc is more strict by default in JDK8 and will error out on malformed or 
illegal tags found in doc comments. Although tagged as JDK8 all of the required 
changes are generic Javadoc cleanups.



--
This message was sent by Atlassian JIRA
(v6.1#6144)


[jira] [Created] (HADOOP-10135) writes to swift fs over partition size leave temp files and empty output file

2013-11-27 Thread David Dobbins (JIRA)
David Dobbins created HADOOP-10135:
--

 Summary: writes to swift fs over partition size leave temp files 
and empty output file
 Key: HADOOP-10135
 URL: https://issues.apache.org/jira/browse/HADOOP-10135
 Project: Hadoop Common
  Issue Type: Bug
  Components: fs
Affects Versions: 3.0.0
Reporter: David Dobbins


The OpenStack/swift filesystem produces incorrect output when the written 
objects exceed the configured partition size. After job completion, the 
expected files in the swift container have length == 0 and a collection of 
temporary files remain with names that appear to be URLs.

This can be replicated with teragen against the minicluster using the following 
command line:
bin/hadoop jar 
./share/hadoop/mapreduce/hadoop-mapreduce-examples-3.0.0-SNAPSHOT.jar teragen 
10 swift://mycontainer.myservice/teradata

Where core-site.xml contains:
  
fs.swift.impl
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem
  
  
fs.swift.service.myservice.auth.url
https://auth.api.rackspacecloud.com/v2.0/tokens
  
  
fs.swift.service.myservice.username
[[your-cloud-username]]
  
  
fs.swift.service.myservice.region
DFW
  
  
fs.swift.service.myservice.apikey
[[your-api-key]]
  
  
fs.swift.service.myservice.public
true
  

Container "mycontainer" should have a collection of objects with names starting 
with "teradata/part-m-0".  Instead, that file is empty and there is a 
collection of objects with names like 
"swift://mycontainer.myservice/teradata/_temporary/0/_temporary/attempt_local415043862_0001_m_00_0/part-m-0/10"



--
This message was sent by Atlassian JIRA
(v6.1#6144)