Re: Write Access to Hadoop wiki
done -try now On 12 January 2014 16:15, Lewis John Mcgibbney lewis.mcgibb...@gmail.comwrote: Hi Folks, I would please like write access id: LewisJohnMcgibbney Lewis John McGibbney Thank you very much in advance Lewis -- *Lewis* -- CONFIDENTIALITY NOTICE NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
Build failed in Jenkins: Hadoop-Common-trunk #1011
See https://builds.apache.org/job/Hadoop-Common-trunk/1011/changes Changes: [tucu] HADOOP-10223. MiniKdc#main() should close the FileReader it creates. (Ted Yu via tucu) -- [...truncated 60419 lines...] Adding reference: maven.local.repository [DEBUG] Initialize Maven Ant Tasks parsing buildfile jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.7/maven-antrun-plugin-1.7.jar!/org/apache/maven/ant/tasks/antlib.xml with URI = jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.7/maven-antrun-plugin-1.7.jar!/org/apache/maven/ant/tasks/antlib.xml from a zip file parsing buildfile jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.2/ant-1.8.2.jar!/org/apache/tools/ant/antlib.xml with URI = jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.2/ant-1.8.2.jar!/org/apache/tools/ant/antlib.xml from a zip file Class org.apache.maven.ant.tasks.AttachArtifactTask loaded from parent loader (parentFirst) +Datatype attachartifact org.apache.maven.ant.tasks.AttachArtifactTask Class org.apache.maven.ant.tasks.DependencyFilesetsTask loaded from parent loader (parentFirst) +Datatype dependencyfilesets org.apache.maven.ant.tasks.DependencyFilesetsTask Setting project property: test.build.dir - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-dir Setting project property: test.exclude.pattern - _ Setting project property: hadoop.assemblies.version - 3.0.0-SNAPSHOT Setting project property: test.exclude - _ Setting project property: distMgmtSnapshotsId - apache.snapshots.https Setting project property: project.build.sourceEncoding - UTF-8 Setting project property: java.security.egd - file:///dev/urandom Setting project property: distMgmtSnapshotsUrl - https://repository.apache.org/content/repositories/snapshots Setting project property: distMgmtStagingUrl - https://repository.apache.org/service/local/staging/deploy/maven2 Setting project property: avro.version - 1.7.4 Setting project property: test.build.data - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-dir Setting project property: commons-daemon.version - 1.0.13 Setting project property: hadoop.common.build.dir - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/../../hadoop-common-project/hadoop-common/target Setting project property: testsThreadCount - 4 Setting project property: maven.test.redirectTestOutputToFile - true Setting project property: jdiff.version - 1.0.9 Setting project property: distMgmtStagingName - Apache Release Distribution Repository Setting project property: project.reporting.outputEncoding - UTF-8 Setting project property: build.platform - Linux-i386-32 Setting project property: protobuf.version - 2.5.0 Setting project property: failIfNoTests - false Setting project property: protoc.path - ${env.HADOOP_PROTOC_PATH} Setting project property: jersey.version - 1.9 Setting project property: distMgmtStagingId - apache.staging.https Setting project property: distMgmtSnapshotsName - Apache Development Snapshot Repository Setting project property: ant.file - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/pom.xml [DEBUG] Setting properties with prefix: Setting project property: project.groupId - org.apache.hadoop Setting project property: project.artifactId - hadoop-common-project Setting project property: project.name - Apache Hadoop Common Project Setting project property: project.description - Apache Hadoop Common Project Setting project property: project.version - 3.0.0-SNAPSHOT Setting project property: project.packaging - pom Setting project property: project.build.directory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target Setting project property: project.build.outputDirectory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/classes Setting project property: project.build.testOutputDirectory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-classes Setting project property: project.build.sourceDirectory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/src/main/java Setting project property: project.build.testSourceDirectory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/src/test/java Setting project property: localRepository -id: local url: file:///home/jenkins/.m2/repository/ layout: none Setting project property: settings.localRepository - /home/jenkins/.m2/repository Setting project property: maven.project.dependencies.versions - [INFO] Executing tasks Build sequence for target(s) `main' is [main] Complete build sequence is [main, ] main: [mkdir] Created dir:
Help! My Hadoop doesn't work: the video
As a requested followup to the how to commit video, one on how to file a bug, or more importantly, when to paste an exception into a web browser or learn to use basic networking tools before trying to bring up a Hadoop cluster http://youtu.be/NaJlRk5aTRQ http://www.slideshare.net/steve_l/2014-0110-reporting-a-bug TL:DR: Be able to start a Minecraft server before trying to bring up a Hadoop cluster -- CONFIDENTIALITY NOTICE NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
[jira] [Created] (HADOOP-10228) FsPermission#fromShort() should cache FsAction.values()
Haohui Mai created HADOOP-10228: --- Summary: FsPermission#fromShort() should cache FsAction.values() Key: HADOOP-10228 URL: https://issues.apache.org/jira/browse/HADOOP-10228 Project: Hadoop Common Issue Type: Bug Reporter: Haohui Mai Assignee: Haohui Mai Priority: Minor FsPermission#fromShort() calls FsAction.values() every time, which causes unnecessary performance penalty. -- This message was sent by Atlassian JIRA (v6.1.5#6160)
[jira] [Created] (HADOOP-10229) DaemonFactory should not extend Daemon
Hiroshi Ikeda created HADOOP-10229: -- Summary: DaemonFactory should not extend Daemon Key: HADOOP-10229 URL: https://issues.apache.org/jira/browse/HADOOP-10229 Project: Hadoop Common Issue Type: Bug Affects Versions: 2.2.0 Reporter: Hiroshi Ikeda Priority: Minor The static nested class org.apache.hadoop.util.Daemon.DaemonFactory unnecessarily extends its nesting class Daemon, though a thread factory is not required to be a thread. -- This message was sent by Atlassian JIRA (v6.1.5#6160)
[jira] [Created] (HADOOP-10230) GSetByHashMap breaks contract of GSet
Hiroshi Ikeda created HADOOP-10230: -- Summary: GSetByHashMap breaks contract of GSet Key: HADOOP-10230 URL: https://issues.apache.org/jira/browse/HADOOP-10230 Project: Hadoop Common Issue Type: Bug Affects Versions: 2.2.0 Reporter: Hiroshi Ikeda Priority: Trivial The contract of GSet says it is ensured to throw NullPointerException if a given argument is null for many methods, but GSetByHashMap doesn't. I think just writing non-null preconditions for GSet are required. -- This message was sent by Atlassian JIRA (v6.1.5#6160)
[jira] [Created] (HADOOP-10231) Add some components in Native Libraries document
Akira AJISAKA created HADOOP-10231: -- Summary: Add some components in Native Libraries document Key: HADOOP-10231 URL: https://issues.apache.org/jira/browse/HADOOP-10231 Project: Hadoop Common Issue Type: Improvement Components: documentation Affects Versions: 2.2.0 Reporter: Akira AJISAKA Priority: Minor The documented components in Native Libraries are only zlib and gzip. Now Native Libraries includes some other components such as other compression formats (lz4, snappy), libhdfs and fuse module. These components should be documented. -- This message was sent by Atlassian JIRA (v6.1.5#6160)