svn commit: r1078968 - in /hadoop/mapreduce/branches/branch-0.22: ./ src/java/org/apache/hadoop/mapred/tools/ src/java/org/apache/hadoop/mapreduce/tools/ src/tools/org/apache/hadoop/fs/ src/tools/org/
Author: tomwhite Date: Mon Mar 7 21:48:05 2011 New Revision: 1078968 URL: http://svn.apache.org/viewvc?rev=1078968=rev Log: Merge -r 1078963:1078964 from trunk to branch-0.22. Fixes: MAPREDUCE-2336 Added: hadoop/mapreduce/branches/branch-0.22/src/java/org/apache/hadoop/mapred/tools/package-info.java - copied unchanged from r1078964, hadoop/mapreduce/trunk/src/java/org/apache/hadoop/mapred/tools/package-info.java hadoop/mapreduce/branches/branch-0.22/src/java/org/apache/hadoop/mapreduce/tools/package-info.java - copied unchanged from r1078964, hadoop/mapreduce/trunk/src/java/org/apache/hadoop/mapreduce/tools/package-info.java hadoop/mapreduce/branches/branch-0.22/src/tools/org/apache/hadoop/fs/package-info.java - copied unchanged from r1078964, hadoop/mapreduce/trunk/src/tools/org/apache/hadoop/fs/package-info.java hadoop/mapreduce/branches/branch-0.22/src/tools/org/apache/hadoop/tools/package-info.java - copied unchanged from r1078964, hadoop/mapreduce/trunk/src/tools/org/apache/hadoop/tools/package-info.java Modified: hadoop/mapreduce/branches/branch-0.22/CHANGES.txt hadoop/mapreduce/branches/branch-0.22/build.xml Modified: hadoop/mapreduce/branches/branch-0.22/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/mapreduce/branches/branch-0.22/CHANGES.txt?rev=1078968=1078967=1078968=diff == --- hadoop/mapreduce/branches/branch-0.22/CHANGES.txt (original) +++ hadoop/mapreduce/branches/branch-0.22/CHANGES.txt Mon Mar 7 21:48:05 2011 @@ -507,6 +507,9 @@ Release 0.22.0 - Unreleased MAPREDUCE-2284. TestLocalRunner.testMultiMaps times out (todd) +MAPREDUCE-2336. Tool-related packages should be in the Tool javadoc group. +(tomwhite) + Release 0.21.1 - Unreleased NEW FEATURES Modified: hadoop/mapreduce/branches/branch-0.22/build.xml URL: http://svn.apache.org/viewvc/hadoop/mapreduce/branches/branch-0.22/build.xml?rev=1078968=1078967=1078968=diff == --- hadoop/mapreduce/branches/branch-0.22/build.xml (original) +++ hadoop/mapreduce/branches/branch-0.22/build.xml Mon Mar 7 21:48:05 2011 @@ -1046,7 +1046,7 @@ - + - To unsubscribe, e-mail: common-commits-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-commits-h...@hadoop.apache.org
svn commit: r1464130 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm
Author: tomwhite Date: Wed Apr 3 18:01:14 2013 New Revision: 1464130 URL: http://svn.apache.org/r1464130 Log: YARN-381. Improve fair scheduler docs. Contributed by Sandy Ryza. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm?rev=1464130r1=1464129r2=1464130view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm Wed Apr 3 18:01:14 2013 @@ -249,7 +249,7 @@ Hadoop MapReduce Next Generation - Clust *-+-++ | yarn.resourcemanager.scheduler.class | | | | | ResourceManager Scheduler class. | | -| | | CapacityScheduler (recommended) or FifoScheduler | +| | | CapacityScheduler (recommended), FairScheduler (also recommended), or FifoScheduler | *-+-++ | yarn.scheduler.minimum-allocation-mb | | | | | Minimum limit of memory to allocate to each container request at the Resource Manager. | |
svn commit: r1464131 - /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm
Author: tomwhite Date: Wed Apr 3 18:03:21 2013 New Revision: 1464131 URL: http://svn.apache.org/r1464131 Log: Merge -r 1464129:1464130 from trunk to branch-2. Fixes: YARN-381. Improve fair scheduler docs. Contributed by Sandy Ryza. Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm?rev=1464131r1=1464130r2=1464131view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm Wed Apr 3 18:03:21 2013 @@ -249,7 +249,7 @@ Hadoop MapReduce Next Generation - Clust *-+-++ | yarn.resourcemanager.scheduler.class | | | | | ResourceManager Scheduler class. | | -| | | CapacityScheduler (recommended) or FifoScheduler | +| | | CapacityScheduler (recommended), FairScheduler (also recommended), or FifoScheduler | *-+-++ | yarn.scheduler.minimum-allocation-mb | | | | | Minimum limit of memory to allocate to each container request at the Resource Manager. | |
svn commit: r1461537 - in /hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming: TestStreamReduceNone.java TestStreamXmlRecordReader.java
Author: tomwhite Date: Wed Mar 27 11:42:32 2013 New Revision: 1461537 URL: http://svn.apache.org/r1461537 Log: MAPREDUCE-5006. Fix failing streaming tests due to MAPREDUCE-4994. Contributed by Sandy Ryza. Modified: hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamReduceNone.java hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamXmlRecordReader.java Modified: hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamReduceNone.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamReduceNone.java?rev=1461537r1=1461536r2=1461537view=diff == --- hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamReduceNone.java (original) +++ hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamReduceNone.java Wed Mar 27 11:42:32 2013 @@ -68,6 +68,7 @@ public class TestStreamReduceNone -reducer, org.apache.hadoop.mapred.lib.IdentityReducer, -numReduceTasks, 0, -jobconf, mapreduce.task.files.preserve.failedtasks=true, + -jobconf, mapreduce.job.maps=1, -jobconf, stream.tmpdir=+System.getProperty(test.build.data,/tmp) }; } Modified: hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamXmlRecordReader.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamXmlRecordReader.java?rev=1461537r1=1461536r2=1461537view=diff == --- hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamXmlRecordReader.java (original) +++ hadoop/common/branches/branch-2/hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestStreamXmlRecordReader.java Wed Mar 27 11:42:32 2013 @@ -54,6 +54,8 @@ public class TestStreamXmlRecordReader e protected String[] genArgs() { args.add(-inputreader); args.add(StreamXmlRecordReader,begin=xmltag,end=/xmltag); +args.add(-jobconf); +args.add(mapreduce.job.maps=1); return super.genArgs(); } }
svn commit: r1450725 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/org/apache/hadoop/mapred/ReduceTask.java
Author: tomwhite Date: Wed Feb 27 10:42:58 2013 New Revision: 1450725 URL: http://svn.apache.org/r1450725 Log: MAPREDUCE-5008. Merger progress miscounts with respect to EOF_MARKER. Contributed by Sandy Ryza. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/ReduceTask.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1450725r1=1450724r2=1450725view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Wed Feb 27 10:42:58 2013 @@ -502,6 +502,9 @@ Release 1.2.0 - unreleased HADOOP-8917. add LOCALE.US to toLowerCase in SecurityUtil.replacePattern. (Arpit Gupta via suresh) +MAPREDUCE-5008. Merger progress miscounts with respect to EOF_MARKER. +(Sandy Ryza via tomwhite) + Release 1.1.2 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/ReduceTask.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/ReduceTask.java?rev=1450725r1=1450724r2=1450725view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/ReduceTask.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/ReduceTask.java Wed Feb 27 10:42:58 2013 @@ -2473,8 +2473,8 @@ class ReduceTask extends Task { keyClass, valueClass, codec, null); try { Merger.writeFile(rIter, writer, reporter, job); -long decompressedBytesWritten = writer.decompressedBytesWritten; writer.close(); +long decompressedBytesWritten = writer.decompressedBytesWritten; writer = null; FileStatus fileStatus = fs.getFileStatus(outputPath); CompressAwareFileStatus compressedFileStatus = new CompressAwareFileStatus( @@ -2708,8 +2708,8 @@ class ReduceTask extends Task { spilledRecordsCounter, null); Merger.writeFile(iter, writer, reporter, conf); - decompressedBytesWritten = writer.decompressedBytesWritten; writer.close(); + decompressedBytesWritten = writer.decompressedBytesWritten; } catch (Exception e) { localFileSys.delete(outputPath, true); throw new IOException (StringUtils.stringifyException(e)); @@ -2822,8 +2822,8 @@ class ReduceTask extends Task { combineCollector.setWriter(writer); combinerRunner.combine(rIter, combineCollector); } - decompressedBytesWritten = writer.decompressedBytesWritten; writer.close(); + decompressedBytesWritten = writer.decompressedBytesWritten; LOG.info(reduceTask.getTaskID() + Merge of the + noInMemorySegments +
svn commit: r1450835 - in /hadoop/common/branches/branch-1: CHANGES.txt src/docs/src/documentation/content/xdocs/cluster_setup.xml
Author: tomwhite Date: Wed Feb 27 16:37:54 2013 New Revision: 1450835 URL: http://svn.apache.org/r1450835 Log: MAPREDUCE-5035. Update MR1 memory configuration docs. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/docs/src/documentation/content/xdocs/cluster_setup.xml Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1450835r1=1450834r2=1450835view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Wed Feb 27 16:37:54 2013 @@ -505,6 +505,8 @@ Release 1.2.0 - unreleased MAPREDUCE-5008. Merger progress miscounts with respect to EOF_MARKER. (Sandy Ryza via tomwhite) +MAPREDUCE-5035. Update MR1 memory configuration docs. (tomwhite) + Release 1.1.2 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/docs/src/documentation/content/xdocs/cluster_setup.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/docs/src/documentation/content/xdocs/cluster_setup.xml?rev=1450835r1=1450834r2=1450835view=diff == --- hadoop/common/branches/branch-1/src/docs/src/documentation/content/xdocs/cluster_setup.xml (original) +++ hadoop/common/branches/branch-1/src/docs/src/documentation/content/xdocs/cluster_setup.xml Wed Feb 27 16:37:54 2013 @@ -721,7 +721,7 @@ A TT ensures that a task is killed if it, and its descendants, use VMEM over the task's per-task limit. It also ensures that one or more tasks are killed if the sum total of VMEM -usage by all tasks, and their descendents, cross the node-limit./p +usage by all tasks, and their descendants, cross the node-limit./p pUsers can, optionally, specify the VMEM task-limit per job. If no such limit is provided, a default limit is used. A node-limit can be @@ -733,20 +733,25 @@ table trthName/ththType/ththDescription/th/tr - trtdmapred.tasktracker.vmem.reserved/tdtdlong/td -tdA number, in bytes, that represents an offset. The total VMEM on -the machine, minus this offset, is the VMEM node-limit for all -tasks, and their descendants, spawned by the TT. + trtdcodemapred.cluster.map.memory.mb/code, codemapred.cluster.reduce.memory.mb/code/tdtdlong/td +tdThe size, in terms of virtual memory, of a single map/reduce slot +in the Map-Reduce framework, used by the scheduler. +A job can ask for multiple slots for a single task via +mapred.job.map.memory.mb/mapred.job.reduce.memory.mb, up to the limit specified by + mapred.cluster.max.map.memory.mb/mapred.cluster.max.reduce.memory.mb, if the scheduler supports the feature. +The value of -1 indicates that this feature is turned off. /td/tr - trtdmapred.task.default.maxvmem/tdtdlong/td + trtdcodemapred.job.map.memory.mb/code, codemapred.job.reduce.memory.mb/code/tdtdlong/td tdA number, in bytes, that represents the default VMEM task-limit -associated with a task. Unless overridden by a job's setting, -this number defines the VMEM task-limit. +associated with a map/reduce task. Unless overridden by a job's setting, +this number defines the VMEM task-limit. These properties replace the old deprecated property, +codemapred.task.default.maxvmem/code. /td/tr - trtdmapred.task.limit.maxvmem/tdtdlong/td + trtdcodemapred.cluster.max.map.memory.mb/code, codemapred.cluster.max.reduce.memory.mb/code/tdtdlong/td tdA number, in bytes, that represents the upper VMEM task-limit -associated with a task. Users, when specifying a VMEM task-limit -for their tasks, should not specify a limit which exceeds this amount. +associated with a map/reduce task. Users, when specifying a VMEM task-limit +for their tasks, should not specify a limit which exceeds this amount. These properties replace the old deprecated property, +codemapred.task.limit.maxvmem/code. /td/tr /table @@ -754,7 +759,7 @@ table trthName/ththType/ththDescription/th/tr - trtdmapred.tasktracker.taskmemorymanager.monitoring-interval/td + trtdcodemapred.tasktracker.taskmemorymanager.monitoring-interval/code/td tdlong/td tdThe time interval, in milliseconds, between which the TT checks for any memory violation. The default value is 5000 msec @@ -768,14 +773,6 @@ above are missing or -1
svn commit: r1446183 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/ src/test/java/org/apache/hadoop/io/
Author: tomwhite Date: Thu Feb 14 14:07:33 2013 New Revision: 1446183 URL: http://svn.apache.org/r1446183 Log: HADOOP-9154. SortedMapWritable#putAll() doesn't add key/value classes to the map. Contributed by Karthik Kambatla. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1446183r1=1446182r2=1446183view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Thu Feb 14 14:07:33 2013 @@ -366,6 +366,9 @@ Release 2.0.4-beta - UNRELEASED HADOOP-9297. remove old record IO generation and tests. (tucu) +HADOOP-9154. SortedMapWritable#putAll() doesn't add key/value classes to +the map. (Karthik Kambatla via tomwhite) + Release 2.0.3-alpha - 2013-02-06 INCOMPATIBLE CHANGES Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java?rev=1446183r1=1446182r2=1446183view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java Thu Feb 14 14:07:33 2013 @@ -29,6 +29,8 @@ import org.apache.hadoop.classification. import org.apache.hadoop.conf.Configurable; import org.apache.hadoop.conf.Configuration; +import com.google.common.annotations.VisibleForTesting; + /** * Abstract base class for MapWritable and SortedMapWritable * @@ -45,10 +47,12 @@ public abstract class AbstractMapWritabl private AtomicReferenceConfiguration conf; /* Class to id mappings */ - private MapClass, Byte classToIdMap = new ConcurrentHashMapClass, Byte(); + @VisibleForTesting + MapClass, Byte classToIdMap = new ConcurrentHashMapClass, Byte(); /* Id to Class mappings */ - private MapByte, Class idToClassMap = new ConcurrentHashMapByte, Class(); + @VisibleForTesting + MapByte, Class idToClassMap = new ConcurrentHashMapByte, Class(); /* The number of new classes (those not established by the constructor) */ private volatile byte newClasses = 0; Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java?rev=1446183r1=1446182r2=1446183view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java Thu Feb 14 14:07:33 2013 @@ -141,7 +141,7 @@ public class SortedMapWritable extends A for (Map.Entry? extends WritableComparable, ? extends Writable e: t.entrySet()) { - instance.put(e.getKey(), e.getValue()); + put(e.getKey(), e.getValue()); } } Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java?rev=1446183r1=1446182r2=1446183view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java Thu Feb 14 14:07:33 2013 @@ -164,4 +164,18 @@ public class TestSortedMapWritable { assertTrue(failureReason, !mapA.equals(mapB)); assertTrue(failureReason, !mapB.equals(mapA)); } + + @Test(timeout = 1000) + public void testPutAll() { +SortedMapWritable map1 = new SortedMapWritable(); +SortedMapWritable map2 = new
svn commit: r1446186 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/ src/test/java/org/apache/hadoop/io/
Author: tomwhite Date: Thu Feb 14 14:09:23 2013 New Revision: 1446186 URL: http://svn.apache.org/r1446186 Log: Merge -r 1446182:1446183 from trunk to branch-2. Fixes: HADOOP-9154. SortedMapWritable#putAll() doesn't add key/value classes to the map. Contributed by Karthik Kambatla. Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1446186r1=1446185r2=1446186view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Thu Feb 14 14:09:23 2013 @@ -34,6 +34,9 @@ Release 2.0.4-beta - UNRELEASED HADOOP-9297. remove old record IO generation and tests. (tucu) +HADOOP-9154. SortedMapWritable#putAll() doesn't add key/value classes to +the map. (Karthik Kambatla via tomwhite) + Release 2.0.3-alpha - 2013-02-06 INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java?rev=1446186r1=1446185r2=1446186view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/AbstractMapWritable.java Thu Feb 14 14:09:23 2013 @@ -29,6 +29,8 @@ import org.apache.hadoop.classification. import org.apache.hadoop.conf.Configurable; import org.apache.hadoop.conf.Configuration; +import com.google.common.annotations.VisibleForTesting; + /** * Abstract base class for MapWritable and SortedMapWritable * @@ -45,10 +47,12 @@ public abstract class AbstractMapWritabl private AtomicReferenceConfiguration conf; /* Class to id mappings */ - private MapClass, Byte classToIdMap = new ConcurrentHashMapClass, Byte(); + @VisibleForTesting + MapClass, Byte classToIdMap = new ConcurrentHashMapClass, Byte(); /* Id to Class mappings */ - private MapByte, Class idToClassMap = new ConcurrentHashMapByte, Class(); + @VisibleForTesting + MapByte, Class idToClassMap = new ConcurrentHashMapByte, Class(); /* The number of new classes (those not established by the constructor) */ private volatile byte newClasses = 0; Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java?rev=1446186r1=1446185r2=1446186view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java Thu Feb 14 14:09:23 2013 @@ -141,7 +141,7 @@ public class SortedMapWritable extends A for (Map.Entry? extends WritableComparable, ? extends Writable e: t.entrySet()) { - instance.put(e.getKey(), e.getValue()); + put(e.getKey(), e.getValue()); } } Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java?rev=1446186r1=1446185r2=1446186view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java Thu Feb 14 14:09:23 2013 @@ -164,4
svn commit: r1446187 - in /hadoop/common/branches/branch-1: CHANGES.txt src/core/org/apache/hadoop/io/AbstractMapWritable.java src/core/org/apache/hadoop/io/SortedMapWritable.java src/test/org/apache/
Author: tomwhite Date: Thu Feb 14 14:10:13 2013 New Revision: 1446187 URL: http://svn.apache.org/r1446187 Log: HADOOP-9154. SortedMapWritable#putAll() doesn't add key/value classes to the map. Contributed by Karthik Kambatla. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/AbstractMapWritable.java hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1446187r1=1446186r2=1446187view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Thu Feb 14 14:10:13 2013 @@ -490,6 +490,9 @@ Release 1.2.0 - unreleased HDFS-4466. Remove the deadlock from AbstractDelegationTokenSecretManager. (Brandon Li via suresh) +HADOOP-9154. SortedMapWritable#putAll() doesn't add key/value classes to +the map. (Karthik Kambatla via tomwhite) + Release 1.1.2 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/AbstractMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/AbstractMapWritable.java?rev=1446187r1=1446186r2=1446187view=diff == --- hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/AbstractMapWritable.java (original) +++ hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/AbstractMapWritable.java Thu Feb 14 14:10:13 2013 @@ -43,10 +43,10 @@ public abstract class AbstractMapWritabl private AtomicReferenceConfiguration conf; /* Class to id mappings */ - private MapClass, Byte classToIdMap = new ConcurrentHashMapClass, Byte(); + MapClass, Byte classToIdMap = new ConcurrentHashMapClass, Byte(); /* Id to Class mappings */ - private MapByte, Class idToClassMap = new ConcurrentHashMapByte, Class(); + MapByte, Class idToClassMap = new ConcurrentHashMapByte, Class(); /* The number of new classes (those not established by the constructor) */ private volatile byte newClasses = 0; Modified: hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java?rev=1446187r1=1446186r2=1446187view=diff == --- hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java (original) +++ hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java Thu Feb 14 14:10:13 2013 @@ -139,7 +139,7 @@ public class SortedMapWritable extends A for (Map.Entry? extends WritableComparable, ? extends Writable e: t.entrySet()) { - instance.put(e.getKey(), e.getValue()); + put(e.getKey(), e.getValue()); } } Modified: hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java?rev=1446187r1=1446186r2=1446187view=diff == --- hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java (original) +++ hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java Thu Feb 14 14:10:13 2013 @@ -164,4 +164,18 @@ public class TestSortedMapWritable { assertTrue(failureReason, !mapA.equals(mapB)); assertTrue(failureReason, !mapB.equals(mapA)); } + + @Test(timeout = 1000) + public void testPutAll() { +SortedMapWritable map1 = new SortedMapWritable(); +SortedMapWritable map2 = new SortedMapWritable(); +map1.put(new Text(key), new Text(value)); +map2.putAll(map1); + +assertEquals(map1 entries don't match map2 entries, map1, map2); +assertTrue( +map2 doesn't have class information from map1, +map2.classToIdMap.containsKey(Text.class) + map2.idToClassMap.containsValue(Text.class)); + } }
svn commit: r1443395 - in /hadoop/common/branches/branch-1: CHANGES.txt src/core/org/apache/hadoop/io/SortedMapWritable.java src/test/org/apache/hadoop/io/TestSortedMapWritable.java
Author: tomwhite Date: Thu Feb 7 10:43:21 2013 New Revision: 1443395 URL: http://svn.apache.org/viewvc?rev=1443395view=rev Log: HADOOP-9124. SortedMapWritable violates contract of Map interface for equals() and hashCode(). Contributed by Surenkumar Nihalani Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1443395r1=1443394r2=1443395view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Thu Feb 7 10:43:21 2013 @@ -479,6 +479,9 @@ Release 1.2.0 - unreleased MAPREDUCE-4970. Child tasks (try to) create security audit log files. (sandyr via tucu) +HADOOP-9124. SortedMapWritable violates contract of Map interface for +equals() and hashCode(). (Surenkumar Nihalani via tomwhite) + Release 1.1.2 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java?rev=1443395r1=1443394r2=1443395view=diff == --- hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java (original) +++ hadoop/common/branches/branch-1/src/core/org/apache/hadoop/io/SortedMapWritable.java Thu Feb 7 10:43:21 2013 @@ -203,4 +203,27 @@ public class SortedMapWritable extends A e.getValue().write(out); } } + + @Override + public boolean equals(Object obj) { +if (this == obj) { + return true; +} + +if (obj instanceof SortedMapWritable) { + Map map = (Map) obj; + if (size() != map.size()) { +return false; + } + + return entrySet().equals(map.entrySet()); +} + +return false; + } + + @Override + public int hashCode() { +return instance.hashCode(); + } } Modified: hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java?rev=1443395r1=1443394r2=1443395view=diff == --- hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java (original) +++ hadoop/common/branches/branch-1/src/test/org/apache/hadoop/io/TestSortedMapWritable.java Thu Feb 7 10:43:21 2013 @@ -19,15 +19,21 @@ */ package org.apache.hadoop.io; +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertNotNull; +import static org.junit.Assert.assertTrue; + import java.util.Map; -import junit.framework.TestCase; +import org.junit.Test; /** * Tests SortedMapWritable */ -public class TestSortedMapWritable extends TestCase { +public class TestSortedMapWritable { /** the test */ + @Test @SuppressWarnings(unchecked) public void testSortedMapWritable() { Text[] keys = { @@ -92,6 +98,7 @@ public class TestSortedMapWritable exten /** * Test that number of unknown classes is propagated across multiple copies. */ + @Test @SuppressWarnings(deprecation) public void testForeignClass() { SortedMapWritable inMap = new SortedMapWritable(); @@ -101,4 +108,60 @@ public class TestSortedMapWritable exten SortedMapWritable copyOfCopy = new SortedMapWritable(outMap); assertEquals(1, copyOfCopy.getNewClasses()); } + + /** + * Tests if equal and hashCode method still hold the contract. + */ + @Test + public void testEqualsAndHashCode() { +String failureReason; +SortedMapWritable mapA = new SortedMapWritable(); +SortedMapWritable mapB = new SortedMapWritable(); + +// Sanity checks +failureReason = SortedMapWritable couldn't be initialized. Got null reference; +assertNotNull(failureReason, mapA); +assertNotNull(failureReason, mapB); + +// Basic null check +assertFalse(equals method returns true when passed null, +mapA.equals(null)); + +// When entry set is empty, they should be equal +assertTrue(Two empty SortedMapWritables are no longer equal, +mapA.equals(mapB)); + +// Setup +Text[] keys = { new Text(key1), new Text(key2) }; + +BytesWritable[] values = { new BytesWritable(value1.getBytes()), +new BytesWritable(value2.getBytes()) }; + +mapA.put(keys[0], values[0]); +mapB.put(keys[1], values[1]); + +// entrySets are different +failureReason = Two
svn commit: r1441475 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/io/SortedMapWritable.java src/test/java/org/apache/hadoop/io/TestSortedM
Author: tomwhite Date: Fri Feb 1 15:03:35 2013 New Revision: 1441475 URL: http://svn.apache.org/viewvc?rev=1441475view=rev Log: HADOOP-9124. SortedMapWritable violates contract of Map interface for equals() and hashCode(). Contributed by Surenkumar Nihalani Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1441475r1=1441474r2=1441475view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Fri Feb 1 15:03:35 2013 @@ -592,6 +592,9 @@ Release 2.0.3-alpha - Unreleased HADOOP-9221. Convert remaining xdocs to APT. (Andy Isaacson via atm) HADOOP-8981. TestMetricsSystemImpl fails on Windows. (Xuan Gong via suresh) + +HADOOP-9124. SortedMapWritable violates contract of Map interface for +equals() and hashCode(). (Surenkumar Nihalani via tomwhite) Release 2.0.2-alpha - 2012-09-07 Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java?rev=1441475r1=1441474r2=1441475view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java Fri Feb 1 15:03:35 2013 @@ -203,4 +203,27 @@ public class SortedMapWritable extends A e.getValue().write(out); } } + + @Override + public boolean equals(Object obj) { +if (this == obj) { + return true; +} + +if (obj instanceof SortedMapWritable) { + Map map = (Map) obj; + if (size() != map.size()) { +return false; + } + + return entrySet().equals(map.entrySet()); +} + +return false; + } + + @Override + public int hashCode() { +return instance.hashCode(); + } } Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java?rev=1441475r1=1441474r2=1441475view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java Fri Feb 1 15:03:35 2013 @@ -17,15 +17,20 @@ */ package org.apache.hadoop.io; -import java.util.Map; +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertNotNull; +import static org.junit.Assert.assertTrue; -import junit.framework.TestCase; +import java.util.Map; +import org.junit.Test; /** * Tests SortedMapWritable */ -public class TestSortedMapWritable extends TestCase { +public class TestSortedMapWritable { /** the test */ + @Test @SuppressWarnings(unchecked) public void testSortedMapWritable() { Text[] keys = { @@ -90,6 +95,7 @@ public class TestSortedMapWritable exten /** * Test that number of unknown classes is propagated across multiple copies. */ + @Test @SuppressWarnings(deprecation) public void testForeignClass() { SortedMapWritable inMap = new SortedMapWritable(); @@ -99,4 +105,63 @@ public class TestSortedMapWritable exten SortedMapWritable copyOfCopy = new SortedMapWritable(outMap); assertEquals(1, copyOfCopy.getNewClasses()); } + + /** + * Tests if equal and hashCode method still hold the contract. + */ + @Test + public void testEqualsAndHashCode() { +String failureReason; +SortedMapWritable mapA = new SortedMapWritable(); +SortedMapWritable mapB = new SortedMapWritable(); + +// Sanity checks +failureReason = SortedMapWritable couldn't be initialized. Got null reference; +assertNotNull(failureReason, mapA); +assertNotNull(failureReason, mapB); + +// Basic null check +assertFalse(equals method returns true when passed null, mapA.equals(null
svn commit: r1441476 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/io/SortedMapWritable.java src/test/java/org/apache/hadoop/io
Author: tomwhite Date: Fri Feb 1 15:05:32 2013 New Revision: 1441476 URL: http://svn.apache.org/viewvc?rev=1441476view=rev Log: Merge -r 1441474:1441475 from trunk to branch-2. Fixes: HADOOP-9124. SortedMapWritable violates contract of Map interface for equals() and hashCode(). Contributed by Surenkumar Nihalani Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1441476r1=1441475r2=1441476view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Fri Feb 1 15:05:32 2013 @@ -273,6 +273,9 @@ Release 2.0.3-alpha - Unreleased HADOOP-9221. Convert remaining xdocs to APT. (Andy Isaacson via atm) HADOOP-8981. TestMetricsSystemImpl fails on Windows. (Xuan Gong via suresh) + +HADOOP-9124. SortedMapWritable violates contract of Map interface for +equals() and hashCode(). (Surenkumar Nihalani via tomwhite) Release 2.0.2-alpha - 2012-09-07 Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java?rev=1441476r1=1441475r2=1441476view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SortedMapWritable.java Fri Feb 1 15:05:32 2013 @@ -203,4 +203,27 @@ public class SortedMapWritable extends A e.getValue().write(out); } } + + @Override + public boolean equals(Object obj) { +if (this == obj) { + return true; +} + +if (obj instanceof SortedMapWritable) { + Map map = (Map) obj; + if (size() != map.size()) { +return false; + } + + return entrySet().equals(map.entrySet()); +} + +return false; + } + + @Override + public int hashCode() { +return instance.hashCode(); + } } Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java?rev=1441476r1=1441475r2=1441476view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/TestSortedMapWritable.java Fri Feb 1 15:05:32 2013 @@ -17,15 +17,20 @@ */ package org.apache.hadoop.io; -import java.util.Map; +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertNotNull; +import static org.junit.Assert.assertTrue; -import junit.framework.TestCase; +import java.util.Map; +import org.junit.Test; /** * Tests SortedMapWritable */ -public class TestSortedMapWritable extends TestCase { +public class TestSortedMapWritable { /** the test */ + @Test @SuppressWarnings(unchecked) public void testSortedMapWritable() { Text[] keys = { @@ -90,6 +95,7 @@ public class TestSortedMapWritable exten /** * Test that number of unknown classes is propagated across multiple copies. */ + @Test @SuppressWarnings(deprecation) public void testForeignClass() { SortedMapWritable inMap = new SortedMapWritable(); @@ -99,4 +105,63 @@ public class TestSortedMapWritable exten SortedMapWritable copyOfCopy = new SortedMapWritable(outMap); assertEquals(1, copyOfCopy.getNewClasses()); } + + /** + * Tests if equal and hashCode method still hold the contract. + */ + @Test + public void testEqualsAndHashCode() { +String failureReason; +SortedMapWritable mapA = new SortedMapWritable(); +SortedMapWritable mapB = new SortedMapWritable(); + +// Sanity checks +failureReason = SortedMapWritable
svn commit: r1438447 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java src/test/org/apache/hadoop/mapreduce/TestLocalRunner.java
Author: tomwhite Date: Fri Jan 25 10:57:52 2013 New Revision: 1438447 URL: http://svn.apache.org/viewvc?rev=1438447view=rev Log: MAPREDUCE-2931. LocalJobRunner should support parallel mapper execution. Contributed by Sandy Ryza. Added: hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapreduce/TestLocalRunner.java (with props) Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1438447r1=1438446r2=1438447view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Fri Jan 25 10:57:52 2013 @@ -160,6 +160,9 @@ Release 1.2.0 - unreleased MAPREDUCE-4907. TrackerDistributedCacheManager issues too many getFileStatus calls. (sandyr via tucu) + +MAPREDUCE-2931. LocalJobRunner should support parallel mapper execution. +(Sandy Ryza via tomwhite) OPTIMIZATIONS Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java?rev=1438447r1=1438446r2=1438447view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java Fri Jan 25 10:57:52 2013 @@ -22,10 +22,15 @@ import java.io.File; import java.io.IOException; import java.io.OutputStream; import java.util.ArrayList; +import java.util.Collections; import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Random; +import java.util.concurrent.ExecutorService; +import java.util.concurrent.Executors; +import java.util.concurrent.TimeUnit; +import java.util.concurrent.atomic.AtomicInteger; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; @@ -35,27 +40,26 @@ import org.apache.hadoop.filecache.Track import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import org.apache.hadoop.mapreduce.security.token.delegation.DelegationTokenIdentifier; -import org.apache.hadoop.io.DataOutputBuffer; import org.apache.hadoop.io.Text; -import org.apache.hadoop.io.serializer.SerializationFactory; -import org.apache.hadoop.io.serializer.Serializer; import org.apache.hadoop.mapreduce.split.SplitMetaInfoReader; import org.apache.hadoop.mapreduce.split.JobSplit.TaskSplitMetaInfo; import org.apache.hadoop.security.UserGroupInformation; -import org.apache.hadoop.mapreduce.security.TokenCache; import org.apache.hadoop.security.Credentials; import org.apache.hadoop.security.authorize.AccessControlList; import org.apache.hadoop.security.token.Token; /** Implements MapReduce locally, in-process, for debugging. */ -class LocalJobRunner implements JobSubmissionProtocol { +public class LocalJobRunner implements JobSubmissionProtocol { public static final Log LOG = LogFactory.getLog(LocalJobRunner.class); + + public static final String LOCAL_MAX_MAPS = + mapreduce.local.map.tasks.maximum; private FileSystem fs; private HashMapJobID, Job jobs = new HashMapJobID, Job(); private JobConf conf; - private int map_tasks = 0; + private AtomicInteger map_tasks = new AtomicInteger(0); private int reduce_tasks = 0; final Random rand = new Random(); private final TaskController taskController = new DefaultTaskController(); @@ -65,6 +69,8 @@ class LocalJobRunner implements JobSubmi private static final String jobDir = localRunner/; + private static final Counters EMPTY_COUNTERS = new Counters(); + public long getProtocolVersion(String protocol, long clientVersion) { return JobSubmissionProtocol.versionID; } @@ -82,9 +88,15 @@ class LocalJobRunner implements JobSubmi private JobID id; private JobConf job; + +private int numMapTasks; +private float [] partialMapProgress; +private Counters [] mapCounters; +private Counters reduceCounters; private JobStatus status; -private ArrayListTaskAttemptID mapIds = new ArrayListTaskAttemptID(); +private ListTaskAttemptID mapIds = Collections.synchronizedList( +new ArrayListTaskAttemptID()); private JobProfile profile; private FileSystem localFs; @@ -92,13 +104,6 @@ class LocalJobRunner implements JobSubmi private TrackerDistributedCacheManager trackerDistributedCacheManager; private TaskDistributedCacheManager taskDistributedCacheManager; - -// Counters summed over all the map/reduce tasks which -// have successfully completed
svn commit: r1437343 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/org/apache/hadoop/mapred/TaskTracker.java
Author: tomwhite Date: Wed Jan 23 11:06:39 2013 New Revision: 1437343 URL: http://svn.apache.org/viewvc?rev=1437343view=rev Log: MAPREDUCE-4929. mapreduce.task.timeout is ignored. Contributed by Sandy Ryza. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1437343r1=1437342r2=1437343view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Wed Jan 23 11:06:39 2013 @@ -441,6 +441,9 @@ Release 1.2.0 - unreleased HADOOP-8580. ant compile-native fails with automake version 1.11.3. (Gera Shegalov via suresh) + +MAPREDUCE-4929. mapreduce.task.timeout is ignored. +(Sandy Ryza via tomwhite) Release 1.1.2 - Unreleased Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java?rev=1437343r1=1437342r2=1437343view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java Wed Jan 23 11:06:39 2013 @@ -2805,7 +2805,11 @@ public class TaskTracker implements MRCo this.localJobConf = lconf; keepFailedTaskFiles = localJobConf.getKeepFailedTaskFiles(); taskTimeout = localJobConf.getLong(mapred.task.timeout, - 10 * 60 * 1000); + Integer.MIN_VALUE); + if (taskTimeout == Integer.MIN_VALUE) { +taskTimeout = localJobConf.getLong(mapreduce.task.timeout, +10 * 60 * 1000); + } if (task.isMapTask()) { debugCommand = localJobConf.getMapDebugScript(); } else {
svn commit: r1433879 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/security/Credentials.java src/main/java/org/apache/hadoop/security/UserG
Author: tomwhite Date: Wed Jan 16 10:20:11 2013 New Revision: 1433879 URL: http://svn.apache.org/viewvc?rev=1433879view=rev Log: HADOOP-9212. Potential deadlock in FileSystem.Cache/IPC/UGI. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1433879r1=1433878r2=1433879view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Wed Jan 16 10:20:11 2013 @@ -554,6 +554,8 @@ Release 2.0.3-alpha - Unreleased HADOOP-8816. HTTP Error 413 full HEAD if using kerberos authentication. (moritzmoeller via tucu) + +HADOOP-9212. Potential deadlock in FileSystem.Cache/IPC/UGI. (tomwhite) Release 2.0.2-alpha - 2012-09-07 Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java?rev=1433879r1=1433878r2=1433879view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java Wed Jan 16 10:20:11 2013 @@ -18,10 +18,13 @@ package org.apache.hadoop.security; +import java.io.BufferedInputStream; import java.io.DataInput; import java.io.DataInputStream; import java.io.DataOutput; import java.io.DataOutputStream; +import java.io.File; +import java.io.FileInputStream; import java.io.IOException; import java.util.Arrays; import java.util.Collection; @@ -148,8 +151,32 @@ public class Credentials implements Writ in.close(); return credentials; } catch(IOException ioe) { + throw new IOException(Exception reading + filename, ioe); +} finally { IOUtils.cleanup(LOG, in); +} + } + + /** + * Convenience method for reading a token storage file, and loading the Tokens + * therein in the passed UGI + * @param filename + * @param conf + * @throws IOException + */ + public static Credentials readTokenStorageFile(File filename, Configuration conf) + throws IOException { +DataInputStream in = null; +Credentials credentials = new Credentials(); +try { + in = new DataInputStream(new BufferedInputStream( + new FileInputStream(filename))); + credentials.readTokenStorageStream(in); + return credentials; +} catch(IOException ioe) { throw new IOException(Exception reading + filename, ioe); +} finally { + IOUtils.cleanup(LOG, in); } } Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java?rev=1433879r1=1433878r2=1433879view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java Wed Jan 16 10:20:11 2013 @@ -20,6 +20,7 @@ package org.apache.hadoop.security; import static org.apache.hadoop.fs.CommonConfigurationKeys.HADOOP_KERBEROS_MIN_SECONDS_BEFORE_RELOGIN; import static org.apache.hadoop.fs.CommonConfigurationKeys.HADOOP_KERBEROS_MIN_SECONDS_BEFORE_RELOGIN_DEFAULT; +import java.io.File; import java.io.IOException; import java.lang.reflect.UndeclaredThrowableException; import java.security.AccessControlContext; @@ -656,10 +657,11 @@ public class UserGroupInformation { String fileLocation = System.getenv(HADOOP_TOKEN_FILE_LOCATION); if (fileLocation != null) { - // load the token storage file and put all of the tokens into the - // user. + // Load the token storage file and put all of the tokens into the + // user. Don't use the FileSystem API for reading since it has a lock + // cycle (HADOOP-9212). Credentials cred = Credentials.readTokenStorageFile
svn commit: r1433882 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/security/Credentials.java src/main/java/org/apache/hadoop/se
Author: tomwhite Date: Wed Jan 16 10:26:21 2013 New Revision: 1433882 URL: http://svn.apache.org/viewvc?rev=1433882view=rev Log: Merge -r 1433878:1433879 from trunk to branch-2. Fixes: HADOOP-9212. Potential deadlock in FileSystem.Cache/IPC/UGI. Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1433882r1=1433881r2=1433882view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Wed Jan 16 10:26:21 2013 @@ -244,6 +244,8 @@ Release 2.0.3-alpha - Unreleased HADOOP-8816. HTTP Error 413 full HEAD if using kerberos authentication. (moritzmoeller via tucu) + +HADOOP-9212. Potential deadlock in FileSystem.Cache/IPC/UGI. (tomwhite) HADOOP-8589 ViewFs tests fail when tests and home dirs are nested. (sanjay Radia) Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java?rev=1433882r1=1433881r2=1433882view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java Wed Jan 16 10:26:21 2013 @@ -18,10 +18,13 @@ package org.apache.hadoop.security; +import java.io.BufferedInputStream; import java.io.DataInput; import java.io.DataInputStream; import java.io.DataOutput; import java.io.DataOutputStream; +import java.io.File; +import java.io.FileInputStream; import java.io.IOException; import java.util.Arrays; import java.util.Collection; @@ -148,8 +151,32 @@ public class Credentials implements Writ in.close(); return credentials; } catch(IOException ioe) { + throw new IOException(Exception reading + filename, ioe); +} finally { IOUtils.cleanup(LOG, in); +} + } + + /** + * Convenience method for reading a token storage file, and loading the Tokens + * therein in the passed UGI + * @param filename + * @param conf + * @throws IOException + */ + public static Credentials readTokenStorageFile(File filename, Configuration conf) + throws IOException { +DataInputStream in = null; +Credentials credentials = new Credentials(); +try { + in = new DataInputStream(new BufferedInputStream( + new FileInputStream(filename))); + credentials.readTokenStorageStream(in); + return credentials; +} catch(IOException ioe) { throw new IOException(Exception reading + filename, ioe); +} finally { + IOUtils.cleanup(LOG, in); } } Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java?rev=1433882r1=1433881r2=1433882view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java Wed Jan 16 10:26:21 2013 @@ -18,6 +18,7 @@ package org.apache.hadoop.security; +import java.io.File; import java.io.IOException; import java.lang.reflect.UndeclaredThrowableException; import java.security.AccessControlContext; @@ -645,10 +646,11 @@ public class UserGroupInformation { String fileLocation = System.getenv(HADOOP_TOKEN_FILE_LOCATION); if (fileLocation != null) { - // load the token storage file and put all of the tokens into the - // user. + // Load the token storage file and put all of the tokens into the + // user. Don't use the FileSystem API for reading since it has a lock + // cycle
svn commit: r1431251 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java
Author: tomwhite Date: Thu Jan 10 10:05:53 2013 New Revision: 1431251 URL: http://svn.apache.org/viewvc?rev=1431251view=rev Log: HADOOP-9183. Potential deadlock in ActiveStandbyElector. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1431251r1=1431250r2=1431251view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Thu Jan 10 10:05:53 2013 @@ -528,6 +528,8 @@ Release 2.0.3-alpha - Unreleased HADOOP-9155. FsPermission should have different default value, 777 for directory and 666 for file. (Binglin Chang via atm) +HADOOP-9183. Potential deadlock in ActiveStandbyElector. (tomwhite) + Release 2.0.2-alpha - 2012-09-07 INCOMPATIBLE CHANGES Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java?rev=1431251r1=1431250r2=1431251view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java Thu Jan 10 10:05:53 2013 @@ -613,7 +613,7 @@ public class ActiveStandbyElector implem // Unfortunately, the ZooKeeper constructor connects to ZooKeeper and // may trigger the Connected event immediately. So, if we register the // watcher after constructing ZooKeeper, we may miss that event. Instead, -// we construct the watcher first, and have it queue any events it receives +// we construct the watcher first, and have it block any events it receives // before we can set its ZooKeeper reference. WatcherWithClientRef watcher = new WatcherWithClientRef(); ZooKeeper zk = new ZooKeeper(zkHostPort, zkSessionTimeout, watcher); @@ -1002,19 +1002,17 @@ public class ActiveStandbyElector implem private CountDownLatch hasReceivedEvent = new CountDownLatch(1); /** - * If any events arrive before the reference to ZooKeeper is set, - * they get queued up and later forwarded when the reference is - * available. + * Latch used to wait until the reference to ZooKeeper is set. */ -private final ListWatchedEvent queuedEvents = Lists.newLinkedList(); +private CountDownLatch hasSetZooKeeper = new CountDownLatch(1); private WatcherWithClientRef() { } private WatcherWithClientRef(ZooKeeper zk) { - this.zk = zk; + setZooKeeperRef(zk); } - + /** * Waits for the next event from ZooKeeper to arrive. * @@ -1029,9 +1027,7 @@ public class ActiveStandbyElector implem if (!hasReceivedEvent.await(connectionTimeoutMs, TimeUnit.MILLISECONDS)) { LOG.error(Connection timed out: couldn't connect to ZooKeeper in + connectionTimeoutMs + milliseconds); - synchronized (this) { -zk.close(); - } + zk.close(); throw KeeperException.create(Code.CONNECTIONLOSS); } } catch (InterruptedException e) { @@ -1041,29 +1037,18 @@ public class ActiveStandbyElector implem } } -private synchronized void setZooKeeperRef(ZooKeeper zk) { +private void setZooKeeperRef(ZooKeeper zk) { Preconditions.checkState(this.zk == null, zk already set -- must be set exactly once); this.zk = zk; - - for (WatchedEvent e : queuedEvents) { -forwardEvent(e); - } - queuedEvents.clear(); + hasSetZooKeeper.countDown(); } @Override -public synchronized void process(WatchedEvent event) { - if (zk != null) { -forwardEvent(event); - } else { -queuedEvents.add(event); - } -} - -private void forwardEvent(WatchedEvent event) { +public void process(WatchedEvent event) { hasReceivedEvent.countDown(); try { +hasSetZooKeeper.await(zkSessionTimeout, TimeUnit.MILLISECONDS); ActiveStandbyElector.this.processWatchEvent( zk, event); } catch (Throwable t) {
svn commit: r1431252 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java
Author: tomwhite Date: Thu Jan 10 10:09:06 2013 New Revision: 1431252 URL: http://svn.apache.org/viewvc?rev=1431252view=rev Log: Merge -r 1431250:1431251 from trunk to branch-2. Fixes: HADOOP-9183. Potential deadlock in ActiveStandbyElector. Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1431252r1=1431251r2=1431252view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Thu Jan 10 10:09:06 2013 @@ -225,6 +225,8 @@ Release 2.0.3-alpha - Unreleased HADOOP-9155. FsPermission should have different default value, 777 for directory and 666 for file. (Binglin Chang via atm) +HADOOP-9183. Potential deadlock in ActiveStandbyElector. (tomwhite) + Release 2.0.2-alpha - 2012-09-07 INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java?rev=1431252r1=1431251r2=1431252view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/ActiveStandbyElector.java Thu Jan 10 10:09:06 2013 @@ -613,7 +613,7 @@ public class ActiveStandbyElector implem // Unfortunately, the ZooKeeper constructor connects to ZooKeeper and // may trigger the Connected event immediately. So, if we register the // watcher after constructing ZooKeeper, we may miss that event. Instead, -// we construct the watcher first, and have it queue any events it receives +// we construct the watcher first, and have it block any events it receives // before we can set its ZooKeeper reference. WatcherWithClientRef watcher = new WatcherWithClientRef(); ZooKeeper zk = new ZooKeeper(zkHostPort, zkSessionTimeout, watcher); @@ -1002,19 +1002,17 @@ public class ActiveStandbyElector implem private CountDownLatch hasReceivedEvent = new CountDownLatch(1); /** - * If any events arrive before the reference to ZooKeeper is set, - * they get queued up and later forwarded when the reference is - * available. + * Latch used to wait until the reference to ZooKeeper is set. */ -private final ListWatchedEvent queuedEvents = Lists.newLinkedList(); +private CountDownLatch hasSetZooKeeper = new CountDownLatch(1); private WatcherWithClientRef() { } private WatcherWithClientRef(ZooKeeper zk) { - this.zk = zk; + setZooKeeperRef(zk); } - + /** * Waits for the next event from ZooKeeper to arrive. * @@ -1029,9 +1027,7 @@ public class ActiveStandbyElector implem if (!hasReceivedEvent.await(connectionTimeoutMs, TimeUnit.MILLISECONDS)) { LOG.error(Connection timed out: couldn't connect to ZooKeeper in + connectionTimeoutMs + milliseconds); - synchronized (this) { -zk.close(); - } + zk.close(); throw KeeperException.create(Code.CONNECTIONLOSS); } } catch (InterruptedException e) { @@ -1041,29 +1037,18 @@ public class ActiveStandbyElector implem } } -private synchronized void setZooKeeperRef(ZooKeeper zk) { +private void setZooKeeperRef(ZooKeeper zk) { Preconditions.checkState(this.zk == null, zk already set -- must be set exactly once); this.zk = zk; - - for (WatchedEvent e : queuedEvents) { -forwardEvent(e); - } - queuedEvents.clear(); + hasSetZooKeeper.countDown(); } @Override -public synchronized void process(WatchedEvent event) { - if (zk != null) { -forwardEvent(event); - } else { -queuedEvents.add(event); - } -} - -private void forwardEvent(WatchedEvent event) { +public void process(WatchedEvent event) { hasReceivedEvent.countDown(); try { +hasSetZooKeeper.await(zkSessionTimeout, TimeUnit.MILLISECONDS); ActiveStandbyElector.this.processWatchEvent( zk, event); } catch (Throwable t) {
svn commit: r1430876 - in /hadoop/common/branches/branch-1: ./ src/mapred/org/apache/hadoop/mapred/ src/test/org/apache/hadoop/mapred/
Author: tomwhite Date: Wed Jan 9 15:01:10 2013 New Revision: 1430876 URL: http://svn.apache.org/viewvc?rev=1430876view=rev Log: MAPREDUCE-4850. Job recovery may fail if staging directory has been deleted. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/CleanupQueue.java hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Task.java hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1430876r1=1430875r2=1430876view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Wed Jan 9 15:01:10 2013 @@ -413,6 +413,9 @@ Release 1.2.0 - unreleased HADOOP-9191. TestAccessControlList and TestJobHistoryConfig fail with JDK7. (Arpit Agarwal via suresh) +MAPREDUCE-4850. Job recovery may fail if staging directory has been +deleted. (tomwhite) + Release 1.1.2 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/CleanupQueue.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/CleanupQueue.java?rev=1430876r1=1430875r2=1430876view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/CleanupQueue.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/CleanupQueue.java Wed Jan 9 15:01:10 2013 @@ -56,10 +56,17 @@ public class CleanupQueue { static class PathDeletionContext { final Path fullPath;// full path of file or dir final Configuration conf; +final UserGroupInformation ugi; public PathDeletionContext(Path fullPath, Configuration conf) { + this(fullPath, conf, null); +} + +public PathDeletionContext(Path fullPath, Configuration conf, +UserGroupInformation ugi) { this.fullPath = fullPath; this.conf = conf; + this.ugi = ugi; } protected Path getPathForCleanup() { @@ -72,7 +79,7 @@ public class CleanupQueue { */ protected void deletePath() throws IOException, InterruptedException { final Path p = getPathForCleanup(); - UserGroupInformation.getLoginUser().doAs( + (ugi == null ? UserGroupInformation.getLoginUser() : ugi).doAs( new PrivilegedExceptionActionObject() { public Object run() throws IOException { p.getFileSystem(conf).delete(p, true); Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java?rev=1430876r1=1430875r2=1430876view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java Wed Jan 9 15:01:10 2013 @@ -3250,7 +3250,17 @@ public class JobInProgress { Path tempDir = jobtracker.getSystemDirectoryForJob(getJobID()); CleanupQueue.getInstance().addToQueue( -new PathDeletionContext(tempDir, conf)); +new PathDeletionContext(tempDir, conf)); + +// delete the staging area for the job +String jobTempDir = conf.get(mapreduce.job.dir); +if (jobTempDir != null conf.getKeepTaskFilesPattern() == null +!conf.getKeepFailedTaskFiles()) { + Path jobTempDirPath = new Path(jobTempDir); + CleanupQueue.getInstance().addToQueue( + new PathDeletionContext(jobTempDirPath, conf, userUGI)); +} + } catch (IOException e) { LOG.warn(Error cleaning up +profile.getJobID()+: +e); } Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Task.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Task.java?rev=1430876r1=1430875r2=1430876view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Task.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Task.java Wed Jan 9 15:01:10 2013 @@ -1071,14 +1071,6 @@ abstract public class Task implements Wr + JobStatus.State.FAILED + or + JobStatus.State.KILLED
svn commit: r1430371 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java
Author: tomwhite Date: Tue Jan 8 16:37:41 2013 New Revision: 1430371 URL: http://svn.apache.org/viewvc?rev=1430371view=rev Log: MAPREDUCE-4278. Cannot run two local jobs in parallel from the same gateway. Contributed by Sandy Ryza. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1430371r1=1430370r2=1430371view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Tue Jan 8 16:37:41 2013 @@ -407,6 +407,9 @@ Release 1.2.0 - unreleased HDFS-4351. In BlockPlacementPolicyDefault.chooseTarget(..), numOfReplicas needs to be updated when avoiding stale nodes. (Andrew Wang via szetszwo) +MAPREDUCE-4278. Cannot run two local jobs in parallel from the same +gateway. (Sandy Ryza via tomwhite) + Release 1.1.2 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java?rev=1430371r1=1430370r2=1430371view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java Tue Jan 8 16:37:41 2013 @@ -427,8 +427,12 @@ class LocalJobRunner implements JobSubmi // JobSubmissionProtocol methods private static int jobid = 0; + // used for making sure that local jobs run in different jvms don't + // collide on staging or job directories + private int randid; + public synchronized JobID getNewJobId() { -return new JobID(local, ++jobid); +return new JobID(local + randid, ++jobid); } public JobStatus submitJob(JobID jobid, String jobSubmitDir, @@ -541,10 +545,11 @@ class LocalJobRunner implements JobSubmi /tmp/hadoop/mapred/staging)); UserGroupInformation ugi = UserGroupInformation.getCurrentUser(); String user; +randid = rand.nextInt(Integer.MAX_VALUE); if (ugi != null) { - user = ugi.getShortUserName() + rand.nextInt(); + user = ugi.getShortUserName() + randid; } else { - user = dummy + rand.nextInt(); + user = dummy + randid; } return fs.makeQualified(new Path(stagingRootDir, user+/.staging)).toString(); }
svn commit: r1424546 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/org/apache/hadoop/mapred/JobTracker.java
Author: tomwhite Date: Thu Dec 20 15:54:30 2012 New Revision: 1424546 URL: http://svn.apache.org/viewvc?rev=1424546view=rev Log: MAPREDUCE-4806. Some private methods in JobTracker.RecoveryManager are not used anymore after MAPREDUCE-3837. Contributed by Karthik Kambatla. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1424546r1=1424545r2=1424546view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Thu Dec 20 15:54:30 2012 @@ -354,6 +354,9 @@ Release 1.2.0 - unreleased MAPREDUCE-4860. DelegationTokenRenewal attempts to renew token even after a job is removed. (kkambatl via tucu) +MAPREDUCE-4806. Some private methods in JobTracker.RecoveryManager are not +used anymore after MAPREDUCE-3837. (Karthik Kambatla via tomwhite) + Release 1.1.2 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java?rev=1424546r1=1424545r2=1424546view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java Thu Dec 20 15:54:30 2012 @@ -1309,233 +1309,6 @@ public class JobTracker implements MRCon } return ret; } - -private JobStatusChangeEvent updateJob(JobInProgress jip, -JobHistory.JobInfo job) { - // Change the job priority - String jobpriority = job.get(Keys.JOB_PRIORITY); - JobPriority priority = JobPriority.valueOf(jobpriority); - // It's important to update this via the jobtracker's api as it will - // take care of updating the event listeners too - - try { -setJobPriority(jip.getJobID(), priority); - } catch (IOException e) { -// This will not happen. JobTracker can set jobPriority of any job -// as mrOwner has the needed permissions. -LOG.warn(Unexpected. JobTracker could not do SetJobPriority on - + jip.getJobID() + . + e); - } - - // Save the previous job status - JobStatus oldStatus = (JobStatus)jip.getStatus().clone(); - - // Set the start/launch time only if there are recovered tasks - // Increment the job's restart count - jip.updateJobInfo(job.getLong(JobHistory.Keys.SUBMIT_TIME), -job.getLong(JobHistory.Keys.LAUNCH_TIME)); - - // Save the new job status - JobStatus newStatus = (JobStatus)jip.getStatus().clone(); - - return new JobStatusChangeEvent(jip, EventType.START_TIME_CHANGED, oldStatus, - newStatus); -} - -private void updateTip(TaskInProgress tip, JobHistory.Task task) { - long startTime = task.getLong(Keys.START_TIME); - if (startTime != 0) { -tip.setExecStartTime(startTime); - } - - long finishTime = task.getLong(Keys.FINISH_TIME); - // For failed tasks finish-time will be missing - if (finishTime != 0) { -tip.setExecFinishTime(finishTime); - } - - String cause = task.get(Keys.TASK_ATTEMPT_ID); - if (cause.length() 0) { -// This means that the this is a FAILED events -TaskAttemptID id = TaskAttemptID.forName(cause); -TaskStatus status = tip.getTaskStatus(id); -synchronized (JobTracker.this) { - // This will add the tip failed event in the new log - tip.getJob().failedTask(tip, id, status.getDiagnosticInfo(), - status.getPhase(), status.getRunState(), - status.getTaskTracker()); -} - } -} - -private void createTaskAttempt(JobInProgress job, - TaskAttemptID attemptId, - JobHistory.TaskAttempt attempt) - throws UnknownHostException { - TaskID id = attemptId.getTaskID(); - String type = attempt.get(Keys.TASK_TYPE); - TaskInProgress tip = job.getTaskInProgress(id); - - //I. Get the required info - TaskStatus taskStatus = null; - String trackerName = attempt.get(Keys.TRACKER_NAME); - String trackerHostName = -JobInProgress.convertTrackerNameToHostName(trackerName); - // recover the port information. - int port = 0; // default to 0 - String hport = attempt.get(Keys.HTTP_PORT); - if (hport
svn commit: r1423825 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/fs/ src/main/java/org/apache/hadoop/fs/viewfs/ src/test/java/org/apac
Author: tomwhite Date: Wed Dec 19 11:15:29 2012 New Revision: 1423825 URL: http://svn.apache.org/viewvc?rev=1423825view=rev Log: Merge -r 1423823:1423824 from trunk to branch-2. Fixes: HADOOP-9153. Support createNonRecursive in ViewFileSystem. Contributed by Sandy Ryza. Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/RawLocalFileSystem.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/ChRootedFileSystem.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/ViewFileSystem.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/viewfs/ViewFileSystemBaseTest.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1423825r1=1423824r2=1423825view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Wed Dec 19 11:15:29 2012 @@ -205,6 +205,9 @@ Release 2.0.3-alpha - Unreleased HADOOP-9152. HDFS can report negative DFS Used on clusters with very small amounts of data. (Brock Noland via atm) +HADOOP-9153. Support createNonRecursive in ViewFileSystem. +(Sandy Ryza via tomwhite) + Release 2.0.2-alpha - 2012-09-07 INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java?rev=1423825r1=1423824r2=1423825view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java Wed Dec 19 11:15:29 2012 @@ -166,6 +166,18 @@ public class FilterFileSystem extends Fi return fs.create(f, permission, overwrite, bufferSize, replication, blockSize, progress); } + + + + @Override + @Deprecated + public FSDataOutputStream createNonRecursive(Path f, FsPermission permission, + EnumSetCreateFlag flags, int bufferSize, short replication, long blockSize, + Progressable progress) throws IOException { + +return fs.createNonRecursive(f, permission, flags, bufferSize, replication, blockSize, +progress); + } /** * Set replication for an existing file. Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/RawLocalFileSystem.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/RawLocalFileSystem.java?rev=1423825r1=1423824r2=1423825view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/RawLocalFileSystem.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/RawLocalFileSystem.java Wed Dec 19 11:15:29 2012 @@ -30,6 +30,7 @@ import java.io.FileDescriptor; import java.net.URI; import java.nio.ByteBuffer; import java.util.Arrays; +import java.util.EnumSet; import java.util.StringTokenizer; import org.apache.hadoop.classification.InterfaceAudience; @@ -281,6 +282,18 @@ public class RawLocalFileSystem extends return new FSDataOutputStream(new BufferedOutputStream( new LocalFSFileOutputStream(f, false), bufferSize), statistics); } + + @Override + @Deprecated + public FSDataOutputStream createNonRecursive(Path f, FsPermission permission, + EnumSetCreateFlag flags, int bufferSize, short replication, long blockSize, + Progressable progress) throws IOException { +if (exists(f) !flags.contains(CreateFlag.OVERWRITE)) { + throw new IOException(File already exists: +f); +} +return new FSDataOutputStream(new BufferedOutputStream( +new LocalFSFileOutputStream(f, false), bufferSize), statistics); + } @Override public FSDataOutputStream create(Path f, FsPermission
svn commit: r1422961 - in /hadoop/common/branches/branch-1/src: mapred/org/apache/hadoop/mapred/TaskTracker.java test/org/apache/hadoop/mapred/TestRecoveryManager.java
Author: tomwhite Date: Mon Dec 17 15:00:05 2012 New Revision: 1422961 URL: http://svn.apache.org/viewvc?rev=1422961view=rev Log: MAPREDUCE-4859. TestRecoveryManager fails on branch-1. Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java?rev=1422961r1=1422960r2=1422961view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java Mon Dec 17 15:00:05 2012 @@ -610,7 +610,15 @@ public class TaskTracker implements MRCo LOG.warn(Unknown job + jobId + being deleted.); } else { synchronized (rjob) { - rjob.tasks.remove(tip); + // Only remove the TIP if it is identical to the one that is finished + // Job recovery means that it is possible to have two task attempts + // with the same ID, which is used for TIP equals/hashcode. + for (TaskInProgress t : rjob.tasks) { +if (tip == t) { + rjob.tasks.remove(tip); + break; +} + } } } } Modified: hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java?rev=1422961r1=1422960r2=1422961view=diff == --- hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java (original) +++ hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java Mon Dec 17 15:00:05 2012 @@ -84,8 +84,7 @@ public class TestRecoveryManager { * - restarts the jobtracker * - checks if the jobtraker starts normally */ - @Test - @Ignore + @Test(timeout=12) public void testJobTrackerRestartsWithMissingJobFile() throws Exception { LOG.info(Testing jobtracker restart with faulty job); String signalFile = new Path(TEST_DIR, signal).toString(); @@ -111,7 +110,7 @@ public class TestRecoveryManager { new Path(TEST_DIR, input), new Path(TEST_DIR, output2), 30, 0, test-recovery-manager, signalFile, signalFile); -// submit the faulty job +// submit another job RunningJob rJob2 = (new JobClient(job2)).submitJob(job2); LOG.info(Submitted job + rJob2.getID()); @@ -129,7 +128,7 @@ public class TestRecoveryManager { Path jobFile = new Path(sysDir, rJob1.getID().toString() + / + JobTracker.JOB_INFO_FILE); LOG.info(Deleting job token file : + jobFile.toString()); -fs.delete(jobFile, false); // delete the job.xml file +Assert.assertTrue(fs.delete(jobFile, false)); // delete the job.xml file // create the job.xml file with 1 bytes FSDataOutputStream out = fs.create(jobFile); @@ -142,12 +141,22 @@ public class TestRecoveryManager { // start the jobtracker LOG.info(Starting jobtracker); mr.startJobTracker(); -ClusterStatus status = - mr.getJobTrackerRunner().getJobTracker().getClusterStatus(false); +JobTracker jobtracker = mr.getJobTrackerRunner().getJobTracker(); +ClusterStatus status = jobtracker.getClusterStatus(false); // check if the jobtracker came up or not Assert.assertEquals(JobTracker crashed!, JobTracker.State.RUNNING, status.getJobTrackerState()); + +// wait for job 2 to complete +JobInProgress jip = jobtracker.getJob(rJob2.getID()); +while (!jip.isComplete()) { + LOG.info(Waiting for job + rJob2.getID() + to be successful); + // Signaling Map task to complete + fs.create(new Path(TEST_DIR, signal)); + UtilsForTests.waitFor(100); +} +Assert.assertTrue(Job should be successful, rJob2.isSuccessful()); } /** @@ -156,8 +165,7 @@ public class TestRecoveryManager { * - kills the jobtracker * - checks if the jobtraker starts normally and job is recovered while */ - @Test - @Ignore + @Test(timeout=12) public void testJobResubmission() throws Exception { LOG.info(Testing Job Resubmission); String signalFile = new Path(TEST_DIR, signal).toString(); @@ -196,6 +204,8 @@ public class TestRecoveryManager { // assert that job is recovered by the jobtracker Assert.assertEquals(Resubmission failed , 1, jobtracker.getAllJobs().length); + +// wait for job 1 to complete JobInProgress jip = jobtracker.getJob(rJob1
svn commit: r1422968 - in /hadoop/common/branches/branch-1.1/src: mapred/org/apache/hadoop/mapred/TaskTracker.java test/org/apache/hadoop/mapred/TestRecoveryManager.java
Author: tomwhite Date: Mon Dec 17 15:03:55 2012 New Revision: 1422968 URL: http://svn.apache.org/viewvc?rev=1422968view=rev Log: Merge -r 1422960:1422961 from branch-1 to branch-1.1. Fixes: MAPREDUCE-4859. TestRecoveryManager fails on branch-1. Modified: hadoop/common/branches/branch-1.1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java hadoop/common/branches/branch-1.1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java Modified: hadoop/common/branches/branch-1.1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1.1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java?rev=1422968r1=1422967r2=1422968view=diff == --- hadoop/common/branches/branch-1.1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java (original) +++ hadoop/common/branches/branch-1.1/src/mapred/org/apache/hadoop/mapred/TaskTracker.java Mon Dec 17 15:03:55 2012 @@ -605,7 +605,15 @@ public class TaskTracker implements MRCo LOG.warn(Unknown job + jobId + being deleted.); } else { synchronized (rjob) { - rjob.tasks.remove(tip); + // Only remove the TIP if it is identical to the one that is finished + // Job recovery means that it is possible to have two task attempts + // with the same ID, which is used for TIP equals/hashcode. + for (TaskInProgress t : rjob.tasks) { +if (tip == t) { + rjob.tasks.remove(tip); + break; +} + } } } } Modified: hadoop/common/branches/branch-1.1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1.1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java?rev=1422968r1=1422967r2=1422968view=diff == --- hadoop/common/branches/branch-1.1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java (original) +++ hadoop/common/branches/branch-1.1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java Mon Dec 17 15:03:55 2012 @@ -84,8 +84,7 @@ public class TestRecoveryManager { * - restarts the jobtracker * - checks if the jobtraker starts normally */ - @Test - @Ignore + @Test(timeout=12) public void testJobTrackerRestartsWithMissingJobFile() throws Exception { LOG.info(Testing jobtracker restart with faulty job); String signalFile = new Path(TEST_DIR, signal).toString(); @@ -111,7 +110,7 @@ public class TestRecoveryManager { new Path(TEST_DIR, input), new Path(TEST_DIR, output2), 30, 0, test-recovery-manager, signalFile, signalFile); -// submit the faulty job +// submit another job RunningJob rJob2 = (new JobClient(job2)).submitJob(job2); LOG.info(Submitted job + rJob2.getID()); @@ -129,7 +128,7 @@ public class TestRecoveryManager { Path jobFile = new Path(sysDir, rJob1.getID().toString() + / + JobTracker.JOB_INFO_FILE); LOG.info(Deleting job token file : + jobFile.toString()); -fs.delete(jobFile, false); // delete the job.xml file +Assert.assertTrue(fs.delete(jobFile, false)); // delete the job.xml file // create the job.xml file with 1 bytes FSDataOutputStream out = fs.create(jobFile); @@ -142,12 +141,22 @@ public class TestRecoveryManager { // start the jobtracker LOG.info(Starting jobtracker); mr.startJobTracker(); -ClusterStatus status = - mr.getJobTrackerRunner().getJobTracker().getClusterStatus(false); +JobTracker jobtracker = mr.getJobTrackerRunner().getJobTracker(); +ClusterStatus status = jobtracker.getClusterStatus(false); // check if the jobtracker came up or not Assert.assertEquals(JobTracker crashed!, JobTracker.State.RUNNING, status.getJobTrackerState()); + +// wait for job 2 to complete +JobInProgress jip = jobtracker.getJob(rJob2.getID()); +while (!jip.isComplete()) { + LOG.info(Waiting for job + rJob2.getID() + to be successful); + // Signaling Map task to complete + fs.create(new Path(TEST_DIR, signal)); + UtilsForTests.waitFor(100); +} +Assert.assertTrue(Job should be successful, rJob2.isSuccessful()); } /** @@ -156,8 +165,7 @@ public class TestRecoveryManager { * - kills the jobtracker * - checks if the jobtraker starts normally and job is recovered while */ - @Test - @Ignore + @Test(timeout=12) public void testJobResubmission() throws Exception { LOG.info(Testing Job Resubmission); String signalFile = new Path(TEST_DIR, signal).toString(); @@ -196,6 +204,8 @@ public class TestRecoveryManager { // assert that job is recovered by the jobtracker Assert.assertEquals(Resubmission failed , 1, jobtracker.getAllJobs().length
svn commit: r1414731 - in /hadoop/common/branches/branch-1: CHANGES.txt src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairSchedulerEventLog.java src/contrib/fairscheduler/src/test/org/ap
Author: tomwhite Date: Wed Nov 28 14:43:28 2012 New Revision: 1414731 URL: http://svn.apache.org/viewvc?rev=1414731view=rev Log: MAPREDUCE-4778. Fair scheduler event log is only written if directory exists on HDFS. Contributed by Sandy Ryza. Added: hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/test/org/apache/hadoop/mapred/TestFairSchedulerEventLog.java (with props) Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairSchedulerEventLog.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1414731r1=1414730r2=1414731view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Wed Nov 28 14:43:28 2012 @@ -322,6 +322,9 @@ Release 1.2.0 - unreleased HADOOP-9099. TestNetUtils fails if UnknownHost is resolved as a valid hostname. (Ivan Mitic via szetszwo) +MAPREDUCE-4778. Fair scheduler event log is only written if directory +exists on HDFS. (Sandy Ryza via tomwhite) + Release 1.1.1 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairSchedulerEventLog.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairSchedulerEventLog.java?rev=1414731r1=1414730r2=1414731view=diff == --- hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairSchedulerEventLog.java (original) +++ hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairSchedulerEventLog.java Wed Nov 28 14:43:28 2012 @@ -76,12 +76,11 @@ class FairSchedulerEventLog { logDir = conf.get(mapred.fairscheduler.eventlog.location, new File(System.getProperty(hadoop.log.dir)).getAbsolutePath() + File.separator + fairscheduler); - Path logDirPath = new Path(logDir); - FileSystem fs = logDirPath.getFileSystem(conf); - if (!fs.exists(logDirPath)) { -if (!fs.mkdirs(logDirPath)) { + File logDirFile = new File(logDir); + if (!logDirFile.exists()) { +if (!logDirFile.mkdirs()) { throw new IOException( - Mkdirs failed to create + logDirPath.toString()); + Mkdirs failed to create + logDirFile.toString()); } } String username = System.getProperty(user.name); @@ -125,6 +124,10 @@ class FairSchedulerEventLog { } } + String getLogFile() { +return logFile; + } + /** * Flush and close the log. */ Added: hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/test/org/apache/hadoop/mapred/TestFairSchedulerEventLog.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/test/org/apache/hadoop/mapred/TestFairSchedulerEventLog.java?rev=1414731view=auto == --- hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/test/org/apache/hadoop/mapred/TestFairSchedulerEventLog.java (added) +++ hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/test/org/apache/hadoop/mapred/TestFairSchedulerEventLog.java Wed Nov 28 14:43:28 2012 @@ -0,0 +1,61 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * License); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an AS IS BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.mapred; + +import java.io.File; +import java.io.IOException; + +import junit.framework.Assert; +import junit.framework.TestCase; + +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.mapred.TestFairScheduler.FakeTaskTrackerManager; + +public class TestFairSchedulerEventLog extends TestCase { + + private File logFile; + + /** + * Make sure the scheduler creates the event log. + */ + public void testCreateEventLog() throws IOException { +Configuration conf = new Configuration
svn commit: r1412077 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java src/test/java/org/apache/hadoop/fs/TestDe
Author: tomwhite Date: Wed Nov 21 12:29:37 2012 New Revision: 1412077 URL: http://svn.apache.org/viewvc?rev=1412077view=rev Log: HADOOP-9049. DelegationTokenRenewer needs to be Singleton and FileSystems should register/deregister to/from. Contributed by Karthik Kambatla. Added: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestDelegationTokenRenewer.java (with props) Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1412077r1=1412076r2=1412077view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Wed Nov 21 12:29:37 2012 @@ -444,6 +444,9 @@ Release 2.0.3-alpha - Unreleased HADOOP-6607. Add different variants of non caching HTTP headers. (tucu) +HADOOP-9049. DelegationTokenRenewer needs to be Singleton and FileSystems +should register/deregister to/from. (Karthik Kambatla via tomwhite) + Release 2.0.2-alpha - 2012-09-07 INCOMPATIBLE CHANGES Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java?rev=1412077r1=1412076r2=1412077view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java Wed Nov 21 12:29:37 2012 @@ -33,7 +33,7 @@ import org.apache.hadoop.util.Time; * A daemon thread that waits for the next file system to renew. */ @InterfaceAudience.Private -public class DelegationTokenRenewerT extends FileSystem DelegationTokenRenewer.Renewable +public class DelegationTokenRenewer extends Thread { /** The renewable interface used by the renewer. */ public interface Renewable { @@ -93,7 +93,7 @@ public class DelegationTokenRenewerT ex * @param newTime the new time */ private void updateRenewalTime() { - renewalTime = RENEW_CYCLE + Time.now(); + renewalTime = renewCycle + Time.now(); } /** @@ -134,34 +134,69 @@ public class DelegationTokenRenewerT ex } /** Wait for 95% of a day between renewals */ - private static final int RENEW_CYCLE = 24 * 60 * 60 * 950; + private static final int RENEW_CYCLE = 24 * 60 * 60 * 950; - private DelayQueueRenewActionT queue = new DelayQueueRenewActionT(); + @InterfaceAudience.Private + protected static int renewCycle = RENEW_CYCLE; - public DelegationTokenRenewer(final ClassT clazz) { + /** Queue to maintain the RenewActions to be processed by the {@link #run()} */ + private volatile DelayQueueRenewAction? queue = new DelayQueueRenewAction?(); + + /** + * Create the singleton instance. However, the thread can be started lazily in + * {@link #addRenewAction(FileSystem)} + */ + private static DelegationTokenRenewer INSTANCE = null; + + private DelegationTokenRenewer(final Class? extends FileSystem clazz) { super(clazz.getSimpleName() + - + DelegationTokenRenewer.class.getSimpleName()); setDaemon(true); } + public static synchronized DelegationTokenRenewer getInstance() { +if (INSTANCE == null) { + INSTANCE = new DelegationTokenRenewer(FileSystem.class); +} +return INSTANCE; + } + /** Add a renew action to the queue. */ - public void addRenewAction(final T fs) { + public synchronized T extends FileSystem Renewable void addRenewAction(final T fs) { queue.add(new RenewActionT(fs)); +if (!isAlive()) { + start(); +} } + /** Remove the associated renew action from the queue */ + public synchronized T extends FileSystem Renewable void removeRenewAction( + final T fs) { +for (RenewAction? action : queue) { + if (action.weakFs.get() == fs) { +queue.remove(action); +return; + } +} + } + + @SuppressWarnings(static-access) @Override public void run() { for(;;) { - RenewActionT action = null; + RenewAction? action = null; try { -action = queue.take(); -if (action.renew()) { - action.updateRenewalTime(); - queue.add(action); +synchronized (this) { + action = queue.take(); + if (action.renew
svn commit: r1412079 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java src/test/java/org/apache/hado
Author: tomwhite Date: Wed Nov 21 12:37:42 2012 New Revision: 1412079 URL: http://svn.apache.org/viewvc?rev=1412079view=rev Log: Merge -r 1412076:1412077 from trunk to branch-2. Fixes: HADOOP-9049. DelegationTokenRenewer needs to be Singleton and FileSystems should register/deregister to/from. Contributed by Karthik Kambatla. Added: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestDelegationTokenRenewer.java - copied unchanged from r1412077, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestDelegationTokenRenewer.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1412079r1=1412078r2=1412079view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Wed Nov 21 12:37:42 2012 @@ -152,6 +152,9 @@ Release 2.0.3-alpha - Unreleased HADOOP-6607. Add different variants of non caching HTTP headers. (tucu) +HADOOP-9049. DelegationTokenRenewer needs to be Singleton and FileSystems +should register/deregister to/from. (Karthik Kambatla via tomwhite) + Release 2.0.2-alpha - 2012-09-07 INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java?rev=1412079r1=1412078r2=1412079view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java Wed Nov 21 12:37:42 2012 @@ -33,7 +33,7 @@ import org.apache.hadoop.util.Time; * A daemon thread that waits for the next file system to renew. */ @InterfaceAudience.Private -public class DelegationTokenRenewerT extends FileSystem DelegationTokenRenewer.Renewable +public class DelegationTokenRenewer extends Thread { /** The renewable interface used by the renewer. */ public interface Renewable { @@ -93,7 +93,7 @@ public class DelegationTokenRenewerT ex * @param newTime the new time */ private void updateRenewalTime() { - renewalTime = RENEW_CYCLE + Time.now(); + renewalTime = renewCycle + Time.now(); } /** @@ -134,34 +134,69 @@ public class DelegationTokenRenewerT ex } /** Wait for 95% of a day between renewals */ - private static final int RENEW_CYCLE = 24 * 60 * 60 * 950; + private static final int RENEW_CYCLE = 24 * 60 * 60 * 950; - private DelayQueueRenewActionT queue = new DelayQueueRenewActionT(); + @InterfaceAudience.Private + protected static int renewCycle = RENEW_CYCLE; - public DelegationTokenRenewer(final ClassT clazz) { + /** Queue to maintain the RenewActions to be processed by the {@link #run()} */ + private volatile DelayQueueRenewAction? queue = new DelayQueueRenewAction?(); + + /** + * Create the singleton instance. However, the thread can be started lazily in + * {@link #addRenewAction(FileSystem)} + */ + private static DelegationTokenRenewer INSTANCE = null; + + private DelegationTokenRenewer(final Class? extends FileSystem clazz) { super(clazz.getSimpleName() + - + DelegationTokenRenewer.class.getSimpleName()); setDaemon(true); } + public static synchronized DelegationTokenRenewer getInstance() { +if (INSTANCE == null) { + INSTANCE = new DelegationTokenRenewer(FileSystem.class); +} +return INSTANCE; + } + /** Add a renew action to the queue. */ - public void addRenewAction(final T fs) { + public synchronized T extends FileSystem Renewable void addRenewAction(final T fs) { queue.add(new RenewActionT(fs)); +if (!isAlive()) { + start(); +} } + /** Remove the associated renew action from the queue */ + public synchronized T extends FileSystem Renewable void removeRenewAction( + final T fs) { +for (RenewAction? action : queue) { + if (action.weakFs.get() == fs) { +queue.remove(action); +return; + } +} + } + + @SuppressWarnings(static-access) @Override public
svn commit: r1411762 - in /hadoop/common/branches/branch-0.23/hadoop-project/src/site: apt/index.apt.vm site.xml
Author: tomwhite Date: Tue Nov 20 17:46:14 2012 New Revision: 1411762 URL: http://svn.apache.org/viewvc?rev=1411762view=rev Log: HADOOP-8860. Split MapReduce and YARN sections in documentation navigation. Modified: hadoop/common/branches/branch-0.23/hadoop-project/src/site/apt/index.apt.vm hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml Modified: hadoop/common/branches/branch-0.23/hadoop-project/src/site/apt/index.apt.vm URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-project/src/site/apt/index.apt.vm?rev=1411762r1=1411761r2=1411762view=diff == --- hadoop/common/branches/branch-0.23/hadoop-project/src/site/apt/index.apt.vm (original) +++ hadoop/common/branches/branch-0.23/hadoop-project/src/site/apt/index.apt.vm Tue Nov 20 17:46:14 2012 @@ -34,7 +34,7 @@ Apache Hadoop 0.23 Namenodes. More details are available in the - {{{./hadoop-yarn/hadoop-yarn-site/Federation.html}HDFS Federation}} + {{{./hadoop-project-dist/hadoop-hdfs/Federation.html}HDFS Federation}} document. * {MapReduce NextGen aka YARN aka MRv2} @@ -65,9 +65,9 @@ Getting Started The Hadoop documentation includes the information you need to get started using Hadoop. Begin with the - {{{./hadoop-yarn/hadoop-yarn-site/SingleCluster.html}Single Node Setup}} which + {{{./hadoop-project-dist/hadoop-common/SingleCluster.html}Single Node Setup}} which shows you how to set up a single-node Hadoop installation. Then move on to the - {{{./hadoop-yarn/hadoop-yarn-site/ClusterSetup.html}Cluster Setup}} to learn how + {{{./hadoop-project-dist/hadoop-common/ClusterSetup.html}Cluster Setup}} to learn how to set up a multi-node Hadoop installation. Modified: hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml?rev=1411762r1=1411761r2=1411762view=diff == --- hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml (original) +++ hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml Tue Nov 20 17:46:14 2012 @@ -48,31 +48,29 @@ menu name=Common inherit=top item name=Overview href=index.html/ - item name=Single Node Setup href=hadoop-yarn/hadoop-yarn-site/SingleCluster.html/ - item name=Cluster Setup href=hadoop-yarn/hadoop-yarn-site/ClusterSetup.html/ - item name=Hadoop Commands href=hadoop-project-dist/hadoop-common/commands_manual.html/ + item name=Single Node Setup href=hadoop-project-dist/hadoop-common/SingleCluster.html/ + item name=Cluster Setup href=hadoop-project-dist/hadoop-common/ClusterSetup.html/ /menu menu name=HDFS inherit=top - item name=Federation href=hadoop-yarn/hadoop-yarn-site/Federation.html/ - item name=WebHDFS REST API href=hadoop-yarn/hadoop-yarn-site/WebHDFS.html/ + item name=Federation href=hadoop-project-dist/hadoop-hdfs/Federation.html/ + item name=WebHDFS REST API href=hadoop-project-dist/hadoop-hdfs/WebHDFS.html/ item name=HttpFS Gateway href=hadoop-hdfs-httpfs/index.html/ - item name=HDFS User Guide href=hadoop-project-dist/hadoop-hdfs/hdfs_user_guide.html/ /menu -menu name=Yarn/MapReduce inherit=top +menu name=MapReduce/YARN inherit=top item name=YARN Architecture href=hadoop-yarn/hadoop-yarn-site/YARN.html/ - item name=Writing Yarn Applications href=hadoop-yarn/hadoop-yarn-site/WritingYarnApplications.html/ + item name=Writing YARN Applications href=hadoop-yarn/hadoop-yarn-site/WritingYarnApplications.html/ item name=Capacity Scheduler href=hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html/ item name=Web Application Proxy href=hadoop-yarn/hadoop-yarn-site/WebApplicationProxy.html/ - item name=Yarn Commands href=hadoop-yarn/hadoop-yarn-site/YarnCommands.html/ + item name=YARN Commands href=hadoop-yarn/hadoop-yarn-site/YarnCommands.html/ /menu -menu name=YARN REST API's inherit=top +menu name=YARN REST APIs inherit=top item name=Introduction href=hadoop-yarn/hadoop-yarn-site/WebServicesIntro.html/ item name=Resource Manager href=hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html/ item name=Node Manager href=hadoop-yarn/hadoop-yarn-site/NodeManagerRest.html/ - item name=Mapreduce Application Master href=hadoop-yarn/hadoop-yarn-site/MapredAppMasterRest.html/ + item name=MR Application Master href=hadoop-yarn/hadoop-yarn-site/MapredAppMasterRest.html/ item name=History Server href=hadoop-yarn/hadoop-yarn-site/HistoryServerRest.html/ /menu @@ -94,8 +92,8 @@ menu name=Configuration inherit=top item name=core-default.xml href=hadoop-project-dist/hadoop-common/core-default.xml/ item name
svn commit: r1411762 - in /hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common: ./ src/site/ src/site/apt/ src/site/resources/ src/site/resources/css/
Author: tomwhite Date: Tue Nov 20 17:46:14 2012 New Revision: 1411762 URL: http://svn.apache.org/viewvc?rev=1411762view=rev Log: HADOOP-8860. Split MapReduce and YARN sections in documentation navigation. Added: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm - copied unchanged from r1411754, hadoop/common/branches/branch-0.23/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site/src/site/apt/ClusterSetup.apt.vm hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/apt/SingleCluster.apt.vm - copied unchanged from r1411754, hadoop/common/branches/branch-0.23/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site/src/site/apt/SingleCluster.apt.vm hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/site.css (with props) hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/site.xml (with props) Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1411762r1=1411761r2=1411762view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Tue Nov 20 17:46:14 2012 @@ -37,6 +37,9 @@ Release 0.23.5 - UNRELEASED HADOOP-8889. Upgrade to Surefire 2.12.3 (todd) +HADOOP-8860. Split MapReduce and YARN sections in documentation navigation. +(tomwhite) + OPTIMIZATIONS HADOOP-8819. Incorrectly is used instead of in some file system Added: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/site.css URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/site.css?rev=1411762view=auto == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/site.css (added) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/site.css Tue Nov 20 17:46:14 2012 @@ -0,0 +1,30 @@ +/* +* Licensed to the Apache Software Foundation (ASF) under one or more +* contributor license agreements. See the NOTICE file distributed with +* this work for additional information regarding copyright ownership. +* The ASF licenses this file to You under the Apache License, Version 2.0 +* (the License); you may not use this file except in compliance with +* the License. You may obtain a copy of the License at +* +* http://www.apache.org/licenses/LICENSE-2.0 +* +* Unless required by applicable law or agreed to in writing, software +* distributed under the License is distributed on an AS IS BASIS, +* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +* See the License for the specific language governing permissions and +* limitations under the License. +*/ +#banner { + height: 93px; + background: none; +} + +#bannerLeft img { + margin-left: 30px; + margin-top: 10px; +} + +#bannerRight img { + margin: 17px; +} + Propchange: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/site.css -- svn:eol-style = native Added: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/site.xml?rev=1411762view=auto == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/site.xml (added) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/site.xml Tue Nov 20 17:46:14 2012 @@ -0,0 +1,28 @@ +!-- + Licensed under the Apache License, Version 2.0 (the License); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an AS IS BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions
svn commit: r1411771 - in /hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common: ./ src/site/ src/site/apt/ src/site/resources/ src/site/resources/css/
Author: tomwhite Date: Tue Nov 20 18:00:40 2012 New Revision: 1411771 URL: http://svn.apache.org/viewvc?rev=1411771view=rev Log: Merge -r 1411761:1411762 from trunk to branch-0.23.5. Fixes: HADOOP-8860. Split MapReduce and YARN sections in documentation navigation. Added: hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm - copied unchanged from r1411762, hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/src/site/apt/SingleCluster.apt.vm - copied unchanged from r1411762, hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/apt/SingleCluster.apt.vm hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/src/site/resources/ - copied from r1411762, hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/ hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/src/site/resources/css/ - copied from r1411762, hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/ hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/src/site/resources/css/site.css - copied unchanged from r1411762, hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/resources/css/site.css hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/src/site/site.xml - copied unchanged from r1411762, hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/site.xml Modified: hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1411771r1=1411770r2=1411771view=diff == --- hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23.5/hadoop-common-project/hadoop-common/CHANGES.txt Tue Nov 20 18:00:40 2012 @@ -25,6 +25,9 @@ Release 0.23.5 - UNRELEASED HADOOP-8889. Upgrade to Surefire 2.12.3 (todd) +HADOOP-8860. Split MapReduce and YARN sections in documentation navigation. +(tomwhite) + OPTIMIZATIONS HADOOP-8819. Incorrectly is used instead of in some file system
svn commit: r1411771 - in /hadoop/common/branches/branch-0.23.5/hadoop-project/src/site: apt/index.apt.vm site.xml
Author: tomwhite Date: Tue Nov 20 18:00:40 2012 New Revision: 1411771 URL: http://svn.apache.org/viewvc?rev=1411771view=rev Log: Merge -r 1411761:1411762 from trunk to branch-0.23.5. Fixes: HADOOP-8860. Split MapReduce and YARN sections in documentation navigation. Modified: hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/apt/index.apt.vm hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/site.xml Modified: hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/apt/index.apt.vm URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/apt/index.apt.vm?rev=1411771r1=1411770r2=1411771view=diff == --- hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/apt/index.apt.vm (original) +++ hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/apt/index.apt.vm Tue Nov 20 18:00:40 2012 @@ -34,7 +34,7 @@ Apache Hadoop 0.23 Namenodes. More details are available in the - {{{./hadoop-yarn/hadoop-yarn-site/Federation.html}HDFS Federation}} + {{{./hadoop-project-dist/hadoop-hdfs/Federation.html}HDFS Federation}} document. * {MapReduce NextGen aka YARN aka MRv2} @@ -65,9 +65,9 @@ Getting Started The Hadoop documentation includes the information you need to get started using Hadoop. Begin with the - {{{./hadoop-yarn/hadoop-yarn-site/SingleCluster.html}Single Node Setup}} which + {{{./hadoop-project-dist/hadoop-common/SingleCluster.html}Single Node Setup}} which shows you how to set up a single-node Hadoop installation. Then move on to the - {{{./hadoop-yarn/hadoop-yarn-site/ClusterSetup.html}Cluster Setup}} to learn how + {{{./hadoop-project-dist/hadoop-common/ClusterSetup.html}Cluster Setup}} to learn how to set up a multi-node Hadoop installation. Modified: hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/site.xml?rev=1411771r1=1411770r2=1411771view=diff == --- hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/site.xml (original) +++ hadoop/common/branches/branch-0.23.5/hadoop-project/src/site/site.xml Tue Nov 20 18:00:40 2012 @@ -48,31 +48,29 @@ menu name=Common inherit=top item name=Overview href=index.html/ - item name=Single Node Setup href=hadoop-yarn/hadoop-yarn-site/SingleCluster.html/ - item name=Cluster Setup href=hadoop-yarn/hadoop-yarn-site/ClusterSetup.html/ - item name=Hadoop Commands href=hadoop-project-dist/hadoop-common/commands_manual.html/ + item name=Single Node Setup href=hadoop-project-dist/hadoop-common/SingleCluster.html/ + item name=Cluster Setup href=hadoop-project-dist/hadoop-common/ClusterSetup.html/ /menu menu name=HDFS inherit=top - item name=Federation href=hadoop-yarn/hadoop-yarn-site/Federation.html/ - item name=WebHDFS REST API href=hadoop-yarn/hadoop-yarn-site/WebHDFS.html/ + item name=Federation href=hadoop-project-dist/hadoop-hdfs/Federation.html/ + item name=WebHDFS REST API href=hadoop-project-dist/hadoop-hdfs/WebHDFS.html/ item name=HttpFS Gateway href=hadoop-hdfs-httpfs/index.html/ - item name=HDFS User Guide href=hadoop-project-dist/hadoop-hdfs/hdfs_user_guide.html/ /menu -menu name=Yarn/MapReduce inherit=top +menu name=MapReduce/YARN inherit=top item name=YARN Architecture href=hadoop-yarn/hadoop-yarn-site/YARN.html/ - item name=Writing Yarn Applications href=hadoop-yarn/hadoop-yarn-site/WritingYarnApplications.html/ + item name=Writing YARN Applications href=hadoop-yarn/hadoop-yarn-site/WritingYarnApplications.html/ item name=Capacity Scheduler href=hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html/ item name=Web Application Proxy href=hadoop-yarn/hadoop-yarn-site/WebApplicationProxy.html/ - item name=Yarn Commands href=hadoop-yarn/hadoop-yarn-site/YarnCommands.html/ + item name=YARN Commands href=hadoop-yarn/hadoop-yarn-site/YarnCommands.html/ /menu -menu name=YARN REST API's inherit=top +menu name=YARN REST APIs inherit=top item name=Introduction href=hadoop-yarn/hadoop-yarn-site/WebServicesIntro.html/ item name=Resource Manager href=hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html/ item name=Node Manager href=hadoop-yarn/hadoop-yarn-site/NodeManagerRest.html/ - item name=Mapreduce Application Master href=hadoop-yarn/hadoop-yarn-site/MapredAppMasterRest.html/ + item name=MR Application Master href=hadoop-yarn/hadoop-yarn-site/MapredAppMasterRest.html/ item name=History Server href=hadoop-yarn/hadoop-yarn-site/HistoryServerRest.html/ /menu @@ -94,8 +92,8 @@ menu name=Configuration inherit=top item name=core
svn commit: r1411235 - /hadoop/common/trunk/hadoop-project/pom.xml
Author: tomwhite Date: Mon Nov 19 15:12:22 2012 New Revision: 1411235 URL: http://svn.apache.org/viewvc?rev=1411235view=rev Log: YARN-129. Simplify classpath construction for mini YARN tests. Modified: hadoop/common/trunk/hadoop-project/pom.xml Modified: hadoop/common/trunk/hadoop-project/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-project/pom.xml?rev=1411235r1=1411234r2=1411235view=diff == --- hadoop/common/trunk/hadoop-project/pom.xml (original) +++ hadoop/common/trunk/hadoop-project/pom.xml Mon Nov 19 15:12:22 2012 @@ -703,11 +703,6 @@ groupIdorg.apache.maven.plugins/groupId artifactIdmaven-jar-plugin/artifactId version2.3.1/version - configuration -excludes - excludemrapp-generated-classpath/exclude -/excludes - /configuration /plugin plugin groupIdorg.apache.maven.plugins/groupId @@ -803,21 +798,6 @@ /executions /plugin plugin -artifactIdmaven-dependency-plugin/artifactId -executions - execution -idbuild-classpath/id -phasegenerate-sources/phase -goals - goalbuild-classpath/goal -/goals -configuration - outputFiletarget/classes/mrapp-generated-classpath/outputFile -/configuration - /execution -/executions - /plugin - plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-surefire-plugin/artifactId configuration
svn commit: r1411239 - /hadoop/common/branches/branch-2/hadoop-project/pom.xml
Author: tomwhite Date: Mon Nov 19 15:16:57 2012 New Revision: 1411239 URL: http://svn.apache.org/viewvc?rev=1411239view=rev Log: Merge -r 1411234:1411235 from trunk to branch-2. Fixes: YARN-129. Simplify classpath construction for mini YARN tests. Modified: hadoop/common/branches/branch-2/hadoop-project/pom.xml Modified: hadoop/common/branches/branch-2/hadoop-project/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-project/pom.xml?rev=1411239r1=1411238r2=1411239view=diff == --- hadoop/common/branches/branch-2/hadoop-project/pom.xml (original) +++ hadoop/common/branches/branch-2/hadoop-project/pom.xml Mon Nov 19 15:16:57 2012 @@ -705,11 +705,6 @@ groupIdorg.apache.maven.plugins/groupId artifactIdmaven-jar-plugin/artifactId version2.3.1/version - configuration -excludes - excludemrapp-generated-classpath/exclude -/excludes - /configuration /plugin plugin groupIdorg.apache.maven.plugins/groupId @@ -805,21 +800,6 @@ /executions /plugin plugin -artifactIdmaven-dependency-plugin/artifactId -executions - execution -idbuild-classpath/id -phasegenerate-sources/phase -goals - goalbuild-classpath/goal -/goals -configuration - outputFiletarget/classes/mrapp-generated-classpath/outputFile -/configuration - /execution -/executions - /plugin - plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-enforcer-plugin/artifactId inheritedfalse/inherited
svn commit: r1408290 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site: apt/CLIMiniCluster.apt.vm apt/ClusterSetup.apt.vm apt/SingleCluster.apt.vm resources/ resources
Author: tomwhite Date: Mon Nov 12 13:57:19 2012 New Revision: 1408290 URL: http://svn.apache.org/viewvc?rev=1408290view=rev Log: Merge -r 1407657:1407658 from trunk to branch-2. Fixes: HADOOP-8860. Split MapReduce and YARN sections in documentation navigation. Added: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/CLIMiniCluster.apt.vm - copied unchanged from r1408264, hadoop/common/branches/branch-2/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site/src/site/apt/CLIMiniCluster.apt.vm hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/ClusterSetup.apt.vm - copied unchanged from r1408264, hadoop/common/branches/branch-2/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site/src/site/apt/ClusterSetup.apt.vm hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/SingleCluster.apt.vm - copied unchanged from r1408264, hadoop/common/branches/branch-2/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site/src/site/apt/SingleCluster.apt.vm hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/resources/ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/resources/css/ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/resources/css/site.css (with props) hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/site.xml (with props) Added: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/resources/css/site.css URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/resources/css/site.css?rev=1408290view=auto == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/resources/css/site.css (added) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/resources/css/site.css Mon Nov 12 13:57:19 2012 @@ -0,0 +1,30 @@ +/* +* Licensed to the Apache Software Foundation (ASF) under one or more +* contributor license agreements. See the NOTICE file distributed with +* this work for additional information regarding copyright ownership. +* The ASF licenses this file to You under the Apache License, Version 2.0 +* (the License); you may not use this file except in compliance with +* the License. You may obtain a copy of the License at +* +* http://www.apache.org/licenses/LICENSE-2.0 +* +* Unless required by applicable law or agreed to in writing, software +* distributed under the License is distributed on an AS IS BASIS, +* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +* See the License for the specific language governing permissions and +* limitations under the License. +*/ +#banner { + height: 93px; + background: none; +} + +#bannerLeft img { + margin-left: 30px; + margin-top: 10px; +} + +#bannerRight img { + margin: 17px; +} + Propchange: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/resources/css/site.css -- svn:eol-style = native Added: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/site.xml?rev=1408290view=auto == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/site.xml (added) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/site.xml Mon Nov 12 13:57:19 2012 @@ -0,0 +1,28 @@ +!-- + Licensed under the Apache License, Version 2.0 (the License); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an AS IS BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. See accompanying LICENSE file. +-- +project name=Apache Hadoop ${project.version} + + skin +groupIdorg.apache.maven.skins/groupId +artifactIdmaven-stylus-skin/artifactId +version1.2/version + /skin + + body +links + item name=Apache Hadoop href=http://hadoop.apache.org// +/links + /body + +/project Propchange: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/site.xml -- svn:eol-style = native
svn commit: r1408290 - in /hadoop/common/branches/branch-2/hadoop-project/src/site: apt/index.apt.vm site.xml
Author: tomwhite Date: Mon Nov 12 13:57:19 2012 New Revision: 1408290 URL: http://svn.apache.org/viewvc?rev=1408290view=rev Log: Merge -r 1407657:1407658 from trunk to branch-2. Fixes: HADOOP-8860. Split MapReduce and YARN sections in documentation navigation. Modified: hadoop/common/branches/branch-2/hadoop-project/src/site/apt/index.apt.vm hadoop/common/branches/branch-2/hadoop-project/src/site/site.xml Modified: hadoop/common/branches/branch-2/hadoop-project/src/site/apt/index.apt.vm URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-project/src/site/apt/index.apt.vm?rev=1408290r1=1408289r2=1408290view=diff == --- hadoop/common/branches/branch-2/hadoop-project/src/site/apt/index.apt.vm (original) +++ hadoop/common/branches/branch-2/hadoop-project/src/site/apt/index.apt.vm Mon Nov 12 13:57:19 2012 @@ -34,7 +34,7 @@ Apache Hadoop ${project.version} Namenodes. More details are available in the - {{{./hadoop-yarn/hadoop-yarn-site/Federation.html}HDFS Federation}} + {{{./hadoop-project-dist/hadoop-hdfs/Federation.html}HDFS Federation}} document. * {MapReduce NextGen aka YARN aka MRv2} @@ -65,9 +65,9 @@ Getting Started The Hadoop documentation includes the information you need to get started using Hadoop. Begin with the - {{{./hadoop-yarn/hadoop-yarn-site/SingleCluster.html}Single Node Setup}} which + {{{./hadoop-project-dist/hadoop-common/SingleCluster.html}Single Node Setup}} which shows you how to set up a single-node Hadoop installation. Then move on to the - {{{./hadoop-yarn/hadoop-yarn-site/ClusterSetup.html}Cluster Setup}} to learn how + {{{./hadoop-project-dist/hadoop-common/ClusterSetup.html}Cluster Setup}} to learn how to set up a multi-node Hadoop installation. Modified: hadoop/common/branches/branch-2/hadoop-project/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-project/src/site/site.xml?rev=1408290r1=1408289r2=1408290view=diff == --- hadoop/common/branches/branch-2/hadoop-project/src/site/site.xml (original) +++ hadoop/common/branches/branch-2/hadoop-project/src/site/site.xml Mon Nov 12 13:57:19 2012 @@ -48,33 +48,35 @@ menu name=Common inherit=top item name=Overview href=index.html/ - item name=Single Node Setup href=hadoop-yarn/hadoop-yarn-site/SingleCluster.html/ - item name=Cluster Setup href=hadoop-yarn/hadoop-yarn-site/ClusterSetup.html/ - item name=Hadoop Commands href=hadoop-project-dist/hadoop-common/commands_manual.html/ -/menu + item name=Single Node Setup href=hadoop-project-dist/hadoop-common/SingleCluster.html/ + item name=Cluster Setup href=hadoop-project-dist/hadoop-common/ClusterSetup.html/ + item name=CLI Mini Cluster href=hadoop-project-dist/hadoop-common/CLIMiniCluster.html/ + /menu menu name=HDFS inherit=top - item name=High Availability href=hadoop-yarn/hadoop-yarn-site/HDFSHighAvailability.html/ - item name=Federation href=hadoop-yarn/hadoop-yarn-site/Federation.html/ - item name=WebHDFS REST API href=hadoop-yarn/hadoop-yarn-site/WebHDFS.html/ + item name=High Availability href=hadoop-project-dist/hadoop-hdfs/HDFSHighAvailability.html/ + item name=Federation href=hadoop-project-dist/hadoop-hdfs/Federation.html/ + item name=WebHDFS REST API href=hadoop-project-dist/hadoop-hdfs/WebHDFS.html/ item name=HttpFS Gateway href=hadoop-hdfs-httpfs/index.html/ - item name=HDFS User Guide href=hadoop-project-dist/hadoop-hdfs/hdfs_user_guide.html/ /menu -menu name=Yarn/MapReduce inherit=top +menu name=MapReduce inherit=top + item name=Encrypted Shuffle href=hadoop-mapreduce-client/hadoop-mapreduce-client-core/EncryptedShuffle.html/ +/menu + +menu name=YARN inherit=top item name=YARN Architecture href=hadoop-yarn/hadoop-yarn-site/YARN.html/ - item name=Writing Yarn Applications href=hadoop-yarn/hadoop-yarn-site/WritingYarnApplications.html/ + item name=Writing YARN Applications href=hadoop-yarn/hadoop-yarn-site/WritingYarnApplications.html/ item name=Capacity Scheduler href=hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html/ item name=Web Application Proxy href=hadoop-yarn/hadoop-yarn-site/WebApplicationProxy.html/ - item name=Encrypted Shuffle href=hadoop-yarn/hadoop-yarn-site/EncryptedShuffle.html/ - item name=Yarn Commands href=hadoop-yarn/hadoop-yarn-site/YarnCommands.html/ + item name=YARN Commands href=hadoop-yarn/hadoop-yarn-site/YarnCommands.html/ /menu -menu name=YARN REST API's inherit=top +menu name=YARN REST APIs inherit=top item name=Introduction href=hadoop-yarn/hadoop-yarn-site/WebServicesIntro.html/ item name=Resource Manager href=hadoop-yarn/hadoop-yarn
svn commit: r1396047 - in /hadoop/common/trunk/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools: TestDistCp.java TestIntegration.java
Author: tomwhite Date: Tue Oct 9 14:37:46 2012 New Revision: 1396047 URL: http://svn.apache.org/viewvc?rev=1396047view=rev Log: MAPREDUCE-4654. TestDistCp is ignored. Contributed by Sandy Ryza. Removed: hadoop/common/trunk/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestDistCp.java Modified: hadoop/common/trunk/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java Modified: hadoop/common/trunk/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java?rev=1396047r1=1396046r2=1396047view=diff == --- hadoop/common/trunk/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java (original) +++ hadoop/common/trunk/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java Tue Oct 9 14:37:46 2012 @@ -21,8 +21,11 @@ package org.apache.hadoop.tools; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FSDataInputStream; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; +import org.apache.hadoop.mapreduce.Cluster; +import org.apache.hadoop.mapreduce.JobSubmissionFiles; import org.apache.hadoop.tools.util.TestDistCpUtils; import org.junit.Assert; import org.junit.BeforeClass; @@ -30,6 +33,8 @@ import org.junit.Test; import java.io.IOException; import java.io.OutputStream; +import java.util.ArrayList; +import java.util.List; public class TestIntegration { private static final Log LOG = LogFactory.getLog(TestIntegration.class); @@ -317,6 +322,58 @@ public class TestIntegration { TestDistCpUtils.delete(fs, root); } } + + @Test + public void testDeleteMissingInDestination() { + +try { + addEntries(listFile, srcdir); + createFiles(srcdir/file1, dstdir/file1, dstdir/file2); + + Path target = new Path(root + /dstdir); + runTest(listFile, target, true, true, false); + + checkResult(target, 1, file1); +} catch (IOException e) { + LOG.error(Exception encountered while running distcp, e); + Assert.fail(distcp failure); +} finally { + TestDistCpUtils.delete(fs, root); + TestDistCpUtils.delete(fs, target/tmp1); +} + } + + @Test + public void testOverwrite() { +byte[] contents1 = contents1.getBytes(); +byte[] contents2 = contents2.getBytes(); +Assert.assertEquals(contents1.length, contents2.length); + +try { + addEntries(listFile, srcdir); + createWithContents(srcdir/file1, contents1); + createWithContents(dstdir/file1, contents2); + + Path target = new Path(root + /dstdir); + runTest(listFile, target, false, false, true); + + checkResult(target, 1, file1); + + // make sure dstdir/file1 has been overwritten with the contents + // of srcdir/file1 + FSDataInputStream is = fs.open(new Path(root + /dstdir/file1)); + byte[] dstContents = new byte[contents1.length]; + is.readFully(dstContents); + is.close(); + Assert.assertArrayEquals(contents1, dstContents); +} catch (IOException e) { + LOG.error(Exception encountered while running distcp, e); + Assert.fail(distcp failure); +} finally { + TestDistCpUtils.delete(fs, root); + TestDistCpUtils.delete(fs, target/tmp1); +} + } @Test public void testGlobTargetMissingSingleLevel() { @@ -410,7 +467,33 @@ public class TestIntegration { TestDistCpUtils.delete(fs, target/tmp1); } } + + @Test + public void testCleanup() { +try { + Path sourcePath = new Path(noscheme:///file); + ListPath sources = new ArrayListPath(); + sources.add(sourcePath); + + DistCpOptions options = new DistCpOptions(sources, target); + + Configuration conf = getConf(); + Path stagingDir = JobSubmissionFiles.getStagingDir( + new Cluster(conf), conf); + stagingDir.getFileSystem(conf).mkdirs(stagingDir); + try { +new DistCp(conf, options).execute(); + } catch (Throwable t) { +Assert.assertEquals(stagingDir.getFileSystem(conf). +listStatus(stagingDir).length, 0); + } +} catch (Exception e) { + LOG.error(Exception encountered , e); + Assert.fail(testCleanup failed + e.getMessage()); +} + } + private void addEntries(Path listFile, String... entries) throws IOException { OutputStream out = fs.create(listFile); try { @@ -434,16 +517,32 @@ public class TestIntegration { } } } + + private void createWithContents(String entry, byte[] contents) throws IOException { +OutputStream out = fs.create(new Path(root
svn commit: r1396051 - in /hadoop/common/branches/branch-2/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools: TestDistCp.java TestIntegration.java
Author: tomwhite Date: Tue Oct 9 14:49:51 2012 New Revision: 1396051 URL: http://svn.apache.org/viewvc?rev=1396051view=rev Log: Merge -r 1396046:1396047 from trunk to branch-2. Fixes: MAPREDUCE-4654. TestDistCp is ignored. Contributed by Sandy Ryza. Removed: hadoop/common/branches/branch-2/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestDistCp.java Modified: hadoop/common/branches/branch-2/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java Modified: hadoop/common/branches/branch-2/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java?rev=1396051r1=1396050r2=1396051view=diff == --- hadoop/common/branches/branch-2/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java (original) +++ hadoop/common/branches/branch-2/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java Tue Oct 9 14:49:51 2012 @@ -21,8 +21,11 @@ package org.apache.hadoop.tools; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FSDataInputStream; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; +import org.apache.hadoop.mapreduce.Cluster; +import org.apache.hadoop.mapreduce.JobSubmissionFiles; import org.apache.hadoop.tools.util.TestDistCpUtils; import org.junit.Assert; import org.junit.BeforeClass; @@ -30,6 +33,8 @@ import org.junit.Test; import java.io.IOException; import java.io.OutputStream; +import java.util.ArrayList; +import java.util.List; public class TestIntegration { private static final Log LOG = LogFactory.getLog(TestIntegration.class); @@ -317,6 +322,58 @@ public class TestIntegration { TestDistCpUtils.delete(fs, root); } } + + @Test + public void testDeleteMissingInDestination() { + +try { + addEntries(listFile, srcdir); + createFiles(srcdir/file1, dstdir/file1, dstdir/file2); + + Path target = new Path(root + /dstdir); + runTest(listFile, target, true, true, false); + + checkResult(target, 1, file1); +} catch (IOException e) { + LOG.error(Exception encountered while running distcp, e); + Assert.fail(distcp failure); +} finally { + TestDistCpUtils.delete(fs, root); + TestDistCpUtils.delete(fs, target/tmp1); +} + } + + @Test + public void testOverwrite() { +byte[] contents1 = contents1.getBytes(); +byte[] contents2 = contents2.getBytes(); +Assert.assertEquals(contents1.length, contents2.length); + +try { + addEntries(listFile, srcdir); + createWithContents(srcdir/file1, contents1); + createWithContents(dstdir/file1, contents2); + + Path target = new Path(root + /dstdir); + runTest(listFile, target, false, false, true); + + checkResult(target, 1, file1); + + // make sure dstdir/file1 has been overwritten with the contents + // of srcdir/file1 + FSDataInputStream is = fs.open(new Path(root + /dstdir/file1)); + byte[] dstContents = new byte[contents1.length]; + is.readFully(dstContents); + is.close(); + Assert.assertArrayEquals(contents1, dstContents); +} catch (IOException e) { + LOG.error(Exception encountered while running distcp, e); + Assert.fail(distcp failure); +} finally { + TestDistCpUtils.delete(fs, root); + TestDistCpUtils.delete(fs, target/tmp1); +} + } @Test public void testGlobTargetMissingSingleLevel() { @@ -410,7 +467,33 @@ public class TestIntegration { TestDistCpUtils.delete(fs, target/tmp1); } } + + @Test + public void testCleanup() { +try { + Path sourcePath = new Path(noscheme:///file); + ListPath sources = new ArrayListPath(); + sources.add(sourcePath); + + DistCpOptions options = new DistCpOptions(sources, target); + + Configuration conf = getConf(); + Path stagingDir = JobSubmissionFiles.getStagingDir( + new Cluster(conf), conf); + stagingDir.getFileSystem(conf).mkdirs(stagingDir); + try { +new DistCp(conf, options).execute(); + } catch (Throwable t) { +Assert.assertEquals(stagingDir.getFileSystem(conf). +listStatus(stagingDir).length, 0); + } +} catch (Exception e) { + LOG.error(Exception encountered , e); + Assert.fail(testCleanup failed + e.getMessage()); +} + } + private void addEntries(Path listFile, String... entries) throws IOException { OutputStream out = fs.create(listFile); try { @@ -434,16 +517,32 @@ public class TestIntegration { } } } + + private
svn commit: r1395520 - in /hadoop/common/branches/branch-1: CHANGES.txt src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairScheduler.java
Author: tomwhite Date: Mon Oct 8 11:44:42 2012 New Revision: 1395520 URL: http://svn.apache.org/viewvc?rev=1395520view=rev Log: MAPREDUCE-4706. FairScheduler#dump(): Computing of # running maps and reduces is commented out. Contributed by Karthik Kambatla. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairScheduler.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1395520r1=1395519r2=1395520view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Mon Oct 8 11:44:42 2012 @@ -233,6 +233,9 @@ Release 1.2.0 - unreleased (default behavior in Ubuntu). (Christopher Berner and Andy Isaacson via harsh) +MAPREDUCE-4706. FairScheduler#dump(): Computing of # running maps and +reduces is commented out. (Karthik Kambatla via tomwhite) + Release 1.1.0 - unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairScheduler.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairScheduler.java?rev=1395520r1=1395519r2=1395520view=diff == --- hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairScheduler.java (original) +++ hadoop/common/branches/branch-1/src/contrib/fairscheduler/src/java/org/apache/hadoop/mapred/FairScheduler.java Mon Oct 8 11:44:42 2012 @@ -1082,16 +1082,8 @@ public class FairScheduler extends TaskS else return p1.getName().compareTo(p2.getName()); }}); for (Pool pool: pools) { -int runningMaps = 0; -int runningReduces = 0; -for (JobInProgress job: pool.getJobs()) { - JobInfo info = infos.get(job); - if (info != null) { -// TODO: Fix -//runningMaps += info.runningMaps; -//runningReduces += info.runningReduces; - } -} +int runningMaps = pool.getMapSchedulable().getRunningTasks(); +int runningReduces = pool.getReduceSchedulable().getRunningTasks(); String name = pool.getName(); eventLog.log(POOL, name, poolMgr.getPoolWeight(name), pool.getJobs().size(),
svn commit: r1388381 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/mapred-default.xml
Author: tomwhite Date: Fri Sep 21 09:01:25 2012 New Revision: 1388381 URL: http://svn.apache.org/viewvc?rev=1388381view=rev Log: MAPREDUCE-2770. Improve hadoop.job.history.location doc in mapred-default.xml. Contributed by Sandy Ryza. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/mapred-default.xml Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1388381r1=1388380r2=1388381view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Fri Sep 21 09:01:25 2012 @@ -104,6 +104,9 @@ Release 1.2.0 - unreleased HADOOP-8832. Port generic service plugin mechanism from HADOOP-5257 to branch-1. (backported by Brandon Li via suresh) +MAPREDUCE-2770. Improve hadoop.job.history.location doc in +mapred-default.xml. (Sandy Ryza via tomwhite) + OPTIMIZATIONS HDFS-2533. Backport: Remove needless synchronization on some FSDataSet Modified: hadoop/common/branches/branch-1/src/mapred/mapred-default.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/mapred-default.xml?rev=1388381r1=1388380r2=1388381view=diff == --- hadoop/common/branches/branch-1/src/mapred/mapred-default.xml (original) +++ hadoop/common/branches/branch-1/src/mapred/mapred-default.xml Fri Sep 21 09:01:25 2012 @@ -10,9 +10,12 @@ property namehadoop.job.history.location/name value/value - description If job tracker is static the history files are stored - in this single well known place. If No value is set here, by default, - it is in the local file system at ${hadoop.log.dir}/history. + description The location where jobtracker history files are stored. + The value for this key is treated as a URI, meaning that the files + can be stored either on HDFS or the local file system. If no value is + set here, the location defaults to the local file system, at + file:///${hadoop.log.dir}/history. If the URI is missing a scheme, + fs.default.name is used for the file system. /description /property
svn commit: r1384833 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/conf/Configuration.java src/site/apt/DeprecatedProperties.apt.vm
Author: tomwhite Date: Fri Sep 14 16:05:30 2012 New Revision: 1384833 URL: http://svn.apache.org/viewvc?rev=1384833view=rev Log: HADOOP-8780. Update DeprecatedProperties apt file. Contributed by Ahmed Radwan Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/DeprecatedProperties.apt.vm Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1384833r1=1384832r2=1384833view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Fri Sep 14 16:05:30 2012 @@ -236,6 +236,9 @@ Release 2.0.3-alpha - Unreleased HADOOP-8795. BASH tab completion doesn't look in PATH, assumes path to executable is specified. (Sean Mackrory via atm) +HADOOP-8780. Update DeprecatedProperties apt file. (Ahmed Radwan via +tomwhite) + Release 2.0.2-alpha - 2012-09-07 INCOMPATIBLE CHANGES Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java?rev=1384833r1=1384832r2=1384833view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java Fri Sep 14 16:05:30 2012 @@ -2335,4 +2335,14 @@ public class Configuration implements It * for getClassByName. {@see Configuration#getClassByNameOrNull(String)} */ private static abstract class NegativeCacheSentinel {} + + public static void dumpDeprecatedKeys() { +for (Map.EntryString, DeprecatedKeyInfo entry : deprecatedKeyMap.entrySet()) { + String newKeys = ; + for (String newKey : entry.getValue().newKeys) { +newKeys += newKey + \t; + } + System.out.println(entry.getKey() + \t + newKeys); +} + } } Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/DeprecatedProperties.apt.vm URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/DeprecatedProperties.apt.vm?rev=1384833r1=1384832r2=1384833view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/DeprecatedProperties.apt.vm (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/DeprecatedProperties.apt.vm Fri Sep 14 16:05:30 2012 @@ -24,8 +24,6 @@ Deprecated Properties *---+---+ || Deprecated property name || New property name| *---+---+ -|StorageId | dfs.datanode.StorageId -*---+---+ |create.empty.dir.if.nonexist | mapreduce.jobcontrol.createdir.ifnotexist *---+---+ |dfs.access.time.precision | dfs.namenode.accesstime.precision @@ -38,14 +36,16 @@ Deprecated Properties *---+---+ |dfs.block.size | dfs.blocksize *---+---+ -|dfs.client.buffer.dir | fs.client.buffer.dir -*---+---+ |dfs.data.dir | dfs.datanode.data.dir *---+---+ |dfs.datanode.max.xcievers | dfs.datanode.max.transfer.threads *---+---+ |dfs.df.interval | fs.df.interval *---+---+ +|dfs.federation.nameservice.id | dfs.nameservice.id +*---+---+ +|dfs.federation.nameservices | dfs.nameservices +*---+---+ |dfs.http.address | dfs.namenode.http-address *---+---+ |dfs.https.address | dfs.namenode.https-address @@ -54,10 +54,10 @@ Deprecated Properties *---+---+ |dfs.https.need.client.auth | dfs.client.https.need-auth *---+---+ -|dfs.max-repl-streams | dfs.namenode.replication.max-streams -*---+---+ |dfs.max.objects | dfs.namenode.max.objects *---+---+ +|dfs.max-repl-streams | dfs.namenode.replication.max-streams +*---+---+ |dfs.name.dir | dfs.namenode.name.dir *---+---+ |dfs.name.dir.restore | dfs.namenode.name.dir.restore @@ -86,6 +86,8 @@ Deprecated Properties *---+---+ |dfs.socket.timeout | dfs.client.socket-timeout *---+---+ +|dfs.umaskmode | fs.permissions.umask-mode +*---+---+ |dfs.write.packet.size | dfs.client-write-packet-size *---+---+ |fs.checkpoint.dir | dfs.namenode.checkpoint.dir @@ -106,10 +108,10 @@ Deprecated Properties *---+---+ |hadoop.pipes.command-file.keep | mapreduce.pipes.commandfile.preserve *---+---+ -|hadoop.pipes.executable
svn commit: r1384835 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/conf/Configuration.java src/site/apt/DeprecatedProperties.ap
Author: tomwhite Date: Fri Sep 14 16:15:20 2012 New Revision: 1384835 URL: http://svn.apache.org/viewvc?rev=1384835view=rev Log: Merge -r 1384832:1384833 from trunk to branch-2. Fixes: HADOOP-8780. Update DeprecatedProperties apt file. Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/DeprecatedProperties.apt.vm Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1384835r1=1384834r2=1384835view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Fri Sep 14 16:15:20 2012 @@ -1,7 +1,219 @@ Hadoop Change Log -Release 2.0.3-alpha - Unreleased - +Trunk (Unreleased) + + INCOMPATIBLE CHANGES + +HADOOP-8124. Remove the deprecated FSDataOutputStream constructor, +FSDataOutputStream.sync() and Syncable.sync(). (szetszwo) + + NEW FEATURES + +HADOOP-8469. Make NetworkTopology class pluggable. (Junping Du via +szetszwo) + +HADOOP-8470. Add NetworkTopologyWithNodeGroup, a 4-layer implementation +of NetworkTopology. (Junping Du via szetszwo) + + IMPROVEMENTS + +HADOOP-8017. Configure hadoop-main pom to get rid of M2E plugin execution +not covered (Eric Charles via bobby) + +HADOOP-8015. ChRootFileSystem should extend FilterFileSystem +(Daryn Sharp via bobby) + +HADOOP-7595. Upgrade dependency to Avro 1.5.3. (Alejandro Abdelnur via atm) + +HADOOP-7664. Remove warmings when overriding final parameter configuration +if the override value is same as the final parameter value. +(Ravi Prakash via suresh) + +HADOOP-7729. Send back valid HTTP response if user hits IPC port with +HTTP GET. (todd) + +HADOOP-7792. Add verifyToken method to AbstractDelegationTokenSecretManager. +(jitendra) + +HADOOP-7688. Add servlet handler check in HttpServer.start(). +(Uma Maheswara Rao G via szetszwo) + +HADOOP-7886. Add toString to FileStatus. (SreeHari via jghoman) + +HADOOP-7808. Port HADOOP-7510 - Add configurable option to use original +hostname in token instead of IP to allow server IP change. +(Daryn Sharp via suresh) + +HADOOP-7987. Support setting the run-as user in unsecure mode. (jitendra) + +HADOOP-7988. Upper case in hostname part of the principals doesn't work with +kerberos. (jitendra) + +HADOOP-8078. Add capability to turn on security in unit tests. (Jaimin Jetly +via jitendra) + +HADOOP-7994. Remove getProtocolVersion and getProtocolSignature from the +client side translator and server side implementation. (jitendra) + +HADOOP-7757. Test file reference count is at least 3x actual value (Jon +Eagles via bobby) + +HADOOP-8147. test-patch should run tests with -fn to avoid masking test +failures (Robert Evans via tgraves) + +HADOOP-8290. Remove remaining references to hadoop.native.lib (harsh) + +HADOOP-8308. Support cross-project Jenkins builds. (tomwhite) + +HADOOP-8297. Writable javadocs don't carry default constructor (harsh) + +HADOOP-8360. empty-configuration.xml fails xml validation +(Radim Kolar via harsh) + +HADOOP-8367 Improve documentation of declaringClassProtocolName in rpc headers +(Sanjay Radia) + +HADOOP-8415. Add getDouble() and setDouble() in +org.apache.hadoop.conf.Configuration (Jan van der Lugt via harsh) + +HADOOP-7659. fs -getmerge isn't guaranteed to work well over non-HDFS +filesystems (harsh) + +HADOOP-8059. Add javadoc to InterfaceAudience and InterfaceStability. +(Brandon Li via suresh) + +HADOOP-8434. Add tests for Configuration setter methods. +(Madhukara Phatak via suresh) + +HADOOP-8523. test-patch.sh doesn't validate patches before building +(Jack Dintruff via jeagles) + +HADOOP-8624. ProtobufRpcEngine should log all RPCs if TRACE logging is +enabled (todd) + +HADOOP-8711. IPC Server supports adding exceptions for which +the message is printed and the stack trace is not printed to avoid chatter. +(Brandon Li via Suresh) + +HADOOP-8719. Workaround for kerberos-related log errors upon running any +hadoop command on OSX. (Jianbin Wei via harsh) + +HADOOP-8619. WritableComparator must implement no-arg constructor. +(Chris Douglas via Suresh) + +HADOOP-8736. Add Builder for building RPC server. (Brandon Li via Suresh) + + BUG FIXES + +HADOOP-8177. MBeans shouldn't try to register when it fails
svn commit: r1375689 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/org/apache/hadoop/mapred/JobInProgress.java
Author: tomwhite Date: Tue Aug 21 17:46:42 2012 New Revision: 1375689 URL: http://svn.apache.org/viewvc?rev=1375689view=rev Log: MAPREDUCE-4567. Fix failing TestJobKillAndFail in branch-1. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1375689r1=1375688r2=1375689view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Tue Aug 21 17:46:42 2012 @@ -181,6 +181,8 @@ Release 1.2.0 - unreleased HADOOP-8611. Allow fall-back to the shell-based implementation when JNI-based users-group mapping fails (Robert Parker via bobby) +MAPREDUCE-4567. Fix failing TestJobKillAndFail in branch-1. (tomwhite) + Release 1.1.0 - unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java?rev=1375689r1=1375688r2=1375689view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java Tue Aug 21 17:46:42 2012 @@ -833,7 +833,7 @@ public class JobInProgress { return; } if (this.status.getRunState() == JobStatus.PREP) { - this.status.setRunState(JobStatus.RUNNING); + changeStateTo(JobStatus.RUNNING); JobHistory.JobInfo.logStarted(profile.getJobID()); } }
svn commit: r1373574 - in /hadoop/common/trunk/hadoop-common-project: hadoop-annotations/ hadoop-auth-examples/ hadoop-auth/ hadoop-common/ hadoop-common/src/test/java/org/apache/hadoop/fs/
Author: tomwhite Date: Wed Aug 15 19:10:52 2012 New Revision: 1373574 URL: http://svn.apache.org/viewvc?rev=1373574view=rev Log: HADOOP-8278. Make sure components declare correct set of dependencies. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-annotations/pom.xml hadoop/common/trunk/hadoop-common-project/hadoop-auth-examples/pom.xml hadoop/common/trunk/hadoop-common-project/hadoop-auth/pom.xml hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestFsShellReturnCode.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-annotations/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-annotations/pom.xml?rev=1373574r1=1373573r2=1373574view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-annotations/pom.xml (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-annotations/pom.xml Wed Aug 15 19:10:52 2012 @@ -34,7 +34,7 @@ dependency groupIdjdiff/groupId artifactIdjdiff/artifactId - scopecompile/scope + scopeprovided/scope /dependency /dependencies Modified: hadoop/common/trunk/hadoop-common-project/hadoop-auth-examples/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-auth-examples/pom.xml?rev=1373574r1=1373573r2=1373574view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-auth-examples/pom.xml (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-auth-examples/pom.xml Wed Aug 15 19:10:52 2012 @@ -43,14 +43,19 @@ scopecompile/scope /dependency dependency + groupIdorg.slf4j/groupId + artifactIdslf4j-api/artifactId + scopecompile/scope +/dependency +dependency groupIdlog4j/groupId artifactIdlog4j/artifactId - scopecompile/scope + scoperuntime/scope /dependency dependency groupIdorg.slf4j/groupId artifactIdslf4j-log4j12/artifactId - scopecompile/scope + scoperuntime/scope /dependency /dependencies Modified: hadoop/common/trunk/hadoop-common-project/hadoop-auth/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-auth/pom.xml?rev=1373574r1=1373573r2=1373574view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-auth/pom.xml (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-auth/pom.xml Wed Aug 15 19:10:52 2012 @@ -38,6 +38,7 @@ dependencies dependency + !-- Used, even though 'mvn dependency:analyze' doesn't find it -- groupIdorg.apache.hadoop/groupId artifactIdhadoop-annotations/artifactId scopeprovided/scope @@ -75,12 +76,12 @@ dependency groupIdlog4j/groupId artifactIdlog4j/artifactId - scopecompile/scope + scoperuntime/scope /dependency dependency groupIdorg.slf4j/groupId artifactIdslf4j-log4j12/artifactId - scopecompile/scope + scoperuntime/scope /dependency /dependencies Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1373574r1=1373573r2=1373574view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Wed Aug 15 19:10:52 2012 @@ -289,6 +289,9 @@ Branch-2 ( Unreleased changes ) HADOOP-8687. Upgrade log4j to 1.2.17. (eli) +HADOOP-8278. Make sure components declare correct set of dependencies. +(tomwhite) + BUG FIXES HADOOP-8372. NetUtils.normalizeHostName() incorrectly handles hostname Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml?rev=1373574r1=1373573r2=1373574view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml Wed Aug 15 19:10:52 2012 @@ -74,13 +74,13 @@ scopecompile/scope /dependency dependency - groupIdcommons-net/groupId - artifactIdcommons-net/artifactId + groupIdcommons-io/groupId + artifactIdcommons-io/artifactId scopecompile/scope /dependency dependency - groupIdcommons-io/groupId
svn commit: r1373574 - /hadoop/common/trunk/pom.xml
Author: tomwhite Date: Wed Aug 15 19:10:52 2012 New Revision: 1373574 URL: http://svn.apache.org/viewvc?rev=1373574view=rev Log: HADOOP-8278. Make sure components declare correct set of dependencies. Modified: hadoop/common/trunk/pom.xml Modified: hadoop/common/trunk/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/pom.xml?rev=1373574r1=1373573r2=1373574view=diff == --- hadoop/common/trunk/pom.xml (original) +++ hadoop/common/trunk/pom.xml Wed Aug 15 19:10:52 2012 @@ -365,6 +365,18 @@ xsi:schemaLocation=http://maven.apache. /reportSets /plugin + plugin +groupIdorg.apache.maven.plugins/groupId +artifactIdmaven-dependency-plugin/artifactId +version2.4/version +reportSets + reportSet +reports + reportanalyze-report/report +/reports + /reportSet +/reportSets + /plugin /plugins /reporting
svn commit: r1373594 - /hadoop/common/branches/branch-2/pom.xml
Author: tomwhite Date: Wed Aug 15 19:28:51 2012 New Revision: 1373594 URL: http://svn.apache.org/viewvc?rev=1373594view=rev Log: Merge -r 1373573:1373574 from trunk to branch-2. Fixes: HADOOP-8278 - Make sure components declare correct set of dependencies Modified: hadoop/common/branches/branch-2/pom.xml Modified: hadoop/common/branches/branch-2/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/pom.xml?rev=1373594r1=1373593r2=1373594view=diff == --- hadoop/common/branches/branch-2/pom.xml (original) +++ hadoop/common/branches/branch-2/pom.xml Wed Aug 15 19:28:51 2012 @@ -258,6 +258,18 @@ xsi:schemaLocation=http://maven.apache. /reportSets /plugin + plugin +groupIdorg.apache.maven.plugins/groupId +artifactIdmaven-dependency-plugin/artifactId +version2.4/version +reportSets + reportSet +reports + reportanalyze-report/report +/reports + /reportSet +/reportSets + /plugin /plugins /reporting
svn commit: r1373594 - in /hadoop/common/branches/branch-2/hadoop-common-project: hadoop-annotations/ hadoop-auth-examples/ hadoop-auth/ hadoop-common/ hadoop-common/src/test/java/org/apache/hadoop/fs
Author: tomwhite Date: Wed Aug 15 19:28:51 2012 New Revision: 1373594 URL: http://svn.apache.org/viewvc?rev=1373594view=rev Log: Merge -r 1373573:1373574 from trunk to branch-2. Fixes: HADOOP-8278 - Make sure components declare correct set of dependencies Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-annotations/pom.xml hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth-examples/pom.xml hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth/pom.xml hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestFsShellReturnCode.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-annotations/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-annotations/pom.xml?rev=1373594r1=1373593r2=1373594view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-annotations/pom.xml (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-annotations/pom.xml Wed Aug 15 19:28:51 2012 @@ -34,7 +34,7 @@ dependency groupIdjdiff/groupId artifactIdjdiff/artifactId - scopecompile/scope + scopeprovided/scope /dependency /dependencies Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth-examples/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth-examples/pom.xml?rev=1373594r1=1373593r2=1373594view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth-examples/pom.xml (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth-examples/pom.xml Wed Aug 15 19:28:51 2012 @@ -43,14 +43,19 @@ scopecompile/scope /dependency dependency + groupIdorg.slf4j/groupId + artifactIdslf4j-api/artifactId + scopecompile/scope +/dependency +dependency groupIdlog4j/groupId artifactIdlog4j/artifactId - scopecompile/scope + scoperuntime/scope /dependency dependency groupIdorg.slf4j/groupId artifactIdslf4j-log4j12/artifactId - scopecompile/scope + scoperuntime/scope /dependency /dependencies Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth/pom.xml?rev=1373594r1=1373593r2=1373594view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth/pom.xml (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth/pom.xml Wed Aug 15 19:28:51 2012 @@ -38,6 +38,7 @@ dependencies dependency + !-- Used, even though 'mvn dependency:analyze' doesn't find it -- groupIdorg.apache.hadoop/groupId artifactIdhadoop-annotations/artifactId scopeprovided/scope @@ -75,12 +76,12 @@ dependency groupIdlog4j/groupId artifactIdlog4j/artifactId - scopecompile/scope + scoperuntime/scope /dependency dependency groupIdorg.slf4j/groupId artifactIdslf4j-log4j12/artifactId - scopecompile/scope + scoperuntime/scope /dependency /dependencies Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1373594r1=1373593r2=1373594view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Wed Aug 15 19:28:51 2012 @@ -86,6 +86,9 @@ Release 2.0.1-alpha - UNRELEASED HADOOP-8687. Upgrade log4j to 1.2.17. (eli) +HADOOP-8278. Make sure components declare correct set of dependencies. +(tomwhite) + BUG FIXES HADOOP-8372. NetUtils.normalizeHostName() incorrectly handles hostname Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml?rev=1373594r1=1373593r2=1373594view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml (original) +++ hadoop/common/branches/branch-2/hadoop-common-project
svn commit: r1372541 - in /hadoop/common/branches/branch-1: ./ src/mapred/ src/mapred/org/apache/hadoop/mapred/ src/mapred/org/apache/hadoop/mapreduce/ src/test/org/apache/hadoop/mapreduce/
Author: tomwhite Date: Mon Aug 13 18:34:33 2012 New Revision: 1372541 URL: http://svn.apache.org/viewvc?rev=1372541view=rev Log: MAPREDUCE-4488. Port MAPREDUCE-463 (The job setup and cleanup tasks should be optional) to branch-1. Added: hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapreduce/TestNoJobSetupCleanup.java (with props) Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/mapred-default.xml hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/Job.java hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/JobContext.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1372541r1=1372540r2=1372541view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Mon Aug 13 18:34:33 2012 @@ -27,6 +27,9 @@ Release 1.2.0 - unreleased module external to HDFS to specify how HDFS blocks should be placed. (Sumadhur Reddy Bolli via szetszwo) +MAPREDUCE-4488. Port MAPREDUCE-463 (The job setup and cleanup tasks +should be optional) to branch-1. (tomwhite) + IMPROVEMENTS HDFS-3515. Port HDFS-1457 to branch-1. (eli) Modified: hadoop/common/branches/branch-1/src/mapred/mapred-default.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/mapred-default.xml?rev=1372541r1=1372540r2=1372541view=diff == --- hadoop/common/branches/branch-1/src/mapred/mapred-default.xml (original) +++ hadoop/common/branches/branch-1/src/mapred/mapred-default.xml Mon Aug 13 18:34:33 2012 @@ -35,6 +35,14 @@ /description /property +property + namemapred.committer.job.setup.cleanup.needed/name + valuetrue/value + description true, if job needs job-setup and job-cleanup. +false, otherwise + /description +/property + !-- i/o properties -- property Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java?rev=1372541r1=1372540r2=1372541view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobInProgress.java Mon Aug 13 18:34:33 2012 @@ -131,6 +131,7 @@ public class JobInProgress { private volatile boolean launchedSetup = false; private volatile boolean jobKilled = false; private volatile boolean jobFailed = false; + private boolean jobSetupCleanupNeeded = true; JobPriority priority = JobPriority.NORMAL; final JobTracker jobtracker; @@ -361,6 +362,8 @@ public class JobInProgress { this.taskCompletionEvents = new ArrayListTaskCompletionEvent (numMapTasks + numReduceTasks + 10); +JobContext jobContext = new JobContext(conf, jobId); +this.jobSetupCleanupNeeded = jobContext.getJobSetupCleanupNeeded(); try { this.userUGI = UserGroupInformation.getCurrentUser(); } catch (IOException ie){ @@ -449,6 +452,9 @@ public class JobInProgress { this.taskCompletionEvents = new ArrayListTaskCompletionEvent (numMapTasks + numReduceTasks + 10); + + JobContext jobContext = new JobContext(conf, jobId); + this.jobSetupCleanupNeeded = jobContext.getJobSetupCleanupNeeded(); // Construct the jobACLs status.setJobACLs(jobtracker.getJobACLsManager().constructJobACLs(conf)); @@ -757,7 +763,35 @@ public class JobInProgress { // ... use the same for estimating the total output of all maps resourceEstimator.setThreshhold(completedMapsForReduceSlowstart); + +initSetupCleanupTasks(jobFile); + +synchronized(jobInitKillStatus){ + jobInitKillStatus.initDone = true; + + // set this before the throw to make sure cleanup works properly + tasksInited = true; + + if(jobInitKillStatus.killed) { +throw new KillInterruptedException(Job + jobId + killed in init); + } +} + +JobHistory.JobInfo.logInited(profile.getJobID(), this.launchTime, + numMapTasks, numReduceTasks); +// if setup is not needed, mark it complete +if (!jobSetupCleanupNeeded) { + setupComplete(); +} + } + + private void initSetupCleanupTasks(String jobFile) { +if (!jobSetupCleanupNeeded) { + // nothing to initialize + return; +} + // create cleanup two cleanup tips, one map and one reduce. cleanup
svn commit: r1357831 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/org/apache/hadoop/mapred/Counters.java
Author: tomwhite Date: Thu Jul 5 19:43:26 2012 New Revision: 1357831 URL: http://svn.apache.org/viewvc?rev=1357831view=rev Log: MAPREDUCE-4359. Potential deadlock in Counters. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1357831r1=1357830r2=1357831view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Thu Jul 5 19:43:26 2012 @@ -56,6 +56,8 @@ Release 1.2.0 - unreleased MAPREDUCE-4385. FairScheduler.maxTasksToAssign() should check for fairscheduler.assignmultiple.maps TaskTracker.availableSlots (kkambatl via tucu) +MAPREDUCE-4359. Potential deadlock in Counters. (tomwhite) + Release 1.1.0 - unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java?rev=1357831r1=1357830r2=1357831view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java Thu Jul 5 19:43:26 2012 @@ -295,7 +295,7 @@ public class Counters implements Writabl * @deprecated use {@link #getCounter(String)} instead */ @Deprecated -public synchronized Counter getCounter(int id, String name) { +public Counter getCounter(int id, String name) { return getCounterForName(name); } @@ -304,23 +304,27 @@ public class Counters implements Writabl * @param name the internal counter name * @return the counter */ -public synchronized Counter getCounterForName(String name) { - String shortName = getShortName(name, COUNTER_NAME_LIMIT); - Counter result = subcounters.get(shortName); - if (result == null) { -if (LOG.isDebugEnabled()) { - LOG.debug(Adding + shortName); -} -numCounters = (numCounters == 0) ? Counters.this.size(): numCounters; -if (numCounters = MAX_COUNTER_LIMIT) { - throw new CountersExceededException(Error: Exceeded limits on number of counters - - + Counters= + numCounters + Limit= + MAX_COUNTER_LIMIT); +public Counter getCounterForName(String name) { + synchronized(Counters.this) { // lock ordering: Counters then Group +synchronized (this) { + String shortName = getShortName(name, COUNTER_NAME_LIMIT); + Counter result = subcounters.get(shortName); + if (result == null) { +if (LOG.isDebugEnabled()) { + LOG.debug(Adding + shortName); +} +numCounters = (numCounters == 0) ? Counters.this.size(): numCounters; +if (numCounters = MAX_COUNTER_LIMIT) { + throw new CountersExceededException(Error: Exceeded limits on number of counters - + + Counters= + numCounters + Limit= + MAX_COUNTER_LIMIT); +} +result = new Counter(shortName, localize(shortName + .name, shortName), 0L); +subcounters.put(shortName, result); +numCounters++; + } + return result; } -result = new Counter(shortName, localize(shortName + .name, shortName), 0L); -subcounters.put(shortName, result); -numCounters++; } - return result; } /**
svn commit: r1356904 - in /hadoop/common/branches/branch-1: ./ src/mapred/org/apache/hadoop/mapred/ src/test/org/apache/hadoop/mapred/
Author: tomwhite Date: Tue Jul 3 20:07:12 2012 New Revision: 1356904 URL: http://svn.apache.org/viewvc?rev=1356904view=rev Log: MAPREDUCE-3837. Job tracker is not able to recover job in case of crash and after that no user can submit job. Contributed by Mayank Bansal. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestJobTrackerRestartWithLostTracker.java hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestJobTrackerSafeMode.java hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestRecoveryManager.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1356904r1=1356903r2=1356904view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Tue Jul 3 20:07:12 2012 @@ -43,6 +43,9 @@ Release 1.2.0 - unreleased HADOOP-8249. invalid hadoop-auth cookies should trigger authentication if info is avail before returning HTTP 401 (tucu) +MAPREDUCE-3837. Job tracker is not able to recover job in case of crash +and after that no user can submit job. (Mayank Bansal via tomwhite) + Release 1.1.0 - unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java?rev=1356904r1=1356903r2=1356904view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/JobTracker.java Tue Jul 3 20:07:12 2012 @@ -205,6 +205,7 @@ public class JobTracker implements MRCon State state = State.INITIALIZING; private static final int FS_ACCESS_RETRY_PERIOD = 1; static final String JOB_INFO_FILE = job-info; + static final String JOB_TOKEN_FILE = jobToken; private DNSToSwitchMapping dnsToSwitchMapping; private NetworkTopology clusterMap = new NetworkTopology(); private int numTaskCacheLevels; // the max level to which we cache tasks @@ -1215,179 +1216,6 @@ public class JobTracker implements MRCon /** A custom listener that replays the events in the order in which the * events (task attempts) occurred. */ -class JobRecoveryListener implements Listener { - // The owner job - private JobInProgress jip; - - private JobHistory.JobInfo job; // current job's info object - - // Maintain the count of the (attempt) events recovered - private int numEventsRecovered = 0; - - // Maintains open transactions - private MapString, String hangingAttempts = -new HashMapString, String(); - - // Whether there are any updates for this job - private boolean hasUpdates = false; - - public JobRecoveryListener(JobInProgress jip) { -this.jip = jip; -this.job = new JobHistory.JobInfo(jip.getJobID().toString()); - } - - /** - * Process a task. Note that a task might commit a previously pending - * transaction. - */ - private void processTask(String taskId, JobHistory.Task task) { -// Any TASK info commits the previous transaction -boolean hasHanging = hangingAttempts.remove(taskId) != null; -if (hasHanging) { - numEventsRecovered += 2; -} - -TaskID id = TaskID.forName(taskId); -TaskInProgress tip = getTip(id); - -updateTip(tip, task); - } - - /** - * Adds a task-attempt in the listener - */ - private void processTaskAttempt(String taskAttemptId, - JobHistory.TaskAttempt attempt) -throws UnknownHostException { -TaskAttemptID id = TaskAttemptID.forName(taskAttemptId); - -// Check if the transaction for this attempt can be committed -String taskStatus = attempt.get(Keys.TASK_STATUS); -TaskAttemptID taskID = TaskAttemptID.forName(taskAttemptId); -JobInProgress jip = getJob(taskID.getJobID()); -JobStatus prevStatus = (JobStatus)jip.getStatus().clone(); - -if (taskStatus.length() 0) { - // This means this is an update event - if (taskStatus.equals(Values.SUCCESS.name())) { -// Mark this attempt as hanging -hangingAttempts.put(id.getTaskID().toString(), taskAttemptId); -addSuccessfulAttempt(jip, id, attempt); - } else { -addUnsuccessfulAttempt(jip, id, attempt
svn commit: r1337200 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/security/token/ test/java/org/apache/hadoop/security/token/ test/resourc
Author: tomwhite Date: Fri May 11 15:04:24 2012 New Revision: 1337200 URL: http://svn.apache.org/viewvc?rev=1337200view=rev Log: Merge -r 1337198:1337199 from trunk to branch-2. Fixes: MAPREDUCE-4148. Added: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/META-INF/ - copied from r1337199, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/META-INF/ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/ - copied from r1337199, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/org.apache.hadoop.security.token.TokenIdentifier - copied unchanged from r1337199, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/org.apache.hadoop.security.token.TokenIdentifier Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestToken.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java?rev=1337200r1=1337199r2=1337200view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java Fri May 11 15:04:24 2012 @@ -18,10 +18,15 @@ package org.apache.hadoop.security.token; +import com.google.common.collect.Maps; + +import java.io.ByteArrayInputStream; import java.io.DataInput; +import java.io.DataInputStream; import java.io.DataOutput; import java.io.IOException; import java.util.Arrays; +import java.util.Map; import java.util.ServiceLoader; import org.apache.commons.codec.binary.Base64; @@ -37,6 +42,7 @@ import org.apache.hadoop.io.Text; import org.apache.hadoop.io.Writable; import org.apache.hadoop.io.WritableComparator; import org.apache.hadoop.io.WritableUtils; +import org.apache.hadoop.util.ReflectionUtils; /** * The client-side form of the token. @@ -45,6 +51,9 @@ import org.apache.hadoop.io.WritableUtil @InterfaceStability.Evolving public class TokenT extends TokenIdentifier implements Writable { public static final Log LOG = LogFactory.getLog(Token.class); + + private static MapText, Class? extends TokenIdentifier tokenKindMap; + private byte[] identifier; private byte[] password; private Text kind; @@ -100,13 +109,49 @@ public class TokenT extends TokenIdenti } /** - * Get the token identifier - * @return the token identifier + * Get the token identifier's byte representation + * @return the token identifier's byte representation */ public byte[] getIdentifier() { return identifier; } + private static synchronized Class? extends TokenIdentifier + getClassForIdentifier(Text kind) { +if (tokenKindMap == null) { + tokenKindMap = Maps.newHashMap(); + for (TokenIdentifier id : ServiceLoader.load(TokenIdentifier.class)) { +tokenKindMap.put(id.getKind(), id.getClass()); + } +} +Class? extends TokenIdentifier cls = tokenKindMap.get(kind); +if (cls == null) { + LOG.warn(Cannot find class for token kind + kind); + return null; +} +return cls; + } + + /** + * Get the token identifier object, or null if it could not be constructed + * (because the class could not be loaded, for example). + * @return the token identifier, or null + * @throws IOException + */ + @SuppressWarnings(unchecked) + public T decodeIdentifier() throws IOException { +Class? extends TokenIdentifier cls = getClassForIdentifier(getKind()); +if (cls == null) { + return null; +} +TokenIdentifier tokenIdentifier = ReflectionUtils.newInstance(cls, null); +ByteArrayInputStream buf = new ByteArrayInputStream(identifier); +DataInputStream in = new DataInputStream(buf); +tokenIdentifier.readFields(in); +in.close(); +return (T) tokenIdentifier; + } + /** * Get the token password/secret * @return the token password/secret @@ -260,16 +305,31 @@ public class TokenT extends TokenIdenti buffer.append(num); } } + + private void identifierToString(StringBuilder buffer) { +T id = null; +try { + id = decodeIdentifier(); +} catch (IOException e
svn commit: r1335085 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/fs/ src/test/java/org/apache/hadoop/fs/
Author: tomwhite Date: Mon May 7 16:00:56 2012 New Revision: 1335085 URL: http://svn.apache.org/viewvc?rev=1335085view=rev Log: HADOOP-8328. Duplicate FileSystem Statistics object for 'file' scheme. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/LocalFileSystem.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestLocalFileSystem.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1335085r1=1335084r2=1335085view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Mon May 7 16:00:56 2012 @@ -423,6 +423,9 @@ Release 2.0.0 - UNRELEASED HADOOP-8349. ViewFS doesn't work when the root of a file system is mounted. (atm) +HADOOP-8328. Duplicate FileSystem Statistics object for 'file' scheme. +(tomwhite) + BREAKDOWN OF HADOOP-7454 SUBTASKS HADOOP-7455. HA: Introduce HA Service Protocol Interface. (suresh) Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java?rev=1335085r1=1335084r2=1335085view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java Mon May 7 16:00:56 2012 @@ -53,7 +53,7 @@ import org.apache.hadoop.util.Progressab public class FilterFileSystem extends FileSystem { protected FileSystem fs; - private String swapScheme; + protected String swapScheme; /* * so that extending classes can define it Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/LocalFileSystem.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/LocalFileSystem.java?rev=1335085r1=1335084r2=1335085view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/LocalFileSystem.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/LocalFileSystem.java Mon May 7 16:00:56 2012 @@ -39,6 +39,17 @@ public class LocalFileSystem extends Che public LocalFileSystem() { this(new RawLocalFileSystem()); } + + @Override + public void initialize(URI name, Configuration conf) throws IOException { +if (fs.getConf() == null) { + fs.initialize(name, conf); +} +String scheme = name.getScheme(); +if (!scheme.equals(fs.getUri().getScheme())) { + swapScheme = scheme; +} + } /** * Return the protocol scheme for the FileSystem. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestLocalFileSystem.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestLocalFileSystem.java?rev=1335085r1=1335084r2=1335085view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestLocalFileSystem.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestLocalFileSystem.java Mon May 7 16:00:56 2012 @@ -18,11 +18,14 @@ package org.apache.hadoop.fs; import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FileSystem.Statistics; + import static org.apache.hadoop.fs.FileSystemTestHelper.*; import java.io.*; import static org.junit.Assert.*; + import org.junit.Before; import org.junit.Test; @@ -233,4 +236,16 @@ public class TestLocalFileSystem { assertTrue(Did not delete file, fs.delete(file1)); assertTrue(Did not delete non-empty dir, fs.delete(dir1)); } + + @Test + public void testStatistics() throws Exception { +FileSystem.getLocal(new Configuration()); +int fileSchemeCount = 0; +for (Statistics stats : FileSystem.getAllStatistics()) { + if (stats.getScheme().equals(file
svn commit: r1332479 - /hadoop/common/trunk/dev-support/test-patch.sh
Author: tomwhite Date: Tue May 1 00:21:38 2012 New Revision: 1332479 URL: http://svn.apache.org/viewvc?rev=1332479view=rev Log: HADOOP-8308. Support cross-project Jenkins builds. Modified: hadoop/common/trunk/dev-support/test-patch.sh Modified: hadoop/common/trunk/dev-support/test-patch.sh URL: http://svn.apache.org/viewvc/hadoop/common/trunk/dev-support/test-patch.sh?rev=1332479r1=1332478r2=1332479view=diff == --- hadoop/common/trunk/dev-support/test-patch.sh (original) +++ hadoop/common/trunk/dev-support/test-patch.sh Tue May 1 00:21:38 2012 @@ -389,10 +389,10 @@ checkJavadocWarnings () { echo echo $MVN clean test javadoc:javadoc -DskipTests -Pdocs -D${PROJECT_NAME}PatchProcess $PATCH_DIR/patchJavadocWarnings.txt 21 if [ -d hadoop-project ]; then -(cd hadoop-project; $MVN install) +(cd hadoop-project; $MVN install /dev/null 21) fi if [ -d hadoop-common-project/hadoop-annotations ]; then -(cd hadoop-common-project/hadoop-annotations; $MVN install) +(cd hadoop-common-project/hadoop-annotations; $MVN install /dev/null 21) fi $MVN clean test javadoc:javadoc -DskipTests -Pdocs -D${PROJECT_NAME}PatchProcess $PATCH_DIR/patchJavadocWarnings.txt 21 javadocWarnings=`$GREP '\[WARNING\]' $PATCH_DIR/patchJavadocWarnings.txt | $AWK '/Javadoc Warnings/,EOF' | $GREP warning | $AWK 'BEGIN {total = 0} {total += 1} END {print total}'` @@ -472,8 +472,8 @@ checkReleaseAuditWarnings () { echo == echo echo - echo $MVN apache-rat:check -D${PROJECT_NAME}PatchProcess 21 - $MVN apache-rat:check -D${PROJECT_NAME}PatchProcess 21 + echo $MVN apache-rat:check -D${PROJECT_NAME}PatchProcess $PATCH_DIR/patchReleaseAuditOutput.txt 21 + $MVN apache-rat:check -D${PROJECT_NAME}PatchProcess $PATCH_DIR/patchReleaseAuditOutput.txt 21 find $BASEDIR -name rat.txt | xargs cat $PATCH_DIR/patchReleaseAuditWarnings.txt ### Compare trunk and patch release audit warning numbers @@ -548,10 +548,21 @@ checkFindbugsWarnings () { echo == echo echo - echo $MVN clean test findbugs:findbugs -DskipTests -D${PROJECT_NAME}PatchProcess - $MVN clean test findbugs:findbugs -DskipTests -D${PROJECT_NAME}PatchProcess /dev/null + + modules=$(findModules) + rc=0 + for module in $modules; + do +cd $module +echo Running findbugs in $module +module_suffix=`basename ${module}` +echo $MVN clean test findbugs:findbugs -DskipTests -D${PROJECT_NAME}PatchProcess /dev/null $PATCH_DIR/patchFindBugsOutput${module_suffix}.txt 21 +$MVN clean test findbugs:findbugs -DskipTests -D${PROJECT_NAME}PatchProcess /dev/null $PATCH_DIR/patchFindBugsOutput${module_suffix}.txt 21 +(( rc = rc + $? )) +cd - + done - if [ $? != 0 ] ; then + if [ $rc != 0 ] ; then JIRA_COMMENT=$JIRA_COMMENT -1 findbugs. The patch appears to cause Findbugs (version ${findbugs_version}) to fail. @@ -610,8 +621,8 @@ checkEclipseGeneration () { echo echo - echo $MVN eclipse:eclipse -D${PROJECT_NAME}PatchProcess - $MVN eclipse:eclipse -D${PROJECT_NAME}PatchProcess + echo $MVN eclipse:eclipse -D${PROJECT_NAME}PatchProcess $PATCH_DIR/patchEclipseOutput.txt 21 + $MVN eclipse:eclipse -D${PROJECT_NAME}PatchProcess $PATCH_DIR/patchEclipseOutput.txt 21 if [[ $? != 0 ]] ; then JIRA_COMMENT=$JIRA_COMMENT @@ -639,16 +650,28 @@ runTests () { echo echo - echo $MVN clean install -fn -Pnative -D${PROJECT_NAME}PatchProcess - $MVN clean install -fn -Pnative -D${PROJECT_NAME}PatchProcess - failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g` - # With -fn mvn always exits with a 0 exit code. Because of this we need to - # find the errors instead of using the exit code. We assume that if the build - # failed a -1 is already given for that case + failed_tests= + modules=$(findModules) + for module in $modules; + do +cd $module +echo Running tests in $module +echo $MVN clean install -fn -Pnative -D${PROJECT_NAME}PatchProcess +$MVN clean install -fn -Pnative -D${PROJECT_NAME}PatchProcess +module_failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g` +# With -fn mvn always exits with a 0 exit code. Because of this we need to +# find the errors instead of using the exit code. We assume that if the build +# failed a -1 is already given for that case +if [[ -n $module_failed_tests ]] ; then + failed_tests=${failed_tests} +${module_failed_tests} +fi +cd - + done if [[ -n $failed_tests ]] ; then JIRA_COMMENT=$JIRA_COMMENT --1 core tests
svn commit: r1332479 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
Author: tomwhite Date: Tue May 1 00:21:38 2012 New Revision: 1332479 URL: http://svn.apache.org/viewvc?rev=1332479view=rev Log: HADOOP-8308. Support cross-project Jenkins builds. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1332479r1=1332478r2=1332479view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue May 1 00:21:38 2012 @@ -65,6 +65,8 @@ Trunk (unreleased changes) HADOOP-8285 Use ProtoBuf for RpcPayLoadHeader (sanjay radia) +HADOOP-8308. Support cross-project Jenkins builds. (tomwhite) + BUG FIXES HADOOP-8177. MBeans shouldn't try to register when it fails to create MBeanName.
svn commit: r1328083 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/compress/ src/main/resources/ src/main/resources/META-INF/services/ src/test/j
Author: tomwhite Date: Thu Apr 19 19:20:31 2012 New Revision: 1328083 URL: http://svn.apache.org/viewvc?rev=1328083view=rev Log: HADOOP-7350. Use ServiceLoader to discover compression codec classes. Added: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/META-INF/services/org.apache.hadoop.io.compress.CompressionCodec Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/compress/TestCodecFactory.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1328083r1=1328082r2=1328083view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Thu Apr 19 19:20:31 2012 @@ -360,6 +360,9 @@ Release 2.0.0 - UNRELEASED HADOOP-8282. start-all.sh refers incorrectly start-dfs.sh existence for starting start-yarn.sh. (Devaraj K via eli) +HADOOP-7350. Use ServiceLoader to discover compression codec classes. +(tomwhite) + BREAKDOWN OF HADOOP-7454 SUBTASKS HADOOP-7455. HA: Introduce HA Service Protocol Interface. (suresh) Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java?rev=1328083r1=1328082r2=1328083view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java Thu Apr 19 19:20:31 2012 @@ -36,6 +36,9 @@ public class CompressionCodecFactory { public static final Log LOG = LogFactory.getLog(CompressionCodecFactory.class.getName()); + + private static final ServiceLoaderCompressionCodec CODEC_PROVIDERS = +ServiceLoader.load(CompressionCodec.class); /** * A map from the reversed filename suffixes to the codecs. @@ -95,16 +98,23 @@ public class CompressionCodecFactory { } /** - * Get the list of codecs listed in the configuration + * Get the list of codecs discovered via a Java ServiceLoader, or + * listed in the configuration. Codecs specified in configuration come + * later in the returned list, and are considered to override those + * from the ServiceLoader. * @param conf the configuration to look in - * @return a list of the Configuration classes or null if the attribute - * was not set + * @return a list of the {@link CompressionCodec} classes */ public static ListClass? extends CompressionCodec getCodecClasses(Configuration conf) { +ListClass? extends CompressionCodec result + = new ArrayListClass? extends CompressionCodec(); +// Add codec classes discovered via service loading +for (CompressionCodec codec : CODEC_PROVIDERS) { + result.add(codec.getClass()); +} +// Add codec classes from configuration String codecsString = conf.get(io.compression.codecs); if (codecsString != null) { - ListClass? extends CompressionCodec result -= new ArrayListClass? extends CompressionCodec(); StringTokenizer codecSplit = new StringTokenizer(codecsString, ,); while (codecSplit.hasMoreElements()) { String codecSubstring = codecSplit.nextToken(); @@ -123,14 +133,14 @@ public class CompressionCodecFactory { } } } - return result; -} else { - return null; } +return result; } /** - * Sets a list of codec classes in the configuration. + * Sets a list of codec classes in the configuration. In addition to any + * classes specified using this method, {@link CompressionCodec} classes on + * the classpath are discovered using a Java ServiceLoader. * @param conf the configuration to modify * @param classes the list of classes to set */ @@ -151,21 +161,19 @@ public class CompressionCodecFactory { /** * Find the codecs specified in the config value io.compression.codecs - * and register them. Defaults to gzip and zip. + * and register them. Defaults to gzip and deflate. */ public CompressionCodecFactory(Configuration conf
svn commit: r1328085 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/compress/ src/main/resources/ src/main/resources/META-INF/services
Author: tomwhite Date: Thu Apr 19 19:28:27 2012 New Revision: 1328085 URL: http://svn.apache.org/viewvc?rev=1328085view=rev Log: Merge -r 1328082:1328083 from trunk to branch-2. Fixes: HADOOP-7350 Added: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/resources/META-INF/services/org.apache.hadoop.io.compress.CompressionCodec - copied unchanged from r1328083, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/META-INF/services/org.apache.hadoop.io.compress.CompressionCodec Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/compress/TestCodecFactory.java Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1328085r1=1328084r2=1328085view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt Thu Apr 19 19:28:27 2012 @@ -242,6 +242,9 @@ Release 2.0.0 - UNRELEASED HADOOP-8282. start-all.sh refers incorrectly start-dfs.sh existence for starting start-yarn.sh. (Devaraj K via eli) +HADOOP-7350. Use ServiceLoader to discover compression codec classes. +(tomwhite) + BREAKDOWN OF HADOOP-7454 SUBTASKS HADOOP-7455. HA: Introduce HA Service Protocol Interface. (suresh) Modified: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java?rev=1328085r1=1328084r2=1328085view=diff == --- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java (original) +++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CompressionCodecFactory.java Thu Apr 19 19:28:27 2012 @@ -36,6 +36,9 @@ public class CompressionCodecFactory { public static final Log LOG = LogFactory.getLog(CompressionCodecFactory.class.getName()); + + private static final ServiceLoaderCompressionCodec CODEC_PROVIDERS = +ServiceLoader.load(CompressionCodec.class); /** * A map from the reversed filename suffixes to the codecs. @@ -95,16 +98,23 @@ public class CompressionCodecFactory { } /** - * Get the list of codecs listed in the configuration + * Get the list of codecs discovered via a Java ServiceLoader, or + * listed in the configuration. Codecs specified in configuration come + * later in the returned list, and are considered to override those + * from the ServiceLoader. * @param conf the configuration to look in - * @return a list of the Configuration classes or null if the attribute - * was not set + * @return a list of the {@link CompressionCodec} classes */ public static ListClass? extends CompressionCodec getCodecClasses(Configuration conf) { +ListClass? extends CompressionCodec result + = new ArrayListClass? extends CompressionCodec(); +// Add codec classes discovered via service loading +for (CompressionCodec codec : CODEC_PROVIDERS) { + result.add(codec.getClass()); +} +// Add codec classes from configuration String codecsString = conf.get(io.compression.codecs); if (codecsString != null) { - ListClass? extends CompressionCodec result -= new ArrayListClass? extends CompressionCodec(); StringTokenizer codecSplit = new StringTokenizer(codecsString, ,); while (codecSplit.hasMoreElements()) { String codecSubstring = codecSplit.nextToken(); @@ -123,14 +133,14 @@ public class CompressionCodecFactory { } } } - return result; -} else { - return null; } +return result; } /** - * Sets a list of codec classes in the configuration. + * Sets a list of codec classes in the configuration. In addition to any + * classes specified using this method, {@link CompressionCodec} classes on + * the classpath are discovered using a Java ServiceLoader. * @param conf the configuration to modify * @param classes the list
svn commit: r1304597 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/fs/ src/main/java/org/apache/hadoop/net/ src/main/resources/ src/test/java/org/ap
Author: tomwhite Date: Fri Mar 23 21:03:30 2012 New Revision: 1304597 URL: http://svn.apache.org/viewvc?rev=1304597view=rev Log: HADOOP-7030. Add TableMapping topology implementation to read host to rack mapping from a file. Contributed by Patrick Angeles and tomwhite. Added: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/TableMapping.java (with props) hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/net/TestTableMapping.java (with props) Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1304597r1=1304596r2=1304597view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Fri Mar 23 21:03:30 2012 @@ -135,6 +135,9 @@ Release 0.23.3 - UNRELEASED HADOOP-8121. Active Directory Group Mapping Service. (Jonathan Natkins via atm) +HADOOP-7030. Add TableMapping topology implementation to read host to rack +mapping from a file. (Patrick Angeles and tomwhite via tomwhite) + IMPROVEMENTS HADOOP-7524. Change RPC to allow multiple protocols including multuple Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java?rev=1304597r1=1304596r2=1304597view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java Fri Mar 23 21:03:30 2012 @@ -63,6 +63,10 @@ public class CommonConfigurationKeysPubl /** See a href={@docRoot}/../core-default.htmlcore-default.xml/a */ public static final String NET_TOPOLOGY_NODE_SWITCH_MAPPING_IMPL_KEY = net.topology.node.switch.mapping.impl; + + /** See a href={@docRoot}/../core-default.htmlcore-default.xml/a */ + public static final String NET_TOPOLOGY_TABLE_MAPPING_FILE_KEY = +net.topology.table.file.name; /** See a href={@docRoot}/../core-default.htmlcore-default.xml/a */ public static final String FS_TRASH_CHECKPOINT_INTERVAL_KEY = Added: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/TableMapping.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/TableMapping.java?rev=1304597view=auto == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/TableMapping.java (added) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/TableMapping.java Fri Mar 23 21:03:30 2012 @@ -0,0 +1,147 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * License); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an AS IS BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.net; + +import static org.apache.hadoop.fs.CommonConfigurationKeysPublic.NET_TOPOLOGY_TABLE_MAPPING_FILE_KEY; + +import java.io.BufferedReader; +import java.io.FileReader; +import java.io.IOException; +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +import org.apache.commons.lang.StringUtils; +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; +import
svn commit: r1300798 - in /hadoop/common/branches/branch-1: CHANGES.txt src/mapred/mapred-default.xml src/mapred/org/apache/hadoop/mapred/Counters.java src/test/org/apache/hadoop/mapred/TestUserDefine
Author: tomwhite Date: Thu Mar 15 00:32:15 2012 New Revision: 1300798 URL: http://svn.apache.org/viewvc?rev=1300798view=rev Log: MAPREDUCE-2835. Make per-job counter limits configurable. Modified: hadoop/common/branches/branch-1/CHANGES.txt hadoop/common/branches/branch-1/src/mapred/mapred-default.xml hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestUserDefinedCounters.java Modified: hadoop/common/branches/branch-1/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1300798r1=1300797r2=1300798view=diff == --- hadoop/common/branches/branch-1/CHANGES.txt (original) +++ hadoop/common/branches/branch-1/CHANGES.txt Thu Mar 15 00:32:15 2012 @@ -157,6 +157,8 @@ Release 1.1.0 - unreleased HDFS-3075. Backport HADOOP-4885: Try to restore failed name-node storage directories at checkpoint time. (Brandon Li via szetszwo) +MAPREDUCE-2835. Make per-job counter limits configurable. (tomwhite) + Release 1.0.2 - unreleased NEW FEATURES Modified: hadoop/common/branches/branch-1/src/mapred/mapred-default.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/mapred-default.xml?rev=1300798r1=1300797r2=1300798view=diff == --- hadoop/common/branches/branch-1/src/mapred/mapred-default.xml (original) +++ hadoop/common/branches/branch-1/src/mapred/mapred-default.xml Thu Mar 15 00:32:15 2012 @@ -1233,13 +1233,36 @@ /description /property +!-- end of node health script variables -- + property - namemapreduce.job.counters.limit/name + namemapreduce.job.counters.max/name value120/value descriptionLimit on the number of counters allowed per job. /description /property -!-- end of node health script variables -- +property + namemapreduce.job.counters.groups.max/name + value50/value + descriptionLimit on the number of counter groups allowed per job. + /description +/property + +property + namemapreduce.job.counters.counter.name.max/name + value64/value + descriptionLimit on the length of counter names in jobs. Names + exceeding this limit will be truncated. + /description +/property + +property + namemapreduce.job.counters.group.name.max/name + value128/value + descriptionLimit on the length of counter group names in jobs. Names + exceeding this limit will be truncated. + /description +/property /configuration Modified: hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java?rev=1300798r1=1300797r2=1300798view=diff == --- hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java (original) +++ hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapred/Counters.java Thu Mar 15 00:32:15 2012 @@ -59,18 +59,22 @@ public class Counters implements Writabl private static char[] charsToEscape = {GROUP_OPEN, GROUP_CLOSE, COUNTER_OPEN, COUNTER_CLOSE, UNIT_OPEN, UNIT_CLOSE}; + private static final JobConf conf = new JobConf(); /** limit on the size of the name of the group **/ - private static final int GROUP_NAME_LIMIT = 128; + private static final int GROUP_NAME_LIMIT = +conf.getInt(mapreduce.job.counters.group.name.max, 128); /** limit on the size of the counter name **/ - private static final int COUNTER_NAME_LIMIT = 64; + private static final int COUNTER_NAME_LIMIT = +conf.getInt(mapreduce.job.counters.counter.name.max, 64); - private static final JobConf conf = new JobConf(); /** limit on counters **/ public static int MAX_COUNTER_LIMIT = -conf.getInt(mapreduce.job.counters.limit, 120); +conf.getInt(mapreduce.job.counters.limit, // deprecated in 0.23 +conf.getInt(mapreduce.job.counters.max, 120)); /** the max groups allowed **/ - static final int MAX_GROUP_LIMIT = 50; + public static final int MAX_GROUP_LIMIT = +conf.getInt(mapreduce.job.counters.groups.max, 50); /** the number of current counters**/ private int numCounters = 0; Modified: hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestUserDefinedCounters.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestUserDefinedCounters.java?rev=1300798r1=1300797r2=1300798view=diff == --- hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapred/TestUserDefinedCounters.java (original) +++ hadoop/common/branches/branch-1
svn commit: r1235548 [8/8] - in /hadoop/common/branches/branch-1: ./ src/core/org/apache/hadoop/conf/ src/core/org/apache/hadoop/io/ src/mapred/org/apache/hadoop/mapreduce/ src/mapred/org/apache/hadoo
Added: hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapreduce/lib/partition/TestMRKeyFieldBasedPartitioner.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapreduce/lib/partition/TestMRKeyFieldBasedPartitioner.java?rev=1235548view=auto == --- hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapreduce/lib/partition/TestMRKeyFieldBasedPartitioner.java (added) +++ hadoop/common/branches/branch-1/src/test/org/apache/hadoop/mapreduce/lib/partition/TestMRKeyFieldBasedPartitioner.java Tue Jan 24 23:21:58 2012 @@ -0,0 +1,125 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * License); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an AS IS BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.mapreduce.lib.partition; + +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.io.Text; + +import junit.framework.TestCase; + +public class TestMRKeyFieldBasedPartitioner extends TestCase { + + /** + * Test is key-field-based partitioned works with empty key. + */ + public void testEmptyKey() throws Exception { +int numReducers = 10; +KeyFieldBasedPartitionerText, Text kfbp = + new KeyFieldBasedPartitionerText, Text(); +Configuration conf = new Configuration(); +conf.setInt(num.key.fields.for.partition, 10); +kfbp.setConf(conf); +assertEquals(Empty key should map to 0th partition, + 0, kfbp.getPartition(new Text(), new Text(), numReducers)); + +// check if the hashcode is correct when no keyspec is specified +kfbp = new KeyFieldBasedPartitionerText, Text(); +conf = new Configuration(); +kfbp.setConf(conf); +String input = abc\tdef\txyz; +int hashCode = input.hashCode(); +int expectedPartition = kfbp.getPartition(hashCode, numReducers); +assertEquals(Partitioner doesnt work as expected, expectedPartition, + kfbp.getPartition(new Text(input), new Text(), numReducers)); + +// check if the hashcode is correct with specified keyspec +kfbp = new KeyFieldBasedPartitionerText, Text(); +conf = new Configuration(); +conf.set(KeyFieldBasedPartitioner.PARTITIONER_OPTIONS, -k2,2); +kfbp.setConf(conf); +String expectedOutput = def; +byte[] eBytes = expectedOutput.getBytes(); +hashCode = kfbp.hashCode(eBytes, 0, eBytes.length - 1, 0); +expectedPartition = kfbp.getPartition(hashCode, numReducers); +assertEquals(Partitioner doesnt work as expected, expectedPartition, + kfbp.getPartition(new Text(input), new Text(), numReducers)); + +// test with invalid end index in keyspecs +kfbp = new KeyFieldBasedPartitionerText, Text(); +conf = new Configuration(); +conf.set(KeyFieldBasedPartitioner.PARTITIONER_OPTIONS, -k2,5); +kfbp.setConf(conf); +expectedOutput = def\txyz; +eBytes = expectedOutput.getBytes(); +hashCode = kfbp.hashCode(eBytes, 0, eBytes.length - 1, 0); +expectedPartition = kfbp.getPartition(hashCode, numReducers); +assertEquals(Partitioner doesnt work as expected, expectedPartition, + kfbp.getPartition(new Text(input), new Text(), numReducers)); + +// test with 0 end index in keyspecs +kfbp = new KeyFieldBasedPartitionerText, Text(); +conf = new Configuration(); +conf.set(KeyFieldBasedPartitioner.PARTITIONER_OPTIONS, -k2); +kfbp.setConf(conf); +expectedOutput = def\txyz; +eBytes = expectedOutput.getBytes(); +hashCode = kfbp.hashCode(eBytes, 0, eBytes.length - 1, 0); +expectedPartition = kfbp.getPartition(hashCode, numReducers); +assertEquals(Partitioner doesnt work as expected, expectedPartition, + kfbp.getPartition(new Text(input), new Text(), numReducers)); + +// test with invalid keyspecs +kfbp = new KeyFieldBasedPartitionerText, Text(); +conf = new Configuration(); +conf.set(KeyFieldBasedPartitioner.PARTITIONER_OPTIONS, -k10); +kfbp.setConf(conf); +assertEquals(Partitioner doesnt work as expected, 0, + kfbp.getPartition(new Text(input), new Text(), numReducers)); + +// test with multiple keyspecs +kfbp = new KeyFieldBasedPartitionerText, Text(); +
svn commit: r1235551 - in /hadoop/common/branches/branch-1.0: ./ src/core/org/apache/hadoop/conf/ src/core/org/apache/hadoop/io/ src/mapred/org/apache/hadoop/mapreduce/ src/mapred/org/apache/hadoop/ma
Author: tomwhite Date: Tue Jan 24 23:30:12 2012 New Revision: 1235551 URL: http://svn.apache.org/viewvc?rev=1235551view=rev Log: Merge -r 1235547:1235548 from branch-1 to branch-1.0. Fixes: MAPREDUCE-3607 Added: hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/ - copied from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/ hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/BigDecimalSplitter.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/BigDecimalSplitter.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/BooleanSplitter.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/BooleanSplitter.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBConfiguration.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBConfiguration.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBInputFormat.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBInputFormat.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBOutputFormat.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBOutputFormat.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBRecordReader.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBRecordReader.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBSplitter.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBSplitter.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBWritable.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DBWritable.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DataDrivenDBInputFormat.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DataDrivenDBInputFormat.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DataDrivenDBRecordReader.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DataDrivenDBRecordReader.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/DateSplitter.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/DateSplitter.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/FloatSplitter.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/FloatSplitter.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/IntegerSplitter.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/IntegerSplitter.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/MySQLDBRecordReader.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/MySQLDBRecordReader.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/MySQLDataDrivenDBRecordReader.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/MySQLDataDrivenDBRecordReader.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/OracleDBRecordReader.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/OracleDBRecordReader.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/OracleDataDrivenDBInputFormat.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/OracleDataDrivenDBInputFormat.java hadoop/common/branches/branch-1.0/src/mapred/org/apache/hadoop/mapreduce/lib/db/OracleDataDrivenDBRecordReader.java - copied unchanged from r1235548, hadoop/common/branches/branch-1/src/mapred/org/apache/hadoop/mapreduce/lib/db/OracleDataDrivenDBRecordReader.java hadoop/common/branches/branch-1.0/src/mapred/org
svn commit: r1228291 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/io/SequenceFile.java
Author: tomwhite Date: Fri Jan 6 17:24:55 2012 New Revision: 1228291 URL: http://svn.apache.org/viewvc?rev=1228291view=rev Log: HADOOP-7937. Forward port SequenceFile#syncFs and friends from Hadoop 1.x. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SequenceFile.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1228291r1=1228290r2=1228291view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Fri Jan 6 17:24:55 2012 @@ -983,6 +983,9 @@ Release 0.22.1 - Unreleased BUG FIXES +HADOOP-7937. Forward port SequenceFile#syncFs and friends from Hadoop 1.x. +(tomwhite) + Release 0.22.0 - 2011-11-29 INCOMPATIBLE CHANGES Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SequenceFile.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SequenceFile.java?rev=1228291r1=1228290r2=1228291view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SequenceFile.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/SequenceFile.java Fri Jan 6 17:24:55 2012 @@ -1193,6 +1193,13 @@ public class SequenceFile { } } +/** flush all currently written data to the file system */ +public void syncFs() throws IOException { + if (out != null) { +out.sync(); // flush contents to file system + } +} + /** Returns the configuration of this file. */ Configuration getConf() { return conf; }
svn commit: r1213432 - /hadoop/common/trunk/dev-support/test-patch.sh
Author: tomwhite Date: Mon Dec 12 20:45:30 2011 New Revision: 1213432 URL: http://svn.apache.org/viewvc?rev=1213432view=rev Log: HADOOP-7912. test-patch should run eclipse:eclipse to verify that it does not break again. Contributed by Robert Joseph Evans Modified: hadoop/common/trunk/dev-support/test-patch.sh Modified: hadoop/common/trunk/dev-support/test-patch.sh URL: http://svn.apache.org/viewvc/hadoop/common/trunk/dev-support/test-patch.sh?rev=1213432r1=1213431r2=1213432view=diff == --- hadoop/common/trunk/dev-support/test-patch.sh (original) +++ hadoop/common/trunk/dev-support/test-patch.sh Mon Dec 12 20:45:30 2011 @@ -586,6 +586,35 @@ $JIRA_COMMENT_FOOTER } ### +### Verify eclipse:eclipse works +checkEclipseGeneration () { + echo + echo + echo == + echo == + echo Running mvn eclipse:eclipse. + echo == + echo == + echo + echo + + echo $MVN eclipse:eclipse -D${PROJECT_NAME}PatchProcess + $MVN eclipse:eclipse -D${PROJECT_NAME}PatchProcess + if [[ $? != 0 ]] ; then + JIRA_COMMENT=$JIRA_COMMENT + +-1 eclipse:eclipse. The patch failed to build with eclipse:eclipse. +return 1 + fi + JIRA_COMMENT=$JIRA_COMMENT + ++1 eclipse:eclipse. The patch built with eclipse:eclipse. + return 0 +} + + + +### ### Run the tests runTests () { echo @@ -790,6 +819,8 @@ checkJavadocWarnings (( RESULT = RESULT + $? )) checkJavacWarnings (( RESULT = RESULT + $? )) +checkEclipseGeneration +(( RESULT = RESULT + $? )) ### Checkstyle not implemented yet #checkStyle #(( RESULT = RESULT + $? ))
svn commit: r1213432 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
Author: tomwhite Date: Mon Dec 12 20:45:30 2011 New Revision: 1213432 URL: http://svn.apache.org/viewvc?rev=1213432view=rev Log: HADOOP-7912. test-patch should run eclipse:eclipse to verify that it does not break again. Contributed by Robert Joseph Evans Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1213432r1=1213431r2=1213432view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Mon Dec 12 20:45:30 2011 @@ -172,6 +172,9 @@ Release 0.23.1 - Unreleased HADOOP-6886. LocalFileSystem Needs createNonRecursive API. (Nicolas Spiegelberg and eli via eli) +HADOOP-7912. test-patch should run eclipse:eclipse to verify that it does +not break again. (Robert Joseph Evans via tomwhite) + OPTIMIZATIONS BUG FIXES
svn commit: r1213434 - /hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt
Author: tomwhite Date: Mon Dec 12 20:48:06 2011 New Revision: 1213434 URL: http://svn.apache.org/viewvc?rev=1213434view=rev Log: Merge -r 1213431:1213432 from trunk to branch-0.23. Fixes: HADOOP-7912 Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1213434r1=1213433r2=1213434view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Mon Dec 12 20:48:06 2011 @@ -38,6 +38,9 @@ Release 0.23.1 - Unreleased HADOOP-7758. Make GlobFilter class public. (tucu) +HADOOP-7912. test-patch should run eclipse:eclipse to verify that it does +not break again. (Robert Joseph Evans via tomwhite) + OPTIMIZATIONS BUG FIXES
svn commit: r1207755 - /hadoop/common/trunk/hadoop-tools/hadoop-streaming/pom.xml
Author: tomwhite Date: Tue Nov 29 05:07:47 2011 New Revision: 1207755 URL: http://svn.apache.org/viewvc?rev=1207755view=rev Log: MAPREDUCE-3433. Finding counters by legacy group name returns empty counters. Modified: hadoop/common/trunk/hadoop-tools/hadoop-streaming/pom.xml Modified: hadoop/common/trunk/hadoop-tools/hadoop-streaming/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-tools/hadoop-streaming/pom.xml?rev=1207755r1=1207754r2=1207755view=diff == --- hadoop/common/trunk/hadoop-tools/hadoop-streaming/pom.xml (original) +++ hadoop/common/trunk/hadoop-tools/hadoop-streaming/pom.xml Tue Nov 29 05:07:47 2011 @@ -29,7 +29,7 @@ properties hadoop.log.dir${project.build.directory}/log/hadoop.log.dir - test.exclude.pattern%regex[.*(TestStreamingBadRecords|TestStreamingCombiner|TestStreamingStatus|TestUlimit).*]/test.exclude.pattern + test.exclude.pattern%regex[.*(TestStreamingBadRecords|TestStreamingStatus|TestUlimit).*]/test.exclude.pattern /properties dependencies
svn commit: r1207756 - /hadoop/common/branches/branch-0.23/hadoop-tools/hadoop-streaming/pom.xml
Author: tomwhite Date: Tue Nov 29 05:12:07 2011 New Revision: 1207756 URL: http://svn.apache.org/viewvc?rev=1207756view=rev Log: Merge -r 1207754:1207755 from trunk to branch-0.23. Fixes: MAPREDUCE-3433 Modified: hadoop/common/branches/branch-0.23/hadoop-tools/hadoop-streaming/pom.xml Modified: hadoop/common/branches/branch-0.23/hadoop-tools/hadoop-streaming/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-tools/hadoop-streaming/pom.xml?rev=1207756r1=1207755r2=1207756view=diff == --- hadoop/common/branches/branch-0.23/hadoop-tools/hadoop-streaming/pom.xml (original) +++ hadoop/common/branches/branch-0.23/hadoop-tools/hadoop-streaming/pom.xml Tue Nov 29 05:12:07 2011 @@ -29,7 +29,7 @@ properties hadoop.log.dir${project.build.directory}/log/hadoop.log.dir - test.exclude.pattern%regex[.*(TestStreamingBadRecords|TestStreamingCombiner|TestStreamingStatus|TestUlimit).*]/test.exclude.pattern + test.exclude.pattern%regex[.*(TestStreamingBadRecords|TestStreamingStatus|TestUlimit).*]/test.exclude.pattern /properties dependencies
svn commit: r1203437 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
Author: tomwhite Date: Fri Nov 18 00:19:53 2011 New Revision: 1203437 URL: http://svn.apache.org/viewvc?rev=1203437view=rev Log: HADOOP-7787. Make source tarball use conventional name. Contributed by Bruno Mahé Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1203437r1=1203436r2=1203437view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Fri Nov 18 00:19:53 2011 @@ -117,6 +117,9 @@ Release 0.23.1 - Unreleased HADOOP-7811. TestUserGroupInformation#testGetServerSideGroups test fails in chroot. (Jonathan Eagles via mahadev) + HADOOP-7787. Make source tarball use conventional name. + (Bruno Mahé via tomwhite) + Release 0.23.0 - 2011-11-01 INCOMPATIBLE CHANGES
svn commit: r1203437 - /hadoop/common/trunk/pom.xml
Author: tomwhite Date: Fri Nov 18 00:19:53 2011 New Revision: 1203437 URL: http://svn.apache.org/viewvc?rev=1203437view=rev Log: HADOOP-7787. Make source tarball use conventional name. Contributed by Bruno Mahé Modified: hadoop/common/trunk/pom.xml Modified: hadoop/common/trunk/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/pom.xml?rev=1203437r1=1203436r2=1203437view=diff == --- hadoop/common/trunk/pom.xml (original) +++ hadoop/common/trunk/pom.xml Fri Nov 18 00:19:53 2011 @@ -264,7 +264,7 @@ configuration appendAssemblyIdfalse/appendAssemblyId attachfalse/attach - finalNamehadoop-dist-${project.version}-src/finalName + finalNamehadoop-${project.version}-src/finalName outputDirectoryhadoop-dist/target/outputDirectory !-- Not using descriptorRef and hadoop-assembly dependency -- !-- to avoid making hadoop-main to depend on a module -- @@ -288,7 +288,7 @@ configuration target echo/ -echoHadoop source tar available at: ${basedir}/hadoop-dist/target/hadoop-dist-${project.version}-src.tar.gz/echo +echoHadoop source tar available at: ${basedir}/hadoop-dist/target/hadoop-${project.version}-src.tar.gz/echo echo/ /target /configuration
svn commit: r1203438 - /hadoop/common/branches/branch-0.23/pom.xml
Author: tomwhite Date: Fri Nov 18 00:21:19 2011 New Revision: 1203438 URL: http://svn.apache.org/viewvc?rev=1203438view=rev Log: Merge -r 1203436:1203437 from trunk to branch-0.23. Fixes: HADOOP-7787 Modified: hadoop/common/branches/branch-0.23/pom.xml Modified: hadoop/common/branches/branch-0.23/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/pom.xml?rev=1203438r1=1203437r2=1203438view=diff == --- hadoop/common/branches/branch-0.23/pom.xml (original) +++ hadoop/common/branches/branch-0.23/pom.xml Fri Nov 18 00:21:19 2011 @@ -264,7 +264,7 @@ configuration appendAssemblyIdfalse/appendAssemblyId attachfalse/attach - finalNamehadoop-dist-${project.version}-src/finalName + finalNamehadoop-${project.version}-src/finalName outputDirectoryhadoop-dist/target/outputDirectory !-- Not using descriptorRef and hadoop-assembly dependency -- !-- to avoid making hadoop-main to depend on a module -- @@ -288,7 +288,7 @@ configuration target echo/ -echoHadoop source tar available at: ${basedir}/hadoop-dist/target/hadoop-dist-${project.version}-src.tar.gz/echo +echoHadoop source tar available at: ${basedir}/hadoop-dist/target/hadoop-${project.version}-src.tar.gz/echo echo/ /target /configuration
svn commit: r1203438 - /hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt
Author: tomwhite Date: Fri Nov 18 00:21:19 2011 New Revision: 1203438 URL: http://svn.apache.org/viewvc?rev=1203438view=rev Log: Merge -r 1203436:1203437 from trunk to branch-0.23. Fixes: HADOOP-7787 Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1203438r1=1203437r2=1203438view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Fri Nov 18 00:21:19 2011 @@ -17,6 +17,9 @@ Release 0.23.1 - Unreleased HADOOP-7811. TestUserGroupInformation#testGetServerSideGroups test fails in chroot. (Jonathan Eagles via mahadev) + HADOOP-7787. Make source tarball use conventional name. + (Bruno Mahé via tomwhite) + Release 0.23.0 - 2011-11-01 INCOMPATIBLE CHANGES
svn commit: r1203449 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/bin/ src/main/packages/
Author: tomwhite Date: Fri Nov 18 00:48:54 2011 New Revision: 1203449 URL: http://svn.apache.org/viewvc?rev=1203449view=rev Log: HADOOP-7802. Hadoop scripts unconditionally source $bin/../libexec/hadoop-config.sh. Contributed by Bruno Mahé Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/rcc hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/slaves.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/start-all.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/stop-all.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/hadoop-create-user.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-applications.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-conf.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-hdfs.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-single-node.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/hadoop-validate-setup.sh Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1203449r1=1203448r2=1203449view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Fri Nov 18 00:48:54 2011 @@ -110,6 +110,9 @@ Release 0.23.1 - Unreleased HADOOP-7801. HADOOP_PREFIX cannot be overriden. (Bruno Mahé via tomwhite) +HADOOP-7802. Hadoop scripts unconditionally source +$bin/../libexec/hadoop-config.sh. (Bruno Mahé via tomwhite) + OPTIMIZATIONS BUG FIXES Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop?rev=1203449r1=1203448r2=1203449view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop Fri Nov 18 00:48:54 2011 @@ -21,7 +21,9 @@ bin=`which $0` bin=`dirname ${bin}` bin=`cd $bin; pwd` -. $bin/../libexec/hadoop-config.sh +DEFAULT_LIBEXEC_DIR=$bin/../libexec +HADOOP_LIBEXEC_DIR=${HADOOP_LIBEXEC_DIR:-$DEFAULT_LIBEXEC_DIR} +. $HADOOP_LIBEXEC_DIR/hadoop-config.sh function print_usage(){ echo Usage: hadoop [--config confdir] COMMAND Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh?rev=1203449r1=1203448r2=1203449view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh Fri Nov 18 00:48:54 2011 @@ -39,7 +39,9 @@ fi bin=`dirname ${BASH_SOURCE-$0}` bin=`cd $bin; pwd` -. $bin/../libexec/hadoop-config.sh +DEFAULT_LIBEXEC_DIR=$bin/../libexec +HADOOP_LIBEXEC_DIR=${HADOOP_LIBEXEC_DIR:-$DEFAULT_LIBEXEC_DIR} +. $HADOOP_LIBEXEC_DIR/hadoop-config.sh # get arguments Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh?rev=1203449r1=1203448r2=1203449view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh Fri Nov 18 00:48:54 2011 @@ -29,6 +29,8 @@ fi bin=`dirname ${BASH_SOURCE-$0}` bin=`cd $bin; pwd` -. $bin/../libexec/hadoop-config.sh +DEFAULT_LIBEXEC_DIR=$bin/../libexec +HADOOP_LIBEXEC_DIR=${HADOOP_LIBEXEC_DIR:-$DEFAULT_LIBEXEC_DIR} +. $HADOOP_LIBEXEC_DIR/hadoop-config.sh exec $bin/slaves.sh --config $HADOOP_CONF_DIR cd $HADOOP_PREFIX \; $bin/hadoop-daemon.sh --config $HADOOP_CONF_DIR $@ Modified: hadoop/common/trunk/hadoop-common-project/hadoop
svn commit: r1203451 - in /hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common: ./ src/main/bin/ src/main/packages/
Author: tomwhite Date: Fri Nov 18 00:50:14 2011 New Revision: 1203451 URL: http://svn.apache.org/viewvc?rev=1203451view=rev Log: Merge -r 1203448:1203449 from trunk to branch-0.23. Fixes: HADOOP-7802 Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/rcc hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/slaves.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/start-all.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/stop-all.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/hadoop-create-user.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-applications.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-conf.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-hdfs.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-single-node.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/hadoop-validate-setup.sh Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1203451r1=1203450r2=1203451view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Fri Nov 18 00:50:14 2011 @@ -10,6 +10,9 @@ Release 0.23.1 - Unreleased HADOOP-7801. HADOOP_PREFIX cannot be overriden. (Bruno Mahé via tomwhite) +HADOOP-7802. Hadoop scripts unconditionally source +$bin/../libexec/hadoop-config.sh. (Bruno Mahé via tomwhite) + OPTIMIZATIONS BUG FIXES Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop?rev=1203451r1=1203450r2=1203451view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop Fri Nov 18 00:50:14 2011 @@ -21,7 +21,9 @@ bin=`which $0` bin=`dirname ${bin}` bin=`cd $bin; pwd` -. $bin/../libexec/hadoop-config.sh +DEFAULT_LIBEXEC_DIR=$bin/../libexec +HADOOP_LIBEXEC_DIR=${HADOOP_LIBEXEC_DIR:-$DEFAULT_LIBEXEC_DIR} +. $HADOOP_LIBEXEC_DIR/hadoop-config.sh function print_usage(){ echo Usage: hadoop [--config confdir] COMMAND Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh?rev=1203451r1=1203450r2=1203451view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh Fri Nov 18 00:50:14 2011 @@ -39,7 +39,9 @@ fi bin=`dirname ${BASH_SOURCE-$0}` bin=`cd $bin; pwd` -. $bin/../libexec/hadoop-config.sh +DEFAULT_LIBEXEC_DIR=$bin/../libexec +HADOOP_LIBEXEC_DIR=${HADOOP_LIBEXEC_DIR:-$DEFAULT_LIBEXEC_DIR} +. $HADOOP_LIBEXEC_DIR/hadoop-config.sh # get arguments Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh?rev=1203451r1=1203450r2=1203451view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh Fri Nov 18 00:50:14 2011 @@ -29,6 +29,8 @@ fi bin
svn commit: r1199025 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/bin/ src/main/packages/ src/main/packages/deb/init.d/ src/main/packages/rpm/init.d/
Author: tomwhite Date: Tue Nov 8 00:16:27 2011 New Revision: 1199025 URL: http://svn.apache.org/viewvc?rev=1199025view=rev Log: HADOOP-7801. HADOOP_PREFIX cannot be overriden. Contributed by Bruno Mahé. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-namenode hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-tasktracker hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-conf.sh hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/rpm/init.d/hadoop-jobtracker hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/rpm/init.d/hadoop-namenode hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/rpm/init.d/hadoop-tasktracker Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1199025r1=1199024r2=1199025view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue Nov 8 00:16:27 2011 @@ -100,6 +100,20 @@ Trunk (unreleased changes) HADOOP-7773. Add support for protocol buffer based RPC engine. (suresh) +Release 0.23.1 - Unreleased + + INCOMPATIBLE CHANGES + + NEW FEATURES + + IMPROVEMENTS + +HADOOP-7801. HADOOP_PREFIX cannot be overriden. (Bruno Mahé via tomwhite) + + OPTIMIZATIONS + + BUG FIXES + Release 0.23.0 - 2011-11-01 INCOMPATIBLE CHANGES Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh?rev=1199025r1=1199024r2=1199025view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh Tue Nov 8 00:16:27 2011 @@ -27,7 +27,9 @@ this=$common_bin/$script # the root of the Hadoop installation # See HADOOP-6255 for directory structure layout -export HADOOP_PREFIX=`dirname $this`/.. +HADOOP_DEFAULT_PREFIX=`dirname $this`/.. +HADOOP_PREFIX=${HADOOP_PREFIX:-$HADOOP_DEFAULT_PREFIX} +export HADOOP_PREFIX #check to see if the conf dir is given as an optional argument if [ $# -gt 1 ] Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode?rev=1199025r1=1199024r2=1199025view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode Tue Nov 8 00:16:27 2011 @@ -75,7 +75,7 @@ check_privsep_dir() { } export PATH=${PATH:+$PATH:}/usr/sbin:/usr/bin -export HADOOP_PREFIX=/usr +export HADOOP_PREFIX=${HADOOP_PREFIX:-/usr} case $1 in start) Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker?rev=1199025r1=1199024r2=1199025view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker Tue Nov 8 00:16:27 2011 @@ -67,7 +67,7 @@ check_privsep_dir() { } export PATH=${PATH:+$PATH:}/usr/sbin:/usr/bin -export HADOOP_PREFIX=/usr +export HADOOP_PREFIX=${HADOOP_PREFIX:-/usr} case $1 in start) Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-namenode URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-namenode?rev
svn commit: r1199026 - in /hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common: ./ src/main/bin/ src/main/packages/ src/main/packages/deb/init.d/ src/main/packages/rpm/init.d/
Author: tomwhite Date: Tue Nov 8 00:18:02 2011 New Revision: 1199026 URL: http://svn.apache.org/viewvc?rev=1199026view=rev Log: Merge -r 1199024:1199025 from trunk to branch-0.23. Fixes: HADOOP-7801 Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-namenode hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-tasktracker hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/hadoop-setup-conf.sh hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/rpm/init.d/hadoop-jobtracker hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/rpm/init.d/hadoop-namenode hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/rpm/init.d/hadoop-tasktracker Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1199026r1=1199025r2=1199026view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Tue Nov 8 00:18:02 2011 @@ -1,5 +1,19 @@ Hadoop Change Log +Release 0.23.1 - Unreleased + + INCOMPATIBLE CHANGES + + NEW FEATURES + + IMPROVEMENTS + +HADOOP-7801. HADOOP_PREFIX cannot be overriden. (Bruno Mahé via tomwhite) + + OPTIMIZATIONS + + BUG FIXES + Release 0.23.0 - 2011-11-01 INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh?rev=1199026r1=1199025r2=1199026view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh Tue Nov 8 00:18:02 2011 @@ -27,7 +27,9 @@ this=$common_bin/$script # the root of the Hadoop installation # See HADOOP-6255 for directory structure layout -export HADOOP_PREFIX=`dirname $this`/.. +HADOOP_DEFAULT_PREFIX=`dirname $this`/.. +HADOOP_PREFIX=${HADOOP_PREFIX:-$HADOOP_DEFAULT_PREFIX} +export HADOOP_PREFIX #check to see if the conf dir is given as an optional argument if [ $# -gt 1 ] Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode?rev=1199026r1=1199025r2=1199026view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-datanode Tue Nov 8 00:18:02 2011 @@ -75,7 +75,7 @@ check_privsep_dir() { } export PATH=${PATH:+$PATH:}/usr/sbin:/usr/bin -export HADOOP_PREFIX=/usr +export HADOOP_PREFIX=${HADOOP_PREFIX:-/usr} case $1 in start) Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker?rev=1199026r1=1199025r2=1199026view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/packages/deb/init.d/hadoop-jobtracker Tue Nov 8 00:18:02 2011 @@ -67,7 +67,7 @@ check_privsep_dir() { } export PATH=${PATH:+$PATH:}/usr/sbin:/usr/bin -export HADOOP_PREFIX=/usr +export HADOOP_PREFIX
svn commit: r1195817 - in /hadoop/common/trunk/hadoop-common-project: hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/ hadoop-common/ hadoop-common/src/main/java/org/apache/had
Author: tomwhite Date: Tue Nov 1 04:47:10 2011 New Revision: 1195817 URL: http://svn.apache.org/viewvc?rev=1195817view=rev Log: HADOOP-7782. Aggregate project javadocs. Added: hadoop/common/trunk/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/IncludePublicAnnotationsStandardDoclet.java (with props) hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/snappy/package-info.java (with props) hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/package-info.java (with props) hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/protobuf/package-info.java (with props) hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/package-info.java (with props) hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/package-info.java (with props) hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/package-info.java (with props) hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/package-info.java (with props) Modified: hadoop/common/trunk/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/RootDocProcessor.java hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Added: hadoop/common/trunk/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/IncludePublicAnnotationsStandardDoclet.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/IncludePublicAnnotationsStandardDoclet.java?rev=1195817view=auto == --- hadoop/common/trunk/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/IncludePublicAnnotationsStandardDoclet.java (added) +++ hadoop/common/trunk/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/IncludePublicAnnotationsStandardDoclet.java Tue Nov 1 04:47:10 2011 @@ -0,0 +1,63 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * License); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an AS IS BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.classification.tools; + +import com.sun.javadoc.DocErrorReporter; +import com.sun.javadoc.LanguageVersion; +import com.sun.javadoc.RootDoc; +import com.sun.tools.doclets.standard.Standard; + +/** + * A a href=http://java.sun.com/javase/6/docs/jdk/api/javadoc/doclet/;Doclet/a + * that only includes class-level elements that are annotated with + * {@link org.apache.hadoop.classification.InterfaceAudience.Public}. + * Class-level elements with no annotation are excluded. + * In addition, all elements that are annotated with + * {@link org.apache.hadoop.classification.InterfaceAudience.Private} or + * {@link org.apache.hadoop.classification.InterfaceAudience.LimitedPrivate} + * are also excluded. + * It delegates to the Standard Doclet, and takes the same options. + */ +public class IncludePublicAnnotationsStandardDoclet { + + public static LanguageVersion languageVersion() { +return LanguageVersion.JAVA_1_5; + } + + public static boolean start(RootDoc root) { +System.out.println( +IncludePublicAnnotationsStandardDoclet.class.getSimpleName()); +RootDocProcessor.treatUnannotatedClassesAsPrivate = true; +return Standard.start(RootDocProcessor.process(root)); + } + + public static int optionLength(String option) { +Integer length = StabilityOptions.optionLength(option); +if (length != null) { + return length; +} +return Standard.optionLength(option); + } + + public static boolean validOptions(String[][] options, + DocErrorReporter reporter) { +StabilityOptions.validOptions(options, reporter); +String[][] filteredOptions = StabilityOptions.filterOptions(options); +return Standard.validOptions(filteredOptions, reporter
svn commit: r1195817 - in /hadoop/common/trunk: hadoop-project/src/site/site.xml pom.xml
Author: tomwhite Date: Tue Nov 1 04:47:10 2011 New Revision: 1195817 URL: http://svn.apache.org/viewvc?rev=1195817view=rev Log: HADOOP-7782. Aggregate project javadocs. Modified: hadoop/common/trunk/hadoop-project/src/site/site.xml hadoop/common/trunk/pom.xml Modified: hadoop/common/trunk/hadoop-project/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-project/src/site/site.xml?rev=1195817r1=1195816r2=1195817view=diff == --- hadoop/common/trunk/hadoop-project/src/site/site.xml (original) +++ hadoop/common/trunk/hadoop-project/src/site/site.xml Tue Nov 1 04:47:10 2011 @@ -68,7 +68,8 @@ item name=Capacity Scheduler href=hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html/ /menu -menu name=Configuration Reference inherit=top +menu name=Reference inherit=top + item name=API docs href=api/index.html/ item name=core-default.xml href=hadoop-project-dist/hadoop-common/core-default.xml/ item name=hdfs-default.xml href=hadoop-project-dist/hadoop-hdfs/hdfs-default.xml/ item name=mapred-default.xml href=hadoop-mapreduce-client/hadoop-mapreduce-client-core/mapred-default.xml/ Modified: hadoop/common/trunk/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/pom.xml?rev=1195817r1=1195816r2=1195817view=diff == --- hadoop/common/trunk/pom.xml (original) +++ hadoop/common/trunk/pom.xml Tue Nov 1 04:47:10 2011 @@ -109,8 +109,7 @@ artifactIdmaven-site-plugin/artifactId version3.0/version configuration -!-- Reports are generated at the site level -- -generateReportsfalse/generateReports +generateReportstrue/generateReports /configuration /plugin /plugins @@ -180,6 +179,70 @@ /plugins /build + reporting +excludeDefaultstrue/excludeDefaults +plugins + plugin +groupIdorg.apache.maven.plugins/groupId +artifactIdmaven-javadoc-plugin/artifactId +version2.8/version +reportSets + reportSet +idaggregate/id +configuration + maxmemory1024m/maxmemory + linksourcetrue/linksource + quiettrue/quiet + verbosefalse/verbose + source${maven.compile.source}/source + charset${maven.compile.encoding}/charset + reportOutputDirectory${project.build.directory}/site/reportOutputDirectory + destDirhadoop-project/api/destDir + !-- Non-public APIs -- + excludePackageNamesorg.apache.hadoop.authentication*,org.apache.hadoop.hdfs*,org.apache.hadoop.mapreduce.v2.proto,org.apache.hadoop.yarn.proto,org.apache.hadoop.yarn.server*,org.apache.hadoop.yarn.webapp*/excludePackageNames + groups +group + titleCommon/title + packagesorg.apache.hadoop*/packages +/group +group + titleMapReduce/title + packagesorg.apache.hadoop.mapred*/packages +/group +group + titleYARN/title + packagesorg.apache.hadoop.yarn*/packages +/group + /groups + docletorg.apache.hadoop.classification.tools.IncludePublicAnnotationsStandardDoclet/doclet + docletArtifacts +docletArtifact + groupIdorg.apache.hadoop/groupId + artifactIdhadoop-annotations/artifactId + version${project.version}/version +/docletArtifact + /docletArtifacts + useStandardDocletOptionstrue/useStandardDocletOptions + + !-- switch on dependency-driven aggregation -- + includeDependencySourcesfalse/includeDependencySources + + dependencySourceIncludes +!-- include ONLY dependencies I control -- + dependencySourceIncludeorg.apache.hadoop:hadoop-annotations/dependencySourceInclude + /dependencySourceIncludes + +/configuration +reports + reportaggregate/report +/reports + /reportSet +/reportSets + /plugin + +/plugins + /reporting + profiles profile idsrc/id
svn commit: r1195821 - in /hadoop/common/branches/branch-0.23: hadoop-project/src/site/site.xml pom.xml
Author: tomwhite Date: Tue Nov 1 04:53:00 2011 New Revision: 1195821 URL: http://svn.apache.org/viewvc?rev=1195821view=rev Log: Merge -r 1195816:1195817 from trunk to branch-0.23. Fixes: HADOOP-7782. Modified: hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml hadoop/common/branches/branch-0.23/pom.xml Modified: hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml?rev=1195821r1=1195820r2=1195821view=diff == --- hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml (original) +++ hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml Tue Nov 1 04:53:00 2011 @@ -68,7 +68,8 @@ item name=Capacity Scheduler href=hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html/ /menu -menu name=Configuration Reference inherit=top +menu name=Reference inherit=top + item name=API docs href=api/index.html/ item name=core-default.xml href=hadoop-project-dist/hadoop-common/core-default.xml/ item name=hdfs-default.xml href=hadoop-project-dist/hadoop-hdfs/hdfs-default.xml/ item name=mapred-default.xml href=hadoop-mapreduce-client/hadoop-mapreduce-client-core/mapred-default.xml/ Modified: hadoop/common/branches/branch-0.23/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/pom.xml?rev=1195821r1=1195820r2=1195821view=diff == --- hadoop/common/branches/branch-0.23/pom.xml (original) +++ hadoop/common/branches/branch-0.23/pom.xml Tue Nov 1 04:53:00 2011 @@ -109,8 +109,7 @@ artifactIdmaven-site-plugin/artifactId version3.0/version configuration -!-- Reports are generated at the site level -- -generateReportsfalse/generateReports +generateReportstrue/generateReports /configuration /plugin /plugins @@ -180,6 +179,70 @@ /plugins /build + reporting +excludeDefaultstrue/excludeDefaults +plugins + plugin +groupIdorg.apache.maven.plugins/groupId +artifactIdmaven-javadoc-plugin/artifactId +version2.8/version +reportSets + reportSet +idaggregate/id +configuration + maxmemory1024m/maxmemory + linksourcetrue/linksource + quiettrue/quiet + verbosefalse/verbose + source${maven.compile.source}/source + charset${maven.compile.encoding}/charset + reportOutputDirectory${project.build.directory}/site/reportOutputDirectory + destDirhadoop-project/api/destDir + !-- Non-public APIs -- + excludePackageNamesorg.apache.hadoop.authentication*,org.apache.hadoop.hdfs*,org.apache.hadoop.mapreduce.v2.proto,org.apache.hadoop.yarn.proto,org.apache.hadoop.yarn.server*,org.apache.hadoop.yarn.webapp*/excludePackageNames + groups +group + titleCommon/title + packagesorg.apache.hadoop*/packages +/group +group + titleMapReduce/title + packagesorg.apache.hadoop.mapred*/packages +/group +group + titleYARN/title + packagesorg.apache.hadoop.yarn*/packages +/group + /groups + docletorg.apache.hadoop.classification.tools.IncludePublicAnnotationsStandardDoclet/doclet + docletArtifacts +docletArtifact + groupIdorg.apache.hadoop/groupId + artifactIdhadoop-annotations/artifactId + version${project.version}/version +/docletArtifact + /docletArtifacts + useStandardDocletOptionstrue/useStandardDocletOptions + + !-- switch on dependency-driven aggregation -- + includeDependencySourcesfalse/includeDependencySources + + dependencySourceIncludes +!-- include ONLY dependencies I control -- + dependencySourceIncludeorg.apache.hadoop:hadoop-annotations/dependencySourceInclude + /dependencySourceIncludes + +/configuration +reports + reportaggregate/report +/reports + /reportSet +/reportSets + /plugin + +/plugins + /reporting + profiles profile idsrc/id
svn commit: r1195821 - in /hadoop/common/branches/branch-0.23/hadoop-common-project: hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/ hadoop-common/ hadoop-common/src/main/java
Author: tomwhite Date: Tue Nov 1 04:53:00 2011 New Revision: 1195821 URL: http://svn.apache.org/viewvc?rev=1195821view=rev Log: Merge -r 1195816:1195817 from trunk to branch-0.23. Fixes: HADOOP-7782. Added: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/IncludePublicAnnotationsStandardDoclet.java - copied unchanged from r1195817, hadoop/common/trunk/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/IncludePublicAnnotationsStandardDoclet.java hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/snappy/package-info.java - copied unchanged from r1195817, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/snappy/package-info.java hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/package-info.java - copied unchanged from r1195817, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/package-info.java hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/package-info.java - copied unchanged from r1195817, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/package-info.java hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/package-info.java - copied unchanged from r1195817, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/package-info.java hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/package-info.java - copied unchanged from r1195817, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/package-info.java hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/package-info.java - copied unchanged from r1195817, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/package-info.java Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/RootDocProcessor.java hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/RootDocProcessor.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/RootDocProcessor.java?rev=1195821r1=1195820r2=1195821view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/RootDocProcessor.java (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/tools/RootDocProcessor.java Tue Nov 1 04:53:00 2011 @@ -50,6 +50,7 @@ import org.apache.hadoop.classification. class RootDocProcessor { static String stability = StabilityOptions.UNSTABLE_OPTION; + static boolean treatUnannotatedClassesAsPrivate = false; public static RootDoc process(RootDoc root) { return (RootDoc) process(root, RootDoc.class); @@ -201,6 +202,17 @@ class RootDocProcessor { } } } +for (AnnotationDesc annotation : annotations) { + String qualifiedTypeName = +annotation.annotationType().qualifiedTypeName(); + if (qualifiedTypeName.equals( + InterfaceAudience.Public.class.getCanonicalName())) { +return false; + } +} + } + if (treatUnannotatedClassesAsPrivate) { +return doc.isClass() || doc.isInterface() || doc.isAnnotationType(); } return false; } Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1195821r1=1195820r2=1195821view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Tue Nov 1 04:53:00 2011 @@ -437,6 +437,8 @@ Release 0.23.0 - Unreleased destination
svn commit: r1190703 - in /hadoop/common/trunk: ./ hadoop-project/ hadoop-project/src/ hadoop-project/src/site/ hadoop-project/src/site/apt/ hadoop-project/src/site/resources/ hadoop-project/src/site/
Author: tomwhite Date: Sat Oct 29 00:16:16 2011 New Revision: 1190703 URL: http://svn.apache.org/viewvc?rev=1190703view=rev Log: HADOOP-7763. Add top-level navigation to APT docs. Added: hadoop/common/trunk/hadoop-project/src/ hadoop/common/trunk/hadoop-project/src/site/ hadoop/common/trunk/hadoop-project/src/site/apt/ hadoop/common/trunk/hadoop-project/src/site/apt/index.apt.vm (with props) hadoop/common/trunk/hadoop-project/src/site/resources/ hadoop/common/trunk/hadoop-project/src/site/resources/css/ hadoop/common/trunk/hadoop-project/src/site/resources/css/site.css (with props) hadoop/common/trunk/hadoop-project/src/site/site.xml (with props) Modified: hadoop/common/trunk/hadoop-project/pom.xml hadoop/common/trunk/pom.xml Modified: hadoop/common/trunk/hadoop-project/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-project/pom.xml?rev=1190703r1=1190702r2=1190703view=diff == --- hadoop/common/trunk/hadoop-project/pom.xml (original) +++ hadoop/common/trunk/hadoop-project/pom.xml Sat Oct 29 00:16:16 2011 @@ -472,7 +472,7 @@ plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-site-plugin/artifactId - version2.2/version + version3.0/version /plugin plugin groupIdorg.apache.maven.plugins/groupId Added: hadoop/common/trunk/hadoop-project/src/site/apt/index.apt.vm URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-project/src/site/apt/index.apt.vm?rev=1190703view=auto == --- hadoop/common/trunk/hadoop-project/src/site/apt/index.apt.vm (added) +++ hadoop/common/trunk/hadoop-project/src/site/apt/index.apt.vm Sat Oct 29 00:16:16 2011 @@ -0,0 +1,29 @@ +~~ Licensed under the Apache License, Version 2.0 (the License); +~~ you may not use this file except in compliance with the License. +~~ You may obtain a copy of the License at +~~ +~~ http://www.apache.org/licenses/LICENSE-2.0 +~~ +~~ Unless required by applicable law or agreed to in writing, software +~~ distributed under the License is distributed on an AS IS BASIS, +~~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +~~ See the License for the specific language governing permissions and +~~ limitations under the License. See accompanying LICENSE file. + + --- + Hadoop ${project.version} + --- + --- + ${maven.build.timestamp} + + +Getting Started + + The Hadoop documentation includes the information you need to get started using + Hadoop. Begin with the + {{{./hadoop-yarn/hadoop-yarn-site/SingleCluster.html}Single Node Setup}} which + shows you how to set up a single-node Hadoop installation. Then move on to the + {{{./hadoop-yarn/hadoop-yarn-site/ClusterSetup.html}Cluster Setup}} to learn how + to set up a multi-node Hadoop installation. + + Propchange: hadoop/common/trunk/hadoop-project/src/site/apt/index.apt.vm -- svn:eol-style = native Added: hadoop/common/trunk/hadoop-project/src/site/resources/css/site.css URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-project/src/site/resources/css/site.css?rev=1190703view=auto == --- hadoop/common/trunk/hadoop-project/src/site/resources/css/site.css (added) +++ hadoop/common/trunk/hadoop-project/src/site/resources/css/site.css Sat Oct 29 00:16:16 2011 @@ -0,0 +1,30 @@ +/* +* Licensed to the Apache Software Foundation (ASF) under one or more +* contributor license agreements. See the NOTICE file distributed with +* this work for additional information regarding copyright ownership. +* The ASF licenses this file to You under the Apache License, Version 2.0 +* (the License); you may not use this file except in compliance with +* the License. You may obtain a copy of the License at +* +* http://www.apache.org/licenses/LICENSE-2.0 +* +* Unless required by applicable law or agreed to in writing, software +* distributed under the License is distributed on an AS IS BASIS, +* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +* See the License for the specific language governing permissions and +* limitations under the License. +*/ +#banner { + height: 93px; + background: none; +} + +#bannerLeft img { + margin-left: 30px; + margin-top: 10px; +} + +#bannerRight img { + margin: 17px; +} + Propchange: hadoop/common/trunk/hadoop-project/src/site/resources/css/site.css -- svn:eol-style = native Added: hadoop/common/trunk/hadoop-project/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-project/src/site/site.xml?rev=1190703view=auto
svn commit: r1190706 - in /hadoop/common/branches/branch-0.23/hadoop-common-project: hadoop-auth/src/site/ hadoop-auth/src/site/resources/ hadoop-auth/src/site/resources/css/ hadoop-common/ hadoop-com
Author: tomwhite Date: Sat Oct 29 00:17:58 2011 New Revision: 1190706 URL: http://svn.apache.org/viewvc?rev=1190706view=rev Log: Merge -r 1190702:1190703 from trunk to branch-0.23. Fixes: HADOOP-7763. Added: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-auth/src/site/resources/ - copied from r1190703, hadoop/common/trunk/hadoop-common-project/hadoop-auth/src/site/resources/ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-auth/src/site/resources/css/ - copied from r1190703, hadoop/common/trunk/hadoop-common-project/hadoop-auth/src/site/resources/css/ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-auth/src/site/resources/css/site.css - copied unchanged from r1190703, hadoop/common/trunk/hadoop-common-project/hadoop-auth/src/site/resources/css/site.css hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/ - copied from r1190703, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/apt/ - copied from r1190703, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/site/apt/DeprecatedProperties.apt.vm - copied unchanged from r1190703, hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/DeprecatedProperties.apt.vm Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-auth/src/site/site.xml hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/pom.xml Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-auth/src/site/site.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-auth/src/site/site.xml?rev=1190706r1=1190705r2=1190706view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-auth/src/site/site.xml (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-auth/src/site/site.xml Sat Oct 29 00:17:58 2011 @@ -13,16 +13,10 @@ -- project name=Hadoop Auth - version position=right/ - - bannerLeft -namenbsp;/name - /bannerLeft - skin groupIdorg.apache.maven.skins/groupId artifactIdmaven-stylus-skin/artifactId -version1.1/version +version1.2/version /skin body Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1190706r1=1190705r2=1190706view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Sat Oct 29 00:17:58 2011 @@ -447,6 +447,8 @@ Release 0.23.0 - Unreleased HADOOP-7446. Implement CRC32C native code using SSE4.2 instructions. (Kihwal Lee and todd via todd) +HADOOP-7763. Add top-level navigation to APT docs. (tomwhite) + BUG FIXES HADOOP-7740. Fixed security audit logger configuration. (Arpit Gupta via Eric Yang) Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/pom.xml?rev=1190706r1=1190705r2=1190706view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/pom.xml (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/pom.xml Sat Oct 29 00:17:58 2011 @@ -347,6 +347,18 @@ /target /configuration /execution + execution +phasepre-site/phase +goals + goalrun/goal +/goals +configuration + tasks +copy file=src/main/resources/core-default.xml todir=src/site/resources/ +copy file=src/main/xsl/configuration.xsl todir=src/site/resources/ + /tasks +/configuration + /execution /executions /plugin plugin
svn commit: r1190706 - in /hadoop/common/branches/branch-0.23: ./ hadoop-project/ hadoop-project/src/ hadoop-project/src/site/ hadoop-project/src/site/apt/ hadoop-project/src/site/resources/ hadoop-pr
Author: tomwhite Date: Sat Oct 29 00:17:58 2011 New Revision: 1190706 URL: http://svn.apache.org/viewvc?rev=1190706view=rev Log: Merge -r 1190702:1190703 from trunk to branch-0.23. Fixes: HADOOP-7763. Added: hadoop/common/branches/branch-0.23/hadoop-project/src/ - copied from r1190703, hadoop/common/trunk/hadoop-project/src/ hadoop/common/branches/branch-0.23/hadoop-project/src/site/ - copied from r1190703, hadoop/common/trunk/hadoop-project/src/site/ hadoop/common/branches/branch-0.23/hadoop-project/src/site/apt/ - copied from r1190703, hadoop/common/trunk/hadoop-project/src/site/apt/ hadoop/common/branches/branch-0.23/hadoop-project/src/site/apt/index.apt.vm - copied unchanged from r1190703, hadoop/common/trunk/hadoop-project/src/site/apt/index.apt.vm hadoop/common/branches/branch-0.23/hadoop-project/src/site/resources/ - copied from r1190703, hadoop/common/trunk/hadoop-project/src/site/resources/ hadoop/common/branches/branch-0.23/hadoop-project/src/site/resources/css/ - copied from r1190703, hadoop/common/trunk/hadoop-project/src/site/resources/css/ hadoop/common/branches/branch-0.23/hadoop-project/src/site/resources/css/site.css - copied unchanged from r1190703, hadoop/common/trunk/hadoop-project/src/site/resources/css/site.css hadoop/common/branches/branch-0.23/hadoop-project/src/site/site.xml - copied unchanged from r1190703, hadoop/common/trunk/hadoop-project/src/site/site.xml Modified: hadoop/common/branches/branch-0.23/hadoop-project/pom.xml hadoop/common/branches/branch-0.23/pom.xml Modified: hadoop/common/branches/branch-0.23/hadoop-project/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-project/pom.xml?rev=1190706r1=1190705r2=1190706view=diff == --- hadoop/common/branches/branch-0.23/hadoop-project/pom.xml (original) +++ hadoop/common/branches/branch-0.23/hadoop-project/pom.xml Sat Oct 29 00:17:58 2011 @@ -469,7 +469,7 @@ plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-site-plugin/artifactId - version2.2/version + version3.0/version /plugin plugin groupIdorg.apache.maven.plugins/groupId Modified: hadoop/common/branches/branch-0.23/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/pom.xml?rev=1190706r1=1190705r2=1190706view=diff == --- hadoop/common/branches/branch-0.23/pom.xml (original) +++ hadoop/common/branches/branch-0.23/pom.xml Sat Oct 29 00:17:58 2011 @@ -32,6 +32,10 @@ name${distMgmtSnapshotsName}/name url${distMgmtSnapshotsUrl}/url /snapshotRepository +site + idapache.website/id + urlscpexe://people.apache.org/www/hadoop.apache.org/docs/r${project.version}/url +/site /distributionManagement repositories @@ -103,7 +107,11 @@ plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-site-plugin/artifactId - version2.2/version + version3.0/version + configuration +!-- Reports are generated at the site level -- +generateReportsfalse/generateReports + /configuration /plugin /plugins /pluginManagement @@ -157,9 +165,21 @@ /includes /configuration /plugin +plugin +artifactIdmaven-site-plugin/artifactId +version3.0/version +executions + execution +idattach-descriptor/id +goals + goalattach-descriptor/goal +/goals + /execution +/executions + /plugin /plugins /build - + profiles profile idsrc/id
svn commit: r1190095 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/security/token/Token.java
Author: tomwhite Date: Thu Oct 27 23:50:11 2011 New Revision: 1190095 URL: http://svn.apache.org/viewvc?rev=1190095view=rev Log: HADOOP-7778. FindBugs warning in Token.getKind(). Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1190095r1=1190094r2=1190095view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Thu Oct 27 23:50:11 2011 @@ -762,7 +762,9 @@ Release 0.23.0 - Unreleased Eagles via acmurthy) HADOOP-7764. Allow HttpServer to set both ACL list and path spec filters. -(Jonathan Eagles via acmurthy) +(Jonathan Eagles via acmurthy) + +HADOOP-7778. FindBugs warning in Token.getKind(). (tomwhite) Release 0.22.0 - Unreleased Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java?rev=1190095r1=1190094r2=1190095view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java Thu Oct 27 23:50:11 2011 @@ -119,7 +119,7 @@ public class TokenT extends TokenIdenti * Get the token kind * @return the kind of the token */ - public Text getKind() { + public synchronized Text getKind() { return kind; }
svn commit: r1190097 - in /hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/security/token/Token.java
Author: tomwhite Date: Thu Oct 27 23:54:28 2011 New Revision: 1190097 URL: http://svn.apache.org/viewvc?rev=1190097view=rev Log: Merge -r 1190094:1190095 from trunk to branch. Fixes: HADOOP-7778. Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1190097r1=1190096r2=1190097view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Thu Oct 27 23:54:28 2011 @@ -690,6 +690,8 @@ Release 0.23.0 - Unreleased HADOOP-7721. Add log before login in KerberosAuthenticationHandler. (jitendra) +HADOOP-7778. FindBugs warning in Token.getKind(). (tomwhite) + Release 0.22.0 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java?rev=1190097r1=1190096r2=1190097view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java Thu Oct 27 23:54:28 2011 @@ -119,7 +119,7 @@ public class TokenT extends TokenIdenti * Get the token kind * @return the kind of the token */ - public Text getKind() { + public synchronized Text getKind() { return kind; }
svn commit: r1189552 - /hadoop/common/trunk/dev-support/test-patch.sh
Author: tomwhite Date: Thu Oct 27 00:13:31 2011 New Revision: 1189552 URL: http://svn.apache.org/viewvc?rev=1189552view=rev Log: HADOOP-7768. PreCommit-HADOOP-Build is failing on hadoop-auth-examples Modified: hadoop/common/trunk/dev-support/test-patch.sh Modified: hadoop/common/trunk/dev-support/test-patch.sh URL: http://svn.apache.org/viewvc/hadoop/common/trunk/dev-support/test-patch.sh?rev=1189552r1=1189551r2=1189552view=diff == --- hadoop/common/trunk/dev-support/test-patch.sh (original) +++ hadoop/common/trunk/dev-support/test-patch.sh Thu Oct 27 00:13:31 2011 @@ -598,8 +598,8 @@ runTests () { echo echo - echo $MVN clean install test -Pnative -D${PROJECT_NAME}PatchProcess - $MVN clean install test -Pnative -D${PROJECT_NAME}PatchProcess + echo $MVN clean install -Pnative -D${PROJECT_NAME}PatchProcess + $MVN clean install -Pnative -D${PROJECT_NAME}PatchProcess if [[ $? != 0 ]] ; then ### Find and format names of failed tests failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g`
svn commit: r1188960 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/jmx/JMXJsonServlet.java src/test/java/org/apache/hadoop/jmx/TestJMXJsonS
Author: tomwhite Date: Tue Oct 25 22:58:27 2011 New Revision: 1188960 URL: http://svn.apache.org/viewvc?rev=1188960view=rev Log: HADOOP-7769. TestJMXJsonServlet is failing. Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/jmx/JMXJsonServlet.java hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/jmx/TestJMXJsonServlet.java Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1188960r1=1188959r2=1188960view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue Oct 25 22:58:27 2011 @@ -88,6 +88,8 @@ Trunk (unreleased changes) MAPREDUCE-2764. Fix renewal of dfs delegation tokens. (Owen via jitendra) +HADOOP-7769. TestJMXJsonServlet is failing. (tomwhite) + Release 0.23.0 - Unreleased INCOMPATIBLE CHANGES Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/jmx/JMXJsonServlet.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/jmx/JMXJsonServlet.java?rev=1188960r1=1188959r2=1188960view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/jmx/JMXJsonServlet.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/jmx/JMXJsonServlet.java Tue Oct 25 22:58:27 2011 @@ -168,6 +168,7 @@ public class JMXJsonServlet extends Http if (splitStrings.length != 2) { jg.writeStringField(result, ERROR); jg.writeStringField(message, query format is not as expected.); +jg.flush(); response.setStatus(HttpServletResponse.SC_BAD_REQUEST); return; } Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/jmx/TestJMXJsonServlet.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/jmx/TestJMXJsonServlet.java?rev=1188960r1=1188959r2=1188960view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/jmx/TestJMXJsonServlet.java (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/jmx/TestJMXJsonServlet.java Tue Oct 25 22:58:27 2011 @@ -51,7 +51,7 @@ public class TestJMXJsonServlet extends assertTrue('+p+' does not match +value, m.find()); } - @Test public void testQury() throws Exception { + @Test public void testQuery() throws Exception { String result = readOutput(new URL(baseUrl, /jmx?qry=java.lang:type=Runtime)); LOG.info(/jmx?qry=java.lang:type=Runtime RESULT: +result); assertReFind(\name\\\s*:\\s*\java.lang:type=Runtime\, result);
svn commit: r1185922 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
Author: tomwhite Date: Tue Oct 18 23:48:15 2011 New Revision: 1185922 URL: http://svn.apache.org/viewvc?rev=1185922view=rev Log: HADOOP-7755. Detect MapReduce PreCommit Trunk builds silently failing when running test-patch.sh. Contributed by Jonathan Eagles Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1185922r1=1185921r2=1185922view=diff == --- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue Oct 18 23:48:15 2011 @@ -733,6 +733,9 @@ Release 0.23.0 - Unreleased HADOOP-7724. Fixed hadoop-setup-conf.sh to put proxy user in core-site.xml. (Arpit Gupta via Eric Yang) +HADOOP-7755. Detect MapReduce PreCommit Trunk builds silently failing +when running test-patch.sh. (Jonathan Eagles via tomwhite) + Release 0.22.0 - Unreleased INCOMPATIBLE CHANGES
svn commit: r1185922 - /hadoop/common/trunk/dev-support/test-patch.sh
Author: tomwhite Date: Tue Oct 18 23:48:15 2011 New Revision: 1185922 URL: http://svn.apache.org/viewvc?rev=1185922view=rev Log: HADOOP-7755. Detect MapReduce PreCommit Trunk builds silently failing when running test-patch.sh. Contributed by Jonathan Eagles Modified: hadoop/common/trunk/dev-support/test-patch.sh Modified: hadoop/common/trunk/dev-support/test-patch.sh URL: http://svn.apache.org/viewvc/hadoop/common/trunk/dev-support/test-patch.sh?rev=1185922r1=1185921r2=1185922view=diff == --- hadoop/common/trunk/dev-support/test-patch.sh (original) +++ hadoop/common/trunk/dev-support/test-patch.sh Tue Oct 18 23:48:15 2011 @@ -597,20 +597,23 @@ runTests () { echo == echo echo - + echo $MVN clean install test -Pnative -D${PROJECT_NAME}PatchProcess $MVN clean install test -Pnative -D${PROJECT_NAME}PatchProcess if [[ $? != 0 ]] ; then ### Find and format names of failed tests failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g` - fi - - if [[ -n $failed_tests ]] ; then - -JIRA_COMMENT=$JIRA_COMMENT + +if [[ -n $failed_tests ]] ; then + JIRA_COMMENT=$JIRA_COMMENT -1 core tests. The patch failed these unit tests: $failed_tests +else + JIRA_COMMENT=$JIRA_COMMENT + +-1 core tests. The patch failed the unit tests build +fi return 1 fi JIRA_COMMENT=$JIRA_COMMENT
svn commit: r1185923 - /hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt
Author: tomwhite Date: Tue Oct 18 23:49:33 2011 New Revision: 1185923 URL: http://svn.apache.org/viewvc?rev=1185923view=rev Log: Merge -r 1185921:1185922 from trunk to branch-0.23. Fixes:HADOOP-7755. Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1185923r1=1185922r2=1185923view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Tue Oct 18 23:49:33 2011 @@ -656,6 +656,9 @@ Release 0.23.0 - Unreleased HADOOP-7708. Fixed hadoop-setup-conf.sh to handle config file consistently. (Eric Yang) +HADOOP-7755. Detect MapReduce PreCommit Trunk builds silently failing +when running test-patch.sh. (Jonathan Eagles via tomwhite) + Release 0.22.0 - Unreleased INCOMPATIBLE CHANGES
svn commit: r1185923 - /hadoop/common/branches/branch-0.23/dev-support/test-patch.sh
Author: tomwhite Date: Tue Oct 18 23:49:33 2011 New Revision: 1185923 URL: http://svn.apache.org/viewvc?rev=1185923view=rev Log: Merge -r 1185921:1185922 from trunk to branch-0.23. Fixes:HADOOP-7755. Modified: hadoop/common/branches/branch-0.23/dev-support/test-patch.sh Modified: hadoop/common/branches/branch-0.23/dev-support/test-patch.sh URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/dev-support/test-patch.sh?rev=1185923r1=1185922r2=1185923view=diff == --- hadoop/common/branches/branch-0.23/dev-support/test-patch.sh (original) +++ hadoop/common/branches/branch-0.23/dev-support/test-patch.sh Tue Oct 18 23:49:33 2011 @@ -595,20 +595,23 @@ runTests () { echo == echo echo - + echo $MVN clean install test -Pnative -D${PROJECT_NAME}PatchProcess $MVN clean install test -Pnative -D${PROJECT_NAME}PatchProcess if [[ $? != 0 ]] ; then ### Find and format names of failed tests failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g` - fi - - if [[ -n $failed_tests ]] ; then - -JIRA_COMMENT=$JIRA_COMMENT + +if [[ -n $failed_tests ]] ; then + JIRA_COMMENT=$JIRA_COMMENT -1 core tests. The patch failed these unit tests: $failed_tests +else + JIRA_COMMENT=$JIRA_COMMENT + +-1 core tests. The patch failed the unit tests build +fi return 1 fi JIRA_COMMENT=$JIRA_COMMENT
svn commit: r1166848 - /hadoop/common/trunk/dev-support/test-patch.sh
Author: tomwhite Date: Thu Sep 8 18:39:11 2011 New Revision: 1166848 URL: http://svn.apache.org/viewvc?rev=1166848view=rev Log: HADOOP-7612. Change test-patch to run tests for all nested modules. Modified: hadoop/common/trunk/dev-support/test-patch.sh Modified: hadoop/common/trunk/dev-support/test-patch.sh URL: http://svn.apache.org/viewvc/hadoop/common/trunk/dev-support/test-patch.sh?rev=1166848r1=1166847r2=1166848view=diff == --- hadoop/common/trunk/dev-support/test-patch.sh (original) +++ hadoop/common/trunk/dev-support/test-patch.sh Thu Sep 8 18:39:11 2011 @@ -64,6 +64,7 @@ printUsage() { echo --findbugs-home=path Findbugs home directory (default FINDBUGS_HOME environment variable) echo --forrest-home=path Forrest home directory (default FORREST_HOME environment variable) echo --dirty-workspace Allow the local SVN workspace to have uncommitted changes + echo --run-testsRun all tests below the base directory echo echo Jenkins-only options: echo --jenkins Run by Jenkins (runs tests and posts results to JIRA) @@ -130,6 +131,9 @@ parseArgs() { --dirty-workspace) DIRTY_WORKSPACE=true ;; +--run-tests) + RUN_TESTS=true + ;; *) PATCH_OR_DEFECT=$i ;; @@ -249,6 +253,18 @@ setup () { echo == echo echo + if [[ ! -d hadoop-common-project ]]; then +cd $bindir/.. +echo Compiling $(pwd) +echo $MVN clean test -DskipTests $PATCH_DIR/trunkCompile.txt 21 +$MVN clean test -DskipTests $PATCH_DIR/trunkCompile.txt 21 +if [[ $? != 0 ]] ; then + echo Top-level trunk compilation is broken? + cleanupAndExit 1 +fi +cd - + fi + echo Compiling $(pwd) echo $MVN clean test -DskipTests -D${PROJECT_NAME}PatchProcess -Ptest-patch $PATCH_DIR/trunkJavacWarnings.txt 21 $MVN clean test -DskipTests -D${PROJECT_NAME}PatchProcess -Ptest-patch $PATCH_DIR/trunkJavacWarnings.txt 21 if [[ $? != 0 ]] ; then @@ -580,26 +596,12 @@ runTests () { echo echo - failed_tests= - modules=$(findModules) - for module in $modules; - do -pushd $module - echo Running tests in $module - ### Kill any rogue build processes from the last attempt - $PS auxwww | $GREP ${PROJECT_NAME}PatchProcess | $AWK '{print $2}' | /usr/bin/xargs -t -I {} /bin/kill -9 {} /dev/null - - echo $MVN clean test -Pnative -D${PROJECT_NAME}PatchProcess - $MVN clean test -Pnative -D${PROJECT_NAME}PatchProcess - if [[ $? != 0 ]] ; then -### Find and format names of failed tests -module_failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g` -failed_tests=${failed_tests} -${module_failed_tests} - fi -popd - done - echo $failed_tests + echo $MVN clean test -Pnative -D${PROJECT_NAME}PatchProcess + $MVN clean test -Pnative -D${PROJECT_NAME}PatchProcess + if [[ $? != 0 ]] ; then +### Find and format names of failed tests +failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g` + fi if [[ -n $failed_tests ]] ; then @@ -616,36 +618,6 @@ $failed_tests } ### -### Find the modules changed by the patch - -findModules () { - # Come up with a list of changed files into $TMP - TMP=/tmp/tmp.paths.$$ - $GREP '^+++\|^---' $PATCH_DIR/patch | cut -c '5-' | $GREP -v /dev/null | sort | uniq $TMP - - # if all of the lines start with a/ or b/, then this is a git patch that - # was generated without --no-prefix - if ! $GREP -qv '^a/\|^b/' $TMP ; then -sed -i -e 's,^[ab]/,,' $TMP - fi - - PREFIX_DIRS=$(cut -d '/' -f 1 $TMP | sort | uniq) - - # if all of the lines start with hadoop-common-project/, hadoop-hdfs-project/, or hadoop-mapreduce-project/, this is - # relative to the hadoop root instead of the subproject root - if [[ $PREFIX_DIRS =~ ^(hadoop-common-project|hadoop-hdfs-project|hadoop-mapreduce-project)$ ]]; then -echo $PREFIX_DIRS -return 0 - elif ! echo $PREFIX_DIRS | grep -vxq 'hadoop-common-project\|hadoop-hdfs-project\|hadoop-mapreduce-project' ; then -echo $PREFIX_DIRS -return 0 - fi - - # No modules found. Running from current directory. - echo . -} - -### ### Run the test-contrib target runContribTests () { echo @@ -820,8 +792,8 @@ checkFindbugsWarnings (( RESULT = RESULT + $? )) checkReleaseAuditWarnings (( RESULT = RESULT + $? )) -### Do not call these when run by a developer -if [[ $JENKINS == true ]] ; then +### Run tests for Jenkins or if explictly asked
svn commit: r1166852 - /hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt
Author: tomwhite Date: Thu Sep 8 18:43:55 2011 New Revision: 1166852 URL: http://svn.apache.org/viewvc?rev=1166852view=rev Log: Merge -r 1166847:1166848 from trunk to branch. Fixes: HADOOP-7612 Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Modified: hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1166852r1=1166851r2=1166852view=diff == --- hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt (original) +++ hadoop/common/branches/branch-0.23/hadoop-common-project/hadoop-common/CHANGES.txt Thu Sep 8 18:43:55 2011 @@ -364,6 +364,9 @@ Release 0.23.0 - Unreleased HADOOP-7507. Allow ganglia metrics to include the metrics system tags in the gmetric names. (Alejandro Abdelnur via todd) +HADOOP-7612. Change test-patch to run tests for all nested modules. +(tomwhite) + OPTIMIZATIONS HADOOP-7333. Performance improvement in PureJavaCrc32. (Eric Caspole
svn commit: r1166852 - /hadoop/common/branches/branch-0.23/dev-support/test-patch.sh
Author: tomwhite Date: Thu Sep 8 18:43:55 2011 New Revision: 1166852 URL: http://svn.apache.org/viewvc?rev=1166852view=rev Log: Merge -r 1166847:1166848 from trunk to branch. Fixes: HADOOP-7612 Modified: hadoop/common/branches/branch-0.23/dev-support/test-patch.sh Modified: hadoop/common/branches/branch-0.23/dev-support/test-patch.sh URL: http://svn.apache.org/viewvc/hadoop/common/branches/branch-0.23/dev-support/test-patch.sh?rev=1166852r1=1166851r2=1166852view=diff == --- hadoop/common/branches/branch-0.23/dev-support/test-patch.sh (original) +++ hadoop/common/branches/branch-0.23/dev-support/test-patch.sh Thu Sep 8 18:43:55 2011 @@ -64,6 +64,7 @@ printUsage() { echo --findbugs-home=path Findbugs home directory (default FINDBUGS_HOME environment variable) echo --forrest-home=path Forrest home directory (default FORREST_HOME environment variable) echo --dirty-workspace Allow the local SVN workspace to have uncommitted changes + echo --run-testsRun all tests below the base directory echo echo Jenkins-only options: echo --jenkins Run by Jenkins (runs tests and posts results to JIRA) @@ -130,6 +131,9 @@ parseArgs() { --dirty-workspace) DIRTY_WORKSPACE=true ;; +--run-tests) + RUN_TESTS=true + ;; *) PATCH_OR_DEFECT=$i ;; @@ -249,6 +253,18 @@ setup () { echo == echo echo + if [[ ! -d hadoop-common-project ]]; then +cd $bindir/.. +echo Compiling $(pwd) +echo $MVN clean test -DskipTests $PATCH_DIR/trunkCompile.txt 21 +$MVN clean test -DskipTests $PATCH_DIR/trunkCompile.txt 21 +if [[ $? != 0 ]] ; then + echo Top-level trunk compilation is broken? + cleanupAndExit 1 +fi +cd - + fi + echo Compiling $(pwd) echo $MVN clean test -DskipTests -D${PROJECT_NAME}PatchProcess -Ptest-patch $PATCH_DIR/trunkJavacWarnings.txt 21 $MVN clean test -DskipTests -D${PROJECT_NAME}PatchProcess -Ptest-patch $PATCH_DIR/trunkJavacWarnings.txt 21 if [[ $? != 0 ]] ; then @@ -580,26 +596,12 @@ runTests () { echo echo - failed_tests= - modules=$(findModules) - for module in $modules; - do -pushd $module - echo Running tests in $module - ### Kill any rogue build processes from the last attempt - $PS auxwww | $GREP ${PROJECT_NAME}PatchProcess | $AWK '{print $2}' | /usr/bin/xargs -t -I {} /bin/kill -9 {} /dev/null - - echo $MVN clean test -Pnative -D${PROJECT_NAME}PatchProcess - $MVN clean test -Pnative -D${PROJECT_NAME}PatchProcess - if [[ $? != 0 ]] ; then -### Find and format names of failed tests -module_failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g` -failed_tests=${failed_tests} -${module_failed_tests} - fi -popd - done - echo $failed_tests + echo $MVN clean test -Pnative -D${PROJECT_NAME}PatchProcess + $MVN clean test -Pnative -D${PROJECT_NAME}PatchProcess + if [[ $? != 0 ]] ; then +### Find and format names of failed tests +failed_tests=`find . -name 'TEST*.xml' | xargs $GREP -l -E failure|error | sed -e s|.*target/surefire-reports/TEST-| |g | sed -e s|\.xml||g` + fi if [[ -n $failed_tests ]] ; then @@ -616,36 +618,6 @@ $failed_tests } ### -### Find the modules changed by the patch - -findModules () { - # Come up with a list of changed files into $TMP - TMP=/tmp/tmp.paths.$$ - $GREP '^+++\|^---' $PATCH_DIR/patch | cut -c '5-' | $GREP -v /dev/null | sort | uniq $TMP - - # if all of the lines start with a/ or b/, then this is a git patch that - # was generated without --no-prefix - if ! $GREP -qv '^a/\|^b/' $TMP ; then -sed -i -e 's,^[ab]/,,' $TMP - fi - - PREFIX_DIRS=$(cut -d '/' -f 1 $TMP | sort | uniq) - - # if all of the lines start with hadoop-common-project/, hadoop-hdfs-project/, or hadoop-mapreduce-project/, this is - # relative to the hadoop root instead of the subproject root - if [[ $PREFIX_DIRS =~ ^(hadoop-common-project|hadoop-hdfs-project|hadoop-mapreduce-project)$ ]]; then -echo $PREFIX_DIRS -return 0 - elif ! echo $PREFIX_DIRS | grep -vxq 'hadoop-common-project\|hadoop-hdfs-project\|hadoop-mapreduce-project' ; then -echo $PREFIX_DIRS -return 0 - fi - - # No modules found. Running from current directory. - echo . -} - -### ### Run the test-contrib target runContribTests () { echo @@ -820,8 +792,8 @@ checkFindbugsWarnings (( RESULT = RESULT + $? )) checkReleaseAuditWarnings (( RESULT = RESULT + $? )) -### Do not call these when run by a developer
svn commit: r1161304 - /hadoop/common/trunk/hadoop-common/CHANGES.txt
Author: tomwhite Date: Wed Aug 24 22:28:54 2011 New Revision: 1161304 URL: http://svn.apache.org/viewvc?rev=1161304view=rev Log: HADOOP-7567. 'mvn eclipse:eclipse' fails for hadoop-alfredo (auth). Contributed by Alejandro Abdelnur. Modified: hadoop/common/trunk/hadoop-common/CHANGES.txt Modified: hadoop/common/trunk/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common/CHANGES.txt?rev=1161304r1=1161303r2=1161304view=diff == --- hadoop/common/trunk/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common/CHANGES.txt Wed Aug 24 22:28:54 2011 @@ -519,6 +519,9 @@ Trunk (unreleased changes) HADOOP-7566. MR tests are failing webapps/hdfs not found in CLASSPATH. (Alejandro Abdelnur via mahadev) +HADOOP-7567. 'mvn eclipse:eclipse' fails for hadoop-alfredo (auth). +(Alejandro Abdelnur via tomwhite) + Release 0.22.0 - Unreleased INCOMPATIBLE CHANGES
svn commit: r1161304 - /hadoop/common/trunk/hadoop-alfredo/pom.xml
Author: tomwhite Date: Wed Aug 24 22:28:54 2011 New Revision: 1161304 URL: http://svn.apache.org/viewvc?rev=1161304view=rev Log: HADOOP-7567. 'mvn eclipse:eclipse' fails for hadoop-alfredo (auth). Contributed by Alejandro Abdelnur. Modified: hadoop/common/trunk/hadoop-alfredo/pom.xml Modified: hadoop/common/trunk/hadoop-alfredo/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-alfredo/pom.xml?rev=1161304r1=1161303r2=1161304view=diff == --- hadoop/common/trunk/hadoop-alfredo/pom.xml (original) +++ hadoop/common/trunk/hadoop-alfredo/pom.xml Wed Aug 24 22:28:54 2011 @@ -91,13 +91,6 @@ includekrb5.conf/include /includes /testResource - testResource -directory${basedir}/src/test/resources/directory -filteringfalse/filtering -excludes - excludekrb5.conf/exclude -/excludes - /testResource /testResources plugins plugin
svn commit: r1160341 - /hadoop/common/trunk/hadoop-common/CHANGES.txt
Author: tomwhite Date: Mon Aug 22 17:19:46 2011 New Revision: 1160341 URL: http://svn.apache.org/viewvc?rev=1160341view=rev Log: HADOOP-7498. Remove legacy TAR layout creation. Contributed by Alejandro Abdelnur. Modified: hadoop/common/trunk/hadoop-common/CHANGES.txt Modified: hadoop/common/trunk/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common/CHANGES.txt?rev=1160341r1=1160340r2=1160341view=diff == --- hadoop/common/trunk/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common/CHANGES.txt Mon Aug 22 17:19:46 2011 @@ -334,6 +334,9 @@ Trunk (unreleased changes) HADOOP-7264. Bump avro version to at least 1.4.1. (Alejandro Abdelnur via tomwhite) +HADOOP-7498. Remove legacy TAR layout creation. (Alejandro Abdelnur via +tomwhite) + OPTIMIZATIONS HADOOP-7333. Performance improvement in PureJavaCrc32. (Eric Caspole
svn commit: r1160341 - in /hadoop/common/trunk: BUILDING.txt hadoop-assemblies/src/main/resources/assemblies/hadoop-bintar.xml hadoop-assemblies/src/main/resources/assemblies/hadoop-tar.xml hadoop-pro
Author: tomwhite Date: Mon Aug 22 17:19:46 2011 New Revision: 1160341 URL: http://svn.apache.org/viewvc?rev=1160341view=rev Log: HADOOP-7498. Remove legacy TAR layout creation. Contributed by Alejandro Abdelnur. Removed: hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-bintar.xml Modified: hadoop/common/trunk/BUILDING.txt hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-tar.xml hadoop/common/trunk/hadoop-project-distro/pom.xml Modified: hadoop/common/trunk/BUILDING.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/BUILDING.txt?rev=1160341r1=1160340r2=1160341view=diff == --- hadoop/common/trunk/BUILDING.txt (original) +++ hadoop/common/trunk/BUILDING.txt Mon Aug 22 17:19:46 2011 @@ -43,7 +43,7 @@ Maven build goals: * Run clover: mvn test -Pclover [-DcloverLicenseLocation=${user.name}/.clover.license] * Run Rat : mvn apache-rat:check * Build javadocs: mvn javadoc:javadoc - * Build TAR : mvn package [-Ptar][-Pbintar][-Pdocs][-Psrc][-Pnative] + * Build TAR : mvn package [-Ptar][-Pdocs][-Psrc][-Pnative] Build options: Modified: hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-tar.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-tar.xml?rev=1160341r1=1160340r2=1160341view=diff == --- hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-tar.xml (original) +++ hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-tar.xml Mon Aug 22 17:19:46 2011 @@ -15,79 +15,105 @@ limitations under the License. -- assembly - idhadoop-tar/id + idhadoop-bintar/id formats formatdir/format /formats includeBaseDirectoryfalse/includeBaseDirectory fileSets fileSet - directory${basedir}/directory - outputDirectory//outputDirectory - includes -include*.txt/include - /includes + directory${basedir}/src/main/bin/directory + outputDirectory/bin/outputDirectory + excludes +exclude*.sh/exclude + /excludes + fileMode0755/fileMode +/fileSet +fileSet + directory${basedir}/src/main/conf/directory + outputDirectory/etc/hadoop/outputDirectory /fileSet fileSet directory${basedir}/src/main/bin/directory - outputDirectory/bin/outputDirectory + outputDirectory/libexec/outputDirectory includes -include*/include +include*-config.sh/include /includes fileMode0755/fileMode /fileSet fileSet directory${basedir}/src/main/bin/directory - outputDirectory/libexec/outputDirectory + outputDirectory/sbin/outputDirectory includes -include*-config.sh/include +include*.sh/include /includes + excludes +excludehadoop-config.sh/exclude + /excludes fileMode0755/fileMode /fileSet fileSet - directory${basedir}/src/main/conf/directory - outputDirectory/conf/outputDirectory + directory${basedir}/src/main/packages/directory + outputDirectory/sbin/outputDirectory + includes +include*.sh/include + /includes + fileMode0755/fileMode /fileSet fileSet - directory${basedir}/src/main/webapps/directory - outputDirectory/webapps/outputDirectory - excludes -excludeproto-*-web.xml/exclude - /excludes + directory${basedir}/directory + outputDirectory/share/doc/hadoop/${hadoop.component}/outputDirectory + includes +include*.txt/include + /includes /fileSet fileSet directory${project.build.directory}/webapps/directory - outputDirectory/webapps/outputDirectory - excludes -excludeproto-*-web.xml/exclude - /excludes + outputDirectory/share/hadoop/${hadoop.component}/webapps/outputDirectory /fileSet fileSet - directory${project.build.directory}/site/directory - outputDirectory/docs/outputDirectory + directory${basedir}/src/main/conf/directory + outputDirectory/share/hadoop/${hadoop.component}/templates/outputDirectory + includes +include*-site.xml/include + /includes /fileSet fileSet directory${project.build.directory}/directory - outputDirectory//outputDirectory + outputDirectory/share/hadoop/${hadoop.component}/outputDirectory includes include${project.artifactId}-${project.version}.jar/include include${project.artifactId}-${project.version}-tests.jar/include +include${project.artifactId}-${project.version}-sources.jar/include + include${project.artifactId}-${project.version}-test-sources.jar/include
svn commit: r1160344 - in /hadoop/common/trunk/hadoop-common: CHANGES.txt pom.xml
Author: tomwhite Date: Mon Aug 22 17:40:58 2011 New Revision: 1160344 URL: http://svn.apache.org/viewvc?rev=1160344view=rev Log: HADOOP-7496. Break Maven TAR bintar profiles into just LAYOUT TAR proper. Contributed by Alejandro Abdelnur. Modified: hadoop/common/trunk/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common/pom.xml Modified: hadoop/common/trunk/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common/CHANGES.txt?rev=1160344r1=1160343r2=1160344view=diff == --- hadoop/common/trunk/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common/CHANGES.txt Mon Aug 22 17:40:58 2011 @@ -337,6 +337,9 @@ Trunk (unreleased changes) HADOOP-7498. Remove legacy TAR layout creation. (Alejandro Abdelnur via tomwhite) +HADOOP-7496. Break Maven TAR bintar profiles into just LAYOUT TAR proper. +(Alejandro Abdelnur via tomwhite) + OPTIMIZATIONS HADOOP-7333. Performance improvement in PureJavaCrc32. (Eric Caspole Modified: hadoop/common/trunk/hadoop-common/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common/pom.xml?rev=1160344r1=1160343r2=1160344view=diff == --- hadoop/common/trunk/hadoop-common/pom.xml (original) +++ hadoop/common/trunk/hadoop-common/pom.xml Mon Aug 22 17:40:58 2011 @@ -16,9 +16,9 @@ modelVersion4.0.0/modelVersion parent groupIdorg.apache.hadoop/groupId -artifactIdhadoop-project-distro/artifactId +artifactIdhadoop-project-dist/artifactId version0.23.0-SNAPSHOT/version -relativePath../hadoop-project-distro/relativePath +relativePath../hadoop-project-dist/relativePath /parent groupIdorg.apache.hadoop/groupId artifactIdhadoop-common/artifactId
svn commit: r1160344 - in /hadoop/common/trunk: ./ hadoop-assemblies/src/main/resources/assemblies/ hadoop-project-dist/ hadoop-project-distro/
Author: tomwhite Date: Mon Aug 22 17:40:58 2011 New Revision: 1160344 URL: http://svn.apache.org/viewvc?rev=1160344view=rev Log: HADOOP-7496. Break Maven TAR bintar profiles into just LAYOUT TAR proper. Contributed by Alejandro Abdelnur. Added: hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-dist.xml (with props) hadoop/common/trunk/hadoop-project-dist/ hadoop/common/trunk/hadoop-project-dist/README.txt (with props) hadoop/common/trunk/hadoop-project-dist/pom.xml (with props) Removed: hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-tar.xml hadoop/common/trunk/hadoop-project-distro/ Modified: hadoop/common/trunk/BUILDING.txt hadoop/common/trunk/pom.xml Modified: hadoop/common/trunk/BUILDING.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/BUILDING.txt?rev=1160344r1=1160343r2=1160344view=diff == --- hadoop/common/trunk/BUILDING.txt (original) +++ hadoop/common/trunk/BUILDING.txt Mon Aug 22 17:40:58 2011 @@ -15,12 +15,13 @@ Requirements: -- Maven modules: - hadoop (Main Hadoop project) - - hadoop-project (Parent POM for all Hadoop Maven modules. ) - (All plugins dependencies versions are defined here.) - - hadoop-annotations (Generates the Hadoop doclet used to generated the Javadocs) - - hadoop-common (Hadoop Common) - - hadoop-hdfs(Hadoop HDFS) + hadoop (Main Hadoop project) + - hadoop-project (Parent POM for all Hadoop Maven modules. ) + (All plugins dependencies versions are defined here.) + - hadoop-project-dist (Parent POM for modules that generate distributions.) + - hadoop-annotations (Generates the Hadoop doclet used to generated the Javadocs) + - hadoop-common (Hadoop Common) + - hadoop-hdfs (Hadoop HDFS) -- Where to run Maven from? @@ -43,15 +44,16 @@ Maven build goals: * Run clover: mvn test -Pclover [-DcloverLicenseLocation=${user.name}/.clover.license] * Run Rat : mvn apache-rat:check * Build javadocs: mvn javadoc:javadoc - * Build TAR : mvn package [-Ptar][-Pdocs][-Psrc][-Pnative] + * Build distribution: mvn package [-Pdist][-Pdocs][-Psrc][-Pnative][-Dtar] Build options: * Use -Pnative to compile/bundle native code * Use -Dsnappy.prefix=(/usr/local) -Dbundle.snappy=(false) to compile Snappy JNI bindings and to bundle Snappy SO files - * Use -Pdocs to generate bundle the documentation in the TAR (using -Ptar) - * Use -Psrc to bundle the source in the TAR (using -Ptar) + * Use -Pdocs to generate bundle the documentation in the distribution (using -Pdist) + * Use -Psrc to bundle the source in the distribution (using -Pdist) + * Use -Dtar to create a TAR with the distribution (using -Pdist) Tests options: Added: hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-dist.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-dist.xml?rev=1160344view=auto == --- hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-dist.xml (added) +++ hadoop/common/trunk/hadoop-assemblies/src/main/resources/assemblies/hadoop-dist.xml Mon Aug 22 17:40:58 2011 @@ -0,0 +1,121 @@ +!-- + Licensed to the Apache Software Foundation (ASF) under one or more + contributor license agreements. See the NOTICE file distributed with + this work for additional information regarding copyright ownership. + The ASF licenses this file to You under the Apache License, Version 2.0 + (the License); you may not use this file except in compliance with + the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an AS IS BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. +-- +assembly + idhadoop-distro/id + formats +formatdir/format + /formats + includeBaseDirectoryfalse/includeBaseDirectory + fileSets +fileSet + directory${basedir}/src/main/bin/directory + outputDirectory/bin/outputDirectory + excludes +exclude*.sh/exclude + /excludes + fileMode0755/fileMode +/fileSet +fileSet
svn commit: r1159699 - in /hadoop/common/trunk: hadoop-common/ hadoop-common/src/main/java/org/apache/hadoop/io/serializer/avro/ hadoop-common/src/main/java/org/apache/hadoop/ipc/ hadoop-common/src/te
Author: tomwhite Date: Fri Aug 19 17:26:42 2011 New Revision: 1159699 URL: http://svn.apache.org/viewvc?rev=1159699view=rev Log: HADOOP-7264. Bump avro version to at least 1.4.1. Contributed by Alejandro Abdelnur Added: hadoop/common/trunk/hadoop-common/src/test/avro/ hadoop/common/trunk/hadoop-common/src/test/avro/AvroSpecificTestProtocol.avpr hadoop/common/trunk/hadoop-common/src/test/avro/avroRecord.avsc Removed: hadoop/common/trunk/hadoop-common/src/test/java/org/apache/hadoop/io/serializer/avro/avroRecord.avsc hadoop/common/trunk/hadoop-common/src/test/java/org/apache/hadoop/ipc/AvroSpecificTestProtocol.avpr Modified: hadoop/common/trunk/hadoop-common/CHANGES.txt hadoop/common/trunk/hadoop-common/pom.xml hadoop/common/trunk/hadoop-common/src/main/java/org/apache/hadoop/io/serializer/avro/AvroSerialization.java hadoop/common/trunk/hadoop-common/src/main/java/org/apache/hadoop/ipc/AvroRpcEngine.java hadoop/common/trunk/hadoop-common/src/main/java/org/apache/hadoop/ipc/AvroSpecificRpcEngine.java hadoop/common/trunk/hadoop-common/src/test/java/org/apache/hadoop/io/AvroTestUtil.java hadoop/common/trunk/hadoop-common/src/test/java/org/apache/hadoop/ipc/AvroTestProtocol.java hadoop/common/trunk/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestAvroRpc.java hadoop/common/trunk/hadoop-project/pom.xml Modified: hadoop/common/trunk/hadoop-common/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common/CHANGES.txt?rev=1159699r1=1159698r2=1159699view=diff == --- hadoop/common/trunk/hadoop-common/CHANGES.txt (original) +++ hadoop/common/trunk/hadoop-common/CHANGES.txt Fri Aug 19 17:26:42 2011 @@ -328,6 +328,9 @@ Trunk (unreleased changes) HADOOP-7555. Add a eclipse-generated files to .gitignore. (atm) +HADOOP-7264. Bump avro version to at least 1.4.1. (Alejandro Abdelnur via +tomwhite) + OPTIMIZATIONS HADOOP-7333. Performance improvement in PureJavaCrc32. (Eric Caspole Modified: hadoop/common/trunk/hadoop-common/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common/pom.xml?rev=1159699r1=1159698r2=1159699view=diff == --- hadoop/common/trunk/hadoop-common/pom.xml (original) +++ hadoop/common/trunk/hadoop-common/pom.xml Fri Aug 19 17:26:42 2011 @@ -219,11 +219,16 @@ scopetest/scope /dependency dependency - groupIdorg.apache.hadoop/groupId + groupIdorg.apache.avro/groupId artifactIdavro/artifactId scopecompile/scope /dependency dependency + groupIdorg.apache.avro/groupId + artifactIdavro-ipc/artifactId + scopecompile/scope +/dependency +dependency groupIdnet.sf.kosmosfs/groupId artifactIdkfs/artifactId scopecompile/scope @@ -323,6 +328,23 @@ /configuration /plugin plugin +groupIdorg.apache.avro/groupId +artifactIdavro-maven-plugin/artifactId +executions + execution +idgenerate-avro-test-sources/id +phasegenerate-test-sources/phase +goals + goalschema/goal + goalprotocol/goal +/goals + /execution +/executions +configuration + testOutputDirectory${project.build.directory}/generated-test-sources/java/testOutputDirectory +/configuration + /plugin + plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-antrun-plugin/artifactId executions @@ -359,24 +381,6 @@ recordcc destdir=${project.build.directory}/generated-test-sources/java fileset dir=${basedir}/src/test/ddl includes=**/*.jr/ /recordcc - -taskdef name=schema classname=org.apache.avro.specific.SchemaTask - classpath refid=maven.test.classpath/ -/taskdef -schema destdir=${project.build.directory}/generated-test-sources/java - fileset dir=${basedir}/src/test -include name=**/*.avsc/ - /fileset -/schema - -taskdef name=schema classname=org.apache.avro.specific.ProtocolTask - classpath refid=maven.test.classpath/ -/taskdef -schema destdir=${project.build.directory}/generated-test-sources/java - fileset dir=${basedir}/src/test -include name=**/*.avpr/ - /fileset -/schema /target /configuration /execution Modified: hadoop/common/trunk/hadoop-common/src/main/java/org/apache/hadoop/io/serializer/avro/AvroSerialization.java URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common/src/main/java/org
svn commit: r1159702 [2/2] - in /hadoop/common/trunk: ./ dev-support/ hadoop-assemblies/src/main/resources/assemblies/ hadoop-common/ hadoop-hdfs/ hadoop-hdfs/.eclipse.templates/ hadoop-hdfs/.eclipse.
Added: hadoop/common/trunk/hadoop-project-distro/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-project-distro/pom.xml?rev=1159702view=auto == --- hadoop/common/trunk/hadoop-project-distro/pom.xml (added) +++ hadoop/common/trunk/hadoop-project-distro/pom.xml Fri Aug 19 17:36:23 2011 @@ -0,0 +1,574 @@ +?xml version=1.0 encoding=UTF-8? +!-- + Licensed under the Apache License, Version 2.0 (the License); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + +http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an AS IS BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. See accompanying LICENSE file. +-- +project + modelVersion4.0.0/modelVersion + parent +groupIdorg.apache.hadoop/groupId +artifactIdhadoop-project/artifactId +version0.23.0-SNAPSHOT/version +relativePath../hadoop-project/relativePath + /parent + groupIdorg.apache.hadoop/groupId + artifactIdhadoop-project-distro/artifactId + version0.23.0-SNAPSHOT/version + descriptionApache Hadoop Project Distro POM/description + nameApache Hadoop Project Distro POM/name + packagingpom/packaging + + properties +hadoop.tmp.dir${project.build.directory}/test/hadoop.tmp.dir +test.build.data${project.build.directory}/test/data/test.build.data +hadoop.log.dir${project.build.directory}/log/hadoop.log.dir + test.build.webapps${project.build.directory}/test-classes/webapps/test.build.webapps +test.cache.data${project.build.directory}/test-classes/test.cache.data + test.build.classes${project.build.directory}/test-classes/test.build.classes + +hadoop.componentUNDEF/hadoop.component +bundle.snappyfalse/bundle.snappy + /properties + + dependencies +dependency + groupIdorg.apache.hadoop/groupId + artifactIdhadoop-annotations/artifactId + scopeprovided/scope +/dependency + /dependencies + + build +plugins + plugin +groupIdorg.apache.maven.plugins/groupId +artifactIdmaven-jar-plugin/artifactId +executions + execution +idprepare-jar/id +phaseprepare-package/phase +goals + goaljar/goal +/goals + /execution + execution +idprepare-test-jar/id +phaseprepare-package/phase +goals + goaltest-jar/goal +/goals +configuration + includes +include**/*.class/include + /includes +/configuration + /execution +/executions + /plugin + plugin +groupIdorg.apache.maven.plugins/groupId +artifactIdmaven-source-plugin/artifactId +executions + execution +phaseprepare-package/phase +goals + goaljar/goal + goaltest-jar/goal +/goals + /execution +/executions +configuration + attachtrue/attach +/configuration + /plugin + plugin +groupIdorg.codehaus.mojo/groupId +artifactIdfindbugs-maven-plugin/artifactId +configuration + excludeFilterFile${basedir}/dev-support/findbugsExcludeFile.xml/excludeFilterFile +/configuration + /plugin + plugin +groupIdorg.apache.maven.plugins/groupId +artifactIdmaven-checkstyle-plugin/artifactId +configuration + configLocationfile://${basedir}/dev-support/checkstyle.xml/configLocation + failOnViolationfalse/failOnViolation + formatxml/format + formathtml/format + outputFile${project.build.directory}/test/checkstyle-errors.xml/outputFile +/configuration + /plugin + plugin +groupIdorg.apache.maven.plugins/groupId +artifactIdmaven-javadoc-plugin/artifactId +configuration + linksourcetrue/linksource + quiettrue/quiet + verbosefalse/verbose + source${maven.compile.source}/source + charset${maven.compile.encoding}/charset + reportOutputDirectory${project.build.directory}/site/reportOutputDirectory + destDirapi/destDir + groups +group + title${project.name} API/title + packagesorg.apache.hadoop*/packages +/group + /groups + docletorg.apache.hadoop.classification.tools.ExcludePrivateAnnotationsStandardDoclet/doclet + docletArtifacts +docletArtifact + groupIdorg.apache.hadoop/groupId + artifactIdhadoop-annotations/artifactId +
svn commit: r1159738 - in /hadoop/common/trunk: hadoop-hdfs/ hadoop-mapreduce/ hadoop-mapreduce/hadoop-mr-client/ hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-app/ hadoop-mapreduce/hadoop
Author: tomwhite Date: Fri Aug 19 18:49:24 2011 New Revision: 1159738 URL: http://svn.apache.org/viewvc?rev=1159738view=rev Log: Fix svn:ignore following integration of MR-279 and HDFS mavenization. Modified: hadoop/common/trunk/hadoop-hdfs/ (props changed) hadoop/common/trunk/hadoop-mapreduce/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-app/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-common/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-core/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-hs/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-jobclient/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-shuffle/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-api/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-common/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-server/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/ (props changed) hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-tests/ (props changed) hadoop/common/trunk/hadoop-project-distro/ (props changed) Propchange: hadoop/common/trunk/hadoop-hdfs/ -- --- svn:ignore (original) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -7,3 +7,4 @@ logs .launches .project .settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/ -- --- svn:ignore (original) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -7,3 +7,4 @@ logs .launches .project .settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -0,0 +1,4 @@ +.classpath +.project +.settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-app/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -0,0 +1,4 @@ +.classpath +.project +.settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-common/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -0,0 +1,4 @@ +.classpath +.project +.settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-core/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -0,0 +1,4 @@ +.classpath +.project +.settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-hs/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -0,0 +1,4 @@ +.classpath +.project +.settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-jobclient/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -0,0 +1,4 @@ +.classpath +.project +.settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-mr-client/hadoop-mapreduce-client-shuffle/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -0,0 +1,4 @@ +.classpath +.project +.settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49:24 2011 @@ -0,0 +1,4 @@ +.classpath +.project +.settings +target Propchange: hadoop/common/trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-api/ -- --- svn:ignore (added) +++ svn:ignore Fri Aug 19 18:49