Author: suresh
Date: Tue Dec 11 20:08:00 2012
New Revision: 1420375

URL: http://svn.apache.org/viewvc?rev=1420375&view=rev
Log:
Mergng trunk to branch-trunk-win

Added:
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/ShuffleConsumerPlugin.java
      - copied unchanged from r1420366, 
hadoop/common/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/ShuffleConsumerPlugin.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/test/java/org/apache/hadoop/mapreduce/TestShufflePlugin.java
      - copied unchanged from r1420366, 
hadoop/common/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/test/java/org/apache/hadoop/mapreduce/TestShufflePlugin.java
Removed:
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/security/token/DelegationTokenRenewal.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapreduce/security/token/TestDelegationTokenRenewal.java
Modified:
    hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/   (props 
changed)
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/CHANGES.txt   
(contents, props changed)
    hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/conf/   
(props changed)
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/dev-support/findbugs-exclude.xml
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/src/main/java/org/apache/hadoop/mapreduce/v2/app/webapp/dao/TaskInfo.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/ReduceTask.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/MRConfig.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/task/reduce/Shuffle.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/resources/mapred-default.xml
   (contents, props changed)
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/src/main/java/org/apache/hadoop/mapreduce/v2/hs/webapp/HsJobsBlock.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientCluster.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientClusterFactory.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRYarnClusterAdapter.java
    
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/TestMiniMRClientCluster.java

Propchange: hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/
------------------------------------------------------------------------------
  Merged /hadoop/common/trunk/hadoop-mapreduce-project:r1415787-1420366

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/CHANGES.txt?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/CHANGES.txt 
Tue Dec 11 20:08:00 2012
@@ -11,6 +11,9 @@ Trunk (Unreleased)
     MAPREDUCE-2669. Add new examples for Mean, Median, and Standard Deviation.
     (Plamen Jeliazkov via shv)
 
+    MAPREDUCE-4049. Experimental api to allow for alternate shuffle plugins.
+    (Avner BenHanoch via acmurthy) 
+
   IMPROVEMENTS
 
     MAPREDUCE-3787. [Gridmix] Optimize job monitoring and STRESS mode for
@@ -168,6 +171,9 @@ Release 2.0.3-alpha - Unreleased 
 
     MAPREDUCE-4723. Fix warnings found by findbugs 2. (Sandy Ryza via eli)
 
+    MAPREDUCE-4703. Add the ability to start the MiniMRClientCluster using 
+    the configurations used before it is being stopped. (ahmed.radwan via tucu)
+
   OPTIMIZATIONS
 
   BUG FIXES
@@ -202,6 +208,9 @@ Release 2.0.3-alpha - Unreleased 
     MAPREDUCE-4800. Cleanup o.a.h.mapred.MapTaskStatus - remove unused 
     code. (kkambatl via tucu)
 
+    MAPREDUCE-4861. Cleanup: Remove unused 
mapreduce.security.token.DelegationTokenRenewal. 
+    (kkambatl via tucu)
+
 Release 2.0.2-alpha - 2012-09-07 
 
   INCOMPATIBLE CHANGES
@@ -604,6 +613,9 @@ Release 0.23.6 - UNRELEASED
     MAPREDUCE-4817. Hardcoded task ping timeout kills tasks localizing large 
     amounts of data (tgraves)
 
+    MAPREDUCE-4836. Elapsed time for running tasks on AM web UI tasks page is 0
+    (Ravi Prakash via jeagles)
+
 Release 0.23.5 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Propchange: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/CHANGES.txt
------------------------------------------------------------------------------
  Merged 
/hadoop/common/trunk/hadoop-mapreduce-project/CHANGES.txt:r1415787-1420366

Propchange: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/conf/
------------------------------------------------------------------------------
  Merged /hadoop/common/trunk/hadoop-mapreduce-project/conf:r1415787-1420366

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/dev-support/findbugs-exclude.xml
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/dev-support/findbugs-exclude.xml?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/dev-support/findbugs-exclude.xml
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/dev-support/findbugs-exclude.xml
 Tue Dec 11 20:08:00 2012
@@ -138,11 +138,6 @@
        <Method name="run" />
        <Bug pattern="DM_EXIT" />
      </Match>
-     <Match>
-       <Class 
name="org.apache.hadoop.mapreduce.security.token.DelegationTokenRenewal$DelegationTokenCancelThread"
 />
-       <Method name="run" />
-       <Bug pattern="DM_EXIT" />
-    </Match>
      <!--
        We need to cast objects between old and new api objects
      -->

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/src/main/java/org/apache/hadoop/mapreduce/v2/app/webapp/dao/TaskInfo.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/src/main/java/org/apache/hadoop/mapreduce/v2/app/webapp/dao/TaskInfo.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/src/main/java/org/apache/hadoop/mapreduce/v2/app/webapp/dao/TaskInfo.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/src/main/java/org/apache/hadoop/mapreduce/v2/app/webapp/dao/TaskInfo.java
 Tue Dec 11 20:08:00 2012
@@ -59,11 +59,12 @@ public class TaskInfo {
     TaskReport report = task.getReport();
     this.startTime = report.getStartTime();
     this.finishTime = report.getFinishTime();
-    this.elapsedTime = Times.elapsed(this.startTime, this.finishTime, false);
+    this.state = report.getTaskState();
+    this.elapsedTime = Times.elapsed(this.startTime, this.finishTime,
+      this.state == TaskState.RUNNING);
     if (this.elapsedTime == -1) {
       this.elapsedTime = 0;
     }
-    this.state = report.getTaskState();
     this.progress = report.getProgress() * 100;
     this.id = MRApps.toString(task.getID());
     this.taskNum = task.getID().getId();

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/ReduceTask.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/ReduceTask.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/ReduceTask.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/ReduceTask.java
 Tue Dec 11 20:08:00 2012
@@ -340,6 +340,7 @@ public class ReduceTask extends Task {
     // Initialize the codec
     codec = initCodec();
     RawKeyValueIterator rIter = null;
+    ShuffleConsumerPlugin shuffleConsumerPlugin = null; 
     
     boolean isLocal = false; 
     // local if
@@ -358,8 +359,14 @@ public class ReduceTask extends Task {
         (null != combinerClass) ? 
             new CombineOutputCollector(reduceCombineOutputCounter, reporter, 
conf) : null;
 
-      Shuffle shuffle = 
-        new Shuffle(getTaskID(), job, FileSystem.getLocal(job), umbilical, 
+      Class<? extends ShuffleConsumerPlugin> clazz =
+            job.getClass(MRConfig.SHUFFLE_CONSUMER_PLUGIN, Shuffle.class, 
ShuffleConsumerPlugin.class);
+                                               
+      shuffleConsumerPlugin = ReflectionUtils.newInstance(clazz, job);
+      LOG.info("Using ShuffleConsumerPlugin: " + shuffleConsumerPlugin);
+
+      ShuffleConsumerPlugin.Context shuffleContext = 
+        new ShuffleConsumerPlugin.Context(getTaskID(), job, 
FileSystem.getLocal(job), umbilical, 
                     super.lDirAlloc, reporter, codec, 
                     combinerClass, combineCollector, 
                     spilledRecordsCounter, reduceCombineInputCounter,
@@ -368,7 +375,8 @@ public class ReduceTask extends Task {
                     mergedMapOutputsCounter,
                     taskStatus, copyPhase, sortPhase, this,
                     mapOutputFile);
-      rIter = shuffle.run();
+      shuffleConsumerPlugin.init(shuffleContext);
+      rIter = shuffleConsumerPlugin.run();
     } else {
       // local job runner doesn't have a copy phase
       copyPhase.complete();
@@ -399,6 +407,10 @@ public class ReduceTask extends Task {
       runOldReducer(job, umbilical, reporter, rIter, comparator, 
                     keyClass, valueClass);
     }
+
+    if (shuffleConsumerPlugin != null) {
+      shuffleConsumerPlugin.close();
+    }
     done(umbilical, reporter);
   }
 

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/MRConfig.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/MRConfig.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/MRConfig.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/MRConfig.java
 Tue Dec 11 20:08:00 2012
@@ -85,6 +85,9 @@ public interface MRConfig {
 
   public static final boolean SHUFFLE_SSL_ENABLED_DEFAULT = false;
 
+  public static final String SHUFFLE_CONSUMER_PLUGIN =
+    "mapreduce.job.reduce.shuffle.consumer.plugin.class";
+
   /**
    * Configuration key to enable/disable IFile readahead.
    */

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/task/reduce/Shuffle.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/task/reduce/Shuffle.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/task/reduce/Shuffle.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/task/reduce/Shuffle.java
 Tue Dec 11 20:08:00 2012
@@ -34,73 +34,63 @@ import org.apache.hadoop.mapred.Task;
 import org.apache.hadoop.mapred.Task.CombineOutputCollector;
 import org.apache.hadoop.mapred.TaskStatus;
 import org.apache.hadoop.mapred.TaskUmbilicalProtocol;
+import org.apache.hadoop.mapred.ShuffleConsumerPlugin;
 import org.apache.hadoop.mapreduce.MRJobConfig;
 import org.apache.hadoop.mapreduce.TaskAttemptID;
 import org.apache.hadoop.util.Progress;
 
-@InterfaceAudience.Private
+@InterfaceAudience.LimitedPrivate("mapreduce")
 @InterfaceStability.Unstable
 @SuppressWarnings({"unchecked", "rawtypes"})
-public class Shuffle<K, V> implements ExceptionReporter {
+public class Shuffle<K, V> implements ShuffleConsumerPlugin<K, V>, 
ExceptionReporter {
   private static final int PROGRESS_FREQUENCY = 2000;
   private static final int MAX_EVENTS_TO_FETCH = 10000;
   private static final int MIN_EVENTS_TO_FETCH = 100;
   private static final int MAX_RPC_OUTSTANDING_EVENTS = 3000000;
   
-  private final TaskAttemptID reduceId;
-  private final JobConf jobConf;
-  private final Reporter reporter;
-  private final ShuffleClientMetrics metrics;
-  private final TaskUmbilicalProtocol umbilical;
+  private ShuffleConsumerPlugin.Context context;
+
+  private TaskAttemptID reduceId;
+  private JobConf jobConf;
+  private Reporter reporter;
+  private ShuffleClientMetrics metrics;
+  private TaskUmbilicalProtocol umbilical;
   
-  private final ShuffleScheduler<K,V> scheduler;
-  private final MergeManager<K, V> merger;
+  private ShuffleScheduler<K,V> scheduler;
+  private MergeManager<K, V> merger;
   private Throwable throwable = null;
   private String throwingThreadName = null;
-  private final Progress copyPhase;
-  private final TaskStatus taskStatus;
-  private final Task reduceTask; //Used for status updates
-  
-  public Shuffle(TaskAttemptID reduceId, JobConf jobConf, FileSystem localFS,
-                 TaskUmbilicalProtocol umbilical,
-                 LocalDirAllocator localDirAllocator,  
-                 Reporter reporter,
-                 CompressionCodec codec,
-                 Class<? extends Reducer> combinerClass,
-                 CombineOutputCollector<K,V> combineCollector,
-                 Counters.Counter spilledRecordsCounter,
-                 Counters.Counter reduceCombineInputCounter,
-                 Counters.Counter shuffledMapsCounter,
-                 Counters.Counter reduceShuffleBytes,
-                 Counters.Counter failedShuffleCounter,
-                 Counters.Counter mergedMapOutputsCounter,
-                 TaskStatus status,
-                 Progress copyPhase,
-                 Progress mergePhase,
-                 Task reduceTask,
-                 MapOutputFile mapOutputFile) {
-    this.reduceId = reduceId;
-    this.jobConf = jobConf;
-    this.umbilical = umbilical;
-    this.reporter = reporter;
+  private Progress copyPhase;
+  private TaskStatus taskStatus;
+  private Task reduceTask; //Used for status updates
+
+  @Override
+  public void init(ShuffleConsumerPlugin.Context context) {
+    this.context = context;
+
+    this.reduceId = context.getReduceId();
+    this.jobConf = context.getJobConf();
+    this.umbilical = context.getUmbilical();
+    this.reporter = context.getReporter();
     this.metrics = new ShuffleClientMetrics(reduceId, jobConf);
-    this.copyPhase = copyPhase;
-    this.taskStatus = status;
-    this.reduceTask = reduceTask;
+    this.copyPhase = context.getCopyPhase();
+    this.taskStatus = context.getStatus();
+    this.reduceTask = context.getReduceTask();
     
     scheduler = 
-      new ShuffleScheduler<K,V>(jobConf, status, this, copyPhase, 
-                                shuffledMapsCounter, 
-                                reduceShuffleBytes, failedShuffleCounter);
-    merger = new MergeManager<K, V>(reduceId, jobConf, localFS, 
-                                    localDirAllocator, reporter, codec, 
-                                    combinerClass, combineCollector, 
-                                    spilledRecordsCounter, 
-                                    reduceCombineInputCounter, 
-                                    mergedMapOutputsCounter, 
-                                    this, mergePhase, mapOutputFile);
+      new ShuffleScheduler<K,V>(jobConf, taskStatus, this, copyPhase, 
+                                context.getShuffledMapsCounter(), 
+                                context.getReduceShuffleBytes(), 
context.getFailedShuffleCounter());
+    merger = new MergeManager<K, V>(reduceId, jobConf, context.getLocalFS(),
+                                    context.getLocalDirAllocator(), reporter, 
context.getCodec(),
+                                    context.getCombinerClass(), 
context.getCombineCollector(), 
+                                    context.getSpilledRecordsCounter(), 
+                                    context.getReduceCombineInputCounter(), 
+                                    context.getMergedMapOutputsCounter(), 
+                                    this, context.getMergePhase(), 
context.getMapOutputFile());
   }
 
+  @Override
   public RawKeyValueIterator run() throws IOException, InterruptedException {
     // Scale the maximum events we fetch per RPC call to mitigate OOM issues
     // on the ApplicationMaster when a thundering herd of reducers fetch events
@@ -171,6 +161,10 @@ public class Shuffle<K, V> implements Ex
     return kvIter;
   }
 
+  @Override
+  public void close(){
+  }
+
   public synchronized void reportException(Throwable t) {
     if (throwable == null) {
       throwable = t;

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/resources/mapred-default.xml
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/resources/mapred-default.xml?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/resources/mapred-default.xml
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/resources/mapred-default.xml
 Tue Dec 11 20:08:00 2012
@@ -748,6 +748,16 @@
   </description>
 </property>
 
+<property>
+  <name>mapreduce.job.reduce.shuffle.consumer.plugin.class</name>
+  <value>org.apache.hadoop.mapreduce.task.reduce.Shuffle</value>
+  <description> 
+  Name of the class whose instance will be used 
+  to send shuffle requests by reducetasks of this job.
+  The class must be an instance of 
org.apache.hadoop.mapred.ShuffleConsumerPlugin.
+  </description>
+</property>
+
 <!-- MR YARN Application properties -->
 
 <property>

Propchange: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/resources/mapred-default.xml
------------------------------------------------------------------------------
  Merged 
/hadoop/common/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/resources/mapred-default.xml:r1415787-1420366

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/src/main/java/org/apache/hadoop/mapreduce/v2/hs/webapp/HsJobsBlock.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/src/main/java/org/apache/hadoop/mapreduce/v2/hs/webapp/HsJobsBlock.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/src/main/java/org/apache/hadoop/mapreduce/v2/hs/webapp/HsJobsBlock.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/src/main/java/org/apache/hadoop/mapreduce/v2/hs/webapp/HsJobsBlock.java
 Tue Dec 11 20:08:00 2012
@@ -78,12 +78,12 @@ public class HsJobsBlock extends HtmlBlo
       .append(dateFormat.format(new Date(job.getFinishTime()))).append("\",\"")
       .append("<a href='").append(url("job", job.getId())).append("'>")
       .append(job.getId()).append("</a>\",\"")
-      .append(StringEscapeUtils.escapeHtml(job.getName()))
-      .append("\",\"")
-      .append(StringEscapeUtils.escapeHtml(job.getUserName()))
-      .append("\",\"")
-      .append(StringEscapeUtils.escapeHtml(job.getQueueName()))
-      .append("\",\"")
+      .append(StringEscapeUtils.escapeJavaScript(StringEscapeUtils.escapeHtml(
+        job.getName()))).append("\",\"")
+      .append(StringEscapeUtils.escapeJavaScript(StringEscapeUtils.escapeHtml(
+        job.getUserName()))).append("\",\"")
+      .append(StringEscapeUtils.escapeJavaScript(StringEscapeUtils.escapeHtml(
+        job.getQueueName()))).append("\",\"")
       .append(job.getState()).append("\",\"")
       .append(String.valueOf(job.getMapsTotal())).append("\",\"")
       .append(String.valueOf(job.getMapsCompleted())).append("\",\"")

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientCluster.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientCluster.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientCluster.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientCluster.java
 Tue Dec 11 20:08:00 2012
@@ -31,6 +31,11 @@ public interface MiniMRClientCluster {
 
   public void start() throws IOException;
 
+  /**
+   * Stop and start back the cluster using the same configuration.
+   */
+  public void restart() throws IOException;
+
   public void stop() throws IOException;
 
   public Configuration getConfig() throws IOException;

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientClusterFactory.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientClusterFactory.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientClusterFactory.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRClientClusterFactory.java
 Tue Dec 11 20:08:00 2012
@@ -67,6 +67,10 @@ public class MiniMRClientClusterFactory 
 
     MiniMRYarnCluster miniMRYarnCluster = new MiniMRYarnCluster(caller
         .getName(), noOfNMs);
+    job.getConfiguration().set("minimrclientcluster.caller.name",
+        caller.getName());
+    job.getConfiguration().setInt("minimrclientcluster.nodemanagers.number",
+        noOfNMs);
     miniMRYarnCluster.init(job.getConfiguration());
     miniMRYarnCluster.start();
 

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRYarnClusterAdapter.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRYarnClusterAdapter.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRYarnClusterAdapter.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/MiniMRYarnClusterAdapter.java
 Tue Dec 11 20:08:00 2012
@@ -18,8 +18,13 @@
 
 package org.apache.hadoop.mapred;
 
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster;
+import org.apache.hadoop.mapreduce.v2.jobhistory.JHAdminConfig;
+import org.apache.hadoop.yarn.conf.YarnConfiguration;
+import org.apache.hadoop.yarn.service.Service.STATE;
 
 /**
  * An adapter for MiniMRYarnCluster providing a MiniMRClientCluster interface.
@@ -29,6 +34,8 @@ public class MiniMRYarnClusterAdapter im
 
   private MiniMRYarnCluster miniMRYarnCluster;
 
+  private static final Log LOG = 
LogFactory.getLog(MiniMRYarnClusterAdapter.class);
+
   public MiniMRYarnClusterAdapter(MiniMRYarnCluster miniMRYarnCluster) {
     this.miniMRYarnCluster = miniMRYarnCluster;
   }
@@ -48,4 +55,22 @@ public class MiniMRYarnClusterAdapter im
     miniMRYarnCluster.stop();
   }
 
+  @Override
+  public void restart() {
+    if (!miniMRYarnCluster.getServiceState().equals(STATE.STARTED)){
+      LOG.warn("Cannot restart the mini cluster, start it first");
+      return;
+    }
+    Configuration oldConf = new Configuration(getConfig());
+    String callerName = oldConf.get("minimrclientcluster.caller.name",
+        this.getClass().getName());
+    int noOfNMs = oldConf.getInt("minimrclientcluster.nodemanagers.number", 1);
+    oldConf.setBoolean(YarnConfiguration.YARN_MINICLUSTER_FIXED_PORTS, true);
+    oldConf.setBoolean(JHAdminConfig.MR_HISTORY_MINICLUSTER_FIXED_PORTS, true);
+    stop();
+    miniMRYarnCluster = new MiniMRYarnCluster(callerName, noOfNMs);
+    miniMRYarnCluster.init(oldConf);
+    miniMRYarnCluster.start();
+  }
+
 }

Modified: 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/TestMiniMRClientCluster.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/TestMiniMRClientCluster.java?rev=1420375&r1=1420374&r2=1420375&view=diff
==============================================================================
--- 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/TestMiniMRClientCluster.java
 (original)
+++ 
hadoop/common/branches/branch-trunk-win/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/TestMiniMRClientCluster.java
 Tue Dec 11 20:08:00 2012
@@ -32,6 +32,8 @@ import org.apache.hadoop.io.IntWritable;
 import org.apache.hadoop.io.Text;
 import org.apache.hadoop.mapreduce.Counters;
 import org.apache.hadoop.mapreduce.Job;
+import org.apache.hadoop.mapreduce.v2.jobhistory.JHAdminConfig;
+import org.apache.hadoop.yarn.conf.YarnConfiguration;
 import org.junit.AfterClass;
 import org.junit.BeforeClass;
 import org.junit.Test;
@@ -92,6 +94,65 @@ public class TestMiniMRClientCluster {
   }
 
   @Test
+  public void testRestart() throws Exception {
+
+    String rmAddress1 = 
mrCluster.getConfig().get(YarnConfiguration.RM_ADDRESS);
+    String rmAdminAddress1 = mrCluster.getConfig().get(
+        YarnConfiguration.RM_ADMIN_ADDRESS);
+    String rmSchedAddress1 = mrCluster.getConfig().get(
+        YarnConfiguration.RM_SCHEDULER_ADDRESS);
+    String rmRstrackerAddress1 = mrCluster.getConfig().get(
+        YarnConfiguration.RM_RESOURCE_TRACKER_ADDRESS);
+    String rmWebAppAddress1 = mrCluster.getConfig().get(
+        YarnConfiguration.RM_WEBAPP_ADDRESS);
+
+    String mrHistAddress1 = mrCluster.getConfig().get(
+        JHAdminConfig.MR_HISTORY_ADDRESS);
+    String mrHistWebAppAddress1 = mrCluster.getConfig().get(
+        JHAdminConfig.MR_HISTORY_WEBAPP_ADDRESS);
+
+    mrCluster.restart();
+
+    String rmAddress2 = 
mrCluster.getConfig().get(YarnConfiguration.RM_ADDRESS);
+    String rmAdminAddress2 = mrCluster.getConfig().get(
+        YarnConfiguration.RM_ADMIN_ADDRESS);
+    String rmSchedAddress2 = mrCluster.getConfig().get(
+        YarnConfiguration.RM_SCHEDULER_ADDRESS);
+    String rmRstrackerAddress2 = mrCluster.getConfig().get(
+        YarnConfiguration.RM_RESOURCE_TRACKER_ADDRESS);
+    String rmWebAppAddress2 = mrCluster.getConfig().get(
+        YarnConfiguration.RM_WEBAPP_ADDRESS);
+
+    String mrHistAddress2 = mrCluster.getConfig().get(
+        JHAdminConfig.MR_HISTORY_ADDRESS);
+    String mrHistWebAppAddress2 = mrCluster.getConfig().get(
+        JHAdminConfig.MR_HISTORY_WEBAPP_ADDRESS);
+
+    assertEquals("Address before restart: " + rmAddress1
+        + " is different from new address: " + rmAddress2, rmAddress1,
+        rmAddress2);
+    assertEquals("Address before restart: " + rmAdminAddress1
+        + " is different from new address: " + rmAdminAddress2,
+        rmAdminAddress1, rmAdminAddress2);
+    assertEquals("Address before restart: " + rmSchedAddress1
+        + " is different from new address: " + rmSchedAddress2,
+        rmSchedAddress1, rmSchedAddress2);
+    assertEquals("Address before restart: " + rmRstrackerAddress1
+        + " is different from new address: " + rmRstrackerAddress2,
+        rmRstrackerAddress1, rmRstrackerAddress2);
+    assertEquals("Address before restart: " + rmWebAppAddress1
+        + " is different from new address: " + rmWebAppAddress2,
+        rmWebAppAddress1, rmWebAppAddress2);
+    assertEquals("Address before restart: " + mrHistAddress1
+        + " is different from new address: " + mrHistAddress2, mrHistAddress1,
+        mrHistAddress2);
+    assertEquals("Address before restart: " + mrHistWebAppAddress1
+        + " is different from new address: " + mrHistWebAppAddress2,
+        mrHistWebAppAddress1, mrHistWebAppAddress2);
+
+  }
+
+  @Test
   public void testJob() throws Exception {
     final Job job = createJob();
     org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(job,


Reply via email to