hadoop git commit: HDFS-7383. Moved the jira from release 2.7.0 section to 2.6.0 in CHANGES.txt

2014-11-09 Thread suresh
Repository: hadoop
Updated Branches:
  refs/heads/branch-2 e3877e805 -> d8e699be7


HDFS-7383. Moved the jira from release 2.7.0 section to 2.6.0 in CHANGES.txt


Project: http://git-wip-us.apache.org/repos/asf/hadoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/hadoop/commit/d8e699be
Tree: http://git-wip-us.apache.org/repos/asf/hadoop/tree/d8e699be
Diff: http://git-wip-us.apache.org/repos/asf/hadoop/diff/d8e699be

Branch: refs/heads/branch-2
Commit: d8e699be73da38b7983f076f181d9b711c04ab6a
Parents: e3877e8
Author: Suresh Srinivas 
Authored: Sun Nov 9 18:22:11 2014 -0800
Committer: Suresh Srinivas 
Committed: Sun Nov 9 18:22:18 2014 -0800

--
 hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hadoop/blob/d8e699be/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
--
diff --git a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt 
b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
index 01298f0..e7f288b 100644
--- a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
+++ b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
@@ -155,9 +155,6 @@ Release 2.7.0 - UNRELEASED
 HDFS-7366. BlockInfo should take replication as an short in the 
constructor.
 (Li Lu via wheat9)
 
-HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw 
-NullPointerException. (szetszwo via suresh)
-
 Release 2.6.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES
@@ -778,6 +775,9 @@ Release 2.6.0 - UNRELEASED
 HDFS-7199.  DFSOutputStream should not silently drop data if DataStreamer
 crashes with an unchecked exception (rushabhs via cmccabe)
 
+HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw 
+NullPointerException. (szetszwo via suresh)
+
 BREAKDOWN OF HDFS-6581 SUBTASKS AND RELATED JIRAS
   
   HDFS-6921. Add LazyPersist flag to FileStatus. (Arpit Agarwal)



hadoop git commit: HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw NullPointerException. Contributed by Tsz Wo Nicholas Sze.

2014-11-09 Thread suresh
Repository: hadoop
Updated Branches:
  refs/heads/branch-2.6 e1608c28d -> c98fcd1d3


HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw 
NullPointerException. Contributed by Tsz Wo Nicholas Sze.

Conflicts:
hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt


Project: http://git-wip-us.apache.org/repos/asf/hadoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/hadoop/commit/c98fcd1d
Tree: http://git-wip-us.apache.org/repos/asf/hadoop/tree/c98fcd1d
Diff: http://git-wip-us.apache.org/repos/asf/hadoop/diff/c98fcd1d

Branch: refs/heads/branch-2.6
Commit: c98fcd1d3b91bf7388ebe90af9e83dd7ab84c170
Parents: e1608c2
Author: Suresh Srinivas 
Authored: Sun Nov 9 17:55:03 2014 -0800
Committer: Suresh Srinivas 
Committed: Sun Nov 9 18:18:58 2014 -0800

--
 hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt |  3 +++
 .../hadoop/hdfs/server/datanode/DataNode.java   |  2 +-
 .../hdfs/server/datanode/DatanodeUtil.java  | 21 
 .../datanode/fsdataset/impl/FsDatasetCache.java |  4 ++--
 4 files changed, 27 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hadoop/blob/c98fcd1d/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
--
diff --git a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt 
b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
index 0677bf6..e755113 100644
--- a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
+++ b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
@@ -615,6 +615,9 @@ Release 2.6.0 - UNRELEASED
 HDFS-7218. FSNamesystem ACL operations should write to audit log on
 failure. (clamb via yliu)
 
+HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw 
+NullPointerException. (szetszwo via suresh)
+
 BREAKDOWN OF HDFS-6581 SUBTASKS AND RELATED JIRAS
   
   HDFS-6921. Add LazyPersist flag to FileStatus. (Arpit Agarwal)

http://git-wip-us.apache.org/repos/asf/hadoop/blob/c98fcd1d/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
--
diff --git 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
index 82572e0..a7dd922 100644
--- 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
+++ 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
@@ -1522,7 +1522,7 @@ public class DataNode extends ReconfigurableBase
 
 try {
   fis[0] = (FileInputStream)data.getBlockInputStream(blk, 0);
-  fis[1] = 
(FileInputStream)data.getMetaDataInputStream(blk).getWrappedStream();
+  fis[1] = DatanodeUtil.getMetaDataInputStream(blk, data);
 } catch (ClassCastException e) {
   LOG.debug("requestShortCircuitFdsForRead failed", e);
   throw new ShortCircuitFdsUnsupportedException("This DataNode's " +

http://git-wip-us.apache.org/repos/asf/hadoop/blob/c98fcd1d/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
--
diff --git 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
index bd1ba2f..746c3f6 100644
--- 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
+++ 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
@@ -18,10 +18,15 @@
 package org.apache.hadoop.hdfs.server.datanode;
 
 import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileNotFoundException;
 import java.io.IOException;
 
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.hdfs.protocol.Block;
+import org.apache.hadoop.hdfs.protocol.ExtendedBlock;
+import org.apache.hadoop.hdfs.server.datanode.fsdataset.FsDatasetSpi;
+import org.apache.hadoop.hdfs.server.datanode.fsdataset.LengthInputStream;
 
 /** Provide utility methods for Datanode. */
 @InterfaceAudience.Private
@@ -114,4 +119,20 @@ public class DatanodeUtil {
 DataStorage.BLOCK_SUBDIR_PREFIX + d2;
 return new File(root, path);
   }
+
+  /**
+   * @return the FileInputStream for the meta data of the given block.
+   * @throws FileNotFoundException
+   *   if the file not found.
+   * @throws ClassCastException
+   *   if the underlying input stream is not a FileInputStream.
+   */
+  public static FileInputStream getMetaDataInputStream(
+  ExtendedBlock b, FsDataset

hadoop git commit: HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw NullPointerException. Contributed by Tsz Wo Nicholas Sze.

2014-11-09 Thread suresh
Repository: hadoop
Updated Branches:
  refs/heads/branch-2 9e63cb449 -> d4f2e791a


HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw 
NullPointerException. Contributed by Tsz Wo Nicholas Sze.


Project: http://git-wip-us.apache.org/repos/asf/hadoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/hadoop/commit/d4f2e791
Tree: http://git-wip-us.apache.org/repos/asf/hadoop/tree/d4f2e791
Diff: http://git-wip-us.apache.org/repos/asf/hadoop/diff/d4f2e791

Branch: refs/heads/branch-2
Commit: d4f2e791a51d81b34209f0a0f7254e190955417f
Parents: 9e63cb4
Author: Suresh Srinivas 
Authored: Sun Nov 9 17:55:03 2014 -0800
Committer: Suresh Srinivas 
Committed: Sun Nov 9 17:55:49 2014 -0800

--
 hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt |  3 +++
 .../hadoop/hdfs/server/datanode/DataNode.java   |  2 +-
 .../hdfs/server/datanode/DatanodeUtil.java  | 21 
 .../datanode/fsdataset/impl/FsDatasetCache.java |  4 ++--
 4 files changed, 27 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hadoop/blob/d4f2e791/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
--
diff --git a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt 
b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
index facb8cf..01298f0 100644
--- a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
+++ b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
@@ -155,6 +155,9 @@ Release 2.7.0 - UNRELEASED
 HDFS-7366. BlockInfo should take replication as an short in the 
constructor.
 (Li Lu via wheat9)
 
+HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw 
+NullPointerException. (szetszwo via suresh)
+
 Release 2.6.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES

http://git-wip-us.apache.org/repos/asf/hadoop/blob/d4f2e791/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
--
diff --git 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
index a64b81f..017529c 100644
--- 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
+++ 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
@@ -1550,7 +1550,7 @@ public class DataNode extends ReconfigurableBase
 
 try {
   fis[0] = (FileInputStream)data.getBlockInputStream(blk, 0);
-  fis[1] = 
(FileInputStream)data.getMetaDataInputStream(blk).getWrappedStream();
+  fis[1] = DatanodeUtil.getMetaDataInputStream(blk, data);
 } catch (ClassCastException e) {
   LOG.debug("requestShortCircuitFdsForRead failed", e);
   throw new ShortCircuitFdsUnsupportedException("This DataNode's " +

http://git-wip-us.apache.org/repos/asf/hadoop/blob/d4f2e791/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
--
diff --git 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
index bd1ba2f..746c3f6 100644
--- 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
+++ 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
@@ -18,10 +18,15 @@
 package org.apache.hadoop.hdfs.server.datanode;
 
 import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileNotFoundException;
 import java.io.IOException;
 
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.hdfs.protocol.Block;
+import org.apache.hadoop.hdfs.protocol.ExtendedBlock;
+import org.apache.hadoop.hdfs.server.datanode.fsdataset.FsDatasetSpi;
+import org.apache.hadoop.hdfs.server.datanode.fsdataset.LengthInputStream;
 
 /** Provide utility methods for Datanode. */
 @InterfaceAudience.Private
@@ -114,4 +119,20 @@ public class DatanodeUtil {
 DataStorage.BLOCK_SUBDIR_PREFIX + d2;
 return new File(root, path);
   }
+
+  /**
+   * @return the FileInputStream for the meta data of the given block.
+   * @throws FileNotFoundException
+   *   if the file not found.
+   * @throws ClassCastException
+   *   if the underlying input stream is not a FileInputStream.
+   */
+  public static FileInputStream getMetaDataInputStream(
+  ExtendedBlock b, FsDatasetSpi data) throws IOException {
+final LengthInputStream lin = data.getMetaDataInputStream(b);
+if (lin == null) {
+  throw new

hadoop git commit: HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw NullPointerException. Contributed by Tsz Wo Nicholas Sze.

2014-11-09 Thread suresh
Repository: hadoop
Updated Branches:
  refs/heads/trunk a37a99345 -> 4ddc5cad0


HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw 
NullPointerException. Contributed by Tsz Wo Nicholas Sze.


Project: http://git-wip-us.apache.org/repos/asf/hadoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/hadoop/commit/4ddc5cad
Tree: http://git-wip-us.apache.org/repos/asf/hadoop/tree/4ddc5cad
Diff: http://git-wip-us.apache.org/repos/asf/hadoop/diff/4ddc5cad

Branch: refs/heads/trunk
Commit: 4ddc5cad0a4175f7f5ef9504a7365601dc7e63b4
Parents: a37a993
Author: Suresh Srinivas 
Authored: Sun Nov 9 17:55:03 2014 -0800
Committer: Suresh Srinivas 
Committed: Sun Nov 9 17:55:03 2014 -0800

--
 hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt |  3 +++
 .../hadoop/hdfs/server/datanode/DataNode.java   |  2 +-
 .../hdfs/server/datanode/DatanodeUtil.java  | 21 
 .../datanode/fsdataset/impl/FsDatasetCache.java |  4 ++--
 4 files changed, 27 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hadoop/blob/4ddc5cad/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
--
diff --git a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt 
b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
index 6bde9bc..af18379 100644
--- a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
+++ b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
@@ -407,6 +407,9 @@ Release 2.7.0 - UNRELEASED
 HDFS-7366. BlockInfo should take replication as an short in the 
constructor.
 (Li Lu via wheat9)
 
+HDFS-7383. DataNode.requestShortCircuitFdsForRead may throw 
+NullPointerException. (szetszwo via suresh)
+
 Release 2.6.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES

http://git-wip-us.apache.org/repos/asf/hadoop/blob/4ddc5cad/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
--
diff --git 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
index 6bd27fa..adfbaf3 100644
--- 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
+++ 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java
@@ -1543,7 +1543,7 @@ public class DataNode extends ReconfigurableBase
 
 try {
   fis[0] = (FileInputStream)data.getBlockInputStream(blk, 0);
-  fis[1] = 
(FileInputStream)data.getMetaDataInputStream(blk).getWrappedStream();
+  fis[1] = DatanodeUtil.getMetaDataInputStream(blk, data);
 } catch (ClassCastException e) {
   LOG.debug("requestShortCircuitFdsForRead failed", e);
   throw new ShortCircuitFdsUnsupportedException("This DataNode's " +

http://git-wip-us.apache.org/repos/asf/hadoop/blob/4ddc5cad/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
--
diff --git 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
index bd1ba2f..746c3f6 100644
--- 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
+++ 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DatanodeUtil.java
@@ -18,10 +18,15 @@
 package org.apache.hadoop.hdfs.server.datanode;
 
 import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileNotFoundException;
 import java.io.IOException;
 
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.hdfs.protocol.Block;
+import org.apache.hadoop.hdfs.protocol.ExtendedBlock;
+import org.apache.hadoop.hdfs.server.datanode.fsdataset.FsDatasetSpi;
+import org.apache.hadoop.hdfs.server.datanode.fsdataset.LengthInputStream;
 
 /** Provide utility methods for Datanode. */
 @InterfaceAudience.Private
@@ -114,4 +119,20 @@ public class DatanodeUtil {
 DataStorage.BLOCK_SUBDIR_PREFIX + d2;
 return new File(root, path);
   }
+
+  /**
+   * @return the FileInputStream for the meta data of the given block.
+   * @throws FileNotFoundException
+   *   if the file not found.
+   * @throws ClassCastException
+   *   if the underlying input stream is not a FileInputStream.
+   */
+  public static FileInputStream getMetaDataInputStream(
+  ExtendedBlock b, FsDatasetSpi data) throws IOException {
+final LengthInputStream lin = data.getMetaDataInputStream(b);
+if (lin == null) {
+  throw new FileNotF

git commit: HDFS-7324. haadmin command usage prints incorrect command name. Contributed by Brahma Reddy Battula.

2014-11-03 Thread suresh
Repository: hadoop
Updated Branches:
  refs/heads/branch-2 715c81ef6 -> c1ba22300


HDFS-7324. haadmin command usage prints incorrect command name. Contributed by 
Brahma Reddy Battula.


Project: http://git-wip-us.apache.org/repos/asf/hadoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/hadoop/commit/c1ba2230
Tree: http://git-wip-us.apache.org/repos/asf/hadoop/tree/c1ba2230
Diff: http://git-wip-us.apache.org/repos/asf/hadoop/diff/c1ba2230

Branch: refs/heads/branch-2
Commit: c1ba223009a9c97df5a8bcf390e3327f420f3533
Parents: 715c81e
Author: Suresh Srinivas 
Authored: Mon Nov 3 13:15:14 2014 -0800
Committer: Suresh Srinivas 
Committed: Mon Nov 3 13:33:33 2014 -0800

--
 hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt   | 3 +++
 .../src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java| 2 +-
 2 files changed, 4 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hadoop/blob/c1ba2230/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
--
diff --git a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt 
b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
index 7433dbe..2c82a8c 100644
--- a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
+++ b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
@@ -132,6 +132,9 @@ Release 2.7.0 - UNRELEASED
 HDFS-7315. DFSTestUtil.readFileBuffer opens extra FSDataInputStream.
 (Plamen Jeliazkov via wheat9)
 
+HDFS-7324. haadmin command usage prints incorrect command name.
+(Brahma Reddy Battula via suresh)
+
 Release 2.6.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES

http://git-wip-us.apache.org/repos/asf/hadoop/blob/c1ba2230/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
--
diff --git 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
index 5c4b49d..1ec6d35 100644
--- 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
+++ 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
@@ -89,7 +89,7 @@ public class DFSHAAdmin extends HAAdmin {
 
   @Override
   protected String getUsageString() {
-return "Usage: DFSHAAdmin [-ns ]";
+return "Usage: haadmin";
   }
 
   @Override



git commit: HDFS-7324. haadmin command usage prints incorrect command name. Contributed by Brahma Reddy Battula.

2014-11-03 Thread suresh
Repository: hadoop
Updated Branches:
  refs/heads/trunk 58e9f24e0 -> 237890fea


HDFS-7324. haadmin command usage prints incorrect command name. Contributed by 
Brahma Reddy Battula.


Project: http://git-wip-us.apache.org/repos/asf/hadoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/hadoop/commit/237890fe
Tree: http://git-wip-us.apache.org/repos/asf/hadoop/tree/237890fe
Diff: http://git-wip-us.apache.org/repos/asf/hadoop/diff/237890fe

Branch: refs/heads/trunk
Commit: 237890feabc809ade4e7542039634e04219d0bcb
Parents: 58e9f24
Author: Suresh Srinivas 
Authored: Mon Nov 3 13:15:14 2014 -0800
Committer: Suresh Srinivas 
Committed: Mon Nov 3 13:27:09 2014 -0800

--
 hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt   | 3 +++
 .../src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java| 2 +-
 2 files changed, 4 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hadoop/blob/237890fe/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
--
diff --git a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt 
b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
index 6c11c9f..16040ed 100644
--- a/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
+++ b/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
@@ -384,6 +384,9 @@ Release 2.7.0 - UNRELEASED
 HDFS-7315. DFSTestUtil.readFileBuffer opens extra FSDataInputStream.
 (Plamen Jeliazkov via wheat9)
 
+HDFS-7324. haadmin command usage prints incorrect command name.
+(Brahma Reddy Battula via suresh)
+
 Release 2.6.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES

http://git-wip-us.apache.org/repos/asf/hadoop/blob/237890fe/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
--
diff --git 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
index 5c4b49d..1ec6d35 100644
--- 
a/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
+++ 
b/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DFSHAAdmin.java
@@ -89,7 +89,7 @@ public class DFSHAAdmin extends HAAdmin {
 
   @Override
   protected String getUsageString() {
-return "Usage: DFSHAAdmin [-ns ]";
+return "Usage: haadmin";
   }
 
   @Override



svn commit: r1608541 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/util/DataChecksum.java

2014-07-07 Thread suresh
Author: suresh
Date: Mon Jul  7 18:14:22 2014
New Revision: 1608541

URL: http://svn.apache.org/r1608541
Log:
HADOOP-10782. Merge r1608539 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1608541&r1=1608540&r2=1608541&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Mon Jul  7 18:14:22 2014
@@ -145,6 +145,8 @@ Release 2.5.0 - UNRELEASED
 
 HADOOP-10312 Shell.ExitCodeException to have more useful toString (stevel)
 
+HADOOP-10782. Fix typo in DataChecksum class. (Jingguo Yao via suresh)
+
   OPTIMIZATIONS
 
   BUG FIXES 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java?rev=1608541&r1=1608540&r2=1608541&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java
 Mon Jul  7 18:14:22 2014
@@ -30,7 +30,7 @@ import org.apache.hadoop.classification.
 import org.apache.hadoop.fs.ChecksumException;
 
 /**
- * This class provides inteface and utilities for processing checksums for
+ * This class provides interface and utilities for processing checksums for
  * DFS data transfers.
  */
 @InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"})




svn commit: r1608539 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/util/DataChecksum.java

2014-07-07 Thread suresh
Author: suresh
Date: Mon Jul  7 18:10:42 2014
New Revision: 1608539

URL: http://svn.apache.org/r1608539
Log:
HADOOP-10782. Fix typo in DataChecksum class. Contributed by Jingguo Yao.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1608539&r1=1608538&r2=1608539&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Mon Jul 
 7 18:10:42 2014
@@ -523,6 +523,8 @@ Release 2.5.0 - UNRELEASED
 
 HADOOP-10312 Shell.ExitCodeException to have more useful toString (stevel)
 
+HADOOP-10782. Fix typo in DataChecksum class. (Jingguo Yao via suresh)
+
   OPTIMIZATIONS
 
   BUG FIXES 

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java?rev=1608539&r1=1608538&r2=1608539&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/DataChecksum.java
 Mon Jul  7 18:10:42 2014
@@ -30,7 +30,7 @@ import org.apache.hadoop.classification.
 import org.apache.hadoop.fs.ChecksumException;
 
 /**
- * This class provides inteface and utilities for processing checksums for
+ * This class provides interface and utilities for processing checksums for
  * DFS data transfers.
  */
 @InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"})




svn commit: r1598754 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/main/java/org/apache/hadoop/security/ src/test/java/org/apac

2014-05-30 Thread suresh
Author: suresh
Date: Fri May 30 21:56:58 2014
New Revision: 1598754

URL: http://svn.apache.org/r1598754
Log:
HADOOP-10342. Merging branch-2 equivalent of commit 1568525 from trunk

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1598754&r1=1598753&r2=1598754&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri May 30 21:56:58 2014
@@ -171,6 +171,9 @@ Release 2.5.0 - UNRELEASED
 HADOOP-10638. Updating hadoop-daemon.sh to work as expected when nfs is
 started as a privileged user. (Manikandan Narayanaswamy via atm)
 
+HADOOP-10342. Add a new method to UGI to use a Kerberos login subject to
+build a new UGI. (Larry McCay via omalley)
+
 Release 2.4.1 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1598754&r1=1598753&r2=1598754&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri May 30 21:56:58 2014
@@ -652,7 +652,7 @@ public class Client {
   // try re-login
   if (UserGroupInformation.isLoginKeytabBased()) {
 UserGroupInformation.getLoginUser().reloginFromKeytab();
-  } else {
+  } else if (UserGroupInformation.isLoginTicketBased()) {
 UserGroupInformation.getLoginUser().reloginFromTicketCache();
   }
   // have granularity of milliseconds

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java?rev=1598754&r1=1598753&r2=1598754&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
 Fri May 30 21:56:58 2014
@@ -692,6 +692,35 @@ public class UserGroupInformation {
 }
   }
 
+   /**
+   * Create a UserGroupInformation from a Subject with Kerberos principal.
+   *
+   * @param userThe KerberosPrincipal to use in UGI
+   *
+   * @throws IOExceptionif the kerberos login fails
+   */
+  public static UserGroupInformation getUGIFromSubject(Subject subject)
+  throws IOException {
+if (subject == null) {
+  throw new IOException("Subject must not be null");
+}
+
+if (subject.getPrincipals(KerberosPrincipal.class).isEmpty()) {
+  throw new IOException("Provided Subject must contain a 
KerberosPrincipal");
+}
+
+KerberosPrincipal principal =
+subject.getPrincipals(KerberosPrincipal.class).iterator().next();
+
+User ugiUser = new User(principal.getName(),
+AuthenticationMethod.KERBEROS, null);
+subject.getPrincipals().add(ugiUser);
+UserGroupInformation ugi = new UserGroupInformation(subject);
+ugi.setLogin(null);
+ugi.setAuthenticationMethod(AuthenticationMethod.KERBEROS);
+return ugi;
+  }
+
   /**
* Get the currently logged in user.
* @return the logged in user
@@ -1100,6 +1129,14 @@ public class UserGroupInformation {
   }
 
   /**
+   * Did the login happen via ticket cache
+   * @return true or false
+   */
+  public static boolean isLoginTicketBased()  throws IOException {
+return getLoginUser().isKrbTkt;
+  }
+
+  /**
   

svn commit: r1594283 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/security/authorize/ src/test/java/org/apache/hadoop/security/authoriz

2014-05-13 Thread suresh
Author: suresh
Date: Tue May 13 17:00:01 2014
New Revision: 1594283

URL: http://svn.apache.org/r1594283
Log:
HADOOP-10566. Adding files missed in previous commit 1594280

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyServers.java
  - copied unchanged from r1594282, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyServers.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyServers.java
  - copied unchanged from r1594282, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyServers.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1594283&r1=1594282&r2=1594283&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Tue May 13 17:00:01 2014
@@ -48,6 +48,9 @@ Release 2.5.0 - UNRELEASED
 HADOOP-10158. SPNEGO should work with multiple interfaces/SPNs.
 (daryn via kihwal)
 
+HADOOP-10566. Refactor proxyservers out of ProxyUsers.
+(Benoy Antony via suresh)
+
   OPTIMIZATIONS
 
   BUG FIXES 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java?rev=1594283&r1=1594282&r2=1594283&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
 Tue May 13 17:00:01 2014
@@ -19,11 +19,9 @@
 package org.apache.hadoop.security.authorize;
 
 import java.net.InetAddress;
-import java.net.InetSocketAddress;
 import java.net.UnknownHostException;
 import java.util.Collection;
 import java.util.HashMap;
-import java.util.HashSet;
 import java.util.Map;
 import java.util.Map.Entry;
 
@@ -42,7 +40,6 @@ public class ProxyUsers {
   private static final String CONF_GROUPS = ".groups";
   private static final String CONF_HADOOP_PROXYUSER = "hadoop.proxyuser.";
   private static final String CONF_HADOOP_PROXYUSER_RE = 
"hadoop\\.proxyuser\\.";
-  public static final String CONF_HADOOP_PROXYSERVERS = "hadoop.proxyservers";
   
   private static boolean init = false;
   //list of users, groups and hosts per proxyuser
@@ -52,8 +49,6 @@ public class ProxyUsers {
 new HashMap>();
   private static Map> proxyHosts = 
 new HashMap>();
-  private static Collection proxyServers =
-new HashSet();
 
   /**
* reread the conf and get new values for 
"hadoop.proxyuser.*.groups/users/hosts"
@@ -73,7 +68,6 @@ public class ProxyUsers {
 proxyGroups.clear();
 proxyHosts.clear();
 proxyUsers.clear();
-proxyServers.clear();
 
 // get all the new keys for users
 String regex = CONF_HADOOP_PROXYUSER_RE+"[^.]*\\"+CONF_USERS;
@@ -98,22 +92,8 @@ public class ProxyUsers {
   proxyHosts.put(entry.getKey(),
   StringUtils.getTrimmedStringCollection(entry.getValue()));
 }
-
-// trusted proxy servers such as http proxies
-for (String host : conf.getTrimmedStrings(CONF_HADOOP_PROXYSERVERS)) {
-  InetSocketAddress addr = new InetSocketAddress(host, 0);
-  if (!addr.isUnresolved()) {
-proxyServers.add(addr.getAddress().getHostAddress());
-  }
-}
 init = true;
-  }
-
-  public static synchronized boolean isProxyServer(String remoteAddr) { 
-if(!init) {
-  refreshSuperUserGroupsConfiguration(); 
-}
-return proxyServers.contains(remoteAddr);
+ProxyServers.refresh(conf);
   }
   
   /**

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/

svn commit: r1594285 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/security/authorize/ src/test/java/org/apache/hadoop/security/authoriz

2014-05-13 Thread suresh
Author: suresh
Date: Tue May 13 17:01:24 2014
New Revision: 1594285

URL: http://svn.apache.org/r1594285
Log:
Revert the commit r1594283

Removed:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyServers.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyServers.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1594285&r1=1594284&r2=1594285&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Tue May 13 17:01:24 2014
@@ -48,9 +48,6 @@ Release 2.5.0 - UNRELEASED
 HADOOP-10158. SPNEGO should work with multiple interfaces/SPNs.
 (daryn via kihwal)
 
-HADOOP-10566. Refactor proxyservers out of ProxyUsers.
-(Benoy Antony via suresh)
-
   OPTIMIZATIONS
 
   BUG FIXES 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java?rev=1594285&r1=1594284&r2=1594285&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
 Tue May 13 17:01:24 2014
@@ -19,9 +19,11 @@
 package org.apache.hadoop.security.authorize;
 
 import java.net.InetAddress;
+import java.net.InetSocketAddress;
 import java.net.UnknownHostException;
 import java.util.Collection;
 import java.util.HashMap;
+import java.util.HashSet;
 import java.util.Map;
 import java.util.Map.Entry;
 
@@ -40,6 +42,7 @@ public class ProxyUsers {
   private static final String CONF_GROUPS = ".groups";
   private static final String CONF_HADOOP_PROXYUSER = "hadoop.proxyuser.";
   private static final String CONF_HADOOP_PROXYUSER_RE = 
"hadoop\\.proxyuser\\.";
+  public static final String CONF_HADOOP_PROXYSERVERS = "hadoop.proxyservers";
   
   private static boolean init = false;
   //list of users, groups and hosts per proxyuser
@@ -49,6 +52,8 @@ public class ProxyUsers {
 new HashMap>();
   private static Map> proxyHosts = 
 new HashMap>();
+  private static Collection proxyServers =
+new HashSet();
 
   /**
* reread the conf and get new values for 
"hadoop.proxyuser.*.groups/users/hosts"
@@ -68,6 +73,7 @@ public class ProxyUsers {
 proxyGroups.clear();
 proxyHosts.clear();
 proxyUsers.clear();
+proxyServers.clear();
 
 // get all the new keys for users
 String regex = CONF_HADOOP_PROXYUSER_RE+"[^.]*\\"+CONF_USERS;
@@ -92,8 +98,22 @@ public class ProxyUsers {
   proxyHosts.put(entry.getKey(),
   StringUtils.getTrimmedStringCollection(entry.getValue()));
 }
+
+// trusted proxy servers such as http proxies
+for (String host : conf.getTrimmedStrings(CONF_HADOOP_PROXYSERVERS)) {
+  InetSocketAddress addr = new InetSocketAddress(host, 0);
+  if (!addr.isUnresolved()) {
+proxyServers.add(addr.getAddress().getHostAddress());
+  }
+}
 init = true;
-ProxyServers.refresh(conf);
+  }
+
+  public static synchronized boolean isProxyServer(String remoteAddr) { 
+if(!init) {
+  refreshSuperUserGroupsConfiguration(); 
+}
+return proxyServers.contains(remoteAddr);
   }
   
   /**

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java?rev=1594285&r1=1594284&r2=1594285&view=diff
==
--- 
hadoop/common/branches/branch-2/h

svn commit: r1594282 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/security/authorize/ProxyServers.java test/java/org/apache/hadoop/security/authorize/

2014-05-13 Thread suresh
Author: suresh
Date: Tue May 13 16:56:25 2014
New Revision: 1594282

URL: http://svn.apache.org/r1594282
Log:
HADOOP-10566. Adding files missed in previous commit 1594280

Added:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyServers.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyServers.java

Added: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyServers.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyServers.java?rev=1594282&view=auto
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyServers.java
 (added)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyServers.java
 Tue May 13 16:56:25 2014
@@ -0,0 +1,53 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.security.authorize;
+
+import java.net.InetSocketAddress;
+import java.util.Collection;
+import java.util.HashSet;
+
+import org.apache.hadoop.conf.Configuration;
+
+public class ProxyServers {
+  public static final String CONF_HADOOP_PROXYSERVERS = "hadoop.proxyservers";
+  private static volatile Collection proxyServers;
+
+  public static void refresh() {
+refresh(new Configuration());
+  }
+
+  public static void refresh(Configuration conf){
+Collection tempServers = new HashSet();
+// trusted proxy servers such as http proxies
+for (String host : conf.getTrimmedStrings(CONF_HADOOP_PROXYSERVERS)) {
+  InetSocketAddress addr = new InetSocketAddress(host, 0);
+  if (!addr.isUnresolved()) {
+tempServers.add(addr.getAddress().getHostAddress());
+  }
+}
+proxyServers = tempServers;
+  }
+
+  public static boolean isProxyServer(String remoteAddr) { 
+if (proxyServers == null) {
+  refresh(); 
+}
+return proxyServers.contains(remoteAddr);
+  }
+}

Added: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyServers.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyServers.java?rev=1594282&view=auto
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyServers.java
 (added)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyServers.java
 Tue May 13 16:56:25 2014
@@ -0,0 +1,38 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.security.authorize;
+
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertTrue;
+
+import org.apache.hadoop.conf.Configuration;
+import org.junit.Test;
+
+public class TestProxyServers {
+
+  @Test
+  public void testProxyServer() {
+Configuration conf = new Configuration();
+assertF

svn commit: r1594280 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java src/test/java/org/apache/hadoop/secur

2014-05-13 Thread suresh
Author: suresh
Date: Tue May 13 16:53:38 2014
New Revision: 1594280

URL: http://svn.apache.org/r1594280
Log:
HADOOP-10566. Refactor proxyservers out of ProxyUsers. Contributed by Benoy 
Antony.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1594280&r1=1594279&r2=1594280&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue May 
13 16:53:38 2014
@@ -380,6 +380,9 @@ Release 2.5.0 - UNRELEASED
 HADOOP-10158. SPNEGO should work with multiple interfaces/SPNs.
 (daryn via kihwal)
 
+HADOOP-10566. Refactor proxyservers out of ProxyUsers.
+(Benoy Antony via suresh)
+
   OPTIMIZATIONS
 
   BUG FIXES 

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java?rev=1594280&r1=1594279&r2=1594280&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java
 Tue May 13 16:53:38 2014
@@ -19,12 +19,10 @@
 package org.apache.hadoop.security.authorize;
 
 import java.net.InetAddress;
-import java.net.InetSocketAddress;
 import java.net.UnknownHostException;
 import java.util.ArrayList;
 import java.util.Collection;
 import java.util.HashMap;
-import java.util.HashSet;
 import java.util.Map;
 import java.util.Map.Entry;
 
@@ -44,7 +42,6 @@ public class ProxyUsers {
   private static final String CONF_GROUPS = ".groups";
   private static final String CONF_HADOOP_PROXYUSER = "hadoop.proxyuser.";
   private static final String CONF_HADOOP_PROXYUSER_RE = 
"hadoop\\.proxyuser\\.";
-  public static final String CONF_HADOOP_PROXYSERVERS = "hadoop.proxyservers";
   
   private static boolean init = false;
   //list of users, groups and hosts per proxyuser
@@ -54,8 +51,6 @@ public class ProxyUsers {
 new HashMap>();
   private static Map> proxyHosts = 
 new HashMap>();
-  private static Collection proxyServers =
-new HashSet();
 
   /**
* reread the conf and get new values for 
"hadoop.proxyuser.*.groups/users/hosts"
@@ -75,7 +70,6 @@ public class ProxyUsers {
 proxyGroups.clear();
 proxyHosts.clear();
 proxyUsers.clear();
-proxyServers.clear();
 
 // get all the new keys for users
 String regex = CONF_HADOOP_PROXYUSER_RE+"[^.]*\\"+CONF_USERS;
@@ -103,22 +97,8 @@ public class ProxyUsers {
   proxyHosts.put(entry.getKey(),
   StringUtils.getTrimmedStringCollection(entry.getValue()));
 }
-
-// trusted proxy servers such as http proxies
-for (String host : conf.getTrimmedStrings(CONF_HADOOP_PROXYSERVERS)) {
-  InetSocketAddress addr = new InetSocketAddress(host, 0);
-  if (!addr.isUnresolved()) {
-proxyServers.add(addr.getAddress().getHostAddress());
-  }
-}
 init = true;
-  }
-
-  public static synchronized boolean isProxyServer(String remoteAddr) { 
-if(!init) {
-  refreshSuperUserGroupsConfiguration(); 
-}
-return proxyServers.contains(remoteAddr);
+ProxyServers.refresh(conf);
   }
   
   /**

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java?rev=1594280&r1=1594279&r2=1594280&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/authorize/TestProxyUsers.java
 Tue May 13 16:53:38 2014
@@ -327,17 +327,6 @@ public class TestProxyUsers {
 assertEquals (1,hosts.size());
   }
 
- 

svn commit: r1588944 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/conf/hadoop-metrics2.properties

2014-04-21 Thread suresh
Author: suresh
Date: Mon Apr 21 18:00:08 2014
New Revision: 1588944

URL: http://svn.apache.org/r1588944
Log:
HADOOP-9919. Merge r1588943 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1588944&r1=1588943&r2=1588944&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Mon Apr 21 18:00:08 2014
@@ -70,6 +70,9 @@ Release 2.5.0 - UNRELEASED
 HADOOP-10499. Remove unused parameter from ProxyUsers.authorize().
 (Benoy Antony via cnauroth)
 
+HADOOP-9919. Update hadoop-metrics2.properties examples to Yarn.
+(Akira AJISAKA via suresh)
+
 Release 2.4.1 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties?rev=1588944&r1=1588943&r2=1588944&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties
 Mon Apr 21 18:00:08 2014
@@ -12,19 +12,22 @@
 
 #datanode.sink.file.filename=datanode-metrics.out
 
-# the following example split metrics of different
-# context to different sinks (in this case files)
-#jobtracker.sink.file_jvm.context=jvm
-#jobtracker.sink.file_jvm.filename=jobtracker-jvm-metrics.out
-#jobtracker.sink.file_mapred.context=mapred
-#jobtracker.sink.file_mapred.filename=jobtracker-mapred-metrics.out
+#resourcemanager.sink.file.filename=resourcemanager-metrics.out
 
-#tasktracker.sink.file.filename=tasktracker-metrics.out
+#nodemanager.sink.file.filename=nodemanager-metrics.out
 
-#maptask.sink.file.filename=maptask-metrics.out
+#mrappmaster.sink.file.filename=mrappmaster-metrics.out
 
-#reducetask.sink.file.filename=reducetask-metrics.out
+#jobhistoryserver.sink.file.filename=jobhistoryserver-metrics.out
 
+# the following example split metrics of different
+# context to different sinks (in this case files)
+#nodemanager.sink.file_jvm.class=org.apache.hadoop.metrics2.sink.FileSink
+#nodemanager.sink.file_jvm.context=jvm
+#nodemanager.sink.file_jvm.filename=nodemanager-jvm-metrics.out
+#nodemanager.sink.file_mapred.class=org.apache.hadoop.metrics2.sink.FileSink
+#nodemanager.sink.file_mapred.context=mapred
+#nodemanager.sink.file_mapred.filename=nodemanager-mapred-metrics.out
 
 #
 # Below are for sending metrics to Ganglia
@@ -56,11 +59,10 @@
 
 #datanode.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
 
-#jobtracker.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
-
-#tasktracker.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
+#resourcemanager.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
 
-#maptask.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
+#nodemanager.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
 
-#reducetask.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
+#mrappmaster.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
 
+#jobhistoryserver.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649




svn commit: r1588943 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/conf/hadoop-metrics2.properties

2014-04-21 Thread suresh
Author: suresh
Date: Mon Apr 21 17:57:06 2014
New Revision: 1588943

URL: http://svn.apache.org/r1588943
Log:
HADOOP-9919. Update hadoop-metrics2.properties examples to Yarn. Contributed by 
Akira AJISAKA.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1588943&r1=1588942&r2=1588943&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Mon Apr 
21 17:57:06 2014
@@ -393,6 +393,9 @@ Release 2.5.0 - UNRELEASED
 HADOOP-10499. Remove unused parameter from ProxyUsers.authorize().
 (Benoy Antony via cnauroth)
 
+HADOOP-9919. Update hadoop-metrics2.properties examples to Yarn.
+(Akira AJISAKA via suresh)
+
 Release 2.4.1 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties?rev=1588943&r1=1588942&r2=1588943&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics2.properties
 Mon Apr 21 17:57:06 2014
@@ -12,19 +12,22 @@
 
 #datanode.sink.file.filename=datanode-metrics.out
 
-# the following example split metrics of different
-# context to different sinks (in this case files)
-#jobtracker.sink.file_jvm.context=jvm
-#jobtracker.sink.file_jvm.filename=jobtracker-jvm-metrics.out
-#jobtracker.sink.file_mapred.context=mapred
-#jobtracker.sink.file_mapred.filename=jobtracker-mapred-metrics.out
+#resourcemanager.sink.file.filename=resourcemanager-metrics.out
 
-#tasktracker.sink.file.filename=tasktracker-metrics.out
+#nodemanager.sink.file.filename=nodemanager-metrics.out
 
-#maptask.sink.file.filename=maptask-metrics.out
+#mrappmaster.sink.file.filename=mrappmaster-metrics.out
 
-#reducetask.sink.file.filename=reducetask-metrics.out
+#jobhistoryserver.sink.file.filename=jobhistoryserver-metrics.out
 
+# the following example split metrics of different
+# context to different sinks (in this case files)
+#nodemanager.sink.file_jvm.class=org.apache.hadoop.metrics2.sink.FileSink
+#nodemanager.sink.file_jvm.context=jvm
+#nodemanager.sink.file_jvm.filename=nodemanager-jvm-metrics.out
+#nodemanager.sink.file_mapred.class=org.apache.hadoop.metrics2.sink.FileSink
+#nodemanager.sink.file_mapred.context=mapred
+#nodemanager.sink.file_mapred.filename=nodemanager-mapred-metrics.out
 
 #
 # Below are for sending metrics to Ganglia
@@ -56,11 +59,10 @@
 
 #datanode.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
 
-#jobtracker.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
-
-#tasktracker.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
+#resourcemanager.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
 
-#maptask.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
+#nodemanager.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
 
-#reducetask.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
+#mrappmaster.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649
 
+#jobhistoryserver.sink.ganglia.servers=yourgangliahost_1:8649,yourgangliahost_2:8649




svn commit: r1573779 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/fs/shell/Count.java src/test/resources/testConf.xml

2014-03-03 Thread suresh
Author: suresh
Date: Mon Mar  3 22:12:20 2014
New Revision: 1573779

URL: http://svn.apache.org/r1573779
Log:
HADOOP-10378. Merge 1573776 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1573779&r1=1573778&r2=1573779&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Mon Mar  3 22:12:20 2014
@@ -12,6 +12,9 @@ Release 2.5.0 - UNRELEASED
 
   BUG FIXES 
 
+HADOOP-10378. Typo in help printed by hdfs dfs -help.
+(Mit Desai via suresh)
+
 Release 2.4.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java?rev=1573779&r1=1573778&r2=1573779&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java
 Mon Mar  3 22:12:20 2014
@@ -48,7 +48,7 @@ public class Count extends FsCommand {
   "Count the number of directories, files and bytes under the paths\n" +
   "that match the specified file pattern.  The output columns are:\n" +
   "DIR_COUNT FILE_COUNT CONTENT_SIZE FILE_NAME or\n" +
-  "QUOTA REMAINING_QUATA SPACE_QUOTA REMAINING_SPACE_QUOTA \n" +
+  "QUOTA REMAINING_QUOTA SPACE_QUOTA REMAINING_SPACE_QUOTA \n" +
   "  DIR_COUNT FILE_COUNT CONTENT_SIZE FILE_NAME";
   
   private boolean showQuotas;

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml?rev=1573779&r1=1573778&r2=1573779&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml
 Mon Mar  3 22:12:20 2014
@@ -234,7 +234,7 @@
 
 
   RegexpComparator
-  ^( |\t)*QUOTA REMAINING_QUATA SPACE_QUOTA 
REMAINING_SPACE_QUOTA( )*
+  ^( |\t)*QUOTA REMAINING_QUOTA SPACE_QUOTA 
REMAINING_SPACE_QUOTA( )*
 
 
   RegexpComparator




svn commit: r1573776 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/fs/shell/Count.java src/test/resources/testConf.xml

2014-03-03 Thread suresh
Author: suresh
Date: Mon Mar  3 22:08:49 2014
New Revision: 1573776

URL: http://svn.apache.org/r1573776
Log:
HADOOP-10378. Typo in help printed by hdfs dfs -help. Contributed by Mit Desai.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1573776&r1=1573775&r2=1573776&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Mon Mar 
 3 22:08:49 2014
@@ -315,6 +315,9 @@ Release 2.5.0 - UNRELEASED
 
   BUG FIXES 
 
+HADOOP-10378. Typo in help printed by hdfs dfs -help.
+(Mit Desai via suresh)
+
 Release 2.4.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java?rev=1573776&r1=1573775&r2=1573776&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Count.java
 Mon Mar  3 22:08:49 2014
@@ -48,7 +48,7 @@ public class Count extends FsCommand {
   "Count the number of directories, files and bytes under the paths\n" +
   "that match the specified file pattern.  The output columns are:\n" +
   "DIR_COUNT FILE_COUNT CONTENT_SIZE FILE_NAME or\n" +
-  "QUOTA REMAINING_QUATA SPACE_QUOTA REMAINING_SPACE_QUOTA \n" +
+  "QUOTA REMAINING_QUOTA SPACE_QUOTA REMAINING_SPACE_QUOTA \n" +
   "  DIR_COUNT FILE_COUNT CONTENT_SIZE FILE_NAME";
   
   private boolean showQuotas;

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml?rev=1573776&r1=1573775&r2=1573776&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml
 Mon Mar  3 22:08:49 2014
@@ -234,7 +234,7 @@
 
 
   RegexpComparator
-  ^( |\t)*QUOTA REMAINING_QUATA SPACE_QUOTA 
REMAINING_SPACE_QUOTA( )*
+  ^( |\t)*QUOTA REMAINING_QUOTA SPACE_QUOTA 
REMAINING_SPACE_QUOTA( )*
 
 
   RegexpComparator




svn commit: r1568166 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java

2014-02-13 Thread suresh
Author: suresh
Date: Thu Feb 13 23:47:58 2014
New Revision: 1568166

URL: http://svn.apache.org/r1568166
Log:
HADOOP-10249. Merge 1568164 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1568166&r1=1568165&r2=1568166&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Thu Feb 13 23:47:58 2014
@@ -41,6 +41,9 @@ Release 2.4.0 - UNRELEASED
 HADOOP-10338. Cannot get the FileStatus of the root inode from the new
 Globber (cmccabe)
 
+HADOOP-10249. LdapGroupsMapping should trim ldap password read from file.
+(Dilli Armugam via suresh)
+
 Release 2.3.1 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java?rev=1568166&r1=1568165&r2=1568166&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java
 Thu Feb 13 23:47:58 2014
@@ -356,7 +356,7 @@ public class LdapGroupsMapping
 c = reader.read();
   }
   reader.close();
-  return password.toString();
+  return password.toString().trim();
 } catch (IOException ioe) {
   throw new RuntimeException("Could not read password file: " + pwFile, 
ioe);
 }




svn commit: r1568164 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java

2014-02-13 Thread suresh
Author: suresh
Date: Thu Feb 13 23:46:16 2014
New Revision: 1568164

URL: http://svn.apache.org/r1568164
Log:
HADOOP-10249. LdapGroupsMapping should trim ldap password read from file. 
Contributed by Dilli Armugam.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1568164&r1=1568163&r2=1568164&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Thu Feb 
13 23:46:16 2014
@@ -339,6 +339,9 @@ Release 2.4.0 - UNRELEASED
 HADOOP-10338. Cannot get the FileStatus of the root inode from the new
 Globber (cmccabe)
 
+HADOOP-10249. LdapGroupsMapping should trim ldap password read from file.
+(Dilli Armugam via suresh)
+
 Release 2.3.1 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java?rev=1568164&r1=1568163&r2=1568164&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java
 Thu Feb 13 23:46:16 2014
@@ -356,7 +356,7 @@ public class LdapGroupsMapping
 c = reader.read();
   }
   reader.close();
-  return password.toString();
+  return password.toString().trim();
 } catch (IOException ioe) {
   throw new RuntimeException("Could not read password file: " + pwFile, 
ioe);
 }




svn commit: r1566709 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/overview.html

2014-02-10 Thread suresh
Author: suresh
Date: Mon Feb 10 19:34:54 2014
New Revision: 1566709

URL: http://svn.apache.org/r1566709
Log:
HADOOP-10333. Fix grammatical error in overview.html document. Contributed by 
René Nyffenegger.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/overview.html

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1566709&r1=1566708&r2=1566709&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Mon Feb 
10 19:34:54 2014
@@ -312,6 +312,9 @@ Release 2.4.0 - UNRELEASED
 HADOOP-10295. Allow distcp to automatically identify the checksum type of 
 source files and use it for the target. (jing9 and Laurent Goujon)
 
+HADOOP-10333. Fix grammatical error in overview.html document.
+(René Nyffenegger via suresh)
+
   OPTIMIZATIONS
 
   BUG FIXES

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/overview.html
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/overview.html?rev=1566709&r1=1566708&r2=1566709&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/overview.html
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/overview.html
 Mon Feb 10 19:34:54 2014
@@ -57,7 +57,7 @@ that process vast amounts of data. Here'
 
 
   
-Hadoop was been demonstrated on GNU/Linux clusters with 2000 nodes.
+Hadoop has been demonstrated on GNU/Linux clusters with more than 4000 
nodes.
   
   
 Windows is also a supported platform.




svn commit: r1566710 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/overview.html

2014-02-10 Thread suresh
Author: suresh
Date: Mon Feb 10 19:35:46 2014
New Revision: 1566710

URL: http://svn.apache.org/r1566710
Log:
HADOOP-10333. Merge 1566709 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/overview.html

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1566710&r1=1566709&r2=1566710&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Mon Feb 10 19:35:46 2014
@@ -14,6 +14,9 @@ Release 2.4.0 - UNRELEASED
 HADOOP-10295. Allow distcp to automatically identify the checksum type of 
 source files and use it for the target. (jing9 and Laurent Goujon)
 
+HADOOP-10333. Fix grammatical error in overview.html document.
+(René Nyffenegger via suresh)
+
   OPTIMIZATIONS
 
   BUG FIXES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/overview.html
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/overview.html?rev=1566710&r1=1566709&r2=1566710&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/overview.html
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/overview.html
 Mon Feb 10 19:35:46 2014
@@ -57,7 +57,7 @@ that process vast amounts of data. Here'
 
 
   
-Hadoop was been demonstrated on GNU/Linux clusters with 2000 nodes.
+Hadoop has been demonstrated on GNU/Linux clusters with more than 4000 
nodes.
   
   
 Windows is also a supported platform.




svn commit: r1561967 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/http/HttpServer.java

2014-01-27 Thread suresh
Author: suresh
Date: Tue Jan 28 07:50:54 2014
New Revision: 1561967

URL: http://svn.apache.org/r1561967
Log:
HADOOP-10292. Restore HttpServer from branch-2.2 in branch-2. Contributed by 
Haohui Mai.

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1561967&r1=1561966&r2=1561967&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Tue Jan 28 07:50:54 2014
@@ -371,6 +371,9 @@ Release 2.3.0 - UNRELEASED
 HADOOP-10255. Rename HttpServer to HttpServer2 to retain older 
 HttpServer in branch-2 for compatibility. (Haohui Mai via suresh)
 
+HADOOP-10292. Restore HttpServer from branch-2.2 in branch-2.
+(Haohui Mai via suresh)
+
 Release 2.2.0 - 2013-10-13
 
   INCOMPATIBLE CHANGES

Added: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer.java?rev=1561967&view=auto
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer.java
 (added)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer.java
 Tue Jan 28 07:50:54 2014
@@ -0,0 +1,1103 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.http;
+
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.io.PrintWriter;
+import java.net.BindException;
+import java.net.InetSocketAddress;
+import java.net.URL;
+import java.security.GeneralSecurityException;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.Enumeration;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import javax.net.ssl.SSLServerSocketFactory;
+import javax.servlet.Filter;
+import javax.servlet.FilterChain;
+import javax.servlet.FilterConfig;
+import javax.servlet.ServletContext;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+import javax.servlet.http.HttpServlet;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletRequestWrapper;
+import javax.servlet.http.HttpServletResponse;
+
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
+import org.apache.hadoop.classification.InterfaceAudience;
+import org.apache.hadoop.classification.InterfaceStability;
+import org.apache.hadoop.conf.ConfServlet;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.CommonConfigurationKeys;
+import org.apache.hadoop.jmx.JMXJsonServlet;
+import org.apache.hadoop.log.LogLevel;
+import org.apache.hadoop.metrics.MetricsServlet;
+import org.apache.hadoop.security.SecurityUtil;
+import org.apache.hadoop.security.UserGroupInformation;
+import org.apache.hadoop.security.authentication.server.AuthenticationFilter;
+import org.apache.hadoop.security.authorize.AccessControlList;
+import org.apache.hadoop.security.ssl.SSLFactory;
+import org.apache.hadoop.util.ReflectionUtils;
+import org.apache.hadoop.util.Shell;
+import org.mortbay.io.Buffer;
+import org.mortbay.jetty.Connector;
+import org.mortbay.jetty.Handler;
+import org.mortbay.jetty.MimeTypes;
+import org.mortbay.jetty.Server;
+import org.mortbay.jetty.handler.ContextHandler;
+import org.mortbay.jetty.handler.ContextHandlerCollection;
+import org.mortbay.jetty

svn commit: r1561966 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/conf/ src/main/java/org/apache/hadoop/http/ src/main/java/org/apache/

2014-01-27 Thread suresh
Author: suresh
Date: Tue Jan 28 07:47:37 2014
New Revision: 1561966

URL: http://svn.apache.org/r1561966
Log:
HADOOP-10255. Merge 1561959 and 1561961 from trunk.

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
  - copied unchanged from r1561959, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
Removed:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ConfServlet.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/AdminAuthorizedServlet.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/jmx/JMXJsonServlet.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/log/LogLevel.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsServlet.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/AuthenticationFilterInitializer.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/HttpServerFunctionalTest.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/TestGlobalFilter.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/TestHtmlQuoting.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/TestHttpServer.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/TestHttpServerLifecycle.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/TestHttpServerWebapps.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/TestPathFilter.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/TestSSLHttpServer.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/http/TestServletFilter.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/jmx/TestJMXJsonServlet.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/log/TestLogLevel.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestAuthenticationFilter.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1561966&r1=1561965&r2=1561966&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Tue Jan 28 07:47:37 2014
@@ -368,6 +368,9 @@ Release 2.3.0 - UNRELEASED
 HADOOP-9830. Fix typo at http://hadoop.apache.org/docs/current/
 (Kousuke Saruta via Arpit Agarwal)
 
+HADOOP-10255. Rename HttpServer to HttpServer2 to retain older 
+HttpServer in branch-2 for compatibility. (Haohui Mai via suresh)
+
 Release 2.2.0 - 2013-10-13
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ConfServlet.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ConfServlet.java?rev=1561966&r1=1561965&r2=1561966&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ConfServlet.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ConfServlet.java
 Tue Jan 28 07:47:37 2014
@@ -27,7 +27,7 @@ import javax.servlet.http.HttpServletRes
 
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
-import org.apache.hadoop.http.HttpServer;
+import org.apache.hadoop.http.HttpServer2;
 
 /**
  * A 

svn commit: r1561961 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

2014-01-27 Thread suresh
Author: suresh
Date: Tue Jan 28 07:34:03 2014
New Revision: 1561961

URL: http://svn.apache.org/r1561961
Log:
HADOOP-10255. Adding missed CHANGES.txt from change 1561959.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1561961&r1=1561960&r2=1561961&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue Jan 
28 07:34:03 2014
@@ -661,8 +661,8 @@ Release 2.3.0 - UNRELEASED
 HADOOP-9830. Fix typo at http://hadoop.apache.org/docs/current/
 (Kousuke Saruta via Arpit Agarwal)
 
-HADOOP-10255. Rename HttpServer into HttpServer2.
-(Haohui Mai via suresh)
+HADOOP-10255. Rename HttpServer to HttpServer2 to retain older 
+HttpServer in branch-2 for compatibility. (Haohui Mai via suresh)
 
 Release 2.2.0 - 2013-10-13
 




svn commit: r1561959 [2/2] - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/conf/ src/main/java/org/apache/hadoop/http/ src/main/java/org/apache/hadoop

2014-01-27 Thread suresh
Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/log/TestLogLevel.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/log/TestLogLevel.java?rev=1561959&r1=1561958&r2=1561959&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/log/TestLogLevel.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/log/TestLogLevel.java
 Tue Jan 28 07:32:52 2014
@@ -20,7 +20,7 @@ package org.apache.hadoop.log;
 import java.io.*;
 import java.net.*;
 
-import org.apache.hadoop.http.HttpServer;
+import org.apache.hadoop.http.HttpServer2;
 import org.apache.hadoop.net.NetUtils;
 
 import junit.framework.TestCase;
@@ -44,7 +44,7 @@ public class TestLogLevel extends TestCa
   log.error("log.error1");
   assertTrue(!Level.ERROR.equals(log.getEffectiveLevel()));
 
-  HttpServer server = new HttpServer.Builder().setName("..")
+  HttpServer2 server = new HttpServer2.Builder().setName("..")
   .addEndpoint(new URI("http://localhost:0";)).setFindPort(true)
   .build();
   

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestAuthenticationFilter.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestAuthenticationFilter.java?rev=1561959&r1=1561958&r2=1561959&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestAuthenticationFilter.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestAuthenticationFilter.java
 Tue Jan 28 07:32:52 2014
@@ -18,7 +18,7 @@ package org.apache.hadoop.security;
 
 
 import junit.framework.TestCase;
-import org.apache.hadoop.http.HttpServer;
+import org.apache.hadoop.http.HttpServer2;
 import org.apache.hadoop.security.authentication.server.AuthenticationFilter;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.http.FilterContainer;
@@ -49,7 +49,7 @@ public class TestAuthenticationFilter ex
  AuthenticationFilterInitializer.SIGNATURE_SECRET_FILE, 
  secretFile.getAbsolutePath());
 
-conf.set(HttpServer.BIND_ADDRESS, "barhost");
+conf.set(HttpServer2.BIND_ADDRESS, "barhost");
 
 FilterContainer container = Mockito.mock(FilterContainer.class);
 Mockito.doAnswer(




svn commit: r1558500 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/Client.java

2014-01-15 Thread suresh
Author: suresh
Date: Wed Jan 15 18:32:40 2014
New Revision: 1558500

URL: http://svn.apache.org/r1558500
Log:
HADOOP-10236. Merge change 1558498 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1558500&r1=1558499&r2=1558500&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Wed Jan 15 18:32:40 2014
@@ -224,6 +224,9 @@ Release 2.4.0 - UNRELEASED
 HADOOP-10223. MiniKdc#main() should close the FileReader it creates. 
 (Ted Yu via tucu)
 
+HADOOP-10236. Fix typo in o.a.h.ipc.Client#checkResponse. (Akira Ajisaka
+    via suresh)
+
 Release 2.3.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1558500&r1=1558499&r2=1558500&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Wed Jan 15 18:32:40 2014
@@ -286,7 +286,7 @@ public class Client {
   if (!Arrays.equals(id, RpcConstants.DUMMY_CLIENT_ID)) {
 if (!Arrays.equals(id, clientId)) {
   throw new IOException("Client IDs not matched: local ID="
-  + StringUtils.byteToHexString(clientId) + ", ID in reponse="
+  + StringUtils.byteToHexString(clientId) + ", ID in response="
   + 
StringUtils.byteToHexString(header.getClientId().toByteArray()));
 }
   }




svn commit: r1558498 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/Client.java

2014-01-15 Thread suresh
Author: suresh
Date: Wed Jan 15 18:27:59 2014
New Revision: 1558498

URL: http://svn.apache.org/r1558498
Log:
HADOOP-10236. Fix typo in o.a.h.ipc.Client#checkResponse. Contributed by Akira 
Ajisaka.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1558498&r1=1558497&r2=1558498&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Wed Jan 
15 18:27:59 2014
@@ -517,6 +517,9 @@ Release 2.4.0 - UNRELEASED
 HADOOP-10223. MiniKdc#main() should close the FileReader it creates. 
 (Ted Yu via tucu)
 
+HADOOP-10236. Fix typo in o.a.h.ipc.Client#checkResponse. (Akira Ajisaka
+    via suresh)
+
 Release 2.3.0 - UNRELEASED
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1558498&r1=1558497&r2=1558498&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Wed Jan 15 18:27:59 2014
@@ -286,7 +286,7 @@ public class Client {
   if (!Arrays.equals(id, RpcConstants.DUMMY_CLIENT_ID)) {
 if (!Arrays.equals(id, clientId)) {
   throw new IOException("Client IDs not matched: local ID="
-  + StringUtils.byteToHexString(clientId) + ", ID in reponse="
+  + StringUtils.byteToHexString(clientId) + ", ID in response="
   + 
StringUtils.byteToHexString(header.getClientId().toByteArray()));
 }
   }




svn commit: r1551648 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/util/ReflectionUtils.java

2013-12-17 Thread suresh
Author: suresh
Date: Tue Dec 17 18:24:38 2013
New Revision: 1551648

URL: http://svn.apache.org/r1551648
Log:
HADOOP-10168. Merge 1551646 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1551648&r1=1551647&r2=1551648&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Tue Dec 17 18:24:38 2013
@@ -105,12 +105,14 @@ Release 2.4.0 - UNRELEASED
 
 HADOOP-10102. Update commons IO from 2.1 to 2.4 (Akira Ajisaka via stevel)
 
+HADOOP-10168. fix javadoc of ReflectionUtils#copy. (Thejas Nair via suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9748. Reduce blocking on UGI.ensureInitialized (daryn)
 
-   HADOOP-10047. Add a direct-buffer based apis for compression. (Gopal V
-   via acmurthy)
+HADOOP-10047. Add a direct-buffer based apis for compression. (Gopal V
+via acmurthy)
 
   BUG FIXES
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java?rev=1551648&r1=1551647&r2=1551648&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java
 Tue Dec 17 18:24:38 2013
@@ -275,8 +275,9 @@ public class ReflectionUtils {
   
   /**
* Make a copy of the writable object using serialization to a buffer
-   * @param dst the object to copy from
-   * @param src the object to copy into, which is destroyed
+   * @param src the object to copy from
+   * @param dst the object to copy into, which is destroyed
+   * @return dst param (the copy)
* @throws IOException
*/
   @SuppressWarnings("unchecked")




svn commit: r1551646 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/util/ReflectionUtils.java

2013-12-17 Thread suresh
Author: suresh
Date: Tue Dec 17 18:18:15 2013
New Revision: 1551646

URL: http://svn.apache.org/r1551646
Log:
HADOOP-10168. fix javadoc of ReflectionUtils#copy. Contributed by Thejas Nair.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1551646&r1=1551645&r2=1551646&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue Dec 
17 18:18:15 2013
@@ -397,12 +397,14 @@ Release 2.4.0 - UNRELEASED
 
 HADOOP-10102. Update commons IO from 2.1 to 2.4 (Akira Ajisaka via stevel)
 
+HADOOP-10168. fix javadoc of ReflectionUtils#copy. (Thejas Nair via suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9748. Reduce blocking on UGI.ensureInitialized (daryn)
 
-   HADOOP-10047. Add a direct-buffer based apis for compression. (Gopal V
-   via acmurthy)
+HADOOP-10047. Add a direct-buffer based apis for compression. (Gopal V
+via acmurthy)
 
   BUG FIXES
 

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java?rev=1551646&r1=1551645&r2=1551646&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ReflectionUtils.java
 Tue Dec 17 18:18:15 2013
@@ -275,8 +275,9 @@ public class ReflectionUtils {
   
   /**
* Make a copy of the writable object using serialization to a buffer
-   * @param dst the object to copy from
-   * @param src the object to copy into, which is destroyed
+   * @param src the object to copy from
+   * @param dst the object to copy into, which is destroyed
+   * @return dst param (the copy)
* @throws IOException
*/
   @SuppressWarnings("unchecked")




svn commit: r1545379 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/util/LightWeightGSet.java

2013-11-25 Thread suresh
Author: suresh
Date: Mon Nov 25 19:46:46 2013
New Revision: 1545379

URL: http://svn.apache.org/r1545379
Log:
HADOOP-10126. Merge change 1545376 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1545379&r1=1545378&r2=1545379&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Mon Nov 25 19:46:46 2013
@@ -98,6 +98,8 @@ Release 2.3.0 - UNRELEASED
 HADOOP-10111. Allow DU to be initialized with an initial value (Kihwal Lee
 via jeagles)
 
+HADOOP-10126. LightWeightGSet log message is confusing. (Vinay via suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9748. Reduce blocking on UGI.ensureInitialized (daryn)

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java?rev=1545379&r1=1545378&r2=1545379&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java
 Mon Nov 25 19:46:46 2013
@@ -327,8 +327,11 @@ public class LightWeightGSet

svn commit: r1545376 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/util/LightWeightGSet.java

2013-11-25 Thread suresh
Author: suresh
Date: Mon Nov 25 19:42:19 2013
New Revision: 1545376

URL: http://svn.apache.org/r1545376
Log:
HADOOP-10126. LightWeightGSet log message is confusing. Contributed by Vinay.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1545376&r1=1545375&r2=1545376&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Mon Nov 
25 19:42:19 2013
@@ -388,6 +388,8 @@ Release 2.3.0 - UNRELEASED
 HADOOP-10111. Allow DU to be initialized with an initial value (Kihwal Lee
 via jeagles)
 
+HADOOP-10126. LightWeightGSet log message is confusing. (Vinay via suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9748. Reduce blocking on UGI.ensureInitialized (daryn)

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java?rev=1545376&r1=1545375&r2=1545376&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java
 Mon Nov 25 19:42:19 2013
@@ -348,8 +348,11 @@ public class LightWeightGSet

svn commit: r1533246 - in /hadoop/common/branches/branch-1: CHANGES.txt src/hdfs/org/apache/hadoop/hdfs/server/namenode/FSImage.java

2013-10-17 Thread suresh
Author: suresh
Date: Thu Oct 17 20:26:49 2013
New Revision: 1533246

URL: http://svn.apache.org/r1533246
Log:
HDFS-5367. Restoring namenode storage locks namenode due to unnecessary fsimage 
write. Contributed by John Zhao.

Modified:
hadoop/common/branches/branch-1/CHANGES.txt

hadoop/common/branches/branch-1/src/hdfs/org/apache/hadoop/hdfs/server/namenode/FSImage.java

Modified: hadoop/common/branches/branch-1/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1533246&r1=1533245&r2=1533246&view=diff
==
--- hadoop/common/branches/branch-1/CHANGES.txt (original)
+++ hadoop/common/branches/branch-1/CHANGES.txt Thu Oct 17 20:26:49 2013
@@ -43,6 +43,9 @@ Release 1.3.0 - unreleased
 HDFS-5245. shouldRetry() in WebHDFSFileSystem generates excessive warnings.
 (Haohui Mai via jing9)
 
+HDFS-5367. Restoring namenode storage locks namenode due to unnecessary
+fsimage write. (Jonh Zhao via suresh)
+
   BUG FIXES
 
 HADOOP-9863. Backport HADOOP-8686 to support BigEndian on ppc64. 

Modified: 
hadoop/common/branches/branch-1/src/hdfs/org/apache/hadoop/hdfs/server/namenode/FSImage.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/hdfs/org/apache/hadoop/hdfs/server/namenode/FSImage.java?rev=1533246&r1=1533245&r2=1533246&view=diff
==
--- 
hadoop/common/branches/branch-1/src/hdfs/org/apache/hadoop/hdfs/server/namenode/FSImage.java
 (original)
+++ 
hadoop/common/branches/branch-1/src/hdfs/org/apache/hadoop/hdfs/server/namenode/FSImage.java
 Thu Oct 17 20:26:49 2013
@@ -1308,12 +1308,9 @@ public class FSImage extends Storage {
 
 if (sd.getStorageDirType().equals(NameNodeDirType.EDITS)) {
   restoreFile(goodEdits, sd.getCurrentDir(), 
NameNodeFile.EDITS.getName());
-} else if (sd.getStorageDirType().equals(NameNodeDirType.IMAGE)) {
-  restoreFile(goodImage, sd.getCurrentDir(), 
NameNodeFile.IMAGE.getName());
 } else if (sd.getStorageDirType().equals(
 NameNodeDirType.IMAGE_AND_EDITS)) {
-  restoreFile(goodEdits, sd.getCurrentDir(), 
NameNodeFile.EDITS.getName());
-  restoreFile(goodImage, sd.getCurrentDir(), 
NameNodeFile.IMAGE.getName());
+  restoreFile(goodEdits, sd.getCurrentDir(), 
NameNodeFile.EDITS.getName());  
 } else {
   throw new IOException("Invalid NameNodeDirType: "
   + sd.getStorageDirType());




svn commit: r1532912 - /hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/CHANGES.txt

2013-10-16 Thread suresh
Author: suresh
Date: Wed Oct 16 21:11:03 2013
New Revision: 1532912

URL: http://svn.apache.org/r1532912
Log:
HADOOP-10005. Merge 1532908 from branch-2

Modified:

hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1532912&r1=1532911&r2=1532912&view=diff
==
--- 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/CHANGES.txt
 Wed Oct 16 21:11:03 2013
@@ -207,6 +207,9 @@ Release 2.1.1-beta - 2013-09-23
 HADOOP-9977. Hadoop services won't start with different keypass and
 keystorepass when https is enabled. (cnauroth)
 
+HADOOP-10005. No need to check INFO severity level is enabled or not.
+(Jackie Chang via suresh)
+
 Release 2.1.0-beta - 2013-08-22
 
   INCOMPATIBLE CHANGES




svn commit: r1532912 - /hadoop/common/branches/branch-2.2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java

2013-10-16 Thread suresh
Author: suresh
Date: Wed Oct 16 21:11:03 2013
New Revision: 1532912

URL: http://svn.apache.org/r1532912
Log:
HADOOP-10005. Merge 1532908 from branch-2

Modified:

hadoop/common/branches/branch-2.2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java

Modified: 
hadoop/common/branches/branch-2.2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java?rev=1532912&r1=1532911&r2=1532912&view=diff
==
--- 
hadoop/common/branches/branch-2.2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
 (original)
+++ 
hadoop/common/branches/branch-2.2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
 Wed Oct 16 21:11:03 2013
@@ -122,11 +122,9 @@ public class DebugJobProducer implements
   // Add/remove excess
   recs[0] += totalrecs - tot_recs;
   bytes[0] += totalbytes - tot_bytes;
-  if (LOG.isInfoEnabled()) {
-LOG.info(
-  "DIST: " + Arrays.toString(recs) + " " + tot_recs + "/" + totalrecs +
-" " + Arrays.toString(bytes) + " " + tot_bytes + "/" + totalbytes);
-  }
+  LOG.info(
+"DIST: " + Arrays.toString(recs) + " " + tot_recs + "/" + totalrecs +
+  " " + Arrays.toString(bytes) + " " + tot_bytes + "/" + totalbytes);
 }
 
 private static final AtomicInteger seq = new AtomicInteger(0);




svn commit: r1532908 - /hadoop/common/branches/branch-2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java

2013-10-16 Thread suresh
Author: suresh
Date: Wed Oct 16 21:05:23 2013
New Revision: 1532908

URL: http://svn.apache.org/r1532908
Log:
HADOOP-10005. Merge 1532907 from trunk

Modified:

hadoop/common/branches/branch-2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java

Modified: 
hadoop/common/branches/branch-2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java?rev=1532908&r1=1532907&r2=1532908&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
 Wed Oct 16 21:05:23 2013
@@ -122,11 +122,9 @@ public class DebugJobProducer implements
   // Add/remove excess
   recs[0] += totalrecs - tot_recs;
   bytes[0] += totalbytes - tot_bytes;
-  if (LOG.isInfoEnabled()) {
-LOG.info(
-  "DIST: " + Arrays.toString(recs) + " " + tot_recs + "/" + totalrecs +
-" " + Arrays.toString(bytes) + " " + tot_bytes + "/" + totalbytes);
-  }
+  LOG.info(
+"DIST: " + Arrays.toString(recs) + " " + tot_recs + "/" + totalrecs +
+  " " + Arrays.toString(bytes) + " " + tot_bytes + "/" + totalbytes);
 }
 
 private static final AtomicInteger seq = new AtomicInteger(0);




svn commit: r1532908 - /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

2013-10-16 Thread suresh
Author: suresh
Date: Wed Oct 16 21:05:23 2013
New Revision: 1532908

URL: http://svn.apache.org/r1532908
Log:
HADOOP-10005. Merge 1532907 from trunk

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1532908&r1=1532907&r2=1532908&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Wed Oct 16 21:05:23 2013
@@ -314,6 +314,9 @@ Release 2.1.1-beta - 2013-09-23
 HADOOP-9977. Hadoop services won't start with different keypass and
 keystorepass when https is enabled. (cnauroth)
 
+HADOOP-10005. No need to check INFO severity level is enabled or not.
+(Jackie Chang via suresh)
+
 Release 2.1.0-beta - 2013-08-22
 
   INCOMPATIBLE CHANGES




svn commit: r1532907 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

2013-10-16 Thread suresh
Author: suresh
Date: Wed Oct 16 21:00:07 2013
New Revision: 1532907

URL: http://svn.apache.org/r1532907
Log:
HADOOP-10005. No need to check INFO severity level is enabled or not. 
Contributed by Jackie Chang.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1532907&r1=1532906&r2=1532907&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Wed Oct 
16 21:00:07 2013
@@ -585,6 +585,9 @@ Release 2.1.1-beta - 2013-09-23
 HADOOP-9977. Hadoop services won't start with different keypass and
 keystorepass when https is enabled. (cnauroth)
 
+HADOOP-10005. No need to check INFO severity level is enabled or not.
+(Jackie Chang via suresh)
+
 Release 2.1.0-beta - 2013-08-22
 
   INCOMPATIBLE CHANGES




svn commit: r1532907 - /hadoop/common/trunk/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java

2013-10-16 Thread suresh
Author: suresh
Date: Wed Oct 16 21:00:07 2013
New Revision: 1532907

URL: http://svn.apache.org/r1532907
Log:
HADOOP-10005. No need to check INFO severity level is enabled or not. 
Contributed by Jackie Chang.

Modified:

hadoop/common/trunk/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java

Modified: 
hadoop/common/trunk/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java?rev=1532907&r1=1532906&r2=1532907&view=diff
==
--- 
hadoop/common/trunk/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
 (original)
+++ 
hadoop/common/trunk/hadoop-tools/hadoop-gridmix/src/test/java/org/apache/hadoop/mapred/gridmix/DebugJobProducer.java
 Wed Oct 16 21:00:07 2013
@@ -122,11 +122,9 @@ public class DebugJobProducer implements
   // Add/remove excess
   recs[0] += totalrecs - tot_recs;
   bytes[0] += totalbytes - tot_bytes;
-  if (LOG.isInfoEnabled()) {
-LOG.info(
-  "DIST: " + Arrays.toString(recs) + " " + tot_recs + "/" + totalrecs +
-" " + Arrays.toString(bytes) + " " + tot_bytes + "/" + totalbytes);
-  }
+  LOG.info(
+"DIST: " + Arrays.toString(recs) + " " + tot_recs + "/" + totalrecs +
+  " " + Arrays.toString(bytes) + " " + tot_bytes + "/" + totalbytes);
 }
 
 private static final AtomicInteger seq = new AtomicInteger(0);




svn commit: r1531129 - in /hadoop/common/branches/branch-2.2: ./ hadoop-project/ hadoop-project/pom.xml hadoop-project/src/site/

2013-10-10 Thread suresh
Author: suresh
Date: Thu Oct 10 22:09:37 2013
New Revision: 1531129

URL: http://svn.apache.org/r1531129
Log:
HADOOP-10029. Merging change 1531126 from branch-2.

Modified:
hadoop/common/branches/branch-2.2/   (props changed)
hadoop/common/branches/branch-2.2/hadoop-project/   (props changed)
hadoop/common/branches/branch-2.2/hadoop-project/pom.xml   (props changed)
hadoop/common/branches/branch-2.2/hadoop-project/src/site/   (props changed)

Propchange: hadoop/common/branches/branch-2.2/
--
  Merged /hadoop/common/trunk:r1531125

Propchange: hadoop/common/branches/branch-2.2/hadoop-project/
--
  Merged /hadoop/common/trunk/hadoop-project:r1531125

Propchange: hadoop/common/branches/branch-2.2/hadoop-project/pom.xml
--
  Merged /hadoop/common/trunk/hadoop-project/pom.xml:r1531125

Propchange: hadoop/common/branches/branch-2.2/hadoop-project/src/site/
--
  Merged /hadoop/common/trunk/hadoop-project/src/site:r1531125




svn commit: r1531129 - in /hadoop/common/branches/branch-2.2/hadoop-common-project: ./ hadoop-auth/ hadoop-common/ hadoop-common/src/ hadoop-common/src/main/docs/ hadoop-common/src/main/java/ hadoop-c

2013-10-10 Thread suresh
Author: suresh
Date: Thu Oct 10 22:09:37 2013
New Revision: 1531129

URL: http://svn.apache.org/r1531129
Log:
HADOOP-10029. Merging change 1531126 from branch-2.

Modified:
hadoop/common/branches/branch-2.2/hadoop-common-project/   (props changed)
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-auth/   
(props changed)
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/   
(props changed)

hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/CHANGES.txt
   (props changed)
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/  
 (props changed)

hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/docs/
   (props changed)

hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/java/
   (props changed)

hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java

hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/test/core/
   (props changed)

hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestHarFileSystem.java

Propchange: hadoop/common/branches/branch-2.2/hadoop-common-project/
--
  Merged /hadoop/common/trunk/hadoop-common-project:r1531125

Propchange: hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-auth/
--
  Merged /hadoop/common/trunk/hadoop-common-project/hadoop-auth:r1531125

Propchange: 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/
--
  Merged /hadoop/common/trunk/hadoop-common-project/hadoop-common:r1531125

Propchange: 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/CHANGES.txt
--
  Merged 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt:r1531125

Propchange: 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/
--
--- svn:mergeinfo (added)
+++ svn:mergeinfo Thu Oct 10 22:09:37 2013
@@ -0,0 +1 @@
+/hadoop/common/trunk/hadoop-common-project/hadoop-common/src:1531125

Propchange: 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/docs/
--
  Merged 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/docs:r1531125

Propchange: 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/java/
--
  Merged 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java:r1531125

Modified: 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java?rev=1531129&r1=1531128&r2=1531129&view=diff
==
--- 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 (original)
+++ 
hadoop/common/branches/branch-2.2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 Thu Oct 10 22:09:37 2013
@@ -273,7 +273,17 @@ public class HarFileSystem extends FileS
   public Path getWorkingDirectory() {
 return new Path(uri.toString());
   }
-  
+
+  @Override
+  public Path getInitialWorkingDirectory() {
+return getWorkingDirectory();
+  }
+
+  @Override
+  public FsStatus getStatus(Path p) throws IOException {
+return fs.getStatus(p);
+  }
+
   /**
* Create a har specific auth 
* har-underlyingfs:port
@@ -296,9 +306,18 @@ public class HarFileSystem extends FileS
 return auth;
   }
 
+  /**
+   * Used for delegation token related functionality. Must delegate to
+   * underlying file system.
+   */
   @Override
   protected URI getCanonicalUri() {
-return fs.canonicalizeUri(getUri());
+return fs.getCanonicalUri();
+  }
+
+  @Override
+  protected URI canonicalizeUri(URI uri) {
+return fs.canonicalizeUri(uri);
   }
 
   /**
@@ -311,6 +330,16 @@ public class HarFileSystem extends FileS
 return this.uri;
   }
   
+  @Override
+  protected void checkPath(Path path) {
+fs.checkPath(path);
+  }
+
+  @Override
+  public Path resolvePath(Path p) throws IOException {
+return fs.resolvePath(p);
+  }
+
   /**
* this method returns the path 
* inside the har filesystem

svn commit: r1531126 - in /hadoop/common/branches/branch-2/hadoop-common-project: ./ hadoop-auth/ hadoop-common/ hadoop-common/src/ hadoop-common/src/main/docs/ hadoop-common/src/main/java/ hadoop-com

2013-10-10 Thread suresh
Author: suresh
Date: Thu Oct 10 21:58:41 2013
New Revision: 1531126

URL: http://svn.apache.org/r1531126
Log:
HADOOP-10029. Merging change 1531125 from trunk.

Modified:
hadoop/common/branches/branch-2/hadoop-common-project/   (props changed)
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth/   (props 
changed)
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/   
(props changed)

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
  (props changed)
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/   
(props changed)

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/docs/
   (props changed)

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/
   (props changed)

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/core/
   (props changed)

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestHarFileSystem.java

Propchange: hadoop/common/branches/branch-2/hadoop-common-project/
--
  Merged /hadoop/common/trunk/hadoop-common-project:r1531125

Propchange: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-auth/
--
  Merged /hadoop/common/trunk/hadoop-common-project/hadoop-auth:r1531125

Propchange: hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/
--
  Merged /hadoop/common/trunk/hadoop-common-project/hadoop-common:r1531125

Propchange: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
--
  Merged 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt:r1531125

Propchange: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/
--
  Merged /hadoop/common/trunk/hadoop-common-project/hadoop-common/src:r1531125

Propchange: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/docs/
--
  Merged 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/docs:r1531125

Propchange: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/
--
  Merged 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java:r1531125

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java?rev=1531126&r1=1531125&r2=1531126&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 Thu Oct 10 21:58:41 2013
@@ -273,7 +273,17 @@ public class HarFileSystem extends FileS
   public Path getWorkingDirectory() {
 return new Path(uri.toString());
   }
-  
+
+  @Override
+  public Path getInitialWorkingDirectory() {
+return getWorkingDirectory();
+  }
+
+  @Override
+  public FsStatus getStatus(Path p) throws IOException {
+return fs.getStatus(p);
+  }
+
   /**
* Create a har specific auth 
* har-underlyingfs:port
@@ -296,9 +306,18 @@ public class HarFileSystem extends FileS
 return auth;
   }
 
+  /**
+   * Used for delegation token related functionality. Must delegate to
+   * underlying file system.
+   */
   @Override
   protected URI getCanonicalUri() {
-return fs.canonicalizeUri(getUri());
+return fs.getCanonicalUri();
+  }
+
+  @Override
+  protected URI canonicalizeUri(URI uri) {
+return fs.canonicalizeUri(uri);
   }
 
   /**
@@ -311,6 +330,16 @@ public class HarFileSystem extends FileS
 return this.uri;
   }
   
+  @Override
+  protected void checkPath(Path path) {
+fs.checkPath(path);
+  }
+
+  @Override
+  public Path resolvePath(Path p) throws IOException {
+return fs.resolvePath(p);
+  }
+
   /**
* this method returns the path 
* inside the har filesystem.
@@ -675,18 +704,31 @@ public class HarFileSystem extends FileS
 hstatus.getPartName()),
 hstatus.g

svn commit: r1531126 - in /hadoop/common/branches/branch-2: ./ hadoop-project/ hadoop-project/pom.xml hadoop-project/src/site/

2013-10-10 Thread suresh
Author: suresh
Date: Thu Oct 10 21:58:41 2013
New Revision: 1531126

URL: http://svn.apache.org/r1531126
Log:
HADOOP-10029. Merging change 1531125 from trunk.

Modified:
hadoop/common/branches/branch-2/   (props changed)
hadoop/common/branches/branch-2/hadoop-project/   (props changed)
hadoop/common/branches/branch-2/hadoop-project/pom.xml   (props changed)
hadoop/common/branches/branch-2/hadoop-project/src/site/   (props changed)

Propchange: hadoop/common/branches/branch-2/
--
  Merged /hadoop/common/trunk:r1531125

Propchange: hadoop/common/branches/branch-2/hadoop-project/
--
  Merged /hadoop/common/trunk/hadoop-project:r1531125

Propchange: hadoop/common/branches/branch-2/hadoop-project/pom.xml
--
  Merged /hadoop/common/trunk/hadoop-project/pom.xml:r1531125

Propchange: hadoop/common/branches/branch-2/hadoop-project/src/site/
--
  Merged /hadoop/common/trunk/hadoop-project/src/site:r1531125




svn commit: r1531125 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/fs/HarFileSystem.java test/java/org/apache/hadoop/fs/TestHarFileSystem.java

2013-10-10 Thread suresh
Author: suresh
Date: Thu Oct 10 21:55:38 2013
New Revision: 1531125

URL: http://svn.apache.org/r1531125
Log:
HADOOP-10029. Specifying har file to MR job fails in secure cluster. 
Contributed by Suresh Srinivas.

Modified:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestHarFileSystem.java

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java?rev=1531125&r1=1531124&r2=1531125&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 Thu Oct 10 21:55:38 2013
@@ -273,7 +273,17 @@ public class HarFileSystem extends FileS
   public Path getWorkingDirectory() {
 return new Path(uri.toString());
   }
-  
+
+  @Override
+  public Path getInitialWorkingDirectory() {
+return getWorkingDirectory();
+  }
+
+  @Override
+  public FsStatus getStatus(Path p) throws IOException {
+return fs.getStatus(p);
+  }
+
   /**
* Create a har specific auth 
* har-underlyingfs:port
@@ -296,9 +306,18 @@ public class HarFileSystem extends FileS
 return auth;
   }
 
+  /**
+   * Used for delegation token related functionality. Must delegate to
+   * underlying file system.
+   */
   @Override
   protected URI getCanonicalUri() {
-return fs.canonicalizeUri(getUri());
+return fs.getCanonicalUri();
+  }
+
+  @Override
+  protected URI canonicalizeUri(URI uri) {
+return fs.canonicalizeUri(uri);
   }
 
   /**
@@ -311,6 +330,16 @@ public class HarFileSystem extends FileS
 return this.uri;
   }
   
+  @Override
+  protected void checkPath(Path path) {
+fs.checkPath(path);
+  }
+
+  @Override
+  public Path resolvePath(Path p) throws IOException {
+return fs.resolvePath(p);
+  }
+
   /**
* this method returns the path 
* inside the har filesystem.
@@ -675,18 +704,31 @@ public class HarFileSystem extends FileS
 hstatus.getPartName()),
 hstatus.getStartIndex(), hstatus.getLength(), bufferSize);
   }
- 
+
+  /**
+   * Used for delegation token related functionality. Must delegate to
+   * underlying file system.
+   */
   @Override
-  public FSDataOutputStream create(Path f,
-  FsPermission permission,
-  boolean overwrite,
-  int bufferSize,
-  short replication,
-  long blockSize,
+  public FileSystem[] getChildFileSystems() {
+return new FileSystem[]{fs};
+  }
+
+  @Override
+  public FSDataOutputStream create(Path f, FsPermission permission,
+  boolean overwrite, int bufferSize, short replication, long blockSize,
   Progressable progress) throws IOException {
 throw new IOException("Har: create not allowed.");
   }
 
+  @SuppressWarnings("deprecation")
+  @Override
+  public FSDataOutputStream createNonRecursive(Path f, boolean overwrite,
+  int bufferSize, short replication, long blockSize, Progressable progress)
+  throws IOException {
+throw new IOException("Har: create not allowed.");
+  }
+
   @Override
   public FSDataOutputStream append(Path f, int bufferSize, Progressable 
progress) throws IOException {
 throw new IOException("Har: append not allowed.");
@@ -694,6 +736,7 @@ public class HarFileSystem extends FileS
 
   @Override
   public void close() throws IOException {
+super.close();
 if (fs != null) {
   try {
 fs.close();
@@ -781,11 +824,17 @@ public class HarFileSystem extends FileS
* not implemented.
*/
   @Override
-  public void copyFromLocalFile(boolean delSrc, Path src, Path dst) throws 
-IOException {
+  public void copyFromLocalFile(boolean delSrc, boolean overwrite,
+  Path src, Path dst) throws IOException {
 throw new IOException("Har: copyfromlocalfile not allowed");
   }
-  
+
+  @Override
+  public void copyFromLocalFile(boolean delSrc, boolean overwrite,
+  Path[] srcs, Path dst) throws IOException {
+throw new IOException("Har: copyfromlocalfile not allowed");
+  }
+
   /**
* copies the file in the har filesystem to a local file.
*/
@@ -822,6 +871,11 @@ public class HarFileSystem extends FileS
 throw new IOException("Har: setowner not allowed");
   }
 
+  @Override
+  public void setTimes(Path p, long mtime, long atime) throws IOException {
+throw new IOException("Har: setTimes not allowed");
+  }
+
   /**
* Not implemented.
*/
@@ -1147,4 +1201,43 @@ public class HarFileSystem extends

svn commit: r1529712 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/fs/FileContext.java src/main/java/org/apache/hadoop/f

2013-10-06 Thread suresh
Author: suresh
Date: Sun Oct  6 23:17:39 2013
New Revision: 1529712

URL: http://svn.apache.org/r1529712
Log:
HADOOP-10020. Disable symlinks temporarily (branch-2.1-beta only change). 
Contributed by Sanjay Radia.

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/core-site.xml

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1529712&r1=1529711&r2=1529712&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Sun Oct  6 23:17:39 2013
@@ -19,6 +19,9 @@ Release 2.1.2 - UNRELEASED
 
   INCOMPATIBLE CHANGES
 
+HADOOP-10020. Disable symlinks temporarily (branch-2.1-beta only change)
+(sanjay via suresh)
+
   NEW FEATURES
 
 HDFS-4817.  Make HDFS advisory caching configurable on a per-file basis.

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java?rev=1529712&r1=1529711&r2=1529712&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java
 Sun Oct  6 23:17:39 2013
@@ -1339,11 +1339,15 @@ public final class FileContext {
*   target or link is not supported
* @throws IOException If an I/O error occurred
*/
+  @SuppressWarnings("deprecation")
   public void createSymlink(final Path target, final Path link,
   final boolean createParent) throws AccessControlException,
   FileAlreadyExistsException, FileNotFoundException,
   ParentNotDirectoryException, UnsupportedFileSystemException, 
   IOException { 
+if (!FileSystem.isSymlinksEnabled()) {
+  throw new UnsupportedOperationException("Symlinks not supported");
+}
 final Path nonRelLink = fixRelativePart(link);
 new FSLinkResolver() {
   @Override

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java?rev=1529712&r1=1529711&r2=1529712&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java
 Sun Oct  6 23:17:39 2013
@@ -2807,4 +2807,24 @@ public abstract class FileSystem extends
  ": " + pair.getValue());
 }
   }
+  
+  // Symlinks are temporarily disabled - see Hadoop-10020
+  private static boolean symlinkEnabled = false;
+  private static Configuration conf = null;
+  
+  @Deprecated
+  @VisibleForTesting
+  public static boolean isSymlinksEnabled() {
+if (conf == null) {
+  Configuration conf = new Configuration();
+  symlinkEnabled = conf.getBoolean("test.SymlinkEnabledForTesting", 
false); 
+}
+return symlinkEnabled;
+  }
+  
+  @Deprecated
+  @VisibleForTesting
+  public static void enableSymlinks() {
+symlinkEnabled = true;
+  }
 }

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/core-site.xml
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/core-site.xml?rev=1529712&r1=1529711&r2=1529712&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/core-site.xml
 (original)
+++ 
hadoop/common/branches/b

svn commit: r1528303 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/security/ main/java/org/apache/hadoop/security/token/ test/java/o

2013-10-01 Thread suresh
Author: suresh
Date: Wed Oct  2 04:20:29 2013
New Revision: 1528303

URL: http://svn.apache.org/r1528303
Log:
HADOOP-10012. Merge 1528302 from branch-2.

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java?rev=1528303&r1=1528302&r2=1528303&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
 Wed Oct  2 04:20:29 2013
@@ -31,6 +31,7 @@ import java.util.Arrays;
 import java.util.Collection;
 import java.util.Collections;
 import java.util.HashMap;
+import java.util.Iterator;
 import java.util.List;
 import java.util.Map;
 import java.util.Set;
@@ -1309,7 +1310,14 @@ public class UserGroupInformation {
* @return Credentials of tokens associated with this user
*/
   public synchronized Credentials getCredentials() {
-return new Credentials(getCredentialsInternal());
+Credentials creds = new Credentials(getCredentialsInternal());
+Iterator> iter = creds.getAllTokens().iterator();
+while (iter.hasNext()) {
+  if (iter.next() instanceof Token.PrivateToken) {
+iter.remove();
+  }
+}
+return creds;
   }
   
   /**

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java?rev=1528303&r1=1528302&r2=1528303&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
 Wed Oct  2 04:20:29 2013
@@ -19,31 +19,20 @@
 package org.apache.hadoop.security.token;
 
 import com.google.common.collect.Maps;
-
-import java.io.ByteArrayInputStream;
-import java.io.DataInput;
-import java.io.DataInputStream;
-import java.io.DataOutput;
-import java.io.IOException;
-import java.util.Arrays;
-import java.util.Map;
-import java.util.ServiceLoader;
-
 import org.apache.commons.codec.binary.Base64;
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
-  
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
 import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.io.DataInputBuffer;
-import org.apache.hadoop.io.DataOutputBuffer;
-import org.apache.hadoop.io.Text;
-import org.apache.hadoop.io.Writable;
-import org.apache.hadoop.io.WritableComparator;
-import org.apache.hadoop.io.WritableUtils;
+import org.apache.hadoop.io.*;
 import org.apache.hadoop.util.ReflectionUtils;
 
+import java.io.*;
+import java.util.Arrays;
+import java.util.Map;
+import java.util.ServiceLoader;
+
 /**
  * The client-side form of the token.
  */
@@ -195,6 +184,19 @@ public class Token extends Token 
{
+public PrivateToken(Token token) {
+  super(token);
+}
+  }
+
   @Override
   public void readFields(DataInput in) throws IOException {
 int len = WritableUtils.readVInt(in);

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java?rev=1528303&r1=1528302&r2=1528303&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoo

svn commit: r1528302 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/security/ main/java/org/apache/hadoop/security/token/ test/java/org/apac

2013-10-01 Thread suresh
Author: suresh
Date: Wed Oct  2 04:11:26 2013
New Revision: 1528302

URL: http://svn.apache.org/r1528302
Log:
HADOOP-10012. Merge 1528301 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java?rev=1528302&r1=1528301&r2=1528302&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
 Wed Oct  2 04:11:26 2013
@@ -31,6 +31,7 @@ import java.util.Arrays;
 import java.util.Collection;
 import java.util.Collections;
 import java.util.HashMap;
+import java.util.Iterator;
 import java.util.List;
 import java.util.Map;
 import java.util.Set;
@@ -1313,7 +1314,14 @@ public class UserGroupInformation {
* @return Credentials of tokens associated with this user
*/
   public synchronized Credentials getCredentials() {
-return new Credentials(getCredentialsInternal());
+Credentials creds = new Credentials(getCredentialsInternal());
+Iterator> iter = creds.getAllTokens().iterator();
+while (iter.hasNext()) {
+  if (iter.next() instanceof Token.PrivateToken) {
+iter.remove();
+  }
+}
+return creds;
   }
   
   /**

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java?rev=1528302&r1=1528301&r2=1528302&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
 Wed Oct  2 04:11:26 2013
@@ -19,31 +19,20 @@
 package org.apache.hadoop.security.token;
 
 import com.google.common.collect.Maps;
-
-import java.io.ByteArrayInputStream;
-import java.io.DataInput;
-import java.io.DataInputStream;
-import java.io.DataOutput;
-import java.io.IOException;
-import java.util.Arrays;
-import java.util.Map;
-import java.util.ServiceLoader;
-
 import org.apache.commons.codec.binary.Base64;
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
-  
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
 import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.io.DataInputBuffer;
-import org.apache.hadoop.io.DataOutputBuffer;
-import org.apache.hadoop.io.Text;
-import org.apache.hadoop.io.Writable;
-import org.apache.hadoop.io.WritableComparator;
-import org.apache.hadoop.io.WritableUtils;
+import org.apache.hadoop.io.*;
 import org.apache.hadoop.util.ReflectionUtils;
 
+import java.io.*;
+import java.util.Arrays;
+import java.util.Map;
+import java.util.ServiceLoader;
+
 /**
  * The client-side form of the token.
  */
@@ -195,6 +184,19 @@ public class Token extends Token 
{
+public PrivateToken(Token token) {
+  super(token);
+}
+  }
+
   @Override
   public void readFields(DataInput in) throws IOException {
 int len = WritableUtils.readVInt(in);

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java?rev=1528302&r1=1528301&r2=1528302&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java
 We

svn commit: r1528301 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/security/ main/java/org/apache/hadoop/security/token/ test/java/org/apache/hadoop/se

2013-10-01 Thread suresh
Author: suresh
Date: Wed Oct  2 04:00:06 2013
New Revision: 1528301

URL: http://svn.apache.org/r1528301
Log:
HADOOP-10012. Secure Oozie jobs fail with delegation token renewal exception in 
Namenode HA setup. Contributed by Daryn Sharp and Suresh Srinivas.

Modified:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java?rev=1528301&r1=1528300&r2=1528301&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
 Wed Oct  2 04:00:06 2013
@@ -33,6 +33,7 @@ import java.util.Arrays;
 import java.util.Collection;
 import java.util.Collections;
 import java.util.HashMap;
+import java.util.Iterator;
 import java.util.List;
 import java.util.Map;
 import java.util.Set;
@@ -1325,7 +1326,14 @@ public class UserGroupInformation {
* @return Credentials of tokens associated with this user
*/
   public synchronized Credentials getCredentials() {
-return new Credentials(getCredentialsInternal());
+Credentials creds = new Credentials(getCredentialsInternal());
+Iterator> iter = creds.getAllTokens().iterator();
+while (iter.hasNext()) {
+  if (iter.next() instanceof Token.PrivateToken) {
+iter.remove();
+  }
+}
+return creds;
   }
   
   /**

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java?rev=1528301&r1=1528300&r2=1528301&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
 Wed Oct  2 04:00:06 2013
@@ -19,31 +19,20 @@
 package org.apache.hadoop.security.token;
 
 import com.google.common.collect.Maps;
-
-import java.io.ByteArrayInputStream;
-import java.io.DataInput;
-import java.io.DataInputStream;
-import java.io.DataOutput;
-import java.io.IOException;
-import java.util.Arrays;
-import java.util.Map;
-import java.util.ServiceLoader;
-
 import org.apache.commons.codec.binary.Base64;
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
-  
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
 import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.io.DataInputBuffer;
-import org.apache.hadoop.io.DataOutputBuffer;
-import org.apache.hadoop.io.Text;
-import org.apache.hadoop.io.Writable;
-import org.apache.hadoop.io.WritableComparator;
-import org.apache.hadoop.io.WritableUtils;
+import org.apache.hadoop.io.*;
 import org.apache.hadoop.util.ReflectionUtils;
 
+import java.io.*;
+import java.util.Arrays;
+import java.util.Map;
+import java.util.ServiceLoader;
+
 /**
  * The client-side form of the token.
  */
@@ -195,6 +184,19 @@ public class Token extends Token 
{
+public PrivateToken(Token token) {
+  super(token);
+}
+  }
+
   @Override
   public void readFields(DataInput in) throws IOException {
 int len = WritableUtils.readVInt(in);

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java?rev=1528301&r1=1528300&r2=1528301&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java
 Wed Oct  2 04:00:06 2013
@@ -16,11 +16,21 @@
  */
 package org.apache.hadoop.secu

svn commit: r1528266 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/fs/ src/test/java/org/apache/hadoop/fs/ src/test/resources/tes

2013-10-01 Thread suresh
Author: suresh
Date: Tue Oct  1 23:11:17 2013
New Revision: 1528266

URL: http://svn.apache.org/r1528266
Log:
HADOOP-10003. Merging 1528263 from branch-2.

Added:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/test.har/
  - copied from r1528263, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/test.har/

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/test.har/.part-0.crc
  - copied unchanged from r1528263, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/test.har/.part-0.crc

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/test.har/_SUCCESS
  - copied unchanged from r1528263, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/test.har/_SUCCESS

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/test.har/_index
  - copied unchanged from r1528263, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/test.har/_index

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/test.har/_masterindex
  - copied unchanged from r1528263, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/test.har/_masterindex

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/test.har/part-0
  - copied unchanged from r1528263, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/test.har/part-0
Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/pom.xml

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestHarFileSystemBasics.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1528266&r1=1528265&r2=1528266&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Tue Oct  1 23:11:17 2013
@@ -39,6 +39,9 @@ Release 2.1.2 - UNRELEASED
 HADOOP-9761.  ViewFileSystem#rename fails when using DistributedFileSystem.
 (Andrew Wang via Colin Patrick McCabe)
 
+HADOOP-10003. HarFileSystem.listLocatedStatus() fails.
+(Jason Dere and suresh via suresh)
+
 Release 2.1.1-beta - 2013-09-23
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/pom.xml
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/pom.xml?rev=1528266&r1=1528265&r2=1528266&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/pom.xml
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/pom.xml
 Tue Oct  1 23:11:17 2013
@@ -453,6 +453,10 @@
 src/test/resources/kdc/ldif/users.ldif
 
src/main/native/src/org/apache/hadoop/io/compress/lz4/lz4.c
 
src/test/java/org/apache/hadoop/fs/test-untar.tgz
+src/test/resources/test.har/_SUCCESS
+src/test/resources/test.har/_index
+src/test/resources/test.har/_masterindex
+src/test/resources/test.har/part-0
   
 
   

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java?rev=1528266&r1=1528265&r2=1528266&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 Tue Oct  1 23:11:17 2013
@@ -17,20 +17,6 @@
  */
 package org.apache.hadoop.fs;
 
-import java.io.FileNotFoundException;
-import java.io.IOException;
-import java.io.U

svn commit: r1528263 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/fs/ src/test/java/org/apache/hadoop/fs/ src/test/resources/test.har/

2013-10-01 Thread suresh
Author: suresh
Date: Tue Oct  1 23:06:28 2013
New Revision: 1528263

URL: http://svn.apache.org/r1528263
Log:
HADOOP-10003. Merging 1528256 from trunk.

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/test.har/
  - copied from r1528256, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/test.har/
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestHarFileSystemBasics.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1528263&r1=1528262&r2=1528263&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Tue Oct  1 23:06:28 2013
@@ -132,6 +132,9 @@ Release 2.1.2 - UNRELEASED
 HADOOP-9761.  ViewFileSystem#rename fails when using DistributedFileSystem.
 (Andrew Wang via Colin Patrick McCabe)
 
+HADOOP-10003. HarFileSystem.listLocatedStatus() fails.
+(Jason Dere and suresh via suresh)
+
 Release 2.1.1-beta - 2013-09-23
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml?rev=1528263&r1=1528262&r2=1528263&view=diff
==
--- hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml 
(original)
+++ hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/pom.xml 
Tue Oct  1 23:06:28 2013
@@ -458,6 +458,10 @@
 
src/main/native/src/org/apache/hadoop/io/compress/lz4/lz4hc.c
 
src/main/native/src/org/apache/hadoop/io/compress/lz4/lz4hc_encoder.h
 
src/test/java/org/apache/hadoop/fs/test-untar.tgz
+src/test/resources/test.har/_SUCCESS
+src/test/resources/test.har/_index
+src/test/resources/test.har/_masterindex
+src/test/resources/test.har/part-0
   
 
   

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java?rev=1528263&r1=1528262&r2=1528263&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 Tue Oct  1 23:06:28 2013
@@ -17,20 +17,6 @@
  */
 package org.apache.hadoop.fs;
 
-import java.io.FileNotFoundException;
-import java.io.IOException;
-import java.io.UnsupportedEncodingException;
-import java.net.URI;
-import java.net.URISyntaxException;
-import java.net.URLDecoder;
-import java.util.ArrayList;
-import java.util.Collections;
-import java.util.List;
-import java.util.LinkedHashMap;
-import java.util.Map;
-import java.util.TreeMap;
-import java.util.HashMap;
-
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.conf.Configuration;
@@ -40,6 +26,14 @@ import org.apache.hadoop.io.Text;
 import org.apache.hadoop.util.LineReader;
 import org.apache.hadoop.util.Progressable;
 
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.net.URI;
+import java.net.URISyntaxException;
+import java.net.URLDecoder;
+import java.util.*;
+
 /**
  * This is an implementation of the Hadoop Archive 
  * Filesystem. This archive Filesystem has index files
@@ -53,7 +47,7 @@ import org.apache.hadoop.util.Progressab
  * index for ranges of hashcodes.
  */
 
-public class HarFileSystem extends FilterFileSystem {
+public class HarFileSystem extends FileSystem {
 
   private static final Log LOG = LogFactory.getLog(HarFileSystem.class);
 
@@ -75,11 +69,13 @@ public class HarFileSystem extends Filte
   // pointer into the static metadata cache
   private HarMetaData metadata;
 
+  private FileSystem fs;
+
   /**
* public construction of 

svn commit: r1528256 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/fs/ src/test/java/org/apache/hadoop/fs/ src/test/resources/test.har/

2013-10-01 Thread suresh
Author: suresh
Date: Tue Oct  1 22:57:59 2013
New Revision: 1528256

URL: http://svn.apache.org/r1528256
Log:
HADOOP-10003. HarFileSystem.listLocatedStatus() fails. Contributed by Jason 
Dere and suresh.

Added:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/test.har/

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/test.har/.part-0.crc
   (with props)

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/test.har/_SUCCESS
   (with props)

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/test.har/_index
   (with props)

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/test.har/_masterindex
   (with props)

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/test.har/part-0
   (with props)
Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/TestHarFileSystemBasics.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1528256&r1=1528255&r2=1528256&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue Oct 
 1 22:57:59 2013
@@ -420,6 +420,9 @@ Release 2.1.2 - UNRELEASED
 HADOOP-9761.  ViewFileSystem#rename fails when using DistributedFileSystem.
 (Andrew Wang via Colin Patrick McCabe)
 
+HADOOP-10003. HarFileSystem.listLocatedStatus() fails.
+(Jason Dere and suresh via suresh)
+
 Release 2.1.1-beta - 2013-09-23
 
   INCOMPATIBLE CHANGES

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml?rev=1528256&r1=1528255&r2=1528256&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml (original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/pom.xml Tue Oct  1 
22:57:59 2013
@@ -464,6 +464,10 @@
 
src/main/native/src/org/apache/hadoop/io/compress/lz4/lz4hc.c
 
src/main/native/src/org/apache/hadoop/io/compress/lz4/lz4hc_encoder.h
 
src/test/java/org/apache/hadoop/fs/test-untar.tgz
+src/test/resources/test.har/_SUCCESS
+src/test/resources/test.har/_index
+src/test/resources/test.har/_masterindex
+src/test/resources/test.har/part-0
   
 
   

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java?rev=1528256&r1=1528255&r2=1528256&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java
 Tue Oct  1 22:57:59 2013
@@ -17,20 +17,6 @@
  */
 package org.apache.hadoop.fs;
 
-import java.io.FileNotFoundException;
-import java.io.IOException;
-import java.io.UnsupportedEncodingException;
-import java.net.URI;
-import java.net.URISyntaxException;
-import java.net.URLDecoder;
-import java.util.ArrayList;
-import java.util.Collections;
-import java.util.List;
-import java.util.LinkedHashMap;
-import java.util.Map;
-import java.util.TreeMap;
-import java.util.HashMap;
-
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.conf.Configuration;
@@ -40,6 +26,14 @@ import org.apache.hadoop.io.Text;
 import org.apache.hadoop.util.LineReader;
 import org.apache.hadoop.util.Progressable;
 
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.net.URI;
+import java.net.URISyntaxException;
+import java.net.URLDecoder;
+import java.util.*;
+
 /**
  * This is an implementation of the Hadoop Archive 
  * Filesystem. This archive Filesystem has index files
@@ -53,7 +47,7 @@ import org.apache.hadoop.util.Progressab
  * index for ranges of hashcodes.
  */
 
-public class HarFileSystem extends FilterFileSystem {
+public class HarFileSystem extends FileSystem

svn commit: r1514108 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java src/site/apt/FileSystemShe

2013-08-14 Thread suresh
Author: suresh
Date: Thu Aug 15 00:40:22 2013
New Revision: 1514108

URL: http://svn.apache.org/r1514108
Log:
HADOOP-9381. Merge 1514103 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1514108&r1=1514107&r2=1514108&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Thu Aug 15 00:40:22 2013
@@ -56,6 +56,8 @@ Release 2.1.1-beta - UNRELEASED
 HADOOP-9527. Add symlink support to LocalFileSystem on Windows.
 (Arpit Agarwal)
 
+HADOOP-9381. Document dfs cp -f option. (Keegan Witt, suresh via suresh)
+
 Release 2.1.0-beta - 2013-08-06
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java?rev=1514108&r1=1514107&r2=1514108&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
 Thu Aug 15 00:40:22 2013
@@ -133,7 +133,8 @@ class CopyCommands {  
   "Copy files that match the file pattern  to a\n" +
   "destination.  When copying multiple files, the destination\n" +
   "must be a directory. Passing -p preserves access and\n" +
-  "modification times, ownership and the mode.\n";
+  "modification times, ownership and the mode. Passing -f\n" +
+  "overwrites the destination if it already exists.\n";
 
 @Override
 protected void processOptions(LinkedList args) throws IOException {
@@ -186,7 +187,8 @@ class CopyCommands {  
   "into fs. Copying fails if the file already\n" +
   "exists, unless the -f flag is given. Passing\n" +
   "-p preserves access and modification times,\n" +
-  "ownership and the mode.\n";
+  "ownership and the mode. Passing -f overwrites\n" +
+  "the destination if it already exists.\n";
 
 @Override
 protected void processOptions(LinkedList args) throws IOException {

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm?rev=1514108&r1=1514107&r2=1514108&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
 Thu Aug 15 00:40:22 2013
@@ -86,11 +86,14 @@ chgrp
 
Usage: <<>>
 
-   Change group association of files. With -R, make the change recursively
-   through the directory structure. The user must be the owner of files, or
+   Change group association of files. The user must be the owner of files, or
else a super-user. Additional information is in the
{{{betterurl}Permissions Guide}}.
 
+   Options
+
+ * The -R option will make the change recursively through the directory 
structure.
+
 chmod
 
Usage: << URI [URI ...]>>>
@@ -100,14 +103,21 @@ chmod
else a super-user. Additional information is in the
{{{betterurl}Permissions Guide}}.
 
+   Options
+
+ * The -R option will make the change recursively through the directory 
structure.
+
 chown
 
Usage: <<>>
 
-   Change the owner of files. With -R, make the change recursively through the
-   directory structure. The user must be a super-user. Additional information
+   Change the ow

svn commit: r1514103 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java src/site/apt/FileSystemShell.apt.

2013-08-14 Thread suresh
Author: suresh
Date: Thu Aug 15 00:04:40 2013
New Revision: 1514103

URL: http://svn.apache.org/r1514103
Log:
HADOOP-9381. Merge 1514089 from trunk

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1514103&r1=1514102&r2=1514103&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Thu Aug 15 00:04:40 2013
@@ -116,6 +116,8 @@ Release 2.1.1-beta - UNRELEASED
 HADOOP-9857. Tests block and sometimes timeout on Windows due to invalid
 entropy source. (cnauroth)
 
+HADOOP-9381. Document dfs cp -f option. (Keegan Witt, suresh via suresh)
+
 Release 2.1.0-beta - 2013-08-06
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java?rev=1514103&r1=1514102&r2=1514103&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
 Thu Aug 15 00:04:40 2013
@@ -133,7 +133,8 @@ class CopyCommands {  
   "Copy files that match the file pattern  to a\n" +
   "destination.  When copying multiple files, the destination\n" +
   "must be a directory. Passing -p preserves access and\n" +
-  "modification times, ownership and the mode.\n";
+  "modification times, ownership and the mode. Passing -f\n" +
+  "overwrites the destination if it already exists.\n";
 
 @Override
 protected void processOptions(LinkedList args) throws IOException {
@@ -186,7 +187,8 @@ class CopyCommands {  
   "into fs. Copying fails if the file already\n" +
   "exists, unless the -f flag is given. Passing\n" +
   "-p preserves access and modification times,\n" +
-  "ownership and the mode.\n";
+  "ownership and the mode. Passing -f overwrites\n" +
+  "the destination if it already exists.\n";
 
 @Override
 protected void processOptions(LinkedList args) throws IOException {

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm?rev=1514103&r1=1514102&r2=1514103&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
 Thu Aug 15 00:04:40 2013
@@ -86,11 +86,14 @@ chgrp
 
Usage: <<>>
 
-   Change group association of files. With -R, make the change recursively
-   through the directory structure. The user must be the owner of files, or
+   Change group association of files. The user must be the owner of files, or
else a super-user. Additional information is in the
{{{betterurl}Permissions Guide}}.
 
+   Options
+
+ * The -R option will make the change recursively through the directory 
structure.
+
 chmod
 
Usage: << URI [URI ...]>>>
@@ -100,14 +103,21 @@ chmod
else a super-user. Additional information is in the
{{{betterurl}Permissions Guide}}.
 
+   Options
+
+ * The -R option will make the change recursively through the directory 
structure.
+
 chown
 
Usage: <<>>
 
-   Change the owner of files. With -R, make the change recursively through the
-   directory structure. The user must be a super-user. Additional information
+   Change the owner of files. The user must be a super-user. Additional 
information
is in the {

svn commit: r1514089 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java src/site/apt/FileSystemShell.apt.vm src/test/

2013-08-14 Thread suresh
Author: suresh
Date: Wed Aug 14 23:17:55 2013
New Revision: 1514089

URL: http://svn.apache.org/r1514089
Log:
HADOOP-9381. Document dfs cp -f option. Contributed by Keegan Witt and Suresh 
Srinivas.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/resources/testConf.xml

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1514089&r1=1514088&r2=1514089&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Wed Aug 
14 23:17:55 2013
@@ -389,6 +389,8 @@ Release 2.1.1-beta - UNRELEASED
 HADOOP-9857. Tests block and sometimes timeout on Windows due to invalid
 entropy source. (cnauroth)
 
+HADOOP-9381. Document dfs cp -f option. (Keegan Witt, suresh via suresh)
+
 Release 2.1.0-beta - 2013-08-06
 
   INCOMPATIBLE CHANGES

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java?rev=1514089&r1=1514088&r2=1514089&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/CopyCommands.java
 Wed Aug 14 23:17:55 2013
@@ -133,7 +133,8 @@ class CopyCommands {  
   "Copy files that match the file pattern  to a\n" +
   "destination.  When copying multiple files, the destination\n" +
   "must be a directory. Passing -p preserves access and\n" +
-  "modification times, ownership and the mode.\n";
+  "modification times, ownership and the mode. Passing -f\n" +
+  "overwrites the destination if it already exists.\n";
 
 @Override
 protected void processOptions(LinkedList args) throws IOException {
@@ -186,7 +187,8 @@ class CopyCommands {  
   "into fs. Copying fails if the file already\n" +
   "exists, unless the -f flag is given. Passing\n" +
   "-p preserves access and modification times,\n" +
-  "ownership and the mode.\n";
+  "ownership and the mode. Passing -f overwrites\n" +
+  "the destination if it already exists.\n";
 
 @Override
 protected void processOptions(LinkedList args) throws IOException {

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm?rev=1514089&r1=1514088&r2=1514089&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/site/apt/FileSystemShell.apt.vm
 Wed Aug 14 23:17:55 2013
@@ -86,11 +86,14 @@ chgrp
 
Usage: <<>>
 
-   Change group association of files. With -R, make the change recursively
-   through the directory structure. The user must be the owner of files, or
+   Change group association of files. The user must be the owner of files, or
else a super-user. Additional information is in the
{{{betterurl}Permissions Guide}}.
 
+   Options
+
+ * The -R option will make the change recursively through the directory 
structure.
+
 chmod
 
Usage: << URI [URI ...]>>>
@@ -100,14 +103,21 @@ chmod
else a super-user. Additional information is in the
{{{betterurl}Permissions Guide}}.
 
+   Options
+
+ * The -R option will make the change recursively through the directory 
structure.
+
 chown
 
Usage: <<>>
 
-   Change the owner of files. With -R, make the change recursively through the
-   directory structure. The user must be a super-user. Additional information
+   Change the owner of files. The user must be a super-user. Additional 
information
is in the {{{betterurl}Permissions Guide}}.
 
+   Options
+
+ * The -R option will make the change recursively through the directory 
structure.
+
 copyFromLoc

svn commit: r1508336 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/io/retry/ main/java/org/apache/hadoop/ipc/ main/java/org/apache/h

2013-07-30 Thread suresh
Author: suresh
Date: Tue Jul 30 08:01:00 2013
New Revision: 1508336

URL: http://svn.apache.org/r1508336
Log:
HDFS-5025. Merge 1508335 from branch-2

Added:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java
  - copied unchanged from r1508335, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java
Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestRetryCache.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1508336&r1=1508335&r2=1508336&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Tue Jul 30 08:01:00 2013
@@ -28,6 +28,7 @@ import java.util.concurrent.atomic.Atomi
 
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
+import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.io.retry.RetryPolicy.RetryAction;
 import org.apache.hadoop.ipc.Client;
 import org.apache.hadoop.ipc.Client.ConnectionId;
@@ -39,7 +40,12 @@ import org.apache.hadoop.util.ThreadUtil
 
 import com.google.common.annotations.VisibleForTesting;
 
-class RetryInvocationHandler implements RpcInvocationHandler {
+/**
+ * This class implements RpcInvocationHandler and supports retry on the client 
+ * side.
+ */
+@InterfaceAudience.Private
+public class RetryInvocationHandler implements RpcInvocationHandler {
   public static final Log LOG = 
LogFactory.getLog(RetryInvocationHandler.class);
   private final FailoverProxyProvider proxyProvider;
 

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1508336&r1=1508335&r2=1508336&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Tue Jul 30 08:01:00 2013
@@ -1105,7 +1105,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
-this.clientId = StringUtils.getUuidBytes();
+this.clientId = ClientId.getClientId();
   }
 
   /**

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java?rev=1508336&r1=1508335&r2=1508336&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 Tue Jul 30 08:01:00 2013
@@ -19,6 +19,7 @@ package org.apache.hadoop.ipc;
 
 
 import java.util.Arrays;
+import java.util.UUID;
 
 import org.apache.commons.logging.Log;
 import o

svn commit: r1508335 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/io/retry/ main/java/org/apache/hadoop/ipc/ main/java/org/apache/hadoop/u

2013-07-30 Thread suresh
Author: suresh
Date: Tue Jul 30 07:56:23 2013
New Revision: 1508335

URL: http://svn.apache.org/r1508335
Log:
HDFS-5025. Merge 1508332 from trunk

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java
  - copied unchanged from r1508332, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestRetryCache.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1508335&r1=1508334&r2=1508335&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Tue Jul 30 07:56:23 2013
@@ -28,6 +28,7 @@ import java.util.concurrent.atomic.Atomi
 
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
+import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.io.retry.RetryPolicy.RetryAction;
 import org.apache.hadoop.ipc.Client;
 import org.apache.hadoop.ipc.Client.ConnectionId;
@@ -39,7 +40,12 @@ import org.apache.hadoop.util.ThreadUtil
 
 import com.google.common.annotations.VisibleForTesting;
 
-class RetryInvocationHandler implements RpcInvocationHandler {
+/**
+ * This class implements RpcInvocationHandler and supports retry on the client 
+ * side.
+ */
+@InterfaceAudience.Private
+public class RetryInvocationHandler implements RpcInvocationHandler {
   public static final Log LOG = 
LogFactory.getLog(RetryInvocationHandler.class);
   private final FailoverProxyProvider proxyProvider;
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1508335&r1=1508334&r2=1508335&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Tue Jul 30 07:56:23 2013
@@ -1159,7 +1159,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
-this.clientId = StringUtils.getUuidBytes();
+this.clientId = ClientId.getClientId();
 this.sendParamsExecutor = clientExcecutorFactory.refAndGetInstance();
   }
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java?rev=1508335&r1=1508334&r2=1508335&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 Tue Jul 30 07:56:23 2013
@@ -19,6 +19,7 @@ package org.apache.hadoop.ipc;
 
 
 import java.util.Arrays;
+import java.util.UUID;
 
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
@@ -27,6 +28,7 @@ import org.a

svn commit: r1508332 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common/src: main/java/org/apache/hadoop/io/retry/ main/java/org/apache/hadoop/ipc/ main/java/org/apache/hadoop/util/ test/ja

2013-07-30 Thread suresh
Author: suresh
Date: Tue Jul 30 07:51:38 2013
New Revision: 1508332

URL: http://svn.apache.org/r1508332
Log:
HDFS-5025. Record ClientId and CallId in EditLog to enable rebuilding retry 
cache in case of HA failover. Contributed by Jing Zhao.

Added:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java
Modified:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestRetryCache.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1508332&r1=1508331&r2=1508332&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Tue Jul 30 07:51:38 2013
@@ -27,6 +27,7 @@ import java.util.Map;
 
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
+import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.io.retry.RetryPolicy.RetryAction;
 import org.apache.hadoop.ipc.Client;
 import org.apache.hadoop.ipc.Client.ConnectionId;
@@ -38,7 +39,12 @@ import org.apache.hadoop.util.ThreadUtil
 
 import com.google.common.annotations.VisibleForTesting;
 
-class RetryInvocationHandler implements RpcInvocationHandler {
+/**
+ * This class implements RpcInvocationHandler and supports retry on the client 
+ * side.
+ */
+@InterfaceAudience.Private
+public class RetryInvocationHandler implements RpcInvocationHandler {
   public static final Log LOG = 
LogFactory.getLog(RetryInvocationHandler.class);
   private final FailoverProxyProvider proxyProvider;
 

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1508332&r1=1508331&r2=1508332&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Tue Jul 30 07:51:38 2013
@@ -1161,7 +1161,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
-this.clientId = StringUtils.getUuidBytes();
+this.clientId = ClientId.getClientId();
 this.sendParamsExecutor = clientExcecutorFactory.refAndGetInstance();
   }
 

Added: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java?rev=1508332&view=auto
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java
 (added)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java
 Tue Jul 30 07:51:38 2013
@@ -0,0 +1,79 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in w

svn commit: r1508315 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/retry/ src/test/java/org/apache/hadoop/io/retry/

2013-07-29 Thread suresh
Author: suresh
Date: Tue Jul 30 06:26:18 2013
New Revision: 1508315

URL: http://svn.apache.org/r1508315
Log:
HADOOP-9792. Merge 1508313 from branch-2.

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryPolicies.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryPolicy.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/retry/TestFailoverProxy.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1508315&r1=1508314&r2=1508315&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Tue Jul 30 06:26:18 2013
@@ -71,6 +71,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9762. RetryCache utility for implementing RPC retries. 
 (Suresh Srinivas via jing9)
 
+HADOOP-9792. Retry the methods that are tagged @AtMostOnce along 
+with @Idempotent. (suresh)
+
   IMPROVEMENTS
 
 HADOOP-9164. Print paths of loaded native libraries in

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java?rev=1508315&r1=1508314&r2=1508315&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
 Tue Jul 30 06:26:18 2013
@@ -51,8 +51,8 @@ public interface FailoverProxyProviderhttp://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1508315&r1=1508314&r2=1508315&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Tue Jul 30 06:26:18 2013
@@ -53,7 +53,7 @@ class RetryInvocationHandler implements 
   private final Map methodNameToPolicyMap;
   private Object currentProxy;
 
-  RetryInvocationHandler(FailoverProxyProvider proxyProvider,
+  protected RetryInvocationHandler(FailoverProxyProvider proxyProvider,
   RetryPolicy retryPolicy) {
 this(proxyProvider, retryPolicy, Collections.emptyMap());
   }
@@ -97,11 +97,16 @@ class RetryInvocationHandler implements 
 hasMadeASuccessfulCall = true;
 return ret;
   } catch (Exception e) {
-boolean isMethodIdempotent = proxyProvider.getInterface()
+boolean isIdempotentOrAtMostOnce = proxyProvider.getInterface()
 .getMethod(method.getName(), method.getParameterTypes())
 .isAnnotationPresent(Idempotent.class);
+if (!isIdempotentOrAtMostOnce) {
+  isIdempotentOrAtMostOnce = proxyProvider.getInterface()
+  .getMethod(method.getName(), method.getParameterTypes())
+  .isAnnotationPresent(AtMostOnce.class);
+}
 RetryAction action = policy.shouldRetry(e, retries++,
-invocationFailoverCount, isMethodIdempotent);
+invocationFailoverCount, isIdempotentOrAtMostOnce);
 if (action.action == RetryAction.RetryDecision.FAIL) {
   if (action.reason != null) {
 LOG.warn("Exception while invoking " + 
@@ -169,7 +174,7 @@ class RetryInvocationHandler implements 
 }
   }
   
-  private Object invokeMethod(Method method, Object[] args) throws Throwable {
+  protected Object invokeMethod(Method method, Object[] 

svn commit: r1508313 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/retry/ src/test/java/org/apache/hadoop/io/retry/

2013-07-29 Thread suresh
Author: suresh
Date: Tue Jul 30 06:24:02 2013
New Revision: 1508313

URL: http://svn.apache.org/r1508313
Log:
HADOOP-9792. Merge 1508312 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryPolicies.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryPolicy.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/retry/TestFailoverProxy.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1508313&r1=1508312&r2=1508313&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Tue Jul 30 06:24:02 2013
@@ -107,6 +107,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9762. RetryCache utility for implementing RPC retries. 
 (Suresh Srinivas via jing9)
 
+HADOOP-9792. Retry the methods that are tagged @AtMostOnce along 
+with @Idempotent. (suresh)
+
   IMPROVEMENTS
 
 HADOOP-9164. Print paths of loaded native libraries in

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java?rev=1508313&r1=1508312&r2=1508313&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
 Tue Jul 30 06:24:02 2013
@@ -51,8 +51,8 @@ public interface FailoverProxyProviderhttp://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1508313&r1=1508312&r2=1508313&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Tue Jul 30 06:24:02 2013
@@ -53,7 +53,7 @@ class RetryInvocationHandler implements 
   private final Map methodNameToPolicyMap;
   private Object currentProxy;
 
-  RetryInvocationHandler(FailoverProxyProvider proxyProvider,
+  protected RetryInvocationHandler(FailoverProxyProvider proxyProvider,
   RetryPolicy retryPolicy) {
 this(proxyProvider, retryPolicy, Collections.emptyMap());
   }
@@ -97,11 +97,16 @@ class RetryInvocationHandler implements 
 hasMadeASuccessfulCall = true;
 return ret;
   } catch (Exception e) {
-boolean isMethodIdempotent = proxyProvider.getInterface()
+boolean isIdempotentOrAtMostOnce = proxyProvider.getInterface()
 .getMethod(method.getName(), method.getParameterTypes())
 .isAnnotationPresent(Idempotent.class);
+if (!isIdempotentOrAtMostOnce) {
+  isIdempotentOrAtMostOnce = proxyProvider.getInterface()
+  .getMethod(method.getName(), method.getParameterTypes())
+  .isAnnotationPresent(AtMostOnce.class);
+}
 RetryAction action = policy.shouldRetry(e, retries++,
-invocationFailoverCount, isMethodIdempotent);
+invocationFailoverCount, isIdempotentOrAtMostOnce);
 if (action.action == RetryAction.RetryDecision.FAIL) {
   if (action.reason != null) {
 LOG.warn("Exception while invoking " + 
@@ -169,7 +174,7 @@ class RetryInvocationHandler implements 
 }
   }
   
-  private Object invokeMethod(Method method, Object[] args) throws Throwable {
+  protected Object invokeMethod(Method method, Object[] args) throws Throwable 
{
 try {
   if (!method.isAccessible()) {
 method.setAccessible(true);

Modified: 

svn commit: r1508312 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/retry/ src/test/java/org/apache/hadoop/io/retry/

2013-07-29 Thread suresh
Author: suresh
Date: Tue Jul 30 06:19:28 2013
New Revision: 1508312

URL: http://svn.apache.org/r1508312
Log:
HADOOP-9792. Retry the methods that are tagged @AtMostOnce along with 
@Idempotent. Contributed by Suresh Srinivas.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryPolicies.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryPolicy.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/io/retry/TestFailoverProxy.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1508312&r1=1508311&r2=1508312&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue Jul 
30 06:19:28 2013
@@ -377,6 +377,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9762. RetryCache utility for implementing RPC retries. 
 (Suresh Srinivas via jing9)
 
+HADOOP-9792. Retry the methods that are tagged @AtMostOnce along 
+with @Idempotent. (suresh)
+
   IMPROVEMENTS
 
 HADOOP-9164. Print paths of loaded native libraries in

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java?rev=1508312&r1=1508311&r2=1508312&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/FailoverProxyProvider.java
 Tue Jul 30 06:19:28 2013
@@ -51,8 +51,8 @@ public interface FailoverProxyProviderhttp://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1508312&r1=1508311&r2=1508312&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Tue Jul 30 06:19:28 2013
@@ -52,7 +52,7 @@ class RetryInvocationHandler implements 
   private final Map methodNameToPolicyMap;
   private Object currentProxy;
 
-  RetryInvocationHandler(FailoverProxyProvider proxyProvider,
+  protected RetryInvocationHandler(FailoverProxyProvider proxyProvider,
   RetryPolicy retryPolicy) {
 this(proxyProvider, retryPolicy, Collections.emptyMap());
   }
@@ -96,11 +96,16 @@ class RetryInvocationHandler implements 
 hasMadeASuccessfulCall = true;
 return ret;
   } catch (Exception e) {
-boolean isMethodIdempotent = proxyProvider.getInterface()
+boolean isIdempotentOrAtMostOnce = proxyProvider.getInterface()
 .getMethod(method.getName(), method.getParameterTypes())
 .isAnnotationPresent(Idempotent.class);
+if (!isIdempotentOrAtMostOnce) {
+  isIdempotentOrAtMostOnce = proxyProvider.getInterface()
+  .getMethod(method.getName(), method.getParameterTypes())
+  .isAnnotationPresent(AtMostOnce.class);
+}
 RetryAction action = policy.shouldRetry(e, retries++,
-invocationFailoverCount, isMethodIdempotent);
+invocationFailoverCount, isIdempotentOrAtMostOnce);
 if (action.action == RetryAction.RetryDecision.FAIL) {
   if (action.reason != null) {
 LOG.warn("Exception while invoking " + 
@@ -168,7 +173,7 @@ class RetryInvocationHandler implements 
 }
   }
   
-  private Object invokeMethod(Method method, Object[] args) throws Throwable {
+  protected Object invokeMethod(Method method, Object[] args) throws Throwable 
{
 try {
   if (!method.isAccessible()) {
 method.setAccessible(true);

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryPolicies.java
URL: 
ht

svn commit: r1507417 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/RetryCache.java

2013-07-26 Thread suresh
Author: suresh
Date: Fri Jul 26 20:04:45 2013
New Revision: 1507417

URL: http://svn.apache.org/r1507417
Log:
HADOOP-9770. Merge 1507416 from branch-2.

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1507417&r1=1507416&r2=1507417&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Fri Jul 26 20:04:45 2013
@@ -190,6 +190,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9760. Move GSet and related classes to common from HDFS.
 (suresh)
 
+HADOOP-9770. Make RetryCache#state non volatile. (suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java?rev=1507417&r1=1507416&r2=1507417&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 Fri Jul 26 20:04:45 2013
@@ -52,7 +52,7 @@ public class RetryCache {
 private static byte SUCCESS = 1;
 private static byte FAILED = 2;
 
-private volatile byte state = INPROGRESS;
+private byte state = INPROGRESS;
 
 // Store uuid as two long for better memory utilization
 private final long clientIdMsb; // Most signficant bytes
@@ -63,8 +63,10 @@ public class RetryCache {
 private LightWeightGSet.LinkedElement next;
 
 CacheEntry(byte[] clientId, int callId, long expirationTime) {
-  Preconditions.checkArgument(clientId.length == 16, "Invalid clientId");
-  // Conver UUID bytes to two longs
+  // ClientId must be a UUID - that is 16 octets.
+  Preconditions.checkArgument(clientId.length == 16,
+  "Invalid clientId - must be UUID of size 16 octets");
+  // Convert UUID bytes to two longs
   long tmp = 0;
   for (int i=0; i<8; i++) {
 tmp = (tmp << 8) | (clientId[i] & 0xff);
@@ -116,7 +118,7 @@ public class RetryCache {
   this.notifyAll();
 }
 
-public boolean isSuccess() {
+public synchronized boolean isSuccess() {
   return state == SUCCESS;
 }
 
@@ -241,13 +243,13 @@ public class RetryCache {
 
   private static CacheEntry newEntry(long expirationTime) {
 return new CacheEntry(Server.getClientId(), Server.getCallId(),
-expirationTime);
+System.nanoTime() + expirationTime);
   }
 
   private static CacheEntryWithPayload newEntry(Object payload,
   long expirationTime) {
 return new CacheEntryWithPayload(Server.getClientId(), Server.getCallId(),
-payload, expirationTime);
+payload, System.nanoTime() + expirationTime);
   }
 
   /** Static method that provides null check for retryCache */




svn commit: r1507416 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/RetryCache.java

2013-07-26 Thread suresh
Author: suresh
Date: Fri Jul 26 20:02:35 2013
New Revision: 1507416

URL: http://svn.apache.org/r1507416
Log:
HADOOP-9770. Merge 1507414 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1507416&r1=1507415&r2=1507416&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 26 20:02:35 2013
@@ -223,6 +223,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9760. Move GSet and related classes to common from HDFS.
 (suresh)
 
+HADOOP-9770. Make RetryCache#state non volatile. (suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java?rev=1507416&r1=1507415&r2=1507416&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 Fri Jul 26 20:02:35 2013
@@ -52,7 +52,7 @@ public class RetryCache {
 private static byte SUCCESS = 1;
 private static byte FAILED = 2;
 
-private volatile byte state = INPROGRESS;
+private byte state = INPROGRESS;
 
 // Store uuid as two long for better memory utilization
 private final long clientIdMsb; // Most signficant bytes
@@ -63,8 +63,10 @@ public class RetryCache {
 private LightWeightGSet.LinkedElement next;
 
 CacheEntry(byte[] clientId, int callId, long expirationTime) {
-  Preconditions.checkArgument(clientId.length == 16, "Invalid clientId");
-  // Conver UUID bytes to two longs
+  // ClientId must be a UUID - that is 16 octets.
+  Preconditions.checkArgument(clientId.length == 16,
+  "Invalid clientId - must be UUID of size 16 octets");
+  // Convert UUID bytes to two longs
   long tmp = 0;
   for (int i=0; i<8; i++) {
 tmp = (tmp << 8) | (clientId[i] & 0xff);
@@ -116,7 +118,7 @@ public class RetryCache {
   this.notifyAll();
 }
 
-public boolean isSuccess() {
+public synchronized boolean isSuccess() {
   return state == SUCCESS;
 }
 
@@ -241,13 +243,13 @@ public class RetryCache {
 
   private static CacheEntry newEntry(long expirationTime) {
 return new CacheEntry(Server.getClientId(), Server.getCallId(),
-expirationTime);
+System.nanoTime() + expirationTime);
   }
 
   private static CacheEntryWithPayload newEntry(Object payload,
   long expirationTime) {
 return new CacheEntryWithPayload(Server.getClientId(), Server.getCallId(),
-payload, expirationTime);
+payload, System.nanoTime() + expirationTime);
   }
 
   /** Static method that provides null check for retryCache */




svn commit: r1507414 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/RetryCache.java

2013-07-26 Thread suresh
Author: suresh
Date: Fri Jul 26 19:59:06 2013
New Revision: 1507414

URL: http://svn.apache.org/r1507414
Log:
HADOOP-9770. Make RetryCache#state non volatile. Contributed by Suresh Srinivas.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1507414&r1=1507413&r2=1507414&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Fri Jul 
26 19:59:06 2013
@@ -496,6 +496,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9756. Remove the deprecated getServer(..) methods from RPC.
 (Junping Du via szetszwo)
 
+HADOOP-9770. Make RetryCache#state non volatile. (suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java?rev=1507414&r1=1507413&r2=1507414&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 Fri Jul 26 19:59:06 2013
@@ -52,7 +52,7 @@ public class RetryCache {
 private static byte SUCCESS = 1;
 private static byte FAILED = 2;
 
-private volatile byte state = INPROGRESS;
+private byte state = INPROGRESS;
 
 // Store uuid as two long for better memory utilization
 private final long clientIdMsb; // Most signficant bytes
@@ -63,8 +63,10 @@ public class RetryCache {
 private LightWeightGSet.LinkedElement next;
 
 CacheEntry(byte[] clientId, int callId, long expirationTime) {
-  Preconditions.checkArgument(clientId.length == 16, "Invalid clientId");
-  // Conver UUID bytes to two longs
+  // ClientId must be a UUID - that is 16 octets.
+  Preconditions.checkArgument(clientId.length == 16,
+  "Invalid clientId - must be UUID of size 16 octets");
+  // Convert UUID bytes to two longs
   long tmp = 0;
   for (int i=0; i<8; i++) {
 tmp = (tmp << 8) | (clientId[i] & 0xff);
@@ -116,7 +118,7 @@ public class RetryCache {
   this.notifyAll();
 }
 
-public boolean isSuccess() {
+public synchronized boolean isSuccess() {
   return state == SUCCESS;
 }
 
@@ -241,13 +243,13 @@ public class RetryCache {
 
   private static CacheEntry newEntry(long expirationTime) {
 return new CacheEntry(Server.getClientId(), Server.getCallId(),
-expirationTime);
+System.nanoTime() + expirationTime);
   }
 
   private static CacheEntryWithPayload newEntry(Object payload,
   long expirationTime) {
 return new CacheEntryWithPayload(Server.getClientId(), Server.getCallId(),
-payload, expirationTime);
+payload, System.nanoTime() + expirationTime);
   }
 
   /** Static method that provides null check for retryCache */




svn commit: r1507196 - /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

2013-07-25 Thread suresh
Author: suresh
Date: Fri Jul 26 05:56:47 2013
New Revision: 1507196

URL: http://svn.apache.org/r1507196
Log:
HDFS-5016. Merge 1507191 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java?rev=1507196&r1=1507195&r2=1507196&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
 Fri Jul 26 05:56:47 2013
@@ -907,4 +907,16 @@ public class StringUtils {
 buf.putLong(uuid.getLeastSignificantBits());
 return buf.array();
   }
+  
+  /**
+   * Get stack trace for a given thread.
+   */
+  public static String getStackTrace(Thread t) {
+final StackTraceElement[] stackTrace = t.getStackTrace();
+StringBuilder str = new StringBuilder();
+for (StackTraceElement e : stackTrace) {
+  str.append(e.toString() + "\n");
+}
+return str.toString();
+  }
 }




svn commit: r1507191 - /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

2013-07-25 Thread suresh
Author: suresh
Date: Fri Jul 26 04:47:25 2013
New Revision: 1507191

URL: http://svn.apache.org/r1507191
Log:
HDFS-5016. Merge 1507189 from trunk

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java?rev=1507191&r1=1507190&r2=1507191&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
 Fri Jul 26 04:47:25 2013
@@ -907,4 +907,16 @@ public class StringUtils {
 buf.putLong(uuid.getLeastSignificantBits());
 return buf.array();
   }
+  
+  /**
+   * Get stack trace for a given thread.
+   */
+  public static String getStackTrace(Thread t) {
+final StackTraceElement[] stackTrace = t.getStackTrace();
+StringBuilder str = new StringBuilder();
+for (StackTraceElement e : stackTrace) {
+  str.append(e.toString() + "\n");
+}
+return str.toString();
+  }
 }




svn commit: r1507189 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

2013-07-25 Thread suresh
Author: suresh
Date: Fri Jul 26 04:42:41 2013
New Revision: 1507189

URL: http://svn.apache.org/r1507189
Log:
HDFS-5016. Deadlock in pipeline recovery causes Datanode to be marked dead. 
Contributed by Suresh Srinivas.

Modified:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java?rev=1507189&r1=1507188&r2=1507189&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java
 Fri Jul 26 04:42:41 2013
@@ -907,4 +907,16 @@ public class StringUtils {
 buf.putLong(uuid.getLeastSignificantBits());
 return buf.array();
   }
+  
+  /**
+   * Get stack trace for a given thread.
+   */
+  public static String getStackTrace(Thread t) {
+final StackTraceElement[] stackTrace = t.getStackTrace();
+StringBuilder str = new StringBuilder();
+for (StackTraceElement e : stackTrace) {
+  str.append(e.toString() + "\n");
+}
+return str.toString();
+  }
 }




svn commit: r1507176 - /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

2013-07-25 Thread suresh
Author: suresh
Date: Fri Jul 26 01:34:18 2013
New Revision: 1507176

URL: http://svn.apache.org/r1507176
Log:
HDFS-4979. Merge 1507173 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java?rev=1507176&r1=1507175&r2=1507176&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 Fri Jul 26 01:34:18 2013
@@ -290,4 +290,4 @@ public class RetryCache {
   cache.set.clear();
 }
   }
-}
\ No newline at end of file
+}




svn commit: r1507173 - /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

2013-07-25 Thread suresh
Author: suresh
Date: Fri Jul 26 01:19:25 2013
New Revision: 1507173

URL: http://svn.apache.org/r1507173
Log:
HDFS-4979. Merge 1507170 from trunk

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java?rev=1507173&r1=1507172&r2=1507173&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 Fri Jul 26 01:19:25 2013
@@ -290,4 +290,4 @@ public class RetryCache {
   cache.set.clear();
 }
   }
-}
\ No newline at end of file
+}




svn commit: r1507170 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

2013-07-25 Thread suresh
Author: suresh
Date: Fri Jul 26 01:09:27 2013
New Revision: 1507170

URL: http://svn.apache.org/r1507170
Log:
HDFS-4979. Implement retry cache on Namenode. Contributed by Suresh Srinivas.

Modified:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java?rev=1507170&r1=1507169&r2=1507170&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RetryCache.java
 Fri Jul 26 01:09:27 2013
@@ -290,4 +290,4 @@ public class RetryCache {
   cache.set.clear();
 }
   }
-}
\ No newline at end of file
+}




svn commit: r1505914 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop: ha/ io/retry/ security/ security/authorize/ tools/

2013-07-22 Thread suresh
Author: suresh
Date: Tue Jul 23 06:48:31 2013
New Revision: 1505914

URL: http://svn.apache.org/r1505914
Log:
HDFS-4974. Merge 1505912 from branch-2

Added:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java
  - copied unchanged from r1505912, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java
Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/RefreshAuthorizationPolicyProtocol.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/GetUserMappingsProtocol.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java?rev=1505914&r1=1505913&r2=1505914&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
 Tue Jul 23 06:48:31 2013
@@ -20,6 +20,7 @@ package org.apache.hadoop.ha;
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
 import org.apache.hadoop.fs.CommonConfigurationKeys;
+import org.apache.hadoop.io.retry.Idempotent;
 import org.apache.hadoop.security.AccessControlException;
 import org.apache.hadoop.security.KerberosInfo;
 
@@ -106,6 +107,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void monitorHealth() throws HealthCheckFailedException,
  AccessControlException,
  IOException;
@@ -121,6 +123,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void transitionToActive(StateChangeRequestInfo reqInfo)
throws ServiceFailedException,
   AccessControlException,
@@ -137,6 +140,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void transitionToStandby(StateChangeRequestInfo reqInfo)
 throws ServiceFailedException,
AccessControlException,
@@ -152,6 +156,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public HAServiceStatus getServiceStatus() throws AccessControlException,
IOException;
 }

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java?rev=1505914&r1=1505913&r2=1505914&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java
 Tue Jul 23 06:48:31 2013
@@ -22,6 +22,7 @@ import java.io.IOException;
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
 import org.apache.hadoop.fs.CommonConfigurationKeys;
+import org.apache.hadoop.io.retry.Idempotent;
 import org.apache.hadoop.security.KerberosInfo;
 
 /**
@@ -43,12 +44,13 @@ public interface RefreshUserMappingsProt
* Refresh user to group mappings.
* @throws IOException
*/
+  @Idempotent
   public void refreshUserToGroupsMappings() throws IOException;
   
   /**
* Refresh superuser proxy group list
* @throws IOException
*/
-  public void refreshSupe

svn commit: r1505912 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop: ha/ io/retry/ security/ security/authorize/ tools/

2013-07-22 Thread suresh
Author: suresh
Date: Tue Jul 23 06:44:47 2013
New Revision: 1505912

URL: http://svn.apache.org/r1505912
Log:
HDFS-4974. Merge 1505911 from trunk

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java
  - copied unchanged from r1505911, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/RefreshAuthorizationPolicyProtocol.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/GetUserMappingsProtocol.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java?rev=1505912&r1=1505911&r2=1505912&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
 Tue Jul 23 06:44:47 2013
@@ -20,6 +20,7 @@ package org.apache.hadoop.ha;
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
 import org.apache.hadoop.fs.CommonConfigurationKeys;
+import org.apache.hadoop.io.retry.Idempotent;
 import org.apache.hadoop.security.AccessControlException;
 import org.apache.hadoop.security.KerberosInfo;
 
@@ -106,6 +107,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void monitorHealth() throws HealthCheckFailedException,
  AccessControlException,
  IOException;
@@ -121,6 +123,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void transitionToActive(StateChangeRequestInfo reqInfo)
throws ServiceFailedException,
   AccessControlException,
@@ -137,6 +140,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void transitionToStandby(StateChangeRequestInfo reqInfo)
 throws ServiceFailedException,
AccessControlException,
@@ -152,6 +156,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public HAServiceStatus getServiceStatus() throws AccessControlException,
IOException;
 }

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java?rev=1505912&r1=1505911&r2=1505912&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java
 Tue Jul 23 06:44:47 2013
@@ -22,6 +22,7 @@ import java.io.IOException;
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
 import org.apache.hadoop.fs.CommonConfigurationKeys;
+import org.apache.hadoop.io.retry.Idempotent;
 import org.apache.hadoop.security.KerberosInfo;
 
 /**
@@ -43,12 +44,13 @@ public interface RefreshUserMappingsProt
* Refresh user to group mappings.
* @throws IOException
*/
+  @Idempotent
   public void refreshUserToGroupsMappings() throws IOException;
   
   /**
* Refresh superuser proxy group list
* @throws IOException
*/
-  public void refreshSuperUserGroupsConfiguration() 
-  throws IOException;
+  @Idempotent
+  public void refreshSuperUserGrou

svn commit: r1505911 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop: ha/ io/retry/ security/ security/authorize/ tools/

2013-07-22 Thread suresh
Author: suresh
Date: Tue Jul 23 06:40:33 2013
New Revision: 1505911

URL: http://svn.apache.org/r1505911
Log:
HDFS-4974. Add Idempotent and AtMostOnce annotations to namenode protocol 
methods. Contributed by Suresh Srinivas.

Added:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java
Modified:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/RefreshUserMappingsProtocol.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/RefreshAuthorizationPolicyProtocol.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/GetUserMappingsProtocol.java

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java?rev=1505911&r1=1505910&r2=1505911&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ha/HAServiceProtocol.java
 Tue Jul 23 06:40:33 2013
@@ -20,6 +20,7 @@ package org.apache.hadoop.ha;
 import org.apache.hadoop.classification.InterfaceAudience;
 import org.apache.hadoop.classification.InterfaceStability;
 import org.apache.hadoop.fs.CommonConfigurationKeys;
+import org.apache.hadoop.io.retry.Idempotent;
 import org.apache.hadoop.security.AccessControlException;
 import org.apache.hadoop.security.KerberosInfo;
 
@@ -106,6 +107,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void monitorHealth() throws HealthCheckFailedException,
  AccessControlException,
  IOException;
@@ -121,6 +123,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void transitionToActive(StateChangeRequestInfo reqInfo)
throws ServiceFailedException,
   AccessControlException,
@@ -137,6 +140,7 @@ public interface HAServiceProtocol {
* @throws IOException
*   if other errors happen
*/
+  @Idempotent
   public void transitionToStandby(StateChangeRequestInfo reqInfo)
 throws ServiceFailedException,
AccessControlException,
@@ -153,6 +157,7 @@ public interface HAServiceProtocol {
*   if other errors happen
* @see HAServiceStatus
*/
+  @Idempotent
   public HAServiceStatus getServiceStatus() throws AccessControlException,
IOException;
 }

Added: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java?rev=1505911&view=auto
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java
 (added)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/AtMostOnce.java
 Tue Jul 23 06:40:33 2013
@@ -0,0 +1,41 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.io.retry;
+
+import java.lang.annotation.ElementType;
+import java.lang.annotation.Inherited;
+import java.lang.annotation.Retention;
+import java.lang.annotation.RetentionPolicy;
+import 

svn commit: r1505877 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/util/ src/test/java/org/apache/hadoop/util/

2013-07-22 Thread suresh
Author: suresh
Date: Tue Jul 23 01:57:18 2013
New Revision: 1505877

URL: http://svn.apache.org/r1505877
Log:
HADOOP-9760. Merge 1505876 from branch-2

Added:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java
  - copied unchanged from r1505876, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSetByHashMap.java
  - copied unchanged from r1505876, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSetByHashMap.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java
  - copied unchanged from r1505876, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestGSet.java
  - copied unchanged from r1505876, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestGSet.java
Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505877&r1=1505876&r2=1505877&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Tue Jul 23 01:57:18 2013
@@ -181,6 +181,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9754. Remove unnecessary "throws IOException/InterruptedException",
 and fix generic and other javac warnings.  (szetszwo)
 
+HADOOP-9760. Move GSet and related classes to common from HDFS.
+(suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs




svn commit: r1505876 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/util/ src/test/java/org/apache/hadoop/util/

2013-07-22 Thread suresh
Author: suresh
Date: Tue Jul 23 01:47:09 2013
New Revision: 1505876

URL: http://svn.apache.org/r1505876
Log:
HADOOP-9760. Merge 1505875 from trunk.

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java
  - copied unchanged from r1505875, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSetByHashMap.java
  - copied unchanged from r1505875, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSetByHashMap.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java
  - copied unchanged from r1505875, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestGSet.java
  - copied unchanged from r1505875, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestGSet.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505876&r1=1505875&r2=1505876&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Tue Jul 23 01:47:09 2013
@@ -214,6 +214,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9754. Remove unnecessary "throws IOException/InterruptedException",
 and fix generic and other javac warnings.  (szetszwo)
 
+HADOOP-9760. Move GSet and related classes to common from HDFS.
+(suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs




svn commit: r1505875 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/util/ src/test/java/org/apache/hadoop/util/

2013-07-22 Thread suresh
Author: suresh
Date: Tue Jul 23 01:40:58 2013
New Revision: 1505875

URL: http://svn.apache.org/r1505875
Log:
HADOOP-9760. Move GSet and related classes to common from HDFS. Contributed by 
Suresh Srinivas.

Added:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSetByHashMap.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/LightWeightGSet.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestGSet.java
Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505875&r1=1505874&r2=1505875&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Tue Jul 
23 01:40:58 2013
@@ -484,6 +484,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9754. Remove unnecessary "throws IOException/InterruptedException",
 and fix generic and other javac warnings.  (szetszwo)
 
+HADOOP-9760. Move GSet and related classes to common from HDFS.
+(suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Added: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java?rev=1505875&view=auto
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java
 (added)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/GSet.java
 Tue Jul 23 01:40:58 2013
@@ -0,0 +1,86 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.util;
+
+import org.apache.hadoop.classification.InterfaceAudience;
+
+/**
+ * A {@link GSet} is set,
+ * which supports the {@link #get(Object)} operation.
+ * The {@link #get(Object)} operation uses a key to lookup an element.
+ * 
+ * Null element is not supported.
+ * 
+ * @param  The type of the keys.
+ * @param  The type of the elements, which must be a subclass of the keys.
+ */
+@InterfaceAudience.Private
+public interface GSet extends Iterable {
+  /**
+   * @return The size of this set.
+   */
+  int size();
+
+  /**
+   * Does this set contain an element corresponding to the given key?
+   * @param key The given key.
+   * @return true if the given key equals to a stored element.
+   * Otherwise, return false.
+   * @throws NullPointerException if key == null.
+   */
+  boolean contains(K key);
+
+  /**
+   * Return the stored element which is equal to the given key.
+   * This operation is similar to {@link java.util.Map#get(Object)}.
+   * @param key The given key.
+   * @return The stored element if it exists.
+   * Otherwise, return null.
+   * @throws NullPointerException if key == null.
+   */
+  E get(K key);
+
+  /**
+   * Add/replace an element.
+   * If the element does not exist, add it to the set.
+   * Otherwise, replace the existing element.
+   *
+   * Note that this operation
+   * is similar to {@link java.util.Map#put(Object, Object)}
+   * but is different from {@link java.util.Set#add(Object)}
+   * which does not replace the existing element if there is any.
+   *
+   * @param element The element being put.
+   * @return the previous stored element if there is any.
+   * Otherwise, return null.
+   * @throws NullPointerException if element == null.
+   */
+  E put(E element);
+
+  /**
+   * Remove the element corresponding to the given key. 
+   * This operation is similar to {@li

svn commit: r1505592 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/main/proto/ src/test/java/org/apache/hadoop/ipc/

2013-07-21 Thread suresh
Author: suresh
Date: Mon Jul 22 04:24:26 2013
New Revision: 1505592

URL: http://svn.apache.org/r1505592
Log:
HADOOP-9751. Merge 1505059 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505592&r1=1505591&r2=1505592&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Mon Jul 22 04:24:26 2013
@@ -175,6 +175,9 @@ Release 2.1.0-beta - 2013-07-02
 
 HADOOP-9717. Add retry attempt count to the RPC requests. (jing9)
 
+HADOOP-9751. Add clientId and retryCount to RpcResponseHeaderProto.
+(szetszwo)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505592&r1=1505591&r2=1505592&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Mon Jul 22 04:24:26 2013
@@ -33,6 +33,7 @@ import java.net.Socket;
 import java.net.SocketTimeoutException;
 import java.net.UnknownHostException;
 import java.security.PrivilegedExceptionAction;
+import java.util.Arrays;
 import java.util.Hashtable;
 import java.util.Iterator;
 import java.util.Map.Entry;
@@ -221,6 +222,24 @@ public class Client {
 return refCount==0;
   }
 
+  /** Check the rpc response header. */
+  void checkResponse(RpcResponseHeaderProto header) throws IOException {
+if (header == null) {
+  throw new IOException("Response is null.");
+}
+if (header.hasClientId()) {
+  // check client IDs
+  final byte[] id = header.getClientId().toByteArray();
+  if (!Arrays.equals(id, RpcConstants.DUMMY_CLIENT_ID)) {
+if (!Arrays.equals(id, clientId)) {
+  throw new IOException("Client IDs not matched: local ID="
+  + StringUtils.byteToHexString(clientId) + ", ID in reponse="
+  + 
StringUtils.byteToHexString(header.getClientId().toByteArray()));
+}
+  }
+}
+  }
+
   Call createCall(RPC.RpcKind rpcKind, Writable rpcRequest) {
 return new Call(rpcKind, rpcRequest);
   }
@@ -998,9 +1017,8 @@ public class Client {
 int totalLen = in.readInt();
 RpcResponseHeaderProto header = 
 RpcResponseHeaderProto.parseDelimitedFrom(in);
-if (header == null) {
-  throw new IOException("Response is null.");
-}
+checkResponse(header);
+
 int headerLen = header.getSerializedSize();
 headerLen += CodedOutputStream.computeRawVarint32Size(headerLen);
 

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java?rev=1505592&r1=1505591&r2=1505592&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
 Mon Jul 22 04:24:26 2013
@@ -2287,7 +2287,9 @@ public abstract class Server {
 DataOutputStream out = new DataOutputStream(responseBuf);
 RpcResponseHeaderProto.Builder headerBuilder =  
 RpcResponseHeaderProto.newBuilder();
+headerBuilder.setClientId(B

svn commit: r1505590 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/retry/ src/main/java/org/apache/hadoop/ipc/ src/main/java/o

2013-07-21 Thread suresh
Author: suresh
Date: Mon Jul 22 04:19:28 2013
New Revision: 1505590

URL: http://svn.apache.org/r1505590
Log:
HADOOP-9717. Merge 1505053 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ProtoUtil.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505590&r1=1505589&r2=1505590&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Mon Jul 22 04:19:28 2013
@@ -173,6 +173,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9716. Rpc retries should use the same call ID as the original call.
 (szetszwo)
 
+HADOOP-9717. Add retry attempt count to the RPC requests. (jing9)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1505590&r1=1505589&r2=1505590&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Mon Jul 22 04:19:28 2013
@@ -36,6 +36,8 @@ import org.apache.hadoop.ipc.RpcConstant
 import org.apache.hadoop.ipc.RpcInvocationHandler;
 import org.apache.hadoop.util.ThreadUtil;
 
+import com.google.common.base.Preconditions;
+
 class RetryInvocationHandler implements RpcInvocationHandler {
   public static final Log LOG = 
LogFactory.getLog(RetryInvocationHandler.class);
   private final FailoverProxyProvider proxyProvider;
@@ -87,7 +89,7 @@ class RetryInvocationHandler implements 
   }
 
   if (isRpc) {
-Client.setCallId(callId);
+Client.setCallIdAndRetryCount(callId, retries);
   }
   try {
 Object ret = invokeMethod(method, args);
@@ -97,8 +99,8 @@ class RetryInvocationHandler implements 
 boolean isMethodIdempotent = proxyProvider.getInterface()
 .getMethod(method.getName(), method.getParameterTypes())
 .isAnnotationPresent(Idempotent.class);
-RetryAction action = policy.shouldRetry(e, retries++, 
invocationFailoverCount,
-isMethodIdempotent);
+RetryAction action = policy.shouldRetry(e, retries++,
+invocationFailoverCount, isMethodIdempotent);
 if (action.action == RetryAction.RetryDecision.FAIL) {
   if (action.reason != null) {
 LOG.warn("Exception while invoking " + 

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505590&r1=1505589&r2=1505590&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/

svn commit: r1505589 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/retry/ src/main/java/org/apache/hadoop/ipc/ src/test/java/o

2013-07-21 Thread suresh
Author: suresh
Date: Mon Jul 22 04:17:22 2013
New Revision: 1505589

URL: http://svn.apache.org/r1505589
Log:
HADOOP-9716. Merge 1505052 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ProtobufRpcEngine.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPCServerResponder.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestProtoBufRpc.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestRPC.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505589&r1=1505588&r2=1505589&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Mon Jul 22 04:17:22 2013
@@ -170,6 +170,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9734. Common protobuf definitions for GetUserMappingsProtocol,
 RefreshAuthorizationPolicyProtocol and RefreshUserMappingsProtocol (jlowe)
 
+HADOOP-9716. Rpc retries should use the same call ID as the original call.
+(szetszwo)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1505589&r1=1505588&r2=1505589&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Mon Jul 22 04:17:22 2013
@@ -18,8 +18,10 @@
 package org.apache.hadoop.io.retry;
 
 import java.io.IOException;
+import java.lang.reflect.InvocationHandler;
 import java.lang.reflect.InvocationTargetException;
 import java.lang.reflect.Method;
+import java.lang.reflect.Proxy;
 import java.util.Collections;
 import java.util.Map;
 import java.util.concurrent.atomic.AtomicLong;
@@ -27,10 +29,12 @@ import java.util.concurrent.atomic.Atomi
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.io.retry.RetryPolicy.RetryAction;
-import org.apache.hadoop.util.ThreadUtil;
+import org.apache.hadoop.ipc.Client;
 import org.apache.hadoop.ipc.Client.ConnectionId;
 import org.apache.hadoop.ipc.RPC;
+import org.apache.hadoop.ipc.RpcConstants;
 import org.apache.hadoop.ipc.RpcInvocationHandler;
+import org.apache.hadoop.util.ThreadUtil;
 
 class RetryInvocationHandler implements RpcInvocationHandler {
   public static final Log LOG = 
LogFactory.getLog(RetryInvocationHandler.class);
@@ -45,13 +49,13 @@ class RetryInvocationHandler implements 
   private final RetryPolicy defaultPolicy;
   private final Map methodNameToPolicyMap;
   private Object currentProxy;
-  
-  public RetryInvocationHandler(FailoverProxyProvider proxyProvider,
+
+  RetryInvocationHandler(FailoverProxyProvider proxyProvider,
   RetryPolicy retryPolicy) {
 this(proxyProvider, retryPolicy, Collections.emptyMap());
   }
 
-  public RetryInvocationHandler(FailoverProxyProvider proxyProvider,
+  RetryInvocationHandler(FailoverProxyProvider proxyProvider,
   RetryPolicy defaultPolicy,
   Map methodNameToPolicyMap) {
 this.proxyProvider = proxyProvider;
@@ -70,6 +74,8 @@ class RetryInvocationHandler implements 
 
 // The number of times this method invocation has been failed over.
 int invocationFailoverCount = 0;

svn commit: r1505588 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/test/java/org/apache/hadoop/ipc/

2013-07-21 Thread suresh
Author: suresh
Date: Mon Jul 22 04:11:38 2013
New Revision: 1505588

URL: http://svn.apache.org/r1505588
Log:
HADOOP-9691. Merge 1505042 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505588&r1=1505587&r2=1505588&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Mon Jul 22 04:11:38 2013
@@ -144,6 +144,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9676.  Make maximum RPC buffer size configurable (Colin Patrick
 McCabe)
 
+HADOOP-9691. RPC clients can generate call ID using AtomicInteger instead 
of
+synchronizing on the Client instance. (cnauroth)
+
 HADOOP-9661. Allow metrics sources to be extended. (sandyr via tucu)
 
 HADOOP-9370.  Write FSWrapper class to wrap FileSystem and FileContext for

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505588&r1=1505587&r2=1505588&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Mon Jul 22 04:11:38 2013
@@ -45,6 +45,7 @@ import java.util.concurrent.Future;
 import java.util.concurrent.RejectedExecutionException;
 import java.util.concurrent.TimeUnit;
 import java.util.concurrent.atomic.AtomicBoolean;
+import java.util.concurrent.atomic.AtomicInteger;
 import java.util.concurrent.atomic.AtomicLong;
 
 import javax.net.SocketFactory;
@@ -105,7 +106,7 @@ public class Client {
 new Hashtable();
 
   private Class valueClass;   // class of call values
-  private int counter;// counter for call ids
+  private final AtomicInteger counter = new AtomicInteger(); // call ID 
sequence
   private AtomicBoolean running = new AtomicBoolean(true); // if client runs
   final private Configuration conf;
 
@@ -218,9 +219,7 @@ public class Client {
 protected Call(RPC.RpcKind rpcKind, Writable param) {
   this.rpcKind = rpcKind;
   this.rpcRequest = param;
-  synchronized (Client.this) {
-this.id = counter++;
-  }
+  this.id = nextCallId();
 }
 
 /** Indicate when the call is complete and the
@@ -1566,4 +1565,18 @@ public class Client {
   return serverPrincipal + "@" + address;
 }
   }  
+
+  /**
+   * Returns the next valid sequential call ID by incrementing an atomic 
counter
+   * and masking off the sign bit.  Valid call IDs are non-negative integers in
+   * the range [ 0, 2^31 - 1 ].  Negative numbers are reserved for special
+   * purposes.  The values can overflow back to 0 and be reused.  Note that 
prior
+   * versions of the client did not mask off the sign bit, so a server may 
still
+   * see a negative call ID if it receives connections from an old client.
+   * 
+   * @return int next valid call ID
+   */
+  private int nextCallId() {
+return counter.getAndIncrement() & 0x7FFF;
+  }
 }

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java?rev=1505588&r1=1505587&r2=1505588&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src

svn commit: r1505587 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/Client.java

2013-07-21 Thread suresh
Author: suresh
Date: Mon Jul 22 04:05:54 2013
New Revision: 1505587

URL: http://svn.apache.org/r1505587
Log:
HADOOP-9720. Merge 1505040 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505587&r1=1505586&r2=1505587&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Mon Jul 22 04:05:54 2013
@@ -161,6 +161,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9416.  Add new symlink resolution methods in FileSystem and
 FileSystemLinkResolver.  (Andrew Wang via Colin Patrick McCabe)
 
+HADOOP-9720. Rename Client#uuid to Client#clientId. (Arpit Agarwal via
+suresh)
+
 HADOOP-9734. Common protobuf definitions for GetUserMappingsProtocol,
 RefreshAuthorizationPolicyProtocol and RefreshUserMappingsProtocol (jlowe)
 

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505587&r1=1505586&r2=1505587&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Mon Jul 22 04:05:54 2013
@@ -115,7 +115,7 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
-  private final byte[] uuid;
+  private final byte[] clientId;
   
   final static int CONNECTION_CONTEXT_CALL_ID = -3;
   
@@ -788,9 +788,10 @@ public class Client {
   RPC.getProtocolName(remoteId.getProtocol()),
   remoteId.getTicket(),
   authMethod);
-  RpcRequestHeaderProto connectionContextHeader =
-  ProtoUtil.makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
-  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID, 
uuid);
+  RpcRequestHeaderProto connectionContextHeader = ProtoUtil
+  .makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
+  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID,
+  clientId);
   RpcRequestMessageWrapper request =
   new RpcRequestMessageWrapper(connectionContextHeader, message);
   
@@ -897,7 +898,7 @@ public class Client {
   // Items '1' and '2' are prepared here. 
   final DataOutputBuffer d = new DataOutputBuffer();
   RpcRequestHeaderProto header = ProtoUtil.makeRpcRequestHeader(
- call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, uuid);
+ call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, clientId);
   header.writeDelimitedTo(d);
   call.rpcRequest.write(d);
 
@@ -1097,7 +1098,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
-this.uuid = StringUtils.getUuidBytes();
+this.clientId = StringUtils.getUuidBytes();
   }
 
   /**




svn commit: r1505543 - in /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/main/java/org/apache/hadoop/security/ src/main/java/o

2013-07-21 Thread suresh
Author: suresh
Date: Mon Jul 22 00:12:52 2013
New Revision: 1505543

URL: http://svn.apache.org/r1505543
Log:
HADOOP-9688 merge 1505040 from branch-2

Added:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
  - copied unchanged from r1505030, 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ProtoUtil.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestProtoBufRpc.java

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505543&r1=1505542&r2=1505543&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Mon Jul 22 00:12:52 2013
@@ -39,6 +39,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9421. [RPC v9] Convert SASL to use ProtoBuf and provide
 negotiation capabilities (daryn)
 
+HADOOP-9688. Add globally unique Client ID to RPC requests. (suresh)
+
 HADOOP-9683. [RPC v9] Wrap IpcConnectionContext in RPC headers (daryn)
 
   NEW FEATURES

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505543&r1=1505542&r2=1505543&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Mon Jul 22 00:12:52 2013
@@ -85,6 +85,7 @@ import org.apache.hadoop.security.token.
 import org.apache.hadoop.security.token.TokenSelector;
 import org.apache.hadoop.util.ProtoUtil;
 import org.apache.hadoop.util.ReflectionUtils;
+import org.apache.hadoop.util.StringUtils;
 import org.apache.hadoop.util.Time;
 
 import com.google.common.util.concurrent.ThreadFactoryBuilder;
@@ -114,9 +115,8 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
+  private final byte[] uuid;
   
-  final static int PING_CALL_ID = -1;
-
   final static int CONNECTION_CONTEXT_CALL_ID = -3;
   
   /**
@@ -762,8 +762,8 @@ public class Client {
 throws IOException {
   DataOutputStream out = new DataOutputStream(new 
BufferedOutputStream(outStream));
   // Write out the header, version and authentication method
-  out.write(Server.HEADER.array());
-  out.write(Server.CURRENT_VERSION);
+  out.write(RpcConstants.HEADER.array());
+  out.write(RpcConstants.CURRENT_VERSION);
   out.write(serviceClass);
   final AuthProtocol authProtocol;
   switch (authMethod) {
@@ -790,7 +790,7 @@ public class Client {
   authMethod);
   RpcRequestHeaderProto connectionContextHeader =
   ProtoUtil.makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
-  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID);
+  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID, 
uuid);
   RpcRequestMessageWrapper request =
   new RpcRequestMessageWrapper(connectionContextHeader, message);
   
@@ -842,7 +842,7 @@ public 

svn commit: r1505039 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/Client.java

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 21:52:56 2013
New Revision: 1505039

URL: http://svn.apache.org/r1505039
Log:
Revert r1505037 to fix the commit text

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505039&r1=1505038&r2=1505039&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 21:52:56 2013
@@ -191,9 +191,6 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9416.  Add new symlink resolution methods in FileSystem and
 FileSystemLinkResolver.  (Andrew Wang via Colin Patrick McCabe)
 
-HADOOP-9720. Rename Client#uuid to Client#clientId. (Arpit Agarwal via
-suresh)
-
 HADOOP-9734. Common protobuf definitions for GetUserMappingsProtocol,
 RefreshAuthorizationPolicyProtocol and RefreshUserMappingsProtocol (jlowe)
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505039&r1=1505038&r2=1505039&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri Jul 19 21:52:56 2013
@@ -115,7 +115,7 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
-  private final byte[] clientId;
+  private final byte[] uuid;
   
   final static int CONNECTION_CONTEXT_CALL_ID = -3;
   
@@ -841,10 +841,9 @@ public class Client {
   RPC.getProtocolName(remoteId.getProtocol()),
   remoteId.getTicket(),
   authMethod);
-  RpcRequestHeaderProto connectionContextHeader = ProtoUtil
-  .makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
-  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID,
-  clientId);
+  RpcRequestHeaderProto connectionContextHeader =
+  ProtoUtil.makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
+  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID, 
uuid);
   RpcRequestMessageWrapper request =
   new RpcRequestMessageWrapper(connectionContextHeader, message);
   
@@ -952,7 +951,7 @@ public class Client {
   // Items '1' and '2' are prepared here. 
   final DataOutputBuffer d = new DataOutputBuffer();
   RpcRequestHeaderProto header = ProtoUtil.makeRpcRequestHeader(
- call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, clientId);
+ call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, uuid);
   header.writeDelimitedTo(d);
   call.rpcRequest.write(d);
 
@@ -1152,7 +1151,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
-this.clientId = StringUtils.getUuidBytes();
+this.uuid = StringUtils.getUuidBytes();
 this.sendParamsExecutor = clientExcecutorFactory.refAndGetInstance();
   }
 




svn commit: r1505037 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/Client.java

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 21:50:15 2013
New Revision: 1505037

URL: http://svn.apache.org/r1505037
Log:
HADOOP-9691. Merge r1501615 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505037&r1=1505036&r2=1505037&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 21:50:15 2013
@@ -191,6 +191,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9416.  Add new symlink resolution methods in FileSystem and
 FileSystemLinkResolver.  (Andrew Wang via Colin Patrick McCabe)
 
+HADOOP-9720. Rename Client#uuid to Client#clientId. (Arpit Agarwal via
+suresh)
+
 HADOOP-9734. Common protobuf definitions for GetUserMappingsProtocol,
 RefreshAuthorizationPolicyProtocol and RefreshUserMappingsProtocol (jlowe)
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505037&r1=1505036&r2=1505037&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri Jul 19 21:50:15 2013
@@ -115,7 +115,7 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
-  private final byte[] uuid;
+  private final byte[] clientId;
   
   final static int CONNECTION_CONTEXT_CALL_ID = -3;
   
@@ -841,9 +841,10 @@ public class Client {
   RPC.getProtocolName(remoteId.getProtocol()),
   remoteId.getTicket(),
   authMethod);
-  RpcRequestHeaderProto connectionContextHeader =
-  ProtoUtil.makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
-  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID, 
uuid);
+  RpcRequestHeaderProto connectionContextHeader = ProtoUtil
+  .makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
+  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID,
+  clientId);
   RpcRequestMessageWrapper request =
   new RpcRequestMessageWrapper(connectionContextHeader, message);
   
@@ -951,7 +952,7 @@ public class Client {
   // Items '1' and '2' are prepared here. 
   final DataOutputBuffer d = new DataOutputBuffer();
   RpcRequestHeaderProto header = ProtoUtil.makeRpcRequestHeader(
- call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, uuid);
+ call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, clientId);
   header.writeDelimitedTo(d);
   call.rpcRequest.write(d);
 
@@ -1151,7 +1152,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
-this.uuid = StringUtils.getUuidBytes();
+this.clientId = StringUtils.getUuidBytes();
 this.sendParamsExecutor = clientExcecutorFactory.refAndGetInstance();
   }
 




svn commit: r1505059 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/main/proto/ src/test/java/org/apache/hadoop/ipc/

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 22:40:26 2013
New Revision: 1505059

URL: http://svn.apache.org/r1505059
Log:
HADOOP-9751. Merge r1505036 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505059&r1=1505058&r2=1505059&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 22:40:26 2013
@@ -205,6 +205,9 @@ Release 2.1.0-beta - 2013-07-02
 
 HADOOP-9717. Add retry attempt count to the RPC requests. (jing9)
 
+HADOOP-9751. Add clientId and retryCount to RpcResponseHeaderProto.
+(szetszwo)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505059&r1=1505058&r2=1505059&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri Jul 19 22:40:26 2013
@@ -33,6 +33,7 @@ import java.net.Socket;
 import java.net.SocketTimeoutException;
 import java.net.UnknownHostException;
 import java.security.PrivilegedExceptionAction;
+import java.util.Arrays;
 import java.util.Hashtable;
 import java.util.Iterator;
 import java.util.Map.Entry;
@@ -274,6 +275,24 @@ public class Client {
 return refCount==0;
   }
 
+  /** Check the rpc response header. */
+  void checkResponse(RpcResponseHeaderProto header) throws IOException {
+if (header == null) {
+  throw new IOException("Response is null.");
+}
+if (header.hasClientId()) {
+  // check client IDs
+  final byte[] id = header.getClientId().toByteArray();
+  if (!Arrays.equals(id, RpcConstants.DUMMY_CLIENT_ID)) {
+if (!Arrays.equals(id, clientId)) {
+  throw new IOException("Client IDs not matched: local ID="
+  + StringUtils.byteToHexString(clientId) + ", ID in reponse="
+  + 
StringUtils.byteToHexString(header.getClientId().toByteArray()));
+}
+  }
+}
+  }
+
   Call createCall(RPC.RpcKind rpcKind, Writable rpcRequest) {
 return new Call(rpcKind, rpcRequest);
   }
@@ -1052,9 +1071,8 @@ public class Client {
 int totalLen = in.readInt();
 RpcResponseHeaderProto header = 
 RpcResponseHeaderProto.parseDelimitedFrom(in);
-if (header == null) {
-  throw new IOException("Response is null.");
-}
+checkResponse(header);
+
 int headerLen = header.getSerializedSize();
 headerLen += CodedOutputStream.computeRawVarint32Size(headerLen);
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java?rev=1505059&r1=1505058&r2=1505059&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
 Fri Jul 19 22:40:26 2013
@@ -2287,7 +2287,9 @@ public abstract class Server {
 DataOutputStream out = new DataOutputStream(responseBuf);
 RpcResponseHeaderProto.Builder headerBuilder =  
 RpcResponseHeaderProto.newBuilder();
+headerBuilder.setClientId(ByteString.copyFrom(call.clientId));
 headerBuilder.setCallId(call.callId);
+headerBuilder.setRetryCount(call.retryCount);
   

svn commit: r1505009 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/main/java/org/apache/hadoop/security/ src/main/java/org/apac

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 20:49:13 2013
New Revision: 1505009

URL: http://svn.apache.org/r1505009
Log:
Revert the merge r1505005

Removed:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ProtoUtil.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestProtoBufRpc.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505009&r1=1505008&r2=1505009&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 20:49:13 2013
@@ -69,8 +69,6 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9421. [RPC v9] Convert SASL to use ProtoBuf and provide
 negotiation capabilities (daryn)
 
-HADOOP-9688. Add globally unique Client ID to RPC requests. (suresh)
-
 HADOOP-9683. [RPC v9] Wrap IpcConnectionContext in RPC headers (daryn)
 
   NEW FEATURES
@@ -191,9 +189,6 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9416.  Add new symlink resolution methods in FileSystem and
 FileSystemLinkResolver.  (Andrew Wang via Colin Patrick McCabe)
 
-HADOOP-9720. Rename Client#uuid to Client#clientId. (Arpit Agarwal via
-suresh)
-
 HADOOP-9734. Common protobuf definitions for GetUserMappingsProtocol,
 RefreshAuthorizationPolicyProtocol and RefreshUserMappingsProtocol (jlowe)
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505009&r1=1505008&r2=1505009&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri Jul 19 20:49:13 2013
@@ -85,7 +85,6 @@ import org.apache.hadoop.security.token.
 import org.apache.hadoop.security.token.TokenSelector;
 import org.apache.hadoop.util.ProtoUtil;
 import org.apache.hadoop.util.ReflectionUtils;
-import org.apache.hadoop.util.StringUtils;
 import org.apache.hadoop.util.Time;
 
 import com.google.common.util.concurrent.ThreadFactoryBuilder;
@@ -115,7 +114,6 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
-  private final byte[] clientId;
   
   final static int PING_CALL_ID = -1;
 
@@ -817,8 +815,8 @@ public class Client {
 throws IOException {
   DataOutputStream out = new DataOutputStream(new 
BufferedOutputStream(outStream));
   // Write out the header, version and authentication method
-  out.write(RpcConstants.HEADER.array());
-  out.write(RpcConstants.CURRENT_VERSION);
+  out.write(Server.HEADER.array());
+  out.write(Server.CURRENT_VERSION);
   out.write(serviceClass);
   final AuthProtocol authProtocol;
   switch (authMethod) {
@@ -897,7 +895,7 @@ public class Client {
   if ( curTime - lastActivity.get() >= pingInterval) {
 lastActivity.set(curTime);
 synchronized (out) {
-  out.writeInt(RpcConstants.PING_CALL_ID);
+  out.writeInt(PING_CALL_ID);
   out.flush();
 }
   }
@@ -953,7 +951,7 @@ public class Client {
   // Items '1' and '2' are prepared here. 
   

svn commit: r1505005 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/main/java/org/apache/hadoop/security/ src/main/java/org/apac

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 20:41:10 2013
New Revision: 1505005

URL: http://svn.apache.org/r1505005
Log:
HADOOP-9688 merge r1500843 and r1500847, and HADOOP-9720 merge r1502301 from 
trunk

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
  - copied unchanged from r1500847, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ProtoUtil.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestProtoBufRpc.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505005&r1=1505004&r2=1505005&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 20:41:10 2013
@@ -69,6 +69,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9421. [RPC v9] Convert SASL to use ProtoBuf and provide
 negotiation capabilities (daryn)
 
+HADOOP-9688. Add globally unique Client ID to RPC requests. (suresh)
+
 HADOOP-9683. [RPC v9] Wrap IpcConnectionContext in RPC headers (daryn)
 
   NEW FEATURES
@@ -189,6 +191,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9416.  Add new symlink resolution methods in FileSystem and
 FileSystemLinkResolver.  (Andrew Wang via Colin Patrick McCabe)
 
+HADOOP-9720. Rename Client#uuid to Client#clientId. (Arpit Agarwal via
+suresh)
+
 HADOOP-9734. Common protobuf definitions for GetUserMappingsProtocol,
 RefreshAuthorizationPolicyProtocol and RefreshUserMappingsProtocol (jlowe)
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505005&r1=1505004&r2=1505005&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri Jul 19 20:41:10 2013
@@ -85,6 +85,7 @@ import org.apache.hadoop.security.token.
 import org.apache.hadoop.security.token.TokenSelector;
 import org.apache.hadoop.util.ProtoUtil;
 import org.apache.hadoop.util.ReflectionUtils;
+import org.apache.hadoop.util.StringUtils;
 import org.apache.hadoop.util.Time;
 
 import com.google.common.util.concurrent.ThreadFactoryBuilder;
@@ -114,6 +115,7 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
+  private final byte[] clientId;
   
   final static int PING_CALL_ID = -1;
 
@@ -815,8 +817,8 @@ public class Client {
 throws IOException {
   DataOutputStream out = new DataOutputStream(new 
BufferedOutputStream(outStream));
   // Write out the header, version and authentication method
-  out.write(Server.HEADER.array());
-  out.write(Server.CURRENT_VERSION);
+  out.write(RpcConstants.HEADER.array());
+  out.write(RpcConstants.CURRENT_VERSION);
   out.write(serviceClass);
   final AuthProtocol authProtocol;
   switch (authMethod) {
@@ -895,7 +897,7 @@ public class Client {
   if ( curTime - lastActivity.get() >= pingInterval) {
 lastActivity.set(curTime);
 synchronized (out) {
-  out.writeInt(PING_CALL_ID);
+  out.writeIn

svn commit: r1505042 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/test/java/org/apache/hadoop/ipc/

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 21:59:14 2013
New Revision: 1505042

URL: http://svn.apache.org/r1505042
Log:
HADOOP-9691. Merge r1501615 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505042&r1=1505041&r2=1505042&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 21:59:14 2013
@@ -174,6 +174,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9676.  Make maximum RPC buffer size configurable (Colin Patrick
 McCabe)
 
+HADOOP-9691. RPC clients can generate call ID using AtomicInteger instead 
of
+synchronizing on the Client instance. (cnauroth)
+
 HADOOP-9661. Allow metrics sources to be extended. (sandyr via tucu)
 
 HADOOP-9370.  Write FSWrapper class to wrap FileSystem and FileContext for

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505042&r1=1505041&r2=1505042&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri Jul 19 21:59:14 2013
@@ -45,6 +45,7 @@ import java.util.concurrent.Future;
 import java.util.concurrent.RejectedExecutionException;
 import java.util.concurrent.TimeUnit;
 import java.util.concurrent.atomic.AtomicBoolean;
+import java.util.concurrent.atomic.AtomicInteger;
 import java.util.concurrent.atomic.AtomicLong;
 
 import javax.net.SocketFactory;
@@ -105,7 +106,7 @@ public class Client {
 new Hashtable();
 
   private Class valueClass;   // class of call values
-  private int counter;// counter for call ids
+  private final AtomicInteger counter = new AtomicInteger(); // call ID 
sequence
   private AtomicBoolean running = new AtomicBoolean(true); // if client runs
   final private Configuration conf;
 
@@ -271,9 +272,7 @@ public class Client {
 protected Call(RPC.RpcKind rpcKind, Writable param) {
   this.rpcKind = rpcKind;
   this.rpcRequest = param;
-  synchronized (Client.this) {
-this.id = counter++;
-  }
+  this.id = nextCallId();
 }
 
 /** Indicate when the call is complete and the
@@ -1623,4 +1622,18 @@ public class Client {
   return serverPrincipal + "@" + address;
 }
   }  
+
+  /**
+   * Returns the next valid sequential call ID by incrementing an atomic 
counter
+   * and masking off the sign bit.  Valid call IDs are non-negative integers in
+   * the range [ 0, 2^31 - 1 ].  Negative numbers are reserved for special
+   * purposes.  The values can overflow back to 0 and be reused.  Note that 
prior
+   * versions of the client did not mask off the sign bit, so a server may 
still
+   * see a negative call ID if it receives connections from an old client.
+   * 
+   * @return int next valid call ID
+   */
+  private int nextCallId() {
+return counter.getAndIncrement() & 0x7FFF;
+  }
 }

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java?rev=1505042&r1=1505041&r2=1505042&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
 Fri Jul 19 21:59:14 2013
@@ -31,6 +31,7 @@ public class RpcConstants {

svn commit: r1505040 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/Client.java

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 21:55:50 2013
New Revision: 1505040

URL: http://svn.apache.org/r1505040
Log:
HADOOP-9720 merge r1502301 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505040&r1=1505039&r2=1505040&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 21:55:50 2013
@@ -191,6 +191,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9416.  Add new symlink resolution methods in FileSystem and
 FileSystemLinkResolver.  (Andrew Wang via Colin Patrick McCabe)
 
+HADOOP-9720. Rename Client#uuid to Client#clientId. (Arpit Agarwal via
+suresh)
+
 HADOOP-9734. Common protobuf definitions for GetUserMappingsProtocol,
 RefreshAuthorizationPolicyProtocol and RefreshUserMappingsProtocol (jlowe)
 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505040&r1=1505039&r2=1505040&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri Jul 19 21:55:50 2013
@@ -115,7 +115,7 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
-  private final byte[] uuid;
+  private final byte[] clientId;
   
   final static int CONNECTION_CONTEXT_CALL_ID = -3;
   
@@ -841,9 +841,10 @@ public class Client {
   RPC.getProtocolName(remoteId.getProtocol()),
   remoteId.getTicket(),
   authMethod);
-  RpcRequestHeaderProto connectionContextHeader =
-  ProtoUtil.makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
-  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID, 
uuid);
+  RpcRequestHeaderProto connectionContextHeader = ProtoUtil
+  .makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
+  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID,
+  clientId);
   RpcRequestMessageWrapper request =
   new RpcRequestMessageWrapper(connectionContextHeader, message);
   
@@ -951,7 +952,7 @@ public class Client {
   // Items '1' and '2' are prepared here. 
   final DataOutputBuffer d = new DataOutputBuffer();
   RpcRequestHeaderProto header = ProtoUtil.makeRpcRequestHeader(
- call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, uuid);
+ call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, clientId);
   header.writeDelimitedTo(d);
   call.rpcRequest.write(d);
 
@@ -1151,7 +1152,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
-this.uuid = StringUtils.getUuidBytes();
+this.clientId = StringUtils.getUuidBytes();
 this.sendParamsExecutor = clientExcecutorFactory.refAndGetInstance();
   }
 




svn commit: r1505052 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/retry/ src/main/java/org/apache/hadoop/ipc/ src/test/java/org/apac

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 22:15:19 2013
New Revision: 1505052

URL: http://svn.apache.org/r1505052
Log:
HADOOP-9716. Merge r1504362 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ProtobufRpcEngine.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPCServerResponder.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestProtoBufRpc.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestRPC.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505052&r1=1505051&r2=1505052&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 22:15:19 2013
@@ -200,6 +200,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9734. Common protobuf definitions for GetUserMappingsProtocol,
 RefreshAuthorizationPolicyProtocol and RefreshUserMappingsProtocol (jlowe)
 
+HADOOP-9716. Rpc retries should use the same call ID as the original call.
+(szetszwo)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1505052&r1=1505051&r2=1505052&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Fri Jul 19 22:15:19 2013
@@ -18,8 +18,10 @@
 package org.apache.hadoop.io.retry;
 
 import java.io.IOException;
+import java.lang.reflect.InvocationHandler;
 import java.lang.reflect.InvocationTargetException;
 import java.lang.reflect.Method;
+import java.lang.reflect.Proxy;
 import java.util.Collections;
 import java.util.Map;
 import java.util.concurrent.atomic.AtomicLong;
@@ -27,10 +29,12 @@ import java.util.concurrent.atomic.Atomi
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.io.retry.RetryPolicy.RetryAction;
-import org.apache.hadoop.util.ThreadUtil;
+import org.apache.hadoop.ipc.Client;
 import org.apache.hadoop.ipc.Client.ConnectionId;
 import org.apache.hadoop.ipc.RPC;
+import org.apache.hadoop.ipc.RpcConstants;
 import org.apache.hadoop.ipc.RpcInvocationHandler;
+import org.apache.hadoop.util.ThreadUtil;
 
 class RetryInvocationHandler implements RpcInvocationHandler {
   public static final Log LOG = 
LogFactory.getLog(RetryInvocationHandler.class);
@@ -45,13 +49,13 @@ class RetryInvocationHandler implements 
   private final RetryPolicy defaultPolicy;
   private final Map methodNameToPolicyMap;
   private Object currentProxy;
-  
-  public RetryInvocationHandler(FailoverProxyProvider proxyProvider,
+
+  RetryInvocationHandler(FailoverProxyProvider proxyProvider,
   RetryPolicy retryPolicy) {
 this(proxyProvider, retryPolicy, Collections.emptyMap());
   }
 
-  public RetryInvocationHandler(FailoverProxyProvider proxyProvider,
+  RetryInvocationHandler(FailoverProxyProvider proxyProvider,
   RetryPolicy defaultPolicy,
   Map methodNameToPolicyMap) {
 this.proxyProvider = proxyProvider;
@@ -70,6 +74,8 @@ class RetryInvocationHandler implements 
 
 // The number of times this method invocation has been failed over.
 int invocationFailoverCount = 0;
+final boolean isRpc = isRpcInvocation();
+final int callId = isRpc? Client.nextCallId(): 
RpcConstants.INVALID_CALL_

svn commit: r1505053 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/io/retry/ src/main/java/org/apache/hadoop/ipc/ src/main/java/org/apac

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 22:20:09 2013
New Revision: 1505053

URL: http://svn.apache.org/r1505053
Log:
HADOOP-9717. Merge r1504725 from trunk.

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ProtoUtil.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505053&r1=1505052&r2=1505053&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 22:20:09 2013
@@ -203,6 +203,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9716. Rpc retries should use the same call ID as the original call.
 (szetszwo)
 
+HADOOP-9717. Add retry attempt count to the RPC requests. (jing9)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java?rev=1505053&r1=1505052&r2=1505053&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java
 Fri Jul 19 22:20:09 2013
@@ -36,6 +36,8 @@ import org.apache.hadoop.ipc.RpcConstant
 import org.apache.hadoop.ipc.RpcInvocationHandler;
 import org.apache.hadoop.util.ThreadUtil;
 
+import com.google.common.base.Preconditions;
+
 class RetryInvocationHandler implements RpcInvocationHandler {
   public static final Log LOG = 
LogFactory.getLog(RetryInvocationHandler.class);
   private final FailoverProxyProvider proxyProvider;
@@ -87,7 +89,7 @@ class RetryInvocationHandler implements 
   }
 
   if (isRpc) {
-Client.setCallId(callId);
+Client.setCallIdAndRetryCount(callId, retries);
   }
   try {
 Object ret = invokeMethod(method, args);
@@ -97,8 +99,8 @@ class RetryInvocationHandler implements 
 boolean isMethodIdempotent = proxyProvider.getInterface()
 .getMethod(method.getName(), method.getParameterTypes())
 .isAnnotationPresent(Idempotent.class);
-RetryAction action = policy.shouldRetry(e, retries++, 
invocationFailoverCount,
-isMethodIdempotent);
+RetryAction action = policy.shouldRetry(e, retries++,
+invocationFailoverCount, isMethodIdempotent);
 if (action.action == RetryAction.RetryDecision.FAIL) {
   if (action.reason != null) {
 LOG.warn("Exception while invoking " + 

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505053&r1=1505052&r2=1505053&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Cl

svn commit: r1505030 - in /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/main/java/org/apache/hadoop/security/ src/main/java/org/apac

2013-07-19 Thread suresh
Author: suresh
Date: Fri Jul 19 21:37:43 2013
New Revision: 1505030

URL: http://svn.apache.org/r1505030
Log:
HADOOP-9688 merge r1500843 and r1500847 from trunk.

Added:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
  - copied unchanged from r1500847, 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ProtoUtil.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestProtoBufRpc.java

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1505030&r1=1505029&r2=1505030&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Fri Jul 19 21:37:43 2013
@@ -69,6 +69,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9421. [RPC v9] Convert SASL to use ProtoBuf and provide
 negotiation capabilities (daryn)
 
+HADOOP-9688. Add globally unique Client ID to RPC requests. (suresh)
+
 HADOOP-9683. [RPC v9] Wrap IpcConnectionContext in RPC headers (daryn)
 
   NEW FEATURES

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1505030&r1=1505029&r2=1505030&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Fri Jul 19 21:37:43 2013
@@ -85,6 +85,7 @@ import org.apache.hadoop.security.token.
 import org.apache.hadoop.security.token.TokenSelector;
 import org.apache.hadoop.util.ProtoUtil;
 import org.apache.hadoop.util.ReflectionUtils;
+import org.apache.hadoop.util.StringUtils;
 import org.apache.hadoop.util.Time;
 
 import com.google.common.util.concurrent.ThreadFactoryBuilder;
@@ -114,9 +115,8 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
+  private final byte[] uuid;
   
-  final static int PING_CALL_ID = -1;
-
   final static int CONNECTION_CONTEXT_CALL_ID = -3;
   
   /**
@@ -815,8 +815,8 @@ public class Client {
 throws IOException {
   DataOutputStream out = new DataOutputStream(new 
BufferedOutputStream(outStream));
   // Write out the header, version and authentication method
-  out.write(Server.HEADER.array());
-  out.write(Server.CURRENT_VERSION);
+  out.write(RpcConstants.HEADER.array());
+  out.write(RpcConstants.CURRENT_VERSION);
   out.write(serviceClass);
   final AuthProtocol authProtocol;
   switch (authMethod) {
@@ -843,7 +843,7 @@ public class Client {
   authMethod);
   RpcRequestHeaderProto connectionContextHeader =
   ProtoUtil.makeRpcRequestHeader(RpcKind.RPC_PROTOCOL_BUFFER,
-  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID);
+  OperationProto.RPC_FINAL_PACKET, CONNECTION_CONTEXT_CALL_ID, 
uuid);
   RpcRequestMessageWrapper request =
   new RpcRequestMessageWrapper(connectionContextHeader, message);
   
@@ -895,7 +895,7 @@ public class Client {
   if ( curTime - lastActivity.get() >= pingInterval) {
 lastActivity.set(curTime);
 synchronized (out

svn commit: r1504008 - in /hadoop/common/branches/branch-1: CHANGES.txt src/core/org/apache/hadoop/fs/Trash.java

2013-07-16 Thread suresh
Author: suresh
Date: Wed Jul 17 05:59:48 2013
New Revision: 1504008

URL: http://svn.apache.org/r1504008
Log:
HDFS-4903. Print trash configuration and trash emptier state in namenode log. 
Contributed by Arpit Agarwal.

Modified:
hadoop/common/branches/branch-1/CHANGES.txt
hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/Trash.java

Modified: hadoop/common/branches/branch-1/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1504008&r1=1504007&r2=1504008&view=diff
==
--- hadoop/common/branches/branch-1/CHANGES.txt (original)
+++ hadoop/common/branches/branch-1/CHANGES.txt Wed Jul 17 05:59:48 2013
@@ -21,6 +21,9 @@ Release 1.3.0 - unreleased
 HADOOP-8873. Port HADOOP-8175 (Add mkdir -p flag) to branch-1.
 (Akira Ajisaka via suresh)
 
+HDFS-4903. Print trash configuration and trash emptier state in namenode
+log. (Arpit Agarwal via suresh)
+
   BUG FIXES
 
 MAPREDUCE-5047. keep.failed.task.files=true causes job failure on 

Modified: 
hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/Trash.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/Trash.java?rev=1504008&r1=1504007&r2=1504008&view=diff
==
--- hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/Trash.java 
(original)
+++ hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/Trash.java 
Wed Jul 17 05:59:48 2013
@@ -79,6 +79,8 @@ public class Trash extends Configured {
 this.trash = new Path(home, TRASH);
 this.current = new Path(trash, CURRENT);
 this.interval = conf.getLong("fs.trash.interval", 60) * MSECS_PER_MINUTE;
+LOG.info("Namenode trash configuration: Trash interval = " +
+this.interval + " minutes.");
   }
   
   private Path makeTrashRelativePath(Path basePath, Path rmFilePath) {




svn commit: r1504007 - in /hadoop/common/branches/branch-1: CHANGES.txt src/core/org/apache/hadoop/fs/FsShell.java src/docs/src/documentation/content/xdocs/file_system_shell.xml src/test/org/apache/ha

2013-07-16 Thread suresh
Author: suresh
Date: Wed Jul 17 05:53:59 2013
New Revision: 1504007

URL: http://svn.apache.org/r1504007
Log:
HADOOP-8873. Port HADOOP-8175 (Add mkdir -p flag) to branch-1. Contributed by 
Akira Ajisaka.

Modified:
hadoop/common/branches/branch-1/CHANGES.txt
hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/FsShell.java

hadoop/common/branches/branch-1/src/docs/src/documentation/content/xdocs/file_system_shell.xml

hadoop/common/branches/branch-1/src/test/org/apache/hadoop/hdfs/TestDFSShell.java

Modified: hadoop/common/branches/branch-1/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/CHANGES.txt?rev=1504007&r1=1504006&r2=1504007&view=diff
==
--- hadoop/common/branches/branch-1/CHANGES.txt (original)
+++ hadoop/common/branches/branch-1/CHANGES.txt Wed Jul 17 05:53:59 2013
@@ -18,6 +18,9 @@ Release 1.3.0 - unreleased
 support an arbitrary filesystem URI. (Tom White, backported by
 Chelsey Chang via ivanmi)
 
+HADOOP-8873. Port HADOOP-8175 (Add mkdir -p flag) to branch-1.
+(Akira Ajisaka via suresh)
+
   BUG FIXES
 
 MAPREDUCE-5047. keep.failed.task.files=true causes job failure on 

Modified: 
hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/FsShell.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/FsShell.java?rev=1504007&r1=1504006&r2=1504007&view=diff
==
--- hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/FsShell.java 
(original)
+++ hadoop/common/branches/branch-1/src/core/org/apache/hadoop/fs/FsShell.java 
Wed Jul 17 05:53:59 2013
@@ -709,26 +709,31 @@ public class FsShell extends Configured 
   }
 
   /**
-   * Create the given dir
+   * Create the given dir.
+   * @param src a path to create dir
+   * @param createParents
+   *   if set to true, mkdir will not fail when the given dir already exists
+   * @throws IOException
*/
-  void mkdir(String src) throws IOException {
+  void mkdir(String src, boolean createParents) throws IOException {
 Path f = new Path(src);
 FileSystem srcFs = f.getFileSystem(getConf());
 FileStatus fstatus = null;
 try {
   fstatus = srcFs.getFileStatus(f);
   if (fstatus.isDir()) {
-throw new IOException("cannot create directory " 
-+ src + ": File exists");
-  }
-  else {
+if (!createParents) {
+  throw new IOException("cannot create directory "
+  + src + ": File exists");
+}
+  } else {
 throw new IOException(src + " exists but " +
 "is not a directory");
   }
-} catch(FileNotFoundException e) {
-if (!srcFs.mkdirs(f)) {
-  throw new IOException("failed to create " + src);
-}
+} catch (FileNotFoundException e) {
+  if (!srcFs.mkdirs(f)) {
+throw new IOException("failed to create " + src);
+  }
 }
   }
 
@@ -1294,7 +1299,7 @@ public class FsShell extends Configured 
   GET_SHORT_USAGE + "\n\t" +
   "[-getmerge   [addnl]] [-cat ]\n\t" +
   "[" + COPYTOLOCAL_SHORT_USAGE + "] [-moveToLocal  ]\n\t" +
-  "[-mkdir ] [-report] [" + SETREP_SHORT_USAGE + "]\n\t" +
+  "[-mkdir [-p] ] [-report] [" + SETREP_SHORT_USAGE + "]\n\t" +
   "[-touchz ] [-test -[ezd] ] [-stat [format] ]\n\t" +
   "[-tail [-f] ] [-text ]\n\t" +
   "[" + FsShellPermissions.CHMOD_USAGE + "]\n\t" +
@@ -1394,7 +1399,9 @@ public class FsShell extends Configured 
 
 String moveToLocal = "-moveToLocal  :  Not implemented yet 
\n";
 
-String mkdir = "-mkdir : \tCreate a directory in specified location. 
\n";
+String mkdir = "-mkdir [-p] : "
+  + "\tCreate a directory in specified location. \n"
+  + "\t\t-p Do not fail if the directory already exists.";
 
 String setrep = SETREP_SHORT_USAGE
   + ":  Set the replication level of a file. \n"
@@ -1556,8 +1563,15 @@ public class FsShell extends Configured 
   private int doall(String cmd, String argv[], int startindex) {
 int exitCode = 0;
 int i = startindex;
+boolean mkdirCreateParents = false;
 boolean rmSkipTrash = false;
 
+// Check for -p option in mkdir
+if("-mkdir".equals(cmd) && "-p".equals(argv[i])) {
+  mkdirCreateParents = true;
+  i++;
+}
+
 // Check for -skipTrash option in rm/rmr
 if(("-rm".equals(cmd) || "-rmr".equals(cmd)) 
 && "-skipTrash".equals(argv[i])) {
@@ -1576,7 +1590,7 @@ public class FsShell exten

svn commit: r1503573 - /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java

2013-07-15 Thread suresh
Author: suresh
Date: Tue Jul 16 06:54:27 2013
New Revision: 1503573

URL: http://svn.apache.org/r1503573
Log:
HDFS-4903. Merge 1503572 from branch-2

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java?rev=1503573&r1=1503572&r2=1503573&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
 Tue Jul 16 06:54:27 2013
@@ -89,8 +89,11 @@ public class TrashPolicyDefault extends 
 this.emptierInterval = (long)(conf.getFloat(
 FS_TRASH_CHECKPOINT_INTERVAL_KEY, FS_TRASH_CHECKPOINT_INTERVAL_DEFAULT)
 * MSECS_PER_MINUTE);
-  }
-  
+LOG.info("Namenode trash configuration: Deletion interval = " +
+ this.deletionInterval + " minutes, Emptier interval = " +
+ this.emptierInterval + " minutes.");
+   }
+
   private Path makeTrashRelativePath(Path basePath, Path rmFilePath) {
 return Path.mergePaths(basePath, rmFilePath);
   }




svn commit: r1503572 - /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java

2013-07-15 Thread suresh
Author: suresh
Date: Tue Jul 16 06:49:05 2013
New Revision: 1503572

URL: http://svn.apache.org/r1503572
Log:
HDFS-4903. Merge 1503570 from trunk

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java?rev=1503572&r1=1503571&r2=1503572&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
 (original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
 Tue Jul 16 06:49:05 2013
@@ -89,8 +89,11 @@ public class TrashPolicyDefault extends 
 this.emptierInterval = (long)(conf.getFloat(
 FS_TRASH_CHECKPOINT_INTERVAL_KEY, FS_TRASH_CHECKPOINT_INTERVAL_DEFAULT)
 * MSECS_PER_MINUTE);
-  }
-  
+LOG.info("Namenode trash configuration: Deletion interval = " +
+ this.deletionInterval + " minutes, Emptier interval = " +
+ this.emptierInterval + " minutes.");
+   }
+
   private Path makeTrashRelativePath(Path basePath, Path rmFilePath) {
 return Path.mergePaths(basePath, rmFilePath);
   }




svn commit: r1503570 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java

2013-07-15 Thread suresh
Author: suresh
Date: Tue Jul 16 06:33:40 2013
New Revision: 1503570

URL: http://svn.apache.org/r1503570
Log:
HDFS-4903. Print trash configuration and trash emptier state in namenode log. 
Contributed by Arpit Agarwal.

Modified:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java?rev=1503570&r1=1503569&r2=1503570&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/TrashPolicyDefault.java
 Tue Jul 16 06:33:40 2013
@@ -89,8 +89,11 @@ public class TrashPolicyDefault extends 
 this.emptierInterval = (long)(conf.getFloat(
 FS_TRASH_CHECKPOINT_INTERVAL_KEY, FS_TRASH_CHECKPOINT_INTERVAL_DEFAULT)
 * MSECS_PER_MINUTE);
-  }
-  
+LOG.info("Namenode trash configuration: Deletion interval = " +
+ this.deletionInterval + " minutes, Emptier interval = " +
+ this.emptierInterval + " minutes.");
+   }
+
   private Path makeTrashRelativePath(Path basePath, Path rmFilePath) {
 return Path.mergePaths(basePath, rmFilePath);
   }




svn commit: r1502301 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt src/main/java/org/apache/hadoop/ipc/Client.java

2013-07-11 Thread suresh
Author: suresh
Date: Thu Jul 11 18:02:17 2013
New Revision: 1502301

URL: http://svn.apache.org/r1502301
Log:
HADOOP-9720. Rename Client#uuid to Client#clientId. Contributed by Arpit 
Agarwal.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1502301&r1=1502300&r2=1502301&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Thu Jul 
11 18:02:17 2013
@@ -455,6 +455,9 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9416.  Add new symlink resolution methods in FileSystem and
 FileSystemLinkResolver.  (Andrew Wang via Colin Patrick McCabe)
 
+HADOOP-9720. Rename Client#uuid to Client#clientId. (Arpit Agarwal via
+suresh)
+
   OPTIMIZATIONS
 
 HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1502301&r1=1502300&r2=1502301&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Thu Jul 11 18:02:17 2013
@@ -115,7 +115,7 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
-  private final byte[] uuid;
+  private final byte[] clientId;
   
   /**
* Executor on which IPC calls' parameters are sent. Deferring
@@ -891,7 +891,7 @@ public class Client {
   // Items '1' and '2' are prepared here. 
   final DataOutputBuffer d = new DataOutputBuffer();
   RpcRequestHeaderProto header = ProtoUtil.makeRpcRequestHeader(
- call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, uuid);
+ call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, clientId);
   header.writeDelimitedTo(d);
   call.rpcRequest.write(d);
 
@@ -1091,7 +1091,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
-this.uuid = StringUtils.getUuidBytes();
+this.clientId = StringUtils.getUuidBytes();
   }
 
   /**




svn commit: r1500847 - /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java

2013-07-08 Thread suresh
Author: suresh
Date: Mon Jul  8 17:16:16 2013
New Revision: 1500847

URL: http://svn.apache.org/r1500847
Log:
HADOOP-9688. Adding a file missed in the commit 1500843

Added:

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java

Added: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java?rev=1500847&view=auto
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
 (added)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/RpcConstants.java
 Mon Jul  8 17:16:16 2013
@@ -0,0 +1,50 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.ipc;
+
+import java.nio.ByteBuffer;
+
+import org.apache.hadoop.classification.InterfaceAudience;
+
+@InterfaceAudience.Private
+public class RpcConstants {
+  private RpcConstants() {
+// Hidden Constructor
+  }
+  
+  public static final int PING_CALL_ID = -1;
+  
+  public static final byte[] DUMMY_CLIENT_ID = new byte[0];
+  
+  
+  /**
+   * The first four bytes of Hadoop RPC connections
+   */
+  public static final ByteBuffer HEADER = ByteBuffer.wrap("hrpc".getBytes());
+  
+  // 1 : Introduce ping and server does not throw away RPCs
+  // 3 : Introduce the protocol into the RPC connection header
+  // 4 : Introduced SASL security layer
+  // 5 : Introduced use of {@link ArrayPrimitiveWritable$Internal}
+  // in ObjectWritable to efficiently transmit arrays of primitives
+  // 6 : Made RPC Request header explicit
+  // 7 : Changed Ipc Connection Header to use Protocol buffers
+  // 8 : SASL server always sends a final response
+  // 9 : Changes to protocol for HADOOP-8990
+  public static final byte CURRENT_VERSION = 9;
+}




svn commit: r1500843 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: ./ src/main/java/org/apache/hadoop/ipc/ src/main/java/org/apache/hadoop/security/ src/main/java/org/apache/hadoop/ut

2013-07-08 Thread suresh
Author: suresh
Date: Mon Jul  8 17:08:01 2013
New Revision: 1500843

URL: http://svn.apache.org/r1500843
Log:
HADOOP-9688. Add globally unique Client ID to RPC requests. Contributed by 
Suresh Srinivas.

Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ProtoUtil.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestIPC.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestProtoBufRpc.java

hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestProtoUtil.java

Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1500843&r1=1500842&r2=1500843&view=diff
==
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Mon Jul 
 8 17:08:01 2013
@@ -354,6 +354,8 @@ Release 2.1.0-beta - 2013-07-02
 HADOOP-9421. [RPC v9] Convert SASL to use ProtoBuf and provide
 negotiation capabilities (daryn)
 
+HADOOP-9688. Add globally unique Client ID to RPC requests. (suresh)
+
   NEW FEATURES
 
 HADOOP-9283. Add support for running the Hadoop client on AIX. (atm)

Modified: 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
URL: 
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java?rev=1500843&r1=1500842&r2=1500843&view=diff
==
--- 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 (original)
+++ 
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 Mon Jul  8 17:08:01 2013
@@ -82,6 +82,7 @@ import org.apache.hadoop.security.token.
 import org.apache.hadoop.security.token.TokenSelector;
 import org.apache.hadoop.util.ProtoUtil;
 import org.apache.hadoop.util.ReflectionUtils;
+import org.apache.hadoop.util.StringUtils;
 import org.apache.hadoop.util.Time;
 
 import com.google.common.util.concurrent.ThreadFactoryBuilder;
@@ -113,8 +114,7 @@ public class Client {
   private final int connectionTimeout;
 
   private final boolean fallbackAllowed;
-  
-  final static int PING_CALL_ID = -1;
+  private final byte[] uuid;
   
   /**
* Executor on which IPC calls' parameters are sent. Deferring
@@ -759,8 +759,8 @@ public class Client {
 throws IOException {
   DataOutputStream out = new DataOutputStream(new 
BufferedOutputStream(outStream));
   // Write out the header, version and authentication method
-  out.write(Server.HEADER.array());
-  out.write(Server.CURRENT_VERSION);
+  out.write(RpcConstants.HEADER.array());
+  out.write(RpcConstants.CURRENT_VERSION);
   out.write(serviceClass);
   final AuthProtocol authProtocol;
   switch (authMethod) {
@@ -837,7 +837,7 @@ public class Client {
   if ( curTime - lastActivity.get() >= pingInterval) {
 lastActivity.set(curTime);
 synchronized (out) {
-  out.writeInt(PING_CALL_ID);
+  out.writeInt(RpcConstants.PING_CALL_ID);
   out.flush();
 }
   }
@@ -892,7 +892,7 @@ public class Client {
   // Items '1' and '2' are prepared here. 
   final DataOutputBuffer d = new DataOutputBuffer();
   RpcRequestHeaderProto header = ProtoUtil.makeRpcRequestHeader(
- call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id);
+ call.rpcKind, OperationProto.RPC_FINAL_PACKET, call.id, uuid);
   header.writeDelimitedTo(d);
   call.rpcRequest.write(d);
 
@@ -1092,6 +1092,7 @@ public class Client {
 CommonConfigurationKeys.IPC_CLIENT_CONNECT_TIMEOUT_DEFAULT);
 this.fallbackAllowed = 
conf.getBoolean(CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_KEY,
 
CommonConfigurationKeys.IPC_CLIENT_FALLBACK_TO_SIMPLE_AUTH_ALLOWED_DEFAULT);
+this.uuid =

svn commit: r1495086 - /hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

2013-06-20 Thread suresh
Author: suresh
Date: Thu Jun 20 16:31:02 2013
New Revision: 1495086

URL: http://svn.apache.org/r1495086
Log:
Cleanup CHANGES.txt and move some of the jiras to the right section

Modified:

hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1495086&r1=1495085&r2=1495086&view=diff
==
--- 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 (original)
+++ 
hadoop/common/branches/branch-2.1-beta/hadoop-common-project/hadoop-common/CHANGES.txt
 Thu Jun 20 16:31:02 2013
@@ -638,6 +638,11 @@ Release 2.0.3-alpha - 2013-02-06 
 HADOOP-9276. Allow BoundedByteArrayOutputStream to be resettable.
 (Arun Murthy via hitesh)
 
+HADOOP-7688. Add servlet handler check in HttpServer.start().
+(Uma Maheswara Rao G via szetszwo)
+
+HADOOP-7886. Add toString to FileStatus. (SreeHari via jghoman)
+
   OPTIMIZATIONS
 
 HADOOP-8866. SampleQuantiles#query is O(N^2) instead of O(N). (Andrew Wang
@@ -908,6 +913,9 @@ Release 2.0.2-alpha - 2012-09-07 
 HADOOP-8239. Add subclasses of MD5MD5CRC32FileChecksum to support file
 checksum with CRC32C.  (Kihwal Lee via szetszwo)
 
+HADOOP-8619. WritableComparator must implement no-arg constructor.
+(Chris Douglas via Suresh)
+
 HADOOP-8075. Lower native-hadoop library log from info to debug.
 (Hızır Sefa İrken via eli)
 
@@ -922,6 +930,10 @@ Release 2.0.2-alpha - 2012-09-07 
 HADOOP-8819. Incorrectly & is used instead of && in some file system 
 implementations. (Brandon Li via suresh)
 
+HADOOP-7808. Port HADOOP-7510 - Add configurable option to use original 
+hostname in token instead of IP to allow server IP change. 
+    (Daryn Sharp via suresh)
+
   BUG FIXES
 
 HADOOP-8372. NetUtils.normalizeHostName() incorrectly handles hostname
@@ -1281,6 +1293,20 @@ Release 2.0.0-alpha - 05-23-2012
 
 HADOOP-8366 Use ProtoBuf for RpcResponseHeader (sanjay radia)
 
+HADOOP-7729. Send back valid HTTP response if user hits IPC port with
+HTTP GET. (todd)
+
+HADOOP-7987. Support setting the run-as user in unsecure mode. (jitendra)
+
+HADOOP-7994. Remove getProtocolVersion and getProtocolSignature from the 
+client side translator and server side implementation. (jitendra)
+
+HADOOP-8367 Improve documentation of declaringClassProtocolName in 
+rpc headers. (Sanjay Radia)
+
+HADOOP-8624. ProtobufRpcEngine should log all RPCs if TRACE logging is
+enabled (todd)
+
   OPTIMIZATIONS
 
   BUG FIXES
@@ -1414,6 +1440,9 @@ Release 2.0.0-alpha - 05-23-2012
 HADOOP-8359. Fix javadoc warnings in Configuration.  (Anupam Seth via
 szetszwo)
 
+HADOOP-7988. Upper case in hostname part of the principals doesn't work 
with
+kerberos. (jitendra)
+
   BREAKDOWN OF HADOOP-7454 SUBTASKS
 
 HADOOP-7455. HA: Introduce HA Service Protocol Interface. (suresh)
@@ -1478,6 +1507,27 @@ Release 2.0.0-alpha - 05-23-2012
 HADOOP-8655. Fix TextInputFormat for large deliminators. (Gelesh via
 bobby) 
 
+HADOOP-7900. LocalDirAllocator confChanged() accesses conf.get() twice
+(Ravi Gummadi via Uma Maheswara Rao G)
+
+HADOOP-8146.  FsShell commands cannot be interrupted
+(Daryn Sharp via Uma Maheswara Rao G)
+
+HADOOP-8018.  Hudson auto test for HDFS has started throwing javadoc
+(Jon Eagles via bobby)
+
+HADOOP-8001  ChecksumFileSystem's rename doesn't correctly handle checksum
+files. (Daryn Sharp via bobby)
+
+HADOOP-8006  TestFSInputChecker is failing in trunk.
+(Daryn Sharp via bobby)
+
+HADOOP-7998. CheckFileSystem does not correctly honor setVerifyChecksum
+(Daryn Sharp via bobby)
+
+HADOOP-7606. Upgrade Jackson to version 1.7.1 to match the version required
+by Jersey (Alejandro Abdelnur via atm)
+
 Release 0.23.9 - UNRELEASED
 
   INCOMPATIBLE CHANGES
@@ -1982,6 +2032,9 @@ Release 0.23.1 - 2012-02-17 
 HADOOP-8027. Visiting /jmx on the daemon web interfaces may print
 unnecessary error in logs. (atm)
 
+HADOOP-7792. Add verifyToken method to 
AbstractDelegationTokenSecretManager.
+(jitendra)
+
   OPTIMIZATIONS
 
   BUG FIXES




svn commit: r1494832 - /hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

2013-06-19 Thread suresh
Author: suresh
Date: Thu Jun 20 01:21:38 2013
New Revision: 1494832

URL: http://svn.apache.org/r1494832
Log:
Cleanup CHANGES.txt and move some of the jiras to the right section

Modified:

hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt

Modified: 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt
URL: 
http://svn.apache.org/viewvc/hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1494832&r1=1494831&r2=1494832&view=diff
==
--- 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
(original)
+++ 
hadoop/common/branches/branch-2/hadoop-common-project/hadoop-common/CHANGES.txt 
Thu Jun 20 01:21:38 2013
@@ -662,6 +662,11 @@ Release 2.0.3-alpha - 2013-02-06 
 HADOOP-9276. Allow BoundedByteArrayOutputStream to be resettable.
 (Arun Murthy via hitesh)
 
+HADOOP-7688. Add servlet handler check in HttpServer.start().
+(Uma Maheswara Rao G via szetszwo)
+
+HADOOP-7886. Add toString to FileStatus. (SreeHari via jghoman)
+
   OPTIMIZATIONS
 
 HADOOP-8866. SampleQuantiles#query is O(N^2) instead of O(N). (Andrew Wang
@@ -932,6 +937,9 @@ Release 2.0.2-alpha - 2012-09-07 
 HADOOP-8239. Add subclasses of MD5MD5CRC32FileChecksum to support file
 checksum with CRC32C.  (Kihwal Lee via szetszwo)
 
+HADOOP-8619. WritableComparator must implement no-arg constructor.
+(Chris Douglas via Suresh)
+
 HADOOP-8075. Lower native-hadoop library log from info to debug.
 (Hızır Sefa İrken via eli)
 
@@ -946,6 +954,16 @@ Release 2.0.2-alpha - 2012-09-07 
 HADOOP-8819. Incorrectly & is used instead of && in some file system 
 implementations. (Brandon Li via suresh)
 
+HADOOP-7808. Port HADOOP-7510 - Add configurable option to use original 
+hostname in token instead of IP to allow server IP change. 
+    (Daryn Sharp via suresh)
+
+HADOOP-8367 Improve documentation of declaringClassProtocolName in 
+rpc headers. (Sanjay Radia)
+
+HADOOP-8624. ProtobufRpcEngine should log all RPCs if TRACE logging is
+enabled (todd)
+
   BUG FIXES
 
 HADOOP-8372. NetUtils.normalizeHostName() incorrectly handles hostname
@@ -1305,6 +1323,14 @@ Release 2.0.0-alpha - 05-23-2012
 
 HADOOP-8366 Use ProtoBuf for RpcResponseHeader (sanjay radia)
 
+HADOOP-7729. Send back valid HTTP response if user hits IPC port with
+HTTP GET. (todd)
+
+HADOOP-7987. Support setting the run-as user in unsecure mode. (jitendra)
+
+HADOOP-7994. Remove getProtocolVersion and getProtocolSignature from the 
+client side translator and server side implementation. (jitendra)
+
   OPTIMIZATIONS
 
   BUG FIXES
@@ -1438,6 +1464,9 @@ Release 2.0.0-alpha - 05-23-2012
 HADOOP-8359. Fix javadoc warnings in Configuration.  (Anupam Seth via
 szetszwo)
 
+HADOOP-7988. Upper case in hostname part of the principals doesn't work 
with
+kerberos. (jitendra)
+
   BREAKDOWN OF HADOOP-7454 SUBTASKS
 
 HADOOP-7455. HA: Introduce HA Service Protocol Interface. (suresh)
@@ -1502,6 +1531,27 @@ Release 2.0.0-alpha - 05-23-2012
 HADOOP-8655. Fix TextInputFormat for large deliminators. (Gelesh via
 bobby) 
 
+HADOOP-7900. LocalDirAllocator confChanged() accesses conf.get() twice
+(Ravi Gummadi via Uma Maheswara Rao G)
+
+HADOOP-8146.  FsShell commands cannot be interrupted
+(Daryn Sharp via Uma Maheswara Rao G)
+
+HADOOP-8018.  Hudson auto test for HDFS has started throwing javadoc
+(Jon Eagles via bobby)
+
+HADOOP-8001  ChecksumFileSystem's rename doesn't correctly handle checksum
+files. (Daryn Sharp via bobby)
+
+HADOOP-8006  TestFSInputChecker is failing in trunk.
+(Daryn Sharp via bobby)
+
+HADOOP-7998. CheckFileSystem does not correctly honor setVerifyChecksum
+(Daryn Sharp via bobby)
+
+HADOOP-7606. Upgrade Jackson to version 1.7.1 to match the version required
+by Jersey (Alejandro Abdelnur via atm)
+
 Release 0.23.9 - UNRELEASED
 
   INCOMPATIBLE CHANGES
@@ -2011,6 +2061,9 @@ Release 0.23.1 - 2012-02-17 
 HADOOP-8027. Visiting /jmx on the daemon web interfaces may print
 unnecessary error in logs. (atm)
 
+HADOOP-7792. Add verifyToken method to 
AbstractDelegationTokenSecretManager.
+(jitendra)
+
   OPTIMIZATIONS
 
   BUG FIXES




  1   2   3   4   5   6   7   8   9   10   >