hbase git commit: HBASE-15803 ZooKeeperWatcher's constructor can leak a ZooKeeper instance with throwing ZooKeeperConnectionException when canCreateBaseZNode is true

2016-06-06 Thread tedyu
Repository: hbase
Updated Branches:
  refs/heads/master 15c03fd1c -> 7fd3532de


HBASE-15803 ZooKeeperWatcher's constructor can leak a ZooKeeper instance with 
throwing ZooKeeperConnectionException when canCreateBaseZNode is true


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/7fd3532d
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/7fd3532d
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/7fd3532d

Branch: refs/heads/master
Commit: 7fd3532de63d7b1885d6993c11a35c2f85e26631
Parents: 15c03fd
Author: tedyu 
Authored: Mon Jun 6 18:35:15 2016 -0700
Committer: tedyu 
Committed: Mon Jun 6 18:35:15 2016 -0700

--
 .../apache/hadoop/hbase/zookeeper/ZooKeeperWatcher.java | 12 +++-
 1 file changed, 11 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/7fd3532d/hbase-client/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperWatcher.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperWatcher.java
 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperWatcher.java
index 93828eb..ff3d1c7 100644
--- 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperWatcher.java
+++ 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperWatcher.java
@@ -171,7 +171,17 @@ public class ZooKeeperWatcher implements Watcher, 
Abortable, Closeable {
 this.recoverableZooKeeper = ZKUtil.connect(conf, quorum, pendingWatcher, 
identifier);
 pendingWatcher.prepare(this);
 if (canCreateBaseZNode) {
-  createBaseZNodes();
+  try {
+createBaseZNodes();
+  } catch (ZooKeeperConnectionException zce) {
+try {
+  this.recoverableZooKeeper.close();
+} catch (InterruptedException ie) {
+  LOG.debug("Encountered InterruptedException when closing " + 
this.recoverableZooKeeper);
+  Thread.currentThread().interrupt();
+}
+throw zce;
+  }
 }
   }
 



hbase git commit: HBASE-15965 - Testing by executing a command will cover the exact path users will trigger, so its better then directly calling library functions in tests. Changing the tests to use @

2016-06-06 Thread appy
Repository: hbase
Updated Branches:
  refs/heads/branch-1 4a0a9a20d -> c2b4c6f63


HBASE-15965
- Testing by executing a command will cover the exact path users will trigger, 
so its better then directly calling library functions in tests. Changing the 
tests to use @shell.command(:, args) to execute them like it's a 
command coming from shell.

Norm change:
Commands should print the output user would like to see, but in the end, should 
also return the relevant value. This way:
- Tests can use returned value to check that functionality works
- Tests can capture stdout to assert particular kind of output user should see.
- We do not print the return value in interactive mode and keep the output 
clean. See Shell.command() function.

Bugs found due to this change:
- Uncovered bug in major_compact.rb with this approach. It was calling 
admin.majorCompact() which doesn't exist but our tests didn't catch it since 
they directly tested admin.major_compact()
- Enabled TestReplicationShell. If it's bad, flaky infra will take care of it.

Change-Id: I5d8af16bf477a79a2f526a5bf11c245b02b7d276


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/c2b4c6f6
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/c2b4c6f6
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/c2b4c6f6

Branch: refs/heads/branch-1
Commit: c2b4c6f6375346b5cf76a53efdcb36ea6239a30f
Parents: 4a0a9a2
Author: Apekshit Sharma 
Authored: Mon Jun 6 13:35:06 2016 -0700
Committer: Apekshit Sharma 
Committed: Mon Jun 6 18:00:23 2016 -0700

--
 hbase-shell/src/main/ruby/hbase/admin.rb|  40 ++--
 hbase-shell/src/main/ruby/hbase/table.rb|   1 +
 hbase-shell/src/main/ruby/shell.rb  |  12 +-
 hbase-shell/src/main/ruby/shell/commands.rb |   2 +
 .../src/main/ruby/shell/commands/create.rb  |   2 +
 .../src/main/ruby/shell/commands/exists.rb  |   4 +-
 .../src/main/ruby/shell/commands/get_auths.rb   |   1 +
 .../main/ruby/shell/commands/get_peer_config.rb |   1 +
 .../src/main/ruby/shell/commands/is_enabled.rb  |   4 +-
 .../shell/commands/list_namespace_tables.rb |   1 +
 .../ruby/shell/commands/list_peer_configs.rb|   1 +
 .../src/main/ruby/shell/commands/list_peers.rb  |   1 +
 .../main/ruby/shell/commands/locate_region.rb   |   1 +
 .../ruby/shell/commands/show_peer_tableCFs.rb   |   4 +-
 .../src/main/ruby/shell/commands/truncate.rb|   3 +-
 .../ruby/shell/commands/truncate_preserve.rb|   3 +-
 .../hbase/client/TestReplicationShell.java  |   2 +-
 hbase-shell/src/test/ruby/hbase/admin_test.rb   | 221 +--
 .../test/ruby/hbase/replication_admin_test.rb   | 120 +-
 .../ruby/hbase/visibility_labels_admin_test.rb  |  20 +-
 hbase-shell/src/test/ruby/test_helper.rb|  23 +-
 21 files changed, 251 insertions(+), 216 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/c2b4c6f6/hbase-shell/src/main/ruby/hbase/admin.rb
--
diff --git a/hbase-shell/src/main/ruby/hbase/admin.rb 
b/hbase-shell/src/main/ruby/hbase/admin.rb
index 21e0652..db2edd1 100644
--- a/hbase-shell/src/main/ruby/hbase/admin.rb
+++ b/hbase-shell/src/main/ruby/hbase/admin.rb
@@ -423,25 +423,26 @@ module Hbase
 
 
#--
 # Truncates table (deletes all records by recreating the table)
-def truncate(table_name, conf = @conf)
-  table_description = 
@admin.getTableDescriptor(TableName.valueOf(table_name))
+def truncate(table_name_str)
+  puts "Truncating '#{table_name_str}' table (it may take a while):"
+  table_name = TableName.valueOf(table_name_str)
+  table_description = @admin.getTableDescriptor(table_name)
   raise ArgumentError, "Table #{table_name} is not enabled. Enable it 
first." unless enabled?(table_name)
-  yield 'Disabling table...' if block_given?
+  puts 'Disabling table...'
   @admin.disableTable(table_name)
-
   begin
-yield 'Truncating table...' if block_given?
-
@admin.truncateTable(org.apache.hadoop.hbase.TableName.valueOf(table_name), 
false)
+puts 'Truncating table...'
+@admin.truncateTable(table_name, false)
   rescue => e
 # Handle the compatibility case, where the truncate method doesn't 
exists on the Master
 raise e unless e.respond_to?(:cause) && e.cause != nil
 rootCause = e.cause
 if rootCause.kind_of?(org.apache.hadoop.hbase.DoNotRetryIOException) 
then
   # Handle the compatibility case, where the truncate method doesn't 
exists on the Master
-  yield 'Dropping table...' if block_given?
-  

hbase git commit: HBASE-15965 - Testing by executing a command will cover the exact path users will trigger, so its better then directly calling library functions in tests. Changing the tests to use @

2016-06-06 Thread appy
Repository: hbase
Updated Branches:
  refs/heads/master 3d7840a17 -> 15c03fd1c


HBASE-15965
- Testing by executing a command will cover the exact path users will trigger, 
so its better then directly calling library functions in tests. Changing the 
tests to use @shell.command(:, args) to execute them like it's a 
command coming from shell.

Norm change:
Commands should print the output user would like to see, but in the end, should 
also return the relevant value. This way:
- Tests can use returned value to check that functionality works
- Tests can capture stdout to assert particular kind of output user should see.
- We do not print the return value in interactive mode and keep the output 
clean. See Shell.command() function.

Bugs found due to this change:
- Uncovered bug in major_compact.rb with this approach. It was calling 
admin.majorCompact() which doesn't exist but our tests didn't catch it since 
they directly tested admin.major_compact()
- Enabled TestReplicationShell. If it's bad, flaky infra will take care of it.

Change-Id: I5d8af16bf477a79a2f526a5bf11c245b02b7d276


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/15c03fd1
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/15c03fd1
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/15c03fd1

Branch: refs/heads/master
Commit: 15c03fd1c97c271aca6dc30feab35ec0c9f8edbe
Parents: 3d7840a
Author: Apekshit Sharma 
Authored: Mon Jun 6 13:35:06 2016 -0700
Committer: Apekshit Sharma 
Committed: Mon Jun 6 17:50:22 2016 -0700

--
 .../replication/ReplicationPeerConfig.java  |   6 +-
 hbase-shell/src/main/ruby/hbase/admin.rb|  23 +-
 hbase-shell/src/main/ruby/hbase/table.rb|   1 +
 hbase-shell/src/main/ruby/shell.rb  |  12 +-
 hbase-shell/src/main/ruby/shell/commands.rb |   2 +
 .../main/ruby/shell/commands/balance_rsgroup.rb |   9 +-
 .../src/main/ruby/shell/commands/create.rb  |   2 +
 .../src/main/ruby/shell/commands/exists.rb  |   4 +-
 .../src/main/ruby/shell/commands/get_auths.rb   |   1 +
 .../main/ruby/shell/commands/get_peer_config.rb |   1 +
 .../src/main/ruby/shell/commands/is_enabled.rb  |   4 +-
 .../shell/commands/list_namespace_tables.rb |   1 +
 .../ruby/shell/commands/list_peer_configs.rb|   1 +
 .../src/main/ruby/shell/commands/list_peers.rb  |   1 +
 .../main/ruby/shell/commands/locate_region.rb   |   1 +
 .../main/ruby/shell/commands/major_compact.rb   |   2 +-
 .../ruby/shell/commands/show_peer_tableCFs.rb   |   4 +-
 .../src/main/ruby/shell/commands/truncate.rb|   3 +-
 .../ruby/shell/commands/truncate_preserve.rb|   3 +-
 .../hbase/client/TestReplicationShell.java  |   2 +-
 hbase-shell/src/test/ruby/hbase/admin_test.rb   | 229 +--
 .../test/ruby/hbase/replication_admin_test.rb   | 110 -
 .../ruby/hbase/visibility_labels_admin_test.rb  |  20 +-
 hbase-shell/src/test/ruby/test_helper.rb|  23 +-
 24 files changed, 254 insertions(+), 211 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/15c03fd1/hbase-client/src/main/java/org/apache/hadoop/hbase/replication/ReplicationPeerConfig.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/replication/ReplicationPeerConfig.java
 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/replication/ReplicationPeerConfig.java
index 7799de6..1d2066c 100644
--- 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/replication/ReplicationPeerConfig.java
+++ 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/replication/ReplicationPeerConfig.java
@@ -96,8 +96,10 @@ public class ReplicationPeerConfig {
   @Override
   public String toString() {
 StringBuilder builder = new 
StringBuilder("clusterKey=").append(clusterKey).append(",");
-
builder.append("replicationEndpointImpl=").append(replicationEndpointImpl).append(",")
-.append("tableCFs=").append(tableCFsMap.toString());
+
builder.append("replicationEndpointImpl=").append(replicationEndpointImpl).append(",");
+if (tableCFsMap != null) {
+  builder.append("tableCFs=").append(tableCFsMap.toString());
+}
 return builder.toString();
   }
 }

http://git-wip-us.apache.org/repos/asf/hbase/blob/15c03fd1/hbase-shell/src/main/ruby/hbase/admin.rb
--
diff --git a/hbase-shell/src/main/ruby/hbase/admin.rb 
b/hbase-shell/src/main/ruby/hbase/admin.rb
index f32376d..d66c1d6 100644
--- a/hbase-shell/src/main/ruby/hbase/admin.rb
+++ b/hbase-shell/src/main/ruby/hbase/admin.rb
@@ -458,16 +458,17 @@ module Hbase
 
 
#--
 # 

hbase git commit: HBASE-15954 REST server should log requests with TRACE instead of DEBUG

2016-06-06 Thread enis
Repository: hbase
Updated Branches:
  refs/heads/0.98 804f8e7fd -> 6367f7b70


HBASE-15954 REST server should log requests with TRACE instead of DEBUG

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/RESTServer.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ScannerResource.java

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/client/Client.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/filter/RestCsrfPreventionFilter.java

Conflicts:

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesResource.java
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/RESTServlet.java

hbase-server/src/main/java/org/apache/hadoop/hbase/util/ConnectionCache.java

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/RESTServlet.java
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/TableResource.java

hbase-server/src/main/java/org/apache/hadoop/hbase/util/ConnectionCache.java


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/6367f7b7
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/6367f7b7
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/6367f7b7

Branch: refs/heads/0.98
Commit: 6367f7b706662cafe8f147b232869d90b3f65f64
Parents: 804f8e7
Author: Enis Soztutar 
Authored: Mon Jun 6 10:58:37 2016 -0700
Committer: Enis Soztutar 
Committed: Mon Jun 6 16:25:43 2016 -0700

--
 .../hadoop/hbase/rest/MultiRowResource.java |  4 +-
 .../hbase/rest/ProtobufStreamingUtil.java   | 10 ++--
 .../apache/hadoop/hbase/rest/RESTServer.java| 18 ---
 .../apache/hadoop/hbase/rest/RESTServlet.java   |  4 ++
 .../hadoop/hbase/rest/RegionsResource.java  |  4 +-
 .../apache/hadoop/hbase/rest/RootResource.java  |  4 +-
 .../apache/hadoop/hbase/rest/RowResource.java   | 54 ++--
 .../hbase/rest/ScannerInstanceResource.java | 32 +++-
 .../hadoop/hbase/rest/ScannerResource.java  | 18 +++
 .../hadoop/hbase/rest/SchemaResource.java   | 22 
 .../rest/StorageClusterStatusResource.java  |  4 +-
 .../rest/StorageClusterVersionResource.java |  4 +-
 .../apache/hadoop/hbase/rest/TableResource.java | 24 ++---
 .../hadoop/hbase/rest/VersionResource.java  | 10 ++--
 .../apache/hadoop/hbase/rest/client/Client.java | 40 +++
 .../hadoop/hbase/rest/filter/AuthFilter.java|  4 +-
 .../consumer/ProtobufMessageBodyConsumer.java   |  6 +--
 .../hadoop/hbase/util/ConnectionCache.java  |  6 ++-
 18 files changed, 148 insertions(+), 120 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/6367f7b7/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
index c88ac91..8ff3ef6 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
@@ -86,7 +86,9 @@ public class MultiRowResource extends ResourceBase implements 
Constants {
   }
   model.addRow(rowModel);
 } else {
-  LOG.trace("The row : " + rk + " not found in the table.");
+  if (LOG.isTraceEnabled()) {
+LOG.trace("The row : " + rk + " not found in the table.");
+  }
 }
   }
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/6367f7b7/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
index 93bb940..cb0f4c8 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
@@ -49,8 +49,10 @@ public class ProtobufStreamingUtil implements 
StreamingOutput {
 this.contentType = type;
 this.limit = limit;
 this.fetchSize = fetchSize;
-LOG.debug("Created ScanStreamingUtil with content type = " + 
this.contentType + " user limit : "
-+ this.limit + " scan fetch size : " + this.fetchSize);
+if (LOG.isTraceEnabled()) {
+  LOG.trace("Created ScanStreamingUtil with content type = " + 
this.contentType
++ " user limit : " + this.limit + " scan 

[5/5] hbase git commit: HBASE-15954 REST server should log requests with TRACE instead of DEBUG

2016-06-06 Thread enis
HBASE-15954 REST server should log requests with TRACE instead of DEBUG

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/RESTServer.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ScannerResource.java

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/client/Client.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/filter/RestCsrfPreventionFilter.java


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/70593efa
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/70593efa
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/70593efa

Branch: refs/heads/branch-1.2
Commit: 70593efa2760a4c0f5df353047200e5ed14c1035
Parents: edbf275
Author: Enis Soztutar 
Authored: Mon Jun 6 10:58:37 2016 -0700
Committer: Enis Soztutar 
Committed: Mon Jun 6 14:07:16 2016 -0700

--
 .../hadoop/hbase/rest/MultiRowResource.java |  4 +-
 .../hbase/rest/NamespacesInstanceResource.java  | 24 -
 .../hadoop/hbase/rest/NamespacesResource.java   |  4 +-
 .../hbase/rest/ProtobufStreamingUtil.java   | 10 ++--
 .../apache/hadoop/hbase/rest/RESTServer.java| 18 ---
 .../apache/hadoop/hbase/rest/RESTServlet.java   |  5 +-
 .../hadoop/hbase/rest/RegionsResource.java  |  4 +-
 .../apache/hadoop/hbase/rest/RootResource.java  |  4 +-
 .../apache/hadoop/hbase/rest/RowResource.java   | 54 ++--
 .../hbase/rest/ScannerInstanceResource.java | 32 +++-
 .../hadoop/hbase/rest/ScannerResource.java  | 18 +++
 .../hadoop/hbase/rest/SchemaResource.java   | 22 
 .../rest/StorageClusterStatusResource.java  |  4 +-
 .../rest/StorageClusterVersionResource.java |  4 +-
 .../apache/hadoop/hbase/rest/TableResource.java | 26 ++
 .../hadoop/hbase/rest/VersionResource.java  | 10 ++--
 .../apache/hadoop/hbase/rest/client/Client.java | 40 +++
 .../hadoop/hbase/rest/filter/AuthFilter.java|  4 +-
 .../consumer/ProtobufMessageBodyConsumer.java   |  6 +--
 .../hadoop/hbase/util/ConnectionCache.java  |  6 ++-
 20 files changed, 162 insertions(+), 137 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/70593efa/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
index c88ac91..8ff3ef6 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
@@ -86,7 +86,9 @@ public class MultiRowResource extends ResourceBase implements 
Constants {
   }
   model.addRow(rowModel);
 } else {
-  LOG.trace("The row : " + rk + " not found in the table.");
+  if (LOG.isTraceEnabled()) {
+LOG.trace("The row : " + rk + " not found in the table.");
+  }
 }
   }
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/70593efa/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
index 8f64738..c832905 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
@@ -91,8 +91,8 @@ public class NamespacesInstanceResource extends ResourceBase {
 MIMETYPE_PROTOBUF_IETF})
   public Response get(final @Context ServletContext context,
   final @Context UriInfo uriInfo) {
-if (LOG.isDebugEnabled()) {
-  LOG.debug("GET " + uriInfo.getAbsolutePath());
+if (LOG.isTraceEnabled()) {
+  LOG.trace("GET " + uriInfo.getAbsolutePath());
 }
 servlet.getMetrics().incrementRequests(1);
 
@@ -135,8 +135,8 @@ public class NamespacesInstanceResource extends 
ResourceBase {
   @Consumes({MIMETYPE_XML, MIMETYPE_JSON, MIMETYPE_PROTOBUF,
 MIMETYPE_PROTOBUF_IETF})
   public Response put(final NamespacesInstanceModel model, final @Context 
UriInfo uriInfo) {
-if (LOG.isDebugEnabled()) {
-  LOG.debug("PUT " + uriInfo.getAbsolutePath());
+if (LOG.isTraceEnabled()) {
+  LOG.trace("PUT " + uriInfo.getAbsolutePath());
 }
 servlet.getMetrics().incrementRequests(1);
 return processUpdate(model, true, uriInfo);
@@ -151,8 +151,8 @@ public class 

[1/5] hbase git commit: HBASE-15954 REST server should log requests with TRACE instead of DEBUG

2016-06-06 Thread enis
Repository: hbase
Updated Branches:
  refs/heads/branch-1 878b1ea72 -> 4a0a9a20d
  refs/heads/branch-1.1 73a746239 -> 218259c0e
  refs/heads/branch-1.2 edbf2754a -> 70593efa2
  refs/heads/branch-1.3 aa636bef4 -> 466eb3164
  refs/heads/master b21c56e79 -> 3d7840a17


HBASE-15954 REST server should log requests with TRACE instead of DEBUG


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/3d7840a1
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/3d7840a1
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/3d7840a1

Branch: refs/heads/master
Commit: 3d7840a173aab97fb72409fa8c0f161fd7ad0e8f
Parents: b21c56e
Author: Enis Soztutar 
Authored: Mon Jun 6 10:58:37 2016 -0700
Committer: Enis Soztutar 
Committed: Mon Jun 6 10:58:37 2016 -0700

--
 .../hadoop/hbase/rest/MultiRowResource.java |  4 +-
 .../hbase/rest/NamespacesInstanceResource.java  | 24 -
 .../hadoop/hbase/rest/NamespacesResource.java   |  4 +-
 .../hbase/rest/ProtobufStreamingUtil.java   | 10 ++--
 .../apache/hadoop/hbase/rest/RESTServer.java| 12 +++--
 .../apache/hadoop/hbase/rest/RESTServlet.java   |  5 +-
 .../hadoop/hbase/rest/RegionsResource.java  |  4 +-
 .../apache/hadoop/hbase/rest/RootResource.java  |  4 +-
 .../apache/hadoop/hbase/rest/RowResource.java   | 54 ++--
 .../hbase/rest/ScannerInstanceResource.java | 32 +++-
 .../hadoop/hbase/rest/ScannerResource.java  | 17 +++---
 .../hadoop/hbase/rest/SchemaResource.java   | 22 
 .../rest/StorageClusterStatusResource.java  |  4 +-
 .../rest/StorageClusterVersionResource.java |  4 +-
 .../apache/hadoop/hbase/rest/TableResource.java | 26 ++
 .../hadoop/hbase/rest/VersionResource.java  | 10 ++--
 .../apache/hadoop/hbase/rest/client/Client.java | 44 
 .../hadoop/hbase/rest/filter/AuthFilter.java|  4 +-
 .../rest/filter/RestCsrfPreventionFilter.java   | 15 +++---
 .../consumer/ProtobufMessageBodyConsumer.java   |  6 +--
 .../hadoop/hbase/util/ConnectionCache.java  |  6 ++-
 21 files changed, 169 insertions(+), 142 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/3d7840a1/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
index c88ac91..8ff3ef6 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
@@ -86,7 +86,9 @@ public class MultiRowResource extends ResourceBase implements 
Constants {
   }
   model.addRow(rowModel);
 } else {
-  LOG.trace("The row : " + rk + " not found in the table.");
+  if (LOG.isTraceEnabled()) {
+LOG.trace("The row : " + rk + " not found in the table.");
+  }
 }
   }
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/3d7840a1/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
index 8f64738..c832905 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
@@ -91,8 +91,8 @@ public class NamespacesInstanceResource extends ResourceBase {
 MIMETYPE_PROTOBUF_IETF})
   public Response get(final @Context ServletContext context,
   final @Context UriInfo uriInfo) {
-if (LOG.isDebugEnabled()) {
-  LOG.debug("GET " + uriInfo.getAbsolutePath());
+if (LOG.isTraceEnabled()) {
+  LOG.trace("GET " + uriInfo.getAbsolutePath());
 }
 servlet.getMetrics().incrementRequests(1);
 
@@ -135,8 +135,8 @@ public class NamespacesInstanceResource extends 
ResourceBase {
   @Consumes({MIMETYPE_XML, MIMETYPE_JSON, MIMETYPE_PROTOBUF,
 MIMETYPE_PROTOBUF_IETF})
   public Response put(final NamespacesInstanceModel model, final @Context 
UriInfo uriInfo) {
-if (LOG.isDebugEnabled()) {
-  LOG.debug("PUT " + uriInfo.getAbsolutePath());
+if (LOG.isTraceEnabled()) {
+  LOG.trace("PUT " + uriInfo.getAbsolutePath());
 }
 servlet.getMetrics().incrementRequests(1);
 return processUpdate(model, true, uriInfo);
@@ -151,8 +151,8 @@ public class NamespacesInstanceResource extends 
ResourceBase {
   @PUT
   public 

[4/5] hbase git commit: HBASE-15954 REST server should log requests with TRACE instead of DEBUG

2016-06-06 Thread enis
HBASE-15954 REST server should log requests with TRACE instead of DEBUG

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/RESTServer.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ScannerResource.java

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/client/Client.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/filter/RestCsrfPreventionFilter.java

Conflicts:

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesResource.java
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/RESTServlet.java

hbase-server/src/main/java/org/apache/hadoop/hbase/util/ConnectionCache.java


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/218259c0
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/218259c0
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/218259c0

Branch: refs/heads/branch-1.1
Commit: 218259c0edcb7e8d8ee5f6a586c114fa2f33bc7f
Parents: 73a7462
Author: Enis Soztutar 
Authored: Mon Jun 6 10:58:37 2016 -0700
Committer: Enis Soztutar 
Committed: Mon Jun 6 14:06:32 2016 -0700

--
 .../hadoop/hbase/rest/MultiRowResource.java |   4 +-
 .../hbase/rest/ProtobufStreamingUtil.java   |  10 +-
 .../apache/hadoop/hbase/rest/RESTServer.java|  18 +-
 .../apache/hadoop/hbase/rest/RESTServlet.java   |   5 +-
 .../hadoop/hbase/rest/RegionsResource.java  |   4 +-
 .../apache/hadoop/hbase/rest/RootResource.java  |   4 +-
 .../apache/hadoop/hbase/rest/RowResource.java   |  54 ++--
 .../hbase/rest/ScannerInstanceResource.java |  32 ++-
 .../hadoop/hbase/rest/ScannerResource.java  |  18 +-
 .../hadoop/hbase/rest/SchemaResource.java   |  22 +-
 .../rest/StorageClusterStatusResource.java  |   4 +-
 .../rest/StorageClusterVersionResource.java |   4 +-
 .../apache/hadoop/hbase/rest/TableResource.java |  26 +-
 .../hadoop/hbase/rest/VersionResource.java  |  10 +-
 .../apache/hadoop/hbase/rest/client/Client.java |  40 +--
 .../hadoop/hbase/rest/filter/AuthFilter.java|   4 +-
 .../rest/filter/RestCsrfPreventionFilter.java   | 286 +++
 .../consumer/ProtobufMessageBodyConsumer.java   |   6 +-
 .../hadoop/hbase/util/ConnectionCache.java  |   6 +-
 19 files changed, 434 insertions(+), 123 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/218259c0/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
index c88ac91..8ff3ef6 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
@@ -86,7 +86,9 @@ public class MultiRowResource extends ResourceBase implements 
Constants {
   }
   model.addRow(rowModel);
 } else {
-  LOG.trace("The row : " + rk + " not found in the table.");
+  if (LOG.isTraceEnabled()) {
+LOG.trace("The row : " + rk + " not found in the table.");
+  }
 }
   }
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/218259c0/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
index 93bb940..cb0f4c8 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ProtobufStreamingUtil.java
@@ -49,8 +49,10 @@ public class ProtobufStreamingUtil implements 
StreamingOutput {
 this.contentType = type;
 this.limit = limit;
 this.fetchSize = fetchSize;
-LOG.debug("Created ScanStreamingUtil with content type = " + 
this.contentType + " user limit : "
-+ this.limit + " scan fetch size : " + this.fetchSize);
+if (LOG.isTraceEnabled()) {
+  LOG.trace("Created ScanStreamingUtil with content type = " + 
this.contentType
++ " user limit : " + this.limit + " scan fetch size : " + 
this.fetchSize);
+}
   }
 
   @Override
@@ -82,7 +84,9 @@ public class ProtobufStreamingUtil implements StreamingOutput 
{
 outStream.write(Bytes.toBytes((short)objectBytes.length));
 outStream.write(objectBytes);
 outStream.flush();
-LOG.trace("Wrote " + 

[2/5] hbase git commit: HBASE-15954 REST server should log requests with TRACE instead of DEBUG

2016-06-06 Thread enis
HBASE-15954 REST server should log requests with TRACE instead of DEBUG

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/RESTServer.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ScannerResource.java


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/4a0a9a20
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/4a0a9a20
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/4a0a9a20

Branch: refs/heads/branch-1
Commit: 4a0a9a20dd5bbfdafe2ec95196b449d2e1a45a13
Parents: 878b1ea
Author: Enis Soztutar 
Authored: Mon Jun 6 10:58:37 2016 -0700
Committer: Enis Soztutar 
Committed: Mon Jun 6 11:06:52 2016 -0700

--
 .../hadoop/hbase/rest/MultiRowResource.java |  4 +-
 .../hbase/rest/NamespacesInstanceResource.java  | 24 -
 .../hadoop/hbase/rest/NamespacesResource.java   |  4 +-
 .../hbase/rest/ProtobufStreamingUtil.java   | 10 ++--
 .../apache/hadoop/hbase/rest/RESTServer.java| 18 ---
 .../apache/hadoop/hbase/rest/RESTServlet.java   |  5 +-
 .../hadoop/hbase/rest/RegionsResource.java  |  4 +-
 .../apache/hadoop/hbase/rest/RootResource.java  |  4 +-
 .../apache/hadoop/hbase/rest/RowResource.java   | 54 ++--
 .../hbase/rest/ScannerInstanceResource.java | 32 +++-
 .../hadoop/hbase/rest/ScannerResource.java  | 18 +++
 .../hadoop/hbase/rest/SchemaResource.java   | 22 
 .../rest/StorageClusterStatusResource.java  |  4 +-
 .../rest/StorageClusterVersionResource.java |  4 +-
 .../apache/hadoop/hbase/rest/TableResource.java | 26 ++
 .../hadoop/hbase/rest/VersionResource.java  | 10 ++--
 .../apache/hadoop/hbase/rest/client/Client.java | 44 
 .../hadoop/hbase/rest/filter/AuthFilter.java|  4 +-
 .../rest/filter/RestCsrfPreventionFilter.java   | 15 +++---
 .../consumer/ProtobufMessageBodyConsumer.java   |  6 +--
 .../hadoop/hbase/util/ConnectionCache.java  |  6 ++-
 21 files changed, 171 insertions(+), 147 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/4a0a9a20/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
index c88ac91..8ff3ef6 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
@@ -86,7 +86,9 @@ public class MultiRowResource extends ResourceBase implements 
Constants {
   }
   model.addRow(rowModel);
 } else {
-  LOG.trace("The row : " + rk + " not found in the table.");
+  if (LOG.isTraceEnabled()) {
+LOG.trace("The row : " + rk + " not found in the table.");
+  }
 }
   }
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/4a0a9a20/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
index 8f64738..c832905 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
@@ -91,8 +91,8 @@ public class NamespacesInstanceResource extends ResourceBase {
 MIMETYPE_PROTOBUF_IETF})
   public Response get(final @Context ServletContext context,
   final @Context UriInfo uriInfo) {
-if (LOG.isDebugEnabled()) {
-  LOG.debug("GET " + uriInfo.getAbsolutePath());
+if (LOG.isTraceEnabled()) {
+  LOG.trace("GET " + uriInfo.getAbsolutePath());
 }
 servlet.getMetrics().incrementRequests(1);
 
@@ -135,8 +135,8 @@ public class NamespacesInstanceResource extends 
ResourceBase {
   @Consumes({MIMETYPE_XML, MIMETYPE_JSON, MIMETYPE_PROTOBUF,
 MIMETYPE_PROTOBUF_IETF})
   public Response put(final NamespacesInstanceModel model, final @Context 
UriInfo uriInfo) {
-if (LOG.isDebugEnabled()) {
-  LOG.debug("PUT " + uriInfo.getAbsolutePath());
+if (LOG.isTraceEnabled()) {
+  LOG.trace("PUT " + uriInfo.getAbsolutePath());
 }
 servlet.getMetrics().incrementRequests(1);
 return processUpdate(model, true, uriInfo);
@@ -151,8 +151,8 @@ public class NamespacesInstanceResource extends 
ResourceBase {
   @PUT
   public Response putNoBody(final byte[] message,
   final @Context UriInfo uriInfo, final 

[3/5] hbase git commit: HBASE-15954 REST server should log requests with TRACE instead of DEBUG

2016-06-06 Thread enis
HBASE-15954 REST server should log requests with TRACE instead of DEBUG

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/RESTServer.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/ScannerResource.java

Conflicts:
hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/client/Client.java

hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/filter/RestCsrfPreventionFilter.java


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/466eb316
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/466eb316
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/466eb316

Branch: refs/heads/branch-1.3
Commit: 466eb31648a4783c79d9b044fdd84d0db25c3d12
Parents: aa636be
Author: Enis Soztutar 
Authored: Mon Jun 6 10:58:37 2016 -0700
Committer: Enis Soztutar 
Committed: Mon Jun 6 11:34:33 2016 -0700

--
 .../hadoop/hbase/rest/MultiRowResource.java |   4 +-
 .../hbase/rest/NamespacesInstanceResource.java  |  24 +-
 .../hadoop/hbase/rest/NamespacesResource.java   |   4 +-
 .../hbase/rest/ProtobufStreamingUtil.java   |  10 +-
 .../apache/hadoop/hbase/rest/RESTServer.java|  18 +-
 .../apache/hadoop/hbase/rest/RESTServlet.java   |   5 +-
 .../hadoop/hbase/rest/RegionsResource.java  |   4 +-
 .../apache/hadoop/hbase/rest/RootResource.java  |   4 +-
 .../apache/hadoop/hbase/rest/RowResource.java   |  54 ++--
 .../hbase/rest/ScannerInstanceResource.java |  32 ++-
 .../hadoop/hbase/rest/ScannerResource.java  |  18 +-
 .../hadoop/hbase/rest/SchemaResource.java   |  22 +-
 .../rest/StorageClusterStatusResource.java  |   4 +-
 .../rest/StorageClusterVersionResource.java |   4 +-
 .../apache/hadoop/hbase/rest/TableResource.java |  26 +-
 .../hadoop/hbase/rest/VersionResource.java  |  10 +-
 .../apache/hadoop/hbase/rest/client/Client.java |  40 +--
 .../hadoop/hbase/rest/filter/AuthFilter.java|   4 +-
 .../rest/filter/RestCsrfPreventionFilter.java   | 286 +++
 .../consumer/ProtobufMessageBodyConsumer.java   |   6 +-
 .../hadoop/hbase/util/ConnectionCache.java  |   6 +-
 21 files changed, 448 insertions(+), 137 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/466eb316/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
index c88ac91..8ff3ef6 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/MultiRowResource.java
@@ -86,7 +86,9 @@ public class MultiRowResource extends ResourceBase implements 
Constants {
   }
   model.addRow(rowModel);
 } else {
-  LOG.trace("The row : " + rk + " not found in the table.");
+  if (LOG.isTraceEnabled()) {
+LOG.trace("The row : " + rk + " not found in the table.");
+  }
 }
   }
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/466eb316/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
--
diff --git 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
index 8f64738..c832905 100644
--- 
a/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
+++ 
b/hbase-rest/src/main/java/org/apache/hadoop/hbase/rest/NamespacesInstanceResource.java
@@ -91,8 +91,8 @@ public class NamespacesInstanceResource extends ResourceBase {
 MIMETYPE_PROTOBUF_IETF})
   public Response get(final @Context ServletContext context,
   final @Context UriInfo uriInfo) {
-if (LOG.isDebugEnabled()) {
-  LOG.debug("GET " + uriInfo.getAbsolutePath());
+if (LOG.isTraceEnabled()) {
+  LOG.trace("GET " + uriInfo.getAbsolutePath());
 }
 servlet.getMetrics().incrementRequests(1);
 
@@ -135,8 +135,8 @@ public class NamespacesInstanceResource extends 
ResourceBase {
   @Consumes({MIMETYPE_XML, MIMETYPE_JSON, MIMETYPE_PROTOBUF,
 MIMETYPE_PROTOBUF_IETF})
   public Response put(final NamespacesInstanceModel model, final @Context 
UriInfo uriInfo) {
-if (LOG.isDebugEnabled()) {
-  LOG.debug("PUT " + uriInfo.getAbsolutePath());
+if (LOG.isTraceEnabled()) {
+  LOG.trace("PUT " + uriInfo.getAbsolutePath());
 }
 servlet.getMetrics().incrementRequests(1);
 return processUpdate(model, true, uriInfo);
@@ -151,8 +151,8 @@ public class 

[29/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/KeepDeletedCells.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/KeepDeletedCells.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/KeepDeletedCells.html
index e9a6b35..4f4468b 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/KeepDeletedCells.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/KeepDeletedCells.html
@@ -159,13 +159,13 @@ the order they are declared.
 
 
 private KeepDeletedCells
-ScanQueryMatcher.keepDeletedCells
-whether to return deleted rows
-
+ScanInfo.keepDeletedCells
 
 
 private KeepDeletedCells
-ScanInfo.keepDeletedCells
+ScanQueryMatcher.keepDeletedCells
+whether to return deleted rows
+
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.KeyOnlyKeyValue.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.KeyOnlyKeyValue.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.KeyOnlyKeyValue.html
index 025a9fd..6cbd252 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.KeyOnlyKeyValue.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.KeyOnlyKeyValue.html
@@ -122,11 +122,11 @@
 
 
 private KeyValue.KeyOnlyKeyValue
-StoreFileReader.lastBloomKeyOnlyKV
+StoreFileWriter.lastBloomKeyOnlyKV
 
 
 private KeyValue.KeyOnlyKeyValue
-StoreFileWriter.lastBloomKeyOnlyKV
+StoreFileReader.lastBloomKeyOnlyKV
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.html
index 4cec977..73a2ea5 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/KeyValue.html
@@ -201,22 +201,22 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 static KeyValue
-KeyValueUtil.create(http://docs.oracle.com/javase/7/docs/api/java/io/DataInput.html?is-external=true;
 title="class or interface in java.io">DataInputin)
+KeyValue.create(http://docs.oracle.com/javase/7/docs/api/java/io/DataInput.html?is-external=true;
 title="class or interface in java.io">DataInputin)
 
 
 static KeyValue
-KeyValue.create(http://docs.oracle.com/javase/7/docs/api/java/io/DataInput.html?is-external=true;
 title="class or interface in java.io">DataInputin)
+KeyValueUtil.create(http://docs.oracle.com/javase/7/docs/api/java/io/DataInput.html?is-external=true;
 title="class or interface in java.io">DataInputin)
 
 
 static KeyValue
-KeyValueUtil.create(intlength,
+KeyValue.create(intlength,
 http://docs.oracle.com/javase/7/docs/api/java/io/DataInput.html?is-external=true;
 title="class or interface in java.io">DataInputin)
 Create a KeyValue reading length from 
in
 
 
 
 static KeyValue
-KeyValue.create(intlength,
+KeyValueUtil.create(intlength,
 http://docs.oracle.com/javase/7/docs/api/java/io/DataInput.html?is-external=true;
 title="class or interface in java.io">DataInputin)
 Create a KeyValue reading length from 
in
 
@@ -332,31 +332,31 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 static KeyValue
-KeyValueUtil.createKeyValueFromKey(byte[]b)
+KeyValue.createKeyValueFromKey(byte[]b)
 
 
 static KeyValue
-KeyValue.createKeyValueFromKey(byte[]b)
+KeyValueUtil.createKeyValueFromKey(byte[]b)
 
 
 static KeyValue
-KeyValueUtil.createKeyValueFromKey(byte[]b,
+KeyValue.createKeyValueFromKey(byte[]b,
   into,
   intl)
 
 
 static KeyValue
-KeyValue.createKeyValueFromKey(byte[]b,
+KeyValueUtil.createKeyValueFromKey(byte[]b,
   into,
   intl)
 
 
 static KeyValue
-KeyValueUtil.createKeyValueFromKey(http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in 
java.nio">ByteBufferbb)
+KeyValue.createKeyValueFromKey(http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in 
java.nio">ByteBufferbb)
 
 
 static KeyValue
-KeyValue.createKeyValueFromKey(http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in 
java.nio">ByteBufferbb)
+KeyValueUtil.createKeyValueFromKey(http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in 
java.nio">ByteBufferbb)
 
 
 static KeyValue
@@ -520,17 +520,17 @@ Input/OutputFormats, a table indexing MapReduce job, and 

[19/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/Get.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/class-use/Get.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/Get.html
index 19505e7..a4e1d9d 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/Get.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/Get.html
@@ -289,13 +289,13 @@ service.
 
 
 boolean
-Table.exists(Getget)
+HTable.exists(Getget)
 Test for the existence of columns in the table, as 
specified by the Get.
 
 
 
 boolean
-HTable.exists(Getget)
+Table.exists(Getget)
 Test for the existence of columns in the table, as 
specified by the Get.
 
 
@@ -305,13 +305,13 @@ service.
 
 
 Result
-Table.get(Getget)
+HTable.get(Getget)
 Extracts certain cells from a given row.
 
 
 
 Result
-HTable.get(Getget)
+Table.get(Getget)
 Extracts certain cells from a given row.
 
 
@@ -343,13 +343,13 @@ service.
 
 
 boolean[]
-Table.existsAll(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
+HTable.existsAll(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
 Test for the existence of columns in the table, as 
specified by the Gets.
 
 
 
 boolean[]
-HTable.existsAll(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
+Table.existsAll(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
 Test for the existence of columns in the table, as 
specified by the Gets.
 
 
@@ -359,13 +359,13 @@ service.
 
 
 Result[]
-Table.get(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
+HTable.get(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
 Extracts certain cells from the given rows, in batch.
 
 
 
 Result[]
-HTable.get(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
+Table.get(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
 Extracts certain cells from the given rows, in batch.
 
 
@@ -417,33 +417,39 @@ service.
 
 
 boolean
-RegionObserver.postExists(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postExists(ObserverContextRegionCoprocessorEnvironmente,
 Getget,
-booleanexists)
-Called after the client tests for existence using a 
Get.
-
+booleanexists)
 
 
 boolean
-BaseRegionObserver.postExists(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.postExists(ObserverContextRegionCoprocessorEnvironmentc,
 Getget,
-booleanexists)
+booleanexists)
+Called after the client tests for existence using a 
Get.
+
 
 
 void
+BaseRegionObserver.postGetOp(ObserverContextRegionCoprocessorEnvironmente,
+  Getget,
+  http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellresults)
+
+
+void
 RegionObserver.postGetOp(ObserverContextRegionCoprocessorEnvironmentc,
   Getget,
   http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellresult)
 Called after the client performs a Get
 
 
-
-void
-BaseRegionObserver.postGetOp(ObserverContextRegionCoprocessorEnvironmente,
+
+boolean
+BaseRegionObserver.preExists(ObserverContextRegionCoprocessorEnvironmente,
   Getget,
-  http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellresults)
+  booleanexists)
 
-
+
 boolean
 RegionObserver.preExists(ObserverContextRegionCoprocessorEnvironmentc,
   Getget,
@@ -451,27 +457,29 @@ service.
 Called before the client tests for existence using a 
Get.
 
 
-
-boolean
-BaseRegionObserver.preExists(ObserverContextRegionCoprocessorEnvironmente,
-  Getget,
-  booleanexists)
-
 
 void
+BaseRegionObserver.preGetOp(ObserverContextRegionCoprocessorEnvironmente,
+Getget,
+http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellresults)
+
+
+void
 RegionObserver.preGetOp(ObserverContextRegionCoprocessorEnvironmentc,
 Getget,
 

[46/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html 
b/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html
index e3c1f44..b77ad3c 100644
--- a/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html
+++ b/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html
@@ -104,7 +104,7 @@
 
 
 protected Order
-OrderedBytesBase.order
+RawBytes.order
 
 
 protected Order
@@ -112,7 +112,7 @@
 
 
 protected Order
-RawBytes.order
+OrderedBytesBase.order
 
 
 
@@ -125,7 +125,7 @@
 
 
 Order
-Struct.getOrder()
+Union3.getOrder()
 
 
 Order
@@ -133,15 +133,15 @@
 
 
 Order
-PBType.getOrder()
+RawShort.getOrder()
 
 
 Order
-RawFloat.getOrder()
+FixedLengthWrapper.getOrder()
 
 
 Order
-RawByte.getOrder()
+RawInteger.getOrder()
 
 
 Order
@@ -149,50 +149,50 @@
 
 
 Order
-DataType.getOrder()
-Retrieve the sort Order imposed by this data type, 
or null when
- natural ordering is not preserved.
-
+RawBytes.getOrder()
 
 
 Order
-Union3.getOrder()
+TerminatedWrapper.getOrder()
 
 
 Order
-Union4.getOrder()
+Struct.getOrder()
 
 
 Order
-RawInteger.getOrder()
+RawByte.getOrder()
 
 
 Order
-RawDouble.getOrder()
+Union4.getOrder()
 
 
 Order
-TerminatedWrapper.getOrder()
+RawString.getOrder()
 
 
 Order
-OrderedBytesBase.getOrder()
+PBType.getOrder()
 
 
 Order
-RawString.getOrder()
+RawFloat.getOrder()
 
 
 Order
-FixedLengthWrapper.getOrder()
+DataType.getOrder()
+Retrieve the sort Order imposed by this data type, 
or null when
+ natural ordering is not preserved.
+
 
 
 Order
-RawBytes.getOrder()
+RawDouble.getOrder()
 
 
 Order
-RawShort.getOrder()
+OrderedBytesBase.getOrder()
 
 
 



[37/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/HColumnDescriptor.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/HColumnDescriptor.html 
b/devapidocs/org/apache/hadoop/hbase/HColumnDescriptor.html
index f38febf..72ed7c1 100644
--- a/devapidocs/org/apache/hadoop/hbase/HColumnDescriptor.html
+++ b/devapidocs/org/apache/hadoop/hbase/HColumnDescriptor.html
@@ -354,7 +354,7 @@ implements http://docs.oracle.com/javase/7/docs/api/java/lang/Comparabl
 FOREVER
 
 
-private static http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
+static http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 IN_MEMORY_COMPACTION
 
 
@@ -871,7 +871,7 @@ implements http://docs.oracle.com/javase/7/docs/api/java/lang/Comparabl
 
 
 IN_MEMORY_COMPACTION
-private static finalhttp://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String IN_MEMORY_COMPACTION
+public static finalhttp://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String IN_MEMORY_COMPACTION
 See Also:Constant
 Field Values
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html 
b/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html
index 41fb732..0bfb780 100644
--- a/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html
+++ b/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html
@@ -249,7 +249,7 @@ the order they are declared.
 
 
 values
-public staticKeepDeletedCells[]values()
+public staticKeepDeletedCells[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -266,7 +266,7 @@ for (KeepDeletedCells c : KeepDeletedCells.values())
 
 
 valueOf
-public staticKeepDeletedCellsvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticKeepDeletedCellsvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/KeyValue.Type.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/KeyValue.Type.html 
b/devapidocs/org/apache/hadoop/hbase/KeyValue.Type.html
index c31803c..521f7b6 100644
--- a/devapidocs/org/apache/hadoop/hbase/KeyValue.Type.html
+++ b/devapidocs/org/apache/hadoop/hbase/KeyValue.Type.html
@@ -331,7 +331,7 @@ the order they are declared.
 
 
 values
-public staticKeyValue.Type[]values()
+public staticKeyValue.Type[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -348,7 +348,7 @@ for (KeyValue.Type c : KeyValue.Type.values())
 
 
 valueOf
-public staticKeyValue.TypevalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticKeyValue.TypevalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/MetaTableAccessor.QueryType.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/MetaTableAccessor.QueryType.html 
b/devapidocs/org/apache/hadoop/hbase/MetaTableAccessor.QueryType.html
index f552c5a..8064bcf 100644
--- a/devapidocs/org/apache/hadoop/hbase/MetaTableAccessor.QueryType.html
+++ b/devapidocs/org/apache/hadoop/hbase/MetaTableAccessor.QueryType.html
@@ -275,7 +275,7 @@ the order they are declared.
 
 
 values
-public staticMetaTableAccessor.QueryType[]values()
+public staticMetaTableAccessor.QueryType[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method 

[23/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.AsyncRequestFutureImpl.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.AsyncRequestFutureImpl.html
 
b/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.AsyncRequestFutureImpl.html
index d5423d0..60e7979 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.AsyncRequestFutureImpl.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.AsyncRequestFutureImpl.html
@@ -103,7 +103,7 @@
 
 
 
-protected class AsyncProcess.AsyncRequestFutureImplCResult
+protected class AsyncProcess.AsyncRequestFutureImplCResult
 extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object
 implements AsyncProcess.AsyncRequestFuture
 The context, and return value, for a single 
submit/submitAll call.
@@ -473,7 +473,7 @@ implements 
 
 callback
-private finalBatch.CallbackCResult 
callback
+private finalBatch.CallbackCResult 
callback
 
 
 
@@ -482,7 +482,7 @@ implements 
 
 errors
-private finalAsyncProcess.BatchErrors 
errors
+private finalAsyncProcess.BatchErrors 
errors
 
 
 
@@ -491,7 +491,7 @@ implements 
 
 errorsByServer
-private finalConnectionImplementation.ServerErrorTracker 
errorsByServer
+private finalConnectionImplementation.ServerErrorTracker 
errorsByServer
 
 
 
@@ -500,7 +500,7 @@ implements 
 
 pool
-private finalhttp://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ExecutorService.html?is-external=true;
 title="class or interface in java.util.concurrent">ExecutorService pool
+private finalhttp://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ExecutorService.html?is-external=true;
 title="class or interface in java.util.concurrent">ExecutorService pool
 
 
 
@@ -509,7 +509,7 @@ implements 
 
 callsInProgress
-private finalhttp://docs.oracle.com/javase/7/docs/api/java/util/Set.html?is-external=true;
 title="class or interface in java.util">SetPayloadCarryingServerCallable callsInProgress
+private finalhttp://docs.oracle.com/javase/7/docs/api/java/util/Set.html?is-external=true;
 title="class or interface in java.util">SetPayloadCarryingServerCallable callsInProgress
 
 
 
@@ -518,7 +518,7 @@ implements 
 
 tableName
-private finalTableName tableName
+private finalTableName tableName
 
 
 
@@ -527,7 +527,7 @@ implements 
 
 actionsInProgress
-private finalhttp://docs.oracle.com/javase/7/docs/api/java/util/concurrent/atomic/AtomicLong.html?is-external=true;
 title="class or interface in java.util.concurrent.atomic">AtomicLong actionsInProgress
+private finalhttp://docs.oracle.com/javase/7/docs/api/java/util/concurrent/atomic/AtomicLong.html?is-external=true;
 title="class or interface in java.util.concurrent.atomic">AtomicLong actionsInProgress
 
 
 
@@ -536,7 +536,7 @@ implements 
 
 replicaResultLock
-private finalhttp://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object replicaResultLock
+private finalhttp://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object replicaResultLock
 The lock controls access to results. It is only held when 
populating results where
  there might be several callers (eventual consistency gets). For other 
requests,
  there's one unique call going on per result index.
@@ -548,7 +548,7 @@ implements 
 
 results
-private finalhttp://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object[] results
+private finalhttp://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object[] results
 Result array.  Null if results are not needed. Otherwise, 
each index corresponds to
  the action index in initial actions submitted. For most request types, has 
null-s for
  requests that are not done, and result/exception for those that are done.
@@ -564,7 +564,7 @@ implements 
 
 replicaGetIndices
-private finalint[] replicaGetIndices
+private finalint[] replicaGetIndices
 Indices of replica gets in results. If null, all or no 
actions are replica-gets.
 
 
@@ -574,7 +574,7 @@ implements 
 
 hasAnyReplicaGets
-private finalboolean hasAnyReplicaGets
+private finalboolean hasAnyReplicaGets
 
 
 
@@ -583,7 +583,7 @@ implements 
 
 nonceGroup
-private finallong nonceGroup
+private finallong nonceGroup
 
 
 
@@ -592,7 +592,7 @@ implements 
 
 currentCallable
-privatePayloadCarryingServerCallable currentCallable
+privatePayloadCarryingServerCallable currentCallable
 
 
 
@@ -601,7 +601,7 @@ implements 
 
 currentCallTotalTimeout
-privateint currentCallTotalTimeout
+privateint currentCallTotalTimeout
 
 
 
@@ -618,7 +618,7 @@ implements 
 
 AsyncProcess.AsyncRequestFutureImpl

[22/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.html 
b/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.html
index 83430bd..7012c57 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/AsyncProcess.html
@@ -205,107 +205,123 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 DEFAULT_START_LOG_ERRORS_AFTER_COUNT
 
 
+private static int
+DEFAULT_THRESHOLD_TO_LOG_UNDONE_TASK_DETAILS
+
+
 protected AsyncProcess.BatchErrors
 globalErrors
 
-
+
 protected long
 id
 
-
+
 private static 
org.apache.commons.logging.Log
 LOG
 
-
+
 static http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 LOG_DETAILS_FOR_BATCH_ERROR
 Configuration to decide whether to log details for batch 
error
 
 
-
+
 private boolean
 logBatchErrorDetails
 Whether to log details for batch errors
 
 
-
+
 protected int
 maxConcurrentTasksPerRegion
 The number of tasks we run in parallel on a single 
region.
 
 
-
+
 protected int
 maxConcurrentTasksPerServer
 The number of task simultaneously executed on a single 
region server.
 
 
-
+
 protected int
 maxTotalConcurrentTasks
 The number of tasks simultaneously executed on the 
cluster.
 
 
-
+
 private static AsyncProcess.AsyncRequestFuture
 NO_REQS_RESULT
 Return value from a submit that didn't contain any 
requests.
 
 
-
+
 protected int
 numTries
 
-
+
 protected long
 pause
 
-
+
 protected http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ExecutorService.html?is-external=true;
 title="class or interface in 
java.util.concurrent">ExecutorService
 pool
 
-
+
 static http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 PRIMARY_CALL_TIMEOUT_KEY
 
-
+
 protected long
 primaryCallTimeoutMicroseconds
 
-
+
 protected RpcRetryingCallerFactory
 rpcCallerFactory
 
-
+
 protected RpcControllerFactory
 rpcFactory
 
-
+
 protected int
 serverTrackerTimeout
 
-
+
 static http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 START_LOG_ERRORS_AFTER_COUNT_KEY
 Configure the number of failures after which the client 
will start logging.
 
 
-
+
 private int
 startLogErrorsCnt
 
-
+
 protected http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ConcurrentMap.html?is-external=true;
 title="class or interface in 
java.util.concurrent">ConcurrentMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/atomic/AtomicInteger.html?is-external=true;
 title="class or interface in 
java.util.concurrent.atomic">AtomicInteger
 taskCounterPerRegion
 
-
+
 protected http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ConcurrentMap.html?is-external=true;
 title="class or interface in java.util.concurrent">ConcurrentMapServerName,http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/atomic/AtomicInteger.html?is-external=true;
 title="class or interface in 
java.util.concurrent.atomic">AtomicInteger
 taskCounterPerServer
 
-
+
 protected http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/atomic/AtomicLong.html?is-external=true;
 title="class or interface in 
java.util.concurrent.atomic">AtomicLong
 tasksInProgress
 
+
+private int
+THRESHOLD_TO_LOG_REGION_DETAILS
+
+
+private static http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
+THRESHOLD_TO_LOG_UNDONE_TASK_DETAILS
+
+
+private int
+thresholdToLogUndoneTaskDetails
+
 
 protected int
 timeout
@@ -436,12 +452,16 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 isReplicaGet(Rowrow)
 
 
+private void
+logDetailsOfUndoneTasks(longtaskInProgress)
+
+
 private static void
 setNonce(NonceGeneratorng,
 Rowr,
 ActionRowaction)
 
-
+
 CResultAsyncProcess.AsyncRequestFuture
 submit(http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ExecutorService.html?is-external=true;
 title="class or interface in 
java.util.concurrent">ExecutorServicepool,
 TableNametableName,
@@ -452,7 +472,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 Extract from the rows list what we can submit.
 
 
-
+
 CResultAsyncProcess.AsyncRequestFuture
 submit(TableNametableName,
 http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">List? extends Rowrows,
@@ -462,7 +482,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 See submit(ExecutorService,
 TableName, List, boolean, Batch.Callback, boolean).
 
 
-
+
 

[48/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/client/Consistency.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/Consistency.html 
b/apidocs/org/apache/hadoop/hbase/client/Consistency.html
index 51e96d7..31c7ce9 100644
--- a/apidocs/org/apache/hadoop/hbase/client/Consistency.html
+++ b/apidocs/org/apache/hadoop/hbase/client/Consistency.html
@@ -240,7 +240,7 @@ the order they are declared.
 
 
 values
-public staticConsistency[]values()
+public staticConsistency[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -257,7 +257,7 @@ for (Consistency c : Consistency.values())
 
 
 valueOf
-public staticConsistencyvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticConsistencyvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/client/Durability.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/Durability.html 
b/apidocs/org/apache/hadoop/hbase/client/Durability.html
index af1f718..0f065d5 100644
--- a/apidocs/org/apache/hadoop/hbase/client/Durability.html
+++ b/apidocs/org/apache/hadoop/hbase/client/Durability.html
@@ -280,7 +280,7 @@ the order they are declared.
 
 
 values
-public staticDurability[]values()
+public staticDurability[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -297,7 +297,7 @@ for (Durability c : Durability.values())
 
 
 valueOf
-public staticDurabilityvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticDurabilityvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/client/IsolationLevel.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/IsolationLevel.html 
b/apidocs/org/apache/hadoop/hbase/client/IsolationLevel.html
index 7dc6ab3..7cc0a93 100644
--- a/apidocs/org/apache/hadoop/hbase/client/IsolationLevel.html
+++ b/apidocs/org/apache/hadoop/hbase/client/IsolationLevel.html
@@ -243,7 +243,7 @@ the order they are declared.
 
 
 values
-public staticIsolationLevel[]values()
+public staticIsolationLevel[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -260,7 +260,7 @@ for (IsolationLevel c : IsolationLevel.values())
 
 
 valueOf
-public staticIsolationLevelvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticIsolationLevelvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html 
b/apidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html
index 1e8656b..45bd3b6 100644
--- a/apidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html
+++ b/apidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html
@@ -99,6 +99,7 @@
 @InterfaceStability.Evolving
 public class RegionLoadStats
 extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object
+POJO representing region server load
 
 
 


[13/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/package-tree.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/package-tree.html 
b/devapidocs/org/apache/hadoop/hbase/client/package-tree.html
index a437a75..06f203f 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/package-tree.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/package-tree.html
@@ -417,14 +417,14 @@
 java.lang.http://docs.oracle.com/javase/7/docs/api/java/lang/Enum.html?is-external=true;
 title="class or interface in java.lang">EnumE (implements java.lang.http://docs.oracle.com/javase/7/docs/api/java/lang/Comparable.html?is-external=true;
 title="class or interface in java.lang">ComparableT, java.io.http://docs.oracle.com/javase/7/docs/api/java/io/Serializable.html?is-external=true;
 title="class or interface in java.io">Serializable)
 
 org.apache.hadoop.hbase.client.IsolationLevel
-org.apache.hadoop.hbase.client.CompactType
 org.apache.hadoop.hbase.client.TableState.State
-org.apache.hadoop.hbase.client.Consistency
 org.apache.hadoop.hbase.client.AsyncProcess.Retry
-org.apache.hadoop.hbase.client.SnapshotType
-org.apache.hadoop.hbase.client.CompactionState
+org.apache.hadoop.hbase.client.CompactType
 org.apache.hadoop.hbase.client.MasterSwitchType
+org.apache.hadoop.hbase.client.SnapshotType
 org.apache.hadoop.hbase.client.Durability
+org.apache.hadoop.hbase.client.CompactionState
+org.apache.hadoop.hbase.client.Consistency
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/package-use.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/package-use.html 
b/devapidocs/org/apache/hadoop/hbase/client/package-use.html
index 8149752..e5c0821 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/package-use.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/package-use.html
@@ -200,78 +200,84 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
+org.apache.hadoop.hbase.replication
+
+Multi Cluster Replication
+
+
+
 org.apache.hadoop.hbase.replication.regionserver
 
 
-
+
 org.apache.hadoop.hbase.rest
 
 HBase REST
 
 
-
+
 org.apache.hadoop.hbase.rest.client
 
 
-
+
 org.apache.hadoop.hbase.rest.model
 
 
-
+
 org.apache.hadoop.hbase.rsgroup
 
 
-
+
 org.apache.hadoop.hbase.security.access
 
 
-
+
 org.apache.hadoop.hbase.security.token
 
 
-
+
 org.apache.hadoop.hbase.security.visibility
 
 
-
+
 org.apache.hadoop.hbase.snapshot
 
 
-
+
 org.apache.hadoop.hbase.thrift
 
 Provides an HBase http://incubator.apache.org/thrift/;>Thrift
 service.
 
 
-
+
 org.apache.hadoop.hbase.thrift2
 
 Provides an HBase http://thrift.apache.org/;>Thrift
 service.
 
 
-
+
 org.apache.hadoop.hbase.tool
 
 
-
+
 org.apache.hadoop.hbase.util
 
 
-
+
 org.apache.hadoop.hbase.util.hbck
 
 
-
+
 org.apache.hadoop.hbase.wal
 
 
-
+
 org.apache.hadoop.hbase.zookeeper
 
 
-
+
 org.apache.hbase.archetypes.exemplars.client
 
 This package provides fully-functional exemplar Java code 
demonstrating
@@ -747,7 +753,9 @@ service.
 
 
 
-RegionLoadStats
+RegionLoadStats
+POJO representing region server load
+
 
 
 RegionLocator
@@ -923,7 +931,9 @@ service.
 
 
 
-RegionLoadStats
+RegionLoadStats
+POJO representing region server load
+
 
 
 
@@ -1718,6 +1728,49 @@ service.
 
 
 
+
+
+
+
+Classes in org.apache.hadoop.hbase.client
 used by org.apache.hadoop.hbase.replication
+
+Class and Description
+
+
+
+Admin
+The administrative API for HBase.
+
+
+
+Connection
+A cluster connection encapsulating lower level individual 
connections to actual servers and
+ a connection to zookeeper.
+
+
+
+Delete
+Used to perform Delete operations on a single row.
+
+
+
+Put
+Used to perform Put operations for a single row.
+
+
+
+RowMutations
+Performs multiple mutations atomically on a single 
row.
+
+
+
+Table
+Used to communicate with a single HBase table.
+
+
+
+
+
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/security/SecurityCapability.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/security/SecurityCapability.html 
b/devapidocs/org/apache/hadoop/hbase/client/security/SecurityCapability.html
index f91b175..2c7a37b 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/security/SecurityCapability.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/security/SecurityCapability.html
@@ -305,7 +305,7 @@ the order they are declared.
 
 
 values
-public staticSecurityCapability[]values()
+public staticSecurityCapability[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -322,7 +322,7 @@ for 

[45/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html
--
diff --git 
a/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html 
b/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html
index 227f913..8d61583 100644
--- a/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html
+++ b/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html
@@ -116,28 +116,20 @@
 
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
-OrderedString.decode(PositionedByteRangesrc)
-
-
-http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object[]
-Struct.decode(PositionedByteRangesrc)
-
-
 http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html?is-external=true;
 title="class or interface in java.lang">Long
 RawLong.decode(PositionedByteRangesrc)
 
 
 http://docs.oracle.com/javase/7/docs/api/java/lang/Short.html?is-external=true;
 title="class or interface in java.lang">Short
-OrderedInt16.decode(PositionedByteRangesrc)
+RawShort.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Float.html?is-external=true;
 title="class or interface in java.lang">Float
-RawFloat.decode(PositionedByteRangesrc)
+T
+FixedLengthWrapper.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Byte.html?is-external=true;
 title="class or interface in java.lang">Byte
-RawByte.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/7/docs/api/java/lang/Integer.html?is-external=true;
 title="class or interface in java.lang">Integer
+RawInteger.decode(PositionedByteRangesrc)
 
 
 http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html?is-external=true;
 title="class or interface in java.lang">Long
@@ -145,71 +137,79 @@
 
 
 byte[]
-OrderedBlob.decode(PositionedByteRangesrc)
+OrderedBlobVar.decode(PositionedByteRangesrc)
 
 
-T
-DataType.decode(PositionedByteRangesrc)
-Read an instance of T from the buffer 
src.
-
+http://docs.oracle.com/javase/7/docs/api/java/lang/Number.html?is-external=true;
 title="class or interface in java.lang">Number
+OrderedNumeric.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Double.html?is-external=true;
 title="class or interface in java.lang">Double
-OrderedFloat64.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
+OrderedString.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Byte.html?is-external=true;
 title="class or interface in java.lang">Byte
-OrderedInt8.decode(PositionedByteRangesrc)
+byte[]
+RawBytes.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Integer.html?is-external=true;
 title="class or interface in java.lang">Integer
-RawInteger.decode(PositionedByteRangesrc)
+T
+TerminatedWrapper.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Float.html?is-external=true;
 title="class or interface in java.lang">Float
-OrderedFloat32.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object[]
+Struct.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Integer.html?is-external=true;
 title="class or interface in java.lang">Integer
-OrderedInt32.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/7/docs/api/java/lang/Double.html?is-external=true;
 title="class or interface in java.lang">Double
+OrderedFloat64.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Double.html?is-external=true;
 title="class or interface in java.lang">Double
-RawDouble.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/7/docs/api/java/lang/Byte.html?is-external=true;
 title="class or interface in java.lang">Byte
+RawByte.decode(PositionedByteRangesrc)
 
 
-T
-TerminatedWrapper.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/7/docs/api/java/lang/Byte.html?is-external=true;
 title="class or interface in java.lang">Byte
+OrderedInt8.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
-RawString.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/7/docs/api/java/lang/Integer.html?is-external=true;
 title="class or interface in java.lang">Integer
+OrderedInt32.decode(PositionedByteRangesrc)
 
 
-T
-FixedLengthWrapper.decode(PositionedByteRangesrc)

[18/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/Mutation.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/class-use/Mutation.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/Mutation.html
index db8f452..f2eb310 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/Mutation.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/Mutation.html
@@ -355,40 +355,40 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 Cell
-RegionObserver.postMutationBeforeWAL(ObserverContextRegionCoprocessorEnvironmentctx,
+BaseRegionObserver.postMutationBeforeWAL(ObserverContextRegionCoprocessorEnvironmentctx,
   RegionObserver.MutationTypeopType,
   Mutationmutation,
   CelloldCell,
-  CellnewCell)
-Called after a new cell has been created during an 
increment operation, but before
- it is committed to the WAL or memstore.
-
+  CellnewCell)
 
 
 Cell
-BaseRegionObserver.postMutationBeforeWAL(ObserverContextRegionCoprocessorEnvironmentctx,
+RegionObserver.postMutationBeforeWAL(ObserverContextRegionCoprocessorEnvironmentctx,
   RegionObserver.MutationTypeopType,
   Mutationmutation,
   CelloldCell,
-  CellnewCell)
+  CellnewCell)
+Called after a new cell has been created during an 
increment operation, but before
+ it is committed to the WAL or memstore.
+
 
 
 void
-RegionObserver.prePrepareTimeStampForDeleteVersion(ObserverContextRegionCoprocessorEnvironmentc,
-  Mutationmutation,
+BaseRegionObserver.prePrepareTimeStampForDeleteVersion(ObserverContextRegionCoprocessorEnvironmente,
+  Mutationdelete,
   Cellcell,
   
byte[]byteNow,
-  Getget)
-Called before the server updates the timestamp for version 
delete with latest timestamp.
-
+  Getget)
 
 
 void
-BaseRegionObserver.prePrepareTimeStampForDeleteVersion(ObserverContextRegionCoprocessorEnvironmente,
-  Mutationdelete,
+RegionObserver.prePrepareTimeStampForDeleteVersion(ObserverContextRegionCoprocessorEnvironmentc,
+  Mutationmutation,
   Cellcell,
   
byte[]byteNow,
-  Getget)
+  Getget)
+Called before the server updates the timestamp for version 
delete with latest timestamp.
+
 
 
 
@@ -401,17 +401,23 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 void
+BaseRegionObserver.postBatchMutate(ObserverContextRegionCoprocessorEnvironmentc,
+  MiniBatchOperationInProgressMutationminiBatchOp)
+
+
+void
 RegionObserver.postBatchMutate(ObserverContextRegionCoprocessorEnvironmentc,
   MiniBatchOperationInProgressMutationminiBatchOp)
 This will be called after applying a batch of Mutations on 
a region.
 
 
-
+
 void
-BaseRegionObserver.postBatchMutate(ObserverContextRegionCoprocessorEnvironmentc,
-  MiniBatchOperationInProgressMutationminiBatchOp)
+BaseRegionObserver.postBatchMutateIndispensably(ObserverContextRegionCoprocessorEnvironmentctx,
+MiniBatchOperationInProgressMutationminiBatchOp,
+
booleansuccess)
 
-
+
 void
 RegionObserver.postBatchMutateIndispensably(ObserverContextRegionCoprocessorEnvironmentctx,
 MiniBatchOperationInProgressMutationminiBatchOp,
@@ -420,24 +426,18 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
  fails.
 
 
-
+
 void
-BaseRegionObserver.postBatchMutateIndispensably(ObserverContextRegionCoprocessorEnvironmentctx,
-MiniBatchOperationInProgressMutationminiBatchOp,
-   

[12/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/MasterCoprocessorEnvironment.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/MasterCoprocessorEnvironment.html
 
b/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/MasterCoprocessorEnvironment.html
index b8db80e..17601bb 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/MasterCoprocessorEnvironment.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/MasterCoprocessorEnvironment.html
@@ -118,14 +118,14 @@
 
 
 void
-BaseMasterObserver.postAbortProcedure(ObserverContextMasterCoprocessorEnvironmentctx)
-
-
-void
 MasterObserver.postAbortProcedure(ObserverContextMasterCoprocessorEnvironmentctx)
 Called after a abortProcedure request has been 
processed.
 
 
+
+void
+BaseMasterObserver.postAbortProcedure(ObserverContextMasterCoprocessorEnvironmentctx)
+
 
 void
 BaseMasterAndRegionObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -136,25 +136,25 @@
 
 
 void
-BaseMasterObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
+MasterObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
   TableNametableName,
   HColumnDescriptorcolumnFamily)
 Deprecated.
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645).
- Use BaseMasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
+ Use MasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
 
 void
-MasterObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
+BaseMasterObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
   TableNametableName,
   HColumnDescriptorcolumnFamily)
 Deprecated.
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645).
- Use MasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
+ Use BaseMasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
@@ -166,18 +166,18 @@
 
 
 void
-BaseMasterObserver.postAddColumnFamily(ObserverContextMasterCoprocessorEnvironmentctx,
-  TableNametableName,
-  HColumnDescriptorcolumnFamily)
-
-
-void
 MasterObserver.postAddColumnFamily(ObserverContextMasterCoprocessorEnvironmentctx,
   TableNametableName,
   HColumnDescriptorcolumnFamily)
 Called after the new column family has been created.
 
 
+
+void
+BaseMasterObserver.postAddColumnFamily(ObserverContextMasterCoprocessorEnvironmentctx,
+  TableNametableName,
+  HColumnDescriptorcolumnFamily)
+
 
 void
 BaseMasterAndRegionObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -188,25 +188,25 @@
 
 
 void
-BaseMasterObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
+MasterObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
 TableNametableName,
 HColumnDescriptorcolumnFamily)
 Deprecated.
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645). Use
- BaseMasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
+ MasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
 
 void
-MasterObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
+BaseMasterObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
 TableNametableName,
 HColumnDescriptorcolumnFamily)
 Deprecated.
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645). Use
- MasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
+ BaseMasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
@@ -217,16 +217,16 @@
 
 
 void
-BaseMasterObserver.postAddRSGroup(ObserverContextMasterCoprocessorEnvironmentctx,
-http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
-
-
-void
 

[20/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/Delete.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/class-use/Delete.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/Delete.html
index d11a3af..9123492 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/Delete.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/Delete.html
@@ -110,14 +110,20 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
+org.apache.hadoop.hbase.replication
+
+Multi Cluster Replication
+
+
+
 org.apache.hadoop.hbase.rest.client
 
 
-
+
 org.apache.hadoop.hbase.security.access
 
 
-
+
 org.apache.hadoop.hbase.thrift2
 
 Provides an HBase http://thrift.apache.org/;>Thrift
@@ -321,7 +327,7 @@ service.
 
 
 boolean
-Table.checkAndDelete(byte[]row,
+HTable.checkAndDelete(byte[]row,
 byte[]family,
 byte[]qualifier,
 byte[]value,
@@ -332,7 +338,7 @@ service.
 
 
 boolean
-HTable.checkAndDelete(byte[]row,
+Table.checkAndDelete(byte[]row,
 byte[]family,
 byte[]qualifier,
 byte[]value,
@@ -351,7 +357,7 @@ service.
 
 
 boolean
-Table.checkAndDelete(byte[]row,
+HTable.checkAndDelete(byte[]row,
 byte[]family,
 byte[]qualifier,
 CompareFilter.CompareOpcompareOp,
@@ -363,7 +369,7 @@ service.
 
 
 boolean
-HTable.checkAndDelete(byte[]row,
+Table.checkAndDelete(byte[]row,
 byte[]family,
 byte[]qualifier,
 CompareFilter.CompareOpcompareOp,
@@ -384,13 +390,13 @@ service.
 
 
 void
-Table.delete(Deletedelete)
+HTable.delete(Deletedelete)
 Deletes the specified cells/row.
 
 
 
 void
-HTable.delete(Deletedelete)
+Table.delete(Deletedelete)
 Deletes the specified cells/row.
 
 
@@ -409,13 +415,13 @@ service.
 
 
 void
-Table.delete(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListDeletedeletes)
+HTable.delete(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListDeletedeletes)
 Deletes the specified cells/rows in bulk.
 
 
 
 void
-HTable.delete(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListDeletedeletes)
+Table.delete(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListDeletedeletes)
 Deletes the specified cells/rows in bulk.
 
 
@@ -450,107 +456,107 @@ service.
 
 
 boolean
-RegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmente,
 byte[]row,
 byte[]family,
 byte[]qualifier,
 CompareFilter.CompareOpcompareOp,
 ByteArrayComparablecomparator,
 Deletedelete,
-booleanresult)
-Called after checkAndDelete
-
+booleanresult)
 
 
 boolean
-BaseRegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmentc,
 byte[]row,
 byte[]family,
 byte[]qualifier,
 CompareFilter.CompareOpcompareOp,
 ByteArrayComparablecomparator,
 Deletedelete,
-booleanresult)
+booleanresult)
+Called after checkAndDelete
+
 
 
 void
-RegionObserver.postDelete(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postDelete(ObserverContextRegionCoprocessorEnvironmente,
 Deletedelete,
 WALEditedit,
-Durabilitydurability)
-Called after the client deletes a value.
-
+Durabilitydurability)
 
 
 void
-BaseRegionObserver.postDelete(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.postDelete(ObserverContextRegionCoprocessorEnvironmentc,
 Deletedelete,
 WALEditedit,
-Durabilitydurability)
+Durabilitydurability)
+Called after the client deletes a value.
+
 
 
 boolean

[49/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/class-use/Cell.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/class-use/Cell.html 
b/apidocs/org/apache/hadoop/hbase/class-use/Cell.html
index a5333e3..cb97d71 100644
--- a/apidocs/org/apache/hadoop/hbase/class-use/Cell.html
+++ b/apidocs/org/apache/hadoop/hbase/class-use/Cell.html
@@ -919,17 +919,17 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
-Put
-Put.add(Cellkv)
-Add the specified KeyValue to this Put operation.
-
-
-
 Append
 Append.add(Cellcell)
 Add column and value to this Append operation.
 
 
+
+Put
+Put.add(Cellkv)
+Add the specified KeyValue to this Put operation.
+
+
 
 Delete
 Delete.addDeleteMarker(Cellkv)
@@ -1015,19 +1015,19 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 Delete.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 
 
+Append
+Append.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
+
+
 Put
 Put.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 
-
+
 Mutation
 Mutation.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 Method for setting the put's familyMap
 
 
-
-Append
-Append.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
-
 
 
 
@@ -1044,31 +1044,23 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 Cell
-FuzzyRowFilter.getNextCellHint(CellcurrentCell)
+MultipleColumnPrefixFilter.getNextCellHint(Cellcell)
 
 
 Cell
-ColumnPaginationFilter.getNextCellHint(Cellcell)
+MultiRowRangeFilter.getNextCellHint(CellcurrentKV)
 
 
 Cell
-ColumnPrefixFilter.getNextCellHint(Cellcell)
+FilterList.getNextCellHint(CellcurrentCell)
 
 
-Cell
-ColumnRangeFilter.getNextCellHint(Cellcell)
-
-
 abstract Cell
 Filter.getNextCellHint(CellcurrentCell)
 If the filter returns the match code SEEK_NEXT_USING_HINT, 
then it should also tell which is
  the next key it must seek to.
 
 
-
-Cell
-FilterList.getNextCellHint(CellcurrentCell)
-
 
 Cell
 TimestampsFilter.getNextCellHint(CellcurrentCell)
@@ -1077,34 +1069,42 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 Cell
-MultiRowRangeFilter.getNextCellHint(CellcurrentKV)
+ColumnPrefixFilter.getNextCellHint(Cellcell)
 
 
 Cell
-MultipleColumnPrefixFilter.getNextCellHint(Cellcell)
+ColumnPaginationFilter.getNextCellHint(Cellcell)
 
 
 Cell
-SkipFilter.transformCell(Cellv)
+ColumnRangeFilter.getNextCellHint(Cellcell)
 
 
 Cell
-KeyOnlyFilter.transformCell(Cellcell)
+FuzzyRowFilter.getNextCellHint(CellcurrentCell)
 
 
-abstract Cell
-Filter.transformCell(Cellv)
-Give the filter a chance to transform the passed 
KeyValue.
-
+Cell
+WhileMatchFilter.transformCell(Cellv)
 
 
 Cell
-WhileMatchFilter.transformCell(Cellv)
+SkipFilter.transformCell(Cellv)
 
 
 Cell
 FilterList.transformCell(Cellc)
 
+
+abstract Cell
+Filter.transformCell(Cellv)
+Give the filter a chance to transform the passed 
KeyValue.
+
+
+
+Cell
+KeyOnlyFilter.transformCell(Cellcell)
+
 
 
 
@@ -1140,225 +1140,217 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 Filter.ReturnCode
-ColumnPrefixFilter.filterColumn(Cellcell)
+MultipleColumnPrefixFilter.filterColumn(Cellcell)
 
 
 Filter.ReturnCode
-MultipleColumnPrefixFilter.filterColumn(Cellcell)
+ColumnPrefixFilter.filterColumn(Cellcell)
 
 
 Filter.ReturnCode
-PrefixFilter.filterKeyValue(Cellv)
+MultipleColumnPrefixFilter.filterKeyValue(Cellkv)
 
 
 Filter.ReturnCode
-FirstKeyOnlyFilter.filterKeyValue(Cellv)
+MultiRowRangeFilter.filterKeyValue(Cellignored)
 
 
 Filter.ReturnCode
-SingleColumnValueFilter.filterKeyValue(Cellc)
+WhileMatchFilter.filterKeyValue(Cellv)
 
 
 Filter.ReturnCode

[17/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/Put.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/class-use/Put.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/Put.html
index 6b672ab..2108cdf 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/Put.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/Put.html
@@ -123,21 +123,27 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
+org.apache.hadoop.hbase.replication
+
+Multi Cluster Replication
+
+
+
 org.apache.hadoop.hbase.rest.client
 
 
-
+
 org.apache.hadoop.hbase.security.access
 
 
-
+
 org.apache.hadoop.hbase.thrift2
 
 Provides an HBase http://thrift.apache.org/;>Thrift
 service.
 
 
-
+
 org.apache.hadoop.hbase.util
 
 
@@ -461,7 +467,7 @@ service.
 
 
 boolean
-Table.checkAndPut(byte[]row,
+HTable.checkAndPut(byte[]row,
   byte[]family,
   byte[]qualifier,
   byte[]value,
@@ -472,7 +478,7 @@ service.
 
 
 boolean
-HTable.checkAndPut(byte[]row,
+Table.checkAndPut(byte[]row,
   byte[]family,
   byte[]qualifier,
   byte[]value,
@@ -491,7 +497,7 @@ service.
 
 
 boolean
-Table.checkAndPut(byte[]row,
+HTable.checkAndPut(byte[]row,
   byte[]family,
   byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
@@ -503,7 +509,7 @@ service.
 
 
 boolean
-HTable.checkAndPut(byte[]row,
+Table.checkAndPut(byte[]row,
   byte[]family,
   byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
@@ -543,13 +549,13 @@ service.
 
 
 void
-Table.put(Putput)
+HTable.put(Putput)
 Puts some data in the table.
 
 
 
 void
-HTable.put(Putput)
+Table.put(Putput)
 Puts some data in the table.
 
 
@@ -574,11 +580,11 @@ service.
 
 
 void
-BufferedMutatorImpl.validatePut(Putput)
+HTable.validatePut(Putput)
 
 
 void
-HTable.validatePut(Putput)
+BufferedMutatorImpl.validatePut(Putput)
 
 
 static void
@@ -605,13 +611,13 @@ service.
 
 
 void
-Table.put(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListPutputs)
+HTable.put(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListPutputs)
 Puts some data in the table, in batch.
 
 
 
 void
-HTable.put(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListPutputs)
+Table.put(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListPutputs)
 Puts some data in the table, in batch.
 
 
@@ -687,107 +693,107 @@ service.
 
 
 boolean
-RegionObserver.postCheckAndPut(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postCheckAndPut(ObserverContextRegionCoprocessorEnvironmente,
   byte[]row,
   byte[]family,
   byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
   ByteArrayComparablecomparator,
   Putput,
-  booleanresult)
-Called after checkAndPut
-
+  booleanresult)
 
 
 boolean
-BaseRegionObserver.postCheckAndPut(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.postCheckAndPut(ObserverContextRegionCoprocessorEnvironmentc,
   byte[]row,
   byte[]family,
   byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
   ByteArrayComparablecomparator,
   Putput,
-  booleanresult)
+  booleanresult)
+Called after checkAndPut
+
 
 
 void
-RegionObserver.postPut(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postPut(ObserverContextRegionCoprocessorEnvironmente,
   Putput,
   WALEditedit,
-  Durabilitydurability)
-Called after the client stores a value.
-
+  Durabilitydurability)
 
 
 void
-BaseRegionObserver.postPut(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.postPut(ObserverContextRegionCoprocessorEnvironmentc,
   Putput,
   WALEditedit,
-  Durabilitydurability)
+  Durabilitydurability)
+Called after the client stores a value.
+
 
 
 boolean
-RegionObserver.preCheckAndPut(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.preCheckAndPut(ObserverContextRegionCoprocessorEnvironmente,
 

[21/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/Durability.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/Durability.html 
b/devapidocs/org/apache/hadoop/hbase/client/Durability.html
index 88c6248..82b99aa 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/Durability.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/Durability.html
@@ -280,7 +280,7 @@ the order they are declared.
 
 
 values
-public staticDurability[]values()
+public staticDurability[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -297,7 +297,7 @@ for (Durability c : Durability.values())
 
 
 valueOf
-public staticDurabilityvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticDurabilityvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/MasterSwitchType.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/MasterSwitchType.html 
b/devapidocs/org/apache/hadoop/hbase/client/MasterSwitchType.html
index d202499..c4ef3e8 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/MasterSwitchType.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/MasterSwitchType.html
@@ -221,7 +221,7 @@ the order they are declared.
 
 
 values
-public staticMasterSwitchType[]values()
+public staticMasterSwitchType[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -238,7 +238,7 @@ for (MasterSwitchType c : MasterSwitchType.values())
 
 
 valueOf
-public staticMasterSwitchTypevalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticMasterSwitchTypevalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html 
b/devapidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html
index bec7ba8..8573c3a 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/RegionLoadStats.html
@@ -99,6 +99,7 @@
 @InterfaceStability.Evolving
 public class RegionLoadStats
 extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object
+POJO representing region server load
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/TableState.State.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/TableState.State.html 
b/devapidocs/org/apache/hadoop/hbase/client/TableState.State.html
index 7cf401d..9564a07 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/TableState.State.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/TableState.State.html
@@ -260,7 +260,7 @@ the order they are declared.
 
 
 values
-public staticTableState.State[]values()
+public staticTableState.State[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -277,7 +277,7 @@ for (TableState.State c : TableState.State.values())
 
 
 valueOf
-public staticTableState.StatevalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticTableState.StatevalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant 

[25/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html
index 0c43cc9..9cd9fbd 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html
@@ -313,11 +313,11 @@ service.
 
 
 private TableName
-HRegionInfo.tableName
+MetaTableAccessor.TableVisitorBase.tableName
 
 
 private TableName
-MetaTableAccessor.TableVisitorBase.tableName
+HRegionInfo.tableName
 
 
 
@@ -759,39 +759,31 @@ service.
 
 
 private TableName
-HBaseAdmin.TableFuture.tableName
-
-
-private TableName
 ScannerCallableWithReplicas.tableName
 
-
-private TableName
-BufferedMutatorParams.tableName
-
 
 private TableName
-ClientScanner.tableName
+HTable.tableName
 
 
 protected TableName
-RegionAdminServiceCallable.tableName
+AbstractRegionServerCallable.tableName
 
 
 private TableName
-TableState.tableName
+HBaseAdmin.TableFuture.tableName
 
 
 private TableName
-AsyncProcess.AsyncRequestFutureImpl.tableName
+BufferedMutatorParams.tableName
 
 
 private TableName
-HRegionLocator.tableName
+AsyncProcess.AsyncRequestFutureImpl.tableName
 
 
 protected TableName
-AbstractRegionServerCallable.tableName
+RpcRetryingCallerWithReadReplicas.tableName
 
 
 private TableName
@@ -799,11 +791,19 @@ service.
 
 
 private TableName
-HTable.tableName
+HRegionLocator.tableName
 
 
 protected TableName
-RpcRetryingCallerWithReadReplicas.tableName
+RegionAdminServiceCallable.tableName
+
+
+private TableName
+ClientScanner.tableName
+
+
+private TableName
+TableState.tableName
 
 
 
@@ -837,33 +837,33 @@ service.
 
 
 TableName
-BufferedMutator.getName()
-Gets the fully qualified table name instance of the table 
that this BufferedMutator writes to.
-
+HTable.getName()
 
 
 TableName
-RegionLocator.getName()
-Gets the fully qualified table name instance of this 
table.
+BufferedMutator.getName()
+Gets the fully qualified table name instance of the table 
that this BufferedMutator writes to.
 
 
 
 TableName
-HRegionLocator.getName()
-
-
-TableName
 Table.getName()
 Gets the fully qualified table name instance of this 
table.
 
 
-
+
 TableName
 BufferedMutatorImpl.getName()
 
+
+TableName
+HRegionLocator.getName()
+
 
 TableName
-HTable.getName()
+RegionLocator.getName()
+Gets the fully qualified table name instance of this 
table.
+
 
 
 TableName
@@ -874,23 +874,23 @@ service.
 ClientScanner.getTable()
 
 
+TableName
+AbstractRegionServerCallable.getTableName()
+
+
 protected TableName
 HBaseAdmin.TableFuture.getTableName()
 
-
+
 TableName
 BufferedMutatorParams.getTableName()
 
-
+
 TableName
 TableState.getTableName()
 Table name for state
 
 
-
-TableName
-AbstractRegionServerCallable.getTableName()
-
 
 private TableName
 HBaseAdmin.getTableNameBeforeRestoreSnapshot(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in 
java.lang">StringsnapshotName)
@@ -1004,34 +1004,34 @@ service.
 
 
 void
-ClusterConnection.cacheLocation(TableNametableName,
-  RegionLocationslocation)
+ConnectionImplementation.cacheLocation(TableNametableName,
+  RegionLocationslocation)
+Put a newly discovered HRegionLocation into the cache.
+
 
 
 void
-MetaCache.cacheLocation(TableNametableName,
-  RegionLocationslocations)
-Put a newly discovered HRegionLocation into the cache.
-
+ClusterConnection.cacheLocation(TableNametableName,
+  RegionLocationslocation)
 
 
 void
-ConnectionImplementation.cacheLocation(TableNametableName,
-  RegionLocationslocation)
+MetaCache.cacheLocation(TableNametableName,
+  RegionLocationslocations)
 Put a newly discovered HRegionLocation into the cache.
 
 
 
-void
-MetaCache.cacheLocation(TableNametableName,
+private void
+ConnectionImplementation.cacheLocation(TableNametableName,
   ServerNamesource,
   HRegionLocationlocation)
 Put a newly discovered HRegionLocation into the cache.
 
 
 
-private void
-ConnectionImplementation.cacheLocation(TableNametableName,
+void
+MetaCache.cacheLocation(TableNametableName,
   ServerNamesource,
   HRegionLocationlocation)
 Put a newly discovered HRegionLocation into the cache.
@@ -1066,15 +1066,15 @@ service.
 
 
 void
+ConnectionImplementation.clearRegionCache(TableNametableName)
+
+
+void
 ClusterConnection.clearRegionCache(TableNametableName)
 Allows flushing the region cache of all locations that 
pertain to
  tableName
 
 
-
-void
-ConnectionImplementation.clearRegionCache(TableNametableName)
-
 
 void
 ConnectionImplementation.clearRegionCache(TableNametableName,
@@ 

[40/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/coc.html
--
diff --git a/coc.html b/coc.html
index 799d7b2..8060795 100644
--- a/coc.html
+++ b/coc.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  
   Code of Conduct Policy
@@ -331,7 +331,7 @@ For flagrant violations requiring a firm response the PMC 
may opt to skip early
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-06-01
+  Last Published: 
2016-06-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/cygwin.html
--
diff --git a/cygwin.html b/cygwin.html
index df7a7d6..2b3d8f7 100644
--- a/cygwin.html
+++ b/cygwin.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Installing Apache HBase (TM) on Windows using 
Cygwin
 
@@ -673,7 +673,7 @@ Now your HBase server is running, start 
coding and build that next
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-06-01
+  Last Published: 
2016-06-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/dependencies.html
--
diff --git a/dependencies.html b/dependencies.html
index 2db795e..9767f0a 100644
--- a/dependencies.html
+++ b/dependencies.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Project Dependencies
 
@@ -518,7 +518,7 @@
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-06-01
+  Last Published: 
2016-06-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/dependency-convergence.html
--
diff --git a/dependency-convergence.html b/dependency-convergence.html
index 71d5d71..dda57e4 100644
--- a/dependency-convergence.html
+++ b/dependency-convergence.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Reactor Dependency Convergence
 
@@ -1703,7 +1703,7 @@
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-06-01
+  Last Published: 
2016-06-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/dependency-info.html
--
diff --git a/dependency-info.html b/dependency-info.html
index a831f85..ac0d97d 100644
--- a/dependency-info.html
+++ b/dependency-info.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Dependency Information
 
@@ -312,7 +312,7 @@
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-06-01
+  Last Published: 
2016-06-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/dependency-management.html
--
diff --git a/dependency-management.html b/dependency-management.html
index 0a5632c..2fcf86c 100644
--- a/dependency-management.html
+++ b/dependency-management.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Project Dependency Management
 
@@ -798,7 +798,7 @@
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-06-01
+  Last Published: 
2016-06-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/allclasses-frame.html
--
diff --git a/devapidocs/allclasses-frame.html b/devapidocs/allclasses-frame.html
index 4cb8057..c1b7a54 100644
--- a/devapidocs/allclasses-frame.html
+++ b/devapidocs/allclasses-frame.html
@@ -259,6 +259,9 @@
 Canary.RegionTask.TaskType
 Canary.Sink
 Canary.StdOutSink
+Canary.ZookeeperMonitor
+Canary.ZookeeperStdOutSink
+Canary.ZookeeperTask
 CancelableProgressable
 Cancellable
 CatalogJanitor
@@ -1724,8 +1727,10 @@
 ReplicationPeerZKImpl
 ReplicationQueueInfo
 ReplicationQueues
+ReplicationQueuesArguments
 ReplicationQueuesClient
 

[27/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/ServerName.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/ServerName.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/ServerName.html
index 092c69c..bb0f5f4 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/ServerName.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/ServerName.html
@@ -591,11 +591,11 @@
 
 
 private ServerName
-FastFailInterceptorContext.server
+AsyncProcess.AsyncRequestFutureImpl.SingleServerRequestRunnable.server
 
 
 private ServerName
-AsyncProcess.AsyncRequestFutureImpl.SingleServerRequestRunnable.server
+FastFailInterceptorContext.server
 
 
 private ServerName
@@ -698,16 +698,16 @@
 
 
 
-void
-MetaCache.cacheLocation(TableNametableName,
+private void
+ConnectionImplementation.cacheLocation(TableNametableName,
   ServerNamesource,
   HRegionLocationlocation)
 Put a newly discovered HRegionLocation into the cache.
 
 
 
-private void
-ConnectionImplementation.cacheLocation(TableNametableName,
+void
+MetaCache.cacheLocation(TableNametableName,
   ServerNamesource,
   HRegionLocationlocation)
 Put a newly discovered HRegionLocation into the cache.
@@ -736,13 +736,13 @@
 
 
 void
-ClusterConnection.clearCaches(ServerNamesn)
-Clear any caches that pertain to server name 
sn.
-
+ConnectionImplementation.clearCaches(ServerNameserverName)
 
 
 void
-ConnectionImplementation.clearCaches(ServerNameserverName)
+ClusterConnection.clearCaches(ServerNamesn)
+Clear any caches that pertain to server name 
sn.
+
 
 
 void
@@ -830,13 +830,13 @@
 
 
 org.apache.hadoop.hbase.protobuf.generated.AdminProtos.AdminService.BlockingInterface
-ClusterConnection.getAdmin(ServerNameserverName)
-Establishes a connection to the region server at the 
specified address.
-
+ConnectionImplementation.getAdmin(ServerNameserverName)
 
 
 org.apache.hadoop.hbase.protobuf.generated.AdminProtos.AdminService.BlockingInterface
-ConnectionImplementation.getAdmin(ServerNameserverName)
+ClusterConnection.getAdmin(ServerNameserverName)
+Establishes a connection to the region server at the 
specified address.
+
 
 
 private http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html?is-external=true;
 title="class or interface in java.lang">Long
@@ -845,15 +845,15 @@
 
 
 org.apache.hadoop.hbase.protobuf.generated.ClientProtos.ClientService.BlockingInterface
+ConnectionImplementation.getClient(ServerNamesn)
+
+
+org.apache.hadoop.hbase.protobuf.generated.ClientProtos.ClientService.BlockingInterface
 ClusterConnection.getClient(ServerNameserverName)
 Establishes a connection to the region server at the 
specified address, and returns
  a region client protocol.
 
 
-
-org.apache.hadoop.hbase.protobuf.generated.ClientProtos.ClientService.BlockingInterface
-ConnectionImplementation.getClient(ServerNamesn)
-
 
 org.apache.hadoop.hbase.protobuf.generated.ClientProtos.ClientService.BlockingInterface
 CoprocessorHConnection.getClient(ServerNameserverName)
@@ -911,22 +911,22 @@
 
 
 boolean
+ConnectionImplementation.isDeadServer(ServerNamesn)
+
+
+boolean
 ClusterConnection.isDeadServer(ServerNameserverName)
 Deprecated.
 internal method, do not use thru 
ClusterConnection
 
 
 
-
+
 boolean
 ClusterStatusListener.isDeadServer(ServerNamesn)
 Check if we know if a server is dead.
 
 
-
-boolean
-ConnectionImplementation.isDeadServer(ServerNamesn)
-
 
 protected boolean
 PreemptiveFastFailInterceptor.isServerInFailureMap(ServerNameserverName)
@@ -1054,23 +1054,23 @@
 
 
 void
-ClusterConnection.updateCachedLocations(TableNametableName,
+ConnectionImplementation.updateCachedLocations(TableNametableName,
   byte[]regionName,
   byte[]rowkey,
   http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Objectexception,
   ServerNamesource)
-Update the location cache.
+Update the location with the new value (if the exception is 
a RegionMovedException)
+ or delete it from the cache.
 
 
 
 void
-ConnectionImplementation.updateCachedLocations(TableNametableName,
+ClusterConnection.updateCachedLocations(TableNametableName,
   byte[]regionName,
   byte[]rowkey,
   http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Objectexception,
   ServerNamesource)
-Update the location with the new value (if the exception is 
a RegionMovedException)
- or delete it from the cache.
+Update the location cache.

[36/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/Cell.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/Cell.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/Cell.html
index 0c08898..4b0da57 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/Cell.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/Cell.html
@@ -1638,9 +1638,9 @@ service.
 
 
 
-Append
-Append.add(Cellcell)
-Add column and value to this Append operation.
+Put
+Put.add(Cellkv)
+Add the specified KeyValue to this Put operation.
 
 
 
@@ -1650,9 +1650,9 @@ service.
 
 
 
-Put
-Put.add(Cellkv)
-Add the specified KeyValue to this Put operation.
+Append
+Append.add(Cellcell)
+Add column and value to this Append operation.
 
 
 
@@ -1743,17 +1743,17 @@ service.
 booleanpartial)
 
 
-Append
-Append.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
+Put
+Put.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 
 
-Delete
-Delete.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
-
-
 Increment
 Increment.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 
+
+Append
+Append.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
+
 
 Mutation
 Mutation.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
@@ -1761,8 +1761,8 @@ service.
 
 
 
-Put
-Put.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
+Delete
+Delete.setFamilyCellMap(http://docs.oracle.com/javase/7/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 
 
 
@@ -1795,23 +1795,23 @@ service.
 
 
 
-http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html?is-external=true;
 title="class or interface in java.lang">Long
-LongColumnInterpreter.getValue(byte[]colFamily,
-byte[]colQualifier,
-Cellkv)
-
-
 http://docs.oracle.com/javase/7/docs/api/java/lang/Double.html?is-external=true;
 title="class or interface in java.lang">Double
 DoubleColumnInterpreter.getValue(byte[]colFamily,
 byte[]colQualifier,
 Cellc)
 
-
+
 http://docs.oracle.com/javase/7/docs/api/java/math/BigDecimal.html?is-external=true;
 title="class or interface in java.math">BigDecimal
 BigDecimalColumnInterpreter.getValue(byte[]colFamily,
 byte[]colQualifier,
 Cellkv)
 
+
+http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html?is-external=true;
 title="class or interface in java.lang">Long
+LongColumnInterpreter.getValue(byte[]colFamily,
+byte[]colQualifier,
+Cellkv)
+
 
 
 
@@ -1827,13 +1827,13 @@ service.
 
 
 
-private Cell
-BaseDecoder.current
-
-
 protected Cell
 KeyValueCodec.ByteBufferedKeyValueDecoder.current
 
+
+private Cell
+BaseDecoder.current
+
 
 
 
@@ -1857,33 +1857,33 @@ service.
 
 
 Cell
-BaseDecoder.current()
+KeyValueCodec.ByteBufferedKeyValueDecoder.current()
 
 
 Cell
-KeyValueCodec.ByteBufferedKeyValueDecoder.current()
+BaseDecoder.current()
 
 
 protected Cell
-CellCodec.CellDecoder.parseCell()

[05/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html 
b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html
index 64d6d2b..6795665 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html
@@ -158,11 +158,11 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 ImmutableBytesWritable
-TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()
+TableRecordReaderImpl.createKey()
 
 
 ImmutableBytesWritable
-TableRecordReaderImpl.createKey()
+TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()
 
 
 ImmutableBytesWritable
@@ -179,11 +179,9 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 org.apache.hadoop.mapred.RecordReaderImmutableBytesWritable,Result
-TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplitsplit,
+MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplitsplit,
   org.apache.hadoop.mapred.JobConfjob,
-  
org.apache.hadoop.mapred.Reporterreporter)
-Builds a TableRecordReader.
-
+  
org.apache.hadoop.mapred.Reporterreporter)
 
 
 org.apache.hadoop.mapred.RecordReaderImmutableBytesWritable,Result
@@ -193,9 +191,11 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 org.apache.hadoop.mapred.RecordReaderImmutableBytesWritable,Result
-MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplitsplit,
+TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplitsplit,
   org.apache.hadoop.mapred.JobConfjob,
-  
org.apache.hadoop.mapred.Reporterreporter)
+  
org.apache.hadoop.mapred.Reporterreporter)
+Builds a TableRecordReader.
+
 
 
 
@@ -214,20 +214,20 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 void
-IdentityTableMap.map(ImmutableBytesWritablekey,
+GroupingTableMap.map(ImmutableBytesWritablekey,
   Resultvalue,
   org.apache.hadoop.mapred.OutputCollectorImmutableBytesWritable,Resultoutput,
   org.apache.hadoop.mapred.Reporterreporter)
-Pass the key, value to reduce
+Extract the grouping columns from value to construct a new 
key.
 
 
 
 void
-GroupingTableMap.map(ImmutableBytesWritablekey,
+IdentityTableMap.map(ImmutableBytesWritablekey,
   Resultvalue,
   org.apache.hadoop.mapred.OutputCollectorImmutableBytesWritable,Resultoutput,
   org.apache.hadoop.mapred.Reporterreporter)
-Extract the grouping columns from value to construct a new 
key.
+Pass the key, value to reduce
 
 
 
@@ -239,12 +239,12 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 boolean
-TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritablekey,
+TableRecordReaderImpl.next(ImmutableBytesWritablekey,
 Resultvalue)
 
 
 boolean
-TableRecordReaderImpl.next(ImmutableBytesWritablekey,
+TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritablekey,
 Resultvalue)
 
 
@@ -277,20 +277,20 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 void
-IdentityTableMap.map(ImmutableBytesWritablekey,
+GroupingTableMap.map(ImmutableBytesWritablekey,
   Resultvalue,
   org.apache.hadoop.mapred.OutputCollectorImmutableBytesWritable,Resultoutput,
   org.apache.hadoop.mapred.Reporterreporter)
-Pass the key, value to reduce
+Extract the grouping columns from value to construct a new 
key.
 
 
 
 void
-GroupingTableMap.map(ImmutableBytesWritablekey,
+IdentityTableMap.map(ImmutableBytesWritablekey,
   Resultvalue,
   org.apache.hadoop.mapred.OutputCollectorImmutableBytesWritable,Resultoutput,
   org.apache.hadoop.mapred.Reporterreporter)
-Extract the grouping columns from value to construct a new 
key.
+Pass the key, value to reduce
 
 
 
@@ -345,7 +345,7 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 private ImmutableBytesWritable
-HashTable.TableHash.Reader.key
+TableRecordReaderImpl.key
 
 
 private ImmutableBytesWritable
@@ -353,7 +353,7 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 private ImmutableBytesWritable
-TableRecordReaderImpl.key
+HashTable.TableHash.Reader.key
 
 
 (package private) ImmutableBytesWritable
@@ -423,32 +423,32 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 ImmutableBytesWritable
-HashTable.TableHash.Reader.getCurrentKey()
-Get the current key
-

[51/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.


Project: http://git-wip-us.apache.org/repos/asf/hbase-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase-site/commit/d434d867
Tree: http://git-wip-us.apache.org/repos/asf/hbase-site/tree/d434d867
Diff: http://git-wip-us.apache.org/repos/asf/hbase-site/diff/d434d867

Branch: refs/heads/asf-site
Commit: d434d8676182411d651f11cdd5d8d86c01a83de9
Parents: e263ab8
Author: jenkins 
Authored: Mon Jun 6 15:07:15 2016 +
Committer: Misty Stanley-Jones 
Committed: Mon Jun 6 11:25:33 2016 -0700

--
 acid-semantics.html |4 +-
 apache_hbase_reference_guide.pdf|4 +-
 apache_hbase_reference_guide.pdfmarks   |4 +-
 apidocs/constant-values.html|   21 +-
 apidocs/index-all.html  |6 +-
 .../apache/hadoop/hbase/HColumnDescriptor.html  |   34 +-
 .../apache/hadoop/hbase/KeepDeletedCells.html   |4 +-
 apidocs/org/apache/hadoop/hbase/RegionLoad.html |   47 +-
 apidocs/org/apache/hadoop/hbase/ServerLoad.html |   84 +-
 .../org/apache/hadoop/hbase/class-use/Cell.html |  226 +-
 .../hadoop/hbase/class-use/ServerName.html  |4 +-
 .../hadoop/hbase/class-use/TableName.html   |   10 +-
 .../hadoop/hbase/client/CompactionState.html|4 +-
 .../apache/hadoop/hbase/client/Consistency.html |4 +-
 .../apache/hadoop/hbase/client/Durability.html  |4 +-
 .../hadoop/hbase/client/IsolationLevel.html |4 +-
 .../hadoop/hbase/client/RegionLoadStats.html|1 +
 .../hadoop/hbase/client/SnapshotType.html   |4 +-
 .../hbase/client/class-use/Consistency.html |8 +-
 .../hbase/client/class-use/Durability.html  |   10 +-
 .../hbase/client/class-use/IsolationLevel.html  |8 +-
 .../hadoop/hbase/client/class-use/Mutation.html |8 +-
 .../hadoop/hbase/client/class-use/Result.html   |   28 +-
 .../hadoop/hbase/client/class-use/Row.html  |4 +-
 .../hadoop/hbase/client/class-use/Scan.html |   12 +-
 .../hadoop/hbase/client/package-summary.html|4 +-
 .../hadoop/hbase/client/package-tree.html   |8 +-
 .../apache/hadoop/hbase/client/package-use.html |4 +-
 .../hbase/filter/CompareFilter.CompareOp.html   |4 +-
 .../filter/class-use/ByteArrayComparable.html   |8 +-
 .../class-use/CompareFilter.CompareOp.html  |8 +-
 .../filter/class-use/Filter.ReturnCode.html |   66 +-
 .../hadoop/hbase/filter/class-use/Filter.html   |   60 +-
 .../hadoop/hbase/filter/package-tree.html   |6 +-
 .../io/class-use/ImmutableBytesWritable.html|   48 +-
 .../hbase/io/crypto/class-use/Cipher.html   |8 +-
 .../hbase/io/encoding/DataBlockEncoding.html|4 +-
 .../apache/hadoop/hbase/quotas/QuotaType.html   |4 +-
 .../hbase/quotas/ThrottlingException.Type.html  |4 +-
 .../hadoop/hbase/quotas/package-tree.html   |2 +-
 .../hadoop/hbase/regionserver/BloomType.html|4 +-
 .../hadoop/hbase/util/class-use/ByteRange.html  |   98 +-
 .../hadoop/hbase/util/class-use/Order.html  |   40 +-
 .../util/class-use/PositionedByteRange.html |  358 +-
 apidocs/overview-tree.html  |   26 +-
 .../apache/hadoop/hbase/HColumnDescriptor.html  |2 +-
 .../org/apache/hadoop/hbase/RegionLoad.html |  417 +-
 .../org/apache/hadoop/hbase/ServerLoad.html |  550 +-
 .../hadoop/hbase/client/RegionLoadStats.html|   10 +-
 book.html   |2 +-
 bulk-loads.html |4 +-
 checkstyle-aggregate.html   | 6172 +-
 checkstyle.rss  |  780 +--
 coc.html|4 +-
 cygwin.html |4 +-
 dependencies.html   |4 +-
 dependency-convergence.html |4 +-
 dependency-info.html|4 +-
 dependency-management.html  |4 +-
 devapidocs/allclasses-frame.html|5 +
 devapidocs/allclasses-noframe.html  |5 +
 devapidocs/constant-values.html |   60 +-
 devapidocs/deprecated-list.html |  228 +-
 devapidocs/index-all.html   |  218 +-
 .../apache/hadoop/hbase/HColumnDescriptor.html  |4 +-
 .../apache/hadoop/hbase/KeepDeletedCells.html   |4 +-
 .../org/apache/hadoop/hbase/KeyValue.Type.html  |4 +-
 .../hbase/MetaTableAccessor.QueryType.html  |4 +-
 .../org/apache/hadoop/hbase/RegionLoad.html |   47 +-
 .../org/apache/hadoop/hbase/ServerLoad.html |   74 +-
 .../hadoop/hbase/class-use/Abortable.html   |   62 +-
 .../org/apache/hadoop/hbase/class-use/Cell.html |  976 +--
 .../hadoop/hbase/class-use/CellComparator.html  |  

[38/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/index-all.html
--
diff --git a/devapidocs/index-all.html b/devapidocs/index-all.html
index bb5d035..bdba444 100644
--- a/devapidocs/index-all.html
+++ b/devapidocs/index-all.html
@@ -174,6 +174,8 @@
 
 abort(String,
 Throwable) - Method in class org.apache.hadoop.hbase.replication.ReplicationPeerZKImpl
 
+abort
 - Variable in class org.apache.hadoop.hbase.replication.ReplicationQueuesArguments
+
 abort(String,
 Throwable) - Method in class org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher
 
 ABORT_ON_ERROR_KEY
 - Static variable in class org.apache.hadoop.hbase.coprocessor.CoprocessorHost
@@ -196,6 +198,8 @@
 
 abortable
 - Variable in class org.apache.hadoop.hbase.replication.ReplicationPeersZKImpl
 
+abortable
 - Variable in class org.apache.hadoop.hbase.replication.ReplicationQueuesHBaseImpl
+
 abortable
 - Variable in class org.apache.hadoop.hbase.replication.ReplicationStateZKBase
 
 abortable
 - Variable in class org.apache.hadoop.hbase.zookeeper.DrainingServerTracker
@@ -1492,6 +1496,8 @@
 
 Add new hfile references to the queue.
 
+addHFileRefs(String,
 ListString) - Method in class 
org.apache.hadoop.hbase.replication.ReplicationQueuesHBaseImpl
+
 addHFileRefs(String,
 ListString) - Method in class 
org.apache.hadoop.hbase.replication.ReplicationQueuesZKImpl
 
 addHFileRefsToQueue(ReplicationSourceManager,
 TableName, byte[], WALProtos.StoreDescriptor) - Static method in 
class org.apache.hadoop.hbase.replication.regionserver.Replication
@@ -1629,6 +1635,8 @@
 
 Add a new WAL file to the given queue.
 
+addLog(String,
 String) - Method in class org.apache.hadoop.hbase.replication.ReplicationQueuesHBaseImpl
+
 addLog(String,
 String) - Method in class org.apache.hadoop.hbase.replication.ReplicationQueuesZKImpl
 
 addLogFile(String,
 String) - Method in class org.apache.hadoop.hbase.snapshot.SnapshotInfo.SnapshotStats
@@ -1727,6 +1735,8 @@
 
 Add a peer to hfile reference queue if peer does not 
exist.
 
+addPeerToHFileRefs(String)
 - Method in class org.apache.hadoop.hbase.replication.ReplicationQueuesHBaseImpl
+
 addPeerToHFileRefs(String)
 - Method in class org.apache.hadoop.hbase.replication.ReplicationQueuesZKImpl
 
 addPlan(String,
 RegionPlan) - Method in class org.apache.hadoop.hbase.master.AssignmentManager
@@ -2157,6 +2167,8 @@
 
 The Admin.
 
+admin
 - Variable in class org.apache.hadoop.hbase.replication.ReplicationQueuesHBaseImpl
+
 admin
 - Variable in class org.apache.hadoop.hbase.tool.Canary.Monitor
 
 admin
 - Variable in class org.apache.hadoop.hbase.util.ConnectionCache.ConnectionInfo
@@ -3192,8 +3204,6 @@
 
 assignMetaReplica(int)
 - Method in class org.apache.hadoop.hbase.util.HBaseFsck
 
-assignRegion(HRegionInfo)
 - Method in class org.apache.hadoop.hbase.master.HMaster
-
 assignRegion(RpcController,
 MasterProtos.AssignRegionRequest) - Method in class 
org.apache.hadoop.hbase.master.MasterRpcServices
 
 assignRegions(MasterProcedureEnv,
 TableName, ListHRegionInfo) - Static method in class 
org.apache.hadoop.hbase.master.procedure.CreateTableProcedure
@@ -5597,6 +5607,11 @@
 
 buildServerLoad(long,
 long) - Method in class org.apache.hadoop.hbase.regionserver.HRegionServer
 
+buildServerQueueName(String)
 - Method in class org.apache.hadoop.hbase.replication.ReplicationQueuesHBaseImpl
+
+Builds the unique identifier for a queue in the Replication 
table by appending the queueId to
+ the servername
+
 buildServiceCall(byte[],
 Descriptors.MethodDescriptor, Message) - Static method in class 
org.apache.hadoop.hbase.ipc.CoprocessorRpcUtils
 
 Returns a service call instance for the given coprocessor 
request.
@@ -6729,6 +6744,8 @@
 
 call()
 - Method in class org.apache.hadoop.hbase.tool.Canary.RegionTask
 
+call()
 - Method in class org.apache.hadoop.hbase.tool.Canary.ZookeeperTask
+
 call()
 - Method in class org.apache.hadoop.hbase.util.HBaseFsck.CheckRegionConsistencyWorkItem
 
 call()
 - Method in class org.apache.hadoop.hbase.util.HBaseFsck.FileLockCallable
@@ -7007,6 +7024,18 @@
 
 Canary.StdOutSink()
 - Constructor for class org.apache.hadoop.hbase.tool.Canary.StdOutSink
 
+Canary.ZookeeperMonitor - Class in org.apache.hadoop.hbase.tool
+
+Canary.ZookeeperMonitor(Connection,
 String[], boolean, Canary.ExtendedSink, ExecutorService, boolean) - 
Constructor for class org.apache.hadoop.hbase.tool.Canary.ZookeeperMonitor
+
+Canary.ZookeeperStdOutSink - Class in org.apache.hadoop.hbase.tool
+
+Canary.ZookeeperStdOutSink()
 - Constructor for class org.apache.hadoop.hbase.tool.Canary.ZookeeperStdOutSink
+
+Canary.ZookeeperTask - Class in org.apache.hadoop.hbase.tool
+
+Canary.ZookeeperTask(Connection,
 String, String, int, Canary.ZookeeperStdOutSink) - Constructor for 
class org.apache.hadoop.hbase.tool.Canary.ZookeeperTask
+
 CANARY_TABLE_FAMILY_NAME
 - Static variable in class org.apache.hadoop.hbase.tool.Canary
 
 cancel()
 - Method 

[04/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html 
b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html
index 1866697..2e717c9 100644
--- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html
+++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html
@@ -203,15 +203,15 @@
 
 
 private Encryptor
-SecureAsyncProtobufLogWriter.encryptor
+SecureWALCellCodec.encryptor
 
 
 private Encryptor
-SecureWALCellCodec.encryptor
+SecureWALCellCodec.EncryptedKvEncoder.encryptor
 
 
 private Encryptor
-SecureWALCellCodec.EncryptedKvEncoder.encryptor
+SecureAsyncProtobufLogWriter.encryptor
 
 
 private Encryptor
@@ -238,15 +238,15 @@
 
 
 protected void
-AbstractProtobufLogWriter.setEncryptor(Encryptorencryptor)
+SecureAsyncProtobufLogWriter.setEncryptor(Encryptorencryptor)
 
 
 protected void
-SecureAsyncProtobufLogWriter.setEncryptor(Encryptorencryptor)
+SecureProtobufLogWriter.setEncryptor(Encryptorencryptor)
 
 
 protected void
-SecureProtobufLogWriter.setEncryptor(Encryptorencryptor)
+AbstractProtobufLogWriter.setEncryptor(Encryptorencryptor)
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html 
b/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html
index 6db867c..4e14d10 100644
--- a/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html
+++ b/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html
@@ -422,7 +422,7 @@ the order they are declared.
 
 
 values
-public staticDataBlockEncoding[]values()
+public staticDataBlockEncoding[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -439,7 +439,7 @@ for (DataBlockEncoding c : DataBlockEncoding.values())
 
 
 valueOf
-public staticDataBlockEncodingvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticDataBlockEncodingvalueOf(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html
 
b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html
index 9bbd12f..406afe2 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html
@@ -147,15 +147,15 @@
 
 
 protected void
-FastDiffDeltaEncoder.FastDiffSeekerState.copyFromNext(BufferedDataBlockEncoder.SeekerStatethat)
-
-
-protected void
 BufferedDataBlockEncoder.SeekerState.copyFromNext(BufferedDataBlockEncoder.SeekerStatenextState)
 Copy the state from the next one into this instance (the 
previous state
  placeholder).
 
 
+
+protected void
+FastDiffDeltaEncoder.FastDiffSeekerState.copyFromNext(BufferedDataBlockEncoder.SeekerStatethat)
+
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html
 
b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html
index 6871fde..fa89103 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html
@@ -117,11 +117,11 @@
 
 
 (package private) void
-FastDiffDeltaEncoder.FastDiffCompressionState.copyFrom(CompressionStatestate)
+CompressionState.copyFrom(CompressionStatestate)
 
 
 (package private) void
-CompressionState.copyFrom(CompressionStatestate)
+FastDiffDeltaEncoder.FastDiffCompressionState.copyFrom(CompressionStatestate)
 
 
 (package private) void


[01/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
Repository: hbase-site
Updated Branches:
  refs/heads/asf-site e263ab835 -> 120c5b170


http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/HFileBlockIndex.BlockIndexReader.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/HFileBlockIndex.BlockIndexReader.html
 
b/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/HFileBlockIndex.BlockIndexReader.html
index 7fe5bdb..19962b5 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/HFileBlockIndex.BlockIndexReader.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/HFileBlockIndex.BlockIndexReader.html
@@ -136,11 +136,11 @@
 
 
 HFileBlockIndex.BlockIndexReader
-HFileReaderImpl.getDataBlockIndexReader()
+HFile.Reader.getDataBlockIndexReader()
 
 
 HFileBlockIndex.BlockIndexReader
-HFile.Reader.getDataBlockIndexReader()
+HFileReaderImpl.getDataBlockIndexReader()
 
 
 



[06/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html 
b/devapidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
index 5556625..0269155 100644
--- a/devapidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
+++ b/devapidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
@@ -158,11 +158,11 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 Filter
-Scan.getFilter()
+Query.getFilter()
 
 
 Filter
-Query.getFilter()
+Scan.getFilter()
 
 
 
@@ -174,15 +174,15 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
-Scan
-Scan.setFilter(Filterfilter)
-
-
 Query
 Query.setFilter(Filterfilter)
 Apply the specified server-side filter when performing the 
Query.
 
 
+
+Scan
+Scan.setFilter(Filterfilter)
+
 
 Get
 Get.setFilter(Filterfilter)
@@ -413,16 +413,16 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
-(package private) Filter
-FilterWrapper.filter
+private Filter
+SkipFilter.filter
 
 
 private Filter
 WhileMatchFilter.filter
 
 
-private Filter
-SkipFilter.filter
+(package private) Filter
+FilterWrapper.filter
 
 
 private Filter
@@ -452,11 +452,11 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 static Filter
-SingleColumnValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+MultipleColumnPrefixFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-QualifierFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+KeyOnlyFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
@@ -464,27 +464,27 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 static Filter
-ColumnRangeFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+ColumnPaginationFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-ColumnCountGetFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+SingleColumnValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-DependentColumnFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+QualifierFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-ColumnPrefixFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+ColumnCountGetFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-ValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+ColumnRangeFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-RowFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+ValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 

[52/52] hbase-site git commit: Empty commit

2016-06-06 Thread misty
Empty commit


Project: http://git-wip-us.apache.org/repos/asf/hbase-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase-site/commit/120c5b17
Tree: http://git-wip-us.apache.org/repos/asf/hbase-site/tree/120c5b17
Diff: http://git-wip-us.apache.org/repos/asf/hbase-site/diff/120c5b17

Branch: refs/heads/asf-site
Commit: 120c5b17074176022c27930a4656e2a4cc108dcb
Parents: d434d86
Author: Misty Stanley-Jones 
Authored: Mon Jun 6 11:26:02 2016 -0700
Committer: Misty Stanley-Jones 
Committed: Mon Jun 6 11:26:02 2016 -0700

--

--




[31/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html
index e4d3b5b..7153b72 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html
@@ -266,11 +266,11 @@ service.
 
 
 protected HRegionLocation
-RegionAdminServiceCallable.location
+AbstractRegionServerCallable.location
 
 
 protected HRegionLocation
-AbstractRegionServerCallable.location
+RegionAdminServiceCallable.location
 
 
 
@@ -310,43 +310,43 @@ service.
 
 
 HRegionLocation
-RegionLocator.getRegionLocation(byte[]row)
+HRegionLocator.getRegionLocation(byte[]row)
 Finds the region on which the given row is being 
served.
 
 
 
 HRegionLocation
-HRegionLocator.getRegionLocation(byte[]row)
+RegionLocator.getRegionLocation(byte[]row)
 Finds the region on which the given row is being 
served.
 
 
 
 HRegionLocation
-RegionLocator.getRegionLocation(byte[]row,
+HRegionLocator.getRegionLocation(byte[]row,
   booleanreload)
 Finds the region on which the given row is being 
served.
 
 
 
 HRegionLocation
-HRegionLocator.getRegionLocation(byte[]row,
+RegionLocator.getRegionLocation(byte[]row,
   booleanreload)
 Finds the region on which the given row is being 
served.
 
 
 
 HRegionLocation
-ClusterConnection.getRegionLocation(TableNametableName,
+ConnectionImplementation.getRegionLocation(TableNametableName,
   byte[]row,
-  booleanreload)
-Find region location hosting passed row
-
+  booleanreload)
 
 
 HRegionLocation
-ConnectionImplementation.getRegionLocation(TableNametableName,
+ClusterConnection.getRegionLocation(TableNametableName,
   byte[]row,
-  booleanreload)
+  booleanreload)
+Find region location hosting passed row
+
 
 
 private HRegionLocation
@@ -354,15 +354,20 @@ service.
 
 
 HRegionLocation
+ConnectionImplementation.locateRegion(byte[]regionName)
+
+
+HRegionLocation
 ClusterConnection.locateRegion(byte[]regionName)
 Gets the location of the region of regionName.
 
 
-
+
 HRegionLocation
-ConnectionImplementation.locateRegion(byte[]regionName)
+ConnectionImplementation.locateRegion(TableNametableName,
+byte[]row)
 
-
+
 HRegionLocation
 ClusterConnection.locateRegion(TableNametableName,
 byte[]row)
@@ -370,12 +375,12 @@ service.
  lives in.
 
 
-
+
 HRegionLocation
-ConnectionImplementation.locateRegion(TableNametableName,
-byte[]row)
+ConnectionImplementation.relocateRegion(TableNametableName,
+byte[]row)
 
-
+
 HRegionLocation
 ClusterConnection.relocateRegion(TableNametableName,
 byte[]row)
@@ -383,11 +388,6 @@ service.
  lives in, ignoring any value that might be in the cache.
 
 
-
-HRegionLocation
-ConnectionImplementation.relocateRegion(TableNametableName,
-byte[]row)
-
 
 
 
@@ -399,13 +399,13 @@ service.
 
 
 http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListHRegionLocation
-RegionLocator.getAllRegionLocations()
-Retrieves all of the regions associated with this 
table.
-
+HRegionLocator.getAllRegionLocations()
 
 
 http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListHRegionLocation
-HRegionLocator.getAllRegionLocations()
+RegionLocator.getAllRegionLocations()
+Retrieves all of the regions associated with this 
table.
+
 
 
 private Pairhttp://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">Listbyte[],http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListHRegionLocation
@@ -428,15 +428,21 @@ service.
 
 
 http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListHRegionLocation
+ConnectionImplementation.locateRegions(TableNametableName)
+
+
+http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListHRegionLocation
 ClusterConnection.locateRegions(TableNametableName)
 Gets the locations of all regions in the specified table, 
tableName.
 
 
-
+
 http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListHRegionLocation

[47/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html 
b/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
index e2db83a..0e632c9 100644
--- a/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
+++ b/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
@@ -132,11 +132,11 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 Filter
-Query.getFilter()
+Scan.getFilter()
 
 
 Filter
-Scan.getFilter()
+Query.getFilter()
 
 
 
@@ -148,15 +148,15 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
+Scan
+Scan.setFilter(Filterfilter)
+
+
 Query
 Query.setFilter(Filterfilter)
 Apply the specified server-side filter when performing the 
Query.
 
 
-
-Scan
-Scan.setFilter(Filterfilter)
-
 
 Get
 Get.setFilter(Filterfilter)
@@ -382,83 +382,83 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 static Filter
-PrefixFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+SingleColumnValueExcludeFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-FirstKeyOnlyFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+MultipleColumnPrefixFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-SingleColumnValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+FirstKeyOnlyFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-SingleColumnValueExcludeFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+TimestampsFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-PageFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+InclusiveStopFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-ColumnPaginationFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+ColumnCountGetFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-KeyOnlyFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+ColumnPrefixFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-FamilyFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+ValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-ColumnCountGetFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+PageFilter.createFilterFromArguments(http://docs.oracle.com/javase/7/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter

[34/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/HColumnDescriptor.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/class-use/HColumnDescriptor.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/HColumnDescriptor.html
index 2d91178..e1b5330 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/HColumnDescriptor.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/HColumnDescriptor.html
@@ -129,33 +129,39 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
+org.apache.hadoop.hbase.replication
+
+Multi Cluster Replication
+
+
+
 org.apache.hadoop.hbase.rsgroup
 
 
-
+
 org.apache.hadoop.hbase.security
 
 
-
+
 org.apache.hadoop.hbase.security.access
 
 
-
+
 org.apache.hadoop.hbase.security.visibility
 
 
-
+
 org.apache.hadoop.hbase.snapshot
 
 
-
+
 org.apache.hadoop.hbase.thrift
 
 Provides an HBase http://incubator.apache.org/thrift/;>Thrift
 service.
 
 
-
+
 org.apache.hadoop.hbase.tool
 
 
@@ -563,25 +569,25 @@ service.
 
 
 void
-BaseMasterObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
+MasterObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
   TableNametableName,
   HColumnDescriptorcolumnFamily)
 Deprecated.
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645).
- Use BaseMasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
+ Use MasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
 
 void
-MasterObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
+BaseMasterObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironmentctx,
   TableNametableName,
   HColumnDescriptorcolumnFamily)
 Deprecated.
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645).
- Use MasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
+ Use BaseMasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
@@ -593,18 +599,18 @@ service.
 
 
 void
-BaseMasterObserver.postAddColumnFamily(ObserverContextMasterCoprocessorEnvironmentctx,
-  TableNametableName,
-  HColumnDescriptorcolumnFamily)
-
-
-void
 MasterObserver.postAddColumnFamily(ObserverContextMasterCoprocessorEnvironmentctx,
   TableNametableName,
   HColumnDescriptorcolumnFamily)
 Called after the new column family has been created.
 
 
+
+void
+BaseMasterObserver.postAddColumnFamily(ObserverContextMasterCoprocessorEnvironmentctx,
+  TableNametableName,
+  HColumnDescriptorcolumnFamily)
+
 
 void
 BaseMasterAndRegionObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -615,25 +621,25 @@ service.
 
 
 void
-BaseMasterObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
+MasterObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
 TableNametableName,
 HColumnDescriptorcolumnFamily)
 Deprecated.
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645). Use
- BaseMasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
+ MasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
 
 void
-MasterObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
+BaseMasterObserver.postAddColumnHandler(ObserverContextMasterCoprocessorEnvironmentctx,
 TableNametableName,
 HColumnDescriptorcolumnFamily)
 Deprecated.
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645). Use
- MasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
+ BaseMasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
@@ -645,18 +651,18 @@ service.
 
 
 void
-BaseMasterObserver.postCompletedAddColumnFamilyAction(ObserverContextMasterCoprocessorEnvironmentctx,
-TableNametableName,
-HColumnDescriptorcolumnFamily)
-

[42/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/checkstyle-aggregate.html
--
diff --git a/checkstyle-aggregate.html b/checkstyle-aggregate.html
index a91cd7a..8358551 100644
--- a/checkstyle-aggregate.html
+++ b/checkstyle-aggregate.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Checkstyle Results
 
@@ -280,10 +280,10 @@
 Warnings
 Errors
 
-1771
+1773
 0
 0
-11536
+11543
 
 Files
 
@@ -298,5595 +298,5605 @@
 0
 1
 
+maven-archiver/pom.properties
+0
+0
+1
+
 org/apache/hadoop/hbase/AuthUtil.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/BaseConfigurable.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/ByteBufferedKeyOnlyKeyValue.java
 0
 0
 3
-
+
 org/apache/hadoop/hbase/Cell.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/CellComparator.java
 0
 0
 30
-
+
 org/apache/hadoop/hbase/CellScanner.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/CellUtil.java
 0
 0
 94
-
+
 org/apache/hadoop/hbase/ChoreService.java
 0
 0
 5
-
+
 org/apache/hadoop/hbase/ClusterId.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/ClusterStatus.java
 0
 0
 13
-
+
 org/apache/hadoop/hbase/CompatibilityFactory.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/CompatibilitySingletonFactory.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/CompoundConfiguration.java
 0
 0
 3
-
+
 org/apache/hadoop/hbase/CoordinatedStateManagerFactory.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/CoprocessorEnvironment.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/DoNotRetryIOException.java
 0
 0
 3
-
+
 org/apache/hadoop/hbase/DroppedSnapshotException.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/HBaseConfiguration.java
 0
 0
 6
-
+
 org/apache/hadoop/hbase/HBaseIOException.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/HColumnDescriptor.java
 0
 0
 27
-
+
 org/apache/hadoop/hbase/HRegionInfo.java
 0
 0
 58
-
+
 org/apache/hadoop/hbase/HRegionLocation.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/HTableDescriptor.java
 0
 0
 47
-
+
 org/apache/hadoop/hbase/HealthChecker.java
 0
 0
 17
-
+
 org/apache/hadoop/hbase/JMXListener.java
 0
 0
 3
-
+
 org/apache/hadoop/hbase/JitterScheduledThreadPoolExecutorImpl.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/KeyValue.java
 0
 0
 135
-
+
 org/apache/hadoop/hbase/KeyValueTestUtil.java
 0
 0
 9
-
+
 org/apache/hadoop/hbase/KeyValueUtil.java
 0
 0
 30
-
+
 org/apache/hadoop/hbase/LocalHBaseCluster.java
 0
 0
 23
-
+
 org/apache/hadoop/hbase/MetaMutationAnnotation.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/MetaTableAccessor.java
 0
 0
 112
-
+
 org/apache/hadoop/hbase/NamespaceDescriptor.java
 0
 0
 3
-
+
 org/apache/hadoop/hbase/NotAllMetaRegionsOnlineException.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/ProcedureUtil.java
 0
 0
 4
-
+
 org/apache/hadoop/hbase/RegionLoad.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/RegionLocations.java
 0
 0
 10
-
+
 org/apache/hadoop/hbase/RegionStateListener.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/ScheduledChore.java
 0
 0
 6
-
+
 org/apache/hadoop/hbase/ServerLoad.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/ServerName.java
 0
 0
 34
-
+
 org/apache/hadoop/hbase/SettableSequenceId.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/SettableTimestamp.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/SplitLogCounters.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/SplitLogTask.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/Streamable.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/TableDescriptor.java
 0
 0
 4
-
+
 org/apache/hadoop/hbase/TableDescriptors.java
 0
 0
 11
-
+
 org/apache/hadoop/hbase/TableInfoMissingException.java
 0
 0
 6
-
+
 org/apache/hadoop/hbase/TableName.java
 0
 0
 20
-
+
 org/apache/hadoop/hbase/TagType.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/ZKNamespaceManager.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/ZNodeClearer.java
 0
 0
 3
-
+
 org/apache/hadoop/hbase/backup/HFileArchiver.java
 0
 0
 17
-
+
 org/apache/hadoop/hbase/backup/example/HFileArchiveManager.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/backup/example/LongTermArchivingHFileCleaner.java
 0
 0
 5
-
+
 org/apache/hadoop/hbase/backup/example/TableHFileArchiveTracker.java
 0
 0
 6
-
+
 org/apache/hadoop/hbase/backup/example/ZKTableArchiveClient.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/classification/tools/ExcludePrivateAnnotationsStandardDoclet.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/classification/tools/IncludePublicAnnotationsStandardDoclet.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/classification/tools/StabilityOptions.java
 0
 0
 3
-
+
 org/apache/hadoop/hbase/client/AbstractClientScanner.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/client/Action.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/client/Admin.java
 0
 0
 57
-
+
 org/apache/hadoop/hbase/client/Append.java
 0
 0
 4
-
+
 org/apache/hadoop/hbase/client/AsyncProcess.java
 0
 0
 18
-
+
 org/apache/hadoop/hbase/client/BufferedMutator.java
 0
 0
 1
-
+
 org/apache/hadoop/hbase/client/BufferedMutatorImpl.java
 0
 0
 2
-
+
 org/apache/hadoop/hbase/client/ClientAsyncPrefetchScanner.java
 0
 0
 2
-
+
 

[33/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/HDFSBlocksDistribution.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/class-use/HDFSBlocksDistribution.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/HDFSBlocksDistribution.html
index 3df7b03..3eb3be6 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/HDFSBlocksDistribution.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/HDFSBlocksDistribution.html
@@ -270,11 +270,11 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 HDFSBlocksDistribution
-Region.getHDFSBlocksDistribution()
+HRegion.getHDFSBlocksDistribution()
 
 
 HDFSBlocksDistribution
-HRegion.getHDFSBlocksDistribution()
+Region.getHDFSBlocksDistribution()
 
 
 



[24/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/TableNotDisabledException.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/class-use/TableNotDisabledException.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/TableNotDisabledException.html
index 65d17de..74fe4e6 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/class-use/TableNotDisabledException.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/class-use/TableNotDisabledException.html
@@ -100,14 +100,14 @@
 
 
 void
-HMaster.checkTableModifiable(TableNametableName)
-
-
-void
 MasterServices.checkTableModifiable(TableNametableName)
 Check table is modifiable; i.e.
 
 
+
+void
+HMaster.checkTableModifiable(TableNametableName)
+
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/TableNotFoundException.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/class-use/TableNotFoundException.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/TableNotFoundException.html
index 0523bfe..74ccea7 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/TableNotFoundException.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/TableNotFoundException.html
@@ -163,14 +163,14 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 void
-HMaster.checkTableModifiable(TableNametableName)
-
-
-void
 MasterServices.checkTableModifiable(TableNametableName)
 Check table is modifiable; i.e.
 
 
+
+void
+HMaster.checkTableModifiable(TableNametableName)
+
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/ZooKeeperConnectionException.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/class-use/ZooKeeperConnectionException.html
 
b/devapidocs/org/apache/hadoop/hbase/class-use/ZooKeeperConnectionException.html
index 3a34085..f485a70 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/class-use/ZooKeeperConnectionException.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/class-use/ZooKeeperConnectionException.html
@@ -160,7 +160,7 @@
 
 
 boolean
-ClusterConnection.isMasterRunning()
+ConnectionImplementation.isMasterRunning()
 Deprecated.
 this has been deprecated without a replacement
 
@@ -168,7 +168,7 @@
 
 
 boolean
-ConnectionImplementation.isMasterRunning()
+ClusterConnection.isMasterRunning()
 Deprecated.
 this has been deprecated without a replacement
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/classification/class-use/InterfaceAudience.Private.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/classification/class-use/InterfaceAudience.Private.html
 
b/devapidocs/org/apache/hadoop/hbase/classification/class-use/InterfaceAudience.Private.html
index 8534b8f..fb25235 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/classification/class-use/InterfaceAudience.Private.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/classification/class-use/InterfaceAudience.Private.html
@@ -1008,33 +1008,37 @@ service.
 
 
 
+org.apache.hadoop.hbase.protobuf.generated.ClusterStatusProtos.ServerLoad
+ServerLoad.obtainServerLoadPB()
+
+
 void
 ChoreService.onChoreMissedStartTime(ScheduledChorechore)
 
-
+
 void
 ProcedureInfo.setClientAckTime(longtimestamp)
 
-
+
 static void
 CellUtil.setSequenceId(Cellcell,
   longseqId)
 Sets the given seqId to the cell.
 
 
-
+
 static http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 AuthUtil.toGroupEntry(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the group entry with the group prefix for a group 
principal.
 
 
-
+
 http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 ScheduledChore.toString()
 A summation of this chore in human readable format.
 
 
-
+
 boolean
 ChoreService.triggerNow(ScheduledChorechore)
 
@@ -1069,10 +1073,16 @@ service.
   byte[]result)
 
 
+RegionLoad(org.apache.hadoop.hbase.protobuf.generated.ClusterStatusProtos.RegionLoadregionLoadPB)
+
+
 ScheduledChore()
 This constructor is for test only.
 
 
+
+ServerLoad(org.apache.hadoop.hbase.protobuf.generated.ClusterStatusProtos.ServerLoadserverLoad)
+
 
 
 
@@ -6426,19 +6436,28 @@ service.
 
 
 
+class
+ReplicationQueuesArguments
+
+
 interface
 ReplicationQueuesClient
 This provides an interface for clients of replication to 
view replication queues.
 
 
-
+
 class
 ReplicationQueuesClientZKImpl
 
+
+class

[35/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/CellComparator.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/CellComparator.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/CellComparator.html
index 640a146..1ef35ef 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/CellComparator.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/CellComparator.html
@@ -234,29 +234,29 @@
 
 
 int
+BufferedDataBlockEncoder.BufferedEncodedSeeker.compareKey(CellComparatorcomparator,
+Cellkey)
+
+
+int
 DataBlockEncoder.EncodedSeeker.compareKey(CellComparatorcomparator,
 Cellkey)
 Compare the given key against the current key
 
 
-
-int
-BufferedDataBlockEncoder.BufferedEncodedSeeker.compareKey(CellComparatorcomparator,
-Cellkey)
-
 
 DataBlockEncoder.EncodedSeeker
-CopyKeyDataBlockEncoder.createSeeker(CellComparatorcomparator,
+DiffKeyDeltaEncoder.createSeeker(CellComparatorcomparator,
 HFileBlockDecodingContextdecodingCtx)
 
 
 DataBlockEncoder.EncodedSeeker
-PrefixKeyDeltaEncoder.createSeeker(CellComparatorcomparator,
+CopyKeyDataBlockEncoder.createSeeker(CellComparatorcomparator,
 HFileBlockDecodingContextdecodingCtx)
 
 
 DataBlockEncoder.EncodedSeeker
-DiffKeyDeltaEncoder.createSeeker(CellComparatorcomparator,
+PrefixKeyDeltaEncoder.createSeeker(CellComparatorcomparator,
 HFileBlockDecodingContextdecodingCtx)
 
 
@@ -298,33 +298,33 @@
 
 
 
+protected CellComparator
+HFile.WriterFactory.comparator
+
+
 private CellComparator
 HFileReaderImpl.comparator
 Key comparator
 
 
-
+
 protected CellComparator
 HFileWriterImpl.comparator
 Key comparator.
 
 
-
+
 private CellComparator
 HFileBlockIndex.CellBasedKeyBlockIndexReader.comparator
 Needed doing lookup on blocks.
 
 
-
+
 protected CellComparator
 CompoundBloomFilterBase.comparator
 Comparator used to compare Bloom filter keys
 
 
-
-protected CellComparator
-HFile.WriterFactory.comparator
-
 
 
 
@@ -344,11 +344,11 @@
 
 
 CellComparator
-HFileReaderImpl.getComparator()
+HFile.Reader.getComparator()
 
 
 CellComparator
-HFile.Reader.getComparator()
+HFileReaderImpl.getComparator()
 
 
 
@@ -500,44 +500,44 @@
 
 
 private CellComparator
-HStore.comparator
+ScanInfo.comparator
 
 
-private CellComparator
-Segment.comparator
+protected CellComparator
+StripeStoreFlusher.StripeFlushRequest.comparator
 
 
 protected CellComparator
-StripeStoreFlusher.StripeFlushRequest.comparator
+HRegion.RegionScannerImpl.comparator
 
 
 private CellComparator
-ScanInfo.comparator
+StoreFileWriter.Builder.comparator
 
 
 private CellComparator
-AbstractMemStore.comparator
+Segment.comparator
 
 
-protected CellComparator
-StripeMultiFileWriter.comparator
+private CellComparator
+HStore.comparator
 
 
 protected CellComparator
-HRegion.RegionScannerImpl.comparator
+StripeMultiFileWriter.comparator
 
 
 private CellComparator
-StoreFileWriter.Builder.comparator
+AbstractMemStore.comparator
 
 
-protected CellComparator
-KeyValueHeap.KVScannerComparator.kvComparator
-
-
 private CellComparator
 DefaultStoreFileManager.kvComparator
 
+
+protected CellComparator
+KeyValueHeap.KVScannerComparator.kvComparator
+
 
 private CellComparator
 ScanQueryMatcher.rowComparator
@@ -555,21 +555,21 @@
 
 
 CellComparator
-Region.getCellCompartor()
-The comparator to be used with the region
-
+HRegion.getCellCompartor()
 
 
 CellComparator
-HRegion.getCellCompartor()
+Region.getCellCompartor()
+The comparator to be used with the region
+
 
 
 CellComparator
-HStore.getComparator()
+ScanInfo.getComparator()
 
 
-CellComparator
-Store.getComparator()
+(package private) CellComparator
+StoreFileScanner.getComparator()
 
 
 protected CellComparator
@@ -578,20 +578,20 @@
 
 
 
-(package private) CellComparator
-StoreFileScanner.getComparator()
+CellComparator
+StoreFileReader.getComparator()
 
 
 CellComparator
-StoreFileReader.getComparator()
+HStore.getComparator()
 
 
 CellComparator
-KeyValueHeap.KVScannerComparator.getComparator()
+Store.getComparator()
 
 
 CellComparator
-ScanInfo.getComparator()
+KeyValueHeap.KVScannerComparator.getComparator()
 
 
 protected CellComparator
@@ -629,10 +629,12 @@
 
 
 
-protected void
-DateTieredStoreEngine.createComponents(org.apache.hadoop.conf.Configurationconf,
+protected abstract void
+StoreEngine.createComponents(org.apache.hadoop.conf.Configurationconf,
 Storestore,
-CellComparatorkvComparator)
+CellComparatorkvComparator)
+Create the StoreEngine's components.
+
 
 
 protected void
@@ -641,16 +643,14 @@
 CellComparatorcomparator)
 
 
-protected abstract void
-StoreEngine.createComponents(org.apache.hadoop.conf.Configurationconf,
+protected void

[15/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/ResultScanner.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/class-use/ResultScanner.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/ResultScanner.html
index c5131cd..9122e4f 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/ResultScanner.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/ResultScanner.html
@@ -212,14 +212,14 @@ service.
 
 
 ResultScanner
-Table.getScanner(byte[]family)
-Gets a scanner on the current table for the given 
family.
+HTable.getScanner(byte[]family)
+The underlying HTable must 
not be closed.
 
 
 
 ResultScanner
-HTable.getScanner(byte[]family)
-The underlying HTable must 
not be closed.
+Table.getScanner(byte[]family)
+Gets a scanner on the current table for the given 
family.
 
 
 
@@ -228,16 +228,16 @@ service.
 
 
 ResultScanner
-Table.getScanner(byte[]family,
+HTable.getScanner(byte[]family,
 byte[]qualifier)
-Gets a scanner on the current table for the given family 
and qualifier.
+The underlying HTable must 
not be closed.
 
 
 
 ResultScanner
-HTable.getScanner(byte[]family,
+Table.getScanner(byte[]family,
 byte[]qualifier)
-The underlying HTable must 
not be closed.
+Gets a scanner on the current table for the given family 
and qualifier.
 
 
 
@@ -247,15 +247,15 @@ service.
 
 
 ResultScanner
-Table.getScanner(Scanscan)
-Returns a scanner on the current table as specified by the 
Scan
- object.
+HTable.getScanner(Scanscan)
+The underlying HTable must 
not be closed.
 
 
 
 ResultScanner
-HTable.getScanner(Scanscan)
-The underlying HTable must 
not be closed.
+Table.getScanner(Scanscan)
+Returns a scanner on the current table as specified by the 
Scan
+ object.
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/RetriesExhaustedWithDetailsException.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/class-use/RetriesExhaustedWithDetailsException.html
 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/RetriesExhaustedWithDetailsException.html
index 67a31a6..fe4bc5a 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/client/class-use/RetriesExhaustedWithDetailsException.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/RetriesExhaustedWithDetailsException.html
@@ -110,7 +110,8 @@
 
 
 RetriesExhaustedWithDetailsException
-AsyncProcess.waitForAllPreviousOpsAndReset(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListRowfailedRows)
+AsyncProcess.waitForAllPreviousOpsAndReset(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListRowfailedRows,
+  http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringtableName)
 Only used w/useGlobalErrors ctor argument, for HTable 
backward compat.
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/RetryingCallerInterceptorContext.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/class-use/RetryingCallerInterceptorContext.html
 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/RetryingCallerInterceptorContext.html
index c68c269..e3c2748 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/client/class-use/RetryingCallerInterceptorContext.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/RetryingCallerInterceptorContext.html
@@ -131,15 +131,15 @@
 
 
 
-RetryingCallerInterceptorContext
-NoOpRetryableCallerInterceptor.createEmptyContext()
-
-
 abstract RetryingCallerInterceptorContext
 RetryingCallerInterceptor.createEmptyContext()
 This returns the context object for the current call.
 
 
+
+RetryingCallerInterceptorContext
+NoOpRetryableCallerInterceptor.createEmptyContext()
+
 
 RetryingCallerInterceptorContext
 PreemptiveFastFailInterceptor.createEmptyContext()
@@ -179,46 +179,46 @@
 
 
 
-void
-NoOpRetryableCallerInterceptor.handleFailure(RetryingCallerInterceptorContextcontext,
-  http://docs.oracle.com/javase/7/docs/api/java/lang/Throwable.html?is-external=true;
 title="class or interface in java.lang">Throwablet)
-
-
 abstract void
 RetryingCallerInterceptor.handleFailure(RetryingCallerInterceptorContextcontext,
   http://docs.oracle.com/javase/7/docs/api/java/lang/Throwable.html?is-external=true;
 title="class or interface in java.lang">Throwablet)
 Call this function in case we caught a failure during 
retries.
 
 

[03/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/HFileBlockEncodingContext.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/HFileBlockEncodingContext.html
 
b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/HFileBlockEncodingContext.html
index de71447..9626f4a 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/HFileBlockEncodingContext.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/HFileBlockEncodingContext.html
@@ -181,18 +181,18 @@
 
 
 HFileBlockEncodingContext
+BufferedDataBlockEncoder.newDataBlockEncodingContext(DataBlockEncodingencoding,
+  byte[]header,
+  HFileContextmeta)
+
+
+HFileBlockEncodingContext
 DataBlockEncoder.newDataBlockEncodingContext(DataBlockEncodingencoding,
   byte[]headerBytes,
   HFileContextmeta)
 Creates a encoder specific encoding context
 
 
-
-HFileBlockEncodingContext
-BufferedDataBlockEncoder.newDataBlockEncodingContext(DataBlockEncodingencoding,
-  byte[]header,
-  HFileContextmeta)
-
 
 
 
@@ -204,44 +204,44 @@
 
 
 int
-DataBlockEncoder.encode(Cellcell,
+BufferedDataBlockEncoder.encode(Cellcell,
 HFileBlockEncodingContextencodingCtx,
-http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in java.io">DataOutputStreamout)
-Encodes a KeyValue.
-
+http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in 
java.io">DataOutputStreamout)
 
 
 int
-BufferedDataBlockEncoder.encode(Cellcell,
+DataBlockEncoder.encode(Cellcell,
 HFileBlockEncodingContextencodingCtx,
-http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in 
java.io">DataOutputStreamout)
+http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in java.io">DataOutputStreamout)
+Encodes a KeyValue.
+
 
 
 void
+BufferedDataBlockEncoder.endBlockEncoding(HFileBlockEncodingContextencodingCtx,
+http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in java.io">DataOutputStreamout,
+
byte[]uncompressedBytesWithHeader)
+
+
+void
 DataBlockEncoder.endBlockEncoding(HFileBlockEncodingContextencodingCtx,
 http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in java.io">DataOutputStreamout,
 byte[]uncompressedBytesWithHeader)
 Ends encoding for a block of KeyValues.
 
 
-
+
 void
-BufferedDataBlockEncoder.endBlockEncoding(HFileBlockEncodingContextencodingCtx,
-http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in java.io">DataOutputStreamout,
-
byte[]uncompressedBytesWithHeader)
+BufferedDataBlockEncoder.startBlockEncoding(HFileBlockEncodingContextblkEncodingCtx,
+http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in 
java.io">DataOutputStreamout)
 
-
+
 void
 DataBlockEncoder.startBlockEncoding(HFileBlockEncodingContextencodingCtx,
 http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in java.io">DataOutputStreamout)
 Starts encoding for a block of KeyValues.
 
 
-
-void
-BufferedDataBlockEncoder.startBlockEncoding(HFileBlockEncodingContextblkEncodingCtx,
-http://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html?is-external=true;
 title="class or interface in 
java.io">DataOutputStreamout)
-
 
 
 
@@ -271,21 +271,21 @@
 
 
 HFileBlockEncodingContext
-HFileDataBlockEncoderImpl.newDataBlockEncodingContext(byte[]dummyHeader,
-  HFileContextfileContext)
-
-
-HFileBlockEncodingContext
 NoOpDataBlockEncoder.newDataBlockEncodingContext(byte[]dummyHeader,
   HFileContextmeta)
 
-
+
 HFileBlockEncodingContext
 HFileDataBlockEncoder.newDataBlockEncodingContext(byte[]headerBytes,
   HFileContextfileContext)
 Create an encoder 

[39/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/deprecated-list.html
--
diff --git a/devapidocs/deprecated-list.html b/devapidocs/deprecated-list.html
index 30efc7f..eca1560 100644
--- a/devapidocs/deprecated-list.html
+++ b/devapidocs/deprecated-list.html
@@ -523,10 +523,10 @@
 
 
 
-org.apache.hadoop.hbase.http.InfoServer.getPort()
+org.apache.hadoop.hbase.http.HttpServer.getPort()
 
 
-org.apache.hadoop.hbase.http.HttpServer.getPort()
+org.apache.hadoop.hbase.http.InfoServer.getPort()
 
 
 org.apache.hadoop.hbase.CellUtil.getQualifierBufferShallowCopy(Cell)
@@ -617,13 +617,13 @@
 
 
 
-org.apache.hadoop.hbase.client.HTableInterface.getWriteBufferSize()
-as of 1.0.0. Replaced by BufferedMutator.getWriteBufferSize()
+org.apache.hadoop.hbase.client.Table.getWriteBufferSize()
+as of 1.0.1 (should not have been in 1.0.0). Replaced by 
BufferedMutator.getWriteBufferSize()
 
 
 
-org.apache.hadoop.hbase.client.Table.getWriteBufferSize()
-as of 1.0.1 (should not have been in 1.0.0). Replaced by 
BufferedMutator.getWriteBufferSize()
+org.apache.hadoop.hbase.client.HTableInterface.getWriteBufferSize()
+as of 1.0.0. Replaced by BufferedMutator.getWriteBufferSize()
 
 
 
@@ -661,12 +661,12 @@
 org.apache.hadoop.hbase.master.cleaner.BaseLogCleanerDelegate.isLogDeletable(FileStatus)
 
 
-org.apache.hadoop.hbase.client.ClusterConnection.isMasterRunning()
+org.apache.hadoop.hbase.client.ConnectionImplementation.isMasterRunning()
 this has been deprecated without a replacement
 
 
 
-org.apache.hadoop.hbase.client.ConnectionImplementation.isMasterRunning()
+org.apache.hadoop.hbase.client.ClusterConnection.isMasterRunning()
 this has been deprecated without a replacement
 
 
@@ -710,34 +710,34 @@
 org.apache.hadoop.hbase.coprocessor.BaseMasterAndRegionObserver.postAddColumn(ObserverContext,
 TableName, HColumnDescriptor)
 
 
-org.apache.hadoop.hbase.coprocessor.BaseMasterObserver.postAddColumn(ObserverContext,
 TableName, HColumnDescriptor)
+org.apache.hadoop.hbase.coprocessor.MasterObserver.postAddColumn(ObserverContext,
 TableName, HColumnDescriptor)
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645).
- Use BaseMasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
+ Use MasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
-org.apache.hadoop.hbase.coprocessor.MasterObserver.postAddColumn(ObserverContext,
 TableName, HColumnDescriptor)
+org.apache.hadoop.hbase.coprocessor.BaseMasterObserver.postAddColumn(ObserverContext,
 TableName, HColumnDescriptor)
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645).
- Use MasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
+ Use BaseMasterObserver.postAddColumnFamily(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
 org.apache.hadoop.hbase.coprocessor.BaseMasterAndRegionObserver.postAddColumnHandler(ObserverContext,
 TableName, HColumnDescriptor)
 
 
-org.apache.hadoop.hbase.coprocessor.BaseMasterObserver.postAddColumnHandler(ObserverContext,
 TableName, HColumnDescriptor)
+org.apache.hadoop.hbase.coprocessor.MasterObserver.postAddColumnHandler(ObserverContext,
 TableName, HColumnDescriptor)
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645). Use
- BaseMasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
+ MasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
-org.apache.hadoop.hbase.coprocessor.MasterObserver.postAddColumnHandler(ObserverContext,
 TableName, HColumnDescriptor)
+org.apache.hadoop.hbase.coprocessor.BaseMasterObserver.postAddColumnHandler(ObserverContext,
 TableName, HColumnDescriptor)
 As of release 2.0.0, this will be removed in HBase 3.0.0
  (https://issues.apache.org/jira/browse/HBASE-13645;>HBASE-13645). Use
- MasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
+ BaseMasterObserver.postCompletedAddColumnFamilyAction(ObserverContext,
 TableName, HColumnDescriptor).
 
 
 
@@ -759,17 +759,17 @@
 org.apache.hadoop.hbase.coprocessor.BaseMasterAndRegionObserver.postCreateTableHandler(ObserverContext,
 HTableDescriptor, HRegionInfo[])
 
 
-org.apache.hadoop.hbase.coprocessor.BaseMasterObserver.postCreateTableHandler(ObserverContext,
 HTableDescriptor, HRegionInfo[])
+org.apache.hadoop.hbase.coprocessor.MasterObserver.postCreateTableHandler(ObserverContext,
 HTableDescriptor, HRegionInfo[])
 As of release 2.0.0, this will be removed in HBase 3.0.0
(https://issues.apache.org/jira/browse/HBASE-15575;>HBASE-15575).
-   

[28/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/ScheduledChore.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/ScheduledChore.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/ScheduledChore.html
index 1526f62..34907e4 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/ScheduledChore.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/ScheduledChore.html
@@ -172,43 +172,43 @@
 
 
 void
-ChoreService.cancelChore(ScheduledChorechore)
-
-
-void
 ScheduledChore.ChoreServicer.cancelChore(ScheduledChorechore)
 Cancel any ongoing schedules that this chore has with the 
implementer of this interface.
 
 
+
+void
+ChoreService.cancelChore(ScheduledChorechore)
+
 
 void
-ChoreService.cancelChore(ScheduledChorechore,
+ScheduledChore.ChoreServicer.cancelChore(ScheduledChorechore,
   booleanmayInterruptIfRunning)
 
 
 void
-ScheduledChore.ChoreServicer.cancelChore(ScheduledChorechore,
+ChoreService.cancelChore(ScheduledChorechore,
   booleanmayInterruptIfRunning)
 
 
 boolean
-ChoreService.isChoreScheduled(ScheduledChorechore)
+ScheduledChore.ChoreServicer.isChoreScheduled(ScheduledChorechore)
 
 
 boolean
-ScheduledChore.ChoreServicer.isChoreScheduled(ScheduledChorechore)
+ChoreService.isChoreScheduled(ScheduledChorechore)
 
 
 void
-ChoreService.onChoreMissedStartTime(ScheduledChorechore)
-
-
-void
 ScheduledChore.ChoreServicer.onChoreMissedStartTime(ScheduledChorechore)
 A callback that tells the implementer of this interface 
that one of the scheduled chores is
  missing its start time.
 
 
+
+void
+ChoreService.onChoreMissedStartTime(ScheduledChorechore)
+
 
 private void
 ChoreService.printChoreDetails(http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringheader,
@@ -226,14 +226,14 @@
 
 
 boolean
-ChoreService.triggerNow(ScheduledChorechore)
-
-
-boolean
 ScheduledChore.ChoreServicer.triggerNow(ScheduledChorechore)
 This method tries to execute the chore immediately.
 
 
+
+boolean
+ChoreService.triggerNow(ScheduledChorechore)
+
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/Server.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/Server.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/Server.html
index 829b707..d493d57 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/Server.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/Server.html
@@ -184,11 +184,11 @@
 
 
 Server
-BaseCoordinatedStateManager.getServer()
+ZkCoordinatedStateManager.getServer()
 
 
 Server
-ZkCoordinatedStateManager.getServer()
+BaseCoordinatedStateManager.getServer()
 
 
 
@@ -201,11 +201,11 @@
 
 
 void
-BaseCoordinatedStateManager.initialize(Serverserver)
+ZkCoordinatedStateManager.initialize(Serverserver)
 
 
 void
-ZkCoordinatedStateManager.initialize(Serverserver)
+BaseCoordinatedStateManager.initialize(Serverserver)
 
 
 
@@ -336,16 +336,16 @@
 RegionStateStore.server
 
 
-protected Server
-BulkAssigner.server
+private Server
+SplitLogManager.server
 
 
-private Server
-CatalogJanitor.server
+protected Server
+BulkAssigner.server
 
 
 private Server
-SplitLogManager.server
+CatalogJanitor.server
 
 
 
@@ -500,7 +500,7 @@
 
 
 private Server
-LogRoller.server
+RegionMergeTransactionImpl.server
 
 
 private Server
@@ -512,11 +512,11 @@
 
 
 private Server
-HeapMemoryManager.server
+LogRoller.server
 
 
 private Server
-RegionMergeTransactionImpl.server
+HeapMemoryManager.server
 
 
 
@@ -529,23 +529,23 @@
 
 
 Server
-SplitTransactionImpl.getServer()
+RegionMergeTransactionImpl.getServer()
 
 
 Server
-SplitTransaction.getServer()
-Get the Server running the transaction or rollback
-
+SplitTransactionImpl.getServer()
 
 
 Server
-RegionMergeTransaction.getServer()
+SplitTransaction.getServer()
 Get the Server running the transaction or rollback
 
 
 
 Server
-RegionMergeTransactionImpl.getServer()
+RegionMergeTransaction.getServer()
+Get the Server running the transaction or rollback
+
 
 
 
@@ -580,11 +580,16 @@
 
 
 
+Region
+RegionMergeTransactionImpl.execute(Serverserver,
+  RegionServerServicesservices)
+
+
 PairOfSameTypeRegion
 SplitTransactionImpl.execute(Serverserver,
   RegionServerServicesservices)
 
-
+
 PairOfSameTypeRegion
 SplitTransaction.execute(Serverserver,
   RegionServerServicesservices)
@@ -593,7 +598,7 @@
 
 
 
-
+
 Region
 RegionMergeTransaction.execute(Serverserver,
   RegionServerServicesservices)
@@ -602,18 +607,19 @@
 
 
 
-
+
 Region
-RegionMergeTransactionImpl.execute(Serverserver,
-  RegionServerServicesservices)
+RegionMergeTransactionImpl.execute(Serverserver,
+  RegionServerServicesservices,
+  

[41/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/checkstyle.rss
--
diff --git a/checkstyle.rss b/checkstyle.rss
index 9d52840..a2bd80e 100644
--- a/checkstyle.rss
+++ b/checkstyle.rss
@@ -25,8 +25,8 @@ under the License.
 en-us
 2007 - 2016 The Apache Software Foundation
 
-  File: 1771,
- Errors: 11536,
+  File: 1773,
+ Errors: 11543,
  Warnings: 0,
  Infos: 0
   
@@ -130,7 +130,7 @@ under the License.
   
   
 
-  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.regionserver.RowTooBigException.java;>org/apache/hadoop/hbase/regionserver/RowTooBigException.java
+  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.regionserver.LogRoller.java;>org/apache/hadoop/hbase/regionserver/LogRoller.java
 
 
   0
@@ -139,12 +139,12 @@ under the License.
   0
 
 
-  0
+  5
 
   
   
 
-  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.regionserver.LogRoller.java;>org/apache/hadoop/hbase/regionserver/LogRoller.java
+  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.regionserver.RowTooBigException.java;>org/apache/hadoop/hbase/regionserver/RowTooBigException.java
 
 
   0
@@ -153,7 +153,7 @@ under the License.
   0
 
 
-  5
+  0
 
   
   
@@ -354,7 +354,7 @@ under the License.
   
   
 
-  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.client.ResultStatsUtil.java;>org/apache/hadoop/hbase/client/ResultStatsUtil.java
+  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.client.MultiAction.java;>org/apache/hadoop/hbase/client/MultiAction.java
 
 
   0
@@ -363,12 +363,12 @@ under the License.
   0
 
 
-  0
+  3
 
   
   
 
-  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.client.MultiAction.java;>org/apache/hadoop/hbase/client/MultiAction.java
+  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.client.ResultStatsUtil.java;>org/apache/hadoop/hbase/client/ResultStatsUtil.java
 
 
   0
@@ -377,7 +377,7 @@ under the License.
   0
 
 
-  3
+  0
 
   
   
@@ -676,7 +676,7 @@ under the License.
   
   
 
-  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.replication.regionserver.ReplicationSyncUp.java;>org/apache/hadoop/hbase/replication/regionserver/ReplicationSyncUp.java
+  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.zookeeper.ZKConfig.java;>org/apache/hadoop/hbase/zookeeper/ZKConfig.java
 
 
   0
@@ -685,12 +685,12 @@ under the License.
   0
 
 
-  3
+  4
 
   
   
 
-  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.zookeeper.ZKConfig.java;>org/apache/hadoop/hbase/zookeeper/ZKConfig.java
+  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.replication.regionserver.ReplicationSyncUp.java;>org/apache/hadoop/hbase/replication/regionserver/ReplicationSyncUp.java
 
 
   0
@@ -699,7 +699,7 @@ under the License.
   0
 
 
-  4
+  3
 
   
   
@@ -858,7 +858,7 @@ under the License.
   
   
 
-  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.regionserver.wal.Compressor.java;>org/apache/hadoop/hbase/regionserver/wal/Compressor.java
+  http://hbase.apache.org/checkstyle.html#org.apache.hadoop.hbase.io.hfile.CorruptHFileException.java;>org/apache/hadoop/hbase/io/hfile/CorruptHFileException.java
 
 
   0
@@ 

[43/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/src-html/org/apache/hadoop/hbase/ServerLoad.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/ServerLoad.html 
b/apidocs/src-html/org/apache/hadoop/hbase/ServerLoad.html
index 87f3def..8459fb8 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/ServerLoad.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/ServerLoad.html
@@ -65,281 +65,283 @@
 057  private long totalCompactingKVs = 0;
 058  private long currentCompactedKVs = 0;
 059
-060  public 
ServerLoad(ClusterStatusProtos.ServerLoad serverLoad) {
-061this.serverLoad = serverLoad;
-062for (ClusterStatusProtos.RegionLoad 
rl: serverLoad.getRegionLoadsList()) {
-063  stores += rl.getStores();
-064  storefiles += rl.getStorefiles();
-065  storeUncompressedSizeMB += 
rl.getStoreUncompressedSizeMB();
-066  storefileSizeMB += 
rl.getStorefileSizeMB();
-067  memstoreSizeMB += 
rl.getMemstoreSizeMB();
-068  storefileIndexSizeMB += 
rl.getStorefileIndexSizeMB();
-069  readRequestsCount += 
rl.getReadRequestsCount();
-070  filteredReadRequestsCount += 
rl.getFilteredReadRequestsCount();
-071  writeRequestsCount += 
rl.getWriteRequestsCount();
-072  rootIndexSizeKB += 
rl.getRootIndexSizeKB();
-073  totalStaticIndexSizeKB += 
rl.getTotalStaticIndexSizeKB();
-074  totalStaticBloomSizeKB += 
rl.getTotalStaticBloomSizeKB();
-075  totalCompactingKVs += 
rl.getTotalCompactingKVs();
-076  currentCompactedKVs += 
rl.getCurrentCompactedKVs();
-077}
-078
-079  }
-080
-081  // NOTE: Function name cannot start 
with "get" because then an OpenDataException is thrown because
-082  // HBaseProtos.ServerLoad cannot be 
converted to an open data type(see HBASE-5967).
-083  /* @return the underlying ServerLoad 
protobuf object */
-084  public ClusterStatusProtos.ServerLoad 
obtainServerLoadPB() {
-085return serverLoad;
-086  }
-087
-088  protected 
ClusterStatusProtos.ServerLoad serverLoad;
+060  @InterfaceAudience.Private
+061  public 
ServerLoad(ClusterStatusProtos.ServerLoad serverLoad) {
+062this.serverLoad = serverLoad;
+063for (ClusterStatusProtos.RegionLoad 
rl: serverLoad.getRegionLoadsList()) {
+064  stores += rl.getStores();
+065  storefiles += rl.getStorefiles();
+066  storeUncompressedSizeMB += 
rl.getStoreUncompressedSizeMB();
+067  storefileSizeMB += 
rl.getStorefileSizeMB();
+068  memstoreSizeMB += 
rl.getMemstoreSizeMB();
+069  storefileIndexSizeMB += 
rl.getStorefileIndexSizeMB();
+070  readRequestsCount += 
rl.getReadRequestsCount();
+071  filteredReadRequestsCount += 
rl.getFilteredReadRequestsCount();
+072  writeRequestsCount += 
rl.getWriteRequestsCount();
+073  rootIndexSizeKB += 
rl.getRootIndexSizeKB();
+074  totalStaticIndexSizeKB += 
rl.getTotalStaticIndexSizeKB();
+075  totalStaticBloomSizeKB += 
rl.getTotalStaticBloomSizeKB();
+076  totalCompactingKVs += 
rl.getTotalCompactingKVs();
+077  currentCompactedKVs += 
rl.getCurrentCompactedKVs();
+078}
+079
+080  }
+081
+082  // NOTE: Function name cannot start 
with "get" because then an OpenDataException is thrown because
+083  // HBaseProtos.ServerLoad cannot be 
converted to an open data type(see HBASE-5967).
+084  /* @return the underlying ServerLoad 
protobuf object */
+085  @InterfaceAudience.Private
+086  public ClusterStatusProtos.ServerLoad 
obtainServerLoadPB() {
+087return serverLoad;
+088  }
 089
-090  /* @return number of requests  since 
last report. */
-091  public long getNumberOfRequests() {
-092return 
serverLoad.getNumberOfRequests();
-093  }
-094  public boolean hasNumberOfRequests() 
{
-095return 
serverLoad.hasNumberOfRequests();
-096  }
-097
-098  /* @return total Number of requests 
from the start of the region server. */
-099  public long getTotalNumberOfRequests() 
{
-100return 
serverLoad.getTotalNumberOfRequests();
-101  }
-102  public boolean 
hasTotalNumberOfRequests() {
-103return 
serverLoad.hasTotalNumberOfRequests();
-104  }
-105
-106  /* @return the amount of used heap, in 
MB. */
-107  public int getUsedHeapMB() {
-108return serverLoad.getUsedHeapMB();
-109  }
-110  public boolean hasUsedHeapMB() {
-111return serverLoad.hasUsedHeapMB();
-112  }
-113
-114  /* @return the maximum allowable size 
of the heap, in MB. */
-115  public int getMaxHeapMB() {
-116return serverLoad.getMaxHeapMB();
-117  }
-118  public boolean hasMaxHeapMB() {
-119return serverLoad.hasMaxHeapMB();
-120  }
-121
-122  public int getStores() {
-123return stores;
-124  }
-125
-126  public int getStorefiles() {
-127return storefiles;
-128  }
-129
-130  public int getStoreUncompressedSizeMB() 
{
-131return storeUncompressedSizeMB;
-132  }
-133
-134  public int getStorefileSizeInMB() {
-135return storefileSizeMB;
-136  }
-137
-138  public int getMemstoreSizeInMB() {

[16/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/Result.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/class-use/Result.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/Result.html
index 45d1e7a..4c98444 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/Result.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/Result.html
@@ -375,13 +375,13 @@ service.
 
 
 Result
-Table.append(Appendappend)
+HTable.append(Appendappend)
 Appends values to one or more columns within a single 
row.
 
 
 
 Result
-HTable.append(Appendappend)
+Table.append(Appendappend)
 Appends values to one or more columns within a single 
row.
 
 
@@ -399,11 +399,11 @@ service.
 
 
 Result[]
-ScannerCallableWithReplicas.call(inttimeout)
+ClientSmallScanner.SmallScannerCallable.call(inttimeout)
 
 
 Result[]
-ClientSmallScanner.SmallScannerCallable.call(inttimeout)
+ScannerCallableWithReplicas.call(inttimeout)
 
 
 Result[]
@@ -474,13 +474,13 @@ service.
 
 
 Result
-Table.get(Getget)
+HTable.get(Getget)
 Extracts certain cells from a given row.
 
 
 
 Result
-HTable.get(Getget)
+Table.get(Getget)
 Extracts certain cells from a given row.
 
 
@@ -495,13 +495,13 @@ service.
 
 
 Result[]
-Table.get(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
+HTable.get(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
 Extracts certain cells from the given rows, in batch.
 
 
 
 Result[]
-HTable.get(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
+Table.get(http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListGetgets)
 Extracts certain cells from the given rows, in batch.
 
 
@@ -511,13 +511,13 @@ service.
 
 
 Result
-Table.increment(Incrementincrement)
+HTable.increment(Incrementincrement)
 Increments one or more columns within a single row.
 
 
 
 Result
-HTable.increment(Incrementincrement)
+Table.increment(Incrementincrement)
 Increments one or more columns within a single row.
 
 
@@ -527,21 +527,21 @@ service.
 
 
 Result
-ResultScanner.next()
-Grab the next row's worth of values.
-
+ClientSmallScanner.next()
 
 
 Result
-ClientSmallScanner.next()
+ClientSimpleScanner.next()
 
 
 Result
-ClientSimpleScanner.next()
+ClientAsyncPrefetchScanner.next()
 
 
 Result
-ClientAsyncPrefetchScanner.next()
+ResultScanner.next()
+Grab the next row's worth of values.
+
 
 
 Result
@@ -557,14 +557,14 @@ service.
 
 
 Result[]
-ResultScanner.next(intnbRows)
-
-
-Result[]
 AbstractClientScanner.next(intnbRows)
 Get nbRows rows.
 
 
+
+Result[]
+ResultScanner.next(intnbRows)
+
 
 protected Result
 ClientScanner.nextWithSyncCache()
@@ -715,19 +715,25 @@ service.
 
 
 Result
+BaseRegionObserver.postAppend(ObserverContextRegionCoprocessorEnvironmente,
+Appendappend,
+Resultresult)
+
+
+Result
 RegionObserver.postAppend(ObserverContextRegionCoprocessorEnvironmentc,
 Appendappend,
 Resultresult)
 Called after Append
 
 
-
+
 Result
-BaseRegionObserver.postAppend(ObserverContextRegionCoprocessorEnvironmente,
-Appendappend,
-Resultresult)
+BaseRegionObserver.postIncrement(ObserverContextRegionCoprocessorEnvironmente,
+  Incrementincrement,
+  Resultresult)
 
-
+
 Result
 RegionObserver.postIncrement(ObserverContextRegionCoprocessorEnvironmentc,
   Incrementincrement,
@@ -735,60 +741,54 @@ service.
 Called after increment
 
 
-
+
 Result
-BaseRegionObserver.postIncrement(ObserverContextRegionCoprocessorEnvironmente,
-  Incrementincrement,
-  Resultresult)
+BaseRegionObserver.preAppend(ObserverContextRegionCoprocessorEnvironmente,
+  Appendappend)
 
-
+
 Result
 RegionObserver.preAppend(ObserverContextRegionCoprocessorEnvironmentc,
   Appendappend)
 Called before Append.
 
 
-
+
 Result
-BaseRegionObserver.preAppend(ObserverContextRegionCoprocessorEnvironmente,
-  Appendappend)
+BaseRegionObserver.preAppendAfterRowLock(ObserverContextRegionCoprocessorEnvironmente,
+  Appendappend)
 
-
+
 Result
 RegionObserver.preAppendAfterRowLock(ObserverContextRegionCoprocessorEnvironmentc,
   Appendappend)
 Called before Append but after acquiring rowlock.
 
 
-
+
 Result
-BaseRegionObserver.preAppendAfterRowLock(ObserverContextRegionCoprocessorEnvironmente,
-  Appendappend)

[14/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/client/class-use/Scan.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/class-use/Scan.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/Scan.html
index 452f732..5bb78da 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/Scan.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/Scan.html
@@ -217,13 +217,13 @@ service.
 ScannerCallableWithReplicas.scan
 
 
-protected Scan
-ClientScanner.scan
-
-
 private Scan
 ScannerCallable.scan
 
+
+protected Scan
+ClientScanner.scan
+
 
 private Scan
 TableSnapshotScanner.scan
@@ -252,11 +252,11 @@ service.
 
 
 protected Scan
-ClientScanner.getScan()
+ScannerCallable.getScan()
 
 
 protected Scan
-ScannerCallable.getScan()
+ClientScanner.getScan()
 
 
 Scan
@@ -458,15 +458,15 @@ service.
 
 
 ResultScanner
-Table.getScanner(Scanscan)
-Returns a scanner on the current table as specified by the 
Scan
- object.
+HTable.getScanner(Scanscan)
+The underlying HTable must 
not be closed.
 
 
 
 ResultScanner
-HTable.getScanner(Scanscan)
-The underlying HTable must 
not be closed.
+Table.getScanner(Scanscan)
+Returns a scanner on the current table as specified by the 
Scan
+ object.
 
 
 
@@ -881,19 +881,25 @@ service.
 
 
 RegionScanner
+BaseRegionObserver.postScannerOpen(ObserverContextRegionCoprocessorEnvironmente,
+  Scanscan,
+  RegionScanners)
+
+
+RegionScanner
 RegionObserver.postScannerOpen(ObserverContextRegionCoprocessorEnvironmentc,
   Scanscan,
   RegionScanners)
 Called after the client opens a new scanner.
 
 
-
+
 RegionScanner
-BaseRegionObserver.postScannerOpen(ObserverContextRegionCoprocessorEnvironmente,
-  Scanscan,
-  RegionScanners)
+BaseRegionObserver.preScannerOpen(ObserverContextRegionCoprocessorEnvironmente,
+Scanscan,
+RegionScanners)
 
-
+
 RegionScanner
 RegionObserver.preScannerOpen(ObserverContextRegionCoprocessorEnvironmentc,
 Scanscan,
@@ -901,14 +907,16 @@ service.
 Called before the client opens a new scanner.
 
 
-
-RegionScanner
-BaseRegionObserver.preScannerOpen(ObserverContextRegionCoprocessorEnvironmente,
-Scanscan,
-RegionScanners)
-
 
 KeyValueScanner
+BaseRegionObserver.preStoreScannerOpen(ObserverContextRegionCoprocessorEnvironmentc,
+  Storestore,
+  Scanscan,
+  http://docs.oracle.com/javase/7/docs/api/java/util/NavigableSet.html?is-external=true;
 title="class or interface in 
java.util">NavigableSetbyte[]targetCols,
+  KeyValueScanners)
+
+
+KeyValueScanner
 RegionObserver.preStoreScannerOpen(ObserverContextRegionCoprocessorEnvironmentc,
   Storestore,
   Scanscan,
@@ -920,15 +928,16 @@ service.
 
 
 
-
+
 KeyValueScanner
-BaseRegionObserver.preStoreScannerOpen(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.preStoreScannerOpen(ObserverContextRegionCoprocessorEnvironmentc,
   Storestore,
   Scanscan,
   http://docs.oracle.com/javase/7/docs/api/java/util/NavigableSet.html?is-external=true;
 title="class or interface in 
java.util">NavigableSetbyte[]targetCols,
-  KeyValueScanners)
+  KeyValueScanners,
+  longreadPt)
 
-
+
 KeyValueScanner
 RegionObserver.preStoreScannerOpen(ObserverContextRegionCoprocessorEnvironmentc,
   Storestore,
@@ -939,15 +948,6 @@ service.
 Called before a store opens a new scanner.
 
 
-
-KeyValueScanner
-BaseRegionObserver.preStoreScannerOpen(ObserverContextRegionCoprocessorEnvironmentc,
-  Storestore,
-  Scanscan,
-  http://docs.oracle.com/javase/7/docs/api/java/util/NavigableSet.html?is-external=true;
 title="class or interface in 
java.util">NavigableSetbyte[]targetCols,
-  KeyValueScanners,
-  longreadPt)
-
 
 
 
@@ -1059,17 +1059,17 @@ service.
 
 
 private Scan
-TableInputFormatBase.scan
-Holds the details for the internal scanner.
-
+TableSnapshotInputFormatImpl.RecordReader.scan
 
 
 private Scan
-TableSnapshotInputFormatImpl.RecordReader.scan
+TableRecordReaderImpl.scan
 
 
 private Scan
-TableRecordReaderImpl.scan

[09/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionObserver.MutationType.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionObserver.MutationType.html
 
b/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionObserver.MutationType.html
index 1c4a4db..29fee71 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionObserver.MutationType.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionObserver.MutationType.html
@@ -132,22 +132,22 @@ the order they are declared.
 
 
 Cell
-RegionObserver.postMutationBeforeWAL(ObserverContextRegionCoprocessorEnvironmentctx,
+BaseRegionObserver.postMutationBeforeWAL(ObserverContextRegionCoprocessorEnvironmentctx,
   RegionObserver.MutationTypeopType,
   Mutationmutation,
   CelloldCell,
-  CellnewCell)
-Called after a new cell has been created during an 
increment operation, but before
- it is committed to the WAL or memstore.
-
+  CellnewCell)
 
 
 Cell
-BaseRegionObserver.postMutationBeforeWAL(ObserverContextRegionCoprocessorEnvironmentctx,
+RegionObserver.postMutationBeforeWAL(ObserverContextRegionCoprocessorEnvironmentctx,
   RegionObserver.MutationTypeopType,
   Mutationmutation,
   CelloldCell,
-  CellnewCell)
+  CellnewCell)
+Called after a new cell has been created during an 
increment operation, but before
+ it is committed to the WAL or memstore.
+
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/errorhandling/class-use/ForeignException.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/errorhandling/class-use/ForeignException.html
 
b/devapidocs/org/apache/hadoop/hbase/errorhandling/class-use/ForeignException.html
index f49f0b7..e3b9085 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/errorhandling/class-use/ForeignException.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/errorhandling/class-use/ForeignException.html
@@ -297,6 +297,14 @@
 
 
 void
+ProcedureCoordinatorRpcs.sendAbortToMembers(ProcedureprocName,
+ForeignExceptioncause)
+Notify the members that the coordinator has aborted the 
procedure and that it should release
+ barrier resources.
+
+
+
+void
 ZKProcedureCoordinatorRpcs.sendAbortToMembers(Procedureproc,
 ForeignExceptionee)
 This is the abort message being sent by the coordinator to 
member
@@ -305,29 +313,21 @@
  coordinator.
 
 
-
+
 void
-ProcedureCoordinatorRpcs.sendAbortToMembers(ProcedureprocName,
-ForeignExceptioncause)
-Notify the members that the coordinator has aborted the 
procedure and that it should release
- barrier resources.
+ZKProcedureMemberRpcs.sendMemberAborted(Subproceduresub,
+  ForeignExceptionee)
+This should be called by the member and should write a 
serialized root cause exception as
+ to the abort znode.
 
 
-
+
 void
 ProcedureMemberRpcs.sendMemberAborted(Subproceduresub,
   ForeignExceptioncause)
 Notify the coordinator that we aborted the specified Subprocedure
 
 
-
-void
-ZKProcedureMemberRpcs.sendMemberAborted(Subproceduresub,
-  ForeignExceptionee)
-This should be called by the member and should write a 
serialized root cause exception as
- to the abort znode.
-
-
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/exceptions/class-use/DeserializationException.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/exceptions/class-use/DeserializationException.html
 
b/devapidocs/org/apache/hadoop/hbase/exceptions/class-use/DeserializationException.html
index 0de9040..13e94e5 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/exceptions/class-use/DeserializationException.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/exceptions/class-use/DeserializationException.html
@@ -152,13 +152,13 @@
 HTableDescriptor.parseFrom(byte[]bytes)
 
 
-static ClusterId
-ClusterId.parseFrom(byte[]bytes)
-
-
 static HColumnDescriptor
 HColumnDescriptor.parseFrom(byte[]bytes)
 
+
+static ClusterId
+ClusterId.parseFrom(byte[]bytes)
+
 
 static TableDescriptor
 TableDescriptor.parseFrom(byte[]bytes)
@@ -257,145 +257,145 @@
 ByteArrayComparable.parseFrom(byte[]pbBytes)
 
 

[08/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/filter/class-use/ByteArrayComparable.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/filter/class-use/ByteArrayComparable.html 
b/devapidocs/org/apache/hadoop/hbase/filter/class-use/ByteArrayComparable.html
index 481141f..82805d2 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/filter/class-use/ByteArrayComparable.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/filter/class-use/ByteArrayComparable.html
@@ -163,147 +163,147 @@
 
 
 boolean
-RegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmente,
 byte[]row,
 byte[]family,
 byte[]qualifier,
 CompareFilter.CompareOpcompareOp,
 ByteArrayComparablecomparator,
 Deletedelete,
-booleanresult)
-Called after checkAndDelete
-
+booleanresult)
 
 
 boolean
-BaseRegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmentc,
 byte[]row,
 byte[]family,
 byte[]qualifier,
 CompareFilter.CompareOpcompareOp,
 ByteArrayComparablecomparator,
 Deletedelete,
-booleanresult)
+booleanresult)
+Called after checkAndDelete
+
 
 
 boolean
-RegionObserver.postCheckAndPut(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postCheckAndPut(ObserverContextRegionCoprocessorEnvironmente,
   byte[]row,
   byte[]family,
   byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
   ByteArrayComparablecomparator,
   Putput,
-  booleanresult)
-Called after checkAndPut
-
+  booleanresult)
 
 
 boolean
-BaseRegionObserver.postCheckAndPut(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.postCheckAndPut(ObserverContextRegionCoprocessorEnvironmentc,
   byte[]row,
   byte[]family,
   byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
   ByteArrayComparablecomparator,
   Putput,
-  booleanresult)
+  booleanresult)
+Called after checkAndPut
+
 
 
 boolean
-RegionObserver.preCheckAndDelete(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.preCheckAndDelete(ObserverContextRegionCoprocessorEnvironmente,
   byte[]row,
   byte[]family,
   byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
   ByteArrayComparablecomparator,
   Deletedelete,
-  booleanresult)
-Called before checkAndDelete.
-
+  booleanresult)
 
 
 boolean
-BaseRegionObserver.preCheckAndDelete(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.preCheckAndDelete(ObserverContextRegionCoprocessorEnvironmentc,
   byte[]row,
   byte[]family,
   byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
   ByteArrayComparablecomparator,
   Deletedelete,
-  booleanresult)
+  booleanresult)
+Called before checkAndDelete.
+
 
 
 boolean
-RegionObserver.preCheckAndDeleteAfterRowLock(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.preCheckAndDeleteAfterRowLock(ObserverContextRegionCoprocessorEnvironmente,
   byte[]row,
   byte[]family,
   
byte[]qualifier,
   CompareFilter.CompareOpcompareOp,
   ByteArrayComparablecomparator,
  

[30/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/HTableDescriptor.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/HTableDescriptor.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/HTableDescriptor.html
index 3272c87..570467e 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/HTableDescriptor.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/HTableDescriptor.html
@@ -586,22 +586,22 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
-protected HTableDescriptor
-HBaseAdmin.CreateTableFuture.getTableDescriptor()
+HTableDescriptor
+HTable.getTableDescriptor()
+Gets the table descriptor for 
this table.
+
 
 
 protected HTableDescriptor
-HBaseAdmin.TableFuture.getTableDescriptor()
+HBaseAdmin.CreateTableFuture.getTableDescriptor()
 
 
-HTableDescriptor
-Table.getTableDescriptor()
-Gets the table descriptor for 
this table.
-
+protected HTableDescriptor
+HBaseAdmin.TableFuture.getTableDescriptor()
 
 
 HTableDescriptor
-HTable.getTableDescriptor()
+Table.getTableDescriptor()
 Gets the table descriptor for 
this table.
 
 
@@ -1010,18 +1010,18 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-BaseMasterObserver.postCloneSnapshot(ObserverContextMasterCoprocessorEnvironmentctx,
-  
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.SnapshotDescriptionsnapshot,
-  HTableDescriptorhTableDescriptor)
-
-
-void
 MasterObserver.postCloneSnapshot(ObserverContextMasterCoprocessorEnvironmentctx,
   
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.SnapshotDescriptionsnapshot,
   HTableDescriptorhTableDescriptor)
 Called after a snapshot clone operation has been 
requested.
 
 
+
+void
+BaseMasterObserver.postCloneSnapshot(ObserverContextMasterCoprocessorEnvironmentctx,
+  
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.SnapshotDescriptionsnapshot,
+  HTableDescriptorhTableDescriptor)
+
 
 void
 BaseMasterAndRegionObserver.postCompletedCreateTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -1030,18 +1030,18 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-BaseMasterObserver.postCompletedCreateTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
-HTableDescriptordesc,
-HRegionInfo[]regions)
-
-
-void
 MasterObserver.postCompletedCreateTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
 HTableDescriptordesc,
 HRegionInfo[]regions)
 Called after the createTable operation has been 
requested.
 
 
+
+void
+BaseMasterObserver.postCompletedCreateTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
+HTableDescriptordesc,
+HRegionInfo[]regions)
+
 
 void
 BaseMasterAndRegionObserver.postCompletedModifyTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -1050,18 +1050,18 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-BaseMasterObserver.postCompletedModifyTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
-TableNametableName,
-HTableDescriptorhtd)
-
-
-void
 MasterObserver.postCompletedModifyTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
 TableNametableName,
 HTableDescriptorhtd)
 Called after to modifying a table's properties.
 
 
+
+void
+BaseMasterObserver.postCompletedModifyTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
+TableNametableName,
+HTableDescriptorhtd)
+
 
 void
 BaseMasterAndRegionObserver.postCreateTable(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -1070,18 +1070,18 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-BaseMasterObserver.postCreateTable(ObserverContextMasterCoprocessorEnvironmentctx,
-  HTableDescriptordesc,
-  HRegionInfo[]regions)
-
-
-void
 MasterObserver.postCreateTable(ObserverContextMasterCoprocessorEnvironmentctx,
   HTableDescriptordesc,
   

[44/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/overview-tree.html
--
diff --git a/apidocs/overview-tree.html b/apidocs/overview-tree.html
index 6730bc4..8f3059e 100644
--- a/apidocs/overview-tree.html
+++ b/apidocs/overview-tree.html
@@ -839,25 +839,25 @@
 org.apache.hadoop.hbase.util.Order
 org.apache.hadoop.hbase.KeepDeletedCells
 org.apache.hadoop.hbase.ProcedureState
-org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-org.apache.hadoop.hbase.filter.RegexStringComparator.EngineType
-org.apache.hadoop.hbase.filter.BitComparator.BitwiseOp
 org.apache.hadoop.hbase.filter.CompareFilter.CompareOp
-org.apache.hadoop.hbase.filter.Filter.ReturnCode
 org.apache.hadoop.hbase.filter.FilterList.Operator
-org.apache.hadoop.hbase.regionserver.BloomType
-org.apache.hadoop.hbase.quotas.ThrottlingException.Type
-org.apache.hadoop.hbase.quotas.QuotaScope
-org.apache.hadoop.hbase.quotas.QuotaType
-org.apache.hadoop.hbase.quotas.ThrottleType
-org.apache.hadoop.hbase.client.Consistency
-org.apache.hadoop.hbase.client.IsolationLevel
-org.apache.hadoop.hbase.client.MasterSwitchType
-org.apache.hadoop.hbase.client.CompactionState
+org.apache.hadoop.hbase.filter.Filter.ReturnCode
+org.apache.hadoop.hbase.filter.BitComparator.BitwiseOp
+org.apache.hadoop.hbase.filter.RegexStringComparator.EngineType
+org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
 org.apache.hadoop.hbase.client.CompactType
 org.apache.hadoop.hbase.client.Durability
+org.apache.hadoop.hbase.client.IsolationLevel
+org.apache.hadoop.hbase.client.CompactionState
+org.apache.hadoop.hbase.client.MasterSwitchType
+org.apache.hadoop.hbase.client.Consistency
 org.apache.hadoop.hbase.client.SnapshotType
 org.apache.hadoop.hbase.client.security.SecurityCapability
+org.apache.hadoop.hbase.regionserver.BloomType
+org.apache.hadoop.hbase.quotas.ThrottlingException.Type
+org.apache.hadoop.hbase.quotas.QuotaScope
+org.apache.hadoop.hbase.quotas.ThrottleType
+org.apache.hadoop.hbase.quotas.QuotaType
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html 
b/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html
index 87168fe..874a0ff 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html
@@ -72,7 +72,7 @@
 064  // Version 11 -- add column family 
level configuration.
 065  private static final byte 
COLUMN_DESCRIPTOR_VERSION = (byte) 11;
 066
-067  private static final String 
IN_MEMORY_COMPACTION = "IN_MEMORY_COMPACTION";
+067  public static final String 
IN_MEMORY_COMPACTION = "IN_MEMORY_COMPACTION";
 068
 069  // These constants are used as FileInfo 
keys
 070  public static final String COMPRESSION 
= "COMPRESSION";

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/src-html/org/apache/hadoop/hbase/RegionLoad.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/RegionLoad.html 
b/apidocs/src-html/org/apache/hadoop/hbase/RegionLoad.html
index 9dea220..8176664 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/RegionLoad.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/RegionLoad.html
@@ -46,214 +46,215 @@
 038
 039  protected 
ClusterStatusProtos.RegionLoad regionLoadPB;
 040
-041  public 
RegionLoad(ClusterStatusProtos.RegionLoad regionLoadPB) {
-042this.regionLoadPB = regionLoadPB;
-043  }
-044
-045  /**
-046   * @return the region name
-047   */
-048  public byte[] getName() {
-049return 
regionLoadPB.getRegionSpecifier().getValue().toByteArray();
-050  }
-051
-052  /**
-053   * @return the region name as a 
string
-054   */
-055  public String getNameAsString() {
-056return Bytes.toString(getName());
-057  }
-058
-059  /**
-060   * @return the number of stores
-061   */
-062  public int getStores() {
-063return regionLoadPB.getStores();
-064  }
-065
-066  /**
-067   * @return the number of storefiles
-068   */
-069  public int getStorefiles() {
-070return 
regionLoadPB.getStorefiles();
-071  }
-072
-073  /**
-074   * @return the total size of the 
storefiles, in MB
-075   */
-076  public int getStorefileSizeMB() {
-077return 
regionLoadPB.getStorefileSizeMB();
-078  }
-079
-080  /**
-081   * @return the memstore size, in MB
-082   */
-083  public int getMemStoreSizeMB() {
-084return 
regionLoadPB.getMemstoreSizeMB();
-085  }
-086
-087  /**
-088   * @return the approximate size of 
storefile indexes on the heap, in MB
-089   */
-090  public int getStorefileIndexSizeMB() 
{
-091return 
regionLoadPB.getStorefileIndexSizeMB();
-092  }
-093
-094  /**
-095   * @return the number of requests made 
to region
-096   */

[50/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/apidocs/org/apache/hadoop/hbase/RegionLoad.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/RegionLoad.html 
b/apidocs/org/apache/hadoop/hbase/RegionLoad.html
index 1f75cfc..0cdccf1 100644
--- a/apidocs/org/apache/hadoop/hbase/RegionLoad.html
+++ b/apidocs/org/apache/hadoop/hbase/RegionLoad.html
@@ -289,7 +289,8 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 RegionLoad
-publicRegionLoad(org.apache.hadoop.hbase.protobuf.generated.ClusterStatusProtos.RegionLoadregionLoadPB)
+@InterfaceAudience.Private
+publicRegionLoad(org.apache.hadoop.hbase.protobuf.generated.ClusterStatusProtos.RegionLoadregionLoadPB)
 
 
 
@@ -306,7 +307,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getName
-publicbyte[]getName()
+publicbyte[]getName()
 Returns:the region name
 
 
@@ -316,7 +317,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getNameAsString
-publichttp://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringgetNameAsString()
+publichttp://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringgetNameAsString()
 Returns:the region name as a 
string
 
 
@@ -326,7 +327,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getStores
-publicintgetStores()
+publicintgetStores()
 Returns:the number of 
stores
 
 
@@ -336,7 +337,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getStorefiles
-publicintgetStorefiles()
+publicintgetStorefiles()
 Returns:the number of 
storefiles
 
 
@@ -346,7 +347,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getStorefileSizeMB
-publicintgetStorefileSizeMB()
+publicintgetStorefileSizeMB()
 Returns:the total size of the 
storefiles, in MB
 
 
@@ -356,7 +357,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getMemStoreSizeMB
-publicintgetMemStoreSizeMB()
+publicintgetMemStoreSizeMB()
 Returns:the memstore size, in 
MB
 
 
@@ -366,7 +367,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getStorefileIndexSizeMB
-publicintgetStorefileIndexSizeMB()
+publicintgetStorefileIndexSizeMB()
 Returns:the approximate size of 
storefile indexes on the heap, in MB
 
 
@@ -376,7 +377,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getRequestsCount
-publiclonggetRequestsCount()
+publiclonggetRequestsCount()
 Returns:the number of requests 
made to region
 
 
@@ -386,7 +387,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getReadRequestsCount
-publiclonggetReadRequestsCount()
+publiclonggetReadRequestsCount()
 Returns:the number of read 
requests made to region
 
 
@@ -396,7 +397,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getFilteredReadRequestsCount
-publiclonggetFilteredReadRequestsCount()
+publiclonggetFilteredReadRequestsCount()
 Returns:the number of filtered 
read requests made to region
 
 
@@ -406,7 +407,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getWriteRequestsCount
-publiclonggetWriteRequestsCount()
+publiclonggetWriteRequestsCount()
 Returns:the number of write 
requests made to region
 
 
@@ -416,7 +417,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getRootIndexSizeKB
-publicintgetRootIndexSizeKB()
+publicintgetRootIndexSizeKB()
 Returns:The current total size of 
root-level indexes for the region, in KB.
 
 
@@ -426,7 +427,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getTotalStaticIndexSizeKB
-publicintgetTotalStaticIndexSizeKB()
+publicintgetTotalStaticIndexSizeKB()
 Returns:The total size of all 
index blocks, not just the root level, in KB.
 
 
@@ -436,7 +437,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getTotalStaticBloomSizeKB
-publicintgetTotalStaticBloomSizeKB()
+publicintgetTotalStaticBloomSizeKB()
 Returns:The total size of all 
Bloom filter blocks, not just loaded into the
  block cache, in KB.
 
@@ -447,7 +448,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getTotalCompactingKVs
-publiclonggetTotalCompactingKVs()
+publiclonggetTotalCompactingKVs()
 Returns:the total number of kvs 
in current compaction
 
 
@@ -457,7 +458,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getCurrentCompactedKVs
-publiclonggetCurrentCompactedKVs()
+publiclonggetCurrentCompactedKVs()
 Returns:the number of already 
compacted kvs in current compaction
 
 
@@ -467,7 +468,7 @@ extends http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?
 
 
 getCompleteSequenceId

[32/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/HRegionInfo.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/HRegionInfo.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/HRegionInfo.html
index 6bad0b5..f95f6c1 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/HRegionInfo.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/HRegionInfo.html
@@ -867,15 +867,15 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 HRegionInfo
-ScannerCallable.getHRegionInfo()
+MultiServerCallable.getHRegionInfo()
 
 
 HRegionInfo
-MultiServerCallable.getHRegionInfo()
+AbstractRegionServerCallable.getHRegionInfo()
 
 
 HRegionInfo
-AbstractRegionServerCallable.getHRegionInfo()
+ScannerCallable.getHRegionInfo()
 
 
 private HRegionInfo
@@ -1071,17 +1071,17 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-SplitLogManagerCoordination.markRegionsRecovering(ServerNameserverName,
+ZKSplitLogManagerCoordination.markRegionsRecovering(ServerNameserverName,
   http://docs.oracle.com/javase/7/docs/api/java/util/Set.html?is-external=true;
 title="class or interface in java.util">SetHRegionInfouserRegions)
-Mark regions in recovering state for distributed log 
replay
+Create znodes 
/hbase/recovering-regions/[region_ids...]/[failed region server names ...] for
+ all regions of the passed in region servers
 
 
 
 void
-ZKSplitLogManagerCoordination.markRegionsRecovering(ServerNameserverName,
+SplitLogManagerCoordination.markRegionsRecovering(ServerNameserverName,
   http://docs.oracle.com/javase/7/docs/api/java/util/Set.html?is-external=true;
 title="class or interface in java.util">SetHRegionInfouserRegions)
-Create znodes 
/hbase/recovering-regions/[region_ids...]/[failed region server names ...] for
- all regions of the passed in region servers
+Mark regions in recovering state for distributed log 
replay
 
 
 
@@ -1118,16 +1118,16 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-BaseMasterObserver.postAssign(ObserverContextMasterCoprocessorEnvironmentctx,
-HRegionInforegionInfo)
-
-
-void
 MasterObserver.postAssign(ObserverContextMasterCoprocessorEnvironmentctx,
 HRegionInforegionInfo)
 Called after the region assignment has been requested.
 
 
+
+void
+BaseMasterObserver.postAssign(ObserverContextMasterCoprocessorEnvironmentctx,
+HRegionInforegionInfo)
+
 
 void
 BaseMasterAndRegionObserver.postCompletedCreateTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -1136,18 +1136,18 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-BaseMasterObserver.postCompletedCreateTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
-HTableDescriptordesc,
-HRegionInfo[]regions)
-
-
-void
 MasterObserver.postCompletedCreateTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
 HTableDescriptordesc,
 HRegionInfo[]regions)
 Called after the createTable operation has been 
requested.
 
 
+
+void
+BaseMasterObserver.postCompletedCreateTableAction(ObserverContextMasterCoprocessorEnvironmentctx,
+HTableDescriptordesc,
+HRegionInfo[]regions)
+
 
 void
 BaseMasterAndRegionObserver.postCreateTable(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -1156,18 +1156,18 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-BaseMasterObserver.postCreateTable(ObserverContextMasterCoprocessorEnvironmentctx,
-  HTableDescriptordesc,
-  HRegionInfo[]regions)
-
-
-void
 MasterObserver.postCreateTable(ObserverContextMasterCoprocessorEnvironmentctx,
   HTableDescriptordesc,
   HRegionInfo[]regions)
 Called after the createTable operation has been 
requested.
 
 
+
+void
+BaseMasterObserver.postCreateTable(ObserverContextMasterCoprocessorEnvironmentctx,
+  HTableDescriptordesc,
+  HRegionInfo[]regions)
+
 
 void
 BaseMasterAndRegionObserver.postCreateTableHandler(ObserverContextMasterCoprocessorEnvironmentctx,
@@ -1178,25 +1178,25 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 void
-BaseMasterObserver.postCreateTableHandler(ObserverContextMasterCoprocessorEnvironmentctx,

[26/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/Stoppable.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/Stoppable.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/Stoppable.html
index 33cbf05..fe6e1f2 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/Stoppable.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/Stoppable.html
@@ -901,7 +901,7 @@
 
 
 void
-ReplicationSource.init(org.apache.hadoop.conf.Configurationconf,
+ReplicationSourceInterface.init(org.apache.hadoop.conf.Configurationconf,
 org.apache.hadoop.fs.FileSystemfs,
 ReplicationSourceManagermanager,
 ReplicationQueuesreplicationQueues,
@@ -911,12 +911,12 @@
 http://docs.oracle.com/javase/7/docs/api/java/util/UUID.html?is-external=true;
 title="class or interface in java.util">UUIDclusterId,
 ReplicationEndpointreplicationEndpoint,
 MetricsSourcemetrics)
-Instantiation method used by region servers
+Initializer for the source
 
 
 
 void
-ReplicationSourceInterface.init(org.apache.hadoop.conf.Configurationconf,
+ReplicationSource.init(org.apache.hadoop.conf.Configurationconf,
 org.apache.hadoop.fs.FileSystemfs,
 ReplicationSourceManagermanager,
 ReplicationQueuesreplicationQueues,
@@ -926,7 +926,7 @@
 http://docs.oracle.com/javase/7/docs/api/java/util/UUID.html?is-external=true;
 title="class or interface in java.util">UUIDclusterId,
 ReplicationEndpointreplicationEndpoint,
 MetricsSourcemetrics)
-Initializer for the source
+Instantiation method used by region servers
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/class-use/TableDescriptors.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/TableDescriptors.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/TableDescriptors.html
index 294d686..615907d 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/TableDescriptors.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/TableDescriptors.html
@@ -118,11 +118,11 @@
 
 
 TableDescriptors
-HMaster.getTableDescriptors()
+MasterServices.getTableDescriptors()
 
 
 TableDescriptors
-MasterServices.getTableDescriptors()
+HMaster.getTableDescriptors()
 
 
 



[10/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionCoprocessorEnvironment.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionCoprocessorEnvironment.html
 
b/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionCoprocessorEnvironment.html
index 319632b..40ae382 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionCoprocessorEnvironment.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/coprocessor/class-use/RegionCoprocessorEnvironment.html
@@ -152,15 +152,15 @@
 
 
 private RegionCoprocessorEnvironment
-AggregateImplementation.env
+MultiRowMutationEndpoint.env
 
 
 private RegionCoprocessorEnvironment
-MultiRowMutationEndpoint.env
+BaseRowProcessorEndpoint.env
 
 
 private RegionCoprocessorEnvironment
-BaseRowProcessorEndpoint.env
+AggregateImplementation.env
 
 
 
@@ -173,31 +173,37 @@
 
 
 Result
-RegionObserver.postAppend(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postAppend(ObserverContextRegionCoprocessorEnvironmente,
 Appendappend,
-Resultresult)
-Called after Append
-
+Resultresult)
 
 
 Result
-BaseRegionObserver.postAppend(ObserverContextRegionCoprocessorEnvironmente,
+RegionObserver.postAppend(ObserverContextRegionCoprocessorEnvironmentc,
 Appendappend,
-Resultresult)
+Resultresult)
+Called after Append
+
 
 
 void
+BaseRegionObserver.postBatchMutate(ObserverContextRegionCoprocessorEnvironmentc,
+  MiniBatchOperationInProgressMutationminiBatchOp)
+
+
+void
 RegionObserver.postBatchMutate(ObserverContextRegionCoprocessorEnvironmentc,
   MiniBatchOperationInProgressMutationminiBatchOp)
 This will be called after applying a batch of Mutations on 
a region.
 
 
-
+
 void
-BaseRegionObserver.postBatchMutate(ObserverContextRegionCoprocessorEnvironmentc,
-  MiniBatchOperationInProgressMutationminiBatchOp)
+BaseRegionObserver.postBatchMutateIndispensably(ObserverContextRegionCoprocessorEnvironmentctx,
+MiniBatchOperationInProgressMutationminiBatchOp,
+
booleansuccess)
 
-
+
 void
 RegionObserver.postBatchMutateIndispensably(ObserverContextRegionCoprocessorEnvironmentctx,
 MiniBatchOperationInProgressMutationminiBatchOp,
@@ -206,99 +212,99 @@
  fails.
 
 
-
-void
-BaseRegionObserver.postBatchMutateIndispensably(ObserverContextRegionCoprocessorEnvironmentctx,
-MiniBatchOperationInProgressMutationminiBatchOp,
-
booleansuccess)
-
 
 boolean
-RegionObserver.postBulkLoadHFile(ObserverContextRegionCoprocessorEnvironmentctx,
+BaseRegionObserver.postBulkLoadHFile(ObserverContextRegionCoprocessorEnvironmentctx,
   http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListPairbyte[],http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringfamilyPaths,
-  booleanhasLoaded)
-Called after bulkLoadHFile.
-
+  booleanhasLoaded)
 
 
 boolean
-BaseRegionObserver.postBulkLoadHFile(ObserverContextRegionCoprocessorEnvironmentctx,
+RegionObserver.postBulkLoadHFile(ObserverContextRegionCoprocessorEnvironmentctx,
   http://docs.oracle.com/javase/7/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListPairbyte[],http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringfamilyPaths,
-  booleanhasLoaded)
+  booleanhasLoaded)
+Called after bulkLoadHFile.
+
 
 
 boolean
-RegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmentc,
+BaseRegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmente,
 byte[]row,
 byte[]family,
 byte[]qualifier,
 CompareFilter.CompareOpcompareOp,
 ByteArrayComparablecomparator,
 Deletedelete,
-booleanresult)
-Called after checkAndDelete
-
+booleanresult)
 
 
 boolean
-BaseRegionObserver.postCheckAndDelete(ObserverContextRegionCoprocessorEnvironmente,

[02/52] [partial] hbase-site git commit: Published site at b21c56e7958652ca6e6daf04642eb51abaf2b3d7.

2016-06-06 Thread misty
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/d434d867/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/CacheConfig.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/CacheConfig.html 
b/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/CacheConfig.html
index 58f8c31..a82dad5 100644
--- a/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/CacheConfig.html
+++ b/devapidocs/org/apache/hadoop/hbase/io/hfile/class-use/CacheConfig.html
@@ -137,51 +137,51 @@
 
 
 StoreFileReader
-RegionObserver.postStoreFileReaderOpen(ObserverContextRegionCoprocessorEnvironmentctx,
+BaseRegionObserver.postStoreFileReaderOpen(ObserverContextRegionCoprocessorEnvironmentctx,
   
org.apache.hadoop.fs.FileSystemfs,
   org.apache.hadoop.fs.Pathp,
   FSDataInputStreamWrapperin,
   longsize,
   CacheConfigcacheConf,
   Referencer,
-  StoreFileReaderreader)
-Called after the creation of Reader for a store file.
-
+  StoreFileReaderreader)
 
 
 StoreFileReader
-BaseRegionObserver.postStoreFileReaderOpen(ObserverContextRegionCoprocessorEnvironmentctx,
+RegionObserver.postStoreFileReaderOpen(ObserverContextRegionCoprocessorEnvironmentctx,
   
org.apache.hadoop.fs.FileSystemfs,
   org.apache.hadoop.fs.Pathp,
   FSDataInputStreamWrapperin,
   longsize,
   CacheConfigcacheConf,
   Referencer,
-  StoreFileReaderreader)
+  StoreFileReaderreader)
+Called after the creation of Reader for a store file.
+
 
 
 StoreFileReader
-RegionObserver.preStoreFileReaderOpen(ObserverContextRegionCoprocessorEnvironmentctx,
+BaseRegionObserver.preStoreFileReaderOpen(ObserverContextRegionCoprocessorEnvironmentctx,
 
org.apache.hadoop.fs.FileSystemfs,
 org.apache.hadoop.fs.Pathp,
 FSDataInputStreamWrapperin,
 longsize,
 CacheConfigcacheConf,
 Referencer,
-StoreFileReaderreader)
-Called before creation of Reader for a store file.
-
+StoreFileReaderreader)
 
 
 StoreFileReader
-BaseRegionObserver.preStoreFileReaderOpen(ObserverContextRegionCoprocessorEnvironmentctx,
+RegionObserver.preStoreFileReaderOpen(ObserverContextRegionCoprocessorEnvironmentctx,
 
org.apache.hadoop.fs.FileSystemfs,
 org.apache.hadoop.fs.Pathp,
 FSDataInputStreamWrapperin,
 longsize,
 CacheConfigcacheConf,
 Referencer,
-StoreFileReaderreader)
+StoreFileReaderreader)
+Called before creation of Reader for a store file.
+
 
 
 
@@ -231,27 +231,27 @@
 
 
 
+protected CacheConfig
+HFile.WriterFactory.cacheConf
+
+
 private CacheConfig
 HFileReaderImpl.cacheConf
 Block cache configuration.
 
 
-
+
 protected CacheConfig
 HFileWriterImpl.cacheConf
 Cache configuration for caching data on write.
 
 
-
+
 private CacheConfig
 HFileBlockIndex.BlockIndexWriter.cacheConf
 CacheConfig, or null if cache-on-write is disabled
 
 
-
-protected CacheConfig
-HFile.WriterFactory.cacheConf
-
 
 
 
@@ -534,15 +534,15 @@
 
 
 private CacheConfig
-SweepJob.cacheConfig
+SweepReducer.cacheConfig
 
 
 private CacheConfig
-SweepReducer.cacheConfig
+MemStoreWrapper.cacheConfig
 
 
 private CacheConfig
-MemStoreWrapper.cacheConfig
+SweepJob.cacheConfig
 
 
 
@@ -575,17 +575,17 @@
 
 
 
-protected CacheConfig
-HStore.cacheConf
-
-
 private CacheConfig
 StoreFile.cacheConf
 
-
+
 private CacheConfig
 StoreFileWriter.Builder.cacheConf
 
+
+protected CacheConfig
+HStore.cacheConf
+
 
 protected CacheConfig
 HRegionServer.cacheConfig
@@ -609,13 +609,13 @@
 
 
 CacheConfig
-Store.getCacheConfig()
-Used for tests.
-
+HRegionServer.getCacheConfig()
 
 
 CacheConfig
-HRegionServer.getCacheConfig()
+Store.getCacheConfig()
+Used for tests.
+