hbase git commit: HBASE-19174 Updated link to presentations to link to book

2017-11-06 Thread janh
Repository: hbase
Updated Branches:
  refs/heads/master 9d63bda8f -> 29fd1dead


HBASE-19174 Updated link to presentations to link to book


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/29fd1dea
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/29fd1dea
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/29fd1dea

Branch: refs/heads/master
Commit: 29fd1dead227a6e72d29e5b5fc990a08a7c4bb05
Parents: 9d63bda
Author: Jan Hentschel 
Authored: Sat Nov 4 00:04:00 2017 +0100
Committer: Jan Hentschel 
Committed: Tue Nov 7 08:29:38 2017 +0100

--
 src/site/asciidoc/old_news.adoc | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/29fd1dea/src/site/asciidoc/old_news.adoc
--
diff --git a/src/site/asciidoc/old_news.adoc b/src/site/asciidoc/old_news.adoc
index c5cf993..4ae3d7a 100644
--- a/src/site/asciidoc/old_news.adoc
+++ b/src/site/asciidoc/old_news.adoc
@@ -113,7 +113,7 @@ October 2nd, 2009:: HBase at Hadoop World in NYC. A few of 
us will be talking on
 
 August 7th-9th, 2009:: HUG7 and HBase Hackathon at StumbleUpon in SF: Sign up 
for the:: link:http://www.meetup.com/hbaseusergroup/calendar/10950511/[HBase 
User Group Meeting, HUG7] or for the 
link:http://www.meetup.com/hackathon/calendar/10951718/[Hackathon] or for both 
(all are welcome!).
 
-June, 2009::  HBase at HadoopSummit2009 and at NOSQL: See the 
link:https://wiki.apache.org/hadoop/HBase/HBasePresentations[presentations]
+June, 2009::  HBase at HadoopSummit2009 and at NOSQL: See the 
link:https://hbase.apache.org/book.html#other.info.pres[presentations]
 
 March 3rd, 2009 :: HUG6 -- 
link:http://www.meetup.com/hbaseusergroup/calendar/9764004/[HBase User Group 6]
 



hbase git commit: HBASE-19175 Added linklint files to gitignore

2017-11-06 Thread janh
Repository: hbase
Updated Branches:
  refs/heads/master d4e3f902e -> 9d63bda8f


HBASE-19175 Added linklint files to gitignore


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/9d63bda8
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/9d63bda8
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/9d63bda8

Branch: refs/heads/master
Commit: 9d63bda8ff44764963ee3ed11eca3881037ff789
Parents: d4e3f90
Author: Jan Hentschel 
Authored: Sat Nov 4 01:33:30 2017 +0100
Committer: Jan Hentschel 
Committed: Tue Nov 7 08:25:48 2017 +0100

--
 .gitignore | 3 +++
 1 file changed, 3 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/9d63bda8/.gitignore
--
diff --git a/.gitignore b/.gitignore
index b9c6fb2..405edc0 100644
--- a/.gitignore
+++ b/.gitignore
@@ -15,3 +15,6 @@ hbase-*/test
 *.ipr
 patchprocess/
 dependency-reduced-pom.xml
+link_report/
+linklint-*.zip
+linklint/



hbase git commit: HBASE-19183 Removed redundant groupId from hbase-checkstyle and hbase-error-prone

2017-11-06 Thread janh
Repository: hbase
Updated Branches:
  refs/heads/master 0356674cd -> d4e3f902e


HBASE-19183 Removed redundant groupId from hbase-checkstyle and 
hbase-error-prone


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/d4e3f902
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/d4e3f902
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/d4e3f902

Branch: refs/heads/master
Commit: d4e3f902e6ba5b747295ca6053f34badd4018175
Parents: 0356674
Author: Jan Hentschel 
Authored: Sat Nov 4 23:01:40 2017 +0100
Committer: Jan Hentschel 
Committed: Tue Nov 7 08:20:51 2017 +0100

--
 hbase-build-support/hbase-error-prone/pom.xml | 1 -
 hbase-checkstyle/pom.xml  | 1 -
 2 files changed, 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/d4e3f902/hbase-build-support/hbase-error-prone/pom.xml
--
diff --git a/hbase-build-support/hbase-error-prone/pom.xml 
b/hbase-build-support/hbase-error-prone/pom.xml
index 907d82d..067e154 100644
--- a/hbase-build-support/hbase-error-prone/pom.xml
+++ b/hbase-build-support/hbase-error-prone/pom.xml
@@ -26,7 +26,6 @@
 3.0.0-SNAPSHOT
 ..
   
-  org.apache.hbase
   hbase-error-prone
   3.0.0-SNAPSHOT
   Apache HBase - Error Prone Rules

http://git-wip-us.apache.org/repos/asf/hbase/blob/d4e3f902/hbase-checkstyle/pom.xml
--
diff --git a/hbase-checkstyle/pom.xml b/hbase-checkstyle/pom.xml
index ed84b20..2b30c12 100644
--- a/hbase-checkstyle/pom.xml
+++ b/hbase-checkstyle/pom.xml
@@ -22,7 +22,6 @@
 */
 -->
 4.0.0
-org.apache.hbase
 hbase-checkstyle
 3.0.0-SNAPSHOT
 Apache HBase - Checkstyle



hbase git commit: HBASE-19103 Add BigDecimalComparator for filter

2017-11-06 Thread janh
Repository: hbase
Updated Branches:
  refs/heads/master d1b6d8c90 -> 0356674cd


HBASE-19103 Add BigDecimalComparator for filter

Signed-off-by: Jan Hentschel 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/0356674c
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/0356674c
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/0356674c

Branch: refs/heads/master
Commit: 0356674cd1f59b10ab515058efa948e556fbc79e
Parents: d1b6d8c
Author: QilinCao 
Authored: Mon Oct 30 20:55:11 2017 +0800
Committer: Jan Hentschel 
Committed: Tue Nov 7 08:07:58 2017 +0100

--
 .../hbase/filter/BigDecimalComparator.java  | 116 ++
 .../src/main/protobuf/Comparator.proto  |   4 +
 .../src/main/protobuf/Comparator.proto  |   4 +
 .../hbase/filter/TestBigDecimalComparator.java  | 118 +++
 .../filter/TestComparatorSerialization.java |   9 ++
 .../hadoop/hbase/regionserver/TestHRegion.java  |  43 +++
 6 files changed, 294 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/0356674c/hbase-client/src/main/java/org/apache/hadoop/hbase/filter/BigDecimalComparator.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/filter/BigDecimalComparator.java
 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/filter/BigDecimalComparator.java
new file mode 100644
index 000..5da366f
--- /dev/null
+++ 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/filter/BigDecimalComparator.java
@@ -0,0 +1,116 @@
+/*
+ *
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.hbase.filter;
+
+import java.math.BigDecimal;
+import java.nio.ByteBuffer;
+import java.util.Objects;
+
+import org.apache.hadoop.hbase.exceptions.DeserializationException;
+import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException;
+import org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+import org.apache.hadoop.hbase.shaded.protobuf.generated.ComparatorProtos;
+import org.apache.hadoop.hbase.util.ByteBufferUtils;
+import org.apache.hadoop.hbase.util.Bytes;
+
+import org.apache.yetus.audience.InterfaceAudience;
+
+/**
+ * A BigDecimal comparator which numerical compares against the specified byte 
array
+ */
+@InterfaceAudience.Public
+public class BigDecimalComparator extends ByteArrayComparable {
+  private BigDecimal bigDecimal;
+
+  public BigDecimalComparator(BigDecimal value) {
+super(Bytes.toBytes(value));
+this.bigDecimal = value;
+  }
+
+  @Override
+  public boolean equals(Object obj) {
+if (obj == null || !(obj instanceof BigDecimalComparator)) {
+  return false;
+}
+if (this == obj) {
+  return true;
+}
+BigDecimalComparator bdc = (BigDecimalComparator) obj;
+return this.bigDecimal.equals(bdc.bigDecimal);
+  }
+
+  @Override
+  public int hashCode() {
+return Objects.hash(this.bigDecimal);
+  }
+
+  @Override
+  public int compareTo(byte[] value, int offset, int length) {
+BigDecimal that = Bytes.toBigDecimal(value, offset, length);
+return this.bigDecimal.compareTo(that);
+  }
+
+  @Override
+  public int compareTo(ByteBuffer value, int offset, int length) {
+BigDecimal that = ByteBufferUtils.toBigDecimal(value, offset, length);
+return this.bigDecimal.compareTo(that);
+  }
+
+  /**
+   * @return The comparator serialized using pb
+   */
+  @Override
+  public byte[] toByteArray() {
+ComparatorProtos.BigDecimalComparator.Builder builder =
+ComparatorProtos.BigDecimalComparator.newBuilder();
+builder.setComparable(ProtobufUtil.toByteArrayComparable(this.value));
+return builder.build().toByteArray();
+  }
+
+  /**
+   * @param pbBytes A pb serialized {@link BigDecimalComparator} instance
+   * @return An instance of {@link BigDecimalComparator} made from 
bytes
+   * @throws DeserializationException A 

hbase git commit: HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-1.2 498bcba09 -> 273588336


HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/27358833
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/27358833
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/27358833

Branch: refs/heads/branch-1.2
Commit: 2735883361e488a57f70ac5d65073528a5b474bb
Parents: 498bcba
Author: Michael Stack 
Authored: Mon Nov 6 21:19:51 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 21:23:50 2017 -0800

--
 .../apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java| 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/27358833/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
index c9ab40d..7b721cf 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
@@ -47,7 +47,6 @@ import org.junit.rules.TestRule;
  */
 @Category(SmallTests.class)
 public class TestIPv6NIOServerSocketChannel {
-
   private static final Log LOG = 
LogFactory.getLog(TestIPv6NIOServerSocketChannel.class);
 
   @Rule
@@ -67,6 +66,7 @@ public class TestIPv6NIOServerSocketChannel {
 break;
   } catch (BindException ex) {
 //continue
+LOG.info("Failed on " + addr + ", inedAddr=" + inetAddr, ex);
   } finally {
 if (serverSocket != null) {
   serverSocket.close();
@@ -149,9 +149,9 @@ public class TestIPv6NIOServerSocketChannel {
*/
   @Test
   public void testServerSocketFromLocalhostResolution() throws IOException {
-InetAddress[] addrs = InetAddress.getAllByName("localhost");
+InetAddress[] addrs = {InetAddress.getLocalHost()};
 for (InetAddress addr : addrs) {
-  LOG.info("resolved localhost as:" + addr);
+  LOG.info("Resolved localhost as: " + addr);
   bindServerSocket(addr);
   bindNIOServerSocket(addr);
 }



hbase git commit: HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-1.3 123f765eb -> 342328bdd


HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/342328bd
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/342328bd
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/342328bd

Branch: refs/heads/branch-1.3
Commit: 342328bdd841596c426feac569f20024e95a6a4b
Parents: 123f765
Author: Michael Stack 
Authored: Mon Nov 6 21:19:51 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 21:23:24 2017 -0800

--
 .../apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java| 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/342328bd/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
index c9ab40d..7b721cf 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
@@ -47,7 +47,6 @@ import org.junit.rules.TestRule;
  */
 @Category(SmallTests.class)
 public class TestIPv6NIOServerSocketChannel {
-
   private static final Log LOG = 
LogFactory.getLog(TestIPv6NIOServerSocketChannel.class);
 
   @Rule
@@ -67,6 +66,7 @@ public class TestIPv6NIOServerSocketChannel {
 break;
   } catch (BindException ex) {
 //continue
+LOG.info("Failed on " + addr + ", inedAddr=" + inetAddr, ex);
   } finally {
 if (serverSocket != null) {
   serverSocket.close();
@@ -149,9 +149,9 @@ public class TestIPv6NIOServerSocketChannel {
*/
   @Test
   public void testServerSocketFromLocalhostResolution() throws IOException {
-InetAddress[] addrs = InetAddress.getAllByName("localhost");
+InetAddress[] addrs = {InetAddress.getLocalHost()};
 for (InetAddress addr : addrs) {
-  LOG.info("resolved localhost as:" + addr);
+  LOG.info("Resolved localhost as: " + addr);
   bindServerSocket(addr);
   bindNIOServerSocket(addr);
 }



hbase git commit: HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-1.4 afa1b9150 -> d4e973d2d


HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/d4e973d2
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/d4e973d2
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/d4e973d2

Branch: refs/heads/branch-1.4
Commit: d4e973d2dea272e8e156adec52a537ac5de3ff4e
Parents: afa1b91
Author: Michael Stack 
Authored: Mon Nov 6 21:19:51 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 21:22:08 2017 -0800

--
 .../apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java| 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/d4e973d2/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
index 6b0b538..3dc2871 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
@@ -48,7 +48,6 @@ import org.junit.rules.TestRule;
  */
 @Category(SmallTests.class)
 public class TestIPv6NIOServerSocketChannel {
-
   private static final Log LOG = 
LogFactory.getLog(TestIPv6NIOServerSocketChannel.class);
 
   @Rule
@@ -68,6 +67,7 @@ public class TestIPv6NIOServerSocketChannel {
 break;
   } catch (BindException ex) {
 //continue
+LOG.info("Failed on " + addr + ", inedAddr=" + inetAddr, ex);
   } finally {
 if (serverSocket != null) {
   serverSocket.close();
@@ -150,9 +150,9 @@ public class TestIPv6NIOServerSocketChannel {
*/
   @Test
   public void testServerSocketFromLocalhostResolution() throws IOException {
-InetAddress[] addrs = InetAddress.getAllByName("localhost");
+InetAddress[] addrs = {InetAddress.getLocalHost()};
 for (InetAddress addr : addrs) {
-  LOG.info("resolved localhost as:" + addr);
+  LOG.info("Resolved localhost as: " + addr);
   bindServerSocket(addr);
   bindNIOServerSocket(addr);
 }



hbase git commit: HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-1 a2ae58f6e -> ca68d7786


HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/ca68d778
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/ca68d778
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/ca68d778

Branch: refs/heads/branch-1
Commit: ca68d77862be05500e9fd3d933f5888a5a7512c9
Parents: a2ae58f
Author: Michael Stack 
Authored: Mon Nov 6 21:19:51 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 21:21:34 2017 -0800

--
 .../apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java| 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/ca68d778/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
index 6b0b538..3dc2871 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
@@ -48,7 +48,6 @@ import org.junit.rules.TestRule;
  */
 @Category(SmallTests.class)
 public class TestIPv6NIOServerSocketChannel {
-
   private static final Log LOG = 
LogFactory.getLog(TestIPv6NIOServerSocketChannel.class);
 
   @Rule
@@ -68,6 +67,7 @@ public class TestIPv6NIOServerSocketChannel {
 break;
   } catch (BindException ex) {
 //continue
+LOG.info("Failed on " + addr + ", inedAddr=" + inetAddr, ex);
   } finally {
 if (serverSocket != null) {
   serverSocket.close();
@@ -150,9 +150,9 @@ public class TestIPv6NIOServerSocketChannel {
*/
   @Test
   public void testServerSocketFromLocalhostResolution() throws IOException {
-InetAddress[] addrs = InetAddress.getAllByName("localhost");
+InetAddress[] addrs = {InetAddress.getLocalHost()};
 for (InetAddress addr : addrs) {
-  LOG.info("resolved localhost as:" + addr);
+  LOG.info("Resolved localhost as: " + addr);
   bindServerSocket(addr);
   bindNIOServerSocket(addr);
 }



hbase git commit: HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-2 f13cf56f1 -> 57c0fb256


HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/57c0fb25
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/57c0fb25
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/57c0fb25

Branch: refs/heads/branch-2
Commit: 57c0fb2561b63384d38072ba0bf16f7182a3a5cc
Parents: f13cf56
Author: Michael Stack 
Authored: Mon Nov 6 21:19:51 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 21:21:06 2017 -0800

--
 .../apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java| 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/57c0fb25/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
index d4f4ada..e63eaf2 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
@@ -49,7 +49,6 @@ import org.junit.rules.TestRule;
  */
 @Category({MiscTests.class, SmallTests.class})
 public class TestIPv6NIOServerSocketChannel {
-
   private static final Log LOG = 
LogFactory.getLog(TestIPv6NIOServerSocketChannel.class);
 
   @Rule
@@ -69,6 +68,7 @@ public class TestIPv6NIOServerSocketChannel {
 break;
   } catch (BindException ex) {
 //continue
+LOG.info("Failed on " + addr + ", inedAddr=" + inetAddr, ex);
   } finally {
 if (serverSocket != null) {
   serverSocket.close();
@@ -151,9 +151,9 @@ public class TestIPv6NIOServerSocketChannel {
*/
   @Test
   public void testServerSocketFromLocalhostResolution() throws IOException {
-InetAddress[] addrs = InetAddress.getAllByName("localhost");
+InetAddress[] addrs = {InetAddress.getLocalHost()};
 for (InetAddress addr : addrs) {
-  LOG.info("resolved localhost as:" + addr);
+  LOG.info("Resolved localhost as: " + addr);
   bindServerSocket(addr);
   bindNIOServerSocket(addr);
 }



hbase git commit: HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/master b6011a16f -> d1b6d8c90


HBASE-19198 TestIPv6NIOServerSocketChannel fails; unable to bind


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/d1b6d8c9
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/d1b6d8c9
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/d1b6d8c9

Branch: refs/heads/master
Commit: d1b6d8c90692d2ccf9a9e5c9c6186d62a0b2b553
Parents: b6011a1
Author: Michael Stack 
Authored: Mon Nov 6 21:19:51 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 21:20:04 2017 -0800

--
 .../apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java| 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/d1b6d8c9/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
index d4f4ada..e63eaf2 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/TestIPv6NIOServerSocketChannel.java
@@ -49,7 +49,6 @@ import org.junit.rules.TestRule;
  */
 @Category({MiscTests.class, SmallTests.class})
 public class TestIPv6NIOServerSocketChannel {
-
   private static final Log LOG = 
LogFactory.getLog(TestIPv6NIOServerSocketChannel.class);
 
   @Rule
@@ -69,6 +68,7 @@ public class TestIPv6NIOServerSocketChannel {
 break;
   } catch (BindException ex) {
 //continue
+LOG.info("Failed on " + addr + ", inedAddr=" + inetAddr, ex);
   } finally {
 if (serverSocket != null) {
   serverSocket.close();
@@ -151,9 +151,9 @@ public class TestIPv6NIOServerSocketChannel {
*/
   @Test
   public void testServerSocketFromLocalhostResolution() throws IOException {
-InetAddress[] addrs = InetAddress.getAllByName("localhost");
+InetAddress[] addrs = {InetAddress.getLocalHost()};
 for (InetAddress addr : addrs) {
-  LOG.info("resolved localhost as:" + addr);
+  LOG.info("Resolved localhost as: " + addr);
   bindServerSocket(addr);
   bindNIOServerSocket(addr);
 }



hbase git commit: HBASE-19197 Move version on branch-2 from 2.0.0-alpha4 to 2.0.0-beta-1.SNAPSHOT

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-2 40dac699b -> f13cf56f1


HBASE-19197 Move version on branch-2 from 2.0.0-alpha4 to 2.0.0-beta-1.SNAPSHOT


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/f13cf56f
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/f13cf56f
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/f13cf56f

Branch: refs/heads/branch-2
Commit: f13cf56f1c45cd172163ebe919a81536a843364c
Parents: 40dac69
Author: Michael Stack 
Authored: Mon Nov 6 20:46:38 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 20:46:38 2017 -0800

--
 hbase-annotations/pom.xml| 2 +-
 hbase-archetypes/hbase-archetype-builder/pom.xml | 2 +-
 hbase-archetypes/hbase-client-project/pom.xml| 2 +-
 hbase-archetypes/hbase-shaded-client-project/pom.xml | 2 +-
 hbase-archetypes/pom.xml | 2 +-
 hbase-assembly/pom.xml   | 2 +-
 hbase-backup/pom.xml | 2 +-
 hbase-build-configuration/pom.xml| 2 +-
 hbase-checkstyle/pom.xml | 4 ++--
 hbase-client/pom.xml | 2 +-
 hbase-common/pom.xml | 2 +-
 hbase-endpoint/pom.xml   | 2 +-
 hbase-examples/pom.xml   | 2 +-
 hbase-external-blockcache/pom.xml| 2 +-
 hbase-hadoop-compat/pom.xml  | 2 +-
 hbase-hadoop2-compat/pom.xml | 2 +-
 hbase-http/pom.xml   | 2 +-
 hbase-it/pom.xml | 2 +-
 hbase-mapreduce/pom.xml  | 2 +-
 hbase-metrics-api/pom.xml| 2 +-
 hbase-metrics/pom.xml| 2 +-
 hbase-procedure/pom.xml  | 2 +-
 hbase-protocol-shaded/pom.xml| 2 +-
 hbase-protocol/pom.xml   | 2 +-
 hbase-replication/pom.xml| 2 +-
 hbase-resource-bundle/pom.xml| 2 +-
 hbase-rest/pom.xml   | 2 +-
 hbase-rsgroup/pom.xml| 2 +-
 hbase-server/pom.xml | 2 +-
 hbase-shaded/hbase-shaded-check-invariants/pom.xml   | 2 +-
 hbase-shaded/hbase-shaded-client/pom.xml | 2 +-
 hbase-shaded/hbase-shaded-mapreduce/pom.xml  | 2 +-
 hbase-shaded/pom.xml | 2 +-
 hbase-shell/pom.xml  | 2 +-
 hbase-spark-it/pom.xml   | 2 +-
 hbase-spark/pom.xml  | 2 +-
 hbase-testing-util/pom.xml   | 2 +-
 hbase-thrift/pom.xml | 2 +-
 pom.xml  | 2 +-
 39 files changed, 40 insertions(+), 40 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/f13cf56f/hbase-annotations/pom.xml
--
diff --git a/hbase-annotations/pom.xml b/hbase-annotations/pom.xml
index 9e48aa0..7f92a81 100644
--- a/hbase-annotations/pom.xml
+++ b/hbase-annotations/pom.xml
@@ -23,7 +23,7 @@
   
 hbase
 org.apache.hbase
-2.0.0-alpha4-SNAPSHOT
+2.0.0-beta-1.SNAPSHOT
 ..
   
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/f13cf56f/hbase-archetypes/hbase-archetype-builder/pom.xml
--
diff --git a/hbase-archetypes/hbase-archetype-builder/pom.xml 
b/hbase-archetypes/hbase-archetype-builder/pom.xml
index 913efda..a394e53 100644
--- a/hbase-archetypes/hbase-archetype-builder/pom.xml
+++ b/hbase-archetypes/hbase-archetype-builder/pom.xml
@@ -25,7 +25,7 @@
   
 hbase-archetypes
 org.apache.hbase
-2.0.0-alpha4-SNAPSHOT
+2.0.0-beta-1.SNAPSHOT
 ..
   
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/f13cf56f/hbase-archetypes/hbase-client-project/pom.xml
--
diff --git a/hbase-archetypes/hbase-client-project/pom.xml 
b/hbase-archetypes/hbase-client-project/pom.xml
index 8f57d42..a6135d1 100644
--- a/hbase-archetypes/hbase-client-project/pom.xml
+++ b/hbase-archetypes/hbase-client-project/pom.xml
@@ -26,7 +26,7 @@
   
 hbase-archetypes
 org.apache.hbase
-2.0.0-alpha4-SNAPSHOT
+2.0.0-beta-1.SNAPSHOT
 ..
   
   hbase-client-project

http://git-wip-us.apache.org/repos/asf/hbase/blob/f13cf56f/hbase-archetypes/hbase-shaded-client-project/pom.xml

hbase git commit: HBASE-19102 TestZooKeeperMainServer fails with KeeperException Reapplication after fixing compile issue.

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-1 f4b34675c -> a2ae58f6e


HBASE-19102 TestZooKeeperMainServer fails with KeeperException
Reapplication after fixing compile issue.


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/a2ae58f6
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/a2ae58f6
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/a2ae58f6

Branch: refs/heads/branch-1
Commit: a2ae58f6eebe41a47bfb082d340323c4385f0620
Parents: f4b3467
Author: Michael Stack 
Authored: Mon Nov 6 20:30:14 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 20:40:14 2017 -0800

--
 .../hadoop/hbase/zookeeper/ZooKeeperMainServer.java | 12 
 1 file changed, 12 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/a2ae58f6/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
--
diff --git 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
index e81da59..92460fe 100644
--- 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
+++ 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
@@ -20,6 +20,7 @@
 package org.apache.hadoop.hbase.zookeeper;
 
 import java.io.IOException;
+import java.util.concurrent.TimeUnit;
 
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.HBaseConfiguration;
@@ -48,6 +49,15 @@ public class ZooKeeperMainServer {
 public HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain(String[] args)
 throws IOException, InterruptedException {
   super(args);
+  // Make sure we are connected before we proceed. Can take a while on 
some systems. If we
+  // run the command without being connected, we get ConnectionLoss 
KeeperErrorConnection...
+  long startTime = System.currentTimeMillis();
+  while (!this.zk.getState().isConnected()) {
+Thread.sleep(1);
+if ((System.currentTimeMillis() - startTime) > 1) {
+  throw new InterruptedException("Failed connect " + this.zk);
+}
+  }
 }
 
 /**
@@ -100,6 +110,8 @@ public class ZooKeeperMainServer {
   }
 }
 // If command-line arguments, run our hack so they are executed.
+// ZOOKEEPER-1897 was committed to zookeeper-3.4.6 but elsewhere in this 
class we say
+// 3.4.6 breaks command-processing; TODO.
 if (hasCommandLineArguments(args)) {
   HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain zkm =
 new HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain(newArgs);



hbase git commit: HBASE-19102 TestZooKeeperMainServer fails with KeeperException Reapplication after fixing compile issue.

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-1.3 df419a168 -> 123f765eb


HBASE-19102 TestZooKeeperMainServer fails with KeeperException
Reapplication after fixing compile issue.


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/123f765e
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/123f765e
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/123f765e

Branch: refs/heads/branch-1.3
Commit: 123f765eb408d5ccf3ed3e4a2cd247eb2c1bc305
Parents: df419a1
Author: Michael Stack 
Authored: Mon Nov 6 20:30:14 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 20:39:12 2017 -0800

--
 .../hadoop/hbase/zookeeper/ZooKeeperMainServer.java | 12 
 1 file changed, 12 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/123f765e/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
--
diff --git 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
index e81da59..92460fe 100644
--- 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
+++ 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
@@ -20,6 +20,7 @@
 package org.apache.hadoop.hbase.zookeeper;
 
 import java.io.IOException;
+import java.util.concurrent.TimeUnit;
 
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.HBaseConfiguration;
@@ -48,6 +49,15 @@ public class ZooKeeperMainServer {
 public HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain(String[] args)
 throws IOException, InterruptedException {
   super(args);
+  // Make sure we are connected before we proceed. Can take a while on 
some systems. If we
+  // run the command without being connected, we get ConnectionLoss 
KeeperErrorConnection...
+  long startTime = System.currentTimeMillis();
+  while (!this.zk.getState().isConnected()) {
+Thread.sleep(1);
+if ((System.currentTimeMillis() - startTime) > 1) {
+  throw new InterruptedException("Failed connect " + this.zk);
+}
+  }
 }
 
 /**
@@ -100,6 +110,8 @@ public class ZooKeeperMainServer {
   }
 }
 // If command-line arguments, run our hack so they are executed.
+// ZOOKEEPER-1897 was committed to zookeeper-3.4.6 but elsewhere in this 
class we say
+// 3.4.6 breaks command-processing; TODO.
 if (hasCommandLineArguments(args)) {
   HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain zkm =
 new HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain(newArgs);



hbase git commit: HBASE-19102 TestZooKeeperMainServer fails with KeeperException Reapplication after fixing compile issue.

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-1.4 b01b7168d -> afa1b9150


HBASE-19102 TestZooKeeperMainServer fails with KeeperException
Reapplication after fixing compile issue.


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/afa1b915
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/afa1b915
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/afa1b915

Branch: refs/heads/branch-1.4
Commit: afa1b915098eff226577b17dc3f22d31fe9cd215
Parents: b01b716
Author: Michael Stack 
Authored: Mon Nov 6 20:30:14 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 20:39:45 2017 -0800

--
 .../hadoop/hbase/zookeeper/ZooKeeperMainServer.java | 12 
 1 file changed, 12 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/afa1b915/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
--
diff --git 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
index e81da59..92460fe 100644
--- 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
+++ 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
@@ -20,6 +20,7 @@
 package org.apache.hadoop.hbase.zookeeper;
 
 import java.io.IOException;
+import java.util.concurrent.TimeUnit;
 
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.HBaseConfiguration;
@@ -48,6 +49,15 @@ public class ZooKeeperMainServer {
 public HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain(String[] args)
 throws IOException, InterruptedException {
   super(args);
+  // Make sure we are connected before we proceed. Can take a while on 
some systems. If we
+  // run the command without being connected, we get ConnectionLoss 
KeeperErrorConnection...
+  long startTime = System.currentTimeMillis();
+  while (!this.zk.getState().isConnected()) {
+Thread.sleep(1);
+if ((System.currentTimeMillis() - startTime) > 1) {
+  throw new InterruptedException("Failed connect " + this.zk);
+}
+  }
 }
 
 /**
@@ -100,6 +110,8 @@ public class ZooKeeperMainServer {
   }
 }
 // If command-line arguments, run our hack so they are executed.
+// ZOOKEEPER-1897 was committed to zookeeper-3.4.6 but elsewhere in this 
class we say
+// 3.4.6 breaks command-processing; TODO.
 if (hasCommandLineArguments(args)) {
   HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain zkm =
 new HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain(newArgs);



hbase git commit: HBASE-19102 TestZooKeeperMainServer fails with KeeperException Reapplication after fixing compile issue.

2017-11-06 Thread stack
Repository: hbase
Updated Branches:
  refs/heads/branch-1.2 4e9136e03 -> 498bcba09


HBASE-19102 TestZooKeeperMainServer fails with KeeperException
Reapplication after fixing compile issue.


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/498bcba0
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/498bcba0
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/498bcba0

Branch: refs/heads/branch-1.2
Commit: 498bcba095b7f108498d025e862a3bb741416b1d
Parents: 4e9136e
Author: Michael Stack 
Authored: Mon Nov 6 20:30:14 2017 -0800
Committer: Michael Stack 
Committed: Mon Nov 6 20:38:45 2017 -0800

--
 .../hadoop/hbase/zookeeper/ZooKeeperMainServer.java | 12 
 1 file changed, 12 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/498bcba0/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
--
diff --git 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
index e81da59..92460fe 100644
--- 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
+++ 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/zookeeper/ZooKeeperMainServer.java
@@ -20,6 +20,7 @@
 package org.apache.hadoop.hbase.zookeeper;
 
 import java.io.IOException;
+import java.util.concurrent.TimeUnit;
 
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.HBaseConfiguration;
@@ -48,6 +49,15 @@ public class ZooKeeperMainServer {
 public HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain(String[] args)
 throws IOException, InterruptedException {
   super(args);
+  // Make sure we are connected before we proceed. Can take a while on 
some systems. If we
+  // run the command without being connected, we get ConnectionLoss 
KeeperErrorConnection...
+  long startTime = System.currentTimeMillis();
+  while (!this.zk.getState().isConnected()) {
+Thread.sleep(1);
+if ((System.currentTimeMillis() - startTime) > 1) {
+  throw new InterruptedException("Failed connect " + this.zk);
+}
+  }
 }
 
 /**
@@ -100,6 +110,8 @@ public class ZooKeeperMainServer {
   }
 }
 // If command-line arguments, run our hack so they are executed.
+// ZOOKEEPER-1897 was committed to zookeeper-3.4.6 but elsewhere in this 
class we say
+// 3.4.6 breaks command-processing; TODO.
 if (hasCommandLineArguments(args)) {
   HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain zkm =
 new HACK_UNTIL_ZOOKEEPER_1897_ZooKeeperMain(newArgs);



hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times [Forced Update!]

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 4e3a4755c -> 1b1ba46fb (forced update)


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/1b1ba46f
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/1b1ba46f
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/1b1ba46f

Branch: refs/heads/HBASE-19189
Commit: 1b1ba46fb91496e75de12c6c41c490d9de071a42
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 20:43:49 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 78 +++
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 50 +
 3 files changed, 211 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/1b1ba46f/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..6646bdf
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,78 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  parameters {
+string(name: 'tests', description: 'space separated list of tests to run.')
+string(name: 'node', defaultValue: 'Hadoop', description: 'the node label 
that should be used to run the test.')
+  }
+  agent {
+node {
+  label "${params.node}"
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+"""
+sh """#!/bin/bash -e
+  "${env.BASEDIR}/dev-support/gather_machine_environment.sh" \
+  "${OUTPUT_RELATIVE}/machine"
+"""
+dir ("component") {
+  sh '''#!/bin/bash -e
+./dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+--force-timeout 1800 \
+--maven-local-repo ".m2-repo" \
+--log-output "${OUTPUTDIR}" \
+--repeat 100 \
+"${tests}"
+'''
+}
+  }
+  post {
+always {
+  archive 'output/*'
+  archive 'output/**/*'
+}
+failure {
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/1b1ba46f/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy 

hbase git commit: HBASE-19186 Unify to use bytes to show size in master/rs ui

2017-11-06 Thread zghao
Repository: hbase
Updated Branches:
  refs/heads/branch-2 f4a4144f3 -> 40dac699b


HBASE-19186 Unify to use bytes to show size in master/rs ui


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/40dac699
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/40dac699
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/40dac699

Branch: refs/heads/branch-2
Commit: 40dac699bd7ca37fd2401d60f10bebb8a856ad88
Parents: f4a4144
Author: Guanghao Zhang 
Authored: Sun Nov 5 12:41:02 2017 +0800
Committer: Guanghao Zhang 
Committed: Tue Nov 7 10:07:52 2017 +0800

--
 .../tmpl/regionserver/BlockCacheTmpl.jamon  |   4 +-
 .../tmpl/regionserver/ServerMetricsTmpl.jamon   |  10 +-
 .../hbase-webapps/master/procedures.jsp |   9 +-
 .../hbase-webapps/master/processMaster.jsp  |   9 +-
 .../hbase-webapps/master/processRS.jsp  | 228 ---
 .../resources/hbase-webapps/master/table.jsp|   2 +-
 .../hbase-webapps/regionserver/processRS.jsp|   9 +-
 7 files changed, 23 insertions(+), 248 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/40dac699/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
--
diff --git 
a/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
 
b/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
index b4e44d8..5ea5bcc 100644
--- 
a/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
+++ 
b/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
@@ -244,13 +244,13 @@ 
org.apache.hadoop.util.StringUtils.TraditionalBinaryPrefix;
 Size
 <% 
TraditionalBinaryPrefix.long2String(cacheConfig.getBlockCache().getCurrentSize(),
 "B", 1) %>
-Current size of block cache in use (bytes)
+Current size of block cache in use
 
 
 Free
 <% 
TraditionalBinaryPrefix.long2String(cacheConfig.getBlockCache().getFreeSize(),
 "B", 1) %>
-The total free memory currently available to store more cache 
entries (bytes)
+The total free memory currently available to store more cache 
entries
 
 
 Count

http://git-wip-us.apache.org/repos/asf/hbase/blob/40dac699/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
--
diff --git 
a/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
 
b/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
index 2e99d5b..adcfff1 100644
--- 
a/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
+++ 
b/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
@@ -146,7 +146,7 @@ MetricsRegionServerWrapper mWrap;
 
 
 Num. WAL Files
-Size. WAL Files (bytes)
+Size. WAL Files
 
 
 
@@ -165,9 +165,9 @@ MetricsRegionServerWrapper mWrap;
 
 Num. Stores
 Num. Storefiles
-Root Index Size (bytes)
-Index Size (bytes)
-Bloom Size (bytes)
+Root Index Size
+Index Size
+Bloom Size
 
 
 <% mWrap.getNumStores() %>
@@ -212,7 +212,7 @@ MetricsHBaseServerWrapper mServerWrap;
 Priority Call Queue Length
 General Call Queue Length
 Replication Call Queue Length
-Total Call Queue Size (bytes)
+Total Call Queue Size
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/40dac699/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp
--
diff --git 
a/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp 
b/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp
index 63a41cc..c3df296 100644
--- a/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp
+++ b/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp
@@ -39,6 +39,7 @@
   import="org.apache.hadoop.hbase.procedure2.util.StringUtils"
   import="org.apache.hadoop.hbase.shaded.protobuf.generated.ProcedureProtos"
   import="org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil"
+  import="org.apache.hadoop.util.StringUtils.TraditionalBinaryPrefix"
 %>
 <%
   HMaster master = (HMaster)getServletContext().getAttribute(HMaster.MASTER);
@@ -173,7 +174,7 @@
 <%ProcedureWALFile pwf = procedureWALFiles.get(i); %>
 
<%= pwf.getLogId() %>
-   <%= StringUtils.humanSize(pwf.getSize()) 

hbase git commit: HBASE-19186 Unify to use bytes to show size in master/rs ui

2017-11-06 Thread zghao
Repository: hbase
Updated Branches:
  refs/heads/master 2a99b87af -> b6011a16f


HBASE-19186 Unify to use bytes to show size in master/rs ui


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/b6011a16
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/b6011a16
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/b6011a16

Branch: refs/heads/master
Commit: b6011a16fffebae21e56c41206b29d96c0613024
Parents: 2a99b87
Author: Guanghao Zhang 
Authored: Sun Nov 5 12:41:02 2017 +0800
Committer: Guanghao Zhang 
Committed: Tue Nov 7 10:07:03 2017 +0800

--
 .../tmpl/regionserver/BlockCacheTmpl.jamon  |   4 +-
 .../tmpl/regionserver/ServerMetricsTmpl.jamon   |  10 +-
 .../hbase-webapps/master/procedures.jsp |   9 +-
 .../hbase-webapps/master/processMaster.jsp  |   9 +-
 .../hbase-webapps/master/processRS.jsp  | 228 ---
 .../resources/hbase-webapps/master/table.jsp|   2 +-
 .../hbase-webapps/regionserver/processRS.jsp|   9 +-
 7 files changed, 23 insertions(+), 248 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/b6011a16/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
--
diff --git 
a/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
 
b/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
index b4e44d8..5ea5bcc 100644
--- 
a/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
+++ 
b/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/BlockCacheTmpl.jamon
@@ -244,13 +244,13 @@ 
org.apache.hadoop.util.StringUtils.TraditionalBinaryPrefix;
 Size
 <% 
TraditionalBinaryPrefix.long2String(cacheConfig.getBlockCache().getCurrentSize(),
 "B", 1) %>
-Current size of block cache in use (bytes)
+Current size of block cache in use
 
 
 Free
 <% 
TraditionalBinaryPrefix.long2String(cacheConfig.getBlockCache().getFreeSize(),
 "B", 1) %>
-The total free memory currently available to store more cache 
entries (bytes)
+The total free memory currently available to store more cache 
entries
 
 
 Count

http://git-wip-us.apache.org/repos/asf/hbase/blob/b6011a16/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
--
diff --git 
a/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
 
b/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
index 2e99d5b..adcfff1 100644
--- 
a/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
+++ 
b/hbase-server/src/main/jamon/org/apache/hadoop/hbase/tmpl/regionserver/ServerMetricsTmpl.jamon
@@ -146,7 +146,7 @@ MetricsRegionServerWrapper mWrap;
 
 
 Num. WAL Files
-Size. WAL Files (bytes)
+Size. WAL Files
 
 
 
@@ -165,9 +165,9 @@ MetricsRegionServerWrapper mWrap;
 
 Num. Stores
 Num. Storefiles
-Root Index Size (bytes)
-Index Size (bytes)
-Bloom Size (bytes)
+Root Index Size
+Index Size
+Bloom Size
 
 
 <% mWrap.getNumStores() %>
@@ -212,7 +212,7 @@ MetricsHBaseServerWrapper mServerWrap;
 Priority Call Queue Length
 General Call Queue Length
 Replication Call Queue Length
-Total Call Queue Size (bytes)
+Total Call Queue Size
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/b6011a16/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp
--
diff --git 
a/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp 
b/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp
index 63a41cc..c3df296 100644
--- a/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp
+++ b/hbase-server/src/main/resources/hbase-webapps/master/procedures.jsp
@@ -39,6 +39,7 @@
   import="org.apache.hadoop.hbase.procedure2.util.StringUtils"
   import="org.apache.hadoop.hbase.shaded.protobuf.generated.ProcedureProtos"
   import="org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil"
+  import="org.apache.hadoop.util.StringUtils.TraditionalBinaryPrefix"
 %>
 <%
   HMaster master = (HMaster)getServletContext().getAttribute(HMaster.MASTER);
@@ -173,7 +174,7 @@
 <%ProcedureWALFile pwf = procedureWALFiles.get(i); %>
 
<%= pwf.getLogId() %>
-   <%= StringUtils.humanSize(pwf.getSize()) %> 

svn commit: r22940 - /dev/hbase/hbase-2.0.0-alpha4RC0/ /release/hbase/2.0.0-alpha-3/ /release/hbase/2.0.0-alpha4/

2017-11-06 Thread stack
Author: stack
Date: Tue Nov  7 00:35:59 2017
New Revision: 22940

Log:
Publish 2.0.0-alpha4RC0 as 2.0.0-alpha4

Added:
release/hbase/2.0.0-alpha4/
  - copied from r22939, dev/hbase/hbase-2.0.0-alpha4RC0/
Removed:
dev/hbase/hbase-2.0.0-alpha4RC0/
release/hbase/2.0.0-alpha-3/



hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times [Forced Update!]

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 bfe926ad1 -> 4e3a4755c (forced update)


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/4e3a4755
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/4e3a4755
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/4e3a4755

Branch: refs/heads/HBASE-19189
Commit: 4e3a4755c9357ceef876ee8c7321088f6859cac0
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 15:23:32 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 74 ++
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 50 +
 3 files changed, 207 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/4e3a4755/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..0d960cd
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,74 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  agent {
+node {
+  label 'ubuntu'
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+"""
+sh """#!/bin/bash -e
+  "${env.BASEDIR}/dev-support/gather_machine_environment.sh" \
+  "${OUTPUT_RELATIVE}/machine"
+"""
+dir ("component") {
+  sh '''#!/bin/bash -e
+./dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+--force-timeout 1800 \
+--maven-local-repo ".m2-repo" \
+--log-output "${OUTPUTDIR}" \
+--repeat 100 \
+"${tests}"
+'''
+}
+  }
+  post {
+always {
+  archive 'output/*'
+  archive 'output/**/*'
+}
+failure {
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/4e3a4755/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT 

hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times [Forced Update!]

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 ae2e1972e -> bfe926ad1 (forced update)


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/bfe926ad
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/bfe926ad
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/bfe926ad

Branch: refs/heads/HBASE-19189
Commit: bfe926ad1f234a97cb03d4ad5d967b1c1c6b51da
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 14:53:53 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 70 +
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 50 +
 3 files changed, 203 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/bfe926ad/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..b7b96e1
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,70 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  agent {
+node {
+  label 'ubuntu'
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+"""
+dir ("component") {
+  sh '''#!/bin/bash -e
+./dev-support/gather_machine_environment.sh \
+"${OUTPUT_RELATIVE}/machine"
+./dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+--force-timeout 1800 \
+--maven-local-repo ".m2-repo" \
+--log-output "${OUTPUTDIR}" \
+--repeat 100 \
+"${tests}"
+'''
+}
+  }
+  post {
+failure {
+  archive 'output/*'
+  archive 'output/**/*'
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/bfe926ad/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License 

[1/2] hbase git commit: HBASE-19111 Add CellUtil#isPut and deprecate methods returning/expecting non public-api data

2017-11-06 Thread elserj
Repository: hbase
Updated Branches:
  refs/heads/branch-2 43b4aab64 -> f4a4144f3
  refs/heads/master 33ede5516 -> 2a99b87af


HBASE-19111 Add CellUtil#isPut and deprecate methods returning/expecting non 
public-api data

KeyValue.Type, and its corresponding byte value, are not public API. We
shouldn't have methods that are expecting them. Added a basic sanity
test for isPut and isDelete.

Signed-off-by: Ramkrishna 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/2a99b87a
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/2a99b87a
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/2a99b87a

Branch: refs/heads/master
Commit: 2a99b87af2ebe289e2fec94c9cdca0942397977d
Parents: 33ede55
Author: Josh Elser 
Authored: Fri Oct 27 19:27:59 2017 -0400
Committer: Josh Elser 
Committed: Mon Nov 6 15:37:12 2017 -0500

--
 .../main/java/org/apache/hadoop/hbase/Cell.java |  3 +
 .../java/org/apache/hadoop/hbase/CellUtil.java  |  9 +++
 .../hadoop/hbase/client/TestFromClientSide.java | 73 +++-
 3 files changed, 66 insertions(+), 19 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/2a99b87a/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java
--
diff --git a/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java 
b/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java
index b2f6304..f5833c8 100644
--- a/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java
+++ b/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java
@@ -133,7 +133,10 @@ public interface Cell {
 
   /**
* @return The byte representation of the KeyValue.TYPE of this cell: one of 
Put, Delete, etc
+   * @deprecated since 2.0.0, use appropriate {@link CellUtil#isDelete} or
+   *{@link CellUtil#isPut(Cell)} methods instead. This will be removed in 
3.0.0.
*/
+  @Deprecated
   byte getTypeByte();
 
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/2a99b87a/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java
--
diff --git a/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java 
b/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java
index 78f12b5..52eb8fa 100644
--- a/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java
+++ b/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java
@@ -893,6 +893,7 @@ public final class CellUtil {
* {KeyValue.Type#DeleteFamily} or a
* {@link KeyValue.Type#DeleteColumn} KeyValue type.
*/
+  @SuppressWarnings("deprecation")
   public static boolean isDelete(final Cell cell) {
 return PrivateCellUtil.isDelete(cell.getTypeByte());
   }
@@ -962,6 +963,14 @@ public final class CellUtil {
   }
 
   /**
+   * @return True if this cell is a Put.
+   */
+  @SuppressWarnings("deprecation")
+  public static boolean isPut(Cell cell) {
+return cell.getTypeByte() == Type.Put.getCode();
+  }
+
+  /**
* Estimate based on keyvalue's serialization format in the RPC layer. Note 
that there is an extra
* SIZEOF_INT added to the size here that indicates the actual length of the 
cell for cases where
* cell's are serialized in a contiguous format (For eg in RPCs).

http://git-wip-us.apache.org/repos/asf/hbase/blob/2a99b87a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
index 804f821..02d3797 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
@@ -50,6 +50,7 @@ import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.Cell;
+import org.apache.hadoop.hbase.CellScanner;
 import org.apache.hadoop.hbase.CellUtil;
 import org.apache.hadoop.hbase.ClusterStatus.Option;
 import org.apache.hadoop.hbase.CompareOperator;
@@ -132,9 +133,6 @@ public class TestFromClientSide {
   @Rule
   public TestName name = new TestName();
 
-  /**
-   * @throws java.lang.Exception
-   */
   @BeforeClass
   public static void setUpBeforeClass() throws Exception {
 // Uncomment the following lines if more verbosity is needed for
@@ -151,9 +149,6 @@ public class TestFromClientSide {
 TEST_UTIL.startMiniCluster(SLAVES);
   }
 
-  /**
-   * @throws 

[2/2] hbase git commit: HBASE-19111 Add CellUtil#isPut and deprecate methods returning/expecting non public-api data

2017-11-06 Thread elserj
HBASE-19111 Add CellUtil#isPut and deprecate methods returning/expecting non 
public-api data

KeyValue.Type, and its corresponding byte value, are not public API. We
shouldn't have methods that are expecting them. Added a basic sanity
test for isPut and isDelete.

Signed-off-by: Ramkrishna 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/f4a4144f
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/f4a4144f
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/f4a4144f

Branch: refs/heads/branch-2
Commit: f4a4144f35f962c594c18d710eceb4c23ec63ebf
Parents: 43b4aab
Author: Josh Elser 
Authored: Fri Oct 27 19:27:59 2017 -0400
Committer: Josh Elser 
Committed: Mon Nov 6 15:37:16 2017 -0500

--
 .../main/java/org/apache/hadoop/hbase/Cell.java |  3 +
 .../java/org/apache/hadoop/hbase/CellUtil.java  |  9 +++
 .../hadoop/hbase/client/TestFromClientSide.java | 73 +++-
 3 files changed, 66 insertions(+), 19 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/f4a4144f/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java
--
diff --git a/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java 
b/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java
index b2f6304..f5833c8 100644
--- a/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java
+++ b/hbase-common/src/main/java/org/apache/hadoop/hbase/Cell.java
@@ -133,7 +133,10 @@ public interface Cell {
 
   /**
* @return The byte representation of the KeyValue.TYPE of this cell: one of 
Put, Delete, etc
+   * @deprecated since 2.0.0, use appropriate {@link CellUtil#isDelete} or
+   *{@link CellUtil#isPut(Cell)} methods instead. This will be removed in 
3.0.0.
*/
+  @Deprecated
   byte getTypeByte();
 
 

http://git-wip-us.apache.org/repos/asf/hbase/blob/f4a4144f/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java
--
diff --git a/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java 
b/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java
index 206a897..30283f1 100644
--- a/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java
+++ b/hbase-common/src/main/java/org/apache/hadoop/hbase/CellUtil.java
@@ -927,6 +927,7 @@ public final class CellUtil {
* @return True if a delete type, a {@link KeyValue.Type#Delete} or a 
{KeyValue.Type#DeleteFamily}
* or a {@link KeyValue.Type#DeleteColumn} KeyValue type.
*/
+  @SuppressWarnings("deprecation")
   public static boolean isDelete(final Cell cell) {
 return PrivateCellUtil.isDelete(cell.getTypeByte());
   }
@@ -993,6 +994,14 @@ public final class CellUtil {
   }
 
   /**
+   * @return True if this cell is a Put.
+   */
+  @SuppressWarnings("deprecation")
+  public static boolean isPut(Cell cell) {
+return cell.getTypeByte() == Type.Put.getCode();
+  }
+
+  /**
* Estimate based on keyvalue's serialization format in the RPC layer. Note 
that there is an extra
* SIZEOF_INT added to the size here that indicates the actual length of the 
cell for cases where
* cell's are serialized in a contiguous format (For eg in RPCs).

http://git-wip-us.apache.org/repos/asf/hbase/blob/f4a4144f/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
index 804f821..02d3797 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestFromClientSide.java
@@ -50,6 +50,7 @@ import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.Cell;
+import org.apache.hadoop.hbase.CellScanner;
 import org.apache.hadoop.hbase.CellUtil;
 import org.apache.hadoop.hbase.ClusterStatus.Option;
 import org.apache.hadoop.hbase.CompareOperator;
@@ -132,9 +133,6 @@ public class TestFromClientSide {
   @Rule
   public TestName name = new TestName();
 
-  /**
-   * @throws java.lang.Exception
-   */
   @BeforeClass
   public static void setUpBeforeClass() throws Exception {
 // Uncomment the following lines if more verbosity is needed for
@@ -151,9 +149,6 @@ public class TestFromClientSide {
 TEST_UTIL.startMiniCluster(SLAVES);
   }
 
-  /**
-   * @throws java.lang.Exception
-   */
   @AfterClass
   public static void 

hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times [Forced Update!]

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 ccafc2ad1 -> ae2e1972e (forced update)


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/ae2e1972
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/ae2e1972
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/ae2e1972

Branch: refs/heads/HBASE-19189
Commit: ae2e1972e113b7cc2a84c8d117653f053520d0c9
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 14:29:27 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 70 +
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 50 +
 3 files changed, 203 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/ae2e1972/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..610f1dd
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,70 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  agent {
+node {
+  label 'ubuntu'
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/gather_machine_environment.sh \
+  "${env.OUTPUT_RELATIVE}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+  --force-timeout 1800 \
+  --maven-local-repo ".m2-repo" \
+  --log-output "${OUTPUTDIR}" \
+  --repeat 100 \
+  "${tests}"
+"""
+  }
+  post {
+failure {
+  archive 'output/*'
+  archive 'output/**/*'
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/ae2e1972/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the 

hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times [Forced Update!]

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 04fc9a8f0 -> ccafc2ad1 (forced update)


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/ccafc2ad
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/ccafc2ad
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/ccafc2ad

Branch: refs/heads/HBASE-19189
Commit: ccafc2ad1307d2efd587bfec2f463168e0fa1772
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 14:25:47 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 74 ++
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 50 +
 3 files changed, 207 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/ccafc2ad/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..95dc75f
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,74 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  agent {
+node {
+  label 'ubuntu'
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+skipDefaultCheckout()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+dir('component') {
+  checkout scm
+}
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/gather_machine_environment.sh \
+  "${env.OUTPUT_RELATIVE}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+  --force-timeout 1800 \
+  --maven-local-repo ".m2-repo" \
+  --log-output "${OUTPUTDIR}" \
+  --repeat 100 \
+  "${tests}"
+"""
+  }
+  post {
+failure {
+  archive 'output/*'
+  archive 'output/**/*'
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/ccafc2ad/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" 

hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times [Forced Update!]

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 e42744e64 -> 04fc9a8f0 (forced update)


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/04fc9a8f
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/04fc9a8f
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/04fc9a8f

Branch: refs/heads/HBASE-19189
Commit: 04fc9a8f0a718a912e84a5d5d83298e16abe4e04
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 14:23:14 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 79 
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 50 +
 3 files changed, 212 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/04fc9a8f/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..928735d
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,79 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  agent {
+node {
+  label 'ubuntu'
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+skipDefaultCheckout()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+//stage ('scm checkout') {
+//  steps {
+//dir('component') {
+//  checkout scm
+//}
+//  }
+//}
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+  ls -lahR
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/gather_machine_environment.sh \
+  "${env.OUTPUT_RELATIVE}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+  --force-timeout 1800 \
+  --maven-local-repo ".m2-repo" \
+  --log-output "${OUTPUTDIR}" \
+  --repeat 100 \
+  "${tests}"
+"""
+  }
+  post {
+failure {
+  archive 'output/*'
+  archive 'output/**/*'
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/04fc9a8f/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or 

hbase git commit: HBASE-19131 (Addendum) Use the emptyList() to replace EMPTY_LIST

2017-11-06 Thread chia7712
Repository: hbase
Updated Branches:
  refs/heads/master 9ee8e2714 -> 33ede5516


HBASE-19131 (Addendum) Use the emptyList() to replace EMPTY_LIST


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/33ede551
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/33ede551
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/33ede551

Branch: refs/heads/master
Commit: 33ede55164421b40c0bfe1c9d47c1db6701265c2
Parents: 9ee8e27
Author: Chia-Ping Tsai 
Authored: Tue Nov 7 04:06:00 2017 +0800
Committer: Chia-Ping Tsai 
Committed: Tue Nov 7 04:06:00 2017 +0800

--
 .../src/main/java/org/apache/hadoop/hbase/ClusterStatus.java   | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/33ede551/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java
index 13c5bac..351b0c8 100644
--- a/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java
+++ b/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java
@@ -138,7 +138,7 @@ public class ClusterStatus {
*/
   public List getDeadServerNames() {
 if (deadServers == null) {
-  return Collections.EMPTY_LIST;
+  return Collections.emptyList();
 }
 return Collections.unmodifiableList(deadServers);
   }
@@ -256,7 +256,7 @@ public class ClusterStatus {
 
   public Collection getServers() {
 if (liveServers == null) {
-  return Collections.EMPTY_LIST;
+  return Collections.emptyList();
 }
 return Collections.unmodifiableCollection(this.liveServers.keySet());
   }
@@ -281,7 +281,7 @@ public class ClusterStatus {
*/
   public List getBackupMasters() {
 if (backupMasters == null) {
-  return Collections.EMPTY_LIST;
+  return Collections.emptyList();
 }
 return Collections.unmodifiableList(this.backupMasters);
   }



hbase git commit: HBASE-19131 (Addendum) Use the emptyList() to replace EMPTY_LIST

2017-11-06 Thread chia7712
Repository: hbase
Updated Branches:
  refs/heads/branch-2 50f30668b -> 43b4aab64


HBASE-19131 (Addendum) Use the emptyList() to replace EMPTY_LIST


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/43b4aab6
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/43b4aab6
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/43b4aab6

Branch: refs/heads/branch-2
Commit: 43b4aab6485a9cb2c4c2db62d5e5fb533f827f52
Parents: 50f3066
Author: Chia-Ping Tsai 
Authored: Tue Nov 7 04:06:00 2017 +0800
Committer: Chia-Ping Tsai 
Committed: Tue Nov 7 04:06:29 2017 +0800

--
 .../src/main/java/org/apache/hadoop/hbase/ClusterStatus.java   | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/43b4aab6/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java
index 13c5bac..351b0c8 100644
--- a/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java
+++ b/hbase-client/src/main/java/org/apache/hadoop/hbase/ClusterStatus.java
@@ -138,7 +138,7 @@ public class ClusterStatus {
*/
   public List getDeadServerNames() {
 if (deadServers == null) {
-  return Collections.EMPTY_LIST;
+  return Collections.emptyList();
 }
 return Collections.unmodifiableList(deadServers);
   }
@@ -256,7 +256,7 @@ public class ClusterStatus {
 
   public Collection getServers() {
 if (liveServers == null) {
-  return Collections.EMPTY_LIST;
+  return Collections.emptyList();
 }
 return Collections.unmodifiableCollection(this.liveServers.keySet());
   }
@@ -281,7 +281,7 @@ public class ClusterStatus {
*/
   public List getBackupMasters() {
 if (backupMasters == null) {
-  return Collections.EMPTY_LIST;
+  return Collections.emptyList();
 }
 return Collections.unmodifiableList(this.backupMasters);
   }



hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times [Forced Update!]

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 0b66df4f5 -> e42744e64 (forced update)


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/e42744e6
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/e42744e6
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/e42744e6

Branch: refs/heads/HBASE-19189
Commit: e42744e6456248986627a64567881bc8fca61f66
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 14:18:07 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 78 +++
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 50 +
 3 files changed, 211 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/e42744e6/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..429b03a
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,78 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  agent {
+node {
+  label 'ubuntu'
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+skipDefaultCheckout()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+//stage ('scm checkout') {
+//  steps {
+//dir('component') {
+//  checkout scm
+//}
+//  }
+//}
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/gather_machine_environment.sh \
+  "${env.OUTPUT_RELATIVE}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+  --force-timeout 1800 \
+  --maven-local-repo ".m2-repo" \
+  --log-output "${OUTPUTDIR}" \
+  --repeat 100 \
+  "${tests}"
+"""
+  }
+  post {
+failure {
+  archive 'output/*'
+  archive 'output/**/*'
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/e42744e6/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,

hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times [Forced Update!]

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 c0231a664 -> 0b66df4f5 (forced update)


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/0b66df4f
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/0b66df4f
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/0b66df4f

Branch: refs/heads/HBASE-19189
Commit: 0b66df4f53e4b37447a5a9e7fdf0c0221f736691
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 14:15:10 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 78 +++
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 45 +++
 3 files changed, 206 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/0b66df4f/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..429b03a
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,78 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  agent {
+node {
+  label 'ubuntu'
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+skipDefaultCheckout()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+//stage ('scm checkout') {
+//  steps {
+//dir('component') {
+//  checkout scm
+//}
+//  }
+//}
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/gather_machine_environment.sh \
+  "${env.OUTPUT_RELATIVE}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+  --force-timeout 1800 \
+  --maven-local-repo ".m2-repo" \
+  --log-output "${OUTPUTDIR}" \
+  --repeat 100 \
+  "${tests}"
+"""
+  }
+  post {
+failure {
+  archive 'output/*'
+  archive 'output/**/*'
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/0b66df4f/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,

hbase git commit: HBASE-19189 Ad-hoc test job for running a subset of tests lots of times

2017-11-06 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/HBASE-19189 [created] c0231a664


HBASE-19189 Ad-hoc test job for running a subset of tests lots of times


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/c0231a66
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/c0231a66
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/c0231a66

Branch: refs/heads/HBASE-19189
Commit: c0231a66411a8510973db8473cc5f498725d01f7
Parents: 28cdf4a
Author: Sean Busbey 
Authored: Mon Nov 6 13:48:05 2017 -0600
Committer: Sean Busbey 
Committed: Mon Nov 6 13:49:54 2017 -0600

--
 dev-support/adhoc_run_tests/Jenkinsfile| 78 +++
 dev-support/adhoc_run_tests/adhoc_run_tests.sh | 83 +
 dev-support/gather_machine_environment.sh  | 45 +++
 3 files changed, 206 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/c0231a66/dev-support/adhoc_run_tests/Jenkinsfile
--
diff --git a/dev-support/adhoc_run_tests/Jenkinsfile 
b/dev-support/adhoc_run_tests/Jenkinsfile
new file mode 100644
index 000..429b03a
--- /dev/null
+++ b/dev-support/adhoc_run_tests/Jenkinsfile
@@ -0,0 +1,78 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pipeline {
+  agent {
+node {
+  label 'ubuntu'
+}
+  }
+  options {
+timeout (time: 6, unit: 'HOURS')
+timestamps()
+skipDefaultCheckout()
+  }
+  environment {
+// where we check out to across stages
+BASEDIR = "${env.WORKSPACE}/component"
+OUTPUT_RELATIVE = 'output'
+OUTPUTDIR = "${env.WORKSPACE}/output"
+BRANCH_SPECIFIC_DOCKERFILE = "${env.BASEDIR}/dev-support/docker/Dockerfile"
+  }
+  stages {
+//stage ('scm checkout') {
+//  steps {
+//dir('component') {
+//  checkout scm
+//}
+//  }
+//}
+stage ('run tests') {
+  tools {
+maven 'Maven (latest)'
+// this needs to be set to the jdk that ought to be used to build 
releases on the branch the Jenkinsfile is stored in.
+jdk "JDK 1.8 (latest)"
+  }
+  steps {
+sh """#!/bin/bash -e
+  echo "Setting up directories"
+  rm -rf "${env.OUTPUTDIR}" && mkdir "${env.OUTPUTDIR}"
+  rm -rf ".m2-repo" && mkdir ".m2-repo"
+  mkdir "${env.OUTPUTDIR}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/gather_machine_environment.sh \
+  "${env.OUTPUT_RELATIVE}/machine"
+"""
+sh """#!/bin/bash -e
+  ${env.BASEDIR}/dev-support/adhoc_run_tests/adhoc_run_tests.sh \
+  --force-timeout 1800 \
+  --maven-local-repo ".m2-repo" \
+  --log-output "${OUTPUTDIR}" \
+  --repeat 100 \
+  "${tests}"
+"""
+  }
+  post {
+failure {
+  archive 'output/*'
+  archive 'output/**/*'
+  archive 'component/**/target/surefire-reports/*'
+}
+  }
+}
+  }
+}

http://git-wip-us.apache.org/repos/asf/hbase/blob/c0231a66/dev-support/adhoc_run_tests/adhoc_run_tests.sh
--
diff --git a/dev-support/adhoc_run_tests/adhoc_run_tests.sh 
b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
new file mode 100755
index 000..3e61017
--- /dev/null
+++ b/dev-support/adhoc_run_tests/adhoc_run_tests.sh
@@ -0,0 +1,83 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software 

hbase git commit: HBASE-19160 expose CellComparator as IA.Public - addendum fixes compilation for branch-2

2017-11-06 Thread tedyu
Repository: hbase
Updated Branches:
  refs/heads/branch-2 cfddfcf23 -> 50f30668b


HBASE-19160 expose CellComparator as IA.Public - addendum fixes compilation for 
branch-2


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/50f30668
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/50f30668
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/50f30668

Branch: refs/heads/branch-2
Commit: 50f30668b825cccd7c795e8b23217e2a68fdb02d
Parents: cfddfcf
Author: tedyu 
Authored: Mon Nov 6 11:32:26 2017 -0800
Committer: tedyu 
Committed: Mon Nov 6 11:32:26 2017 -0800

--
 .../src/main/java/org/apache/hadoop/hbase/mapreduce/Import.java| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/50f30668/hbase-mapreduce/src/main/java/org/apache/hadoop/hbase/mapreduce/Import.java
--
diff --git 
a/hbase-mapreduce/src/main/java/org/apache/hadoop/hbase/mapreduce/Import.java 
b/hbase-mapreduce/src/main/java/org/apache/hadoop/hbase/mapreduce/Import.java
index fb3a1ef..eaa3343 100644
--- 
a/hbase-mapreduce/src/main/java/org/apache/hadoop/hbase/mapreduce/Import.java
+++ 
b/hbase-mapreduce/src/main/java/org/apache/hadoop/hbase/mapreduce/Import.java
@@ -165,7 +165,7 @@ public class Import extends Configured implements Tool {
 @edu.umd.cs.findbugs.annotations.SuppressWarnings(value = 
"EQ_COMPARETO_USE_OBJECT_EQUALS",
 justification = "This is wrong, yes, but we should be purging 
Writables, not fixing them")
 public int compareTo(KeyValueWritableComparable o) {
-  return CellComparatorImpl.COMPARATOR.compare(this.kv, 
((KeyValueWritableComparable) o).kv);
+  return CellComparator.getInstance().compare(this.kv, o.kv);
 }
 
 public static class KeyValueWritableComparator extends WritableComparator {



[2/2] hbase git commit: HBASE-19160 expose CellComparator as IA.Public

2017-11-06 Thread mdrob
HBASE-19160 expose CellComparator as IA.Public


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/cfddfcf2
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/cfddfcf2
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/cfddfcf2

Branch: refs/heads/branch-2
Commit: cfddfcf23cc38161da70bfe965b8e7b6e0581711
Parents: 47c614c
Author: Mike Drob 
Authored: Thu Nov 2 16:16:43 2017 -0500
Committer: Mike Drob 
Committed: Mon Nov 6 10:15:55 2017 -0600

--
 .../hadoop/hbase/client/ConnectionUtils.java|  4 ++--
 .../org/apache/hadoop/hbase/client/Result.java  |  5 ++--
 .../hadoop/hbase/filter/FilterListBase.java |  4 ++--
 .../hadoop/hbase/filter/FuzzyRowFilter.java |  6 ++---
 .../hbase/filter/InclusiveStopFilter.java   |  4 ++--
 .../org/apache/hadoop/hbase/CellComparator.java | 12 +-
 .../java/org/apache/hadoop/hbase/CellUtil.java  |  2 +-
 .../java/org/apache/hadoop/hbase/KeyValue.java  |  2 +-
 .../io/encoding/BufferedDataBlockEncoder.java   |  3 +--
 .../apache/hadoop/hbase/TestCellComparator.java | 13 ++-
 .../hadoop/hbase/util/RedundantKVGenerator.java |  6 ++---
 .../mapreduce/IntegrationTestImportTsv.java | 10 
 .../hadoop/hbase/mapreduce/CellSortReducer.java |  4 ++--
 .../hbase/mapreduce/HFileOutputFormat2.java |  6 ++---
 .../apache/hadoop/hbase/mapreduce/Import.java   |  6 ++---
 .../hadoop/hbase/mapreduce/PutSortReducer.java  |  4 ++--
 .../hadoop/hbase/mapreduce/SyncTable.java   |  8 +++
 .../hadoop/hbase/mapreduce/TextSortReducer.java |  4 ++--
 .../hadoop/hbase/io/hfile/FixedFileTrailer.java |  2 +-
 .../org/apache/hadoop/hbase/io/hfile/HFile.java |  4 +---
 .../hbase/io/hfile/HFilePrettyPrinter.java  | 10 
 .../hadoop/hbase/io/hfile/HFileReaderImpl.java  |  3 +--
 .../hadoop/hbase/io/hfile/HFileWriterImpl.java  |  3 +--
 .../org/apache/hadoop/hbase/mob/MobUtils.java   |  5 ++--
 .../compactions/PartitionedMobCompactor.java|  3 ++-
 .../hbase/regionserver/DefaultMemStore.java |  3 +--
 .../hadoop/hbase/regionserver/HStore.java   |  3 +--
 .../hbase/regionserver/StoreFileReader.java |  5 ++--
 .../hbase/regionserver/StoreFileWriter.java |  6 ++---
 .../hbase/regionserver/wal/FSWALEntry.java  |  6 ++---
 .../hbase/util/CollectionBackedScanner.java |  5 ++--
 .../hadoop/hbase/util/CompressionTest.java  |  3 ++-
 .../hadoop/hbase/HBaseTestingUtility.java   |  2 +-
 .../hbase/HFilePerformanceEvaluation.java   |  2 +-
 .../apache/hadoop/hbase/client/TestResult.java  | 18 +++
 .../apache/hadoop/hbase/filter/TestFilter.java  | 13 ---
 .../hadoop/hbase/filter/TestFilterList.java | 24 
 .../hbase/regionserver/KeyValueScanFixture.java |  6 ++---
 .../hbase/regionserver/TestCellFlatSet.java | 10 
 .../regionserver/TestCompactingMemStore.java| 12 --
 .../regionserver/TestKeyValueScanFixture.java   |  4 ++--
 .../hbase/regionserver/TestStoreScanner.java| 21 -
 .../AbstractTestScanQueryMatcher.java   |  6 ++---
 .../hadoop/hbase/spark/HBaseContext.scala   |  2 +-
 44 files changed, 133 insertions(+), 151 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/cfddfcf2/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
index 5e0e3b7..bc0ade2 100644
--- 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
+++ 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
@@ -39,7 +39,7 @@ import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.Cell;
-import org.apache.hadoop.hbase.CellComparatorImpl;
+import org.apache.hadoop.hbase.CellComparator;
 import org.apache.hadoop.hbase.HConstants;
 import org.apache.hadoop.hbase.HRegionInfo;
 import org.apache.hadoop.hbase.PrivateCellUtil;
@@ -336,7 +336,7 @@ public final class ConnectionUtils {
 }
 Cell[] rawCells = result.rawCells();
 int index =
-Arrays.binarySearch(rawCells, keepCellsAfter, 
CellComparatorImpl.COMPARATOR::compareWithoutRow);
+Arrays.binarySearch(rawCells, keepCellsAfter, 
CellComparator.getInstance()::compareWithoutRow);
 if (index < 0) {
   index = -index - 1;
 } else {

http://git-wip-us.apache.org/repos/asf/hbase/blob/cfddfcf2/hbase-client/src/main/java/org/apache/hadoop/hbase/client/Result.java

[1/2] hbase git commit: HBASE-19160 expose CellComparator as IA.Public

2017-11-06 Thread mdrob
Repository: hbase
Updated Branches:
  refs/heads/branch-2 47c614c70 -> cfddfcf23
  refs/heads/master 888f2335c -> 9ee8e2714


HBASE-19160 expose CellComparator as IA.Public


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/9ee8e271
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/9ee8e271
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/9ee8e271

Branch: refs/heads/master
Commit: 9ee8e2714df54345743ddf18bf23899872930b2c
Parents: 888f233
Author: Mike Drob 
Authored: Thu Nov 2 16:16:43 2017 -0500
Committer: Mike Drob 
Committed: Mon Nov 6 10:08:14 2017 -0600

--
 .../hadoop/hbase/client/ConnectionUtils.java|  4 ++--
 .../org/apache/hadoop/hbase/client/Result.java  |  5 ++--
 .../hadoop/hbase/filter/FilterListBase.java |  4 ++--
 .../hadoop/hbase/filter/FuzzyRowFilter.java |  6 ++---
 .../hbase/filter/InclusiveStopFilter.java   |  4 ++--
 .../org/apache/hadoop/hbase/CellComparator.java | 12 +-
 .../java/org/apache/hadoop/hbase/CellUtil.java  |  2 +-
 .../java/org/apache/hadoop/hbase/KeyValue.java  |  2 +-
 .../io/encoding/BufferedDataBlockEncoder.java   |  3 +--
 .../apache/hadoop/hbase/TestCellComparator.java | 13 ++-
 .../hadoop/hbase/util/RedundantKVGenerator.java |  6 ++---
 .../mapreduce/IntegrationTestImportTsv.java | 10 
 .../hadoop/hbase/mapreduce/CellSortReducer.java |  4 ++--
 .../hbase/mapreduce/HFileOutputFormat2.java |  6 ++---
 .../apache/hadoop/hbase/mapreduce/Import.java   |  6 ++---
 .../hadoop/hbase/mapreduce/PutSortReducer.java  |  4 ++--
 .../hadoop/hbase/mapreduce/SyncTable.java   |  8 +++
 .../hadoop/hbase/mapreduce/TextSortReducer.java |  4 ++--
 .../hadoop/hbase/io/hfile/FixedFileTrailer.java |  2 +-
 .../org/apache/hadoop/hbase/io/hfile/HFile.java |  4 +---
 .../hbase/io/hfile/HFilePrettyPrinter.java  | 10 
 .../hadoop/hbase/io/hfile/HFileReaderImpl.java  |  3 +--
 .../hadoop/hbase/io/hfile/HFileWriterImpl.java  |  3 +--
 .../org/apache/hadoop/hbase/mob/MobUtils.java   |  5 ++--
 .../compactions/PartitionedMobCompactor.java|  3 ++-
 .../hbase/regionserver/DefaultMemStore.java |  3 +--
 .../hadoop/hbase/regionserver/HStore.java   |  3 +--
 .../hbase/regionserver/StoreFileReader.java |  5 ++--
 .../hbase/regionserver/StoreFileWriter.java |  6 ++---
 .../hbase/regionserver/wal/FSWALEntry.java  |  6 ++---
 .../hbase/util/CollectionBackedScanner.java |  5 ++--
 .../hadoop/hbase/util/CompressionTest.java  |  3 ++-
 .../hadoop/hbase/HBaseTestingUtility.java   |  2 +-
 .../hbase/HFilePerformanceEvaluation.java   |  2 +-
 .../apache/hadoop/hbase/client/TestResult.java  | 18 +++
 .../apache/hadoop/hbase/filter/TestFilter.java  | 13 ---
 .../hadoop/hbase/filter/TestFilterList.java | 24 
 .../hbase/regionserver/KeyValueScanFixture.java |  6 ++---
 .../hbase/regionserver/TestCellFlatSet.java | 10 
 .../regionserver/TestCompactingMemStore.java| 12 --
 .../regionserver/TestKeyValueScanFixture.java   |  4 ++--
 .../hbase/regionserver/TestStoreScanner.java| 21 -
 .../AbstractTestScanQueryMatcher.java   |  6 ++---
 .../hadoop/hbase/spark/HBaseContext.scala   |  2 +-
 44 files changed, 133 insertions(+), 151 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/9ee8e271/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
index 5e0e3b7..bc0ade2 100644
--- 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
+++ 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/client/ConnectionUtils.java
@@ -39,7 +39,7 @@ import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.Cell;
-import org.apache.hadoop.hbase.CellComparatorImpl;
+import org.apache.hadoop.hbase.CellComparator;
 import org.apache.hadoop.hbase.HConstants;
 import org.apache.hadoop.hbase.HRegionInfo;
 import org.apache.hadoop.hbase.PrivateCellUtil;
@@ -336,7 +336,7 @@ public final class ConnectionUtils {
 }
 Cell[] rawCells = result.rawCells();
 int index =
-Arrays.binarySearch(rawCells, keepCellsAfter, 
CellComparatorImpl.COMPARATOR::compareWithoutRow);
+Arrays.binarySearch(rawCells, keepCellsAfter, 
CellComparator.getInstance()::compareWithoutRow);
 if (index < 0) {
   index = -index - 1;
 } else {


hbase-site git commit: INFRA-10751 Empty commit

2017-11-06 Thread git-site-role
Repository: hbase-site
Updated Branches:
  refs/heads/asf-site 32453e2dd -> 2ef9b5f9c


INFRA-10751 Empty commit


Project: http://git-wip-us.apache.org/repos/asf/hbase-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase-site/commit/2ef9b5f9
Tree: http://git-wip-us.apache.org/repos/asf/hbase-site/tree/2ef9b5f9
Diff: http://git-wip-us.apache.org/repos/asf/hbase-site/diff/2ef9b5f9

Branch: refs/heads/asf-site
Commit: 2ef9b5f9c6a5248e62dda9fac22c2529470d3ceb
Parents: 32453e2
Author: jenkins 
Authored: Mon Nov 6 15:17:40 2017 +
Committer: jenkins 
Committed: Mon Nov 6 15:17:40 2017 +

--

--




[46/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html 
b/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
index 739356b..9be4689 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
@@ -828,235 +828,238 @@
 820  
org.apache.hadoop.hbase.shaded.com.google.common.collect.Lists.class,
 821  org.apache.htrace.Trace.class,
 822  
com.codahale.metrics.MetricRegistry.class,
-823  
org.apache.commons.lang3.ArrayUtils.class);
-824  }
-825
-826  /**
-827   * Returns a classpath string built 
from the content of the "tmpjars" value in {@code conf}.
-828   * Also exposed to shell scripts via 
`bin/hbase mapredcp`.
-829   */
-830  public static String 
buildDependencyClasspath(Configuration conf) {
-831if (conf == null) {
-832  throw new 
IllegalArgumentException("Must provide a configuration object.");
-833}
-834SetString paths = new 
HashSet(conf.getStringCollection("tmpjars"));
-835if (paths.isEmpty()) {
-836  throw new 
IllegalArgumentException("Configuration contains no tmpjars.");
-837}
-838StringBuilder sb = new 
StringBuilder();
-839for (String s : paths) {
-840  // entries can take the form 
'file:/path/to/file.jar'.
-841  int idx = s.indexOf(":");
-842  if (idx != -1) s = s.substring(idx 
+ 1);
-843  if (sb.length()  0) 
sb.append(File.pathSeparator);
-844  sb.append(s);
-845}
-846return sb.toString();
-847  }
-848
-849  /**
-850   * Add the HBase dependency jars as 
well as jars for any of the configured
-851   * job classes to the job 
configuration, so that JobClient will ship them
-852   * to the cluster and add them to the 
DistributedCache.
-853   */
-854  public static void 
addDependencyJars(Job job) throws IOException {
-855
addHBaseDependencyJars(job.getConfiguration());
-856try {
-857  
addDependencyJarsForClasses(job.getConfiguration(),
-858  // when making changes here, 
consider also mapred.TableMapReduceUtil
-859  // pull job classes
-860  job.getMapOutputKeyClass(),
-861  job.getMapOutputValueClass(),
-862  job.getInputFormatClass(),
-863  job.getOutputKeyClass(),
-864  job.getOutputValueClass(),
-865  job.getOutputFormatClass(),
-866  job.getPartitionerClass(),
-867  job.getCombinerClass());
-868} catch (ClassNotFoundException e) 
{
-869  throw new IOException(e);
-870}
-871  }
-872
-873  /**
-874   * Add the jars containing the given 
classes to the job's configuration
-875   * such that JobClient will ship them 
to the cluster and add them to
-876   * the DistributedCache.
-877   * @deprecated rely on {@link 
#addDependencyJars(Job)} instead.
-878   */
-879  @Deprecated
-880  public static void 
addDependencyJars(Configuration conf,
-881  Class?... classes) throws 
IOException {
-882LOG.warn("The 
addDependencyJars(Configuration, Class?...) method has been deprecated 
since it"
-883 + " is easy to use 
incorrectly. Most users should rely on addDependencyJars(Job) " +
-884 "instead. See HBASE-8386 for 
more details.");
-885addDependencyJarsForClasses(conf, 
classes);
-886  }
-887
-888  /**
-889   * Add the jars containing the given 
classes to the job's configuration
-890   * such that JobClient will ship them 
to the cluster and add them to
-891   * the DistributedCache.
-892   *
-893   * N.B. that this method at most adds 
one jar per class given. If there is more than one
-894   * jar available containing a class 
with the same name as a given class, we don't define
-895   * which of those jars might be 
chosen.
-896   *
-897   * @param conf The Hadoop Configuration 
to modify
-898   * @param classes will add just those 
dependencies needed to find the given classes
-899   * @throws IOException if an underlying 
library call fails.
-900   */
-901  @InterfaceAudience.Private
-902  public static void 
addDependencyJarsForClasses(Configuration conf,
-903  Class?... classes) throws 
IOException {
-904
-905FileSystem localFs = 
FileSystem.getLocal(conf);
-906SetString jars = new 
HashSet();
-907// Add jars that are already in the 
tmpjars variable
-908
jars.addAll(conf.getStringCollection("tmpjars"));
-909
-910// add jars as we find them to a map 
of contents jar name so that we can avoid
-911// creating new jars for classes that 
have already been packaged.
-912MapString, String 
packagedClasses = new HashMap();
-913
-914// Add jars containing the specified 
classes
-915for (Class? clazz : classes) 
{
-916  if (clazz == null) continue;
-917
-918  

[39/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html 
b/devapidocs/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html
index 902efb0..db3dbe5 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html
@@ -18,7 +18,7 @@
 catch(err) {
 }
 //-->
-var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10,"i7":10,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":10,"i14":10,"i15":10,"i16":10,"i17":10,"i18":10,"i19":10,"i20":10,"i21":10,"i22":10,"i23":10,"i24":10,"i25":10,"i26":10,"i27":10,"i28":10,"i29":10,"i30":10,"i31":10,"i32":10,"i33":10,"i34":10,"i35":10,"i36":10,"i37":10,"i38":10,"i39":10,"i40":10,"i41":10,"i42":10,"i43":10,"i44":10,"i45":10,"i46":10,"i47":10,"i48":10,"i49":10,"i50":10,"i51":10,"i52":10,"i53":10,"i54":10,"i55":10,"i56":10,"i57":10,"i58":10,"i59":10,"i60":10,"i61":10,"i62":10,"i63":10,"i64":10,"i65":10,"i66":10,"i67":10,"i68":10,"i69":10,"i70":10,"i71":10,"i72":10,"i73":10,"i74":10,"i75":10,"i76":10,"i77":10,"i78":10,"i79":10,"i80":10,"i81":10,"i82":10,"i83":10,"i84":10,"i85":10,"i86":10,"i87":10,"i88":10,"i89":10,"i90":10,"i91":10,"i92":10,"i93":10,"i94":10,"i95":10,"i96":10,"i97":10,"i98":10,"i99":10,"i100":10,"i101":10,"i102":10,"i103":10,"i104":10,"i105":10};
+var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10,"i7":10,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":10,"i14":10,"i15":10,"i16":10,"i17":10,"i18":10,"i19":10,"i20":10,"i21":10,"i22":10,"i23":10,"i24":10,"i25":10,"i26":10,"i27":10,"i28":10,"i29":10,"i30":10,"i31":10,"i32":10,"i33":10,"i34":10,"i35":10,"i36":10,"i37":10,"i38":10,"i39":10,"i40":10,"i41":10,"i42":10,"i43":10,"i44":10,"i45":10,"i46":10,"i47":10,"i48":10,"i49":10,"i50":10,"i51":10,"i52":10,"i53":10,"i54":10,"i55":10,"i56":10,"i57":10,"i58":10,"i59":10,"i60":10,"i61":10,"i62":10,"i63":10,"i64":10,"i65":10,"i66":10,"i67":10,"i68":10,"i69":10,"i70":10,"i71":10,"i72":10,"i73":10,"i74":10,"i75":10,"i76":10,"i77":10,"i78":10,"i79":10,"i80":10,"i81":10,"i82":10,"i83":10,"i84":10,"i85":10,"i86":10,"i87":10,"i88":10,"i89":10,"i90":10,"i91":10,"i92":10,"i93":10,"i94":10,"i95":10,"i96":10,"i97":10,"i98":10,"i99":10,"i100":10,"i101":10,"i102":10,"i103":10,"i104":10,"i105":10,"i106":10,"i107":10,"i108":10,"i
 
109":10,"i110":10,"i111":10,"i112":10,"i113":10,"i114":10,"i115":10,"i116":10,"i117":10,"i118":10,"i119":10,"i120":10,"i121":10,"i122":10};
 var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
 var altColor = "altColor";
 var rowColor = "rowColor";
@@ -114,7 +114,7 @@ var activeTableTab = "activeTableTab";
 
 
 @InterfaceAudience.Private
-public class AsyncHBaseAdmin
+public class AsyncHBaseAdmin
 extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object
 implements AsyncAdmin
 The implementation of AsyncAdmin.
@@ -245,32 +245,44 @@ implements 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
-compact(TableNametableName,
-   http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[]columnFamily)
-Compact a column family within a table.
+compact(TableNametableName)
+Compact a table.
 
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
-compactRegion(byte[]regionName,
- http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[]columnFamily)
-Compact a column family within a region.
+compact(TableNametableName,
+   byte[]columnFamily)
+Compact a column family within a table.
 
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
+compactRegion(byte[]regionName)
+Compact an individual region.
+
+
+
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in 

[14/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableOperator.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableOperator.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableOperator.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableOperator.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableOperator.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.FlushRegionRequest;
-099import 

[31/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[42/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html
index 90760f3..a1008b0 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/HRegionLocation.html
@@ -911,10 +911,15 @@ service.
 AsyncNonMetaRegionLocator.TableCache.clearCompletedRequests(http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in java.util">OptionalHRegionLocationlocation)
 
 
+private boolean
+RawAsyncHBaseAdmin.compareRegionsWithSplitKeys(http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListHRegionLocationlocations,
+   byte[][]splitKeys)
+
+
 private void
 HTableMultiplexer.HTableMultiplexerStatus.initialize(http://docs.oracle.com/javase/8/docs/api/java/util/Map.html?is-external=true;
 title="class or interface in java.util">MapHRegionLocation,HTableMultiplexer.FlushWorkerserverToFlushWorkerMap)
 
-
+
 private S,Rvoid
 RawAsyncTableImpl.onLocateComplete(http://docs.oracle.com/javase/8/docs/api/java/util/function/Function.html?is-external=true;
 title="class or interface in 
java.util.function">Functioncom.google.protobuf.RpcChannel,SstubMaker,
 RawAsyncTable.CoprocessorCallableS,Rcallable,
@@ -927,19 +932,19 @@ service.
 HRegionLocationloc,
 http://docs.oracle.com/javase/8/docs/api/java/lang/Throwable.html?is-external=true;
 title="class or interface in 
java.lang">Throwableerror)
 
-
+
 private boolean
 AsyncNonMetaRegionLocator.TableCache.tryComplete(AsyncNonMetaRegionLocator.LocateRequestreq,
http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFutureHRegionLocationfuture,
http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in java.util">OptionalHRegionLocationlocation)
 
-
+
 private boolean
 AsyncNonMetaRegionLocator.TableCache.tryComplete(AsyncNonMetaRegionLocator.LocateRequestreq,
http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFutureHRegionLocationfuture,
http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in java.util">OptionalHRegionLocationlocation)
 
-
+
 (package private) static void
 AsyncRegionLocator.updateCachedLocation(HRegionLocationloc,
 http://docs.oracle.com/javase/8/docs/api/java/lang/Throwable.html?is-external=true;
 title="class or interface in java.lang">Throwableexception,
@@ -947,7 +952,7 @@ service.
 http://docs.oracle.com/javase/8/docs/api/java/util/function/Consumer.html?is-external=true;
 title="class or interface in java.util.function">ConsumerHRegionLocationaddToCache,
 http://docs.oracle.com/javase/8/docs/api/java/util/function/Consumer.html?is-external=true;
 title="class or interface in java.util.function">ConsumerHRegionLocationremoveFromCache)
 
-
+
 (package private) static void
 AsyncRegionLocator.updateCachedLocation(HRegionLocationloc,
 http://docs.oracle.com/javase/8/docs/api/java/lang/Throwable.html?is-external=true;
 title="class or interface in java.lang">Throwableexception,
@@ -955,7 +960,7 @@ service.
 http://docs.oracle.com/javase/8/docs/api/java/util/function/Consumer.html?is-external=true;
 title="class or interface in java.util.function">ConsumerHRegionLocationaddToCache,
 http://docs.oracle.com/javase/8/docs/api/java/util/function/Consumer.html?is-external=true;
 title="class or interface in java.util.function">ConsumerHRegionLocationremoveFromCache)
 
-
+
 (package private) static void
 AsyncRegionLocator.updateCachedLocation(HRegionLocationloc,
 http://docs.oracle.com/javase/8/docs/api/java/lang/Throwable.html?is-external=true;
 title="class or interface in java.lang">Throwableexception,
@@ -963,7 +968,7 @@ service.
 http://docs.oracle.com/javase/8/docs/api/java/util/function/Consumer.html?is-external=true;
 title="class or interface in java.util.function">ConsumerHRegionLocationaddToCache,
 http://docs.oracle.com/javase/8/docs/api/java/util/function/Consumer.html?is-external=true;
 title="class or interface in java.util.function">ConsumerHRegionLocationremoveFromCache)
 
-
+
 (package private) static void
 

[16/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.FlushRegionRequest;
-099import 

[41/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html 
b/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html
index 38264a7..926af2b 100644
--- a/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html
+++ b/devapidocs/org/apache/hadoop/hbase/class-use/TableName.html
@@ -2394,26 +2394,44 @@ service.
byte[]encodeRegionNameB)
 
 
+private http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
+RawAsyncHBaseAdmin.getTableNames(org.apache.hadoop.hbase.shaded.protobuf.generated.MasterProtos.GetTableNamesRequestrequest)
+
+
 default http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
 AsyncAdmin.listTableNames()
 List all of the names of userspace tables.
 
 
+
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
+AsyncHBaseAdmin.listTableNames(booleanincludeSysTables)
+
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
-AsyncHBaseAdmin.listTableNames(http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in java.util">Optionalhttp://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in java.util.regex">Patternpattern,
+AsyncAdmin.listTableNames(booleanincludeSysTables)
+List all of the names of tables.
+
+
+
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
+RawAsyncHBaseAdmin.listTableNames(booleanincludeSysTables)
+
+
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
+AsyncHBaseAdmin.listTableNames(http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in java.util.regex">Patternpattern,
   booleanincludeSysTables)
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
-AsyncAdmin.listTableNames(http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in java.util">Optionalhttp://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in java.util.regex">Patternpattern,
+AsyncAdmin.listTableNames(http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in java.util.regex">Patternpattern,
   booleanincludeSysTables)
 List all of the names of userspace tables.
 
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
-RawAsyncHBaseAdmin.listTableNames(http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in java.util">Optionalhttp://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in java.util.regex">Patternpattern,

[30/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.FlushRegionRequest;
-099import 

[12/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TruncateTableProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TruncateTableProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TruncateTableProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TruncateTableProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TruncateTableProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[44/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/checkstyle.rss
--
diff --git a/checkstyle.rss b/checkstyle.rss
index 4bb7159..b97262f 100644
--- a/checkstyle.rss
+++ b/checkstyle.rss
@@ -26,7 +26,7 @@ under the License.
 2007 - 2017 The Apache Software Foundation
 
   File: 3425,
- Errors: 21699,
+ Errors: 21610,
  Warnings: 0,
  Infos: 0
   
@@ -29917,7 +29917,7 @@ under the License.
   0
 
 
-  131
+  119
 
   
   
@@ -29987,7 +29987,7 @@ under the License.
   0
 
 
-  5
+  1
 
   
   
@@ -31163,7 +31163,7 @@ under the License.
   0
 
 
-  3
+  2
 
   
   
@@ -47571,7 +47571,7 @@ under the License.
   0
 
 
-  189
+  117
 
   
   

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/coc.html
--
diff --git a/coc.html b/coc.html
index c449795..88f48f2 100644
--- a/coc.html
+++ b/coc.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  
   Code of Conduct Policy
@@ -380,7 +380,7 @@ email to mailto:priv...@hbase.apache.org;>the priv
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/cygwin.html
--
diff --git a/cygwin.html b/cygwin.html
index 31be56d..13d36b8 100644
--- a/cygwin.html
+++ b/cygwin.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Installing Apache HBase (TM) on Windows using 
Cygwin
 
@@ -679,7 +679,7 @@ Now your HBase server is running, start 
coding and build that next
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/dependencies.html
--
diff --git a/dependencies.html b/dependencies.html
index 977e558..6c760ef 100644
--- a/dependencies.html
+++ b/dependencies.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Project Dependencies
 
@@ -445,7 +445,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/dependency-convergence.html
--
diff --git a/dependency-convergence.html b/dependency-convergence.html
index f778807..6a9b1d5 100644
--- a/dependency-convergence.html
+++ b/dependency-convergence.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Reactor Dependency Convergence
 
@@ -912,7 +912,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/dependency-info.html
--
diff --git a/dependency-info.html b/dependency-info.html
index b7fd2c7..edfc02b 100644
--- a/dependency-info.html
+++ b/dependency-info.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Dependency Information
 
@@ -318,7 +318,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/dependency-management.html
--

[28/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[08/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-spark/dependency-management.html
--
diff --git a/hbase-build-configuration/hbase-spark/dependency-management.html 
b/hbase-build-configuration/hbase-spark/dependency-management.html
index 9a3efca..276bc4d 100644
--- a/hbase-build-configuration/hbase-spark/dependency-management.html
+++ b/hbase-build-configuration/hbase-spark/dependency-management.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Spark  Project Dependency Management
 
@@ -768,7 +768,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-spark/index.html
--
diff --git a/hbase-build-configuration/hbase-spark/index.html 
b/hbase-build-configuration/hbase-spark/index.html
index d2509aa..62b4d47 100644
--- a/hbase-build-configuration/hbase-spark/index.html
+++ b/hbase-build-configuration/hbase-spark/index.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Spark  About
 
@@ -119,7 +119,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-spark/integration.html
--
diff --git a/hbase-build-configuration/hbase-spark/integration.html 
b/hbase-build-configuration/hbase-spark/integration.html
index fdf9508..4b03c2a 100644
--- a/hbase-build-configuration/hbase-spark/integration.html
+++ b/hbase-build-configuration/hbase-spark/integration.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Spark  CI Management
 
@@ -126,7 +126,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-spark/issue-tracking.html
--
diff --git a/hbase-build-configuration/hbase-spark/issue-tracking.html 
b/hbase-build-configuration/hbase-spark/issue-tracking.html
index d8f0940..226bae4 100644
--- a/hbase-build-configuration/hbase-spark/issue-tracking.html
+++ b/hbase-build-configuration/hbase-spark/issue-tracking.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Spark  Issue Management
 
@@ -123,7 +123,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-spark/license.html
--
diff --git a/hbase-build-configuration/hbase-spark/license.html 
b/hbase-build-configuration/hbase-spark/license.html
index 2a37c79..d1a3ede 100644
--- a/hbase-build-configuration/hbase-spark/license.html
+++ b/hbase-build-configuration/hbase-spark/license.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Spark  Project Licenses
 
@@ -326,7 +326,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-spark/mail-lists.html
--
diff --git a/hbase-build-configuration/hbase-spark/mail-lists.html 
b/hbase-build-configuration/hbase-spark/mail-lists.html
index f1b2138..cee3583 100644
--- a/hbase-build-configuration/hbase-spark/mail-lists.html
+++ b/hbase-build-configuration/hbase-spark/mail-lists.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Spark  Project Mailing Lists
 
@@ -176,7 +176,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.

[21/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MasterRpcCall.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MasterRpcCall.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MasterRpcCall.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MasterRpcCall.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MasterRpcCall.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.FlushRegionRequest;
-099import 

[40/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html 
b/devapidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html
index eec8c88..9799603 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html
@@ -18,7 +18,7 @@
 catch(err) {
 }
 //-->
-var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":18,"i6":6,"i7":6,"i8":6,"i9":6,"i10":18,"i11":6,"i12":18,"i13":6,"i14":6,"i15":6,"i16":6,"i17":6,"i18":18,"i19":6,"i20":6,"i21":6,"i22":6,"i23":6,"i24":6,"i25":18,"i26":6,"i27":6,"i28":6,"i29":6,"i30":6,"i31":6,"i32":6,"i33":6,"i34":6,"i35":6,"i36":18,"i37":6,"i38":6,"i39":6,"i40":6,"i41":6,"i42":6,"i43":6,"i44":18,"i45":18,"i46":6,"i47":6,"i48":6,"i49":6,"i50":18,"i51":6,"i52":18,"i53":6,"i54":6,"i55":6,"i56":6,"i57":6,"i58":6,"i59":6,"i60":6,"i61":6,"i62":6,"i63":6,"i64":6,"i65":6,"i66":18,"i67":6,"i68":6,"i69":6,"i70":18,"i71":6,"i72":6,"i73":6,"i74":18,"i75":6,"i76":18,"i77":6,"i78":18,"i79":6,"i80":18,"i81":6,"i82":6,"i83":18,"i84":6,"i85":18,"i86":6,"i87":6,"i88":6,"i89":6,"i90":6,"i91":6,"i92":6,"i93":6,"i94":6,"i95":6,"i96":6,"i97":6,"i98":6,"i99":6,"i100":6,"i101":6,"i102":6,"i103":6,"i104":6,"i105":6,"i106":6,"i107":6,"i108":6,"i109":6,"i110":6,"i111":18,"i112":18,"i113":6,"i114":6,"i115":18,"i116":6,"i117":6,"i118":6,
 "i119":6,"i120":6,"i121":6,"i122":6,"i123":6,"i124":6};
+var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":18,"i6":6,"i7":6,"i8":6,"i9":6,"i10":6,"i11":6,"i12":6,"i13":6,"i14":6,"i15":6,"i16":6,"i17":6,"i18":6,"i19":6,"i20":6,"i21":6,"i22":6,"i23":6,"i24":6,"i25":6,"i26":6,"i27":6,"i28":6,"i29":6,"i30":6,"i31":6,"i32":6,"i33":6,"i34":6,"i35":6,"i36":6,"i37":6,"i38":18,"i39":6,"i40":6,"i41":6,"i42":6,"i43":6,"i44":6,"i45":6,"i46":18,"i47":18,"i48":6,"i49":6,"i50":6,"i51":6,"i52":6,"i53":6,"i54":18,"i55":6,"i56":6,"i57":6,"i58":6,"i59":6,"i60":6,"i61":6,"i62":6,"i63":6,"i64":6,"i65":6,"i66":6,"i67":6,"i68":6,"i69":6,"i70":6,"i71":6,"i72":18,"i73":6,"i74":6,"i75":6,"i76":6,"i77":6,"i78":6,"i79":6,"i80":18,"i81":6,"i82":6,"i83":18,"i84":6,"i85":6,"i86":6,"i87":6,"i88":6,"i89":6,"i90":6,"i91":6,"i92":6,"i93":6,"i94":6,"i95":6,"i96":6,"i97":6,"i98":6,"i99":6,"i100":6,"i101":6,"i102":6,"i103":6,"i104":6,"i105":6,"i106":6,"i107":6,"i108":6,"i109":6,"i110":6,"i111":6,"i112":6,"i113":6,"i114":6,"i115":6,"i116":6,"i117":18,"i118":18,"i119":6,"i
 
120":6,"i121":6,"i122":6,"i123":6,"i124":6,"i125":6,"i126":6,"i127":6,"i128":6,"i129":6,"i130":6};
 var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"],16:["t5","Default Methods"]};
 var altColor = "altColor";
 var rowColor = "rowColor";
@@ -198,28 +198,28 @@ public interface 
-default http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
 compact(TableNametableName)
 Compact a table.
 
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
-compact(TableNametableName,
-   http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[]columnFamily)
+compact(TableNametableName,
+   byte[]columnFamily)
 Compact a column family within a table.
 
 
 
-default http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
 compactRegion(byte[]regionName)
 Compact an individual region.
 
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in 

[03/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/testdevapidocs/src-html/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.CPMasterObserver.html
--
diff --git 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.CPMasterObserver.html
 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.CPMasterObserver.html
index 714fe7f..f96c848 100644
--- 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.CPMasterObserver.html
+++ 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.CPMasterObserver.html
@@ -1551,7 +1551,7 @@
 1543  assertTrue("Found server", 
found);
 1544  LOG.info("Found " + destName);
 1545  
master.getMasterRpcServices().moveRegion(null, 
RequestConverter.buildMoveRegionRequest(
-1546  
firstGoodPair.getRegionInfo().getEncodedNameAsBytes(),Bytes.toBytes(destName)));
+1546  
firstGoodPair.getRegionInfo().getEncodedNameAsBytes(), 
ServerName.valueOf(destName)));
 1547  assertTrue("Coprocessor should 
have been called on region move",
 1548cp.wasMoveCalled());
 1549
@@ -1573,130 +1573,131 @@
 1565  
UTIL.waitUntilNoRegionsInTransition();
 1566  ListRegionInfo openRegions 
= ProtobufUtil.getOnlineRegions(rs.getRSRpcServices());
 1567  int moveCnt = 
openRegions.size()/2;
-1568  for (int i=0; imoveCnt; i++) 
{
+1568  for (int i = 0; i  moveCnt; 
i++) {
 1569RegionInfo info = 
openRegions.get(i);
 1570if (!info.isMetaRegion()) {
-1571  
master.getMasterRpcServices().moveRegion(null, 
RequestConverter.buildMoveRegionRequest(
-1572  
openRegions.get(i).getEncodedNameAsBytes(), destRS));
-1573}
-1574  }
-1575  //Make sure no regions are in 
transition now
-1576  
UTIL.waitUntilNoRegionsInTransition();
-1577  // now trigger a balance
-1578  master.balanceSwitch(true);
-1579  boolean balanceRun = 
master.balance();
-1580  assertTrue("Coprocessor should be 
called on region rebalancing",
-1581  cp.wasBalanceCalled());
-1582} finally {
-1583  Admin admin = UTIL.getAdmin();
-1584  admin.disableTable(tableName);
-1585  deleteTable(admin, tableName);
-1586}
-1587  }
-1588
-1589  @Test (timeout=18)
-1590  public void 
testTableDescriptorsEnumeration() throws Exception {
-1591MiniHBaseCluster cluster = 
UTIL.getHBaseCluster();
-1592
-1593HMaster master = 
cluster.getMaster();
-1594MasterCoprocessorHost host = 
master.getMasterCoprocessorHost();
-1595CPMasterObserver cp = 
host.findCoprocessor(CPMasterObserver.class);
-1596cp.resetStates();
-1597
-1598GetTableDescriptorsRequest req =
-1599
RequestConverter.buildGetTableDescriptorsRequest((ListTableName)null);
-1600
master.getMasterRpcServices().getTableDescriptors(null, req);
-1601
-1602assertTrue("Coprocessor should be 
called on table descriptors request",
-1603  
cp.wasGetTableDescriptorsCalled());
-1604  }
-1605
-1606  @Test (timeout=18)
-1607  public void 
testTableNamesEnumeration() throws Exception {
-1608MiniHBaseCluster cluster = 
UTIL.getHBaseCluster();
-1609
-1610HMaster master = 
cluster.getMaster();
-1611MasterCoprocessorHost host = 
master.getMasterCoprocessorHost();
-1612CPMasterObserver cp = 
host.findCoprocessor(CPMasterObserver.class);
-1613cp.resetStates();
-1614
-1615
master.getMasterRpcServices().getTableNames(null,
-1616
GetTableNamesRequest.newBuilder().build());
-1617assertTrue("Coprocessor should be 
called on table names request",
-1618  cp.wasGetTableNamesCalled());
-1619  }
-1620
-1621  @Test (timeout=18)
-1622  public void 
testAbortProcedureOperation() throws Exception {
-1623MiniHBaseCluster cluster = 
UTIL.getHBaseCluster();
-1624
-1625HMaster master = 
cluster.getMaster();
-1626MasterCoprocessorHost host = 
master.getMasterCoprocessorHost();
-1627CPMasterObserver cp = 
host.findCoprocessor(CPMasterObserver.class);
-1628cp.resetStates();
-1629
-1630master.abortProcedure(1, true);
-1631assertTrue(
-1632  "Coprocessor should be called on 
abort procedure request",
-1633  cp.wasAbortProcedureCalled());
-1634  }
-1635
-1636  @Test (timeout=18)
-1637  public void 
testGetProceduresOperation() throws Exception {
-1638MiniHBaseCluster cluster = 
UTIL.getHBaseCluster();
-1639
-1640HMaster master = 
cluster.getMaster();
-1641MasterCoprocessorHost host = 
master.getMasterCoprocessorHost();
-1642CPMasterObserver cp = 
host.findCoprocessor(CPMasterObserver.class);
-1643cp.resetStates();
-1644
-1645master.getProcedures();
-1646assertTrue(
-1647  "Coprocessor should be called on 
get procedures request",
-1648  cp.wasGetProceduresCalled());
-1649  }
-1650
-1651  @Test (timeout=18)
-1652  public void 

[18/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyNamespaceProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyNamespaceProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyNamespaceProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyNamespaceProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyNamespaceProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[27/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateTableProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateTableProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateTableProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateTableProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateTableProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[06/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/testdevapidocs/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.html
--
diff --git 
a/testdevapidocs/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.html 
b/testdevapidocs/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.html
index 5cc6515..ccf45f5 100644
--- a/testdevapidocs/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.html
+++ b/testdevapidocs/org/apache/hadoop/hbase/coprocessor/TestMasterObserver.html
@@ -529,7 +529,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 testTableDescriptorsEnumeration
-publicvoidtestTableDescriptorsEnumeration()
+publicvoidtestTableDescriptorsEnumeration()
  throws http://docs.oracle.com/javase/8/docs/api/java/lang/Exception.html?is-external=true;
 title="class or interface in java.lang">Exception
 
 Throws:
@@ -543,7 +543,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 testTableNamesEnumeration
-publicvoidtestTableNamesEnumeration()
+publicvoidtestTableNamesEnumeration()
throws http://docs.oracle.com/javase/8/docs/api/java/lang/Exception.html?is-external=true;
 title="class or interface in java.lang">Exception
 
 Throws:
@@ -557,7 +557,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 testAbortProcedureOperation
-publicvoidtestAbortProcedureOperation()
+publicvoidtestAbortProcedureOperation()
  throws http://docs.oracle.com/javase/8/docs/api/java/lang/Exception.html?is-external=true;
 title="class or interface in java.lang">Exception
 
 Throws:
@@ -571,7 +571,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 testGetProceduresOperation
-publicvoidtestGetProceduresOperation()
+publicvoidtestGetProceduresOperation()
 throws http://docs.oracle.com/javase/8/docs/api/java/lang/Exception.html?is-external=true;
 title="class or interface in java.lang">Exception
 
 Throws:
@@ -585,7 +585,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 testGetLocksOperation
-publicvoidtestGetLocksOperation()
+publicvoidtestGetLocksOperation()
throws http://docs.oracle.com/javase/8/docs/api/java/lang/Exception.html?is-external=true;
 title="class or interface in java.lang">Exception
 
 Throws:
@@ -599,7 +599,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 deleteTable
-privatevoiddeleteTable(org.apache.hadoop.hbase.client.Adminadmin,
+privatevoiddeleteTable(org.apache.hadoop.hbase.client.Adminadmin,
  org.apache.hadoop.hbase.TableNametableName)
   throws http://docs.oracle.com/javase/8/docs/api/java/lang/Exception.html?is-external=true;
 title="class or interface in java.lang">Exception
 
@@ -614,7 +614,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 testQueueLockAndLockHeartbeatOperations
-publicvoidtestQueueLockAndLockHeartbeatOperations()
+publicvoidtestQueueLockAndLockHeartbeatOperations()
  throws http://docs.oracle.com/javase/8/docs/api/java/lang/Exception.html?is-external=true;
 title="class or interface in java.lang">Exception
 
 Throws:

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/testdevapidocs/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
--
diff --git 
a/testdevapidocs/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
 
b/testdevapidocs/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
index 6e4d8fd..45b8fbc 100644
--- 
a/testdevapidocs/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
+++ 
b/testdevapidocs/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
@@ -132,7 +132,7 @@ var activeTableTab = "activeTableTab";
 
 
 
-public static class TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader
+public static class TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader
 extends MultithreadedTestUtil.RepeatingTestThread
 
 
@@ -280,7 +280,7 @@ extends 
 
 numBulkLoads
-finalhttp://docs.oracle.com/javase/8/docs/api/java/util/concurrent/atomic/AtomicLong.html?is-external=true;
 title="class or interface in java.util.concurrent.atomic">AtomicLong numBulkLoads
+finalhttp://docs.oracle.com/javase/8/docs/api/java/util/concurrent/atomic/AtomicLong.html?is-external=true;
 title="class or interface in java.util.concurrent.atomic">AtomicLong numBulkLoads
 
 
 
@@ -289,7 +289,7 @@ extends 
 
 numCompactions

[04/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.html
--
diff --git 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.html
 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.html
index 46bd7da..9659179 100644
--- 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.html
+++ 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.html
@@ -208,7 +208,7 @@
 200}
 201
 202assertTrue(destServerName != null 
 !destServerName.equals(serverName));
-203admin.move(hri.getRegionName(), 
Optional.of(destServerName)).get();
+203admin.move(hri.getRegionName(), 
destServerName).get();
 204
 205long timeoutTime = 
System.currentTimeMillis() + 3;
 206while (true) {
@@ -370,7 +370,7 @@
 362  @Test
 363  public void testMergeRegions() throws 
Exception {
 364byte[][] splitRows = new byte[][] { 
Bytes.toBytes("3"), Bytes.toBytes("6") };
-365createTableWithDefaultConf(tableName, 
Optional.of(splitRows));
+365createTableWithDefaultConf(tableName, 
splitRows);
 366
 367RawAsyncTable metaTable = 
ASYNC_CONN.getRawTable(META_TABLE_NAME);
 368ListHRegionLocation 
regionLocations =
@@ -427,187 +427,190 @@
 419table.putAll(puts).join();
 420
 421if (isSplitRegion) {
-422  
admin.splitRegion(regionLocations.get(0).getRegionInfo().getRegionName(),
-423
Optional.ofNullable(splitPoint)).get();
-424} else {
-425  if (splitPoint == null) {
-426admin.split(tableName).get();
-427  } else {
-428admin.split(tableName, 
splitPoint).get();
-429  }
-430}
-431
-432int count = 0;
-433for (int i = 0; i  45; i++) {
-434  try {
-435regionLocations =
-436
AsyncMetaTableAccessor.getTableHRegionLocations(metaTable, 
Optional.of(tableName))
-437.get();
-438count = regionLocations.size();
-439if (count = 2) {
-440  break;
-441}
-442Thread.sleep(1000L);
-443  } catch (Exception e) {
-444LOG.error(e);
-445  }
-446}
-447assertEquals(count, 2);
-448  }
-449
-450  @Test
-451  public void testCompactRegionServer() 
throws Exception {
-452byte[][] families = { 
Bytes.toBytes("f1"), Bytes.toBytes("f2"), Bytes.toBytes("f3") };
-453createTableWithDefaultConf(tableName, 
Optional.empty(), families);
-454loadData(tableName, families, 3000, 
8);
-455
-456ListHRegionServer rsList =
-457
TEST_UTIL.getHBaseCluster().getLiveRegionServerThreads().stream()
-458.map(rsThread - 
rsThread.getRegionServer()).collect(Collectors.toList());
-459ListRegion regions = new 
ArrayList();
-460rsList.forEach(rs - 
regions.addAll(rs.getRegions(tableName)));
-461Assert.assertEquals(regions.size(), 
1);
-462int countBefore = 
countStoreFilesInFamilies(regions, families);
-463Assert.assertTrue(countBefore  
0);
-464
-465// Minor compaction for all region 
servers.
-466for (HRegionServer rs : rsList)
-467  
admin.compactRegionServer(rs.getServerName()).get();
-468Thread.sleep(5000);
-469int countAfterMinorCompaction = 
countStoreFilesInFamilies(regions, families);
-470
Assert.assertTrue(countAfterMinorCompaction  countBefore);
-471
-472// Major compaction for all region 
servers.
-473for (HRegionServer rs : rsList)
-474  
admin.majorCompactRegionServer(rs.getServerName()).get();
-475Thread.sleep(5000);
-476int countAfterMajorCompaction = 
countStoreFilesInFamilies(regions, families);
-477
Assert.assertEquals(countAfterMajorCompaction, 3);
-478  }
-479
-480  @Test
-481  public void testCompact() throws 
Exception {
-482
compactionTest(TableName.valueOf("testCompact1"), 8, CompactionState.MAJOR, 
false);
-483
compactionTest(TableName.valueOf("testCompact2"), 15, CompactionState.MINOR, 
false);
-484
compactionTest(TableName.valueOf("testCompact3"), 8, CompactionState.MAJOR, 
true);
-485
compactionTest(TableName.valueOf("testCompact4"), 15, CompactionState.MINOR, 
true);
-486  }
-487
-488  private void compactionTest(final 
TableName tableName, final int flushes,
-489  final CompactionState 
expectedState, boolean singleFamily) throws Exception {
-490// Create a table with regions
-491byte[] family = 
Bytes.toBytes("family");
-492byte[][] families =
-493{ family, Bytes.add(family, 
Bytes.toBytes("2")), Bytes.add(family, Bytes.toBytes("3")) };
-494createTableWithDefaultConf(tableName, 
Optional.empty(), families);
-495loadData(tableName, families, 3000, 
flushes);
-496
-497ListRegion regions = new 
ArrayList();
-498TEST_UTIL
-499.getHBaseCluster()
-500.getLiveRegionServerThreads()
-501.forEach(rsThread - 

[35/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/io/hfile/package-tree.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/io/hfile/package-tree.html 
b/devapidocs/org/apache/hadoop/hbase/io/hfile/package-tree.html
index 2b7a8cd..5a615cc 100644
--- a/devapidocs/org/apache/hadoop/hbase/io/hfile/package-tree.html
+++ b/devapidocs/org/apache/hadoop/hbase/io/hfile/package-tree.html
@@ -274,11 +274,11 @@
 java.lang.http://docs.oracle.com/javase/8/docs/api/java/lang/Enum.html?is-external=true;
 title="class or interface in java.lang">EnumE (implements java.lang.http://docs.oracle.com/javase/8/docs/api/java/lang/Comparable.html?is-external=true;
 title="class or interface in java.lang">ComparableT, java.io.http://docs.oracle.com/javase/8/docs/api/java/io/Serializable.html?is-external=true;
 title="class or interface in java.io">Serializable)
 
 org.apache.hadoop.hbase.io.hfile.BlockType.BlockCategory
-org.apache.hadoop.hbase.io.hfile.Cacheable.MemoryType
-org.apache.hadoop.hbase.io.hfile.BlockPriority
 org.apache.hadoop.hbase.io.hfile.CacheConfig.ExternalBlockCaches
-org.apache.hadoop.hbase.io.hfile.HFileBlock.Writer.State
+org.apache.hadoop.hbase.io.hfile.BlockPriority
 org.apache.hadoop.hbase.io.hfile.BlockType
+org.apache.hadoop.hbase.io.hfile.Cacheable.MemoryType
+org.apache.hadoop.hbase.io.hfile.HFileBlock.Writer.State
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/ipc/package-tree.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/ipc/package-tree.html 
b/devapidocs/org/apache/hadoop/hbase/ipc/package-tree.html
index bbafcf8..57976d2 100644
--- a/devapidocs/org/apache/hadoop/hbase/ipc/package-tree.html
+++ b/devapidocs/org/apache/hadoop/hbase/ipc/package-tree.html
@@ -343,9 +343,9 @@
 
 java.lang.http://docs.oracle.com/javase/8/docs/api/java/lang/Enum.html?is-external=true;
 title="class or interface in java.lang">EnumE (implements java.lang.http://docs.oracle.com/javase/8/docs/api/java/lang/Comparable.html?is-external=true;
 title="class or interface in java.lang">ComparableT, java.io.http://docs.oracle.com/javase/8/docs/api/java/io/Serializable.html?is-external=true;
 title="class or interface in java.io">Serializable)
 
-org.apache.hadoop.hbase.ipc.BufferCallBeforeInitHandler.BufferCallAction
-org.apache.hadoop.hbase.ipc.MetricsHBaseServerSourceFactoryImpl.SourceStorage
 org.apache.hadoop.hbase.ipc.CallEvent.Type
+org.apache.hadoop.hbase.ipc.MetricsHBaseServerSourceFactoryImpl.SourceStorage
+org.apache.hadoop.hbase.ipc.BufferCallBeforeInitHandler.BufferCallAction
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html 
b/devapidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
index 7006d2b..e5fc660 100644
--- a/devapidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
+++ b/devapidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
@@ -1354,7 +1354,7 @@ public staticvoid
 
 buildDependencyClasspath
-public statichttp://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringbuildDependencyClasspath(org.apache.hadoop.conf.Configurationconf)
+public statichttp://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringbuildDependencyClasspath(org.apache.hadoop.conf.Configurationconf)
 Returns a classpath string built from the content of the 
"tmpjars" value in conf.
  Also exposed to shell scripts via `bin/hbase mapredcp`.
 
@@ -1365,7 +1365,7 @@ public staticvoid
 
 addDependencyJars
-public staticvoidaddDependencyJars(org.apache.hadoop.mapreduce.Jobjob)
+public staticvoidaddDependencyJars(org.apache.hadoop.mapreduce.Jobjob)
   throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
 Add the HBase dependency jars as well as jars for any of 
the configured
  job classes to the job configuration, so that JobClient will ship them
@@ -1383,7 +1383,7 @@ public staticvoid
 addDependencyJars
 http://docs.oracle.com/javase/8/docs/api/java/lang/Deprecated.html?is-external=true;
 title="class or interface in java.lang">@Deprecated
-public staticvoidaddDependencyJars(org.apache.hadoop.conf.Configurationconf,
+public staticvoidaddDependencyJars(org.apache.hadoop.conf.Configurationconf,
  http://docs.oracle.com/javase/8/docs/api/java/lang/Class.html?is-external=true;
 title="class or interface in 

[49/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/apidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html 
b/apidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html
index 2f03b4f..ff5c5fd 100644
--- a/apidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html
+++ b/apidocs/org/apache/hadoop/hbase/client/AsyncAdmin.html
@@ -18,7 +18,7 @@
 catch(err) {
 }
 //-->
-var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":18,"i6":6,"i7":6,"i8":6,"i9":6,"i10":18,"i11":6,"i12":18,"i13":6,"i14":6,"i15":6,"i16":6,"i17":6,"i18":18,"i19":6,"i20":6,"i21":6,"i22":6,"i23":6,"i24":6,"i25":18,"i26":6,"i27":6,"i28":6,"i29":6,"i30":6,"i31":6,"i32":6,"i33":6,"i34":6,"i35":6,"i36":18,"i37":6,"i38":6,"i39":6,"i40":6,"i41":6,"i42":6,"i43":6,"i44":18,"i45":18,"i46":6,"i47":6,"i48":6,"i49":6,"i50":18,"i51":6,"i52":18,"i53":6,"i54":6,"i55":6,"i56":6,"i57":6,"i58":6,"i59":6,"i60":6,"i61":6,"i62":6,"i63":6,"i64":6,"i65":6,"i66":18,"i67":6,"i68":6,"i69":6,"i70":18,"i71":6,"i72":6,"i73":6,"i74":18,"i75":6,"i76":18,"i77":6,"i78":18,"i79":6,"i80":18,"i81":6,"i82":6,"i83":18,"i84":6,"i85":18,"i86":6,"i87":6,"i88":6,"i89":6,"i90":6,"i91":6,"i92":6,"i93":6,"i94":6,"i95":6,"i96":6,"i97":6,"i98":6,"i99":6,"i100":6,"i101":6,"i102":6,"i103":6,"i104":6,"i105":6,"i106":6,"i107":6,"i108":6,"i109":6,"i110":6,"i111":18,"i112":18,"i113":6,"i114":6,"i115":18,"i116":6,"i117":6,"i118":6,
 "i119":6,"i120":6,"i121":6,"i122":6,"i123":6,"i124":6};
+var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":18,"i6":6,"i7":6,"i8":6,"i9":6,"i10":6,"i11":6,"i12":6,"i13":6,"i14":6,"i15":6,"i16":6,"i17":6,"i18":6,"i19":6,"i20":6,"i21":6,"i22":6,"i23":6,"i24":6,"i25":6,"i26":6,"i27":6,"i28":6,"i29":6,"i30":6,"i31":6,"i32":6,"i33":6,"i34":6,"i35":6,"i36":6,"i37":6,"i38":18,"i39":6,"i40":6,"i41":6,"i42":6,"i43":6,"i44":6,"i45":6,"i46":18,"i47":18,"i48":6,"i49":6,"i50":6,"i51":6,"i52":6,"i53":6,"i54":18,"i55":6,"i56":6,"i57":6,"i58":6,"i59":6,"i60":6,"i61":6,"i62":6,"i63":6,"i64":6,"i65":6,"i66":6,"i67":6,"i68":6,"i69":6,"i70":6,"i71":6,"i72":18,"i73":6,"i74":6,"i75":6,"i76":6,"i77":6,"i78":6,"i79":6,"i80":18,"i81":6,"i82":6,"i83":18,"i84":6,"i85":6,"i86":6,"i87":6,"i88":6,"i89":6,"i90":6,"i91":6,"i92":6,"i93":6,"i94":6,"i95":6,"i96":6,"i97":6,"i98":6,"i99":6,"i100":6,"i101":6,"i102":6,"i103":6,"i104":6,"i105":6,"i106":6,"i107":6,"i108":6,"i109":6,"i110":6,"i111":6,"i112":6,"i113":6,"i114":6,"i115":6,"i116":6,"i117":18,"i118":18,"i119":6,"i
 
120":6,"i121":6,"i122":6,"i123":6,"i124":6,"i125":6,"i126":6,"i127":6,"i128":6,"i129":6,"i130":6};
 var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"],16:["t5","Default Methods"]};
 var altColor = "altColor";
 var rowColor = "rowColor";
@@ -194,28 +194,28 @@ public interface 
-default http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
 compact(TableNametableName)
 Compact a table.
 
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
-compact(TableNametableName,
-   http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[]columnFamily)
+compact(TableNametableName,
+   byte[]columnFamily)
 Compact a column family within a table.
 
 
 
-default http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
 compactRegion(byte[]regionName)
 Compact an individual region.
 
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in 

[26/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteColumnFamilyProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteColumnFamilyProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteColumnFamilyProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteColumnFamilyProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteColumnFamilyProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[45/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/checkstyle-aggregate.html
--
diff --git a/checkstyle-aggregate.html b/checkstyle-aggregate.html
index e3003fa..93555a3 100644
--- a/checkstyle-aggregate.html
+++ b/checkstyle-aggregate.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Checkstyle Results
 
@@ -289,7 +289,7 @@
 3425
 0
 0
-21699
+21610
 
 Files
 
@@ -1617,7 +1617,7 @@
 org/apache/hadoop/hbase/client/AsyncHBaseAdmin.java
 0
 0
-3
+2
 
 org/apache/hadoop/hbase/client/AsyncMasterRequestRpcRetryingCaller.java
 0
@@ -1972,7 +1972,7 @@
 org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.java
 0
 0
-131
+119
 
 org/apache/hadoop/hbase/client/RawAsyncTable.java
 0
@@ -2232,7 +2232,7 @@
 org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
 0
 0
-5
+1
 
 org/apache/hadoop/hbase/client/TestAsyncAdminBuilder.java
 0
@@ -11257,7 +11257,7 @@
 org/apache/hadoop/hbase/shaded/protobuf/RequestConverter.java
 0
 0
-189
+117
 
 org/apache/hadoop/hbase/shaded/protobuf/ResponseConverter.java
 0
@@ -13293,7 +13293,7 @@
 http://checkstyle.sourceforge.net/config_imports.html#UnusedImports;>UnusedImports
 
 processJavadoc: true
-273
+266
 Error
 
 indentation
@@ -13304,7 +13304,7 @@
 caseIndent: 2
 basicOffset: 2
 lineWrappingIndentation: 2
-6429
+6362
 Error
 
 javadoc
@@ -13316,7 +13316,7 @@
 
 
 http://checkstyle.sourceforge.net/config_javadoc.html#NonEmptyAtclauseDescription;>NonEmptyAtclauseDescription
-4425
+4415
 Error
 
 misc
@@ -13334,7 +13334,7 @@
 
 max: 100
 ignorePattern: ^package.*|^import.*|a 
href|href|http://|https://|ftp://|org.apache.thrift.|com.google.protobuf.|hbase.protobuf.generated
-1953
+1948
 Error
 
 
@@ -30217,79 +30217,79 @@
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-159
+168
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-362
+361
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-376
+375
 
 Error
 sizes
 LineLength
 Line is longer than 100 characters (found 106).
-806
+821
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-827
+852
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-846
+881
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-875
+888
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-882
+895
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-883
+896
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-940
+952
 
 Error
 sizes
 LineLength
 Line is longer than 100 characters (found 101).
-967
+979
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-995
+1007
 
 Error
 javadoc
 NonEmptyAtclauseDescription
 At-clause should have a non-empty description.
-1016
+1028
 
 org/apache/hadoop/hbase/client/AsyncAdminBuilder.java
 
@@ -30521,37 +30521,31 @@
 
 Error
 imports
-UnusedImports
-Unused import - org.apache.hadoop.hbase.util.Pair.
-46
-
-Error
-imports
 ImportOrder
 Wrong order for 'com.google.protobuf.RpcChannel' import.
-49
-
+48
+
 Error
 sizes
 LineLength
 Line is longer than 100 characters (found 101).
-384
+436
 
 org/apache/hadoop/hbase/client/AsyncMasterRequestRpcRetryingCaller.java
 
-
+
 Severity
 Category
 Rule
 Message
 Line
-
+
 Error
 imports
 ImportOrder
 Wrong order for 'java.util.concurrent.CompletableFuture' import.
 22
-
+
 Error
 imports
 ImportOrder
@@ -30560,13 +30554,13 @@
 
 org/apache/hadoop/hbase/client/AsyncMetaRegionLocator.java
 
-
+
 Severity
 Category
 Rule
 Message
 Line
-
+
 Error
 imports
 AvoidStarImport
@@ -30575,49 +30569,49 @@
 
 org/apache/hadoop/hbase/client/AsyncNonMetaRegionLocator.java
 
-
+
 Severity
 Category
 Rule
 Message
 Line
-
+
 Error
 imports
 ImportOrder
 Wrong order for 'org.apache.hadoop.hbase.util.Bytes' import.
 55
-
+
 Error
 design
 VisibilityModifier
 Variable 'locateType' must be private and have accessor methods.
 80
-
+
 Error
 design
 VisibilityModifier
 Variable 'cache' must be private and have accessor methods.
 104
-
+
 Error
 design
 VisibilityModifier
 Variable 'pendingRequests' must be private and have accessor methods.
 107
-
+
 Error
 design
 VisibilityModifier
 Variable 'allRequests' must be private and have accessor methods.
 109
-
+
 Error
 sizes
 LineLength
 Line is longer than 100 characters (found 101).
 149
-
+
 Error
 sizes
 LineLength
@@ -30626,85 +30620,85 @@
 
 org/apache/hadoop/hbase/client/AsyncProcess.java
 
-
+
 Severity
 Category
 Rule
 Message
 Line
-
+
 Error
 imports
 ImportOrder
 Wrong order for 'java.io.IOException' import.
 25
-
+
 Error
 imports
 ImportOrder
 Wrong order for 

[38/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
 
b/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
index a2e8d05..b198057 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer.html
@@ -127,7 +127,7 @@ var activeTableTab = "activeTableTab";
 
 
 
-private class RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer
+private class RawAsyncHBaseAdmin.AddColumnFamilyProcedureBiConsumer
 extends RawAsyncHBaseAdmin.TableProcedureBiConsumer
 
 
@@ -240,7 +240,7 @@ extends 
 
 AddColumnFamilyProcedureBiConsumer
-AddColumnFamilyProcedureBiConsumer(AsyncAdminadmin,
+AddColumnFamilyProcedureBiConsumer(AsyncAdminadmin,
TableNametableName)
 
 
@@ -258,7 +258,7 @@ extends 
 
 getOperationType
-http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringgetOperationType()
+http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringgetOperationType()
 
 Specified by:
 getOperationTypein
 classRawAsyncHBaseAdmin.TableProcedureBiConsumer

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
 
b/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
index 4c2bc6e..d33dae4 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.AdminRpcCall.html
@@ -110,7 +110,7 @@ var activeTableTab = "activeTableTab";
 
 
 http://docs.oracle.com/javase/8/docs/api/java/lang/FunctionalInterface.html?is-external=true;
 title="class or interface in java.lang">@FunctionalInterface
-private static interface RawAsyncHBaseAdmin.AdminRpcCallRESP,REQ
+private static interface RawAsyncHBaseAdmin.AdminRpcCallRESP,REQ
 
 
 
@@ -159,7 +159,7 @@ private static interface 
 
 call
-voidcall(org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService.Interfacestub,
+voidcall(org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService.Interfacestub,
   HBaseRpcControllercontroller,
   REQreq,
   org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallbackRESPdone)

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html 
b/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
index c62228e..83935cb 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
@@ -110,7 +110,7 @@ var activeTableTab = "activeTableTab";
 
 
 http://docs.oracle.com/javase/8/docs/api/java/lang/FunctionalInterface.html?is-external=true;
 title="class or interface in java.lang">@FunctionalInterface
-private static interface RawAsyncHBaseAdmin.ConverterD,S
+private static interface RawAsyncHBaseAdmin.ConverterD,S
 
 
 
@@ -156,7 +156,7 @@ private static interface 
 
 convert
-Dconvert(Ssrc)
+Dconvert(Ssrc)
throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
 
 Throws:

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
 
b/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
index 239514f..8b0509d 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer.html
@@ -127,7 +127,7 @@ var activeTableTab = "activeTableTab";
 
 
 
-private class RawAsyncHBaseAdmin.CreateNamespaceProcedureBiConsumer
+private class 

[47/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/apidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html 
b/apidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html
index 4aaf926..e249fd7 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html
@@ -72,489 +72,489 @@
 064  /**
 065   * List all the userspace tables.
 066   * @return - returns a list of 
TableDescriptors wrapped by a {@link CompletableFuture}.
-067   * @see #listTables(Optional, 
boolean)
-068   */
-069  default 
CompletableFutureListTableDescriptor listTables() {
-070return listTables(Optional.empty(), 
false);
-071  }
-072
-073  /**
-074   * List all the tables matching the 
given pattern.
-075   * @param pattern The compiled regular 
expression to match against
-076   * @param includeSysTables False to 
match only against userspace tables
-077   * @return - returns a list of 
TableDescriptors wrapped by a {@link CompletableFuture}.
-078   */
-079  
CompletableFutureListTableDescriptor 
listTables(OptionalPattern pattern,
-080  boolean includeSysTables);
-081
-082  /**
-083   * List all of the names of userspace 
tables.
-084   * @return a list of table names 
wrapped by a {@link CompletableFuture}.
-085   * @see #listTableNames(Optional, 
boolean)
-086   */
-087  default 
CompletableFutureListTableName listTableNames() {
-088return 
listTableNames(Optional.empty(), false);
-089  }
-090
-091  /**
-092   * List all of the names of userspace 
tables.
-093   * @param pattern The regular 
expression to match against
-094   * @param includeSysTables False to 
match only against userspace tables
-095   * @return a list of table names 
wrapped by a {@link CompletableFuture}.
-096   */
-097  
CompletableFutureListTableName 
listTableNames(OptionalPattern pattern,
-098  boolean includeSysTables);
-099
-100  /**
-101   * Method for getting the 
tableDescriptor
-102   * @param tableName as a {@link 
TableName}
-103   * @return the read-only 
tableDescriptor wrapped by a {@link CompletableFuture}.
-104   */
-105  
CompletableFutureTableDescriptor getTableDescriptor(TableName 
tableName);
-106
-107  /**
-108   * Creates a new table.
-109   * @param desc table descriptor for 
table
-110   */
-111  default CompletableFutureVoid 
createTable(TableDescriptor desc) {
-112return createTable(desc, 
Optional.empty());
-113  }
-114
-115  /**
-116   * Creates a new table with the 
specified number of regions. The start key specified will become
-117   * the end key of the first region of 
the table, and the end key specified will become the start
-118   * key of the last region of the table 
(the first region has a null start key and the last region
-119   * has a null end key). BigInteger math 
will be used to divide the key range specified into enough
-120   * segments to make the required number 
of total regions.
-121   * @param desc table descriptor for 
table
-122   * @param startKey beginning of key 
range
-123   * @param endKey end of key range
-124   * @param numRegions the total number 
of regions to create
-125   */
-126  CompletableFutureVoid 
createTable(TableDescriptor desc, byte[] startKey, byte[] endKey,
-127  int numRegions);
-128
-129  /**
-130   * Creates a new table with an initial 
set of empty regions defined by the specified split keys.
-131   * The total number of regions created 
will be the number of split keys plus one.
-132   * Note : Avoid passing empty split 
key.
-133   * @param desc table descriptor for 
table
-134   * @param splitKeys array of split keys 
for the initial regions of the table
-135   */
-136  CompletableFutureVoid 
createTable(TableDescriptor desc, Optionalbyte[][] splitKeys);
+067   */
+068  default 
CompletableFutureListTableDescriptor listTables() {
+069return listTables(false);
+070  }
+071
+072  /**
+073   * List all the tables.
+074   * @param includeSysTables False to 
match only against userspace tables
+075   * @return - returns a list of 
TableDescriptors wrapped by a {@link CompletableFuture}.
+076   */
+077  
CompletableFutureListTableDescriptor listTables(boolean 
includeSysTables);
+078
+079  /**
+080   * List all the tables matching the 
given pattern.
+081   * @param pattern The compiled regular 
expression to match against
+082   * @param includeSysTables False to 
match only against userspace tables
+083   * @return - returns a list of 
TableDescriptors wrapped by a {@link CompletableFuture}.
+084   */
+085  
CompletableFutureListTableDescriptor listTables(Pattern 
pattern, boolean includeSysTables);
+086
+087  /**
+088   * List all of the names of userspace 
tables.
+089   * @return a list of table names 
wrapped by a {@link CompletableFuture}.
+090   * @see #listTableNames(Pattern, 
boolean)
+091   */

[43/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/index-all.html
--
diff --git a/devapidocs/index-all.html b/devapidocs/index-all.html
index 3119218..522ec10 100644
--- a/devapidocs/index-all.html
+++ b/devapidocs/index-all.html
@@ -13424,11 +13424,13 @@
 
 Compact a table.
 
-compact(TableName,
 Optionalbyte[]) - Method in interface 
org.apache.hadoop.hbase.client.AsyncAdmin
+compact(TableName,
 byte[]) - Method in interface org.apache.hadoop.hbase.client.AsyncAdmin
 
 Compact a column family within a table.
 
-compact(TableName,
 Optionalbyte[]) - Method in class 
org.apache.hadoop.hbase.client.AsyncHBaseAdmin
+compact(TableName)
 - Method in class org.apache.hadoop.hbase.client.AsyncHBaseAdmin
+
+compact(TableName,
 byte[]) - Method in class org.apache.hadoop.hbase.client.AsyncHBaseAdmin
 
 compact(TableName)
 - Method in class org.apache.hadoop.hbase.client.HBaseAdmin
 
@@ -13452,13 +13454,15 @@
 
 Compact a table.
 
-compact(TableName,
 Optionalbyte[]) - Method in class 
org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
+compact(TableName)
 - Method in class org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
+
+compact(TableName,
 byte[]) - Method in class org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
 
-compact(TableName,
 Optionalbyte[], boolean, CompactType) - Method in class 
org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
+compact(TableName,
 byte[], boolean, CompactType) - Method in class 
org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
 
 Compact column family of a table, Asynchronous operation 
even if CompletableFuture.get()
 
-compact(ServerName,
 RegionInfo, boolean, Optionalbyte[]) - Method in class 
org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
+compact(ServerName,
 RegionInfo, boolean, byte[]) - Method in class 
org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
 
 Compact the region at specific region server.
 
@@ -13936,11 +13940,13 @@
 
 Compact an individual region.
 
-compactRegion(byte[],
 Optionalbyte[]) - Method in interface 
org.apache.hadoop.hbase.client.AsyncAdmin
+compactRegion(byte[],
 byte[]) - Method in interface org.apache.hadoop.hbase.client.AsyncAdmin
 
 Compact a column family within a region.
 
-compactRegion(byte[],
 Optionalbyte[]) - Method in class 
org.apache.hadoop.hbase.client.AsyncHBaseAdmin
+compactRegion(byte[])
 - Method in class org.apache.hadoop.hbase.client.AsyncHBaseAdmin
+
+compactRegion(byte[],
 byte[]) - Method in class org.apache.hadoop.hbase.client.AsyncHBaseAdmin
 
 compactRegion(byte[])
 - Method in class org.apache.hadoop.hbase.client.HBaseAdmin
 
@@ -13952,9 +13958,11 @@
 
 Compact an individual region.
 
-compactRegion(byte[],
 Optionalbyte[]) - Method in class 
org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
+compactRegion(byte[])
 - Method in class org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
+
+compactRegion(byte[],
 byte[]) - Method in class org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
 
-compactRegion(byte[],
 Optionalbyte[], boolean) - Method in class 
org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
+compactRegion(byte[],
 byte[], boolean) - Method in class 
org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
 
 compactRegion(RpcController,
 AdminProtos.CompactRegionRequest) - Method in class 
org.apache.hadoop.hbase.master.MasterRpcServices
 
@@ -14453,6 +14461,8 @@
 
 compareRegionInfosWithoutReplicaId(RegionInfo,
 RegionInfo) - Static method in class 
org.apache.hadoop.hbase.client.RegionReplicaUtil
 
+compareRegionsWithSplitKeys(ListHRegionLocation,
 byte[][]) - Method in class org.apache.hadoop.hbase.client.RawAsyncHBaseAdmin
+
 compareResults(Result,
 Result) - Static method in class org.apache.hadoop.hbase.client.Result
 
 Does a deep comparison of two Results, down to the byte 
arrays.
@@ -19915,13 +19925,15 @@
 
 Creates a new table with the specified number of 
regions.
 
-createTable(TableDescriptor,
 Optionalbyte[][]) - Method in interface 
org.apache.hadoop.hbase.client.AsyncAdmin
+createTable(TableDescriptor,
 byte[][]) - Method in interface org.apache.hadoop.hbase.client.AsyncAdmin
 
 Creates a new table with an initial set of empty regions 
defined by the specified split keys.
 
+createTable(TableDescriptor)
 - Method in class org.apache.hadoop.hbase.client.AsyncHBaseAdmin
+
 createTable(TableDescriptor,
 byte[], byte[], int) - Method in class 
org.apache.hadoop.hbase.client.AsyncHBaseAdmin
 
-createTable(TableDescriptor,
 Optionalbyte[][]) - Method in class 
org.apache.hadoop.hbase.client.AsyncHBaseAdmin
+createTable(TableDescriptor,
 byte[][]) - Method in class org.apache.hadoop.hbase.client.AsyncHBaseAdmin
 
 createTable(TableDescriptor)
 - Method in class org.apache.hadoop.hbase.client.HBaseAdmin
 
@@ -19929,9 +19941,13 @@
 
 createTable(TableDescriptor,
 byte[][]) - Method in class org.apache.hadoop.hbase.client.HBaseAdmin
 
+createTable(TableDescriptor)
 - Method in class 

[02/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
--
diff --git 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
index d4f7627..4496495 100644
--- 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
+++ 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldClient.AtomicHFileLoader.html
@@ -28,152 +28,151 @@
 020import java.io.IOException;
 021import java.util.ArrayList;
 022import java.util.List;
-023import java.util.Optional;
-024import 
java.util.concurrent.atomic.AtomicLong;
-025
-026import org.apache.commons.logging.Log;
-027import 
org.apache.commons.logging.LogFactory;
-028import org.apache.hadoop.fs.FileSystem;
-029import org.apache.hadoop.fs.Path;
-030import 
org.apache.hadoop.hbase.HConstants;
-031import 
org.apache.hadoop.hbase.MultithreadedTestUtil.RepeatingTestThread;
-032import 
org.apache.hadoop.hbase.MultithreadedTestUtil.TestContext;
-033import 
org.apache.hadoop.hbase.TableName;
-034import 
org.apache.hadoop.hbase.client.ClientServiceCallable;
-035import 
org.apache.hadoop.hbase.client.ClusterConnection;
-036import 
org.apache.hadoop.hbase.client.RpcRetryingCaller;
-037import 
org.apache.hadoop.hbase.client.RpcRetryingCallerFactory;
-038import 
org.apache.hadoop.hbase.ipc.RpcControllerFactory;
-039import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-040import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos;
-041import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-042import 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos.BulkLoadHFileRequest;
-043import 
org.apache.hadoop.hbase.testclassification.LargeTests;
-044import 
org.apache.hadoop.hbase.testclassification.RegionServerTests;
-045import 
org.apache.hadoop.hbase.util.Bytes;
-046import 
org.apache.hadoop.hbase.util.Pair;
-047import 
org.junit.experimental.categories.Category;
-048import org.junit.runner.RunWith;
-049import org.junit.runners.Parameterized;
-050
-051import 
org.apache.hadoop.hbase.shaded.com.google.common.collect.Lists;
-052
-053/**
-054 * Tests bulk loading of HFiles with old 
non-secure client for backward compatibility. Will be
-055 * removed when old non-secure client for 
backward compatibility is not supported.
-056 */
-057@RunWith(Parameterized.class)
-058@Category({RegionServerTests.class, 
LargeTests.class})
-059public class 
TestHRegionServerBulkLoadWithOldClient extends TestHRegionServerBulkLoad {
-060  public 
TestHRegionServerBulkLoadWithOldClient(int duration) {
-061super(duration);
-062  }
-063
-064  private static final Log LOG = 
LogFactory.getLog(TestHRegionServerBulkLoadWithOldClient.class);
-065
-066  public static class AtomicHFileLoader 
extends RepeatingTestThread {
-067final AtomicLong numBulkLoads = new 
AtomicLong();
-068final AtomicLong numCompactions = new 
AtomicLong();
-069private TableName tableName;
-070
-071public AtomicHFileLoader(TableName 
tableName, TestContext ctx,
-072byte targetFamilies[][]) throws 
IOException {
-073  super(ctx);
-074  this.tableName = tableName;
-075}
-076
-077public void doAnAction() throws 
Exception {
-078  long iteration = 
numBulkLoads.getAndIncrement();
-079  Path dir =  
UTIL.getDataTestDirOnTestFS(String.format("bulkLoad_%08d",
-080  iteration));
-081
-082  // create HFiles for different 
column families
-083  FileSystem fs = 
UTIL.getTestFileSystem();
-084  byte[] val = 
Bytes.toBytes(String.format("%010d", iteration));
-085  final ListPairbyte[], 
String famPaths = new ArrayList(NUM_CFS);
-086  for (int i = 0; i  NUM_CFS; 
i++) {
-087Path hfile = new Path(dir, 
family(i));
-088byte[] fam = 
Bytes.toBytes(family(i));
-089createHFile(fs, hfile, fam, QUAL, 
val, 1000);
-090famPaths.add(new 
Pair(fam, hfile.toString()));
-091  }
-092
-093  // bulk load HFiles
-094  final ClusterConnection conn = 
(ClusterConnection) UTIL.getAdmin().getConnection();
-095  RpcControllerFactory 
rpcControllerFactory = new RpcControllerFactory(UTIL.getConfiguration());
-096  ClientServiceCallableVoid 
callable =
-097  new 
ClientServiceCallableVoid(conn, tableName,
-098  Bytes.toBytes("aaa"), 
rpcControllerFactory.newController(), HConstants.PRIORITY_UNSET) {
-099@Override
-100protected Void rpcCall() throws 
Exception {
-101  LOG.info("Non-secure old 
client");
-102  byte[] regionName = 

[32/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder.html
index 02137e9..a760d1c 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder.html
@@ -435,8 +435,8 @@
 427
 428public 
AsyncAdminRequestRetryingCallerT build() {
 429  return new 
AsyncAdminRequestRetryingCallerT(retryTimer, conn, pauseNs, 
maxAttempts,
-430  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, serverName, checkNotNull(callable,
-431"action is null"));
+430  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, checkNotNull(serverName,
+431"serverName is null"), 
checkNotNull(callable, "action is null"));
 432}
 433
 434public CompletableFutureT 
call() {
@@ -496,8 +496,8 @@
 488
 489public 
AsyncServerRequestRpcRetryingCallerT build() {
 490  return new 
AsyncServerRequestRpcRetryingCallerT(retryTimer, conn, pauseNs, 
maxAttempts,
-491  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, serverName, checkNotNull(callable,
-492"action is null"));
+491  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, checkNotNull(serverName,
+492"serverName is null"), 
checkNotNull(callable, "action is null"));
 493}
 494
 495public CompletableFutureT 
call() {

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BatchCallerBuilder.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BatchCallerBuilder.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BatchCallerBuilder.html
index 02137e9..a760d1c 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BatchCallerBuilder.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BatchCallerBuilder.html
@@ -435,8 +435,8 @@
 427
 428public 
AsyncAdminRequestRetryingCallerT build() {
 429  return new 
AsyncAdminRequestRetryingCallerT(retryTimer, conn, pauseNs, 
maxAttempts,
-430  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, serverName, checkNotNull(callable,
-431"action is null"));
+430  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, checkNotNull(serverName,
+431"serverName is null"), 
checkNotNull(callable, "action is null"));
 432}
 433
 434public CompletableFutureT 
call() {
@@ -496,8 +496,8 @@
 488
 489public 
AsyncServerRequestRpcRetryingCallerT build() {
 490  return new 
AsyncServerRequestRpcRetryingCallerT(retryTimer, conn, pauseNs, 
maxAttempts,
-491  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, serverName, checkNotNull(callable,
-492"action is null"));
+491  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, checkNotNull(serverName,
+492"serverName is null"), 
checkNotNull(callable, "action is null"));
 493}
 494
 495public CompletableFutureT 
call() {

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BuilderBase.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BuilderBase.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BuilderBase.html
index 02137e9..a760d1c 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BuilderBase.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncRpcRetryingCallerFactory.BuilderBase.html
@@ -435,8 +435,8 @@
 427
 428public 
AsyncAdminRequestRetryingCallerT build() {
 429  return new 
AsyncAdminRequestRetryingCallerT(retryTimer, conn, pauseNs, 
maxAttempts,
-430  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, serverName, checkNotNull(callable,
-431"action is null"));
+430  operationTimeoutNs, 
rpcTimeoutNs, startLogErrorsCnt, checkNotNull(serverName,
+431"serverName is null"), 
checkNotNull(callable, "action is null"));
 432}
 433
 434public CompletableFutureT 
call() {
@@ -496,8 +496,8 @@
 488
 489public 

[48/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotDescription.html
--
diff --git 
a/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotDescription.html 
b/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotDescription.html
index 982fe25..0c14381 100644
--- a/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotDescription.html
+++ b/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotDescription.html
@@ -109,7 +109,7 @@
 
 
 
-default http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListSnapshotDescription
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListSnapshotDescription
 AsyncAdmin.listSnapshots()
 List completed snapshots.
 
@@ -122,7 +122,7 @@
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListSnapshotDescription
-AsyncAdmin.listSnapshots(http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in java.util">Optionalhttp://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in 
java.util.regex">Patternpattern)
+AsyncAdmin.listSnapshots(http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in java.util.regex">Patternpattern)
 List all the completed snapshots matching the given 
pattern.
 
 
@@ -143,13 +143,19 @@
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListSnapshotDescription
+AsyncAdmin.listTableSnapshots(http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in 
java.util.regex">PatterntableNamePattern)
+List all the completed snapshots matching the given table 
name pattern.
+
+
+
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListSnapshotDescription
 AsyncAdmin.listTableSnapshots(http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in 
java.util.regex">PatterntableNamePattern,
   http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in 
java.util.regex">PatternsnapshotNamePattern)
 List all the completed snapshots matching the given table 
name regular expression and snapshot
  name regular expression.
 
 
-
+
 http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListSnapshotDescription
 Admin.listTableSnapshots(http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in 
java.util.regex">PatterntableNamePattern,
   http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in 
java.util.regex">PatternsnapshotNamePattern)
@@ -157,7 +163,7 @@
  name regular expression.
 
 
-
+
 http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListSnapshotDescription
 Admin.listTableSnapshots(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringtableNameRegex,
   http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in 
java.lang">StringsnapshotNameRegex)

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/apidocs/org/apache/hadoop/hbase/client/class-use/TableDescriptor.html
--
diff --git 
a/apidocs/org/apache/hadoop/hbase/client/class-use/TableDescriptor.html 

[19/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyColumnFamilyProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyColumnFamilyProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyColumnFamilyProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyColumnFamilyProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.ModifyColumnFamilyProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[09/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/integration.html
--
diff --git 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/integration.html
 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/integration.html
index 09d2b0e..6bb0861 100644
--- 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/integration.html
+++ 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/integration.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Archetype builder  CI Management
 
@@ -126,7 +126,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/issue-tracking.html
--
diff --git 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/issue-tracking.html
 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/issue-tracking.html
index 3cc7d1b..aa432cc 100644
--- 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/issue-tracking.html
+++ 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/issue-tracking.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Archetype builder  Issue Management
 
@@ -123,7 +123,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/license.html
--
diff --git 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/license.html
 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/license.html
index 794f7de..ec066c2 100644
--- 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/license.html
+++ 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/license.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Archetype builder  Project Licenses
 
@@ -326,7 +326,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/mail-lists.html
--
diff --git 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/mail-lists.html
 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/mail-lists.html
index 4307e85..b177169 100644
--- 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/mail-lists.html
+++ 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/mail-lists.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Archetype builder  Project Mailing 
Lists
 
@@ -176,7 +176,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2017-11-05
+  Last Published: 
2017-11-06
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/plugin-management.html
--
diff --git 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/plugin-management.html
 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/plugin-management.html
index 82530d0..0b66f74 100644
--- 
a/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/plugin-management.html
+++ 
b/hbase-build-configuration/hbase-archetypes/hbase-archetype-builder/plugin-management.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase - Archetype builder  Project Plugin 
Management
 
@@ -271,7 +271,7 @@
 https://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
  

[51/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
Published site at .


Project: http://git-wip-us.apache.org/repos/asf/hbase-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase-site/commit/32453e2d
Tree: http://git-wip-us.apache.org/repos/asf/hbase-site/tree/32453e2d
Diff: http://git-wip-us.apache.org/repos/asf/hbase-site/diff/32453e2d

Branch: refs/heads/asf-site
Commit: 32453e2dd62411b13fb460535716bfaa8abde622
Parents: 69b2418
Author: jenkins 
Authored: Mon Nov 6 15:16:54 2017 +
Committer: jenkins 
Committed: Mon Nov 6 15:16:54 2017 +

--
 acid-semantics.html | 4 +-
 apache_hbase_reference_guide.pdf| 4 +-
 apidocs/index-all.html  |48 +-
 .../hadoop/hbase/class-use/RegionLoad.html  | 6 +-
 .../hadoop/hbase/class-use/ServerName.html  |32 +-
 .../hadoop/hbase/class-use/TableName.html   |   154 +-
 .../apache/hadoop/hbase/client/AsyncAdmin.html  |   703 +-
 .../client/class-use/SnapshotDescription.html   |14 +-
 .../hbase/client/class-use/TableDescriptor.html |28 +-
 .../hbase/mapreduce/TableMapReduceUtil.html | 6 +-
 .../class-use/ReplicationPeerDescription.html   | 4 +-
 .../apache/hadoop/hbase/client/AsyncAdmin.html  |  1784 +-
 .../hbase/mapreduce/TableMapReduceUtil.html |   459 +-
 book.html   | 2 +-
 bulk-loads.html | 4 +-
 checkstyle-aggregate.html   | 44256 -
 checkstyle.rss  |10 +-
 coc.html| 4 +-
 cygwin.html | 4 +-
 dependencies.html   | 4 +-
 dependency-convergence.html | 4 +-
 dependency-info.html| 4 +-
 dependency-management.html  | 4 +-
 devapidocs/constant-values.html | 6 +-
 devapidocs/index-all.html   |   192 +-
 .../hadoop/hbase/backup/package-tree.html   | 2 +-
 .../hadoop/hbase/class-use/HRegionLocation.html |21 +-
 .../hadoop/hbase/class-use/RegionLoad.html  |35 +-
 .../hadoop/hbase/class-use/ServerName.html  |   125 +-
 .../hadoop/hbase/class-use/TableName.html   |   386 +-
 .../apache/hadoop/hbase/client/AsyncAdmin.html  |   703 +-
 .../hadoop/hbase/client/AsyncHBaseAdmin.html|   957 +-
 ...dmin.AddColumnFamilyProcedureBiConsumer.html | 6 +-
 .../client/RawAsyncHBaseAdmin.AdminRpcCall.html | 4 +-
 .../client/RawAsyncHBaseAdmin.Converter.html| 4 +-
 ...dmin.CreateNamespaceProcedureBiConsumer.html | 6 +-
 ...aseAdmin.CreateTableProcedureBiConsumer.html | 6 +-
 ...n.DeleteColumnFamilyProcedureBiConsumer.html | 6 +-
 ...dmin.DeleteNamespaceProcedureBiConsumer.html | 6 +-
 ...aseAdmin.DeleteTableProcedureBiConsumer.html | 8 +-
 ...seAdmin.DisableTableProcedureBiConsumer.html | 6 +-
 ...aseAdmin.EnableTableProcedureBiConsumer.html | 6 +-
 .../RawAsyncHBaseAdmin.MasterRpcCall.html   | 4 +-
 ...min.MergeTableRegionProcedureBiConsumer.html | 6 +-
 ...n.ModifyColumnFamilyProcedureBiConsumer.html | 6 +-
 ...dmin.ModifyNamespaceProcedureBiConsumer.html | 6 +-
 ...HBaseAdmin.NamespaceProcedureBiConsumer.html |14 +-
 .../RawAsyncHBaseAdmin.ProcedureBiConsumer.html |12 +-
 ...min.SplitTableRegionProcedureBiConsumer.html | 6 +-
 .../RawAsyncHBaseAdmin.TableOperator.html   | 4 +-
 ...syncHBaseAdmin.TableProcedureBiConsumer.html |14 +-
 ...eAdmin.TruncateTableProcedureBiConsumer.html | 6 +-
 .../hadoop/hbase/client/RawAsyncHBaseAdmin.html |  1232 +-
 .../hbase/client/class-use/CompactType.html |24 +-
 .../hbase/client/class-use/RegionInfo.html  | 8 +-
 .../client/class-use/SnapshotDescription.html   |65 +-
 .../hbase/client/class-use/TableDescriptor.html |   106 +-
 .../hadoop/hbase/client/package-tree.html   |26 +-
 .../hadoop/hbase/executor/package-tree.html | 2 +-
 .../hadoop/hbase/filter/package-tree.html   | 6 +-
 .../hadoop/hbase/io/hfile/package-tree.html | 6 +-
 .../apache/hadoop/hbase/ipc/package-tree.html   | 4 +-
 .../hbase/mapreduce/TableMapReduceUtil.html |16 +-
 .../hadoop/hbase/mapreduce/package-tree.html| 4 +-
 .../hbase/master/balancer/package-tree.html | 2 +-
 .../hadoop/hbase/master/package-tree.html   | 6 +-
 .../org/apache/hadoop/hbase/package-tree.html   |14 +-
 .../hadoop/hbase/procedure2/package-tree.html   | 4 +-
 .../hadoop/hbase/quotas/package-tree.html   | 6 +-
 .../hadoop/hbase/regionserver/package-tree.html |14 +-
 .../regionserver/querymatcher/package-tree.html | 2 +-
 .../class-use/ReplicationPeerDescription.html   |36 +-
 

[13/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.TableProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[20/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MergeTableRegionProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MergeTableRegionProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MergeTableRegionProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MergeTableRegionProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.MergeTableRegionProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[34/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html
--
diff --git a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html
index 4aaf926..e249fd7 100644
--- a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html
+++ b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncAdmin.html
@@ -72,489 +72,489 @@
 064  /**
 065   * List all the userspace tables.
 066   * @return - returns a list of 
TableDescriptors wrapped by a {@link CompletableFuture}.
-067   * @see #listTables(Optional, 
boolean)
-068   */
-069  default 
CompletableFutureListTableDescriptor listTables() {
-070return listTables(Optional.empty(), 
false);
-071  }
-072
-073  /**
-074   * List all the tables matching the 
given pattern.
-075   * @param pattern The compiled regular 
expression to match against
-076   * @param includeSysTables False to 
match only against userspace tables
-077   * @return - returns a list of 
TableDescriptors wrapped by a {@link CompletableFuture}.
-078   */
-079  
CompletableFutureListTableDescriptor 
listTables(OptionalPattern pattern,
-080  boolean includeSysTables);
-081
-082  /**
-083   * List all of the names of userspace 
tables.
-084   * @return a list of table names 
wrapped by a {@link CompletableFuture}.
-085   * @see #listTableNames(Optional, 
boolean)
-086   */
-087  default 
CompletableFutureListTableName listTableNames() {
-088return 
listTableNames(Optional.empty(), false);
-089  }
-090
-091  /**
-092   * List all of the names of userspace 
tables.
-093   * @param pattern The regular 
expression to match against
-094   * @param includeSysTables False to 
match only against userspace tables
-095   * @return a list of table names 
wrapped by a {@link CompletableFuture}.
-096   */
-097  
CompletableFutureListTableName 
listTableNames(OptionalPattern pattern,
-098  boolean includeSysTables);
-099
-100  /**
-101   * Method for getting the 
tableDescriptor
-102   * @param tableName as a {@link 
TableName}
-103   * @return the read-only 
tableDescriptor wrapped by a {@link CompletableFuture}.
-104   */
-105  
CompletableFutureTableDescriptor getTableDescriptor(TableName 
tableName);
-106
-107  /**
-108   * Creates a new table.
-109   * @param desc table descriptor for 
table
-110   */
-111  default CompletableFutureVoid 
createTable(TableDescriptor desc) {
-112return createTable(desc, 
Optional.empty());
-113  }
-114
-115  /**
-116   * Creates a new table with the 
specified number of regions. The start key specified will become
-117   * the end key of the first region of 
the table, and the end key specified will become the start
-118   * key of the last region of the table 
(the first region has a null start key and the last region
-119   * has a null end key). BigInteger math 
will be used to divide the key range specified into enough
-120   * segments to make the required number 
of total regions.
-121   * @param desc table descriptor for 
table
-122   * @param startKey beginning of key 
range
-123   * @param endKey end of key range
-124   * @param numRegions the total number 
of regions to create
-125   */
-126  CompletableFutureVoid 
createTable(TableDescriptor desc, byte[] startKey, byte[] endKey,
-127  int numRegions);
-128
-129  /**
-130   * Creates a new table with an initial 
set of empty regions defined by the specified split keys.
-131   * The total number of regions created 
will be the number of split keys plus one.
-132   * Note : Avoid passing empty split 
key.
-133   * @param desc table descriptor for 
table
-134   * @param splitKeys array of split keys 
for the initial regions of the table
-135   */
-136  CompletableFutureVoid 
createTable(TableDescriptor desc, Optionalbyte[][] splitKeys);
+067   */
+068  default 
CompletableFutureListTableDescriptor listTables() {
+069return listTables(false);
+070  }
+071
+072  /**
+073   * List all the tables.
+074   * @param includeSysTables False to 
match only against userspace tables
+075   * @return - returns a list of 
TableDescriptors wrapped by a {@link CompletableFuture}.
+076   */
+077  
CompletableFutureListTableDescriptor listTables(boolean 
includeSysTables);
+078
+079  /**
+080   * List all the tables matching the 
given pattern.
+081   * @param pattern The compiled regular 
expression to match against
+082   * @param includeSysTables False to 
match only against userspace tables
+083   * @return - returns a list of 
TableDescriptors wrapped by a {@link CompletableFuture}.
+084   */
+085  
CompletableFutureListTableDescriptor listTables(Pattern 
pattern, boolean includeSysTables);
+086
+087  /**
+088   * List all of the names of userspace 
tables.
+089   * @return a list of table names 
wrapped by a {@link CompletableFuture}.
+090   * @see #listTableNames(Pattern, 

[50/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/apidocs/org/apache/hadoop/hbase/class-use/TableName.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/class-use/TableName.html 
b/apidocs/org/apache/hadoop/hbase/class-use/TableName.html
index 78a18ae..a7aa3e3 100644
--- a/apidocs/org/apache/hadoop/hbase/class-use/TableName.html
+++ b/apidocs/org/apache/hadoop/hbase/class-use/TableName.html
@@ -538,7 +538,13 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
-AsyncAdmin.listTableNames(http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in java.util">Optionalhttp://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in java.util.regex">Patternpattern,
+AsyncAdmin.listTableNames(booleanincludeSysTables)
+List all of the names of tables.
+
+
+
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListTableName
+AsyncAdmin.listTableNames(http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html?is-external=true;
 title="class or interface in java.util.regex">Patternpattern,
   booleanincludeSysTables)
 List all of the names of userspace tables.
 
@@ -622,7 +628,7 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
-default http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
 AsyncAdmin.compact(TableNametableName)
 Compact a table.
 
@@ -634,13 +640,20 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
+AsyncAdmin.compact(TableNametableName,
+   byte[]columnFamily)
+Compact a column family within a table.
+
+
+
 void
 Admin.compact(TableNametableName,
byte[]columnFamily)
 Compact a column family within a table.
 
 
-
+
 void
 Admin.compact(TableNametableName,
byte[]columnFamily,
@@ -648,20 +661,13 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 Compact a column family within a table.
 
 
-
+
 void
 Admin.compact(TableNametableName,
CompactTypecompactType)
 Compact a table.
 
 
-
-http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
-AsyncAdmin.compact(TableNametableName,
-   http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[]columnFamily)
-Compact a column family within a table.
-
-
 
 static TableDescriptor
 TableDescriptorBuilder.copy(TableNamename,
@@ -926,64 +932,71 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
+http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListRegionLoad
+AsyncAdmin.getRegionLoads(ServerNameserverName,
+  TableNametableName)
+Get a list of RegionLoad of all regions hosted on a 
region seerver for a table.
+
+
+
 RegionLocator
 Connection.getRegionLocator(TableNametableName)
 Retrieve a RegionLocator implementation to inspect region 
information on a table.
 
 
-
+
 AsyncTableRegionLocator
 

[23/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DisableTableProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DisableTableProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DisableTableProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DisableTableProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DisableTableProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[15/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.SplitTableRegionProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.SplitTableRegionProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.SplitTableRegionProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.SplitTableRegionProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.SplitTableRegionProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[29/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.Converter.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.FlushRegionRequest;
-099import 

[36/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/class-use/CompactType.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/class-use/CompactType.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/CompactType.html
index a14d568..cc41410 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/CompactType.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/CompactType.html
@@ -136,16 +136,17 @@ the order they are declared.
 
 
 
-void
-Admin.compact(TableNametableName,
+private http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
+RawAsyncHBaseAdmin.compact(TableNametableName,
byte[]columnFamily,
+   booleanmajor,
CompactTypecompactType)
-Compact a column family within a table.
+Compact column family of a table, Asynchronous operation 
even if CompletableFuture.get()
 
 
 
 void
-HBaseAdmin.compact(TableNametableName,
+Admin.compact(TableNametableName,
byte[]columnFamily,
CompactTypecompactType)
 Compact a column family within a table.
@@ -153,25 +154,24 @@ the order they are declared.
 
 
 void
-Admin.compact(TableNametableName,
+HBaseAdmin.compact(TableNametableName,
+   byte[]columnFamily,
CompactTypecompactType)
-Compact a table.
+Compact a column family within a table.
 
 
 
 void
-HBaseAdmin.compact(TableNametableName,
+Admin.compact(TableNametableName,
CompactTypecompactType)
 Compact a table.
 
 
 
-private http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
-RawAsyncHBaseAdmin.compact(TableNametableName,
-   http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[]columnFamily,
-   booleanmajor,
+void
+HBaseAdmin.compact(TableNametableName,
CompactTypecompactType)
-Compact column family of a table, Asynchronous operation 
even if CompletableFuture.get()
+Compact a table.
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/class-use/RegionInfo.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/client/class-use/RegionInfo.html 
b/devapidocs/org/apache/hadoop/hbase/client/class-use/RegionInfo.html
index 64e4c8a..bf3dc50 100644
--- a/devapidocs/org/apache/hadoop/hbase/client/class-use/RegionInfo.html
+++ b/devapidocs/org/apache/hadoop/hbase/client/class-use/RegionInfo.html
@@ -1058,10 +1058,10 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 private http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
-RawAsyncHBaseAdmin.compact(ServerNamesn,
+RawAsyncHBaseAdmin.compact(ServerNamesn,
RegionInfohri,
booleanmajor,
-   http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[]columnFamily)
+   byte[]columnFamily)
 Compact the region at specific region server.
 
 
@@ -1157,8 +1157,8 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 private http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html?is-external=true;
 title="class or interface in java.util.concurrent">CompletableFuturehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void
-RawAsyncHBaseAdmin.split(RegionInfohri,
- http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[]splitPoint)
+RawAsyncHBaseAdmin.split(RegionInfohri,
+ byte[]splitPoint)
 
 
 (package private) http://docs.oracle.com/javase/8/docs/api/java/util/concurrent/Future.html?is-external=true;
 title="class or interface in java.util.concurrent">Futurehttp://docs.oracle.com/javase/8/docs/api/java/lang/Void.html?is-external=true;
 title="class or interface in java.lang">Void

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/org/apache/hadoop/hbase/client/class-use/SnapshotDescription.html

[05/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.html
--
diff --git 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.html
 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.html
index 59a55cf..a7a41a9 100644
--- 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.html
+++ 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.html
@@ -39,290 +39,289 @@
 031import java.util.EnumSet;
 032import java.util.List;
 033import java.util.Map;
-034import java.util.Optional;
-035
-036import 
org.apache.hadoop.conf.Configuration;
-037import 
org.apache.hadoop.hbase.ClusterStatus;
-038import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-039import 
org.apache.hadoop.hbase.HBaseTestingUtility;
-040import 
org.apache.hadoop.hbase.HConstants;
-041import 
org.apache.hadoop.hbase.RegionLoad;
-042import 
org.apache.hadoop.hbase.ServerLoad;
-043import 
org.apache.hadoop.hbase.ServerName;
-044import 
org.apache.hadoop.hbase.TableName;
-045import 
org.apache.hadoop.hbase.regionserver.HRegion;
-046import 
org.apache.hadoop.hbase.regionserver.HRegionServer;
-047import 
org.apache.hadoop.hbase.testclassification.ClientTests;
-048import 
org.apache.hadoop.hbase.testclassification.MediumTests;
-049import 
org.apache.hadoop.hbase.util.Bytes;
-050import 
org.apache.hadoop.hbase.wal.AbstractFSWALProvider;
-051import org.junit.BeforeClass;
-052import org.junit.Test;
-053import 
org.junit.experimental.categories.Category;
-054import org.junit.runner.RunWith;
-055import org.junit.runners.Parameterized;
-056
-057import com.google.common.collect.Lists;
-058import com.google.common.collect.Maps;
-059
-060@RunWith(Parameterized.class)
-061@Category({ ClientTests.class, 
MediumTests.class })
-062public class TestAsyncClusterAdminApi 
extends TestAsyncAdminBase {
-063
-064  private final Path cnfPath = 
FileSystems.getDefault().getPath("target/test-classes/hbase-site.xml");
-065  private final Path cnf2Path = 
FileSystems.getDefault().getPath("target/test-classes/hbase-site2.xml");
-066  private final Path cnf3Path = 
FileSystems.getDefault().getPath("target/test-classes/hbase-site3.xml");
-067
-068  @BeforeClass
-069  public static void setUpBeforeClass() 
throws Exception {
-070
TEST_UTIL.getConfiguration().setInt(HConstants.MASTER_INFO_PORT, 0);
-071
TEST_UTIL.getConfiguration().setInt(HConstants.HBASE_RPC_TIMEOUT_KEY, 6);
-072
TEST_UTIL.getConfiguration().setInt(HConstants.HBASE_CLIENT_OPERATION_TIMEOUT, 
12);
-073
TEST_UTIL.getConfiguration().setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER, 
2);
-074
TEST_UTIL.getConfiguration().setInt(START_LOG_ERRORS_AFTER_COUNT_KEY, 0);
-075TEST_UTIL.startMiniCluster(2);
-076ASYNC_CONN = 
ConnectionFactory.createAsyncConnection(TEST_UTIL.getConfiguration()).get();
-077  }
-078
-079  @Test
-080  public void testGetMasterInfoPort() 
throws Exception {
-081
assertEquals(TEST_UTIL.getHBaseCluster().getMaster().getInfoServer().getPort(), 
(int) admin
-082.getMasterInfoPort().get());
-083  }
-084
-085  @Test
-086  public void 
testRegionServerOnlineConfigChange() throws Exception {
-087replaceHBaseSiteXML();
-088
admin.getRegionServers().get().forEach(server - 
admin.updateConfiguration(server).join());
-089
-090// Check the configuration of the 
RegionServers
-091
TEST_UTIL.getMiniHBaseCluster().getRegionServerThreads().forEach(thread - 
{
-092  Configuration conf = 
thread.getRegionServer().getConfiguration();
-093  assertEquals(1000, 
conf.getInt("hbase.custom.config", 0));
-094});
-095
-096restoreHBaseSiteXML();
-097  }
-098
-099  @Test
-100  public void 
testMasterOnlineConfigChange() throws Exception {
-101replaceHBaseSiteXML();
-102ServerName master = 
admin.getMaster().get();
-103
admin.updateConfiguration(master).join();
-104admin.getBackupMasters().get()
-105.forEach(backupMaster - 
admin.updateConfiguration(backupMaster).join());
-106
-107// Check the configuration of the 
Masters
-108
TEST_UTIL.getMiniHBaseCluster().getMasterThreads().forEach(thread - {
-109  Configuration conf = 
thread.getMaster().getConfiguration();
-110  assertEquals(1000, 
conf.getInt("hbase.custom.config", 0));
-111});
-112
-113restoreHBaseSiteXML();
-114  }
-115
-116  @Test
-117  public void 
testAllClusterOnlineConfigChange() throws IOException {
-118replaceHBaseSiteXML();
-119admin.updateConfiguration().join();
-120
-121// Check the configuration of the 
Masters
-122
TEST_UTIL.getMiniHBaseCluster().getMasterThreads().forEach(thread - {
-123  Configuration conf = 
thread.getMaster().getConfiguration();
-124  assertEquals(1000, 
conf.getInt("hbase.custom.config", 0));
-125});
-126
-127   

[17/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.NamespaceProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.NamespaceProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.NamespaceProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.NamespaceProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.NamespaceProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[22/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.EnableTableProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.EnableTableProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.EnableTableProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.EnableTableProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.EnableTableProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[24/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteTableProcedureBiConsumer.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteTableProcedureBiConsumer.html
 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteTableProcedureBiConsumer.html
index 531081e..a22e5ce 100644
--- 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteTableProcedureBiConsumer.html
+++ 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.DeleteTableProcedureBiConsumer.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 

[07/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/testdevapidocs/org/apache/hadoop/hbase/client/TestAsyncAdminBase.html
--
diff --git 
a/testdevapidocs/org/apache/hadoop/hbase/client/TestAsyncAdminBase.html 
b/testdevapidocs/org/apache/hadoop/hbase/client/TestAsyncAdminBase.html
index 9128a40..8a8aaab 100644
--- a/testdevapidocs/org/apache/hadoop/hbase/client/TestAsyncAdminBase.html
+++ b/testdevapidocs/org/apache/hadoop/hbase/client/TestAsyncAdminBase.html
@@ -113,7 +113,7 @@ var activeTableTab = "activeTableTab";
 
 
 
-public abstract class TestAsyncAdminBase
+public abstract class TestAsyncAdminBase
 extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object
 Class to test AsyncAdmin.
 
@@ -212,13 +212,13 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 protected void
-createTableWithDefaultConf(org.apache.hadoop.hbase.TableNametableName,
-  http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[][]splitKeys)
+createTableWithDefaultConf(org.apache.hadoop.hbase.TableNametableName,
+  byte[][]splitKeys)
 
 
 protected void
-createTableWithDefaultConf(org.apache.hadoop.hbase.TableNametableName,
-  http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html?is-external=true;
 title="class or interface in 
java.util">Optionalbyte[][]splitKeys,
+createTableWithDefaultConf(org.apache.hadoop.hbase.TableNametableName,
+  byte[][]splitKeys,
   byte[]...families)
 
 
@@ -277,7 +277,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 LOG
-protected static finalorg.apache.commons.logging.Log LOG
+protected static finalorg.apache.commons.logging.Log LOG
 
 
 
@@ -286,7 +286,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 TEST_UTIL
-protected static finalHBaseTestingUtility TEST_UTIL
+protected static finalHBaseTestingUtility TEST_UTIL
 
 
 
@@ -295,7 +295,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 FAMILY
-protected static finalbyte[] FAMILY
+protected static finalbyte[] FAMILY
 
 
 
@@ -304,7 +304,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 FAMILY_0
-protected static finalbyte[] FAMILY_0
+protected static finalbyte[] FAMILY_0
 
 
 
@@ -313,7 +313,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 FAMILY_1
-protected static finalbyte[] FAMILY_1
+protected static finalbyte[] FAMILY_1
 
 
 
@@ -322,7 +322,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 ASYNC_CONN
-protected staticorg.apache.hadoop.hbase.client.AsyncConnection ASYNC_CONN
+protected staticorg.apache.hadoop.hbase.client.AsyncConnection ASYNC_CONN
 
 
 
@@ -331,7 +331,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 admin
-protectedorg.apache.hadoop.hbase.client.AsyncAdmin admin
+protectedorg.apache.hadoop.hbase.client.AsyncAdmin admin
 
 
 
@@ -340,7 +340,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 getAdmin
-publichttp://docs.oracle.com/javase/8/docs/api/java/util/function/Supplier.html?is-external=true;
 title="class or interface in 
java.util.function">Supplierorg.apache.hadoop.hbase.client.AsyncAdmin
 getAdmin
+publichttp://docs.oracle.com/javase/8/docs/api/java/util/function/Supplier.html?is-external=true;
 title="class or interface in 
java.util.function">Supplierorg.apache.hadoop.hbase.client.AsyncAdmin
 getAdmin
 
 
 
@@ -349,7 +349,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 testName
-publicorg.junit.rules.TestName testName
+publicorg.junit.rules.TestName testName
 
 
 
@@ -358,7 +358,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 tableName
-protectedorg.apache.hadoop.hbase.TableName tableName
+protectedorg.apache.hadoop.hbase.TableName tableName
 
 
 
@@ -375,7 +375,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 TestAsyncAdminBase
-publicTestAsyncAdminBase()
+publicTestAsyncAdminBase()
 
 
 
@@ -392,7 +392,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 getRawAsyncAdmin
-private staticorg.apache.hadoop.hbase.client.AsyncAdmingetRawAsyncAdmin()
+private staticorg.apache.hadoop.hbase.client.AsyncAdmingetRawAsyncAdmin()
 
 
 
@@ -401,7 +401,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 getAsyncAdmin
-private staticorg.apache.hadoop.hbase.client.AsyncAdmingetAsyncAdmin()
+private staticorg.apache.hadoop.hbase.client.AsyncAdmingetAsyncAdmin()
 
 
 
@@ -410,7 +410,7 @@ extends 

[33/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html
index 1119a61..cc9d706 100644
--- a/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html
+++ b/devapidocs/src-html/org/apache/hadoop/hbase/client/AsyncHBaseAdmin.html
@@ -51,589 +51,671 @@
 043import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
 044import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
 045import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-046import 
org.apache.hadoop.hbase.util.Pair;
-047import 
org.apache.yetus.audience.InterfaceAudience;
-048
-049import com.google.protobuf.RpcChannel;
-050
-051/**
-052 * The implementation of AsyncAdmin.
-053 * @since 2.0.0
-054 */
-055@InterfaceAudience.Private
-056public class AsyncHBaseAdmin implements 
AsyncAdmin {
-057
-058  private static final Log LOG = 
LogFactory.getLog(AsyncHBaseAdmin.class);
-059
-060  private final RawAsyncHBaseAdmin 
rawAdmin;
-061
-062  private final ExecutorService pool;
-063
-064  AsyncHBaseAdmin(RawAsyncHBaseAdmin 
rawAdmin, ExecutorService pool) {
-065this.rawAdmin = rawAdmin;
-066this.pool = pool;
-067  }
-068
-069  private T 
CompletableFutureT wrap(CompletableFutureT future) {
-070CompletableFutureT 
asyncFuture = new CompletableFuture();
-071future.whenCompleteAsync((r, e) - 
{
-072  if (e != null) {
-073
asyncFuture.completeExceptionally(e);
-074  } else {
-075asyncFuture.complete(r);
-076  }
-077}, pool);
-078return asyncFuture;
-079  }
-080
-081  @Override
-082  public CompletableFutureBoolean 
tableExists(TableName tableName) {
-083return 
wrap(rawAdmin.tableExists(tableName));
-084  }
-085
-086  @Override
-087  public 
CompletableFutureListTableDescriptor 
listTables(OptionalPattern pattern,
-088  boolean includeSysTables) {
-089return 
wrap(rawAdmin.listTables(pattern, includeSysTables));
-090  }
-091
-092  @Override
-093  public 
CompletableFutureListTableName 
listTableNames(OptionalPattern pattern,
-094  boolean includeSysTables) {
-095return 
wrap(rawAdmin.listTableNames(pattern, includeSysTables));
-096  }
-097
-098  @Override
-099  public 
CompletableFutureTableDescriptor getTableDescriptor(TableName 
tableName) {
-100return 
wrap(rawAdmin.getTableDescriptor(tableName));
-101  }
-102
-103  @Override
-104  public CompletableFutureVoid 
createTable(TableDescriptor desc, byte[] startKey, byte[] endKey,
-105  int numRegions) {
-106return 
wrap(rawAdmin.createTable(desc, startKey, endKey, numRegions));
-107  }
-108
-109  @Override
-110  public CompletableFutureVoid 
createTable(TableDescriptor desc, Optionalbyte[][] splitKeys) {
-111return 
wrap(rawAdmin.createTable(desc, splitKeys));
-112  }
-113
-114  @Override
-115  public CompletableFutureVoid 
deleteTable(TableName tableName) {
-116return 
wrap(rawAdmin.deleteTable(tableName));
-117  }
-118
-119  @Override
-120  public CompletableFutureVoid 
truncateTable(TableName tableName, boolean preserveSplits) {
-121return 
wrap(rawAdmin.truncateTable(tableName, preserveSplits));
-122  }
-123
-124  @Override
-125  public CompletableFutureVoid 
enableTable(TableName tableName) {
-126return 
wrap(rawAdmin.enableTable(tableName));
-127  }
-128
-129  @Override
-130  public CompletableFutureVoid 
disableTable(TableName tableName) {
-131return 
wrap(rawAdmin.disableTable(tableName));
-132  }
-133
-134  @Override
-135  public CompletableFutureBoolean 
isTableEnabled(TableName tableName) {
-136return 
wrap(rawAdmin.isTableEnabled(tableName));
-137  }
-138
-139  @Override
-140  public CompletableFutureBoolean 
isTableDisabled(TableName tableName) {
-141return 
wrap(rawAdmin.isTableDisabled(tableName));
-142  }
-143
-144  @Override
-145  public CompletableFutureBoolean 
isTableAvailable(TableName tableName, byte[][] splitKeys) {
-146return 
wrap(rawAdmin.isTableAvailable(tableName, splitKeys));
-147  }
-148
-149  @Override
-150  public CompletableFutureVoid 
addColumnFamily(TableName tableName,
-151  ColumnFamilyDescriptor 
columnFamily) {
-152return 
wrap(rawAdmin.addColumnFamily(tableName, columnFamily));
-153  }
-154
-155  @Override
-156  public CompletableFutureVoid 
deleteColumnFamily(TableName tableName, byte[] columnFamily) {
-157return 
wrap(rawAdmin.deleteColumnFamily(tableName, columnFamily));
-158  }
-159
-160  @Override
-161  public CompletableFutureVoid 
modifyColumnFamily(TableName tableName,
-162  ColumnFamilyDescriptor 
columnFamily) {
-163return 
wrap(rawAdmin.modifyColumnFamily(tableName, columnFamily));
-164  }
-165
-166  @Override
-167  public CompletableFutureVoid 
createNamespace(NamespaceDescriptor 

[11/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.html
--
diff --git 
a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.html 
b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.html
index 531081e..a22e5ce 100644
--- a/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.html
+++ b/devapidocs/src-html/org/apache/hadoop/hbase/client/RawAsyncHBaseAdmin.html
@@ -34,2832 +34,3011 @@
 026import java.util.Collections;
 027import java.util.EnumSet;
 028import java.util.HashMap;
-029import java.util.LinkedList;
-030import java.util.List;
-031import java.util.Map;
-032import java.util.Optional;
-033import java.util.Set;
-034import 
java.util.concurrent.CompletableFuture;
-035import java.util.concurrent.TimeUnit;
-036import 
java.util.concurrent.atomic.AtomicReference;
-037import java.util.function.BiConsumer;
-038import java.util.function.Function;
-039import java.util.regex.Pattern;
-040import java.util.stream.Collectors;
-041import java.util.stream.Stream;
-042
-043import org.apache.commons.io.IOUtils;
-044import org.apache.commons.logging.Log;
-045import 
org.apache.commons.logging.LogFactory;
-046import 
org.apache.hadoop.hbase.AsyncMetaTableAccessor;
-047import 
org.apache.hadoop.hbase.ClusterStatus;
-048import 
org.apache.hadoop.hbase.ClusterStatus.Option;
-049import 
org.apache.hadoop.hbase.HConstants;
-050import 
org.apache.hadoop.hbase.HRegionLocation;
-051import 
org.apache.hadoop.hbase.MetaTableAccessor;
-052import 
org.apache.hadoop.hbase.MetaTableAccessor.QueryType;
-053import 
org.apache.hadoop.hbase.NamespaceDescriptor;
-054import 
org.apache.hadoop.hbase.RegionLoad;
-055import 
org.apache.hadoop.hbase.RegionLocations;
-056import 
org.apache.hadoop.hbase.ServerName;
-057import 
org.apache.hadoop.hbase.TableExistsException;
-058import 
org.apache.hadoop.hbase.TableName;
-059import 
org.apache.hadoop.hbase.TableNotDisabledException;
-060import 
org.apache.hadoop.hbase.TableNotEnabledException;
-061import 
org.apache.hadoop.hbase.TableNotFoundException;
-062import 
org.apache.hadoop.hbase.UnknownRegionException;
-063import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.AdminRequestCallerBuilder;
-064import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.MasterRequestCallerBuilder;
-065import 
org.apache.hadoop.hbase.client.AsyncRpcRetryingCallerFactory.ServerRequestCallerBuilder;
-066import 
org.apache.hadoop.hbase.client.RawAsyncTable.CoprocessorCallable;
-067import 
org.apache.hadoop.hbase.client.Scan.ReadType;
-068import 
org.apache.hadoop.hbase.client.replication.ReplicationSerDeHelper;
-069import 
org.apache.hadoop.hbase.client.replication.TableCFs;
-070import 
org.apache.hadoop.hbase.client.security.SecurityCapability;
-071import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-072import 
org.apache.hadoop.hbase.ipc.HBaseRpcController;
-073import 
org.apache.hadoop.hbase.quotas.QuotaFilter;
-074import 
org.apache.hadoop.hbase.quotas.QuotaSettings;
-075import 
org.apache.hadoop.hbase.quotas.QuotaTableUtil;
-076import 
org.apache.hadoop.hbase.replication.ReplicationException;
-077import 
org.apache.hadoop.hbase.replication.ReplicationPeerConfig;
-078import 
org.apache.hadoop.hbase.replication.ReplicationPeerDescription;
-079import 
org.apache.hadoop.hbase.snapshot.ClientSnapshotDescriptionUtils;
-080import 
org.apache.hadoop.hbase.snapshot.RestoreSnapshotException;
-081import 
org.apache.hadoop.hbase.snapshot.SnapshotCreationException;
-082import 
org.apache.hadoop.hbase.util.Bytes;
-083import 
org.apache.hadoop.hbase.util.EnvironmentEdgeManager;
-084import 
org.apache.hadoop.hbase.util.ForeignExceptionUtil;
-085import 
org.apache.hadoop.hbase.util.Pair;
-086import 
org.apache.yetus.audience.InterfaceAudience;
-087
-088import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcCallback;
-089import 
org.apache.hadoop.hbase.shaded.io.netty.util.Timeout;
-090import 
org.apache.hadoop.hbase.shaded.io.netty.util.TimerTask;
-091import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
-092import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-093import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.AdminService;
-094import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesRequest;
-095import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.ClearCompactionQueuesResponse;
-096import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-097import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionResponse;
-098import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.FlushRegionRequest;
-099import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.FlushRegionResponse;
-100import 

[01/51] [partial] hbase-site git commit: Published site at .

2017-11-06 Thread git-site-role
Repository: hbase-site
Updated Branches:
  refs/heads/asf-site 69b241841 -> 32453e2dd


http://git-wip-us.apache.org/repos/asf/hbase-site/blob/32453e2d/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldSecureEndpoint.AtomicHFileLoader.html
--
diff --git 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldSecureEndpoint.AtomicHFileLoader.html
 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldSecureEndpoint.AtomicHFileLoader.html
index 0746904..07d22e1 100644
--- 
a/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldSecureEndpoint.AtomicHFileLoader.html
+++ 
b/testdevapidocs/src-html/org/apache/hadoop/hbase/regionserver/TestHRegionServerBulkLoadWithOldSecureEndpoint.AtomicHFileLoader.html
@@ -28,166 +28,165 @@
 020import java.io.IOException;
 021import java.util.ArrayList;
 022import java.util.List;
-023import java.util.Optional;
-024import 
java.util.concurrent.atomic.AtomicLong;
-025
-026import org.apache.commons.logging.Log;
-027import 
org.apache.commons.logging.LogFactory;
-028import org.apache.hadoop.fs.FileSystem;
-029import org.apache.hadoop.fs.Path;
-030import 
org.apache.hadoop.hbase.HConstants;
-031import 
org.apache.hadoop.hbase.MultithreadedTestUtil.RepeatingTestThread;
-032import 
org.apache.hadoop.hbase.MultithreadedTestUtil.TestContext;
-033import 
org.apache.hadoop.hbase.TableName;
-034import 
org.apache.hadoop.hbase.client.ClientServiceCallable;
-035import 
org.apache.hadoop.hbase.client.ClusterConnection;
-036import 
org.apache.hadoop.hbase.client.RpcRetryingCaller;
-037import 
org.apache.hadoop.hbase.client.RpcRetryingCallerFactory;
-038import 
org.apache.hadoop.hbase.client.Table;
-039import 
org.apache.hadoop.hbase.coprocessor.CoprocessorHost;
-040import 
org.apache.hadoop.hbase.ipc.RpcControllerFactory;
-041import 
org.apache.hadoop.hbase.shaded.protobuf.RequestConverter;
-042import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos;
-043import 
org.apache.hadoop.hbase.shaded.protobuf.generated.AdminProtos.CompactRegionRequest;
-044import 
org.apache.hadoop.hbase.testclassification.LargeTests;
-045import 
org.apache.hadoop.hbase.testclassification.RegionServerTests;
-046import 
org.apache.hadoop.hbase.util.Bytes;
-047import 
org.apache.hadoop.hbase.util.Pair;
-048import org.junit.BeforeClass;
-049import org.junit.Ignore;
-050import 
org.junit.experimental.categories.Category;
-051import org.junit.runner.RunWith;
-052import org.junit.runners.Parameterized;
-053
-054import 
org.apache.hadoop.hbase.shaded.com.google.common.collect.Lists;
-055
-056/**
-057 * Tests bulk loading of HFiles with old 
secure Endpoint client for backward compatibility. Will be
-058 * removed when old non-secure client for 
backward compatibility is not supported.
-059 */
-060@RunWith(Parameterized.class)
-061@Category({RegionServerTests.class, 
LargeTests.class})
-062@Ignore // BROKEN. FIX OR REMOVE.
-063public class 
TestHRegionServerBulkLoadWithOldSecureEndpoint extends 
TestHRegionServerBulkLoad {
-064  public 
TestHRegionServerBulkLoadWithOldSecureEndpoint(int duration) {
-065super(duration);
-066  }
-067
-068  private static final Log LOG =
-069  
LogFactory.getLog(TestHRegionServerBulkLoadWithOldSecureEndpoint.class);
-070
-071  @BeforeClass
-072  public static void setUpBeforeClass() 
throws IOException {
-073conf.setInt("hbase.rpc.timeout", 10 * 
1000);
-074
conf.set(CoprocessorHost.REGION_COPROCESSOR_CONF_KEY,
-075  
"org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint");
-076  }
-077
-078  public static class AtomicHFileLoader 
extends RepeatingTestThread {
-079final AtomicLong numBulkLoads = new 
AtomicLong();
-080final AtomicLong numCompactions = new 
AtomicLong();
-081private TableName tableName;
-082
-083public AtomicHFileLoader(TableName 
tableName, TestContext ctx,
-084byte targetFamilies[][]) throws 
IOException {
-085  super(ctx);
-086  this.tableName = tableName;
-087}
-088
-089public void doAnAction() throws 
Exception {
-090  long iteration = 
numBulkLoads.getAndIncrement();
-091  Path dir =  
UTIL.getDataTestDirOnTestFS(String.format("bulkLoad_%08d",
-092  iteration));
-093
-094  // create HFiles for different 
column families
-095  FileSystem fs = 
UTIL.getTestFileSystem();
-096  byte[] val = 
Bytes.toBytes(String.format("%010d", iteration));
-097  final ListPairbyte[], 
String famPaths = new ArrayList(NUM_CFS);
-098  for (int i = 0; i  NUM_CFS; 
i++) {
-099Path hfile = new Path(dir, 
family(i));
-100byte[] fam = 
Bytes.toBytes(family(i));
-101createHFile(fs, hfile, fam, QUAL, 
val, 1000);
-102famPaths.add(new 
Pair(fam, hfile.toString()));
-103  }
-104
-105  // bulk load HFiles
-106  

[1/2] hbase git commit: HBASE-18950 Remove Optional parameters in AsyncAdmin interface

2017-11-06 Thread zghao
Repository: hbase
Updated Branches:
  refs/heads/branch-2 061a73db6 -> 47c614c70


http://git-wip-us.apache.org/repos/asf/hbase/blob/47c614c7/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
index c3c4045..83ba244 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
@@ -21,10 +21,8 @@ import static 
org.apache.hadoop.hbase.client.AsyncProcess.START_LOG_ERRORS_AFTER
 
 import java.util.Arrays;
 import java.util.List;
-import java.util.Optional;
-import java.util.concurrent.ExecutionException;
+import java.util.concurrent.CompletableFuture;
 import java.util.concurrent.ForkJoinPool;
-import java.util.concurrent.TimeUnit;
 import java.util.function.Supplier;
 import java.util.regex.Pattern;
 
@@ -41,8 +39,6 @@ import org.junit.Before;
 import org.junit.BeforeClass;
 import org.junit.Rule;
 import org.junit.rules.TestName;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
 import org.junit.runners.Parameterized.Parameter;
 import org.junit.runners.Parameterized.Parameters;
 
@@ -106,7 +102,7 @@ public abstract class TestAsyncAdminBase {
 
   @After
   public void tearDown() throws Exception {
-
admin.listTableNames(Optional.of(Pattern.compile(tableName.getNameAsString() + 
".*")), false)
+admin.listTableNames(Pattern.compile(tableName.getNameAsString() + ".*"), 
false)
 .whenCompleteAsync((tables, err) -> {
   if (tables != null) {
 tables.forEach(table -> {
@@ -122,19 +118,21 @@ public abstract class TestAsyncAdminBase {
   }
 
   protected void createTableWithDefaultConf(TableName tableName) {
-createTableWithDefaultConf(tableName, Optional.empty());
+createTableWithDefaultConf(tableName, null);
   }
 
-  protected void createTableWithDefaultConf(TableName tableName, 
Optional splitKeys) {
+  protected void createTableWithDefaultConf(TableName tableName, byte[][] 
splitKeys) {
 createTableWithDefaultConf(tableName, splitKeys, FAMILY);
   }
 
-  protected void createTableWithDefaultConf(TableName tableName, 
Optional splitKeys,
+  protected void createTableWithDefaultConf(TableName tableName, byte[][] 
splitKeys,
   byte[]... families) {
 TableDescriptorBuilder builder = 
TableDescriptorBuilder.newBuilder(tableName);
 for (byte[] family : families) {
   builder.addColumnFamily(ColumnFamilyDescriptorBuilder.of(family));
 }
-admin.createTable(builder.build(), splitKeys).join();
+CompletableFuture future = splitKeys == null ? 
admin.createTable(builder.build())
+: admin.createTable(builder.build(), splitKeys);
+future.join();
   }
 }

http://git-wip-us.apache.org/repos/asf/hbase/blob/47c614c7/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
index 53de2b5..e7c439b 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
@@ -31,7 +31,6 @@ import java.util.Collection;
 import java.util.EnumSet;
 import java.util.List;
 import java.util.Map;
-import java.util.Optional;
 
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.ClusterStatus;
@@ -254,7 +253,7 @@ public class TestAsyncClusterAdminApi extends 
TestAsyncAdminBase {
   List tableRegions = admin.getTableRegions(table).get();
   List regionLoads = Lists.newArrayList();
   for (ServerName serverName : servers) {
-regionLoads.addAll(admin.getRegionLoads(serverName, 
Optional.of(table)).get());
+regionLoads.addAll(admin.getRegionLoads(serverName, table).get());
   }
   checkRegionsAndRegionLoads(tableRegions, regionLoads);
 }

http://git-wip-us.apache.org/repos/asf/hbase/blob/47c614c7/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.java
index 262cac6..1ee1b94 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.java
+++ 

[2/2] hbase git commit: HBASE-18950 Remove Optional parameters in AsyncAdmin interface

2017-11-06 Thread zghao
HBASE-18950 Remove Optional parameters in AsyncAdmin interface


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/47c614c7
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/47c614c7
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/47c614c7

Branch: refs/heads/branch-2
Commit: 47c614c70607520f2f5824b0026be09a124809af
Parents: 061a73d
Author: Guanghao Zhang 
Authored: Mon Oct 23 11:22:00 2017 +0800
Committer: Guanghao Zhang 
Committed: Mon Nov 6 20:41:20 2017 +0800

--
 .../apache/hadoop/hbase/client/AsyncAdmin.java  | 132 ++---
 .../hadoop/hbase/client/AsyncHBaseAdmin.java| 114 -
 .../client/AsyncRpcRetryingCallerFactory.java   |   8 +-
 .../apache/hadoop/hbase/client/HBaseAdmin.java  |   6 +-
 .../hadoop/hbase/client/RawAsyncHBaseAdmin.java | 477 +--
 .../hbase/shaded/protobuf/ProtobufUtil.java |  11 +-
 .../hbase/shaded/protobuf/RequestConverter.java | 403 ++--
 ...gionServerBulkLoadWithOldSecureEndpoint.java |   3 +-
 .../hadoop/hbase/client/TestAsyncAdminBase.java |  18 +-
 .../hbase/client/TestAsyncClusterAdminApi.java  |   3 +-
 .../hbase/client/TestAsyncRegionAdminApi.java   |  23 +-
 .../hbase/client/TestAsyncSnapshotAdminApi.java |  12 +-
 .../hbase/client/TestAsyncTableAdminApi.java|  22 +-
 .../hbase/coprocessor/TestMasterObserver.java   |   9 +-
 .../regionserver/TestHRegionServerBulkLoad.java |   2 +-
 .../TestHRegionServerBulkLoadWithOldClient.java |   3 +-
 16 files changed, 713 insertions(+), 533 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/47c614c7/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java
index 8fe02b9..baae6cf 100644
--- a/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java
+++ b/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java
@@ -64,38 +64,49 @@ public interface AsyncAdmin {
   /**
* List all the userspace tables.
* @return - returns a list of TableDescriptors wrapped by a {@link 
CompletableFuture}.
-   * @see #listTables(Optional, boolean)
*/
   default CompletableFuture listTables() {
-return listTables(Optional.empty(), false);
+return listTables(false);
   }
 
   /**
+   * List all the tables.
+   * @param includeSysTables False to match only against userspace tables
+   * @return - returns a list of TableDescriptors wrapped by a {@link 
CompletableFuture}.
+   */
+  CompletableFuture listTables(boolean 
includeSysTables);
+
+  /**
* List all the tables matching the given pattern.
* @param pattern The compiled regular expression to match against
* @param includeSysTables False to match only against userspace tables
* @return - returns a list of TableDescriptors wrapped by a {@link 
CompletableFuture}.
*/
-  CompletableFuture listTables(Optional 
pattern,
-  boolean includeSysTables);
+  CompletableFuture listTables(Pattern pattern, boolean 
includeSysTables);
 
   /**
* List all of the names of userspace tables.
* @return a list of table names wrapped by a {@link CompletableFuture}.
-   * @see #listTableNames(Optional, boolean)
+   * @see #listTableNames(Pattern, boolean)
*/
   default CompletableFuture listTableNames() {
-return listTableNames(Optional.empty(), false);
+return listTableNames(false);
   }
 
   /**
+   * List all of the names of tables.
+   * @param includeSysTables False to match only against userspace tables
+   * @return a list of table names wrapped by a {@link CompletableFuture}.
+   */
+  CompletableFuture listTableNames(boolean includeSysTables);
+
+  /**
* List all of the names of userspace tables.
* @param pattern The regular expression to match against
* @param includeSysTables False to match only against userspace tables
* @return a list of table names wrapped by a {@link CompletableFuture}.
*/
-  CompletableFuture listTableNames(Optional pattern,
-  boolean includeSysTables);
+  CompletableFuture listTableNames(Pattern pattern, boolean 
includeSysTables);
 
   /**
* Method for getting the tableDescriptor
@@ -108,9 +119,7 @@ public interface AsyncAdmin {
* Creates a new table.
* @param desc table descriptor for table
*/
-  default CompletableFuture createTable(TableDescriptor desc) {
-return createTable(desc, Optional.empty());
-  }
+  CompletableFuture createTable(TableDescriptor desc);
 
   /**
* Creates a new table with the specified number of regions. The start key 

[1/2] hbase git commit: HBASE-18950 Remove Optional parameters in AsyncAdmin interface

2017-11-06 Thread zghao
Repository: hbase
Updated Branches:
  refs/heads/master bc3f3ee3b -> 888f2335c


http://git-wip-us.apache.org/repos/asf/hbase/blob/888f2335/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
index c3c4045..83ba244 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncAdminBase.java
@@ -21,10 +21,8 @@ import static 
org.apache.hadoop.hbase.client.AsyncProcess.START_LOG_ERRORS_AFTER
 
 import java.util.Arrays;
 import java.util.List;
-import java.util.Optional;
-import java.util.concurrent.ExecutionException;
+import java.util.concurrent.CompletableFuture;
 import java.util.concurrent.ForkJoinPool;
-import java.util.concurrent.TimeUnit;
 import java.util.function.Supplier;
 import java.util.regex.Pattern;
 
@@ -41,8 +39,6 @@ import org.junit.Before;
 import org.junit.BeforeClass;
 import org.junit.Rule;
 import org.junit.rules.TestName;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
 import org.junit.runners.Parameterized.Parameter;
 import org.junit.runners.Parameterized.Parameters;
 
@@ -106,7 +102,7 @@ public abstract class TestAsyncAdminBase {
 
   @After
   public void tearDown() throws Exception {
-
admin.listTableNames(Optional.of(Pattern.compile(tableName.getNameAsString() + 
".*")), false)
+admin.listTableNames(Pattern.compile(tableName.getNameAsString() + ".*"), 
false)
 .whenCompleteAsync((tables, err) -> {
   if (tables != null) {
 tables.forEach(table -> {
@@ -122,19 +118,21 @@ public abstract class TestAsyncAdminBase {
   }
 
   protected void createTableWithDefaultConf(TableName tableName) {
-createTableWithDefaultConf(tableName, Optional.empty());
+createTableWithDefaultConf(tableName, null);
   }
 
-  protected void createTableWithDefaultConf(TableName tableName, 
Optional splitKeys) {
+  protected void createTableWithDefaultConf(TableName tableName, byte[][] 
splitKeys) {
 createTableWithDefaultConf(tableName, splitKeys, FAMILY);
   }
 
-  protected void createTableWithDefaultConf(TableName tableName, 
Optional splitKeys,
+  protected void createTableWithDefaultConf(TableName tableName, byte[][] 
splitKeys,
   byte[]... families) {
 TableDescriptorBuilder builder = 
TableDescriptorBuilder.newBuilder(tableName);
 for (byte[] family : families) {
   builder.addColumnFamily(ColumnFamilyDescriptorBuilder.of(family));
 }
-admin.createTable(builder.build(), splitKeys).join();
+CompletableFuture future = splitKeys == null ? 
admin.createTable(builder.build())
+: admin.createTable(builder.build(), splitKeys);
+future.join();
   }
 }

http://git-wip-us.apache.org/repos/asf/hbase/blob/888f2335/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
index 53de2b5..e7c439b 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncClusterAdminApi.java
@@ -31,7 +31,6 @@ import java.util.Collection;
 import java.util.EnumSet;
 import java.util.List;
 import java.util.Map;
-import java.util.Optional;
 
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.ClusterStatus;
@@ -254,7 +253,7 @@ public class TestAsyncClusterAdminApi extends 
TestAsyncAdminBase {
   List tableRegions = admin.getTableRegions(table).get();
   List regionLoads = Lists.newArrayList();
   for (ServerName serverName : servers) {
-regionLoads.addAll(admin.getRegionLoads(serverName, 
Optional.of(table)).get());
+regionLoads.addAll(admin.getRegionLoads(serverName, table).get());
   }
   checkRegionsAndRegionLoads(tableRegions, regionLoads);
 }

http://git-wip-us.apache.org/repos/asf/hbase/blob/888f2335/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.java
index 262cac6..1ee1b94 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/client/TestAsyncRegionAdminApi.java
+++ 

[2/2] hbase git commit: HBASE-18950 Remove Optional parameters in AsyncAdmin interface

2017-11-06 Thread zghao
HBASE-18950 Remove Optional parameters in AsyncAdmin interface


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/888f2335
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/888f2335
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/888f2335

Branch: refs/heads/master
Commit: 888f2335c952040646ce820f6191f6433ec9411d
Parents: bc3f3ee
Author: Guanghao Zhang 
Authored: Mon Oct 23 11:22:00 2017 +0800
Committer: Guanghao Zhang 
Committed: Mon Nov 6 20:30:59 2017 +0800

--
 .../apache/hadoop/hbase/client/AsyncAdmin.java  | 132 ++---
 .../hadoop/hbase/client/AsyncHBaseAdmin.java| 114 -
 .../client/AsyncRpcRetryingCallerFactory.java   |   8 +-
 .../apache/hadoop/hbase/client/HBaseAdmin.java  |   6 +-
 .../hadoop/hbase/client/RawAsyncHBaseAdmin.java | 477 +--
 .../hbase/shaded/protobuf/ProtobufUtil.java |  11 +-
 .../hbase/shaded/protobuf/RequestConverter.java | 403 ++--
 ...gionServerBulkLoadWithOldSecureEndpoint.java |   3 +-
 .../hadoop/hbase/client/TestAsyncAdminBase.java |  18 +-
 .../hbase/client/TestAsyncClusterAdminApi.java  |   3 +-
 .../hbase/client/TestAsyncRegionAdminApi.java   |  23 +-
 .../hbase/client/TestAsyncSnapshotAdminApi.java |  12 +-
 .../hbase/client/TestAsyncTableAdminApi.java|  22 +-
 .../hbase/coprocessor/TestMasterObserver.java   |   9 +-
 .../regionserver/TestHRegionServerBulkLoad.java |   2 +-
 .../TestHRegionServerBulkLoadWithOldClient.java |   3 +-
 16 files changed, 713 insertions(+), 533 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/888f2335/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java
--
diff --git 
a/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java 
b/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java
index 8fe02b9..baae6cf 100644
--- a/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java
+++ b/hbase-client/src/main/java/org/apache/hadoop/hbase/client/AsyncAdmin.java
@@ -64,38 +64,49 @@ public interface AsyncAdmin {
   /**
* List all the userspace tables.
* @return - returns a list of TableDescriptors wrapped by a {@link 
CompletableFuture}.
-   * @see #listTables(Optional, boolean)
*/
   default CompletableFuture listTables() {
-return listTables(Optional.empty(), false);
+return listTables(false);
   }
 
   /**
+   * List all the tables.
+   * @param includeSysTables False to match only against userspace tables
+   * @return - returns a list of TableDescriptors wrapped by a {@link 
CompletableFuture}.
+   */
+  CompletableFuture listTables(boolean 
includeSysTables);
+
+  /**
* List all the tables matching the given pattern.
* @param pattern The compiled regular expression to match against
* @param includeSysTables False to match only against userspace tables
* @return - returns a list of TableDescriptors wrapped by a {@link 
CompletableFuture}.
*/
-  CompletableFuture listTables(Optional 
pattern,
-  boolean includeSysTables);
+  CompletableFuture listTables(Pattern pattern, boolean 
includeSysTables);
 
   /**
* List all of the names of userspace tables.
* @return a list of table names wrapped by a {@link CompletableFuture}.
-   * @see #listTableNames(Optional, boolean)
+   * @see #listTableNames(Pattern, boolean)
*/
   default CompletableFuture listTableNames() {
-return listTableNames(Optional.empty(), false);
+return listTableNames(false);
   }
 
   /**
+   * List all of the names of tables.
+   * @param includeSysTables False to match only against userspace tables
+   * @return a list of table names wrapped by a {@link CompletableFuture}.
+   */
+  CompletableFuture listTableNames(boolean includeSysTables);
+
+  /**
* List all of the names of userspace tables.
* @param pattern The regular expression to match against
* @param includeSysTables False to match only against userspace tables
* @return a list of table names wrapped by a {@link CompletableFuture}.
*/
-  CompletableFuture listTableNames(Optional pattern,
-  boolean includeSysTables);
+  CompletableFuture listTableNames(Pattern pattern, boolean 
includeSysTables);
 
   /**
* Method for getting the tableDescriptor
@@ -108,9 +119,7 @@ public interface AsyncAdmin {
* Creates a new table.
* @param desc table descriptor for table
*/
-  default CompletableFuture createTable(TableDescriptor desc) {
-return createTable(desc, Optional.empty());
-  }
+  CompletableFuture createTable(TableDescriptor desc);
 
   /**
* Creates a new table with the specified number of regions. The start key