Apache-Phoenix | 4.x-HBase-1.0 | Build Successful

2016-07-11 Thread Apache Jenkins Server
4.x-HBase-1.0 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.0

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.0/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.0/lastCompletedBuild/testReport/

Changes
No changes


Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


phoenix git commit: PHOENIX-3060 pherf tool is not working (Sergey Soldatov)

2016-07-11 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/master 4790b3710 -> af8bbfb52


PHOENIX-3060 pherf tool is not working (Sergey Soldatov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/af8bbfb5
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/af8bbfb5
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/af8bbfb5

Branch: refs/heads/master
Commit: af8bbfb52a3ebc2bffb034ee62e9f27972de8e78
Parents: 4790b37
Author: Ankit Singhal 
Authored: Mon Jul 11 13:42:55 2016 +0530
Committer: Ankit Singhal 
Committed: Mon Jul 11 13:42:55 2016 +0530

--
 phoenix-pherf/pom.xml | 304 ++---
 1 file changed, 286 insertions(+), 18 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/af8bbfb5/phoenix-pherf/pom.xml
--
diff --git a/phoenix-pherf/pom.xml b/phoenix-pherf/pom.xml
index d25eba9..894eb55 100644
--- a/phoenix-pherf/pom.xml
+++ b/phoenix-pherf/pom.xml
@@ -24,6 +24,7 @@
 

${project.basedir}/..
+   org.apache.phoenix.shaded

 

@@ -166,24 +167,291 @@
org.apache.maven.plugins
maven-resources-plugin

-   
-   maven-assembly-plugin
-   
-   
-   make-assembly
-   package
-   
-   single
-   
-   
-   
-   
-   
src/main/assembly/minimal.xml
-   
-   
-   
-   
-   
+   
+   org.apache.maven.plugins
+   maven-install-plugin
+   
+   
+   
+   install-file
+   
+   default-install
+   
+   true
+   
+   install
+   
+   
+   
+   
${basedir}/target/phoenix-pherf-${project.version}-minimal.jar
+   
+   
+   
+   org.apache.maven.plugins
+   maven-shade-plugin
+   
+   
+   package
+   
+   shade
+   
+   
+   
phoenix-pherf-${project.version}-minimal
+   
false
+   
true
+   
false
+   
+   
+   
+   
README.md
+   
${project.basedir}/../README.md
+   
+   
+   
LICENSE.txt
+   
${project.basedir}/../LICENSE
+   
+   
+   
NOTICE
+   
${project.basedir}/../NOTICE
+   
+   
+   
+   
+   
org.apa

phoenix git commit: PHOENIX-3060 pherf tool is not working (Sergey Soldatov)

2016-07-11 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.1 275f716c2 -> 913ee6a42


PHOENIX-3060 pherf tool is not working (Sergey Soldatov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/913ee6a4
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/913ee6a4
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/913ee6a4

Branch: refs/heads/4.x-HBase-1.1
Commit: 913ee6a427ab55ee2b786579877613f3f8e75881
Parents: 275f716
Author: Ankit Singhal 
Authored: Mon Jul 11 13:46:19 2016 +0530
Committer: Ankit Singhal 
Committed: Mon Jul 11 13:46:19 2016 +0530

--
 phoenix-pherf/pom.xml | 304 ++---
 1 file changed, 286 insertions(+), 18 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/913ee6a4/phoenix-pherf/pom.xml
--
diff --git a/phoenix-pherf/pom.xml b/phoenix-pherf/pom.xml
index 9850ade..8f359e6 100644
--- a/phoenix-pherf/pom.xml
+++ b/phoenix-pherf/pom.xml
@@ -24,6 +24,7 @@
 

${project.basedir}/..
+   org.apache.phoenix.shaded

 

@@ -166,24 +167,291 @@
org.apache.maven.plugins
maven-resources-plugin

-   
-   maven-assembly-plugin
-   
-   
-   make-assembly
-   package
-   
-   single
-   
-   
-   
-   
-   
src/main/assembly/minimal.xml
-   
-   
-   
-   
-   
+   
+   org.apache.maven.plugins
+   maven-install-plugin
+   
+   
+   
+   install-file
+   
+   default-install
+   
+   true
+   
+   install
+   
+   
+   
+   
${basedir}/target/phoenix-pherf-${project.version}-minimal.jar
+   
+   
+   
+   org.apache.maven.plugins
+   maven-shade-plugin
+   
+   
+   package
+   
+   shade
+   
+   
+   
phoenix-pherf-${project.version}-minimal
+   
false
+   
true
+   
false
+   
+   
+   
+   
README.md
+   
${project.basedir}/../README.md
+   
+   
+   
LICENSE.txt
+   
${project.basedir}/../LICENSE
+   
+   
+   
NOTICE
+   
${project.basedir}/../NOTICE
+   
+   
+   
+   
+ 

phoenix git commit: PHOENIX-3060 pherf tool is not working (Sergey Soldatov)

2016-07-11 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.0 02ba6507f -> 4dfe5cb93


PHOENIX-3060 pherf tool is not working (Sergey Soldatov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/4dfe5cb9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/4dfe5cb9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/4dfe5cb9

Branch: refs/heads/4.x-HBase-1.0
Commit: 4dfe5cb934561c5b2259db24921b98e1b0512774
Parents: 02ba650
Author: Ankit Singhal 
Authored: Mon Jul 11 13:46:42 2016 +0530
Committer: Ankit Singhal 
Committed: Mon Jul 11 13:46:42 2016 +0530

--
 phoenix-pherf/pom.xml | 304 ++---
 1 file changed, 286 insertions(+), 18 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/4dfe5cb9/phoenix-pherf/pom.xml
--
diff --git a/phoenix-pherf/pom.xml b/phoenix-pherf/pom.xml
index 8144497..d06429a 100644
--- a/phoenix-pherf/pom.xml
+++ b/phoenix-pherf/pom.xml
@@ -24,6 +24,7 @@
 

${project.basedir}/..
+   org.apache.phoenix.shaded

 

@@ -166,24 +167,291 @@
org.apache.maven.plugins
maven-resources-plugin

-   
-   maven-assembly-plugin
-   
-   
-   make-assembly
-   package
-   
-   single
-   
-   
-   
-   
-   
src/main/assembly/minimal.xml
-   
-   
-   
-   
-   
+   
+   org.apache.maven.plugins
+   maven-install-plugin
+   
+   
+   
+   install-file
+   
+   default-install
+   
+   true
+   
+   install
+   
+   
+   
+   
${basedir}/target/phoenix-pherf-${project.version}-minimal.jar
+   
+   
+   
+   org.apache.maven.plugins
+   maven-shade-plugin
+   
+   
+   package
+   
+   shade
+   
+   
+   
phoenix-pherf-${project.version}-minimal
+   
false
+   
true
+   
false
+   
+   
+   
+   
README.md
+   
${project.basedir}/../README.md
+   
+   
+   
LICENSE.txt
+   
${project.basedir}/../LICENSE
+   
+   
+   
NOTICE
+   
${project.basedir}/../NOTICE
+   
+   
+   
+   
+ 

phoenix git commit: PHOENIX-3060 pherf tool is not working (Sergey Soldatov)

2016-07-11 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 9bc0cebe1 -> e3f12f215


PHOENIX-3060 pherf tool is not working (Sergey Soldatov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/e3f12f21
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/e3f12f21
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/e3f12f21

Branch: refs/heads/4.x-HBase-0.98
Commit: e3f12f2157d4c57eba939b670e450c82c0a3a10c
Parents: 9bc0ceb
Author: Ankit Singhal 
Authored: Mon Jul 11 13:49:09 2016 +0530
Committer: Ankit Singhal 
Committed: Mon Jul 11 13:49:09 2016 +0530

--
 phoenix-pherf/pom.xml | 304 ++---
 1 file changed, 286 insertions(+), 18 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/e3f12f21/phoenix-pherf/pom.xml
--
diff --git a/phoenix-pherf/pom.xml b/phoenix-pherf/pom.xml
index c776f7a..55afda0 100644
--- a/phoenix-pherf/pom.xml
+++ b/phoenix-pherf/pom.xml
@@ -24,6 +24,7 @@
 

${project.basedir}/..
+   org.apache.phoenix.shaded

 

@@ -166,24 +167,291 @@
org.apache.maven.plugins
maven-resources-plugin

-   
-   maven-assembly-plugin
-   
-   
-   make-assembly
-   package
-   
-   single
-   
-   
-   
-   
-   
src/main/assembly/minimal.xml
-   
-   
-   
-   
-   
+   
+   org.apache.maven.plugins
+   maven-install-plugin
+   
+   
+   
+   install-file
+   
+   default-install
+   
+   true
+   
+   install
+   
+   
+   
+   
${basedir}/target/phoenix-pherf-${project.version}-minimal.jar
+   
+   
+   
+   org.apache.maven.plugins
+   maven-shade-plugin
+   
+   
+   package
+   
+   shade
+   
+   
+   
phoenix-pherf-${project.version}-minimal
+   
false
+   
true
+   
false
+   
+   
+   
+   
README.md
+   
${project.basedir}/../README.md
+   
+   
+   
LICENSE.txt
+   
${project.basedir}/../LICENSE
+   
+   
+   
NOTICE
+   
${project.basedir}/../NOTICE
+   
+   
+   
+   
+   

Jenkins build is back to normal : Phoenix | Master #1317

2016-07-11 Thread Apache Jenkins Server
See 



Build failed in Jenkins: Phoenix-4.x-HBase-1.1 #114

2016-07-11 Thread Apache Jenkins Server
See 

--
[...truncated 2092 lines...]
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.033 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.AsyncImmutableIndexIT
org.apache.phoenix.end2end.index.AsyncImmutableIndexIT  Time elapsed: 0.033 sec 
 <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.AsyncImmutableIndexIT.doSetup(AsyncImmutableIndexIT.java:52)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.AsyncImmutableIndexIT.doSetup(AsyncImmutableIndexIT.java:52)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.AsyncImmutableIndexIT.doSetup(AsyncImmutableIndexIT.java:52)

Running org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.005 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT  Time elapsed: 0.004 
sec  <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT.doSetup(ImmutableIndexWithStatsIT.java:52)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT.doSetup(ImmutableIndexWithStatsIT.java:52)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT.doSetup(ImmutableIndexWithStatsIT.java:52)

Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.004 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.MutableIndexFailureIT
org.apache.phoenix.end2end.index.MutableIndexFailureIT  Time elapsed: 0.003 sec 
 <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.MutableIndexFailureIT.doSetup(MutableIndexFailureIT.java:115)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.MutableIndexFailureIT.doSetup(MutableIndexFailureIT.java:115)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.MutableIndexFailureIT.doSetup(MutableIndexFailureIT.java:115)

Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Running org.apache.phoenix.end2end.index.MutableIndexReplicationIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.006 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.MutableIndexReplicationIT
org.apache.phoenix.end2end.index.MutableIndexReplicationIT  Time elapsed: 0.005 
sec  <<< ERROR!
java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.MutableIndexReplicationIT.setupConfigsAndStartCluster(MutableIndexReplicationIT.java:170)
at 
org.apache.phoenix.end2end.index.MutableIndexReplicationIT.setUpBeforeClass(MutableIndexReplicationIT.java:108)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.MutableIndexReplicationIT.setupConfigsAndStartCluster(MutableIndexReplicationIT.java:170)
at 
org.apache.phoenix.end2end.index.MutableIndexReplicationIT.setUpBeforeClass(MutableIndexReplicationIT.java:108)

Running org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.003 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT
org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT  Time elapsed: 0.003 
sec  <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT.doSetup(ReadOnlyIndexFailureIT.java:119)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT.doSetup(ReadOnlyIndexFailureIT.java:119)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT.doSetup(ReadOnlyIndexFailureIT.java:119)

Running org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.006 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
org.apache.phoenix.end2end.index.txn.TxWriteFailureIT  Time elapsed: 0.005 sec  
<<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.doSetup(TxWriteFailureIT.java:86)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.txn.TxWriteFailureI

Build failed in Jenkins: Phoenix | Master #1318

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[ankitsinghal59] PHOENIX-3060 pherf tool is not working (Sergey Soldatov)

--
[...truncated 1847 lines...]
[INFO] 
[INFO] --- maven-shade-plugin:2.4.3:shade (default) @ phoenix-pherf ---
[INFO] Excluding org.apache.phoenix:phoenix-core:jar:4.8.0-HBase-1.2-SNAPSHOT 
from the shaded jar.
[INFO] Excluding org.apache.tephra:tephra-api:jar:0.8.0-incubating from the 
shaded jar.
[INFO] Excluding org.apache.tephra:tephra-core:jar:0.8.0-incubating from the 
shaded jar.
[INFO] Excluding org.apache.tephra:tephra-hbase-compat-1.1:jar:0.8.0-incubating 
from the shaded jar.
[INFO] Excluding org.antlr:antlr-runtime:jar:3.5.2 from the shaded jar.
[INFO] Excluding jline:jline:jar:2.11 from the shaded jar.
[INFO] Excluding sqlline:sqlline:jar:1.1.9 from the shaded jar.
[INFO] Excluding com.google.guava:guava:jar:13.0.1 from the shaded jar.
[INFO] Excluding joda-time:joda-time:jar:1.6 from the shaded jar.
[INFO] Excluding net.sourceforge.findbugs:annotations:jar:1.3.2 from the shaded 
jar.
[INFO] Excluding org.codehaus.jackson:jackson-core-asl:jar:1.9.2 from the 
shaded jar.
[INFO] Excluding org.codehaus.jackson:jackson-mapper-asl:jar:1.9.2 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:2.5.0 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding log4j:log4j:jar:1.2.17 from the shaded jar.
[INFO] Excluding org.slf4j:slf4j-api:jar:1.6.4 from the shaded jar.
[INFO] Excluding org.iq80.snappy:snappy:jar:0.3 from the shaded jar.
[INFO] Excluding org.apache.htrace:htrace-core:jar:3.1.0-incubating from the 
shaded jar.
[INFO] Excluding io.netty:netty-all:jar:4.0.23.Final from the shaded jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.7 from the shaded jar.
[INFO] Excluding commons-collections:commons-collections:jar:3.2.2 from the 
shaded jar.
[INFO] Excluding org.apache.commons:commons-csv:jar:1.0 from the shaded jar.
[INFO] Excluding com.google.code.findbugs:jsr305:jar:2.0.1 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-annotations:jar:1.2.0 from the shaded 
jar.
[INFO] Excluding org.apache.hbase:hbase-common:jar:1.2.0 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-protocol:jar:1.2.0 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-client:jar:1.2.0 from the shaded jar.
[INFO] Excluding org.apache.zookeeper:zookeeper:jar:3.4.6 from the shaded jar.
[INFO] Excluding org.jruby.jcodings:jcodings:jar:1.0.8 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-server:jar:1.2.0 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-procedure:jar:1.2.0 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-prefix-tree:jar:1.2.0 from the shaded 
jar.
[INFO] Excluding commons-httpclient:commons-httpclient:jar:3.1 from the shaded 
jar.
[INFO] Excluding com.sun.jersey:jersey-core:jar:1.9 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 from the shaded jar.
[INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty-sslengine:jar:6.1.26 from the shaded 
jar.
[INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shaded 
jar.
[INFO] Excluding tomcat:jasper-compiler:jar:5.5.23 from the shaded jar.
[INFO] Excluding tomcat:jasper-runtime:jar:5.5.23 from the shaded jar.
[INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar.
[INFO] Excluding org.jamon:jamon-runtime:jar:2.4.1 from the shaded jar.
[INFO] Excluding com.lmax:disruptor:jar:3.3.0 from the shaded jar.
[INFO] Excluding org.owasp.esapi:esapi:jar:2.1.0 from the shaded jar.
[INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.7.0 from the 
shaded jar.
[INFO] Excluding commons-fileupload:commons-fileupload:jar:1.2 from the shaded 
jar.
[INFO] Excluding xom:xom:jar:1.2.5 from the shaded jar.
[INFO] Excluding xalan:xalan:jar:2.7.0 from the shaded jar.
[INFO] Excluding org.beanshell:bsh-core:jar:2.0b4 from the shaded jar.
[INFO] Excluding org.owasp.antisamy:antisamy:jar:1.4.3 from the shaded jar.
[INFO] Excluding org.apache.xmlgraphics:batik-css:jar:1.7 from the shaded jar.
[INFO] Excluding org.apache.xmlgraphics:batik-ext:jar:1.7 from the shaded jar.
[INFO] Excluding org.apache.xmlgraphics:batik-util:jar:1.7 from the shaded jar.
[INFO] Excluding xml-apis:xml-apis-ext:jar:1.3.04 from the shaded jar.
[INFO] Excluding net.sourceforge.nekohtml:nekohtml:jar:1.9.12 from the shaded 
jar.
[INFO] Excluding org.apache.hbase:hbase-hadoop-compat:jar

Apache-Phoenix | 4.x-HBase-1.0 | Build Successful

2016-07-11 Thread Apache Jenkins Server
4.x-HBase-1.0 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.0

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.0/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.0/lastCompletedBuild/testReport/

Changes
[ankitsinghal59] PHOENIX-3060 pherf tool is not working (Sergey Soldatov)



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Build failed in Jenkins: Phoenix-4.x-HBase-1.1 #115

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[ankitsinghal59] PHOENIX-3060 pherf tool is not working (Sergey Soldatov)

--
[...truncated 15006 lines...]
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at 
org.apache.hadoop.hbase.master.procedure.ProcedureSyncWait.waitForProcedureToComplete(ProcedureSyncWait.java:81)
at org.apache.hadoop.hbase.master.HMaster.modifyTable(HMaster.java:1955)
at 
org.apache.hadoop.hbase.master.MasterRpcServices.modifyTable(MasterRpcServices.java:1098)
at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:52193)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117)
... 4 more

at 
org.apache.phoenix.iterate.ScannerLeaseRenewalIT.setUp(ScannerLeaseRenewalIT.java:90)
Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2156)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at 
org.apache.hadoop.hbase.master.procedure.ProcedureSyncWait.waitForProcedureToComplete(ProcedureSyncWait.java:81)
at org.apache.hadoop.hbase.master.HMaster.modifyTable(HMaster.java:1955)
at 
org.apache.hadoop.hbase.master.MasterRpcServices.modifyTable(MasterRpcServices.java:1098)
at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:52193)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117)
... 4 more

at 
org.apache.phoenix.iterate.ScannerLeaseRenewalIT.setUp(ScannerLeaseRenewalIT.java:90)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2156)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at 
org.apache.hadoop.hbase.master.procedure.ProcedureSyncWait.waitForProcedureToComplete(ProcedureSyncWait.java:81)
at org.apache.hadoop.hbase.master.HMaster.modifyTable(HMaster.java:1955)
at 
org.apache.hadoop.hbase.master.MasterRpcServices.modifyTable(MasterRpcServices.java:1098)
at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:52193)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117)
... 4 more

at 
org.apache.phoenix.iterate.ScannerLeaseRenewalIT.setUp(ScannerLeaseRenewalIT.java:90)

Running org.apache.phoenix.rpc.PhoenixClientRpcIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.019 sec <<< 
FAILURE! - in org.apache.phoenix.rpc.PhoenixClientRpcIT
org.apache.phoenix.rpc.PhoenixClientRpcIT  Time elapsed: 0.018 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2156)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.proce

Apache-Phoenix | 4.x-HBase-1.0 | Build Successful

2016-07-11 Thread Apache Jenkins Server
4.x-HBase-1.0 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.0

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.0/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.0/lastCompletedBuild/testReport/

Changes
No changes


Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Jenkins build is back to normal : Phoenix | Master #1319

2016-07-11 Thread Apache Jenkins Server
See 



Apache-Phoenix | Master | Build Successful

2016-07-11 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/master

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master/lastCompletedBuild/testReport/

Changes
No changes


Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Build failed in Jenkins: Phoenix-4.x-HBase-1.1 #116

2016-07-11 Thread Apache Jenkins Server
See 

--
[...truncated 15003 lines...]
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at 
org.apache.hadoop.hbase.master.procedure.ProcedureSyncWait.waitForProcedureToComplete(ProcedureSyncWait.java:81)
at org.apache.hadoop.hbase.master.HMaster.modifyTable(HMaster.java:1955)
at 
org.apache.hadoop.hbase.master.MasterRpcServices.modifyTable(MasterRpcServices.java:1098)
at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:52193)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117)
... 4 more

at 
org.apache.phoenix.monitoring.PhoenixMetricsIT.doSetup(PhoenixMetricsIT.java:78)
Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2156)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at 
org.apache.hadoop.hbase.master.procedure.ProcedureSyncWait.waitForProcedureToComplete(ProcedureSyncWait.java:81)
at org.apache.hadoop.hbase.master.HMaster.modifyTable(HMaster.java:1955)
at 
org.apache.hadoop.hbase.master.MasterRpcServices.modifyTable(MasterRpcServices.java:1098)
at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:52193)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117)
... 4 more

at 
org.apache.phoenix.monitoring.PhoenixMetricsIT.doSetup(PhoenixMetricsIT.java:78)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2156)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at 
org.apache.hadoop.hbase.master.procedure.ProcedureSyncWait.waitForProcedureToComplete(ProcedureSyncWait.java:81)
at org.apache.hadoop.hbase.master.HMaster.modifyTable(HMaster.java:1955)
at 
org.apache.hadoop.hbase.master.MasterRpcServices.modifyTable(MasterRpcServices.java:1098)
at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:52193)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117)
... 4 more

at 
org.apache.phoenix.monitoring.PhoenixMetricsIT.doSetup(PhoenixMetricsIT.java:78)

Running org.apache.phoenix.rpc.PhoenixClientRpcIT
Running org.apache.phoenix.rpc.PhoenixServerRpcIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.013 sec <<< 
FAILURE! - in org.apache.phoenix.rpc.PhoenixClientRpcIT
org.apache.phoenix.rpc.PhoenixClientRpcIT  Time elapsed: 0.012 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hbase/procedure2/ProcedureResult;
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2156)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.procedure2.ProcedureExecutor.getResult(J)Lorg/apache/hadoop/hb

phoenix git commit: PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if bulkload has error (Simon Wang)

2016-07-11 Thread mujtaba
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 e3f12f215 -> cf4e33b33


PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if bulkload has 
error (Simon Wang)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/cf4e33b3
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/cf4e33b3
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/cf4e33b3

Branch: refs/heads/4.x-HBase-0.98
Commit: cf4e33b332476ddc88dcd4d9f9ac785f87efc599
Parents: e3f12f2
Author: Mujtaba 
Authored: Mon Jul 11 14:02:21 2016 -0700
Committer: Mujtaba 
Committed: Mon Jul 11 14:02:21 2016 -0700

--
 .../phoenix/mapreduce/index/IndexTool.java   | 19 ---
 1 file changed, 8 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/cf4e33b3/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
index 576dbd3..1b1f0fb 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
@@ -256,8 +256,8 @@ public class IndexTool extends Configured implements Tool {
 }
 return 0;
 } catch (Exception ex) {
-LOG.error(" An exception occured while performing the indexing job 
: "
-+ ExceptionUtils.getStackTrace(ex));
+LOG.error("An exception occurred while performing the indexing 
job: "
++ ExceptionUtils.getMessage(ex) + " at:\n" + 
ExceptionUtils.getStackTrace(ex));
 return -1;
 } finally {
 try {
@@ -278,7 +278,7 @@ public class IndexTool extends Configured implements Tool {
  * @return
  * @throws Exception
  */
-private int configureRunnableJobUsingBulkLoad(Job job, Path outputPath) 
throws Exception {
+private void configureRunnableJobUsingBulkLoad(Job job, Path outputPath) 
throws Exception {
 job.setMapOutputKeyClass(ImmutableBytesWritable.class);
 job.setMapOutputValueClass(KeyValue.class);
 final Configuration configuration = job.getConfiguration();
@@ -288,9 +288,9 @@ public class IndexTool extends Configured implements Tool {
 HFileOutputFormat.configureIncrementalLoad(job, htable);
 boolean status = job.waitForCompletion(true);
 if (!status) {
-LOG.error("Failed to run the IndexTool job. ");
+LOG.error("Failed to run the IndexTool job.");
 htable.close();
-return -1;
+throw new Exception("IndexTool job failed: " + job.toString());
 }
 
 LOG.info("Loading HFiles from {}", outputPath);
@@ -299,8 +299,6 @@ public class IndexTool extends Configured implements Tool {
 htable.close();
 
 FileSystem.get(configuration).delete(outputPath, true);
-
-return 0;
 }
 
 /**
@@ -314,7 +312,7 @@ public class IndexTool extends Configured implements Tool {
  * @return
  * @throws Exception
  */
-private int configureSubmittableJobUsingDirectApi(Job job, Path 
outputPath, boolean runForeground)
+private void configureSubmittableJobUsingDirectApi(Job job, Path 
outputPath, boolean runForeground)
 throws Exception {
 Configuration conf = job.getConfiguration();
 HBaseConfiguration.merge(conf, HBaseConfiguration.create(conf));
@@ -333,16 +331,15 @@ public class IndexTool extends Configured implements Tool 
{
 if (!runForeground) {
 LOG.info("Running Index Build in Background - Submit async and 
exit");
 job.submit();
-return 0;
+return;
 }
 LOG.info("Running Index Build in Foreground. Waits for the build to 
complete. This may take a long time!.");
 boolean result = job.waitForCompletion(true);
 if (!result) {
 LOG.error("Job execution failed!");
-return -1;
+throw new Exception("IndexTool job failed: " + job.toString());
 }
 FileSystem.get(conf).delete(outputPath, true);
-return 0;
 }
 
 /**



phoenix git commit: PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if bulkload has error (Simon Wang)

2016-07-11 Thread mujtaba
Repository: phoenix
Updated Branches:
  refs/heads/master af8bbfb52 -> b49d55831


PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if bulkload has 
error (Simon Wang)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/b49d5583
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/b49d5583
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/b49d5583

Branch: refs/heads/master
Commit: b49d558315501151f6e84990d06682de0be87ef7
Parents: af8bbfb
Author: Mujtaba 
Authored: Mon Jul 11 14:03:39 2016 -0700
Committer: Mujtaba 
Committed: Mon Jul 11 14:03:39 2016 -0700

--
 .../phoenix/mapreduce/index/IndexTool.java   | 19 ---
 1 file changed, 8 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/b49d5583/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
index 576dbd3..1b1f0fb 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
@@ -256,8 +256,8 @@ public class IndexTool extends Configured implements Tool {
 }
 return 0;
 } catch (Exception ex) {
-LOG.error(" An exception occured while performing the indexing job 
: "
-+ ExceptionUtils.getStackTrace(ex));
+LOG.error("An exception occurred while performing the indexing 
job: "
++ ExceptionUtils.getMessage(ex) + " at:\n" + 
ExceptionUtils.getStackTrace(ex));
 return -1;
 } finally {
 try {
@@ -278,7 +278,7 @@ public class IndexTool extends Configured implements Tool {
  * @return
  * @throws Exception
  */
-private int configureRunnableJobUsingBulkLoad(Job job, Path outputPath) 
throws Exception {
+private void configureRunnableJobUsingBulkLoad(Job job, Path outputPath) 
throws Exception {
 job.setMapOutputKeyClass(ImmutableBytesWritable.class);
 job.setMapOutputValueClass(KeyValue.class);
 final Configuration configuration = job.getConfiguration();
@@ -288,9 +288,9 @@ public class IndexTool extends Configured implements Tool {
 HFileOutputFormat.configureIncrementalLoad(job, htable);
 boolean status = job.waitForCompletion(true);
 if (!status) {
-LOG.error("Failed to run the IndexTool job. ");
+LOG.error("Failed to run the IndexTool job.");
 htable.close();
-return -1;
+throw new Exception("IndexTool job failed: " + job.toString());
 }
 
 LOG.info("Loading HFiles from {}", outputPath);
@@ -299,8 +299,6 @@ public class IndexTool extends Configured implements Tool {
 htable.close();
 
 FileSystem.get(configuration).delete(outputPath, true);
-
-return 0;
 }
 
 /**
@@ -314,7 +312,7 @@ public class IndexTool extends Configured implements Tool {
  * @return
  * @throws Exception
  */
-private int configureSubmittableJobUsingDirectApi(Job job, Path 
outputPath, boolean runForeground)
+private void configureSubmittableJobUsingDirectApi(Job job, Path 
outputPath, boolean runForeground)
 throws Exception {
 Configuration conf = job.getConfiguration();
 HBaseConfiguration.merge(conf, HBaseConfiguration.create(conf));
@@ -333,16 +331,15 @@ public class IndexTool extends Configured implements Tool 
{
 if (!runForeground) {
 LOG.info("Running Index Build in Background - Submit async and 
exit");
 job.submit();
-return 0;
+return;
 }
 LOG.info("Running Index Build in Foreground. Waits for the build to 
complete. This may take a long time!.");
 boolean result = job.waitForCompletion(true);
 if (!result) {
 LOG.error("Job execution failed!");
-return -1;
+throw new Exception("IndexTool job failed: " + job.toString());
 }
 FileSystem.get(conf).delete(outputPath, true);
-return 0;
 }
 
 /**



phoenix git commit: PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if bulkload has error (Simon Wang)

2016-07-11 Thread mujtaba
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.0 4dfe5cb93 -> fd1ae10fb


PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if bulkload has 
error (Simon Wang)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/fd1ae10f
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/fd1ae10f
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/fd1ae10f

Branch: refs/heads/4.x-HBase-1.0
Commit: fd1ae10fb05818e1e2cd211d72d414a5a211a463
Parents: 4dfe5cb
Author: Mujtaba 
Authored: Mon Jul 11 14:04:22 2016 -0700
Committer: Mujtaba 
Committed: Mon Jul 11 14:04:22 2016 -0700

--
 .../phoenix/mapreduce/index/IndexTool.java   | 19 ---
 1 file changed, 8 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/fd1ae10f/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
index 576dbd3..1b1f0fb 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
@@ -256,8 +256,8 @@ public class IndexTool extends Configured implements Tool {
 }
 return 0;
 } catch (Exception ex) {
-LOG.error(" An exception occured while performing the indexing job 
: "
-+ ExceptionUtils.getStackTrace(ex));
+LOG.error("An exception occurred while performing the indexing 
job: "
++ ExceptionUtils.getMessage(ex) + " at:\n" + 
ExceptionUtils.getStackTrace(ex));
 return -1;
 } finally {
 try {
@@ -278,7 +278,7 @@ public class IndexTool extends Configured implements Tool {
  * @return
  * @throws Exception
  */
-private int configureRunnableJobUsingBulkLoad(Job job, Path outputPath) 
throws Exception {
+private void configureRunnableJobUsingBulkLoad(Job job, Path outputPath) 
throws Exception {
 job.setMapOutputKeyClass(ImmutableBytesWritable.class);
 job.setMapOutputValueClass(KeyValue.class);
 final Configuration configuration = job.getConfiguration();
@@ -288,9 +288,9 @@ public class IndexTool extends Configured implements Tool {
 HFileOutputFormat.configureIncrementalLoad(job, htable);
 boolean status = job.waitForCompletion(true);
 if (!status) {
-LOG.error("Failed to run the IndexTool job. ");
+LOG.error("Failed to run the IndexTool job.");
 htable.close();
-return -1;
+throw new Exception("IndexTool job failed: " + job.toString());
 }
 
 LOG.info("Loading HFiles from {}", outputPath);
@@ -299,8 +299,6 @@ public class IndexTool extends Configured implements Tool {
 htable.close();
 
 FileSystem.get(configuration).delete(outputPath, true);
-
-return 0;
 }
 
 /**
@@ -314,7 +312,7 @@ public class IndexTool extends Configured implements Tool {
  * @return
  * @throws Exception
  */
-private int configureSubmittableJobUsingDirectApi(Job job, Path 
outputPath, boolean runForeground)
+private void configureSubmittableJobUsingDirectApi(Job job, Path 
outputPath, boolean runForeground)
 throws Exception {
 Configuration conf = job.getConfiguration();
 HBaseConfiguration.merge(conf, HBaseConfiguration.create(conf));
@@ -333,16 +331,15 @@ public class IndexTool extends Configured implements Tool 
{
 if (!runForeground) {
 LOG.info("Running Index Build in Background - Submit async and 
exit");
 job.submit();
-return 0;
+return;
 }
 LOG.info("Running Index Build in Foreground. Waits for the build to 
complete. This may take a long time!.");
 boolean result = job.waitForCompletion(true);
 if (!result) {
 LOG.error("Job execution failed!");
-return -1;
+throw new Exception("IndexTool job failed: " + job.toString());
 }
 FileSystem.get(conf).delete(outputPath, true);
-return 0;
 }
 
 /**



phoenix git commit: PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if bulkload has error (Simon Wang)

2016-07-11 Thread mujtaba
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.1 913ee6a42 -> 5721169a5


PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if bulkload has 
error (Simon Wang)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/5721169a
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/5721169a
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/5721169a

Branch: refs/heads/4.x-HBase-1.1
Commit: 5721169a54a8486ebc81a57981a72859ca28a2c4
Parents: 913ee6a
Author: Mujtaba 
Authored: Mon Jul 11 14:06:01 2016 -0700
Committer: Mujtaba 
Committed: Mon Jul 11 14:06:01 2016 -0700

--
 .../phoenix/mapreduce/index/IndexTool.java   | 19 ---
 1 file changed, 8 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/5721169a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
index 576dbd3..1b1f0fb 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java
@@ -256,8 +256,8 @@ public class IndexTool extends Configured implements Tool {
 }
 return 0;
 } catch (Exception ex) {
-LOG.error(" An exception occured while performing the indexing job 
: "
-+ ExceptionUtils.getStackTrace(ex));
+LOG.error("An exception occurred while performing the indexing 
job: "
++ ExceptionUtils.getMessage(ex) + " at:\n" + 
ExceptionUtils.getStackTrace(ex));
 return -1;
 } finally {
 try {
@@ -278,7 +278,7 @@ public class IndexTool extends Configured implements Tool {
  * @return
  * @throws Exception
  */
-private int configureRunnableJobUsingBulkLoad(Job job, Path outputPath) 
throws Exception {
+private void configureRunnableJobUsingBulkLoad(Job job, Path outputPath) 
throws Exception {
 job.setMapOutputKeyClass(ImmutableBytesWritable.class);
 job.setMapOutputValueClass(KeyValue.class);
 final Configuration configuration = job.getConfiguration();
@@ -288,9 +288,9 @@ public class IndexTool extends Configured implements Tool {
 HFileOutputFormat.configureIncrementalLoad(job, htable);
 boolean status = job.waitForCompletion(true);
 if (!status) {
-LOG.error("Failed to run the IndexTool job. ");
+LOG.error("Failed to run the IndexTool job.");
 htable.close();
-return -1;
+throw new Exception("IndexTool job failed: " + job.toString());
 }
 
 LOG.info("Loading HFiles from {}", outputPath);
@@ -299,8 +299,6 @@ public class IndexTool extends Configured implements Tool {
 htable.close();
 
 FileSystem.get(configuration).delete(outputPath, true);
-
-return 0;
 }
 
 /**
@@ -314,7 +312,7 @@ public class IndexTool extends Configured implements Tool {
  * @return
  * @throws Exception
  */
-private int configureSubmittableJobUsingDirectApi(Job job, Path 
outputPath, boolean runForeground)
+private void configureSubmittableJobUsingDirectApi(Job job, Path 
outputPath, boolean runForeground)
 throws Exception {
 Configuration conf = job.getConfiguration();
 HBaseConfiguration.merge(conf, HBaseConfiguration.create(conf));
@@ -333,16 +331,15 @@ public class IndexTool extends Configured implements Tool 
{
 if (!runForeground) {
 LOG.info("Running Index Build in Background - Submit async and 
exit");
 job.submit();
-return 0;
+return;
 }
 LOG.info("Running Index Build in Foreground. Waits for the build to 
complete. This may take a long time!.");
 boolean result = job.waitForCompletion(true);
 if (!result) {
 LOG.error("Job execution failed!");
-return -1;
+throw new Exception("IndexTool job failed: " + job.toString());
 }
 FileSystem.get(conf).delete(outputPath, true);
-return 0;
 }
 
 /**



phoenix git commit: Added ability to run async indexes when hbase cluster is in non-distributed mode or when mr is in local mode

2016-07-11 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 cf4e33b33 -> 04df7bca0


Added ability to run async indexes when hbase cluster is in non-distributed 
mode or when mr is in local mode


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/04df7bca
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/04df7bca
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/04df7bca

Branch: refs/heads/4.x-HBase-0.98
Commit: 04df7bca07607f9db7f7bba5f726bff6013b81e0
Parents: cf4e33b
Author: tejamobref 
Authored: Tue Jul 12 01:37:12 2016 +0530
Committer: Thomas D'Silva 
Committed: Mon Jul 11 15:15:09 2016 -0700

--
 .../phoenix/end2end/index/AsyncIndexIT.java | 180 +++
 .../coprocessor/MetaDataRegionObserver.java | 113 ++--
 .../phoenix/jdbc/PhoenixEmbeddedDriver.java |   4 +-
 .../index/automation/PhoenixMRJobSubmitter.java |  13 +-
 .../apache/phoenix/query/QueryConstants.java|  18 +-
 .../org/apache/phoenix/query/QueryServices.java |   4 +-
 6 files changed, 308 insertions(+), 24 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/04df7bca/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
new file mode 100644
index 000..43d1bd9
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
@@ -0,0 +1,180 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.end2end.index;
+
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.ASYNC_CREATED_DATE;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.COLUMN_FAMILY;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.COLUMN_NAME;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.DATA_TABLE_NAME;
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_SCHEMA;
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_TABLE;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_NAME;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_SCHEM;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_TYPE;
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertTrue;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.apache.phoenix.end2end.NeedsOwnMiniClusterTest;
+import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
+import org.apache.phoenix.query.BaseTest;
+import org.apache.phoenix.query.QueryServices;
+import org.apache.phoenix.schema.PIndexState;
+import org.apache.phoenix.schema.PTableType;
+import org.apache.phoenix.schema.types.PDate;
+import org.apache.phoenix.util.ReadOnlyProps;
+import org.junit.After;
+import org.junit.Test;
+import org.junit.experimental.categories.Category;
+
+@Category(NeedsOwnMiniClusterTest.class)
+public class AsyncIndexIT extends BaseTest {
+
+private static final String PERSON_TABLE_NAME = "PERSON";
+private static final String PERSON_TABLE_NAME_WITH_SCHEMA = "TEST.PERSON";
+private static final String TEST_SCHEMA = "TEST";
+
+private static final String PERSON_TABLE_ASYNC_INDEX_INFO_QUERY = "SELECT "
++ DATA_TABLE_NAME + ", " + TABLE_SCHEM + ", "
++ TABLE_NAME + " FROM " + SYSTEM_CATALOG_SCHEMA + ".\""
++ SYSTEM_CATALOG_TABLE + "\""
++ " (" + ASYNC_CREATED_DATE + " "
++ PDate.INSTANCE.getSqlTypeName() + ") " + " WHERE "
++ COLUMN_NAME + " IS NULL and " + COLUMN_FAMILY + " IS NULL  and "
++ ASYNC_CREATED_DATE + " IS NOT NULL and "

Build failed in Jenkins: Phoenix | 4.x-HBase-0.98 #1226

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[mujtaba] PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if

--
[...truncated 895 lines...]
at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2024)
at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2004)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3247)
at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31190)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)


testNewColumnFamilyInheritsTTLOfEmptyCF(org.apache.phoenix.end2end.AlterTableIT)
  Time elapsed: 10.231 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: Column family 
CF does not exist in region 
NEWCFTTLTEST,\x06\x00\x00,1468274203479.e49f4ec4d6cd92c733b46894fe3336d5. in 
table 'NEWCFTTLTEST', {TABLE_ATTRIBUTES => {coprocessor$1 => 
'|org.apache.phoenix.coprocessor.ScanRegionObserver|805306366|', coprocessor$2 
=> 
'|org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver|805306366|', 
coprocessor$3 => 
'|org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver|805306366|', 
coprocessor$4 => 
'|org.apache.phoenix.coprocessor.ServerCachingEndpointImpl|805306366|', 
coprocessor$5 => 
'|org.apache.phoenix.hbase.index.Indexer|805306366|index.builder=org.apache.phoenix.index.PhoenixIndexBuilder,org.apache.hadoop.hbase.index.codec.class=org.apache.phoenix.index.PhoenixIndexCodec'},
 {NAME => '0', DATA_BLOCK_ENCODING => 'FAST_DIFF', BLOOMFILTER => 'ROW', 
REPLICATION_SCOPE => '0', COMPRESSION => 'NONE', VERSIONS => '1', TTL => '1000 
SECONDS (16 MINUTES 40 SECONDS)', MIN_VERSIONS => '0', KEEP_DELETED_CELLS => 
'FALSE', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 'true'}
at 
org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:5901)
at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2024)
at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2004)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3247)
at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31190)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)

Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: Column family 
CF does not exist in region 
NEWCFTTLTEST,\x06\x00\x00,1468274203479.e49f4ec4d6cd92c733b46894fe3336d5. in 
table 'NEWCFTTLTEST', {TABLE_ATTRIBUTES => {coprocessor$1 => 
'|org.apache.phoenix.coprocessor.ScanRegionObserver|805306366|', coprocessor$2 
=> 
'|org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver|805306366|', 
coprocessor$3 => 
'|org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver|805306366|', 
coprocessor$4 => 
'|org.apache.phoenix.coprocessor.ServerCachingEndpointImpl|805306366|', 
coprocessor$5 => 
'|org.apache.phoenix.hbase.index.Indexer|805306366|index.builder=org.apache.phoenix.index.PhoenixIndexBuilder,org.apache.hadoop.hbase.index.codec.class=org.apache.phoenix.index.PhoenixIndexCodec'},
 {NAME => '0', DATA_BLOCK_ENCODING => 'FAST_DIFF', BLOOMFILTER => 'ROW', 
REPLICATION_SCOPE => '0', COMPRESSION => 'NONE', VERSIONS => '1', TTL => '1000 
SECONDS (16 MINUTES 40 SECONDS)', MIN_VERSIONS => '0', KEEP_DELETED_CELLS => 
'FALSE', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 'true'}
at 
org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:5901)
at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2024)
at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2004)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3247)
at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31190)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
at org.apache.hadoop.h

Build failed in Jenkins: Phoenix-4.x-HBase-1.0 #564

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[mujtaba] PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if

--
[...truncated 749 lines...]
Running org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.619 sec - in 
org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.33 sec - in 
org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Running org.apache.phoenix.end2end.PrimitiveTypeIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.207 sec - in 
org.apache.phoenix.end2end.PrimitiveTypeIT
Running org.apache.phoenix.end2end.QueryMoreIT
Running org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.408 sec - in 
org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.087 sec - 
in org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.RTrimFunctionIT
Running org.apache.phoenix.end2end.ReadOnlyIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.039 sec - in 
org.apache.phoenix.end2end.RTrimFunctionIT
Running org.apache.phoenix.end2end.RegexpSplitFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.435 sec - in 
org.apache.phoenix.end2end.ReadOnlyIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.69 sec - in 
org.apache.phoenix.end2end.RegexpSplitFunctionIT
Running org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Running org.apache.phoenix.end2end.ReverseFunctionIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.34 sec - in 
org.apache.phoenix.end2end.DistinctPrefixFilterIT
Running org.apache.phoenix.end2end.SerialIteratorsIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.431 sec - 
in org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.691 sec - in 
org.apache.phoenix.end2end.SerialIteratorsIT
Running org.apache.phoenix.end2end.ServerExceptionIT
Running org.apache.phoenix.end2end.SignFunctionEnd2EndIT
Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.7 sec - in 
org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.342 sec - in 
org.apache.phoenix.end2end.SignFunctionEnd2EndIT
Running org.apache.phoenix.end2end.SortOrderIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.708 sec - in 
org.apache.phoenix.end2end.ServerExceptionIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.206 sec - in 
org.apache.phoenix.end2end.ReverseFunctionIT
Running org.apache.phoenix.end2end.StatementHintsIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.55 sec - in 
org.apache.phoenix.end2end.StatementHintsIT
Running org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
Running org.apache.phoenix.end2end.StringToArrayFunctionIT
Running org.apache.phoenix.end2end.StoreNullsIT
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.127 sec - 
in org.apache.phoenix.end2end.StringToArrayFunctionIT
Running org.apache.phoenix.end2end.ToCharFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.986 sec - in 
org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
Running org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.899 sec - in 
org.apache.phoenix.end2end.StoreNullsIT
Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.465 sec - in 
org.apache.phoenix.end2end.QueryMoreIT
Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.48 sec - in 
org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.941 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.125 sec - in 
org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.575 sec - 
in org.apache.phoenix.end2end.ToCharFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.856 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
Tests run: 45, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 90.442 sec - 
in org.apache.phoenix.end2end.SortOrderIT

Results :

Tests run: 342, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test 
(NeedTheirOwnClusterTests) @ phoenix-core ---

---

phoenix git commit: Added ability to run async indexes when hbase cluster is in non-distributed mode or when mr is in local mode

2016-07-11 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/master b49d55831 -> 919a89e3c


Added ability to run async indexes when hbase cluster is in non-distributed 
mode or when mr is in local mode


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/919a89e3
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/919a89e3
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/919a89e3

Branch: refs/heads/master
Commit: 919a89e3c2d3a6adb7ea3e166e3f602b8d028f3e
Parents: b49d558
Author: tejamobref 
Authored: Tue Jul 12 01:37:12 2016 +0530
Committer: Thomas D'Silva 
Committed: Mon Jul 11 15:28:00 2016 -0700

--
 .../phoenix/end2end/index/AsyncIndexIT.java | 180 +++
 .../coprocessor/MetaDataRegionObserver.java | 113 ++--
 .../phoenix/jdbc/PhoenixEmbeddedDriver.java |   4 +-
 .../index/automation/PhoenixMRJobSubmitter.java |  13 +-
 .../apache/phoenix/query/QueryConstants.java|  18 +-
 .../org/apache/phoenix/query/QueryServices.java |   4 +-
 6 files changed, 308 insertions(+), 24 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/919a89e3/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
new file mode 100644
index 000..43d1bd9
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
@@ -0,0 +1,180 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.end2end.index;
+
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.ASYNC_CREATED_DATE;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.COLUMN_FAMILY;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.COLUMN_NAME;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.DATA_TABLE_NAME;
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_SCHEMA;
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_TABLE;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_NAME;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_SCHEM;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_TYPE;
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertTrue;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.apache.phoenix.end2end.NeedsOwnMiniClusterTest;
+import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
+import org.apache.phoenix.query.BaseTest;
+import org.apache.phoenix.query.QueryServices;
+import org.apache.phoenix.schema.PIndexState;
+import org.apache.phoenix.schema.PTableType;
+import org.apache.phoenix.schema.types.PDate;
+import org.apache.phoenix.util.ReadOnlyProps;
+import org.junit.After;
+import org.junit.Test;
+import org.junit.experimental.categories.Category;
+
+@Category(NeedsOwnMiniClusterTest.class)
+public class AsyncIndexIT extends BaseTest {
+
+private static final String PERSON_TABLE_NAME = "PERSON";
+private static final String PERSON_TABLE_NAME_WITH_SCHEMA = "TEST.PERSON";
+private static final String TEST_SCHEMA = "TEST";
+
+private static final String PERSON_TABLE_ASYNC_INDEX_INFO_QUERY = "SELECT "
++ DATA_TABLE_NAME + ", " + TABLE_SCHEM + ", "
++ TABLE_NAME + " FROM " + SYSTEM_CATALOG_SCHEMA + ".\""
++ SYSTEM_CATALOG_TABLE + "\""
++ " (" + ASYNC_CREATED_DATE + " "
++ PDate.INSTANCE.getSqlTypeName() + ") " + " WHERE "
++ COLUMN_NAME + " IS NULL and " + COLUMN_FAMILY + " IS NULL  and "
++ ASYNC_CREATED_DATE + " IS NOT NULL and "
++ T

Build failed in Jenkins: Phoenix | Master #1321

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[mujtaba] PHOENIX-3061 IndexTool marks index as ACTIVE and exit 0 even if 
bulkload

--
[...truncated 745 lines...]
Running org.apache.phoenix.end2end.ArrayToStringFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.919 sec - in 
org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
Running org.apache.phoenix.end2end.ArrayFillFunctionIT
Running org.apache.phoenix.end2end.ArraysWithNullsIT
Running org.apache.phoenix.end2end.AlterSessionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.128 sec - in 
org.apache.phoenix.end2end.AlterSessionIT
Running org.apache.phoenix.end2end.AutoCommitIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.47 sec - in 
org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.998 sec - in 
org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.422 sec - 
in org.apache.phoenix.end2end.ArrayFillFunctionIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.842 sec - 
in org.apache.phoenix.end2end.ArrayToStringFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.27 sec - in 
org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Running org.apache.phoenix.end2end.DynamicFamilyIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.858 sec - in 
org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.808 sec - in 
org.apache.phoenix.end2end.DynamicFamilyIT
Running org.apache.phoenix.end2end.FirstValueFunctionIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.692 sec - 
in org.apache.phoenix.end2end.ArraysWithNullsIT
Running org.apache.phoenix.end2end.DynamicUpsertIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.105 sec - in 
org.apache.phoenix.end2end.DynamicUpsertIT
Running org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.152 sec - in 
org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Running org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.LikeExpressionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.804 sec - in 
org.apache.phoenix.end2end.LikeExpressionIT
Running org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.079 sec - in 
org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.549 sec - in 
org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Running org.apache.phoenix.end2end.DistinctPrefixFilterIT
Running org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.3 sec - in 
org.apache.phoenix.end2end.ArithmeticQueryIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.347 sec - in 
org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.521 sec - in 
org.apache.phoenix.end2end.FirstValueFunctionIT
Running org.apache.phoenix.end2end.QueryMoreIT
Running org.apache.phoenix.end2end.PrimitiveTypeIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.231 sec - in 
org.apache.phoenix.end2end.PrimitiveTypeIT
Running org.apache.phoenix.end2end.RTrimFunctionIT
Running org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.61 sec - in 
org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
Running org.apache.phoenix.end2end.ReadOnlyIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.784 sec - in 
org.apache.phoenix.end2end.RTrimFunctionIT
Running org.apache.phoenix.end2end.RegexpSplitFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.593 sec - in 
org.apache.phoenix.end2end.ReadOnlyIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.824 sec - in 
org.apache.phoenix.end2end.RegexpSplitFunctionIT
Running org.apache.phoenix.end2end.ReverseFunctionIT
Running org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.65 sec - in 
org.apache.phoenix.end2end.NthValueFunctionIT
Running org.apache.phoenix.end2end.SerialIteratorsIT
Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.341 sec - in 
org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Running org.apache.phoenix.end2end.ServerExceptionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped:

phoenix git commit: Added ability to run async indexes when hbase cluster is in non-distributed mode or when mr is in local mode

2016-07-11 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.0 fd1ae10fb -> 4785a4a3b


Added ability to run async indexes when hbase cluster is in non-distributed 
mode or when mr is in local mode


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/4785a4a3
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/4785a4a3
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/4785a4a3

Branch: refs/heads/4.x-HBase-1.0
Commit: 4785a4a3b0ad356357abacafa35cda5bb045d895
Parents: fd1ae10
Author: tejamobref 
Authored: Tue Jul 12 01:37:12 2016 +0530
Committer: Thomas D'Silva 
Committed: Mon Jul 11 15:31:27 2016 -0700

--
 .../phoenix/end2end/index/AsyncIndexIT.java | 180 +++
 .../coprocessor/MetaDataRegionObserver.java | 113 ++--
 .../phoenix/jdbc/PhoenixEmbeddedDriver.java |   4 +-
 .../index/automation/PhoenixMRJobSubmitter.java |  13 +-
 .../apache/phoenix/query/QueryConstants.java|  18 +-
 .../org/apache/phoenix/query/QueryServices.java |   4 +-
 6 files changed, 308 insertions(+), 24 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/4785a4a3/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
new file mode 100644
index 000..43d1bd9
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
@@ -0,0 +1,180 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.end2end.index;
+
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.ASYNC_CREATED_DATE;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.COLUMN_FAMILY;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.COLUMN_NAME;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.DATA_TABLE_NAME;
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_SCHEMA;
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_TABLE;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_NAME;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_SCHEM;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_TYPE;
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertTrue;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.apache.phoenix.end2end.NeedsOwnMiniClusterTest;
+import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
+import org.apache.phoenix.query.BaseTest;
+import org.apache.phoenix.query.QueryServices;
+import org.apache.phoenix.schema.PIndexState;
+import org.apache.phoenix.schema.PTableType;
+import org.apache.phoenix.schema.types.PDate;
+import org.apache.phoenix.util.ReadOnlyProps;
+import org.junit.After;
+import org.junit.Test;
+import org.junit.experimental.categories.Category;
+
+@Category(NeedsOwnMiniClusterTest.class)
+public class AsyncIndexIT extends BaseTest {
+
+private static final String PERSON_TABLE_NAME = "PERSON";
+private static final String PERSON_TABLE_NAME_WITH_SCHEMA = "TEST.PERSON";
+private static final String TEST_SCHEMA = "TEST";
+
+private static final String PERSON_TABLE_ASYNC_INDEX_INFO_QUERY = "SELECT "
++ DATA_TABLE_NAME + ", " + TABLE_SCHEM + ", "
++ TABLE_NAME + " FROM " + SYSTEM_CATALOG_SCHEMA + ".\""
++ SYSTEM_CATALOG_TABLE + "\""
++ " (" + ASYNC_CREATED_DATE + " "
++ PDate.INSTANCE.getSqlTypeName() + ") " + " WHERE "
++ COLUMN_NAME + " IS NULL and " + COLUMN_FAMILY + " IS NULL  and "
++ ASYNC_CREATED_DATE + " IS NOT NULL and "
+ 

phoenix git commit: Added ability to run async indexes when hbase cluster is in non-distributed mode or when mr is in local mode

2016-07-11 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.1 5721169a5 -> 424bd798b


Added ability to run async indexes when hbase cluster is in non-distributed 
mode or when mr is in local mode


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/424bd798
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/424bd798
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/424bd798

Branch: refs/heads/4.x-HBase-1.1
Commit: 424bd798bc0034f126ba8074ac93bcd409c74969
Parents: 5721169
Author: tejamobref 
Authored: Tue Jul 12 01:37:12 2016 +0530
Committer: Thomas D'Silva 
Committed: Mon Jul 11 15:33:24 2016 -0700

--
 .../phoenix/end2end/index/AsyncIndexIT.java | 180 +++
 .../coprocessor/MetaDataRegionObserver.java | 113 ++--
 .../phoenix/jdbc/PhoenixEmbeddedDriver.java |   4 +-
 .../index/automation/PhoenixMRJobSubmitter.java |  13 +-
 .../apache/phoenix/query/QueryConstants.java|  18 +-
 .../org/apache/phoenix/query/QueryServices.java |   4 +-
 6 files changed, 308 insertions(+), 24 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/424bd798/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
new file mode 100644
index 000..43d1bd9
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/AsyncIndexIT.java
@@ -0,0 +1,180 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.end2end.index;
+
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.ASYNC_CREATED_DATE;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.COLUMN_FAMILY;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.COLUMN_NAME;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.DATA_TABLE_NAME;
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_SCHEMA;
+import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_TABLE;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_NAME;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_SCHEM;
+import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TABLE_TYPE;
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertTrue;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.apache.phoenix.end2end.NeedsOwnMiniClusterTest;
+import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
+import org.apache.phoenix.query.BaseTest;
+import org.apache.phoenix.query.QueryServices;
+import org.apache.phoenix.schema.PIndexState;
+import org.apache.phoenix.schema.PTableType;
+import org.apache.phoenix.schema.types.PDate;
+import org.apache.phoenix.util.ReadOnlyProps;
+import org.junit.After;
+import org.junit.Test;
+import org.junit.experimental.categories.Category;
+
+@Category(NeedsOwnMiniClusterTest.class)
+public class AsyncIndexIT extends BaseTest {
+
+private static final String PERSON_TABLE_NAME = "PERSON";
+private static final String PERSON_TABLE_NAME_WITH_SCHEMA = "TEST.PERSON";
+private static final String TEST_SCHEMA = "TEST";
+
+private static final String PERSON_TABLE_ASYNC_INDEX_INFO_QUERY = "SELECT "
++ DATA_TABLE_NAME + ", " + TABLE_SCHEM + ", "
++ TABLE_NAME + " FROM " + SYSTEM_CATALOG_SCHEMA + ".\""
++ SYSTEM_CATALOG_TABLE + "\""
++ " (" + ASYNC_CREATED_DATE + " "
++ PDate.INSTANCE.getSqlTypeName() + ") " + " WHERE "
++ COLUMN_NAME + " IS NULL and " + COLUMN_FAMILY + " IS NULL  and "
++ ASYNC_CREATED_DATE + " IS NOT NULL and "
+ 

Build failed in Jenkins: Phoenix-4.x-HBase-1.0 #565

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[tdsilva] Added ability to run async indexes when hbase cluster is in

--
[...truncated 719 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.105 sec - in 
org.apache.phoenix.end2end.AlterSessionIT
Running org.apache.phoenix.end2end.ArraysWithNullsIT
Running org.apache.phoenix.end2end.AutoCommitIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.944 sec - in 
org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.ArrayToStringFunctionIT
Running org.apache.phoenix.end2end.ArrayFillFunctionIT
Running org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.527 sec - in 
org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.587 sec - in 
org.apache.phoenix.end2end.ArrayFillFunctionIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.134 sec - 
in org.apache.phoenix.end2end.ArrayToStringFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.696 sec - in 
org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Running org.apache.phoenix.end2end.DynamicFamilyIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.007 sec - in 
org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.656 sec - 
in org.apache.phoenix.end2end.ArraysWithNullsIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.359 sec - in 
org.apache.phoenix.end2end.DynamicFamilyIT
Running org.apache.phoenix.end2end.FirstValueFunctionIT
Running org.apache.phoenix.end2end.DynamicUpsertIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.131 sec - in 
org.apache.phoenix.end2end.DynamicUpsertIT
Running org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.153 sec - in 
org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Running org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.LikeExpressionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.904 sec - in 
org.apache.phoenix.end2end.LikeExpressionIT
Running org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.598 sec - in 
org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.693 sec - in 
org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Running org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.877 sec - in 
org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.382 sec - in 
org.apache.phoenix.end2end.FirstValueFunctionIT
Running org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
Running org.apache.phoenix.end2end.PrimitiveTypeIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.182 sec - in 
org.apache.phoenix.end2end.PrimitiveTypeIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.589 sec - in 
org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
Running org.apache.phoenix.end2end.QueryMoreIT
Running org.apache.phoenix.end2end.RTrimFunctionIT
Running org.apache.phoenix.end2end.DistinctPrefixFilterIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.518 sec - 
in org.apache.phoenix.end2end.ArithmeticQueryIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.556 sec - in 
org.apache.phoenix.end2end.RTrimFunctionIT
Running org.apache.phoenix.end2end.ReadOnlyIT
Running org.apache.phoenix.end2end.RegexpSplitFunctionIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.004 sec - in 
org.apache.phoenix.end2end.RegexpSplitFunctionIT
Running org.apache.phoenix.end2end.ReverseFunctionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.212 sec - 
in org.apache.phoenix.end2end.NthValueFunctionIT
Running org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.114 sec - in 
org.apache.phoenix.end2end.ReadOnlyIT
Running org.apache.phoenix.end2end.SerialIteratorsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.672 sec - in 
org.apache.phoenix.end2end.SerialIteratorsIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.574 sec - in 
org.apache.phoenix.end2end.ReverseFunctionIT
Running org.apache.phoenix.end2end.ServerExceptionIT
Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.8

Build failed in Jenkins: Phoenix | Master #1322

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[tdsilva] Added ability to run async indexes when hbase cluster is in

--
[...truncated 740 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.231 sec - in 
org.apache.phoenix.end2end.AlterSessionIT
Running org.apache.phoenix.end2end.ArraysWithNullsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.822 sec - in 
org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
Running org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.ArrayToStringFunctionIT
Running org.apache.phoenix.end2end.ArrayFillFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.436 sec - in 
org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.42 sec - in 
org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.937 sec - 
in org.apache.phoenix.end2end.ArrayFillFunctionIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.799 sec - 
in org.apache.phoenix.end2end.ArrayToStringFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.724 sec - in 
org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Running org.apache.phoenix.end2end.DynamicFamilyIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.093 sec - in 
org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.854 sec - in 
org.apache.phoenix.end2end.DynamicFamilyIT
Running org.apache.phoenix.end2end.FirstValueFunctionIT
Running org.apache.phoenix.end2end.DynamicUpsertIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.138 sec - in 
org.apache.phoenix.end2end.DynamicUpsertIT
Running org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.152 sec - in 
org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Running org.apache.phoenix.end2end.DistinctPrefixFilterIT
Running org.apache.phoenix.end2end.LikeExpressionIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 60.205 sec - 
in org.apache.phoenix.end2end.ArraysWithNullsIT
Running org.apache.phoenix.end2end.MD5FunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.891 sec - in 
org.apache.phoenix.end2end.LikeExpressionIT
Running org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.503 sec - in 
org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Running org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.661 sec - in 
org.apache.phoenix.end2end.FirstValueFunctionIT
Running org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.543 sec - in 
org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.331 sec - in 
org.apache.phoenix.end2end.DistinctPrefixFilterIT
Running org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.111 sec - in 
org.apache.phoenix.end2end.MD5FunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.457 sec - in 
org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
Running org.apache.phoenix.end2end.QueryMoreIT
Running org.apache.phoenix.end2end.RTrimFunctionIT
Running org.apache.phoenix.end2end.PrimitiveTypeIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.102 sec - in 
org.apache.phoenix.end2end.PrimitiveTypeIT
Running org.apache.phoenix.end2end.ReadOnlyIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.063 sec - in 
org.apache.phoenix.end2end.RTrimFunctionIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 86.369 sec - 
in org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.ReverseFunctionIT
Running org.apache.phoenix.end2end.RegexpSplitFunctionIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.427 sec - in 
org.apache.phoenix.end2end.RegexpSplitFunctionIT
Running org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.994 sec - in 
org.apache.phoenix.end2end.ReadOnlyIT
Running org.apache.phoenix.end2end.SerialIteratorsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.492 sec - in 
org.apache.phoenix.end2end.SerialIteratorsIT
Running org.apache.phoenix.end2end.ServerExceptionIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.853 sec - in 
org.apache.phoenix.end2end.ReverseF

Build failed in Jenkins: Phoenix-4.x-HBase-1.1 #118

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[tdsilva] Added ability to run async indexes when hbase cluster is in

--
[...truncated 2129 lines...]

Running org.apache.phoenix.end2end.index.MutableIndexReplicationIT
Running org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.004 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.MutableIndexReplicationIT
org.apache.phoenix.end2end.index.MutableIndexReplicationIT  Time elapsed: 0.003 
sec  <<< ERROR!
java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.MutableIndexReplicationIT.setupConfigsAndStartCluster(MutableIndexReplicationIT.java:170)
at 
org.apache.phoenix.end2end.index.MutableIndexReplicationIT.setUpBeforeClass(MutableIndexReplicationIT.java:108)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.MutableIndexReplicationIT.setupConfigsAndStartCluster(MutableIndexReplicationIT.java:170)
at 
org.apache.phoenix.end2end.index.MutableIndexReplicationIT.setUpBeforeClass(MutableIndexReplicationIT.java:108)

Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.007 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT
org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT  Time elapsed: 0.006 
sec  <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT.doSetup(ReadOnlyIndexFailureIT.java:119)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT.doSetup(ReadOnlyIndexFailureIT.java:119)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT.doSetup(ReadOnlyIndexFailureIT.java:119)

Running org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.006 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
org.apache.phoenix.end2end.index.txn.TxWriteFailureIT  Time elapsed: 0.005 sec  
<<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.doSetup(TxWriteFailureIT.java:86)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.doSetup(TxWriteFailureIT.java:86)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.doSetup(TxWriteFailureIT.java:86)

Running org.apache.phoenix.execute.PartialCommitIT
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.006 sec <<< 
FAILURE! - in org.apache.phoenix.execute.PartialCommitIT
org.apache.phoenix.execute.PartialCommitIT  Time elapsed: 0.005 sec  <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.execute.PartialCommitIT.doSetup(PartialCommitIT.java:94)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.execute.PartialCommitIT.doSetup(PartialCommitIT.java:94)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.execute.PartialCommitIT.doSetup(PartialCommitIT.java:94)

Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 611.514 sec <<< 
FAILURE! - in org.apache.phoenix.end2end.index.AsyncIndexIT
testAsyncIndexBuilderNonDistributed(org.apache.phoenix.end2end.index.AsyncIndexIT)
  Time elapsed: 206.707 sec  <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.AsyncIndexIT.testAsyncIndexBuilderNonDistributed(AsyncIndexIT.java:116)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.AsyncIndexIT.testAsyncIndexBuilderNonDistributed(AsyncIndexIT.java:116)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2end.index.AsyncIndexIT.testAsyncIndexBuilderNonDistributed(AsyncIndexIT.java:116)

testAsyncIndexBuilderDistributed(org.apache.phoenix.end2end.index.AsyncIndexIT) 
 Time elapsed: 202.498 sec  <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.AsyncIndexIT.testAsyncIndexBuilderDistributed(AsyncIndexIT.java:162)
Caused by: java.io.IOException: Shutting down
at 
org.apache.phoenix.end2end.index.AsyncIndexIT.testAsyncIndexBuilderDistributed(AsyncIndexIT.java:162)
Caused by: java.lang.RuntimeException: Master not initialized after 20ms 
seconds
at 
org.apache.phoenix.end2en