Jenkins build is back to normal : Phoenix-4.x-HBase-1.3 #33

2018-02-09 Thread Apache Jenkins Server
See 




Apache-Phoenix | 4.x-HBase-1.3 | Build Successful

2018-02-09 Thread Apache Jenkins Server
4.x-HBase-1.3 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.3

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastCompletedBuild/testReport/

Changes
[jmahonin] PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Jenkins build is back to normal : Phoenix-4.x-HBase-1.2 #261

2018-02-09 Thread Apache Jenkins Server
See 




Build failed in Jenkins: Phoenix | 4.x-HBase-0.98 #1813

2018-02-09 Thread Apache Jenkins Server
See 


Changes:

[jmahonin] PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)

--
[...truncated 110.25 KB...]
org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 
action: org.apache.hadoop.hbase.DoNotRetryIOException: Failed 1 action: 
NotServingRegionException: 1 time, servers with issues: 
asf931.gq1.ygridcore.net,35502,1518211039016, 
at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:80)
at 
org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:62)
at 
org.apache.phoenix.index.PhoenixTransactionalIndexer.postBatchMutateIndispensably(PhoenixTransactionalIndexer.java:240)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$37.call(RegionCoprocessorHost.java:1040)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1656)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1733)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1688)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.postBatchMutateIndispensably(RegionCoprocessorHost.java:1036)
at 
org.apache.hadoop.hbase.regionserver.HRegion.doMiniBatchMutation(HRegion.java:2767)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:2359)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:2314)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:2318)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.doBatchOp(HRegionServer.java:4678)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.doNonAtomicRegionMutation(HRegionServer.java:3835)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.multi(HRegionServer.java:3680)
at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32500)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2195)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: 
Failed 1 action: NotServingRegionException: 1 time, servers with issues: 
asf931.gq1.ygridcore.net,35502,1518211039016, 
at 
org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:211)
at 
org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:195)
at 
org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:1082)
at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:2479)
at org.apache.hadoop.hbase.client.HTable.batchCallback(HTable.java:898)
at org.apache.hadoop.hbase.client.HTable.batchCallback(HTable.java:913)
at org.apache.hadoop.hbase.client.HTable.batch(HTable.java:888)
at 
org.apache.phoenix.hbase.index.write.ParallelWriterIndexCommitter$1.call(ParallelWriterIndexCommitter.java:170)
at 
org.apache.phoenix.hbase.index.write.ParallelWriterIndexCommitter$1.call(ParallelWriterIndexCommitter.java:133)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
... 1 more
: 1 time, servers with issues: asf931.gq1.ygridcore.net,35502,1518211039016, 
at 
org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTable(ParameterizedTransactionIT.java:288)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: 
Failed 1 action: org.apache.hadoop.hbase.DoNotRetryIOException: Failed 1 
action: NotServingRegionException: 1 time, servers with issues: 
asf931.gq1.ygridcore.net,35502,1518211039016, 
at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:80)
at 
org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:62)
at 
org.apache.phoenix.index.PhoenixTransactionalIndexer.postBatchMutateIndispensably(PhoenixTransactionalIndexer.java:240)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$37.call(RegionCoprocessorHost.java:1040)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1656)
at 

Build failed in Jenkins: Phoenix | Master #1932

2018-02-09 Thread Apache Jenkins Server
See 


Changes:

[jmahonin] PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)

--
[...truncated 110.31 KB...]
[INFO] Running org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 185.104 
s - in org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 325.267 
s - in org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Tests run: 64, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 489.442 
s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 194.262 
s - in org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.739 s 
- in org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.745 s 
- in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.295 s 
- in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.064 s 
- in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.004 s 
- in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.316 s 
- in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.595 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.368 s 
- in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 233.442 
s - in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.044 s 
- in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 167.113 
s - in org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 333.959 
s - in org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 89.717 s 
- in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.755 s 
- in org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.312 s 
- in org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 590.416 
s - in org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[WARNING] Tests run: 52, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
224.509 s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 591.1 s 
- in org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 293.152 
s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Failures: 
[ERROR]   ConcurrentMutationsIT.testConcurrentDeletesAndUpsertValues:214 
Expected to find PK in data table: (0,0)
[ERROR]   DefaultColumnValueIT.testDefaultIndexed:978
[ERROR]   RowValueConstructorIT.testRVCLastPkIsTable1stPkIndex:1584
[ERROR]   
IndexMetadataIT.testMutableTableOnlyHasPrimaryKeyIndex:623->helpTestTableOnlyHasPrimaryKeyIndex:662
[ERROR] Errors: 
[ERROR]   
OrderByIT.testOrderByReverseOptimizationWithNUllsLastBug3491:969->doTestOrderByReverseOptimizationWithNUllsLastBug3491:1017->assertResultSet:1185
 » 

[1/2] phoenix git commit: PHOENIX-4342 - Surface QueryPlan in MutationPlan

2018-02-09 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/5.x-HBase-2.0 f1ea37105 -> 8c1746c21


PHOENIX-4342 - Surface QueryPlan in MutationPlan


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/05959b16
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/05959b16
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/05959b16

Branch: refs/heads/5.x-HBase-2.0
Commit: 05959b164efb2ec4973ef127675b61d24dce643c
Parents: f1ea371
Author: Geoffrey Jacoby 
Authored: Thu Nov 2 13:41:02 2017 -0700
Committer: James Taylor 
Committed: Fri Feb 9 12:30:47 2018 -0800

--
 .../phoenix/compile/BaseMutationPlan.java   |   5 +
 .../phoenix/compile/DelegateMutationPlan.java   |   5 +
 .../apache/phoenix/compile/DeleteCompiler.java  | 545 ---
 .../apache/phoenix/compile/MutationPlan.java|   5 +-
 .../apache/phoenix/compile/UpsertCompiler.java  | 675 +++
 .../apache/phoenix/jdbc/PhoenixStatement.java   |   9 +-
 6 files changed, 733 insertions(+), 511 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/05959b16/phoenix-core/src/main/java/org/apache/phoenix/compile/BaseMutationPlan.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/BaseMutationPlan.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/BaseMutationPlan.java
index 0e45682..60eb59a 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/BaseMutationPlan.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/BaseMutationPlan.java
@@ -79,4 +79,9 @@ public abstract class BaseMutationPlan implements 
MutationPlan {
 return 0l;
 }
 
+@Override
+public QueryPlan getQueryPlan() {
+return null;
+}
+
 }
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/phoenix/blob/05959b16/phoenix-core/src/main/java/org/apache/phoenix/compile/DelegateMutationPlan.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/DelegateMutationPlan.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/DelegateMutationPlan.java
index 343ec32..90eef61 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/DelegateMutationPlan.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/DelegateMutationPlan.java
@@ -42,6 +42,11 @@ public class DelegateMutationPlan implements MutationPlan {
 }
 
 @Override
+public QueryPlan getQueryPlan() {
+return plan.getQueryPlan();
+}
+
+@Override
 public ParameterMetaData getParameterMetaData() {
 return plan.getParameterMetaData();
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/05959b16/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
index ff3d501..6383ed0 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
@@ -304,14 +304,16 @@ public class DeleteCompiler {
 return Collections.emptyList();
 }
 
-private class MultiDeleteMutationPlan implements MutationPlan {
+private class MultiRowDeleteMutationPlan implements MutationPlan {
 private final List plans;
 private final MutationPlan firstPlan;
-
-public MultiDeleteMutationPlan(@NotNull List plans) {
+private final QueryPlan dataPlan;
+
+public MultiRowDeleteMutationPlan(QueryPlan dataPlan, @NotNull 
List plans) {
 Preconditions.checkArgument(!plans.isEmpty());
 this.plans = plans;
 this.firstPlan = plans.get(0);
+this.dataPlan = dataPlan;
 }
 
 @Override
@@ -349,8 +351,8 @@ public class DeleteCompiler {
 return firstPlan.getSourceRefs();
 }
 
-   @Override
-   public Operation getOperation() {
+   @Override
+   public Operation getOperation() {
return operation;
}
 
@@ -402,6 +404,11 @@ public class DeleteCompiler {
 }
 return estInfoTimestamp;
 }
+
+@Override
+public QueryPlan getQueryPlan() {
+return dataPlan;
+}
 }
 
 public MutationPlan compile(DeleteStatement delete) throws SQLException {
@@ -551,69 +558,9 @@ public class DeleteCompiler {
 List mutationPlans 

[2/2] phoenix git commit: PHOENIX-3941 Filter regions to scan for local indexes based on data table leading pk filter conditions

2018-02-09 Thread jamestaylor
PHOENIX-3941 Filter regions to scan for local indexes based on data table 
leading pk filter conditions


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/8c1746c2
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/8c1746c2
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/8c1746c2

Branch: refs/heads/5.x-HBase-2.0
Commit: 8c1746c211edc5df62f3d3c62b797a9ba4e97e24
Parents: 05959b1
Author: James Taylor 
Authored: Wed Feb 7 23:02:44 2018 -0800
Committer: James Taylor 
Committed: Fri Feb 9 12:30:56 2018 -0800

--
 .../apache/phoenix/compile/DeleteCompiler.java  |   2 +-
 .../org/apache/phoenix/compile/ExplainPlan.java |  10 +
 .../apache/phoenix/compile/JoinCompiler.java|  10 +-
 .../apache/phoenix/compile/PostDDLCompiler.java |   2 +-
 .../apache/phoenix/compile/QueryCompiler.java   |  20 +-
 .../org/apache/phoenix/compile/ScanRanges.java  |  12 +-
 .../apache/phoenix/compile/UpsertCompiler.java  |   4 +-
 .../apache/phoenix/execute/AggregatePlan.java   |  12 +-
 .../apache/phoenix/execute/BaseQueryPlan.java   |   4 +-
 .../execute/LiteralResultIterationPlan.java |   2 +-
 .../org/apache/phoenix/execute/ScanPlan.java|  18 +-
 .../phoenix/iterate/BaseResultIterators.java| 226 ++-
 .../apache/phoenix/iterate/ExplainTable.java|   3 +-
 .../phoenix/iterate/ParallelIterators.java  |   8 +-
 .../apache/phoenix/iterate/SerialIterators.java |   4 +-
 .../apache/phoenix/jdbc/PhoenixStatement.java   |   2 +-
 .../apache/phoenix/optimize/QueryOptimizer.java |   4 +-
 .../query/ConnectionlessQueryServicesImpl.java  |   8 +-
 .../phoenix/compile/QueryCompilerTest.java  | 226 +++
 .../apache/phoenix/query/KeyRangeClipTest.java  | 155 +
 .../query/ParallelIteratorsSplitTest.java   |   2 +-
 21 files changed, 674 insertions(+), 60 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/8c1746c2/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
index 6383ed0..b77fcbe 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
@@ -578,7 +578,7 @@ public class DeleteCompiler {
 }
 final RowProjector projector = projectorToBe;
 final QueryPlan aggPlan = new AggregatePlan(context, select, 
dataPlan.getTableRef(), projector, null, null,
-OrderBy.EMPTY_ORDER_BY, null, GroupBy.EMPTY_GROUP_BY, 
null);
+OrderBy.EMPTY_ORDER_BY, null, GroupBy.EMPTY_GROUP_BY, 
null, dataPlan);
 return new ServerSelectDeleteMutationPlan(dataPlan, connection, 
aggPlan, projector, maxSize, maxSizeBytes);
 } else {
 final DeletingParallelIteratorFactory parallelIteratorFactory = 
parallelIteratorFactoryToBe;

http://git-wip-us.apache.org/repos/asf/phoenix/blob/8c1746c2/phoenix-core/src/main/java/org/apache/phoenix/compile/ExplainPlan.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/ExplainPlan.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/ExplainPlan.java
index 2bc7809..ef34daa 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/ExplainPlan.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/ExplainPlan.java
@@ -34,4 +34,14 @@ public class ExplainPlan {
 public List getPlanSteps() {
 return planSteps;
 }
+
+@Override
+public String toString() {
+StringBuilder buf = new StringBuilder();
+for (String step : planSteps) {
+buf.append(step);
+buf.append('\n');
+}
+return buf.toString();
+}
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/8c1746c2/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java
index 887e2d2..f9d8711 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java
@@ -17,8 +17,8 @@
  */
 package org.apache.phoenix.compile;
 
-import static 
org.apache.phoenix.schema.PTable.QualifierEncodingScheme.NON_ENCODED_QUALIFIERS;
 import static 

phoenix git commit: PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)

2018-02-09 Thread jmahonin
Repository: phoenix
Updated Branches:
  refs/heads/4.x-cdh5.11.2 ce7fae412 -> 00940b343


PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/00940b34
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/00940b34
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/00940b34

Branch: refs/heads/4.x-cdh5.11.2
Commit: 00940b3430777c18e9a0984e0fa0c2768f6b6c55
Parents: ce7fae4
Author: Josh Mahonin 
Authored: Fri Feb 9 15:03:25 2018 -0500
Committer: Josh Mahonin 
Committed: Fri Feb 9 15:16:14 2018 -0500

--
 .../scala/org/apache/phoenix/spark/DataFrameFunctions.scala| 6 +-
 .../scala/org/apache/phoenix/spark/ProductRDDFunctions.scala   | 6 +-
 2 files changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/00940b34/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
index 92f4c58..be4a32b 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
@@ -57,7 +57,11 @@ class DataFrameFunctions(data: DataFrame) extends 
Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],

http://git-wip-us.apache.org/repos/asf/phoenix/blob/00940b34/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
index 9b368b6..1b33e6e 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
@@ -49,7 +49,11 @@ class ProductRDDFunctions[A <: Product](data: RDD[A]) 
extends Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],



phoenix git commit: PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)

2018-02-09 Thread jmahonin
Repository: phoenix
Updated Branches:
  refs/heads/5.x-HBase-2.0 2cc3241d2 -> f1ea37105


PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/f1ea3710
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/f1ea3710
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/f1ea3710

Branch: refs/heads/5.x-HBase-2.0
Commit: f1ea371053e641759784895310e82f012a47ee8e
Parents: 2cc3241
Author: Josh Mahonin 
Authored: Fri Feb 9 15:03:25 2018 -0500
Committer: Josh Mahonin 
Committed: Fri Feb 9 15:15:51 2018 -0500

--
 .../scala/org/apache/phoenix/spark/DataFrameFunctions.scala| 6 +-
 .../scala/org/apache/phoenix/spark/ProductRDDFunctions.scala   | 6 +-
 2 files changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/f1ea3710/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
index 92f4c58..be4a32b 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
@@ -57,7 +57,11 @@ class DataFrameFunctions(data: DataFrame) extends 
Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],

http://git-wip-us.apache.org/repos/asf/phoenix/blob/f1ea3710/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
index 9b368b6..1b33e6e 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
@@ -49,7 +49,11 @@ class ProductRDDFunctions[A <: Product](data: RDD[A]) 
extends Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],



phoenix git commit: PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)

2018-02-09 Thread jmahonin
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.3 60e5ff315 -> d15efd1bf


PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/d15efd1b
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/d15efd1b
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/d15efd1b

Branch: refs/heads/4.x-HBase-1.3
Commit: d15efd1bfcc7174c61c8d40b6d543fba11459a75
Parents: 60e5ff3
Author: Josh Mahonin 
Authored: Fri Feb 9 15:03:25 2018 -0500
Committer: Josh Mahonin 
Committed: Fri Feb 9 15:14:58 2018 -0500

--
 .../scala/org/apache/phoenix/spark/DataFrameFunctions.scala| 6 +-
 .../scala/org/apache/phoenix/spark/ProductRDDFunctions.scala   | 6 +-
 2 files changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/d15efd1b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
index 92f4c58..be4a32b 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
@@ -57,7 +57,11 @@ class DataFrameFunctions(data: DataFrame) extends 
Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],

http://git-wip-us.apache.org/repos/asf/phoenix/blob/d15efd1b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
index 9b368b6..1b33e6e 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
@@ -49,7 +49,11 @@ class ProductRDDFunctions[A <: Product](data: RDD[A]) 
extends Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],



phoenix git commit: PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)

2018-02-09 Thread jmahonin
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.2 96c13c113 -> 618dfb026


PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/618dfb02
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/618dfb02
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/618dfb02

Branch: refs/heads/4.x-HBase-1.2
Commit: 618dfb0267d2d7d3008c955dbd0d13bccad83b0f
Parents: 96c13c1
Author: Josh Mahonin 
Authored: Fri Feb 9 15:03:25 2018 -0500
Committer: Josh Mahonin 
Committed: Fri Feb 9 15:14:13 2018 -0500

--
 .../scala/org/apache/phoenix/spark/DataFrameFunctions.scala| 6 +-
 .../scala/org/apache/phoenix/spark/ProductRDDFunctions.scala   | 6 +-
 2 files changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/618dfb02/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
index 92f4c58..be4a32b 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
@@ -57,7 +57,11 @@ class DataFrameFunctions(data: DataFrame) extends 
Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],

http://git-wip-us.apache.org/repos/asf/phoenix/blob/618dfb02/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
index 9b368b6..1b33e6e 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
@@ -49,7 +49,11 @@ class ProductRDDFunctions[A <: Product](data: RDD[A]) 
extends Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],



phoenix git commit: PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)

2018-02-09 Thread jmahonin
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 d2801ef98 -> 1d34ad38b


PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/1d34ad38
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/1d34ad38
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/1d34ad38

Branch: refs/heads/4.x-HBase-0.98
Commit: 1d34ad38bb90d2c87b518c49832d33bc4c391999
Parents: d2801ef
Author: Josh Mahonin 
Authored: Fri Feb 9 15:03:25 2018 -0500
Committer: Josh Mahonin 
Committed: Fri Feb 9 15:13:34 2018 -0500

--
 .../scala/org/apache/phoenix/spark/DataFrameFunctions.scala| 6 +-
 .../scala/org/apache/phoenix/spark/ProductRDDFunctions.scala   | 6 +-
 2 files changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/1d34ad38/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
index 92f4c58..be4a32b 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
@@ -57,7 +57,11 @@ class DataFrameFunctions(data: DataFrame) extends 
Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],

http://git-wip-us.apache.org/repos/asf/phoenix/blob/1d34ad38/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
index 9b368b6..1b33e6e 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
@@ -49,7 +49,11 @@ class ProductRDDFunctions[A <: Product](data: RDD[A]) 
extends Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],



phoenix git commit: PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)

2018-02-09 Thread jmahonin
Repository: phoenix
Updated Branches:
  refs/heads/master 8f01b5fad -> f5512105c


PHOENIX-4056 Spark 2.2 empty path fix (Stepan Migunov)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/f5512105
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/f5512105
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/f5512105

Branch: refs/heads/master
Commit: f5512105c6507d948b7bca789ece29dee5917711
Parents: 8f01b5f
Author: Josh Mahonin 
Authored: Fri Feb 9 15:03:25 2018 -0500
Committer: Josh Mahonin 
Committed: Fri Feb 9 15:03:25 2018 -0500

--
 .../scala/org/apache/phoenix/spark/DataFrameFunctions.scala| 6 +-
 .../scala/org/apache/phoenix/spark/ProductRDDFunctions.scala   | 6 +-
 2 files changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/f5512105/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
index 92f4c58..be4a32b 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/DataFrameFunctions.scala
@@ -57,7 +57,11 @@ class DataFrameFunctions(data: DataFrame) extends 
Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],

http://git-wip-us.apache.org/repos/asf/phoenix/blob/f5512105/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
index 9b368b6..1b33e6e 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
@@ -49,7 +49,11 @@ class ProductRDDFunctions[A <: Product](data: RDD[A]) 
extends Serializable {
 
 // Save it
 phxRDD.saveAsNewAPIHadoopFile(
-  "",
+  Option(
+conf.get("mapreduce.output.fileoutputformat.outputdir")
+  ).getOrElse(
+Option(conf.get("mapred.output.dir")).getOrElse("")
+  ),
   classOf[NullWritable],
   classOf[PhoenixRecordWritable],
   classOf[PhoenixOutputFormat[PhoenixRecordWritable]],



Build failed in Jenkins: Phoenix Compile Compatibility with HBase #543

2018-02-09 Thread Apache Jenkins Server
See 


--
[...truncated 39.70 KB...]
[ERROR] 
:[364,5]
 method does not override or implement a method from a supertype
[ERROR] 
:[370,5]
 method does not override or implement a method from a supertype
[ERROR] 
:[376,5]
 method does not override or implement a method from a supertype
[ERROR] 
:[382,5]
 method does not override or implement a method from a supertype
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on 
project phoenix-core: Compilation failure: Compilation failure: 
[ERROR] 
:[34,39]
 cannot find symbol
[ERROR]   symbol:   class MetricRegistry
[ERROR]   location: package org.apache.hadoop.hbase.metrics
[ERROR] 
:[144,16]
 cannot find symbol
[ERROR]   symbol:   class MetricRegistry
[ERROR]   location: class 
org.apache.phoenix.coprocessor.PhoenixMetaDataCoprocessorHost.PhoenixMetaDataControllerEnvironment
[ERROR] 
:[24,35]
 cannot find symbol
[ERROR]   symbol:   class DelegatingHBaseRpcController
[ERROR]   location: package org.apache.hadoop.hbase.ipc
[ERROR] 
:[25,35]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: package org.apache.hadoop.hbase.ipc
[ERROR] 
:[37,37]
 cannot find symbol
[ERROR]   symbol: class DelegatingHBaseRpcController
[ERROR] 
:[56,38]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: class 
org.apache.hadoop.hbase.ipc.controller.MetadataRpcController
[ERROR] 
:[26,35]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: package org.apache.hadoop.hbase.ipc
[ERROR] 
:[40,12]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: class 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerMetadataRpcControllerFactory
[ERROR] 
:[46,12]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: class 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerMetadataRpcControllerFactory
[ERROR] 
:[52,12]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: class 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerMetadataRpcControllerFactory
[ERROR] 
:[57,46]
 cannot 

Build failed in Jenkins: Phoenix-4.x-HBase-1.2 #260

2018-02-09 Thread Apache Jenkins Server
See 


Changes:

[ankitsinghal59] PHOENIX-4588 Clone expression also if it's children have

--
[...truncated 128.39 KB...]
[ERROR] 
testInClauseWithIndexOnColumnOfUsignedIntType[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT)
  Time elapsed: 46.114 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] 
testDeleteFromNonPKColumnIndex[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT)
  Time elapsed: 46.113 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] 
testGroupByCount[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT)
  Time elapsed: 46.113 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] 
testInFilterOnIndexedTable[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT)
  Time elapsed: 46.112 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] 
testTableDescriptorPriority[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT)
  Time elapsed: 46.148 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] 
testSelectCF[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT)
  Time elapsed: 46.119 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] 
testUpsertAfterIndexDrop[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT)
  Time elapsed: 46.119 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] 
testReturnedTimestamp[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT)
  Time elapsed: 46.111 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[WARNING] Tests run: 52, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
354.64 s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 417.727 
s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.createIndexOnTableWithSpecifiedDefaultCF:499
 » Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.createIndexOnTableWithSpecifiedDefaultCF:499
 » Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:264->BaseIndexIT.testCreateIndexAfterUpsertStarted:276
 » Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:264->BaseIndexIT.testCreateIndexAfterUpsertStarted:276
 » Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:256->BaseIndexIT.testCreateIndexAfterUpsertStarted:276
 » Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:256->BaseIndexIT.testCreateIndexAfterUpsertStarted:276
 » Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:192 » 
Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:192 » 
Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:370 » 
Runtime
[ERROR]   
GlobalImmutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:370 » 

Build failed in Jenkins: Phoenix | Master #1931

2018-02-09 Thread Apache Jenkins Server
See 


Changes:

[ankitsinghal59] PHOENIX-4588 Clone expression also if it's children have

--
[...truncated 110.27 KB...]
[INFO] Running org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 364.008 
s - in org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 64, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 740.163 
s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 370.07 
s - in org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.392 s 
- in org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.404 s 
- in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.632 s 
- in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 391.621 
s - in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.254 s 
- in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.903 s 
- in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 609.606 
s - in org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 75.434 s 
- in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.359 s 
- in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.861 s 
- in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.619 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 95.143 s 
- in org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 295.432 
s - in org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 649.255 
s - in org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 104.037 
s - in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 820.103 
s - in org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.583 s 
- in org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 814.116 
s - in org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[WARNING] Tests run: 52, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
371.888 s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 441.413 
s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Failures: 
[ERROR]   ConcurrentMutationsIT.testConcurrentDeletesAndUpsertValues:214 
Expected to find PK in data table: (0,0)
[ERROR]   DefaultColumnValueIT.testDefaultIndexed:978
[ERROR]   RowValueConstructorIT.testRVCLastPkIsTable1stPkIndex:1584
[ERROR]   
IndexMetadataIT.testMutableTableOnlyHasPrimaryKeyIndex:623->helpTestTableOnlyHasPrimaryKeyIndex:662
[ERROR] Errors: 
[ERROR]   

Jenkins build is back to normal : Phoenix | 4.x-HBase-0.98 #1812

2018-02-09 Thread Apache Jenkins Server
See