See 
<https://builds.apache.org/job/Phoenix-master/2276/display/redirect?page=changes>

Changes:

[tdsilva] modify index state based on client version to support old clients

------------------------------------------
[...truncated 843.88 KB...]
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.451 s 
- in org.apache.phoenix.end2end.DropTableWithViewsIT
[INFO] Running org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT
[INFO] Running org.apache.phoenix.end2end.ViewIT
[INFO] Running org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 86.641 s 
- in org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 171.915 
s - in org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
[ERROR] Tests run: 10, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 160.6 
s <<< FAILURE! - in org.apache.phoenix.end2end.TenantSpecificViewIndexIT
[ERROR] 
testMultiCFViewIndex(org.apache.phoenix.end2end.TenantSpecificViewIndexIT)  
Time elapsed: 13.3 s  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: SCHEMA2.N000030: 
java.lang.OutOfMemoryError: unable to create new native thread
        at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:114)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:656)
        at 
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:17038)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8016)
        at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2409)
        at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2391)
        at 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42010)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
        at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
        at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
Caused by: java.lang.RuntimeException: java.lang.OutOfMemoryError: unable to 
create new native thread
        at 
org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:200)
        at 
org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:269)
        at 
org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:437)
        at 
org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:312)
        at 
org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:597)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findRelatedViews(ViewFinder.java:94)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findAllRelatives(ViewFinder.java:65)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findAllRelatives(ViewFinder.java:59)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.findAncestorViews(MetaDataEndpointImpl.java:2565)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.findAncestorViewsOfIndex(MetaDataEndpointImpl.java:2553)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addDerivedColumnsFromAncestors(MetaDataEndpointImpl.java:744)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.combineColumns(MetaDataEndpointImpl.java:680)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTableFromCache(MetaDataEndpointImpl.java:1928)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:3714)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.combineColumns(MetaDataEndpointImpl.java:695)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTableFromCache(MetaDataEndpointImpl.java:1928)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:3714)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:611)
        ... 9 more
Caused by: java.lang.OutOfMemoryError: unable to create new native thread
        at java.lang.Thread.start0(Native Method)
        at java.lang.Thread.start(Thread.java:717)
        at 
java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
        at 
java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1367)
        at 
org.apache.hadoop.hbase.client.ResultBoundedCompletionService.submit(ResultBoundedCompletionService.java:171)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.addCallsForCurrentReplica(ScannerCallableWithReplicas.java:320)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:182)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
        ... 26 more

        at 
org.apache.phoenix.end2end.TenantSpecificViewIndexIT.createViewAndIndexesWithTenantId(TenantSpecificViewIndexIT.java:195)
        at 
org.apache.phoenix.end2end.TenantSpecificViewIndexIT.testMultiCFViewIndex(TenantSpecificViewIndexIT.java:127)
        at 
org.apache.phoenix.end2end.TenantSpecificViewIndexIT.testMultiCFViewIndex(TenantSpecificViewIndexIT.java:80)
Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: SCHEMA2.N000030: 
java.lang.OutOfMemoryError: unable to create new native thread
        at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:114)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:656)
        at 
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:17038)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8016)
        at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2409)
        at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2391)
        at 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42010)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
        at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
        at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
Caused by: java.lang.RuntimeException: java.lang.OutOfMemoryError: unable to 
create new native thread
        at 
org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:200)
        at 
org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:269)
        at 
org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:437)
        at 
org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:312)
        at 
org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:597)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findRelatedViews(ViewFinder.java:94)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findAllRelatives(ViewFinder.java:65)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findAllRelatives(ViewFinder.java:59)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.findAncestorViews(MetaDataEndpointImpl.java:2565)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.findAncestorViewsOfIndex(MetaDataEndpointImpl.java:2553)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addDerivedColumnsFromAncestors(MetaDataEndpointImpl.java:744)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.combineColumns(MetaDataEndpointImpl.java:680)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTableFromCache(MetaDataEndpointImpl.java:1928)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:3714)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.combineColumns(MetaDataEndpointImpl.java:695)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTableFromCache(MetaDataEndpointImpl.java:1928)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:3714)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:611)
        ... 9 more
Caused by: java.lang.OutOfMemoryError: unable to create new native thread
        at java.lang.Thread.start0(Native Method)
        at java.lang.Thread.start(Thread.java:717)
        at 
java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
        at 
java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1367)
        at 
org.apache.hadoop.hbase.client.ResultBoundedCompletionService.submit(ResultBoundedCompletionService.java:171)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.addCallsForCurrentReplica(ScannerCallableWithReplicas.java:320)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:182)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
        ... 26 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: SCHEMA2.N000030: 
java.lang.OutOfMemoryError: unable to create new native thread
        at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:114)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:656)
        at 
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:17038)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8016)
        at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2409)
        at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2391)
        at 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42010)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
        at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
        at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
Caused by: java.lang.RuntimeException: java.lang.OutOfMemoryError: unable to 
create new native thread
        at 
org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:200)
        at 
org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:269)
        at 
org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:437)
        at 
org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:312)
        at 
org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:597)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findRelatedViews(ViewFinder.java:94)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findAllRelatives(ViewFinder.java:65)
        at 
org.apache.phoenix.coprocessor.ViewFinder.findAllRelatives(ViewFinder.java:59)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.findAncestorViews(MetaDataEndpointImpl.java:2565)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.findAncestorViewsOfIndex(MetaDataEndpointImpl.java:2553)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addDerivedColumnsFromAncestors(MetaDataEndpointImpl.java:744)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.combineColumns(MetaDataEndpointImpl.java:680)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTableFromCache(MetaDataEndpointImpl.java:1928)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:3714)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.combineColumns(MetaDataEndpointImpl.java:695)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTableFromCache(MetaDataEndpointImpl.java:1928)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:3714)
        at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:611)
        ... 9 more
Caused by: java.lang.OutOfMemoryError: unable to create new native thread
        at java.lang.Thread.start0(Native Method)
        at java.lang.Thread.start(Thread.java:717)
        at 
java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
        at 
java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1367)
        at 
org.apache.hadoop.hbase.client.ResultBoundedCompletionService.submit(ResultBoundedCompletionService.java:171)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.addCallsForCurrentReplica(ScannerCallableWithReplicas.java:320)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:182)
        at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
        ... 26 more


[WARNING] Tests run: 14, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 
149.784 s - in org.apache.phoenix.end2end.index.ViewIndexIT
[ERROR] Tests run: 160, Failures: 0, Errors: 6, Skipped: 0, Time elapsed: 
1,038.522 s <<< FAILURE! - in org.apache.phoenix.end2end.ViewIT
[ERROR] testReadOnlyOnUpdatableView[ViewIT_transactionProvider=OMID, 
columnEncoded=false](org.apache.phoenix.end2end.ViewIT)  Time elapsed: 3.593 s  
<<< ERROR!
org.apache.phoenix.execute.CommitException: java.lang.NullPointerException
        at org.apache.phoenix.end2end.ViewIT.testUpdatableView(ViewIT.java:1210)
        at 
org.apache.phoenix.end2end.ViewIT.testReadOnlyOnUpdatableView(ViewIT.java:221)
Caused by: java.lang.NullPointerException
        at org.apache.phoenix.end2end.ViewIT.testUpdatableView(ViewIT.java:1210)
        at 
org.apache.phoenix.end2end.ViewIT.testReadOnlyOnUpdatableView(ViewIT.java:221)

[ERROR] testViewWithCurrentDate[ViewIT_transactionProvider=OMID, 
columnEncoded=false](org.apache.phoenix.end2end.ViewIT)  Time elapsed: 2.561 s  
<<< ERROR!
org.apache.phoenix.execute.CommitException: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testViewWithCurrentDate(ViewIT.java:375)
Caused by: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testViewWithCurrentDate(ViewIT.java:375)

[ERROR] 
testReadOnlyViewWithCaseSensitiveTableNames[ViewIT_transactionProvider=OMID, 
columnEncoded=false](org.apache.phoenix.end2end.ViewIT)  Time elapsed: 2.601 s  
<<< ERROR!
org.apache.phoenix.execute.CommitException: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testReadOnlyViewWithCaseSensitiveTableNames(ViewIT.java:306)
Caused by: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testReadOnlyViewWithCaseSensitiveTableNames(ViewIT.java:306)

[ERROR] testViewUsesTableGlobalIndex[ViewIT_transactionProvider=OMID, 
columnEncoded=false](org.apache.phoenix.end2end.ViewIT)  Time elapsed: 9.286 s  
<<< ERROR!
org.apache.phoenix.execute.CommitException: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testViewUsesTableIndex(ViewIT.java:703)
        at 
org.apache.phoenix.end2end.ViewIT.testViewUsesTableGlobalIndex(ViewIT.java:669)
Caused by: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testViewUsesTableIndex(ViewIT.java:703)
        at 
org.apache.phoenix.end2end.ViewIT.testViewUsesTableGlobalIndex(ViewIT.java:669)

[ERROR] 
testReadOnlyViewWithCaseSensitiveColumnNames[ViewIT_transactionProvider=OMID, 
columnEncoded=false](org.apache.phoenix.end2end.ViewIT)  Time elapsed: 2.481 s  
<<< ERROR!
org.apache.phoenix.execute.CommitException: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testReadOnlyViewWithCaseSensitiveColumnNames(ViewIT.java:344)
Caused by: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testReadOnlyViewWithCaseSensitiveColumnNames(ViewIT.java:344)

[ERROR] testCreateViewDefinesPKColumn[ViewIT_transactionProvider=OMID, 
columnEncoded=false](org.apache.phoenix.end2end.ViewIT)  Time elapsed: 2.46 s  
<<< ERROR!
org.apache.phoenix.execute.CommitException: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testCreateViewDefinesPKColumn(ViewIT.java:753)
Caused by: java.lang.NullPointerException
        at 
org.apache.phoenix.end2end.ViewIT.testCreateViewDefinesPKColumn(ViewIT.java:753)

[INFO] Tests run: 112, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
1,374.718 s - in org.apache.phoenix.end2end.AlterTableWithViewsIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   
TenantSpecificViewIndexIT.testMultiCFViewIndex:80->testMultiCFViewIndex:127->createViewAndIndexesWithTenantId:195
 » PhoenixIO
[ERROR]   ViewIT.testCreateViewDefinesPKColumn:753 » Commit 
java.lang.NullPointerExcepti...
[ERROR]   ViewIT.testReadOnlyOnUpdatableView:221->testUpdatableView:1210 » 
Commit java.l...
[ERROR]   ViewIT.testReadOnlyViewWithCaseSensitiveColumnNames:344 » Commit 
java.lang.Nul...
[ERROR]   ViewIT.testReadOnlyViewWithCaseSensitiveTableNames:306 » Commit 
java.lang.Null...
[ERROR]   ViewIT.testViewUsesTableGlobalIndex:669->testViewUsesTableIndex:703 » 
Commit j...
[ERROR]   ViewIT.testViewWithCurrentDate:375 » Commit 
java.lang.NullPointerException
[INFO] 
[ERROR] Tests run: 313, Failures: 0, Errors: 7, Skipped: 2
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:verify (ParallelStatsEnabledTest) @ 
phoenix-core ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Phoenix 5.1.0-HBase-2.0-SNAPSHOT:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  3.111 s]
[INFO] Phoenix Core ....................................... FAILURE [  03:16 h]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Kafka .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Load Balancer .............................. SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  03:16 h
[INFO] Finished at: 2019-01-10T09:36:33Z
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-failsafe-plugin:2.20:verify 
(ParallelStatsEnabledTest) on project phoenix-core: There are test failures.
[ERROR] 
[ERROR] Please refer to 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/failsafe-reports>
 for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, 
[date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results
Not sending mail to unregistered user rchintaguntla@HW15978.local
Not sending mail to unregistered user ankitsingha...@gmail.com

Reply via email to