See 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/display/redirect?page=changes>

Changes:

[jtaylor] PHOENIX-4162 Disallow transition from DISABLE to INACTIVE when

------------------------------------------
[...truncated 307.90 KB...]
[INFO] --- maven-surefire-plugin:2.20:test (default-test) @ phoenix-spark ---
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ phoenix-spark ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ 
phoenix-spark ---
[INFO] Building jar: 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/artifact/phoenix-spark/target/phoenix-spark-4.12.0-HBase-1.1-SNAPSHOT-sources.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix-spark ---
[INFO] Building jar: 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/artifact/phoenix-spark/target/phoenix-spark-4.12.0-HBase-1.1-SNAPSHOT-tests.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ phoenix-spark ---
[INFO] Building jar: 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/artifact/phoenix-spark/target/phoenix-spark-4.12.0-HBase-1.1-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (empty-javadoc-jar) @ phoenix-spark ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/artifact/phoenix-spark/target/phoenix-spark-4.12.0-HBase-1.1-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ 
phoenix-spark ---
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @ phoenix-spark 
---
WARNING: -c has been deprecated and will be reused for a different (but still 
very cool) purpose in ScalaTest 2.0. Please change all uses of -c to -P.
Discovery starting.
Discovery completed in 725 milliseconds.
Run starting. Expected test count is: 35
PhoenixSparkIT:
AbstractPhoenixSparkIT:
PhoenixSparkITTenantSpecific:
Formatting using clusterid: testClusterID
Formatting using clusterid: testClusterID
2    [ScalaTest-3] ERROR org.apache.hadoop.hdfs.MiniDFSCluster  - IOE creating 
namenodes. Permissions dump:
path 
'<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/data'>:
 
        
absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/data>
        permissions: ----
path 
'<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs'>:
 
        
absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs>
        permissions: drwx
path 
'<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b'>:
 
        
absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b>
        permissions: drwx
path 
'<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f'>:
 
        
absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f>
        permissions: drwx
path 
'<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data'>:
 
        
absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data>
        permissions: drwx
path 
'<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target'>:
 
        
absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target>
        permissions: drwx
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark'>: 
        
absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark>
        permissions: drwx
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/'>: 
        absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/>
        permissions: drwx
path '/home/jenkins/jenkins-slave/workspace': 
        absolute:/home/jenkins/jenkins-slave/workspace
        permissions: drwx
path '/home/jenkins/jenkins-slave': 
        absolute:/home/jenkins/jenkins-slave
        permissions: drwx
path '/home/jenkins': 
        absolute:/home/jenkins
        permissions: drwx
path '/home': 
        absolute:/home
        permissions: dr-x
path '/': 
        absolute:/
        permissions: dr-x

java.io.IOException: Cannot create directory 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/name1/current>
        at 
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:548)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:569)
        at 
org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:991)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:342)
        at 
org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:176)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:973)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:811)
        at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:742)
        at 
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:574)
        at 
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:968)
        at 
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:849)
        at 
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:831)
        at 
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:818)
        at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:558)
        at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:458)
        at 
org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:440)
        at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:532)
        at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:526)
        at 
org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:57)
        at 
org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(AbstractPhoenixSparkIT.scala:33)
        at 
org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:88)
        at 
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
        at 
org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:44)
        at 
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
        at 
org.apache.phoenix.spark.AbstractPhoenixSparkIT.run(AbstractPhoenixSparkIT.scala:44)
        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Exception encountered when invoking run on a nested suite - 
java.io.IOException: Cannot create directory 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/name1/current>
 *** ABORTED ***
  java.lang.RuntimeException: java.io.IOException: Cannot create directory 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/name1/current>
  at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:561)
  at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:458)
  at 
org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:440)
  at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:532)
  at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:526)
  at 
org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:57)
  at 
org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(AbstractPhoenixSparkIT.scala:33)
  at 
org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:88)
  at 
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
  at 
org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:44)
  ...
  Cause: java.io.IOException: Cannot create directory 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/name1/current>
  at 
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
  at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:548)
  at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:569)
  at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
  at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:991)
  at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:342)
  at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:176)
  at 
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:973)
  at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:811)
  at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:742)
  ...
10983 [RpcServer.reader=1,bindAddress=qnode3.quenda.co,port=41577] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.1.1 
port: 42687 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
11960 [RpcServer.reader=1,bindAddress=qnode3.quenda.co,port=45045] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 37372 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
12601 [RpcServer.reader=2,bindAddress=qnode3.quenda.co,port=45045] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 37376 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
14098 [RpcServer.reader=3,bindAddress=qnode3.quenda.co,port=45045] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 37388 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
15001 [RpcServer.reader=4,bindAddress=qnode3.quenda.co,port=45045] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 37400 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
15045 [RpcServer.reader=2,bindAddress=qnode3.quenda.co,port=41577] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 35474 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
67190 [ScalaTest-4] INFO  org.spark_project.jetty.util.log  - Logging 
initialized @73926ms
67417 [ScalaTest-4] INFO  org.spark_project.jetty.server.Server  - 
jetty-9.2.z-SNAPSHOT
67460 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@65440139{/jobs,null,AVAILABLE}
67461 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@53ac791f{/jobs/json,null,AVAILABLE}
67462 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@60440d23{/jobs/job,null,AVAILABLE}
67463 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@13baa635{/jobs/job/json,null,AVAILABLE}
67464 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@42f5ebb1{/stages,null,AVAILABLE}
67465 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@1f44d3cf{/stages/json,null,AVAILABLE}
67466 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@5fadcb12{/stages/stage,null,AVAILABLE}
67467 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started 
o.s.j.s.ServletContextHandler@aef3bd1{/stages/stage/json,null,AVAILABLE}
67468 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@60401bf2{/stages/pool,null,AVAILABLE}
67469 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started 
o.s.j.s.ServletContextHandler@10eade3c{/stages/pool/json,null,AVAILABLE}
67469 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@2ba43dd1{/storage,null,AVAILABLE}
67470 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@2bd67ab9{/storage/json,null,AVAILABLE}
67471 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@e131e2b{/storage/rdd,null,AVAILABLE}
67472 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started 
o.s.j.s.ServletContextHandler@921a045{/storage/rdd/json,null,AVAILABLE}
67472 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@7ea116b1{/environment,null,AVAILABLE}
67473 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started 
o.s.j.s.ServletContextHandler@1d8ff6a{/environment/json,null,AVAILABLE}
67474 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@4d688918{/executors,null,AVAILABLE}
67474 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@a00ae5c{/executors/json,null,AVAILABLE}
67475 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started 
o.s.j.s.ServletContextHandler@32b71f25{/executors/threadDump,null,AVAILABLE}
67476 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started 
o.s.j.s.ServletContextHandler@145dd026{/executors/threadDump/json,null,AVAILABLE}
67498 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@1b0c793c{/static,null,AVAILABLE}
67505 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@4feb6003{/,null,AVAILABLE}
67508 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@65ec8df2{/api,null,AVAILABLE}
67510 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started 
o.s.j.s.ServletContextHandler@d5b3f25{/stages/stage/kill,null,AVAILABLE}
67523 [ScalaTest-4] INFO  org.spark_project.jetty.server.ServerConnector  - 
Started ServerConnector@18fa2b54{HTTP/1.1}{0.0.0.0:4040}
67536 [ScalaTest-4] INFO  org.spark_project.jetty.server.Server  - Started 
@74274ms
68348 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Started o.s.j.s.ServletContextHandler@440e5527{/metrics/json,null,AVAILABLE}
70199 [RpcServer.reader=5,bindAddress=qnode3.quenda.co,port=45045] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 37596 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
70257 [RpcServer.reader=3,bindAddress=qnode3.quenda.co,port=41577] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 35670 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
75813 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@797a25a0{/SQL,null,AVAILABLE}
75815 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@c3e407b{/SQL/json,null,AVAILABLE}
75817 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@41c57f19{/SQL/execution,null,AVAILABLE}
75818 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@14514e6{/SQL/execution/json,null,AVAILABLE}
75822 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@306e8128{/static/sql,null,AVAILABLE}
79140 [RpcServer.reader=6,bindAddress=qnode3.quenda.co,port=45045] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 37608 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
79182 [RpcServer.reader=4,bindAddress=qnode3.quenda.co,port=41577] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 35682 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
- Can read from tenant-specific table as DataFrame
80869 [RpcServer.reader=7,bindAddress=qnode3.quenda.co,port=45045] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 37616 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
80897 [RpcServer.reader=5,bindAddress=qnode3.quenda.co,port=41577] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 
port: 35690 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
- Can read from tenant-specific table as RDD
- Can write a DataFrame using 'DataFrame.saveToPhoenix' to tenant-specific view
- Can write a DataFrame using 'DataFrame.write' to tenant-specific view
- Can write an RDD to Phoenix tenant-specific view
85249 [ScalaTest-4] INFO  org.spark_project.jetty.server.ServerConnector  - 
Stopped ServerConnector@18fa2b54{HTTP/1.1}{0.0.0.0:4040}
85253 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@d5b3f25{/stages/stage/kill,null,UNAVAILABLE}
85254 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@65ec8df2{/api,null,UNAVAILABLE}
85255 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@4feb6003{/,null,UNAVAILABLE}
85255 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@1b0c793c{/static,null,UNAVAILABLE}
85256 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@145dd026{/executors/threadDump/json,null,UNAVAILABLE}
85256 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@32b71f25{/executors/threadDump,null,UNAVAILABLE}
85257 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@a00ae5c{/executors/json,null,UNAVAILABLE}
85257 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@4d688918{/executors,null,UNAVAILABLE}
85258 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@1d8ff6a{/environment/json,null,UNAVAILABLE}
85259 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@7ea116b1{/environment,null,UNAVAILABLE}
85259 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@921a045{/storage/rdd/json,null,UNAVAILABLE}
85260 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@e131e2b{/storage/rdd,null,UNAVAILABLE}
85260 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@2bd67ab9{/storage/json,null,UNAVAILABLE}
85261 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@2ba43dd1{/storage,null,UNAVAILABLE}
85261 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@10eade3c{/stages/pool/json,null,UNAVAILABLE}
85261 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@60401bf2{/stages/pool,null,UNAVAILABLE}
85262 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@aef3bd1{/stages/stage/json,null,UNAVAILABLE}
85262 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@5fadcb12{/stages/stage,null,UNAVAILABLE}
85262 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@1f44d3cf{/stages/json,null,UNAVAILABLE}
85263 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@42f5ebb1{/stages,null,UNAVAILABLE}
85263 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped 
o.s.j.s.ServletContextHandler@13baa635{/jobs/job/json,null,UNAVAILABLE}
85263 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@60440d23{/jobs/job,null,UNAVAILABLE}
85264 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@53ac791f{/jobs/json,null,UNAVAILABLE}
85264 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler 
 - Stopped o.s.j.s.ServletContextHandler@65440139{/jobs,null,UNAVAILABLE}
Run completed in 2 minutes, 50 seconds.
Total number of tests run: 5
Suites: completed 3, aborted 1
Tests: succeeded 5, failed 0, canceled 0, ignored 0, pending 0
*** 1 SUITE ABORTED ***
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  4.307 s]
[INFO] Phoenix Core ....................................... SUCCESS [  01:45 h]
[INFO] Phoenix - Flume .................................... SUCCESS [01:54 min]
[INFO] Phoenix - Kafka .................................... SUCCESS [02:27 min]
[INFO] Phoenix - Pig ...................................... SUCCESS [05:49 min]
[INFO] Phoenix Query Server Client ........................ SUCCESS [ 29.307 s]
[INFO] Phoenix Query Server ............................... SUCCESS [02:35 min]
[INFO] Phoenix - Pherf .................................... SUCCESS [03:04 min]
[INFO] Phoenix - Spark .................................... FAILURE [04:06 min]
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] Phoenix Load Balancer .............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2017-09-06T09:57:47Z
[INFO] Final Memory: 121M/1274M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test 
(integration-test) on project phoenix-spark: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-spark
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results

Reply via email to