See 
<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/668/display/redirect?page=changes>

Changes:

[tdsilva] PHOENIX-4625 memory leak in PhoenixConnection if scanner renew lease

------------------------------------------
[...truncated 367.24 KB...]
  Cause: java.io.IOException: Failed to create local dir in 
/tmp/blockmgr-8219fff8-aa0d-4d96-96dd-7dcf985833c2/00.
  at 
org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
  at org.apache.spark.storage.DiskStore.remove(DiskStore.scala:111)
  at 
org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1340)
  at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
  at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
  at org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:702)
  at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1234)
  at 
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:103)
  at 
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:86)
  at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  ...
- Can save binary types back to phoenix *** FAILED ***
  org.apache.spark.SparkException: Job aborted due to stage failure: Task 
serialization failed: java.io.IOException: Failed to create local dir in 
/tmp/blockmgr-8219fff8-aa0d-4d96-96dd-7dcf985833c2/01.
java.io.IOException: Failed to create local dir in 
/tmp/blockmgr-8219fff8-aa0d-4d96-96dd-7dcf985833c2/01.
        at 
org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
        at org.apache.spark.storage.DiskStore.remove(DiskStore.scala:111)
        at 
org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1340)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
        at 
org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
        at 
org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:702)
        at 
org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1234)
        at 
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:103)
        at 
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:86)
        at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
        at 
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:56)
        at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1387)
        at 
org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1012)
        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:933)
        at 
org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:873)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1630)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1622)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1611)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
  at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1454)
  at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1442)
  at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1441)
  at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
  at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1441)
  at 
org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1022)
  at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:933)
  at 
org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:873)
  at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1630)
  ...
  Cause: java.io.IOException: Failed to create local dir in 
/tmp/blockmgr-8219fff8-aa0d-4d96-96dd-7dcf985833c2/01.
  at 
org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
  at org.apache.spark.storage.DiskStore.remove(DiskStore.scala:111)
  at 
org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1340)
  at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
  at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
  at org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:702)
  at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1234)
  at 
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:103)
  at 
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:86)
  at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  ...
- Can load Phoenix DATE columns through DataFrame API *** FAILED ***
  java.io.IOException: Failed to create local dir in 
/tmp/blockmgr-8219fff8-aa0d-4d96-96dd-7dcf985833c2/02.
  at 
org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
  at org.apache.spark.storage.DiskStore.remove(DiskStore.scala:111)
  at 
org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1340)
  at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
  at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
  at org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:702)
  at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1234)
  at 
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:103)
  at 
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:86)
  at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  ...
- Can coerce Phoenix DATE columns to TIMESTAMP through DataFrame API *** FAILED 
***
  java.io.IOException: Failed to create local dir in 
/tmp/blockmgr-8219fff8-aa0d-4d96-96dd-7dcf985833c2/03.
  at 
org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
  at org.apache.spark.storage.DiskStore.remove(DiskStore.scala:111)
  at 
org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1340)
  at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
  at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
  at org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:702)
  at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1234)
  at 
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:103)
  at 
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:86)
  at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  ...
- Can load Phoenix Time columns through DataFrame API *** FAILED ***
  java.io.IOException: Failed to create local dir in 
/tmp/blockmgr-8219fff8-aa0d-4d96-96dd-7dcf985833c2/04.
  at 
org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
  at org.apache.spark.storage.DiskStore.remove(DiskStore.scala:111)
  at 
org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1340)
  at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
  at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
  at org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:702)
  at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1234)
  at 
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:103)
  at 
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:86)
  at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  ...
- can read all Phoenix data types *** FAILED ***
  java.io.IOException: Failed to create local dir in 
/tmp/blockmgr-8219fff8-aa0d-4d96-96dd-7dcf985833c2/05.
  at 
org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
  at org.apache.spark.storage.DiskStore.remove(DiskStore.scala:111)
  at 
org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1340)
  at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
  at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
  at org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:702)
  at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1234)
  at 
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:103)
  at 
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:86)
  at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  ...
94475 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.ServerConnector  - Stopped 
ServerConnector@560bd1fa{HTTP/1.1}{0.0.0.0:4040}
94478 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@7a1c2c8e{/stages/stage/kill,null,UNAVAILABLE}
94478 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@71f0d4b2{/api,null,UNAVAILABLE}
94478 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@37c14656{/,null,UNAVAILABLE}
94478 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@3dd43860{/static,null,UNAVAILABLE}
94478 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@bf203c0{/executors/threadDump/json,null,UNAVAILABLE}
94478 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@4bd115db{/executors/threadDump,null,UNAVAILABLE}
94479 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@59d956ca{/executors/json,null,UNAVAILABLE}
94479 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@16f1c18a{/executors,null,UNAVAILABLE}
94479 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@cbe2317{/environment/json,null,UNAVAILABLE}
94479 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@683571bc{/environment,null,UNAVAILABLE}
94479 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@74a0ca1d{/storage/rdd/json,null,UNAVAILABLE}
94479 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@7ad7e784{/storage/rdd,null,UNAVAILABLE}
94479 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@36740491{/storage/json,null,UNAVAILABLE}
94479 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@40328bbc{/storage,null,UNAVAILABLE}
94480 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@4529e005{/stages/pool/json,null,UNAVAILABLE}
94480 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@a6271fe{/stages/pool,null,UNAVAILABLE}
94480 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@7f73c849{/stages/stage/json,null,UNAVAILABLE}
94480 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@202b5f26{/stages/stage,null,UNAVAILABLE}
94480 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@1ef1f961{/stages/json,null,UNAVAILABLE}
94480 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@2eb314e{/stages,null,UNAVAILABLE}
94480 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@576662f6{/jobs/job/json,null,UNAVAILABLE}
94480 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@7421ce2a{/jobs/job,null,UNAVAILABLE}
94481 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@29569831{/jobs/json,null,UNAVAILABLE}
94481 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@27085a7{/jobs,null,UNAVAILABLE}
187773 [RpcServer.reader=6,bindAddress=asf927.gq1.ygridcore.net,port=43675] 
INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 
67.195.81.163 port: 56888 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
PhoenixSparkITTenantSpecific:
187864 [RpcServer.reader=4,bindAddress=asf927.gq1.ygridcore.net,port=34358] 
INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 
67.195.81.163 port: 33710 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
242194 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.Server  - jetty-9.2.z-SNAPSHOT
242198 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@58fddf3b{/jobs,null,AVAILABLE}
242199 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@34ac822{/jobs/json,null,AVAILABLE}
242199 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@229931de{/jobs/job,null,AVAILABLE}
242199 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@7798f790{/jobs/job/json,null,AVAILABLE}
242199 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@5ba54c47{/stages,null,AVAILABLE}
242199 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@4132f852{/stages/json,null,AVAILABLE}
242200 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@794cf0ee{/stages/stage,null,AVAILABLE}
242200 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@2a5dd779{/stages/stage/json,null,AVAILABLE}
242200 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@76ed5ea9{/stages/pool,null,AVAILABLE}
242200 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@59e1e63e{/stages/pool/json,null,AVAILABLE}
242200 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@7dd31e8b{/storage,null,AVAILABLE}
242200 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@2f7a7836{/storage/json,null,AVAILABLE}
242201 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@116a3994{/storage/rdd,null,AVAILABLE}
242201 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@54ea2c7a{/storage/rdd/json,null,AVAILABLE}
242201 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@5a0e2923{/environment,null,AVAILABLE}
242201 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@57aee705{/environment/json,null,AVAILABLE}
242201 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@1bc1263c{/executors,null,AVAILABLE}
242201 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@25b53b60{/executors/json,null,AVAILABLE}
242202 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@1d4d2ef7{/executors/threadDump,null,AVAILABLE}
242202 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@36466928{/executors/threadDump/json,null,AVAILABLE}
242202 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@48a1dcee{/static,null,AVAILABLE}
242202 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@7aafb282{/,null,AVAILABLE}
242203 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@24f0b3bb{/api,null,AVAILABLE}
242203 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@36a7ccef{/stages/stage/kill,null,AVAILABLE}
242203 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.ServerConnector  - Started 
ServerConnector@4ce63bb{HTTP/1.1}{0.0.0.0:4040}
242204 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.Server  - Started @253626ms
242248 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@f801411{/metrics/json,null,AVAILABLE}
242349 [RpcServer.reader=7,bindAddress=asf927.gq1.ygridcore.net,port=43675] 
INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 
67.195.81.163 port: 57548 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
242388 [ScalaTest-main-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@2105137a{/SQL,null,AVAILABLE}
242389 [ScalaTest-main-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@2884dffa{/SQL/json,null,AVAILABLE}
242390 [ScalaTest-main-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@6c05c6e5{/SQL/execution,null,AVAILABLE}
242391 [ScalaTest-main-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@2223a3d8{/SQL/execution/json,null,AVAILABLE}
242393 [ScalaTest-main-running-PhoenixSparkITTenantSpecific] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Started 
o.s.j.s.ServletContextHandler@1ce77152{/static/sql,null,AVAILABLE}
242780 [RpcServer.reader=8,bindAddress=asf927.gq1.ygridcore.net,port=43675] 
INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 
67.195.81.163 port: 57556 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
242795 [RpcServer.reader=5,bindAddress=asf927.gq1.ygridcore.net,port=34358] 
INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 
67.195.81.163 port: 34378 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
- Can read from tenant-specific table as DataFrame
243446 [RpcServer.reader=9,bindAddress=asf927.gq1.ygridcore.net,port=43675] 
INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 
67.195.81.163 port: 57568 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
243461 [RpcServer.reader=6,bindAddress=asf927.gq1.ygridcore.net,port=34358] 
INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 
67.195.81.163 port: 34390 with version info: version: "1.1.9" url: 
"git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: 
"0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 
22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
- Can read from tenant-specific table as RDD
- Can write a DataFrame using 'DataFrame.saveToPhoenix' to tenant-specific view
- Can write a DataFrame using 'DataFrame.write' to tenant-specific view
- Can write an RDD to Phoenix tenant-specific view
244980 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.ServerConnector  - Stopped 
ServerConnector@4ce63bb{HTTP/1.1}{0.0.0.0:4040}
244981 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@36a7ccef{/stages/stage/kill,null,UNAVAILABLE}
244981 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@24f0b3bb{/api,null,UNAVAILABLE}
244981 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@7aafb282{/,null,UNAVAILABLE}
244981 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@48a1dcee{/static,null,UNAVAILABLE}
244981 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@36466928{/executors/threadDump/json,null,UNAVAILABLE}
244981 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@1d4d2ef7{/executors/threadDump,null,UNAVAILABLE}
244981 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@25b53b60{/executors/json,null,UNAVAILABLE}
244981 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@1bc1263c{/executors,null,UNAVAILABLE}
244982 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@57aee705{/environment/json,null,UNAVAILABLE}
244982 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@5a0e2923{/environment,null,UNAVAILABLE}
244982 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@54ea2c7a{/storage/rdd/json,null,UNAVAILABLE}
244982 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@116a3994{/storage/rdd,null,UNAVAILABLE}
244982 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@2f7a7836{/storage/json,null,UNAVAILABLE}
244982 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@7dd31e8b{/storage,null,UNAVAILABLE}
244982 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@59e1e63e{/stages/pool/json,null,UNAVAILABLE}
244982 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@76ed5ea9{/stages/pool,null,UNAVAILABLE}
244983 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@2a5dd779{/stages/stage/json,null,UNAVAILABLE}
244983 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@794cf0ee{/stages/stage,null,UNAVAILABLE}
244983 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@4132f852{/stages/json,null,UNAVAILABLE}
244983 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@5ba54c47{/stages,null,UNAVAILABLE}
244983 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@7798f790{/jobs/job/json,null,UNAVAILABLE}
244983 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@229931de{/jobs/job,null,UNAVAILABLE}
244983 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@34ac822{/jobs/json,null,UNAVAILABLE}
244983 [ScalaTest-main-running-DiscoverySuite] INFO  
org.spark_project.jetty.server.handler.ContextHandler  - Stopped 
o.s.j.s.ServletContextHandler@58fddf3b{/jobs,null,UNAVAILABLE}
Run completed in 5 minutes, 44 seconds.
Total number of tests run: 35
Suites: completed 4, aborted 0
Tests: succeeded 6, failed 29, canceled 0, ignored 0, pending 0
*** 29 TESTS FAILED ***
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  1.867 s]
[INFO] Phoenix Core ....................................... SUCCESS [  02:17 h]
[INFO] Phoenix - Flume .................................... SUCCESS [01:50 min]
[INFO] Phoenix - Kafka .................................... SUCCESS [02:25 min]
[INFO] Phoenix - Pig ...................................... SUCCESS [04:19 min]
[INFO] Phoenix Query Server Client ........................ SUCCESS [ 13.040 s]
[INFO] Phoenix Query Server ............................... SUCCESS [02:24 min]
[INFO] Phoenix - Pherf .................................... SUCCESS [02:36 min]
[INFO] Phoenix - Spark .................................... FAILURE [06:22 min]
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] Phoenix Load Balancer .............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:37 h
[INFO] Finished at: 2018-02-22T21:20:54Z
[INFO] Final Memory: 122M/1566M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test 
(integration-test) on project phoenix-spark: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-spark
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
[Fast Archiver] Compressed -589239261 B of artifacts by 328.1% relative to #666
Recording test results

Reply via email to