See 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/1689/display/redirect?page=changes>

Changes:

[Pablo] Make experiments as set attr of RuntimeValueProvider

------------------------------------------
[...truncated 23.30 KB...]
2018-05-09 06:02:54 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2018-05-09 06:02:54 INFO  log:192 - Logging initialized @5695ms
2018-05-09 06:02:54 INFO  Server:345 - jetty-9.3.z-SNAPSHOT
2018-05-09 06:02:54 INFO  Server:403 - Started @5921ms
2018-05-09 06:02:54 INFO  AbstractConnector:270 - Started 
ServerConnector@13f7bcb{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-05-09 06:02:54 INFO  Utils:54 - Successfully started service 'SparkUI' on 
port 4040.
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@5b56b654{/jobs,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@607b2792{/jobs/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@138a7441{/jobs/job,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@3e598df9{/jobs/job/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@99a65d3{/stages,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@42cc13a0{/stages/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@6813a331{/stages/stage,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@111610e6{/stages/stage/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@29d37757{/stages/pool,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@25cc7470{/stages/pool/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@79b663b3{/storage,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@5d28bcd5{/storage/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@32639b12{/storage/rdd,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@3887cf88{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@78dc4696{/environment,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@5652f555{/environment/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@55120f99{/executors,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@38f2e97e{/executors/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@323659f8{/executors/threadDump,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@3e521715{/executors/threadDump/json,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@265c5d69{/static,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1d2644e3{/,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@602c4656{/api,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@63998bf4{/jobs/job/kill,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@61942c1{/stages/stage/kill,null,AVAILABLE,@Spark}
2018-05-09 06:02:54 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at 
http://10.128.0.4:4040
2018-05-09 06:02:54 INFO  SparkContext:54 - Added JAR 
file:/usr/lib/hadoop/hadoop-common.jar at 
spark://10.128.0.4:59608/jars/hadoop-common.jar with timestamp 1525845774758
2018-05-09 06:02:55 INFO  Utils:54 - Using initial executors = 10000, max of 
spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors 
and spark.executor.instances
2018-05-09 06:02:55 INFO  GoogleHadoopFileSystemBase:648 - GHFS version: 
1.6.5-hadoop2
2018-05-09 06:02:56 INFO  RMProxy:123 - Connecting to ResourceManager at 
pkb-92e6e15c-m/10.128.0.4:8032
2018-05-09 06:02:57 INFO  Client:54 - Requesting a new application from cluster 
with 2 NodeManagers
2018-05-09 06:02:57 INFO  Client:54 - Verifying our application has not 
requested more than the maximum memory capability of the cluster (3072 MB per 
container)
2018-05-09 06:02:57 INFO  Client:54 - Will allocate AM container, with 1024 MB 
memory including 384 MB overhead
2018-05-09 06:02:57 INFO  Client:54 - Setting up container launch context for 
our AM
2018-05-09 06:02:57 INFO  Client:54 - Setting up the launch environment for our 
AM container
2018-05-09 06:02:57 INFO  Client:54 - Preparing resources for our AM container
2018-05-09 06:02:59 INFO  Client:54 - Uploading resource 
file:/usr/lib/spark/examples/jars/spark-examples.jar -> 
hdfs://pkb-92e6e15c-m/user/root/.sparkStaging/application_1525845706911_0001/spark-examples.jar
2018-05-09 06:03:01 INFO  Client:54 - Uploading resource 
file:/hadoop/spark/tmp/spark-b5c51010-e06c-44b7-9e26-dc07950a3a05/__spark_conf__6857507561911862427.zip
 -> 
hdfs://pkb-92e6e15c-m/user/root/.sparkStaging/application_1525845706911_0001/__spark_conf__.zip
2018-05-09 06:03:01 INFO  SecurityManager:54 - Changing view acls to: root
2018-05-09 06:03:01 INFO  SecurityManager:54 - Changing modify acls to: root
2018-05-09 06:03:01 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-05-09 06:03:01 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-05-09 06:03:01 INFO  SecurityManager:54 - SecurityManager: authentication 
disabled; ui acls disabled; users  with view permissions: Set(root); groups 
with view permissions: Set(); users  with modify permissions: Set(root); groups 
with modify permissions: Set()
2018-05-09 06:03:01 INFO  Client:54 - Submitting application 
application_1525845706911_0001 to ResourceManager
2018-05-09 06:03:02 INFO  YarnClientImpl:278 - Submitted application 
application_1525845706911_0001
2018-05-09 06:03:02 INFO  SchedulerExtensionServices:54 - Starting Yarn 
extension services with app application_1525845706911_0001 and attemptId None
2018-05-09 06:03:03 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:03 INFO  Client:54 - 
         client token: N/A
         diagnostics: AM container is launched, waiting for AM container to 
Register with RM
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1525845781878
         final status: UNDEFINED
         tracking URL: 
http://pkb-92e6e15c-m:8088/proxy/application_1525845706911_0001/
         user: root
2018-05-09 06:03:04 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:05 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:06 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:07 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:08 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:09 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:10 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:11 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:12 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:13 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:14 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:15 INFO  YarnSchedulerBackend$YarnSchedulerEndpoint:54 - 
ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
2018-05-09 06:03:15 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: ACCEPTED)
2018-05-09 06:03:15 INFO  YarnClientSchedulerBackend:54 - Add WebUI Filter. 
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> 
pkb-92e6e15c-m, PROXY_URI_BASES -> 
http://pkb-92e6e15c-m:8088/proxy/application_1525845706911_0001), 
/proxy/application_1525845706911_0001
2018-05-09 06:03:15 INFO  JettyUtils:54 - Adding filter: 
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
2018-05-09 06:03:16 INFO  Client:54 - Application report for 
application_1525845706911_0001 (state: RUNNING)
2018-05-09 06:03:16 INFO  Client:54 - 
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: 10.128.0.5
         ApplicationMaster RPC port: 0
         queue: default
         start time: 1525845781878
         final status: UNDEFINED
         tracking URL: 
http://pkb-92e6e15c-m:8088/proxy/application_1525845706911_0001/
         user: root
2018-05-09 06:03:16 INFO  YarnClientSchedulerBackend:54 - Application 
application_1525845706911_0001 has started running.
2018-05-09 06:03:16 INFO  Utils:54 - Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 48572.
2018-05-09 06:03:16 INFO  NettyBlockTransferService:54 - Server created on 
10.128.0.4:48572
2018-05-09 06:03:16 INFO  BlockManager:54 - Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
2018-05-09 06:03:16 INFO  BlockManagerMaster:54 - Registering BlockManager 
BlockManagerId(driver, 10.128.0.4, 48572, None)
2018-05-09 06:03:16 INFO  BlockManagerMasterEndpoint:54 - Registering block 
manager 10.128.0.4:48572 with 376.8 MB RAM, BlockManagerId(driver, 10.128.0.4, 
48572, None)
2018-05-09 06:03:16 INFO  BlockManagerMaster:54 - Registered BlockManager 
BlockManagerId(driver, 10.128.0.4, 48572, None)
2018-05-09 06:03:16 INFO  BlockManager:54 - external shuffle service port = 7337
2018-05-09 06:03:16 INFO  BlockManager:54 - Initialized BlockManager: 
BlockManagerId(driver, 10.128.0.4, 48572, None)
2018-05-09 06:03:16 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@40a1b6d4{/metrics/json,null,AVAILABLE,@Spark}
2018-05-09 06:03:17 INFO  EventLoggingListener:54 - Logging events to 
hdfs://pkb-92e6e15c-m/user/spark/eventlog/application_1525845706911_0001
2018-05-09 06:03:17 INFO  Utils:54 - Using initial executors = 10000, max of 
spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors 
and spark.executor.instances
2018-05-09 06:03:17 INFO  YarnClientSchedulerBackend:54 - SchedulerBackend is 
ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
2018-05-09 06:03:17 INFO  SharedState:54 - loading hive config file: 
file:/etc/hive/conf.dist/hive-site.xml
2018-05-09 06:03:17 INFO  SharedState:54 - Setting hive.metastore.warehouse.dir 
('null') to the value of spark.sql.warehouse.dir 
('file:/tmp/0e2ea8e7-6c54-4be1-9b78-adeca2da88d7/spark-warehouse').
2018-05-09 06:03:17 INFO  SharedState:54 - Warehouse path is 
'file:/tmp/0e2ea8e7-6c54-4be1-9b78-adeca2da88d7/spark-warehouse'.
2018-05-09 06:03:17 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@4264beb8{/SQL,null,AVAILABLE,@Spark}
2018-05-09 06:03:17 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@7cd3e0da{/SQL/json,null,AVAILABLE,@Spark}
2018-05-09 06:03:17 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1182413a{/SQL/execution,null,AVAILABLE,@Spark}
2018-05-09 06:03:17 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@5b14f482{/SQL/execution/json,null,AVAILABLE,@Spark}
2018-05-09 06:03:17 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@75ad30c1{/static/sql,null,AVAILABLE,@Spark}
2018-05-09 06:03:18 INFO  StateStoreCoordinatorRef:54 - Registered 
StateStoreCoordinator endpoint
2018-05-09 06:03:21 WARN  GoogleHadoopFileSystemBase:75 - 
GHFS.configureBuckets: Warning. No GCS bucket provided. Falling back on 
deprecated fs.gs.system.bucket.
Exception in thread "main" org.apache.spark.sql.AnalysisException: Path does 
not exist: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/etc/hosts;
        at 
org.apache.spark.sql.execution.datasources.DataSource$.org$apache$spark$sql$execution$datasources$DataSource$$checkAndGlobPathIfNecessary(DataSource.scala:626)
        at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:350)
        at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:350)
        at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
        at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at 
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
        at scala.collection.immutable.List.flatMap(List.scala:344)
        at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:349)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
        at org.apache.spark.sql.DataFrameReader.text(DataFrameReader.scala:623)
        at 
org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:657)
        at 
org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:632)
        at org.apache.spark.examples.JavaWordCount.main(JavaWordCount.java:45)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-05-09 06:03:21 INFO  SparkContext:54 - Invoking stop() from shutdown hook
2018-05-09 06:03:21 INFO  AbstractConnector:310 - Stopped 
Spark@13f7bcb{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-05-09 06:03:21 INFO  SparkUI:54 - Stopped Spark web UI at 
http://10.128.0.4:4040
2018-05-09 06:03:21 INFO  YarnClientSchedulerBackend:54 - Interrupting monitor 
thread
2018-05-09 06:03:21 INFO  YarnClientSchedulerBackend:54 - Shutting down all 
executors
2018-05-09 06:03:21 INFO  YarnSchedulerBackend$YarnDriverEndpoint:54 - Asking 
each executor to shut down
2018-05-09 06:03:21 INFO  SchedulerExtensionServices:54 - Stopping 
SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
2018-05-09 06:03:21 INFO  YarnClientSchedulerBackend:54 - Stopped
2018-05-09 06:03:21 INFO  MapOutputTrackerMasterEndpoint:54 - 
MapOutputTrackerMasterEndpoint stopped!
2018-05-09 06:03:21 INFO  MemoryStore:54 - MemoryStore cleared
2018-05-09 06:03:22 INFO  BlockManager:54 - BlockManager stopped
2018-05-09 06:03:22 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2018-05-09 06:03:22 INFO  
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - 
OutputCommitCoordinator stopped!
2018-05-09 06:03:22 INFO  SparkContext:54 - Successfully stopped SparkContext
2018-05-09 06:03:22 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-05-09 06:03:22 INFO  ShutdownHookManager:54 - Deleting directory 
/hadoop/spark/tmp/spark-b5c51010-e06c-44b7-9e26-dc07950a3a05
ERROR: (gcloud.dataproc.jobs.submit.spark) Job 
[0e2ea8e7-6c54-4be1-9b78-adeca2da88d7] entered state [ERROR] while waiting for 
[DONE].

2018-05-09 06:03:25,414 92e6e15c MainThread dpb_wordcount_benchmark(1/1) INFO   
  Cleaning up benchmark dpb_wordcount_benchmark
2018-05-09 06:03:25,414 92e6e15c MainThread dpb_wordcount_benchmark(1/1) INFO   
  Tearing down resources for benchmark dpb_wordcount_benchmark
2018-05-09 06:03:25,414 92e6e15c MainThread dpb_wordcount_benchmark(1/1) INFO   
  Running: gcloud dataproc clusters delete pkb-92e6e15c --format json --quiet
2018-05-09 06:05:07,288 92e6e15c MainThread dpb_wordcount_benchmark(1/1) INFO   
  Running: gcloud dataproc clusters describe pkb-92e6e15c --format json --quiet
2018-05-09 06:05:07,976 92e6e15c MainThread dpb_wordcount_benchmark(1/1) INFO   
  Ran: {gcloud dataproc clusters describe pkb-92e6e15c --format json --quiet}  
ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: 
Cluster projects/apache-beam-testing/regions/global/clusters/pkb-92e6e15c

2018-05-09 06:05:08,023 92e6e15c MainThread INFO     Publishing 2 samples to 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/92e6e15c/perfkit-bq-pubOs_F6y.json>
2018-05-09 06:05:08,024 92e6e15c MainThread INFO     Publishing 2 samples to 
beam_performance.spark_pkp_results
2018-05-09 06:05:08,024 92e6e15c MainThread INFO     Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.spark_pkp_results 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/92e6e15c/perfkit-bq-pubOs_F6y.json>
2018-05-09 06:05:13,266 92e6e15c MainThread INFO     
-------------------------PerfKitBenchmarker Complete 
Results-------------------------
{'metadata': {'dpb_cluster_id': 'pkb-92e6e15c',
              'dpb_cluster_shape': 'n1-standard-1',
              'dpb_cluster_size': 2,
              'dpb_service': 'dataproc',
              'input_location': 'gs:///etc/hosts',
              'perfkitbenchmarker_version': 'v1.12.0-580-g36cfc82',
              'run_number': 0},
 'metric': 'run_time',
 'official': True,
 'owner': 'jenkins',
 'product_name': 'PerfKitBenchmarker',
 'run_uri': '92e6e15c-cd304fd5-05ce-4eeb-89a2-421e9c013522',
 'sample_uri': '29afd66a-99a8-4a7f-b41c-6433c469be7a',
 'test': 'dpb_wordcount_benchmark',
 'timestamp': 1525845805.413429,
 'unit': 'seconds',
 'value': 39.060006}
{'metadata': {'perfkitbenchmarker_version': 'v1.12.0-580-g36cfc82'},
 'metric': 'End to End Runtime',
 'official': True,
 'owner': 'jenkins',
 'product_name': 'PerfKitBenchmarker',
 'run_uri': '92e6e15c-cd304fd5-05ce-4eeb-89a2-421e9c013522',
 'sample_uri': '4f418254-d162-45da-ace7-3fab7e76e66e',
 'test': 'dpb_wordcount_benchmark',
 'timestamp': 1525845907.976928,
 'unit': 'seconds',
 'value': 260.98073387145996}


-------------------------PerfKitBenchmarker Results 
Summary-------------------------
DPB_WORDCOUNT_BENCHMARK:
  dpb_cluster_id="pkb-92e6e15c" dpb_cluster_shape="n1-standard-1" 
dpb_cluster_size="2" dpb_service="dataproc" input_location="gs:///etc/hosts" 
run_number="0"
  run_time                             39.060006 seconds                       
  End to End Runtime                  260.980734 seconds                       

-------------------------
For all tests: perfkitbenchmarker_version="v1.12.0-580-g36cfc82"
2018-05-09 06:05:13,267 92e6e15c MainThread INFO     Publishing 2 samples to 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/92e6e15c/perfkitbenchmarker_results.json>
2018-05-09 06:05:13,267 92e6e15c MainThread INFO     Benchmark run statuses:
------------------------------------------------------------------------------
Name                     UID                       Status     Failed Substatus
------------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  SUCCEEDED                  
------------------------------------------------------------------------------
Success rate: 100.00% (1/1)
2018-05-09 06:05:13,267 92e6e15c MainThread INFO     Complete logs can be found 
at: 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/92e6e15c/pkb.log>
2018-05-09 06:05:13,268 92e6e15c MainThread INFO     Completion statuses can be 
found at: 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/92e6e15c/completion_statuses.json>
[Set GitHub commit status (universal)] SUCCESS on repos 
[GHRepository@50169027[description=Apache 
Beam,homepage=,name=beam,fork=false,size=58754,milestones={},language=Java,commits={},source=<null>,parent=<null>,responseHeaderFields={null=[HTTP/1.1
 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, 
Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, 
X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], 
Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], 
Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; 
charset=utf-8], Date=[Wed, 09 May 2018 06:05:13 GMT], 
ETag=[W/"b325fcee1bcdcb6673ffb3a642ac3060"], Last-Modified=[Wed, 09 May 2018 
00:26:13 GMT], OkHttp-Received-Millis=[1525845914031], 
OkHttp-Response-Source=[NETWORK 200], OkHttp-Selected-Protocol=[http/1.1], 
OkHttp-Sent-Millis=[1525845913892], Referrer-Policy=[origin-when-cross-origin, 
strict-origin-when-cross-origin], Server=[GitHub.com], Status=[200 OK], 
Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], 
Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, 
X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], 
X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], 
X-GitHub-Media-Type=[github.v3; format=json], 
X-GitHub-Request-Id=[DE20:4F40:989654:14F5555:5AF28F94], 
X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], 
X-RateLimit-Remaining=[4969], X-RateLimit-Reset=[1525848688], 
X-Runtime-rack=[0.054108], X-XSS-Protection=[1; 
mode=block]},url=https://api.github.com/repos/apache/beam,id=50904245]] 
(sha:60f90c8) with context:beam_PerformanceTests_Spark
Setting commit status on GitHub for 
https://github.com/apache/beam/commit/60f90c8dcb229c35a82c7be15e64a89678bae058
ERROR: Build step failed with exception
java.io.FileNotFoundException: 
https://api.github.com/repos/apache/beam/statuses/60f90c8dcb229c35a82c7be15e64a89678bae058
        at 
com.squareup.okhttp.internal.huc.HttpURLConnectionImpl.getInputStream(HttpURLConnectionImpl.java:243)
        at 
com.squareup.okhttp.internal.huc.DelegatingHttpsURLConnection.getInputStream(DelegatingHttpsURLConnection.java:210)
        at 
com.squareup.okhttp.internal.huc.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:25)
        at org.kohsuke.github.Requester.parse(Requester.java:612)
        at org.kohsuke.github.Requester.parse(Requester.java:594)
        at org.kohsuke.github.Requester._to(Requester.java:272)
Caused: org.kohsuke.github.GHFileNotFoundException: {"message":"Not 
Found","documentation_url":"https://developer.github.com/v3/repos/statuses/#create-a-status"}
        at org.kohsuke.github.Requester.handleApiError(Requester.java:686)
        at org.kohsuke.github.Requester._to(Requester.java:293)
        at org.kohsuke.github.Requester.to(Requester.java:234)
        at 
org.kohsuke.github.GHRepository.createCommitStatus(GHRepository.java:1075)
        at 
org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:160)
Caused: 
org.jenkinsci.plugins.github.common.CombineErrorHandler$ErrorHandlingException
        at 
org.jenkinsci.plugins.github.common.CombineErrorHandler.handle(CombineErrorHandler.java:74)
        at 
org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:164)
        at 
com.cloudbees.jenkins.GitHubCommitNotifier.perform(GitHubCommitNotifier.java:151)
        at 
hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:81)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:744)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:690)
        at hudson.model.Build$BuildExecution.post2(Build.java:186)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:635)
        at hudson.model.Run.execute(Run.java:1749)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:97)
        at hudson.model.Executor.run(Executor.java:429)
Build step 'Set build status on GitHub commit [deprecated]' marked build as 
failure

Reply via email to