Jenkins build is back to normal : beam_PostCommit_Python_ValidatesRunner_Dataflow #1304

2018-04-09 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #18

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Add Python trigger snippets and tests

--
[...truncated 15.56 MB...]
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForKeys as step s23
Apr 10, 2018 6:34:20 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParDo(ToIsmMetadataRecordForKey) as step s24
Apr 10, 2018 6:34:20 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/Flatten.PCollections as step s25
Apr 10, 2018 6:34:20 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/CreateDataflowView as step s26
Apr 10, 2018 6:34:20 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Partition 
input as step s27
Apr 10, 2018 6:34:20 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Group by 
partition as step s28
Apr 10, 2018 6:34:20 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Batch 
mutations together as step s29
Apr 10, 2018 6:34:20 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write 
mutations to Spanner as step s30
Apr 10, 2018 6:34:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410063417-7ee928f7/output/results/staging/
Apr 10, 2018 6:34:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <80749 bytes, hash kdM_N0KIhA2W69M8ZxFngA> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410063417-7ee928f7/output/results/staging/pipeline-kdM_N0KIhA2W69M8ZxFngA.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 6:34:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-09_23_34_21-9335606100498712109?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-09_23_34_21-9335606100498712109

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 6:34:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-09_23_34_21-9335606100498712109
Apr 10, 2018 6:34:22 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-09_23_34_21-9335606100498712109 with 0 
expected assertions.

org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead STANDARD_ERROR
Apr 10, 2018 6:34:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T06:34:24.855Z: Autoscaling: Resized worker pool from 8 to 
0.
Apr 10, 2018 6:34:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T06:34:24.886Z: Autoscaling: Would further reduce the 
number of workers but reached the minimum number allowed for the job.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 6:34:32 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T06:34:21.540Z: Autoscaling is enabled for job 
2018-04-09_23_34_21-9335606100498712109. The number of workers will be between 
1 and 1000.
Apr 10, 2018 6:34:32 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T06:34:21.556Z: Autoscaling was automatically enabled for 
job 2018-04-09_23_34_21-9335606100498712109.
Apr 10, 2018 6:34:32 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T06:34:24.089Z: Checking required Cloud APIs are enabled.
Apr 10, 2018 6:34:32 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T06:34:24.189Z: Checking 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #42

2018-04-09 Thread Apache Jenkins Server
See 


--
[...truncated 1.59 MB...]
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.SparkContext.(SparkContext.scala:457)
at 
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at 
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:103)
at 
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:68)
at 
org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory.call(SparkRunnerStreamingContextFactory.java:79)
at 
org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory.call(SparkRunnerStreamingContextFactory.java:47)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:627)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:626)
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testDiscardingMode(CreateStreamTest.java:203)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispat

[jira] [Work logged] (BEAM-3119) direct-metrics-counter-committer threads are leaking

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3119?focusedWorklogId=89279&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89279
 ]

ASF GitHub Bot logged work on BEAM-3119:


Author: ASF GitHub Bot
Created on: 10/Apr/18 06:35
Start Date: 10/Apr/18 06:35
Worklog Time Spent: 10m 
  Work Description: rmannibucau commented on a change in pull request 
#4965: BEAM-3119 ensure the metrics thread pool is related to an execution
URL: https://github.com/apache/beam/pull/4965#discussion_r180309992
 
 

 ##
 File path: 
runners/direct-java/src/main/java/org/apache/beam/runners/direct/ExecutorServiceParallelExecutor.java
 ##
 @@ -298,6 +303,11 @@ private void shutdownIfNecessary(State newState) {
 } catch (final RuntimeException re) {
   errors.add(re);
 }
+try {
+metricsExecutor.shutdown();
 
 Review comment:
   https://issues.apache.org/jira/browse/BEAM-4039


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89279)
Time Spent: 2h 40m  (was: 2.5h)

> direct-metrics-counter-committer threads are leaking
> 
>
> Key: BEAM-3119
> URL: https://issues.apache.org/jira/browse/BEAM-3119
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Etienne Chauchot
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> When I run ElasticsearchIOTests using ESv5, there is a thread leak control 
> mechanism ({{com.carrotsearch.randomizedtesting.ThreadLeakControl}}). It 
> waits for 5s for non-terminated threads at the end of a test. It detects 
> leaked {{direct-metrics-counter-committer}} thread.
> {code}
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie 
> threads that couldn't be terminated:
>1) Thread[id=296, name=direct-metrics-counter-committer, 
> state=TIMED_WAITING, group=TGRP-ElasticsearchIOTest]
> at sun.misc.Unsafe.park(Native Method)
> at 
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
> at 
> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
> at 
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>   at __randomizedtesting.SeedInfo.seed([59E504CA1B0DD6A8]:0){code}
> I tried to increase the timeout to 30s (by patching 
> randomizedtesting-runner-2.5.0.jar) but still gets a zombie thread.
> To reproduce, just comment 
> {code}
> @ThreadLeakScope(ThreadLeakScope.Scope.NONE)
> {code}
>  in 
> {code}
> beam/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
> {code}
> and run 
> {code}
> testRead()
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4039) ExecutorServiceParallelExecutor should have a unified shutdown collection

2018-04-09 Thread Romain Manni-Bucau (JIRA)
Romain Manni-Bucau created BEAM-4039:


 Summary: ExecutorServiceParallelExecutor should have a unified 
shutdown collection
 Key: BEAM-4039
 URL: https://issues.apache.org/jira/browse/BEAM-4039
 Project: Beam
  Issue Type: Improvement
  Components: runner-direct
Reporter: Romain Manni-Bucau
Assignee: Thomas Groh


Goal of this ticket is to ensure the executor doesn't have to know each single 
instances of what it destroys at shutdown time.

Proposal is to have a list of closeable (or autocloseable) which and a 
registerCloseable which would decouple it.

Side note: it is still key to ensure the list is 1. sorted and 2. any error 
doesn't prevent to call next element



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3119) direct-metrics-counter-committer threads are leaking

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3119?focusedWorklogId=89277&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89277
 ]

ASF GitHub Bot logged work on BEAM-3119:


Author: ASF GitHub Bot
Created on: 10/Apr/18 06:32
Start Date: 10/Apr/18 06:32
Worklog Time Spent: 10m 
  Work Description: rmannibucau commented on a change in pull request 
#4965: BEAM-3119 ensure the metrics thread pool is related to an execution
URL: https://github.com/apache/beam/pull/4965#discussion_r180309378
 
 

 ##
 File path: 
runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectMetrics.java
 ##
 @@ -223,13 +215,11 @@ public GaugeResult extract(GaugeData data) {
   };
 
   /** The current values of counters in memory. */
-  private MetricsMap> counters =
-  new MetricsMap<>(unusedKey -> new DirectMetric<>(COUNTER));
+  private MetricsMap> counters;
 
 Review comment:
   good catch, will fix


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89277)
Time Spent: 2.5h  (was: 2h 20m)

> direct-metrics-counter-committer threads are leaking
> 
>
> Key: BEAM-3119
> URL: https://issues.apache.org/jira/browse/BEAM-3119
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Etienne Chauchot
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 2.5h
>  Remaining Estimate: 0h
>
> When I run ElasticsearchIOTests using ESv5, there is a thread leak control 
> mechanism ({{com.carrotsearch.randomizedtesting.ThreadLeakControl}}). It 
> waits for 5s for non-terminated threads at the end of a test. It detects 
> leaked {{direct-metrics-counter-committer}} thread.
> {code}
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie 
> threads that couldn't be terminated:
>1) Thread[id=296, name=direct-metrics-counter-committer, 
> state=TIMED_WAITING, group=TGRP-ElasticsearchIOTest]
> at sun.misc.Unsafe.park(Native Method)
> at 
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
> at 
> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
> at 
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>   at __randomizedtesting.SeedInfo.seed([59E504CA1B0DD6A8]:0){code}
> I tried to increase the timeout to 30s (by patching 
> randomizedtesting-runner-2.5.0.jar) but still gets a zombie thread.
> To reproduce, just comment 
> {code}
> @ThreadLeakScope(ThreadLeakScope.Scope.NONE)
> {code}
>  in 
> {code}
> beam/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
> {code}
> and run 
> {code}
> testRead()
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3119) direct-metrics-counter-committer threads are leaking

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3119?focusedWorklogId=89276&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89276
 ]

ASF GitHub Bot logged work on BEAM-3119:


Author: ASF GitHub Bot
Created on: 10/Apr/18 06:31
Start Date: 10/Apr/18 06:31
Worklog Time Spent: 10m 
  Work Description: rmannibucau commented on a change in pull request 
#4965: BEAM-3119 ensure the metrics thread pool is related to an execution
URL: https://github.com/apache/beam/pull/4965#discussion_r180309280
 
 

 ##
 File path: 
runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectMetrics.java
 ##
 @@ -76,6 +64,8 @@
   private static class DirectMetric {
 private final MetricAggregation aggregation;
 
+private final ExecutorService executorService;
 
 Review comment:
   Fair enough, will move to Executor


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89276)
Time Spent: 2h 20m  (was: 2h 10m)

> direct-metrics-counter-committer threads are leaking
> 
>
> Key: BEAM-3119
> URL: https://issues.apache.org/jira/browse/BEAM-3119
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Etienne Chauchot
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 2h 20m
>  Remaining Estimate: 0h
>
> When I run ElasticsearchIOTests using ESv5, there is a thread leak control 
> mechanism ({{com.carrotsearch.randomizedtesting.ThreadLeakControl}}). It 
> waits for 5s for non-terminated threads at the end of a test. It detects 
> leaked {{direct-metrics-counter-committer}} thread.
> {code}
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie 
> threads that couldn't be terminated:
>1) Thread[id=296, name=direct-metrics-counter-committer, 
> state=TIMED_WAITING, group=TGRP-ElasticsearchIOTest]
> at sun.misc.Unsafe.park(Native Method)
> at 
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
> at 
> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
> at 
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>   at __randomizedtesting.SeedInfo.seed([59E504CA1B0DD6A8]:0){code}
> I tried to increase the timeout to 30s (by patching 
> randomizedtesting-runner-2.5.0.jar) but still gets a zombie thread.
> To reproduce, just comment 
> {code}
> @ThreadLeakScope(ThreadLeakScope.Scope.NONE)
> {code}
>  in 
> {code}
> beam/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
> {code}
> and run 
> {code}
> testRead()
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Python_Verify #4639

2018-04-09 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Spark #1572

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Fix missing clock bug in nested TriggerContext

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

[ccy] Add streaming wordcount snippets and test

[ccy] Add Python trigger snippets and tests

--
[...truncated 91.57 KB...]
'apache-beam-testing:bqjob_r2baf5845ef7eecb8_0162ae3902fb_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r2baf5845ef7eecb8_0162ae3902fb_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r2baf5845ef7eecb8_0162ae3902fb_1 ... (0s) Current status: DONE   
2018-04-10 06:23:16,670 4f9052db MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-10 06:23:44,027 4f9052db MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-10 06:23:46,332 4f9052db MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_rf09d9eec806b0a8_0162ae3976e1_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_rf09d9eec806b0a8_0162ae3976e1_1 ... (0s) 
Current status: RUNNING 
Waiting on 
bqjob_rf09d9eec806b0a8_0162ae3976e1_1 ... (0s) Current status: DONE   
2018-04-10 06:23:46,333 4f9052db MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-10 06:24:13,867 4f9052db MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-10 06:24:15,989 4f9052db MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r64d04313ab9b4da8_0162ae39eb5f_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r64d04313ab9b4da8_0162ae39eb5f_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r64d04313ab9b4da8_0162ae39eb5f_1 ... (0s) Current status: DONE   
2018-04-10 06:24:15,990 4f9052db MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-10 06:24:33,204 4f9052db MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-10 06:24:35,346 4f9052db MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #55

2018-04-09 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #41

2018-04-09 Thread Apache Jenkins Server
See 


--
[...truncated 138.85 KB...]
Starting process 'Gradle Test Executor 17'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false 
-javaagent:build/tmp/expandedArchives/org.jacoco.agent-0.7.9.jar_17d4cd69b5d9b44b59ac09d2e4756e43/jacocoagent.jar=destfile=build/jacoco/validatesRunnerBatch.exec,append=true,inclnolocationclasses=false,dumponexit=true,output=file,jmx=false
 -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea 
-cp /home/jenkins/.gradle/caches/4.5.1/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 17'
Successfully started process 'Gradle Test Executor 17'
Gradle Test Executor 16 finished executing tests.
Gradle Test Executor 17 started executing tests.

org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
 STANDARD_ERROR
2018-04-10 06:19:44,347 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! setup
2018-04-10 06:19:44,386 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! /home/jenkins/.gradle/caches/4.5.1/workerMain/gradle-worker.jar
2018-04-10 06:19:44,390 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,391 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,392 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,394 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,397 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,397 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,398 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,398 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,399 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,399 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 06:19:44,405 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 


Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #32

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Fix missing clock bug in nested TriggerContext

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

[ccy] Add streaming wordcount snippets and test

[ccy] Add Python trigger snippets and tests

--
[...truncated 49.48 KB...]
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.225.121.250:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.225.121.250:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.225.121.250:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelpe

[jira] [Work logged] (BEAM-2990) support data type MAP

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2990?focusedWorklogId=89274&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89274
 ]

ASF GitHub Bot logged work on BEAM-2990:


Author: ASF GitHub Bot
Created on: 10/Apr/18 06:13
Start Date: 10/Apr/18 06:13
Worklog Time Spent: 10m 
  Work Description: XuMingmin commented on issue #5079: [BEAM-2990] support 
MAP in SQL schema
URL: https://github.com/apache/beam/pull/5079#issuecomment-379985683
 
 
   retest this please


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89274)
Time Spent: 0.5h  (was: 20m)

> support data type MAP
> -
>
> Key: BEAM-2990
> URL: https://issues.apache.org/jira/browse/BEAM-2990
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> support Non-scalar types:
> MAP   Collection of keys mapped to values
> ARRAY Ordered, contiguous collection that may contain duplicates



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #40

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

[ccy] Add Python trigger snippets and tests

--
[...truncated 1.59 MB...]
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.SparkContext.(SparkContext.scala:457)
at 
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at 
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:103)
at 
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:68)
at 
org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory.call(SparkRunnerStreamingContextFactory.java:79)
at 
org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory.call(SparkRunnerStreamingContextFactory.java:47)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:627)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:626)
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testDiscardingMode(CreateStreamTest.java:203)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatc

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #31

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Fix missing clock bug in nested TriggerContext

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

[ccy] Add streaming wordcount snippets and test

[ccy] Add Python trigger snippets and tests

--
[...truncated 1.34 KB...]
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins840799922644316423.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running gcloud.container.clusters.get-credentials with 
Namespace(__calliope_internal_deepest_parser=ArgumentParser(prog='gcloud.container.clusters.get-credentials',
 usage=None, description='Updates a kubeconfig file with appropriate 
credentials to point\nkubectl at a Container Engine Cluster. By default, 
credentials\nare written to HOME/.kube/config. You can provide an 
alternate\npath by setting the KUBECONFIG environment variable.\n\nSee 
[](https://cloud.google.com/container-engine/docs/kubectl) for\nkubectl 
documentation.', version=None, formatter_class=, conflict_handler='error', add_help=False), 
account=None, api_version=None, authority_selector=None, 
authorization_token_file=None, 
calliope_command=, command_path=['gcloud', 'container', 'clusters', 
'get-credentials'], configuration=None, credential_file_override=None, 
document=None, flatten=None, format=None, h=None, help=None, http_timeout=None, 
log_http=None, name='io-datastores', project=None, quiet=None, 
trace_email=None, trace_log=None, trace_token=None, user_output_enabled=None, 
verbosity='debug', version=None, zone='us-central1-a').
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5294963284957320695.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins9190218342294334064.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1523336468108
namespace "filebasedioithdfs-1523336468108" created
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8946222702441894051.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1523336468108
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3859061120594309510.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4565720781992386969.sh
+ rm -rf .env
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2052677539046158298.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7431578202121947154.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl#md5=ca299c7acd13a72e1171a3697f2b99bc
Downloading/unpacking pip from 
https://pypi.python.org/packages/ac/95/a05b56bb975efa78d3557efa36acaf9cf5d2fd0ee0062060493687432e03/pip-9.0.3-py2.py3-none-any.whl#md5=d512ceb964f38ba31addb8142bc657cb
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins296535305220627.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/b

Build failed in Jenkins: beam_PerformanceTests_JDBC #434

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Fix missing clock bug in nested TriggerContext

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

[ccy] Add streaming wordcount snippets and test

[ccy] Add Python trigger snippets and tests

--
[...truncated 51.05 KB...]
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from 
the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the 
shaded jar.
[INFO] Excluding io.opencensus:opencensus-api:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.dropwizard.metrics:metrics-core:jar:3.1.2 from the shaded 
jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded 
jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.22.0 from the 
shaded jar.
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:integration-test (default) @ 
beam-sdks-java-io-jdbc ---
[INFO] Failsafe report directory: 

[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] --

Build failed in Jenkins: beam_PerformanceTests_Python #1128

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Fix missing clock bug in nested TriggerContext

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

[ccy] Add streaming wordcount snippets and test

[ccy] Add Python trigger snippets and tests

--
[...truncated 61.75 KB...]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 6 resources
[INFO] 
[INFO] --- maven-assembly-plugin:3.1.0:single (export-go-pkg-sources) @ 
beam-sdks-go ---
[INFO] Reading assembly descriptor: descriptor.xml
[INFO] Building zip: 

[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:get (go-get-imports) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go get google.golang.org/grpc 
golang.org/x/oauth2/google google.golang.org/api/storage/v1 
github.com/spf13/cobra cloud.google.com/go/bigquery 
google.golang.org/api/googleapi google.golang.org/api/dataflow/v1b3
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build-linux-amd64) @ beam-sdks-go 
---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- maven-checkstyle-plugin:3.0.0:check (default) @ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:test (go-test) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go test ./...
[INFO] 
[INFO] -Exec.Out-
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl/cmd  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/specialize   [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/symtab   [no test files]
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam 0.021s
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam/artifact0.079s
[INFO] 
[ERROR] 
[ERROR] -Exec.Err-
[ERROR] # github.com/apache/beam/sdks/go/pkg/beam/util/gcsx
[ERROR] github.com/apache/beam/sdks/go/pkg/beam/util/gcsx/gcs.go:46:37: 
undefined: option.WithoutAuthentication
[ERROR] 
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Beam :: Parent .. SUCCESS [  6.128 s]
[INFO] Apache Beam :: SDKs :: Java :: Build Tools . SUCCESS [  4.800 s]
[INFO] Apache Beam :: Model ... SUCCESS [  0.102 s]
[INFO] Apache Beam :: Model :: Pipeline ... SUCCESS [ 16.786 s]
[INFO] Apache Beam :: Model :: Job Management . SUCCESS [  5.187 s]
[INFO] Apache Beam :: Model :: Fn Execution ... SUCCESS [  4.589 s]
[INFO] Apache Beam :: SDKs  SUCCESS [  0.384 s]
[INFO] Apache Beam :: SDKs :: Go .. FAILURE [ 30.145 s]
[INFO] Apache Beam :: SDKs :: Go :: Container . SKIPPED
[INFO] Apache Beam :: SDKs :: Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Core  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Fn Execution  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform Core 
SKIPPED
[INFO] Apache Beam :: Runners . SKIPPED
[INFO] Apache Beam :: Runners :: Core Construction Java ... SKIPPED
[INFO] Apache Beam :: Runners :: Core Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Harness . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Container ... SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Amazon Web Services SKIPPED
[INFO] Apache Beam :: Runners :: Local Java Core .. SKIPPED
[INFO] Apache Beam :: Runners :: Direct Java .. SKIPPED

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #17

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

--
[...truncated 16.15 MB...]
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForKeys as step s23
Apr 10, 2018 5:51:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParDo(ToIsmMetadataRecordForKey) as step s24
Apr 10, 2018 5:51:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/Flatten.PCollections as step s25
Apr 10, 2018 5:51:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/CreateDataflowView as step s26
Apr 10, 2018 5:51:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Partition 
input as step s27
Apr 10, 2018 5:51:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Group by 
partition as step s28
Apr 10, 2018 5:51:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Batch 
mutations together as step s29
Apr 10, 2018 5:51:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write 
mutations to Spanner as step s30
Apr 10, 2018 5:51:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410055118-154bc5c9/output/results/staging/
Apr 10, 2018 5:51:22 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <80747 bytes, hash exbXj114a-IApKBCTdejDQ> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410055118-154bc5c9/output/results/staging/pipeline-exbXj114a-IApKBCTdejDQ.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 5:51:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-09_22_51_23-687610027897140620?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-09_22_51_23-687610027897140620

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 5:51:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-09_22_51_23-687610027897140620
Apr 10, 2018 5:51:23 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-09_22_51_23-687610027897140620 with 0 
expected assertions.
Apr 10, 2018 5:51:29 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:51:23.084Z: Autoscaling is enabled for job 
2018-04-09_22_51_23-687610027897140620. The number of workers will be between 1 
and 1000.
Apr 10, 2018 5:51:29 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:51:23.098Z: Autoscaling was automatically enabled for 
job 2018-04-09_22_51_23-687610027897140620.
Apr 10, 2018 5:51:29 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:51:25.330Z: Checking required Cloud APIs are enabled.
Apr 10, 2018 5:51:29 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:51:25.383Z: Checking permissions granted to controller 
Service Account.
Apr 10, 2018 5:51:29 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:51:28.748Z: Expanding CoGroupByKey operations into 
optimizable parts.
Apr 10, 2018 5:51:29 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:51:28.867Z: Expanding GroupByKey operations into 
optimizable parts.
Apr 10, 2018 5:51:29 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:5

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #16

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Add streaming wordcount snippets and test

--
[...truncated 16.10 MB...]
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParDo(ToIsmMetadataRecordForSize) as step s22
Apr 10, 2018 5:49:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForKeys as step s23
Apr 10, 2018 5:49:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParDo(ToIsmMetadataRecordForKey) as step s24
Apr 10, 2018 5:49:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/Flatten.PCollections as step s25
Apr 10, 2018 5:49:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/CreateDataflowView as step s26
Apr 10, 2018 5:49:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Partition 
input as step s27
Apr 10, 2018 5:49:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Group by 
partition as step s28
Apr 10, 2018 5:49:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Batch 
mutations together as step s29
Apr 10, 2018 5:49:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write 
mutations to Spanner as step s30
Apr 10, 2018 5:49:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410054913-134bf741/output/results/staging/
Apr 10, 2018 5:49:22 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <80747 bytes, hash YsMZX5EnOVC4OLLTYWYOfg> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410054913-134bf741/output/results/staging/pipeline-YsMZX5EnOVC4OLLTYWYOfg.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 5:49:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-09_22_49_22-10182743361825982995?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-09_22_49_22-10182743361825982995

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 5:49:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-09_22_49_22-10182743361825982995
Apr 10, 2018 5:49:23 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-09_22_49_22-10182743361825982995 with 0 
expected assertions.

org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead STANDARD_ERROR
Apr 10, 2018 5:49:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:49:25.663Z: Autoscaling: Resized worker pool from 7 to 
0.
Apr 10, 2018 5:49:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:49:25.690Z: Autoscaling: Would further reduce the 
number of workers but reached the minimum number allowed for the job.
Apr 10, 2018 5:49:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:49:25.723Z: Worker pool stopped.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 5:49:32 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:49:22.913Z: Autoscaling is enabled for job 
2018-04-09_22_49_22-10182743361825982995. The number of workers will be between 
1 and 1000.
Apr 10, 2018 5:49:32 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:49:22.941Z: Autoscali

[beam] 01/01: Merge pull request #5077 from charlesccychen/trigger-snippets

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit cd546e28d9083a705d029c9437a2297534372f93
Merge: 2913cbd 2fcf84e
Author: Ahmet Altay 
AuthorDate: Mon Apr 9 22:48:17 2018 -0700

Merge pull request #5077 from charlesccychen/trigger-snippets

Add Python trigger snippets and tests

 .../apache_beam/examples/snippets/snippets.py  |   1 -
 .../apache_beam/examples/snippets/snippets_test.py | 126 +
 2 files changed, 126 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[beam] branch master updated (2913cbd -> cd546e2)

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 2913cbd  Merge pull request #5078 from 
charlesccychen/fnapi-runner-valueprovider
 add 2fcf84e  Add Python trigger snippets and tests
 new cd546e2  Merge pull request #5077 from charlesccychen/trigger-snippets

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache_beam/examples/snippets/snippets.py  |   1 -
 .../apache_beam/examples/snippets/snippets_test.py | 126 +
 2 files changed, 126 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


Build failed in Jenkins: beam_PostCommit_Python_Verify #4638

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[robertwb] Revert "Revert #4781 which broke Python postsubmits"

[robertwb] Guard side input mutation for Fn API mode only.

[ehudm] Allow longs as input to Timestamp.of().

[ccy] Fix missing clock bug in nested TriggerContext

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

[ccy] Add streaming wordcount snippets and test

--
[...truncated 1.04 MB...]
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operation_specs.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operations.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operations.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_main_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sideinputs.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sideinputs_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler_fast.pyx -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler_slow.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/worker_id_interceptor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/worker_id_interceptor_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/testing/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/pipeline_verifiers.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/pipeline_verifiers_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_pipeline.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_pipeline_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_stream.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_stream_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_utils.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_utils_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/util.py -> 
apache-beam-2.5.0.dev0/apache_beam/testin

[jira] [Work logged] (BEAM-3981) Futurize and fix python 2 compatibility for coders package

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3981?focusedWorklogId=89268&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89268
 ]

ASF GitHub Bot logged work on BEAM-3981:


Author: ASF GitHub Bot
Created on: 10/Apr/18 05:39
Start Date: 10/Apr/18 05:39
Worklog Time Spent: 10m 
  Work Description: aaltay commented on issue #5053: [BEAM-3981] Futurize 
coders subpackage
URL: https://github.com/apache/beam/pull/5053#issuecomment-379980133
 
 
   R: @charlesccychen  cc: @tvalentyn 
   
   Some high level comments:
   - six could be handy in some cases, for example `six.string_types` could be 
used instead of
   ```
   Try:
 unicode   # pylint: disable=unicode-builtin
   except NameError:
 unicode = str
   ```
   What do you think about that?
   - Is there a reason to change cython version?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89268)
Time Spent: 6.5h  (was: 6h 20m)

> Futurize and fix python 2 compatibility for coders package
> --
>
> Key: BEAM-3981
> URL: https://issues.apache.org/jira/browse/BEAM-3981
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 6.5h
>  Remaining Estimate: 0h
>
> Run automatic conversion with futurize tool on coders subpackage and fix 
> python 2 compatibility. This prepares the subpackage for python 3 support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1303

2018-04-09 Thread Apache Jenkins Server
See 


--
[...truncated 1.55 KB...]
 > git checkout -f 2913cbd10d39d890b36a15dca0fc7000701e600e
Commit message: "Merge pull request #5078 from 
charlesccychen/fnapi-runner-valueprovider"
 > git rev-list --no-walk 2913cbd10d39d890b36a15dca0fc7000701e600e # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_ValidatesRunner_Dataflow] $ /bin/bash -xe 
/tmp/jenkins4847091541943960495.sh
+ cd src
+ bash sdks/python/run_validatesrunner.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# INFRA does not install virtualenv
pip install virtualenv --user
Requirement already satisfied: virtualenv in 
/home/jenkins/.local/lib/python2.7/site-packages
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:339: 
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning

# Virtualenv for the rest of the script to run setup & e2e tests
${LOCAL_PATH}/virtualenv sdks/python
New python executable in 

Installing setuptools, pip, wheel...done.
. sdks/python/bin/activate
# This file must be used with "source bin/activate" *from bash*
# you cannot run it directly

deactivate () {
unset -f pydoc >/dev/null 2>&1

# reset old environment variables
# ! [ -z ${VAR+_} ] returns true if VAR is declared at all
if ! [ -z "${_OLD_VIRTUAL_PATH+_}" ] ; then
PATH="$_OLD_VIRTUAL_PATH"
export PATH
unset _OLD_VIRTUAL_PATH
fi
if ! [ -z "${_OLD_VIRTUAL_PYTHONHOME+_}" ] ; then
PYTHONHOME="$_OLD_VIRTUAL_PYTHONHOME"
export PYTHONHOME
unset _OLD_VIRTUAL_PYTHONHOME
fi

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi

if ! [ -z "${_OLD_VIRTUAL_PS1+_}" ] ; then
PS1="$_OLD_VIRTUAL_PS1"
export PS1
unset _OLD_VIRTUAL_PS1
fi

unset VIRTUAL_ENV
if [ ! "${1-}" = "nondestructive" ] ; then
# Self destruct!
unset -f deactivate
fi
}

# unset irrelevant variables
deactivate nondestructive

VIRTUAL_ENV="
export VIRTUAL_ENV

_OLD_VIRTUAL_PATH="$PATH"
PATH="$VIRTUAL_ENV/bin:$PATH"
export PATH

# unset PYTHONHOME if set
if ! [ -z "${PYTHONHOME+_}" ] ; then
_OLD_VIRTUAL_PYTHONHOME="$PYTHONHOME"
unset PYTHONHOME
fi

if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT-}" ] ; then
_OLD_VIRTUAL_PS1="$PS1"
if [ "x" != x ] ; then
PS1="$PS1"
else
PS1="(`basename \"$VIRTUAL_ENV\"`) $PS1"
fi
export PS1
fi
basename "$VIRTUAL_ENV"

# Make sure to unalias pydoc if it's already there
alias pydoc 2>/dev/null >/dev/null && unalias pydoc

pydoc () {
python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
cd sdks/python
pip install -e .[gcp,test]
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #53

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Set RuntimeValueProvider runtime options in FnApiRunner

--
[...truncated 87.47 MB...]
Apr 10, 2018 5:27:49 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous)
 -> (Map, Map) (1/1) (2b2769c83c4f59bd14c7aa9d1bb3b76f) [FINISHED]
Apr 10, 2018 5:27:49 AM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: 
Combine.globally(TestCombineFnWithContext)/Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
Combine.globally(TestCombineFnWithContext)/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$136/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$136/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (83a2f4b6d872793b38884a34c24d2596) switched from 
RUNNING to FINISHED.
Apr 10, 2018 5:27:49 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Freeing task resources for 
Combine.globally(TestCombineFnWithContext)/Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
Combine.globally(TestCombineFnWithContext)/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$136/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$136/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (83a2f4b6d872793b38884a34c24d2596).
Apr 10, 2018 5:27:49 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
Combine.globally(TestCombineFnWithContext)/Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
Combine.globally(TestCombineFnWithContext)/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$136/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$136/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (83a2f4b6d872793b38884a34c24d2596) [FINISHED]
Apr 10, 2018 5:27:49 AM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: PAssert$136/GroupGlobally/GatherAllOutputs/GroupByKey -> 
PAssert$136/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$136/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous) (1/1) 
(91071fa051ccc00e05ec666297c196b3) switched from RUNNING to FINISHED.
Apr 10, 2018 5:27:49 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Freeing task resources for 
PAssert$136/GroupGlobally/GatherAllOutputs/GroupByKey -> 
PAssert$136/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$136/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous) (1/1) 
(91071fa051ccc00e05ec666297c196b3).
Apr 10, 2018 5:27:49 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
PAssert$136/GroupGlobally/GatherAllOutputs/GroupByKey -> 
PAssert$136/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$136/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$136/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous) (1/1) 
(91071fa051ccc00e05ec666297c196b3) [FINISHED]
Apr 10, 2018 5:27:49 AM grizzled.slf4j.Logger info
INFO: Un-registering task and sending final execution state FINISHED to 
JobManager for task PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(6854e19b92cf6ea110ccc1116edb1e7f)
Apr 10, 2018 5:27:49 AM org.apache.flink.runtime

[jira] [Work logged] (BEAM-2990) support data type MAP

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2990?focusedWorklogId=89259&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89259
 ]

ASF GitHub Bot logged work on BEAM-2990:


Author: ASF GitHub Bot
Created on: 10/Apr/18 05:14
Start Date: 10/Apr/18 05:14
Worklog Time Spent: 10m 
  Work Description: XuMingmin commented on issue #5079: [BEAM-2990] support 
MAP in SQL schema
URL: https://github.com/apache/beam/pull/5079#issuecomment-379976290
 
 
   R: + @akedin 
   
   Can you guys take a look and let's try to finish it before 2.5 cutoff? 
Thanks!


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89259)
Time Spent: 20m  (was: 10m)

> support data type MAP
> -
>
> Key: BEAM-2990
> URL: https://issues.apache.org/jira/browse/BEAM-2990
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> support Non-scalar types:
> MAP   Collection of keys mapped to values
> ARRAY Ordered, contiguous collection that may contain duplicates



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #52

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Add streaming wordcount snippets and test

--
[...truncated 82.41 MB...]
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> PAssert$135/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$135/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$135/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$135/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (7114b61bc520409f44880da44c5f90fe) [FINISHED]
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Freeing task resources for 
PAssert$135/GroupGlobally/GatherAllOutputs/GroupByKey -> 
PAssert$135/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$135/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$135/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous) (1/1) 
(42964f81117ac3732af86b2577a10198).
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
PAssert$135/GroupGlobally/GatherAllOutputs/GroupByKey -> 
PAssert$135/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$135/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$135/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous) (1/1) 
(42964f81117ac3732af86b2577a10198) [FINISHED]
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (9ed1e90e2d84bb600fc2ea120cd8b99c) switched from RUNNING to FINISHED.
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Freeing task resources for 
PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (9ed1e90e2d84bb600fc2ea120cd8b99c).
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (9ed1e90e2d84bb600fc2ea120cd8b99c) [FINISHED]
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: PAssert$135/GroupGlobally/GroupDummyAndContents -> 
PAssert$135/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$135/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$135/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$135/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$135/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (16bcebec01633c9b466acf68054370c6) switched from RUNNING to FINISHED.
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Freeing task resources for 
PAssert$135/GroupGlobally/GroupDummyAndContents -> 
PAssert$135/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$135/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$135/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$135/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$135/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (16bcebec01633c9b466acf68054370c6).
Apr 10, 2018 5:13:20 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
PAssert$135/GroupGlobally/GroupDummyAndContents -> 
PAssert$135/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$135/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$135/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$135/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$135/VerifyAssertions/ParDo(DefaultConclude)

[jira] [Work logged] (BEAM-2990) support data type MAP

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2990?focusedWorklogId=89258&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89258
 ]

ASF GitHub Bot logged work on BEAM-2990:


Author: ASF GitHub Bot
Created on: 10/Apr/18 05:12
Start Date: 10/Apr/18 05:12
Worklog Time Spent: 10m 
  Work Description: XuMingmin opened a new pull request #5079: [BEAM-2990] 
support MAP in SQL schema
URL: https://github.com/apache/beam/pull/5079
 
 
   Add type MAP.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [ ] Write a pull request description that is detailed enough to 
understand:
  - [ ] What the pull request does
  - [ ] Why it does it
  - [ ] How it does it
  - [ ] Why this approach
- [ ] Each commit in the pull request should have a meaningful subject line 
and body.
- [ ] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89258)
Time Spent: 10m
Remaining Estimate: 0h

> support data type MAP
> -
>
> Key: BEAM-2990
> URL: https://issues.apache.org/jira/browse/BEAM-2990
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> support Non-scalar types:
> MAP   Collection of keys mapped to values
> ARRAY Ordered, contiguous collection that may contain duplicates



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #15

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

--
[...truncated 15.88 MB...]
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 5:06:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-09_22_06_20-15652693022495578636?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-09_22_06_20-15652693022495578636

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 5:06:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-09_22_06_20-15652693022495578636
Apr 10, 2018 5:06:21 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-09_22_06_20-15652693022495578636 with 0 
expected assertions.

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 10, 2018 5:06:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-09_22_03_17-1758034830248790562 finished with status DONE.
Apr 10, 2018 5:06:22 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
checkForPAssertSuccess
INFO: Success result for Dataflow job 
2018-04-09_22_03_17-1758034830248790562. Found 0 success, 0 failures out of 0 
expected assertions.

org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead STANDARD_ERROR
Apr 10, 2018 5:06:22 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:06:21.419Z: Autoscaling: Resized worker pool from 8 to 
0.
Apr 10, 2018 5:06:22 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:06:21.451Z: Autoscaling: Would further reduce the 
number of workers but reached the minimum number allowed for the job.
Apr 10, 2018 5:06:22 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:06:21.488Z: Worker pool stopped.

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 10, 2018 5:06:23 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 10, 2018 5:06:24 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 10, 2018 5:06:24 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 10, 2018 5:06:24 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 10, 2018 5:06:24 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil 
deleteAllEntities
INFO: Successfully deleted 1000 entities
Gradle Test Executor 126 finished executing tests.

org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead STANDARD_ERROR
Apr 10, 2018 5:06:32 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-09_22_02_48-2677528595931176656 finished with status DONE.
Apr 10, 2018 5:06:32 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
checkForPAssertSuccess
INFO: Success result for Dataflow job 
2018-04-09_22_02_48-2677528595931176656. Found 1 success, 0 failures out of 1 
expected assertions.
Gradle Test Executor 127 finished executing tests.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 5:06:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:06:20.717Z: Autoscaling is enabled for job 
2018-04-09_22_06_20-15652693022495578636. The number of workers will be between 
1 and 1000.
Apr 10, 2018 5:06:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:06:20.733Z: Autoscaling was automatically enabled for 
job 2018-04-09_22_06_20-15652693022495578636.
Apr 10, 2018 5:06:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:06:23.171Z: Checking required Cloud APIs are enabled.
Apr 10, 2018 5:06:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T05:06:23.461Z: Checking permissions granted to controller 
Service Account.
Apr 10, 2018 5:06:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10

[beam] branch master updated (1422ff7 -> 2913cbd)

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 1422ff7  Merge pull request #5071 from 
charlesccychen/wordstream-snippet
 add 608d7dc  Set RuntimeValueProvider runtime options in FnApiRunner
 new 2913cbd  Merge pull request #5078 from 
charlesccychen/fnapi-runner-valueprovider

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/runners/portability/fn_api_runner.py | 2 ++
 1 file changed, 2 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[beam] 01/01: Merge pull request #5078 from charlesccychen/fnapi-runner-valueprovider

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 2913cbd10d39d890b36a15dca0fc7000701e600e
Merge: 1422ff7 608d7dc
Author: Ahmet Altay 
AuthorDate: Mon Apr 9 22:03:06 2018 -0700

Merge pull request #5078 from charlesccychen/fnapi-runner-valueprovider

Set RuntimeValueProvider runtime options in FnApiRunner

 sdks/python/apache_beam/runners/portability/fn_api_runner.py | 2 ++
 1 file changed, 2 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[jira] [Work logged] (BEAM-4037) Add Python streaming wordcount snippets and test

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4037?focusedWorklogId=89253&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89253
 ]

ASF GitHub Bot logged work on BEAM-4037:


Author: ASF GitHub Bot
Created on: 10/Apr/18 04:52
Start Date: 10/Apr/18 04:52
Worklog Time Spent: 10m 
  Work Description: aaltay closed pull request #5071: [BEAM-4037] Add 
streaming wordcount snippets and test
URL: https://github.com/apache/beam/pull/5071
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/sdks/python/apache_beam/examples/snippets/snippets.py 
b/sdks/python/apache_beam/examples/snippets/snippets.py
index 9b150c8d3f0..0f9543a64aa 100644
--- a/sdks/python/apache_beam/examples/snippets/snippets.py
+++ b/sdks/python/apache_beam/examples/snippets/snippets.py
@@ -30,6 +30,8 @@
 string. The tags can contain only letters, digits and _.
 """
 
+import argparse
+
 import six
 
 import apache_beam as beam
@@ -628,6 +630,64 @@ def format_result(word_count):
 p.visit(SnippetUtils.RenameFiles(renames))
 
 
+def examples_wordcount_streaming(argv):
+  import apache_beam as beam
+  from apache_beam import window
+  from apache_beam.io import ReadFromPubSub
+  from apache_beam.io import WriteStringsToPubSub
+  from apache_beam.options.pipeline_options import PipelineOptions
+  from apache_beam.options.pipeline_options import SetupOptions
+  from apache_beam.options.pipeline_options import StandardOptions
+
+  # Parse out arguments.
+  parser = argparse.ArgumentParser()
+  parser.add_argument(
+  '--output_topic', required=True,
+  help=('Output PubSub topic of the form '
+'"projects//topic/".'))
+  group = parser.add_mutually_exclusive_group(required=True)
+  group.add_argument(
+  '--input_topic',
+  help=('Input PubSub topic of the form '
+'"projects//topics/".'))
+  group.add_argument(
+  '--input_subscription',
+  help=('Input PubSub subscription of the form '
+'"projects//subscriptions/."'))
+  known_args, pipeline_args = parser.parse_known_args(argv)
+
+  pipeline_options = PipelineOptions(pipeline_args)
+  pipeline_options.view_as(StandardOptions).streaming = True
+
+  with TestPipeline(options=pipeline_options) as p:
+# [START example_wordcount_streaming_read]
+# Read from Pub/Sub into a PCollection.
+if known_args.input_subscription:
+  lines = p | beam.io.ReadFromPubSub(
+  subscription=known_args.input_subscription)
+else:
+  lines = p | beam.io.ReadFromPubSub(topic=known_args.input_topic)
+# [END example_wordcount_streaming_read]
+
+output = (
+lines
+| 'DecodeUnicode' >> beam.FlatMap(
+lambda encoded: encoded.decode('utf-8'))
+| 'ExtractWords' >> beam.FlatMap(
+lambda x: __import__('re').findall(r'[A-Za-z\']+', x))
+| 'PairWithOnes' >> beam.Map(lambda x: (x, 1))
+| beam.WindowInto(window.FixedWindows(15, 0))
+| 'Group' >> beam.GroupByKey()
+| 'Sum' >> beam.Map(lambda word_ones: (word_ones[0], 
sum(word_ones[1])))
+| 'Format' >> beam.Map(
+lambda word_and_count: '%s: %d' % word_and_count))
+
+# [START example_wordcount_streaming_write]
+# Write to Pub/Sub
+output | beam.io.WriteStringsToPubSub(known_args.output_topic)
+# [END example_wordcount_streaming_write]
+
+
 def examples_ptransforms_templated(renames):
   # [START examples_ptransforms_templated]
   import apache_beam as beam
diff --git a/sdks/python/apache_beam/examples/snippets/snippets_test.py 
b/sdks/python/apache_beam/examples/snippets/snippets_test.py
index 349d52542da..4380ce47271 100644
--- a/sdks/python/apache_beam/examples/snippets/snippets_test.py
+++ b/sdks/python/apache_beam/examples/snippets/snippets_test.py
@@ -1,3 +1,4 @@
+# coding=utf-8
 #
 # Licensed to the Apache Software Foundation (ASF) under one or more
 # contributor license agreements.  See the NOTICE file distributed with
@@ -25,6 +26,8 @@
 import unittest
 import uuid
 
+import mock
+
 import apache_beam as beam
 from apache_beam import coders
 from apache_beam import pvalue
@@ -36,8 +39,10 @@
 from apache_beam.options.pipeline_options import GoogleCloudOptions
 from apache_beam.options.pipeline_options import PipelineOptions
 from apache_beam.testing.test_pipeline import TestPipeline
+from apache_beam.testing.test_stream import TestStream
 from apache_beam.testing.util import assert_that
 from apache_beam.testing.util import equal_to
+from apache_beam.transforms.window import TimestampedValue
 from apache_beam.utils.windowed_value import WindowedValue
 
 # Protect against environments where apitools library is not ava

Jenkins build is back to normal : beam_PostRelease_NightlySnapshot #182

2018-04-09 Thread Apache Jenkins Server
See 




[beam] 01/01: Merge pull request #5071 from charlesccychen/wordstream-snippet

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 1422ff780a4a58d4b34cc08e79a96866280f4bf6
Merge: 20a07d2 399ef70
Author: Ahmet Altay 
AuthorDate: Mon Apr 9 21:52:21 2018 -0700

Merge pull request #5071 from charlesccychen/wordstream-snippet

[BEAM-4037] Add streaming wordcount snippets and test

 .../apache_beam/examples/snippets/snippets.py  | 60 
 .../apache_beam/examples/snippets/snippets_test.py | 66 ++
 2 files changed, 126 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[beam] branch master updated (20a07d2 -> 1422ff7)

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 20a07d2  Merge pull request #5076 from charlesccychen/real-time-stall
 add 399ef70  Add streaming wordcount snippets and test
 new 1422ff7  Merge pull request #5071 from 
charlesccychen/wordstream-snippet

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache_beam/examples/snippets/snippets.py  | 60 
 .../apache_beam/examples/snippets/snippets_test.py | 66 ++
 2 files changed, 126 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[jira] [Work logged] (BEAM-4037) Add Python streaming wordcount snippets and test

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4037?focusedWorklogId=89248&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89248
 ]

ASF GitHub Bot logged work on BEAM-4037:


Author: ASF GitHub Bot
Created on: 10/Apr/18 04:22
Start Date: 10/Apr/18 04:22
Worklog Time Spent: 10m 
  Work Description: charlesccychen commented on issue #5071: [BEAM-4037] 
Add streaming wordcount snippets and test
URL: https://github.com/apache/beam/pull/5071#issuecomment-379969473
 
 
   Thanks, PTAL.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89248)
Time Spent: 1h 20m  (was: 1h 10m)

> Add Python streaming wordcount snippets and test
> 
>
> Key: BEAM-4037
> URL: https://issues.apache.org/jira/browse/BEAM-4037
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Charles Chen
>Assignee: Charles Chen
>Priority: Major
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> We should add Python streaming wordcount snippets and tests.  The 
> documentation will refer to these snippets.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #38

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

--
[...truncated 1.60 MB...]
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.SparkContext.(SparkContext.scala:457)
at 
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at 
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:103)
at 
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:68)
at 
org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory.call(SparkRunnerStreamingContextFactory.java:79)
at 
org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory.call(SparkRunnerStreamingContextFactory.java:47)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:627)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:626)
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testDiscardingMode(CreateStreamTest.java:203)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
   

[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=89246&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89246
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 10/Apr/18 04:17
Start Date: 10/Apr/18 04:17
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-379968837
 
 
   Run Dataflow PostRelease


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89246)
Time Spent: 82.5h  (was: 82h 20m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 82.5h
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #51

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

--
[...truncated 109.54 MB...]
org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_OUT
04/10/2018 04:14:17 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> ToKeyedWorkItem(1/1) switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
Apr 10, 2018 4:14:17 AM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (e483e71b7996a9ee0ba04955ec4610ae) switched from RUNNING to FINISHED.
Apr 10, 2018 4:14:17 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Freeing task resources for 
PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (e483e71b7996a9ee0ba04955ec4610ae).
Apr 10, 2018 4:14:17 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (e483e71b7996a9ee0ba04955ec4610ae) [FINISHED]
Apr 10, 2018 4:14:17 AM grizzled.slf4j.Logger info
INFO: Un-registering task and sending final execution state FINISHED to 
JobManager for task PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(e483e71b7996a9ee0ba04955ec4610ae)
Apr 10, 2018 4:14:17 AM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (e483e71b7996a9ee0ba04955ec4610ae) switched from RUNNING to FINISHED.
Apr 10, 2018 4:14:17 AM org.apache.flink.runtime.client.JobClientActor 
logAndPrintMessage
INFO: 04/10/2018 04:14:17   PAssert$134/GroupGlobally/GroupDummyAndContents 
-> PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude)(1/1)
 switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_OUT
04/10/2018 04:14:17 PAssert$134/GroupGlobally/GroupDummyAndContents -> 
PAssert$134/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$134/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$134/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$134/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$134/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude)(1/1)
 switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
A

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #181

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Real-time timers shouldn't stall completed step in DirectRunner

[yifanzou] return stdout text from command execution and stop using 
var.last_text

--
[...truncated 975.02 KB...]
INFO: 19 sending EndOfStream
Apr 10, 2018 3:55:04 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 20 sending EndOfStream
Apr 10, 2018 3:55:04 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 21 sending EndOfStream
Apr 10, 2018 3:55:05 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 10, 2018 3:55:05 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [16]
Apr 10, 2018 3:55:05 AM com.datatorrent.stram.engine.StreamingContainer undeploy
INFO: Undeploy complete.
Apr 10, 2018 3:55:05 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@68843994identifier=tcp://localhost:36284/16.output.16, 
upstream=16.output.16, group=stream17/17.input, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@26118550{da=com.datatorrent.bufferserver.internal.DataList$Block@84e69b2{identifier=16.output.16,
 data=1048576, readingOffset=0, writingOffset=47298, 
starting_window=5acc35930001, ending_window=5acc35930009, refCount=2, 
uniqueIdentifier=0, next=null, future=null}}} from dl 
com.datatorrent.bufferserver.internal.DataList@303b5e58 {16.output.16}
Apr 10, 2018 3:55:05 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 10, 2018 3:55:05 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 10, 2018 3:55:05 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 10, 2018 3:55:05 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [19, 20, 21]
Apr 10, 2018 3:55:05 AM com.datatorrent.stram.engine.StreamingContainer undeploy
INFO: Undeploy complete.
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-5
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-5 msg: [container-5] Exiting heartbeat loop..
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-5 terminating.
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-9
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-7
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-1
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-1 msg: [container-1] Exiting heartbeat loop..
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-1 terminating.
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-9 msg: [container-9] Exiting heartbeat loop..
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-9 terminating.
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-7 msg: [container-7] Exiting heartbeat loop..
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-7 terminating.
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-2
Apr 10, 2018 3:55:06 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 10, 2018 3:55:06 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-2 msg: [container-2] Exiting heartbeat 

[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=89244&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89244
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:52
Start Date: 10/Apr/18 03:52
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-379965687
 
 
   Run Dataflow PostRelease


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89244)
Time Spent: 82h 20m  (was: 82h 10m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 82h 20m
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #5076 from charlesccychen/real-time-stall

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 20a07d26cdc5b121b4fbb7d9f380a306b64c38f0
Merge: fb7c39d 837c83a
Author: Ahmet Altay 
AuthorDate: Mon Apr 9 20:51:50 2018 -0700

Merge pull request #5076 from charlesccychen/real-time-stall

Real-time timers shouldn't stall completed step in DirectRunner

 sdks/python/apache_beam/runners/direct/watermark_manager.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[beam] branch master updated (fb7c39d -> 20a07d2)

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from fb7c39d  Merge pull request #5073 from 
charlesccychen/fix-trigger-clock-bug
 add 837c83a  Real-time timers shouldn't stall completed step in 
DirectRunner
 new 20a07d2  Merge pull request #5076 from charlesccychen/real-time-stall

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/runners/direct/watermark_manager.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[jira] [Work logged] (BEAM-4037) Add Python streaming wordcount snippets and test

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4037?focusedWorklogId=89241&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89241
 ]

ASF GitHub Bot logged work on BEAM-4037:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:50
Start Date: 10/Apr/18 03:50
Worklog Time Spent: 10m 
  Work Description: aaltay commented on issue #5071: [BEAM-4037] Add 
streaming wordcount snippets and test
URL: https://github.com/apache/beam/pull/5071#issuecomment-379965425
 
 
   Could you fix the lint errors:
   
   ```
   Running pylint for module apache_beam:
   * Module apache_beam.examples.snippets.snippets
   C:681, 0: Line too long (81/80) (line-too-long)
   * Module apache_beam.examples.snippets.snippets_test
   C: 25, 0: standard import "import os" should be placed before "import mock" 
(wrong-import-order)
   C: 26, 0: standard import "import tempfile" should be placed before "import 
mock" (wrong-import-order)
   C: 27, 0: standard import "import unittest" should be placed before "import 
mock" (wrong-import-order)
   C: 28, 0: standard import "import uuid" should be placed before "import 
mock" (wrong-import-order)
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89241)
Time Spent: 1h 10m  (was: 1h)

> Add Python streaming wordcount snippets and test
> 
>
> Key: BEAM-4037
> URL: https://issues.apache.org/jira/browse/BEAM-4037
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Charles Chen
>Assignee: Charles Chen
>Priority: Major
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> We should add Python streaming wordcount snippets and tests.  The 
> documentation will refer to these snippets.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=89239&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89239
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:47
Start Date: 10/Apr/18 03:47
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-379965106
 
 
   Run Seed Job


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89239)
Time Spent: 82h 10m  (was: 82h)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 82h 10m
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=89237&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89237
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:38
Start Date: 10/Apr/18 03:38
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-379963938
 
 
   Run Dataflow PostRelease


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89237)
Time Spent: 82h  (was: 81h 50m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 82h
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #180

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[yifanzou] return stdout text from command execution and stop using 
var.last_text

--
GitHub pull request #4788 of commit e1fbc20be9c904a168b94eb7f9582bd8ff25f9dc, 
no merge conflicts.
Setting status of e1fbc20be9c904a168b94eb7f9582bd8ff25f9dc to PENDING with url 
https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/180/ and 
message: 'Build started sha1 is merged.'
Using context: Jenkins: ./gradlew :release:runJavaExamplesValidationTask
[EnvInject] - Loading node environment variables.
Building remotely on beam4 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/4788/*:refs/remotes/origin/pr/4788/*
 > git rev-parse refs/remotes/origin/pr/4788/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/4788/merge^{commit} # timeout=10
Checking out Revision fb6a37b7bc4fba01387c2b4b8031d31fe82726ea 
(refs/remotes/origin/pr/4788/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fb6a37b7bc4fba01387c2b4b8031d31fe82726ea
Commit message: "Merge e1fbc20be9c904a168b94eb7f9582bd8ff25f9dc into 
fb7c39d087e64c413731abb90ee7aaedcd624f74"
 > git rev-list --no-walk 3155df26eaac73e1a7c2e56049b3d1744e5783fc # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 
 
-Pver= -Prepourl= :release:runJavaExamplesValidationTask
Parallel execution with configuration on demand is an incubating feature.
Applying build_rules.gradle to beam
Adding 47 .gitignore exclusions to Apache Rat
Applying build_rules.gradle to beam-runners-google-cloud-dataflow-java
applyJavaNature with [enableFindbugs:false] for project 
beam-runners-google-cloud-dataflow-java
Applying build_rules.gradle to beam-sdks-java-io-google-cloud-platform
applyJavaNature with [enableFindbugs:false] for project 
beam-sdks-java-io-google-cloud-platform
Applying build_rules.gradle to 
beam-sdks-java-extensions-google-cloud-platform-core
applyJavaNature with default configuration for project 
beam-sdks-java-extensions-google-cloud-platform-core
Applying build_rules.gradle to beam-model-fn-execution
applyJavaNature with [enableFindbugs:false] for project beam-model-fn-execution
applyGrpcNature with default configuration for project beam-model-fn-execution
Applying build_rules.gradle to beam-sdks-java-core
applyJavaNature with default configuration for project beam-sdks-java-core
applyAvroNature with default configuration for project beam-sdks-java-core
Applying build_rules.gradle to beam-examples-java
applyJavaNature with default configuration for project beam-examples-java

FAILURE: Build failed with an exception.

* Where:
Build file 
'
 line: 190

* What went wrong:
A problem occurred evaluating project 
':beam-runners-google-cloud-dataflow-java'.
> Could not find method createJavaQuickstartValidationTask() for arguments 
> [{type=Quickstart, runner=Dataflow, gcpProject=apache-beam-testing, 
> gcsBucket=temp-storage-for-release-validation-tests/nightly-snapshot-validation}]
>  on project ':beam-runners-google-cloud-dataflow-java' of type 
> org.gradle.api.Project.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user sweg...@google.com
Not sending mail to unregistered user eh...@google.com
Not sending mail to unregistered user daniel.o.program...@gmail.com
Not sending mail to unregistered user ankurgoe...@gmail.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user 
yifan...@yifanzou-linuxworkstation.sea.corp.google.com
Not sending mail to unregistered user 
re...@relax-macbookpro2.roam.corp.google.com
Not sendi

[jira] [Work logged] (BEAM-4036) Pickler enters infinite recursion with self-referential classes

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4036?focusedWorklogId=89236&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89236
 ]

ASF GitHub Bot logged work on BEAM-4036:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:35
Start Date: 10/Apr/18 03:35
Worklog Time Spent: 10m 
  Work Description: aaltay commented on issue #5072: [BEAM-4036] Fix 
pickling for "recursive" classes.
URL: https://github.com/apache/beam/pull/5072#issuecomment-379963524
 
 
   There is a lint error:
   
   ```
   Running pycodestyle for module apache_beam:
   apache_beam/internal/module_test.py:74:1: E305 expected 2 blank lines after 
class or function definition, found 1
   Command exited with non-zero status 1
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89236)
Time Spent: 50m  (was: 40m)

> Pickler enters infinite recursion with self-referential classes
> ---
>
> Key: BEAM-4036
> URL: https://issues.apache.org/jira/browse/BEAM-4036
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Chuan Yu Foo
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> The pickler recurses infinitely and dies with maximum recursion limit 
> exceeded when a module contains a self-referential class (or any class which 
> is part of a cycle).
> Here's a minimal example: 
> {code}
> class RecursiveClass(object):
>  SELF_TYPE = None
>  def __init__(self, datum)
>self.datum = 'RecursiveClass:%s' % datum
> RecursiveClass.SELF_TYPE = RecursiveClass
> {code}
> If this is in a module, then the pickler will enter the infinite recursion 
> when trying to pickle any nested class in that module.
>   
> An actual example is with typing.Type, which is part of a cycle typing.Type 
> -> type -> object -> typing.Type. If a module contains an attribute that 
> refers to typing.Type, such as a type alias, it will trigger this bug.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=89235&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89235
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:33
Start Date: 10/Apr/18 03:33
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-379963283
 
 
   Run Seed Job


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89235)
Time Spent: 81h 50m  (was: 81h 40m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 81h 50m
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #96

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[daniel.o.programmer] [BEAM-3706] Removing SideInputs and Parameters from 
CombinePayload.

[daniel.o.programmer] [BEAM-3706] Removing side inputs from Combine translation 
logic.

[daniel.o.programmer] [BEAM-3706] Attempting to fix a findbug issue.

[daniel.o.programmer] [BEAM-3706] Cleaning up side input code in Flink runner.

[daniel.o.programmer] [BEAM-3706] Fixing issue in direct runner with side input 
combines.

[robertwb] Correctly account for keyword arguments in function calls.

[robertwb] Fix type inference for slicing.

[lcwik] Attempt to produce release artifacts.

[lcwik] Sign the published artifacts.

[lcwik] Rename a few of the projects so that the artifact names are correctly

[lcwik] Rewrite even more names.

[lcwik] fixup! fix a few names

[swegner] Add Apache header into generated POM

[lcwik] Finish renaming the projects so that they produce valid maven artifact

[lcwik] Gate signing/publishing configuration on 'release', 'publishing' project

[swegner] Use correct project description in generated pom

[swegner] Add license to generated poms

[swegner] Add scm information to generated pom

[swegner] Add JIRA information to generated pom

[swegner] Add mailing list information to generated pom

[swegner] Add developer info to generated pom files

[swegner] Add top-level metadata to generate poms

[alan.myrvold] only sign when releasing, to avoid gpg erors

[lcwik] Add exclusion rules that are defined on the configuration and the

[lcwik] Only apply maven-publish / signing plugins if actually needed.

[lcwik] Fix relocated paths after renaming modules.

[swegner] Fix Apache license header in generated POM.

[swegner] fixup: tabs -> spaces

[coheigea] Removing some null guards that are not needed

[lcwik] Publish shaded jars when signing

[lcwik] Mirate back to using compile as the dependency type since our archetypes

[lcwik] Also publish the beam-sdks-java-harness package.

[lcwik] Move signing configuration under same block that controls whether

[swegner] Add proper descriptions to generated pom files.

[lcwik] Minor description clean-ups.

[swegner] fixup: Update project name to new convention

[swegner2] Address some comments on PR/5048 (#19)

[swegner] Move generated pom configuration into build_rules.gradle.

[lcwik] [BEAM-4014] Replace the Java maven based PostCommit with a Gradle based

[amyrvold] [BEAM-3255] Move from maven to gradle with nightly builds

[github] Update job_beam_Release_Gradle_NightlySnapshot.groovy

[lcwik] [BEAM-4014] Remove previous names because this renames the existing job

[lcwik] Get Spark validates runner streaming integration tests to use the

[lcwik] Speed up Spark post commit test run speed by running 4 tests

[lcwik] [BEAM-4014] Fix project evaluation order.

[herohde] Fix bad Gradle Go examples directory

[herohde] [BEAM-4034] Fix hooks panic with Go on Flink

[lcwik] [BEAM-4014] Fix class path for examplesJavaIntegrationTest, also fix

[robertwb] Revert "Revert #4781 which broke Python postsubmits"

[robertwb] Guard side input mutation for Fn API mode only.

[ehudm] Add grpcio-tools to gradle virtualenv.

[ehudm] Allow longs as input to Timestamp.of().

[ccy] Fix missing clock bug in nested TriggerContext

--
[...truncated 30.23 KB...]
  o Added experimental support for import of ORC files into BigQuery.
  o Added Cloud KMS key rotating for customer-managed encryption key
protected BigQuery tables.
  o Added --location flag to specify the geographic location in which
BigQuery jobs will run. No changes are required for commands in the
existing US and EU regions.

  Cloud Datalab
  o Updated the datalab component to the 20180206 release. Released
changes are documented in its tracking issue at
https://github.com/googledatalab/datalab/issues/1945
(https://github.com/googledatalab/datalab/issues/1945).

  Cloud Datastore Emulator
  o Released Cloud Datastore Emulator version 1.4.1.
* Recommended: Use the health check endpoint to confirm emulator
  startup instead of relying on the "Dev App Server is now running."
  message.
* Fixed issue where the emulator server would close HTTP connections
  before shutting down, which was a problem for /shutdown. This issue
  can be tracked at
  
https://github.com/GoogleCloudPlatform/google-cloud-datastore/issues/188
  
(https://github.com/GoogleCloudPlatform/google-cloud-datastore/issues/188).

  Compute Engine
  o Modified the preview field to be optional when importing using gcloud
compute security-policies import or gcloud compute security-policies
create.

  Container Engine
  o Promoted --cluster-secondary-range-name, --create-subnetwork,
--enable-autorepair, --enable-ip-alias,

[jira] [Work logged] (BEAM-4031) Add missing dataflow customization options for Go SDK

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4031?focusedWorklogId=89234&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89234
 ]

ASF GitHub Bot logged work on BEAM-4031:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:31
Start Date: 10/Apr/18 03:31
Worklog Time Spent: 10m 
  Work Description: jasonkuster commented on issue #5070: [BEAM-4031] Add 
more Go SDK Dataflow options
URL: https://github.com/apache/beam/pull/5070#issuecomment-379963008
 
 
   @tgroh senpai pls notice me


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89234)
Time Spent: 40m  (was: 0.5h)

> Add missing dataflow customization options for Go SDK
> -
>
> Key: BEAM-4031
> URL: https://issues.apache.org/jira/browse/BEAM-4031
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Henning Rohde
>Priority: Minor
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> We're missing at least:
> zone
> temp_location
> worker_machine_type



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4031) Add missing dataflow customization options for Go SDK

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4031?focusedWorklogId=89233&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89233
 ]

ASF GitHub Bot logged work on BEAM-4031:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:30
Start Date: 10/Apr/18 03:30
Worklog Time Spent: 10m 
  Work Description: jasonkuster commented on issue #5070: [BEAM-4031] Add 
more Go SDK Dataflow options
URL: https://github.com/apache/beam/pull/5070#issuecomment-379962838
 
 
   LGTM


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89233)
Time Spent: 0.5h  (was: 20m)

> Add missing dataflow customization options for Go SDK
> -
>
> Key: BEAM-4031
> URL: https://issues.apache.org/jira/browse/BEAM-4031
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Henning Rohde
>Priority: Minor
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> We're missing at least:
> zone
> temp_location
> worker_machine_type



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4031) Add missing dataflow customization options for Go SDK

2018-04-09 Thread Jason Kuster (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16431658#comment-16431658
 ] 

Jason Kuster commented on BEAM-4031:


My bad; thanks!

> Add missing dataflow customization options for Go SDK
> -
>
> Key: BEAM-4031
> URL: https://issues.apache.org/jira/browse/BEAM-4031
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Henning Rohde
>Priority: Minor
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> We're missing at least:
> zone
> temp_location
> worker_machine_type



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #37

2018-04-09 Thread Apache Jenkins Server
See 


--
[...truncated 108.34 KB...]
Gradle Test Executor 16 started executing tests.
Gradle Test Executor 15 finished executing tests.
Gradle Test Executor 16 finished executing tests.
Gradle Test Executor 17 started executing tests.

org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
 STANDARD_ERROR
2018-04-10 03:24:42,650 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! setup
2018-04-10 03:24:42,679 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! /home/jenkins/.gradle/caches/4.5.1/workerMain/gradle-worker.jar
2018-04-10 03:24:42,691 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,694 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,694 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,699 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,701 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,702 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,710 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,712 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,714 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,714 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,715 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 03:24:42,716 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-streaming_2.11/2.3.0/57da1135f7192a2be85987f1708abf94887f7323/spark-streaming_2.11-2.3.0.jar
2018-04-10 03:24:42,717 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-core_2.11/2.3.0/9e2bc021bd38b06da2e0a56fdd9d13935503d94/spark-core_2.11-2.3.0.jar
2018-04-10 03:24:42,718 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 
/home/jenkins/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.11/2.8.9/jac

[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=89232&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89232
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 10/Apr/18 03:21
Start Date: 10/Apr/18 03:21
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-379961457
 
 
   Run Dataflow PostRelease


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89232)
Time Spent: 81h 40m  (was: 81.5h)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 81h 40m
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #179

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[altay] Update streaming wordcount example and allign with the batch example.

[github] Fix linter error in typehints.

[daniel.o.programmer] [BEAM-3706] Removing SideInputs and Parameters from 
CombinePayload.

[daniel.o.programmer] [BEAM-3706] Removing side inputs from Combine translation 
logic.

[daniel.o.programmer] [BEAM-3706] Attempting to fix a findbug issue.

[daniel.o.programmer] [BEAM-3706] Cleaning up side input code in Flink runner.

[daniel.o.programmer] [BEAM-3706] Fixing issue in direct runner with side input 
combines.

[wcn] Remove include directives for proto well-known-types.

[wcn] Fix golint issues

[tgroh] Reintroduce MetricName#name[space]

[XuMingmin] [BEAM-591] Support custom timestamps & CreateTime support (#4935)

[lcwik] [BEAM-3249] Allow for re-use of dependencies within other projects by

[robertwb] Optimize reshuffle.

[robertwb] Add PipelineRunner.run_async for non-blocking execution.

[boyuanz] Add distribution_counter_microbenchmark to apache_beam.tools.utils

[ehudm] Upgrade virtualenv and tox in Jenkins machines.

[chamikara] [BEAM-3213] MongodbIO performance test (#4859)

[tgroh] Add Components to ExecutableStagePayload

[tgroh] Populate Environment in ExecutableStage

[tgroh] Add getComponents to ExecutableStage

[aaltay] [BEAM-3818] Add support for streaming side inputs in the DirectRunner

[ankurgoenka] Support multiple SDKHarness and SdkWorkerId in RunnerHarness

[ccy] Use utcnow() in determining credential refresh time

[amyrvold] [BEAM-3989] Delete Maven pipeline jobs consistently failing

[chamikara] Revert "[BEAM-2264] Credentials were not being reused between GCS 
calls"

[rmannibucau] [BEAM-3993] read gitignore and add it in rat exclusions

[amyrvold] [BEAM-3987] Use gradle and not maven in

[chamikara] [BEAM-3060] Jenkins configuration allowing to run FilebasedIO tests 
on

[amyrvold] [BEAM-3989] Delete unused pipeline jobs

[sidhom] [BEAM-3249] Add missing gradle artifact ids

[lcwik] [BEAM-3993] Remove duplicate definitions between .gitignore and

[herohde] [BEAM-3982] Register transform types and functions

[wcn] Add godoc for exported methods.

[robertwb] Correctly account for keyword arguments in function calls.

[ankurgoenka] Fixing lint errrors

[robertwb] Fix type inference for slicing.

[iemejia] [BEAM-3875] Update Spark runner to use Spark version 2.3.0

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[lcwik] BEAM-3256 Add archetype testing/generation to existing GradleBuild

[lcwik] [BEAM-3250] Creating a gradle Jenkins config for Flink PostCommit.

[lcwik] Replace Maven based Flink ValidatesRunner postcommit with Gradle based

[lcwik] [BEAM-3249] Drop Java Maven PreCommit.

[aaltay] Fix Python streaming sordcount IT to unblock PostCommit (#5015)

[wcn] Add float support for the SDK.

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[markliu] [BEAM-3946] Fix pubsub_matcher_test which depends on

[amyrvold] Rename flink job to fix seed

[lcwik] Attempt to produce release artifacts.

[ehudm] Add HadoopFileSystemOptions support for Dataflow.

[ehudm] Fix HadoopFileSystem.match bugs.

[ehudm] Test HDFS reads in integration test.

[ehudm] Fix linter errors and add missing license.

[github] [BEAM-3774] Adds support for reading from/writing to more BQ

[lcwik] Sign the published artifacts.

[lcwik] Rename a few of the projects so that the artifact names are correctly

[lcwik] Rewrite even more names.

[lcwik] fixup! fix a few names

[swegner] Add Apache header into generated POM

[lcwik] Finish renaming the projects so that they produce valid maven artifact

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

[herohde] [BEAM-3250] Migrate Dataflow ValidatesRunner test to Gradle

[lcwik] Gate signing/publishing configuration on 'release', 'publishing' project

[herohde] CR: use 4 forks in Dataflow test

[herohde] [BEAM-3250] Migrate Spark ValidatesRunner tests to Gradle

[coheigea] Simplify the Beam and/or SQL Expressions

[amyrvold] [BEAM-2823] Delete failing Beam Windows MavenInstall tests

[swegner] Use correct project description in generated pom

[swegner] Add license to generated poms

[swegner] Add scm information to generated pom

[swegner] Add JIRA information to generated pom

[swegner] Add mailing list information to generated pom

[swegner] Add developer info to generated pom files

[swegner] Add top-level metadata to generate poms

[alan.myrvold] only sign when releasing, to avoid gpg erors

[lcwik] Add exclusion rules that are defined on the configuration and the

[lcwik] Only apply maven-publish / signing plugins if actually needed.

[lcwik] Fix relocated paths after renaming modules.

[swegner] Fix Apache license header in generated POM.

[swegner] fixup: tabs

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #14

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ccy] Fix missing clock bug in nested TriggerContext

--
[...truncated 15.91 MB...]
Apr 10, 2018 2:49:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410024850-e59b967d/output/results/staging/
Apr 10, 2018 2:49:01 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <80747 bytes, hash YtTgfzcIKoEMwr-nZMKrEQ> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410024850-e59b967d/output/results/staging/pipeline-YtTgfzcIKoEMwr-nZMKrEQ.pb

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 10, 2018 2:49:01 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-09_19_45_55-2459194202072211087 finished with status DONE.
Apr 10, 2018 2:49:01 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
checkForPAssertSuccess
INFO: Success result for Dataflow job 
2018-04-09_19_45_55-2459194202072211087. Found 0 success, 0 failures out of 0 
expected assertions.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 10, 2018 2:49:02 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 2:49:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-09_19_49_02-9763438910366993201?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-09_19_49_02-9763438910366993201

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 2:49:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-09_19_49_02-9763438910366993201
Apr 10, 2018 2:49:03 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-09_19_49_02-9763438910366993201 with 0 
expected assertions.

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 10, 2018 2:49:03 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 10, 2018 2:49:03 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 10, 2018 2:49:04 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 10, 2018 2:49:04 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil 
deleteAllEntities
Gradle Test Executor 126 finished executing tests.
INFO: Successfully deleted 1000 entities

org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead STANDARD_ERROR
Apr 10, 2018 2:49:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-09_19_45_17-241409313645717337 finished with status DONE.
Apr 10, 2018 2:49:06 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
checkForPAssertSuccess
INFO: Success result for Dataflow job 
2018-04-09_19_45_17-241409313645717337. Found 1 success, 0 failures out of 1 
expected assertions.
Gradle Test Executor 127 finished executing tests.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 2:49:19 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:49:02.307Z: Autoscaling is enabled for job 
2018-04-09_19_49_02-9763438910366993201. The number of workers will be between 
1 and 1000.
Apr 10, 2018 2:49:19 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:49:02.327Z: Autoscaling was automatically enabled for 
job 2018-04-09_19_49_02-9763438910366993201.
Apr 10, 2018 2:49:19 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:49:07.798Z: Checking required Cloud APIs are enabled.
Apr 10, 2018 2:49:19 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:49:07.974Z: Checking permissions granted to controller 
Service Account.
Apr 10, 2018 2:49:19 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #13

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[robertwb] Revert "Revert #4781 which broke Python postsubmits"

[robertwb] Guard side input mutation for Fn API mode only.

[ehudm] Allow longs as input to Timestamp.of().

--
[...truncated 16.04 MB...]
Submitted job: 2018-04-09_19_20_45-17155428344831942559

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 2:20:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-09_19_20_45-17155428344831942559
Apr 10, 2018 2:20:46 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-09_19_20_45-17155428344831942559 with 0 
expected assertions.
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:45.235Z: Autoscaling is enabled for job 
2018-04-09_19_20_45-17155428344831942559. The number of workers will be between 
1 and 1000.
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:45.261Z: Autoscaling was automatically enabled for 
job 2018-04-09_19_20_45-17155428344831942559.
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:47.962Z: Checking required Cloud APIs are enabled.
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:48.070Z: Checking permissions granted to controller 
Service Account.
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:51.614Z: Expanding CoGroupByKey operations into 
optimizable parts.
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:51.819Z: Expanding GroupByKey operations into 
optimizable parts.
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:51.843Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:52.036Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:52.068Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write 
mutations to Cloud Spanner/Create.Values/Read(CreateSource)
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:52.092Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write
 mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
 into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:52.127Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
 into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:52.156Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
 into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Apr 10, 2018 2:20:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T02:20:52.187Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
 into SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema
Apr 10, 201

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #36

2018-04-09 Thread Apache Jenkins Server
See 


--
[...truncated 107.04 KB...]
Starting process 'Gradle Test Executor 17'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false 
-javaagent:build/tmp/expandedArchives/org.jacoco.agent-0.7.9.jar_17d4cd69b5d9b44b59ac09d2e4756e43/jacocoagent.jar=destfile=build/jacoco/validatesRunnerBatch.exec,append=true,inclnolocationclasses=false,dumponexit=true,output=file,jmx=false
 -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea 
-cp /home/jenkins/.gradle/caches/4.5.1/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 17'
Successfully started process 'Gradle Test Executor 17'
Gradle Test Executor 16 finished executing tests.
Gradle Test Executor 17 started executing tests.

org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
 STANDARD_ERROR
2018-04-10 02:13:44,473 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! setup
2018-04-10 02:13:44,480 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! /home/jenkins/.gradle/caches/4.5.1/workerMain/gradle-worker.jar
2018-04-10 02:13:44,481 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,482 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,482 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,483 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,485 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,485 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,489 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,489 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,490 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,490 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 

2018-04-10 02:13:44,490 [Test worker] ERROR 
org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest
  - !!! 


[jira] [Work logged] (BEAM-4034) Hooks panic on flink

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4034?focusedWorklogId=89218&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89218
 ]

ASF GitHub Bot logged work on BEAM-4034:


Author: ASF GitHub Bot
Created on: 10/Apr/18 02:08
Start Date: 10/Apr/18 02:08
Worklog Time Spent: 10m 
  Work Description: herohde commented on issue #5065: [BEAM-4034] Fix hooks 
panic with Go on Flink
URL: https://github.com/apache/beam/pull/5065#issuecomment-379950635
 
 
   In this particular case, I think we do want a warning (or error, even). It's 
an expectation that every running using the harness serializes hooks and we 
want to notice if that was not done.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89218)
Time Spent: 1h 10m  (was: 1h)

> Hooks panic on flink
> 
>
> Key: BEAM-4034
> URL: https://issues.apache.org/jira/browse/BEAM-4034
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Henning Rohde
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> It seems the serialized hooks are set only for Dataflow. It should probably 
> not panic if nothing is set. I see the below on Flink.
> $ docker logs 0f5
> 2018/04/09 21:03:48 Initializing Go harness: /opt/apache/beam/boot --id=-123 
> --logging_endpoint=docker.for.mac.host.internal:61723 
> --artifact_endpoint=docker.for.mac.host.internal:61724 
> --provision_endpoint=docker.for.mac.host.internal:61725 
> --control_endpoint=docker.for.mac.host.internal:61721 
> --semi_persist_dir=/tmp/semi_persistent_dir2198305397089608348
> Worker panic: DeserializeHooks failed on input "": unexpected end of JSON 
> inputgoroutine 1 [running]:
> runtime/debug.Stack(0x4f, 0x0, 0x0)
>   /usr/local/go/src/runtime/debug/stack.go:24 +0xa7
> runtime/debug.PrintStack()
>   /usr/local/go/src/runtime/debug/stack.go:16 +0x22
> github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/init.hook.func1()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/init/init.go:76
>  +0xb6
> panic(0xc65060, 0xc4200402a0)
>   /usr/local/go/src/runtime/panic.go:491 +0x283
> github.com/apache/beam/sdks/go/pkg/beam/core/util/hooks.DeserializeHooksFromOptions()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/util/hooks/hooks.go:135
>  +0x402
> github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness.Main(0x136dd00, 
> 0xc420060010, 0x7ffdcfb0dda8, 0x22, 0x7ffdcfb0ddde, 0x22, 0x0, 0x0)
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/harness.go:41
>  +0x4f
> github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/init.hook()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/init/init.go:84
>  +0xc1
> github.com/apache/beam/sdks/go/pkg/beam/core/runtime.Init()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/runtime/init.go:42
>  +0x4c
> github.com/apache/beam/sdks/go/pkg/beam.Init()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/forward.go:63 
> +0x20
> main.main()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/examples/wordcount/wordcount.go:156
>  +0x39
> 2018/04/09 21:03:49 User program exited: exit status 2



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4037) Add Python streaming wordcount snippets and test

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4037?focusedWorklogId=89219&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89219
 ]

ASF GitHub Bot logged work on BEAM-4037:


Author: ASF GitHub Bot
Created on: 10/Apr/18 02:08
Start Date: 10/Apr/18 02:08
Worklog Time Spent: 10m 
  Work Description: charlesccychen commented on issue #5071: [BEAM-4037] 
Add streaming wordcount snippets and test
URL: https://github.com/apache/beam/pull/5071#issuecomment-379950759
 
 
   retest this please


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89219)
Time Spent: 1h  (was: 50m)

> Add Python streaming wordcount snippets and test
> 
>
> Key: BEAM-4037
> URL: https://issues.apache.org/jira/browse/BEAM-4037
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Charles Chen
>Assignee: Charles Chen
>Priority: Major
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> We should add Python streaming wordcount snippets and tests.  The 
> documentation will refer to these snippets.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #35

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[robertwb] Revert "Revert #4781 which broke Python postsubmits"

[robertwb] Guard side input mutation for Fn API mode only.

[ehudm] Allow longs as input to Timestamp.of().

[ccy] Fix missing clock bug in nested TriggerContext

--
[...truncated 1.59 MB...]
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.SparkContext.(SparkContext.scala:457)
at 
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at 
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:103)
at 
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:68)
at 
org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory.call(SparkRunnerStreamingContextFactory.java:79)
at 
org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory.call(SparkRunnerStreamingContextFactory.java:47)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:627)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:626)
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testDiscardingMode(CreateStreamTest.java:203)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.Ref

[jira] [Work logged] (BEAM-4034) Hooks panic on flink

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4034?focusedWorklogId=89215&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89215
 ]

ASF GitHub Bot logged work on BEAM-4034:


Author: ASF GitHub Bot
Created on: 10/Apr/18 01:46
Start Date: 10/Apr/18 01:46
Worklog Time Spent: 10m 
  Work Description: wcn3 commented on issue #5065: [BEAM-4034] Fix hooks 
panic with Go on Flink
URL: https://github.com/apache/beam/pull/5065#issuecomment-379947303
 
 
   Bit late, but could we have GetOptionWithDefault as a vehicle? That way, the 
default value could be "{}" which would unmarshal as empty options, and then we 
wouldn't need the warning paths. WDYT?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89215)
Time Spent: 1h  (was: 50m)

> Hooks panic on flink
> 
>
> Key: BEAM-4034
> URL: https://issues.apache.org/jira/browse/BEAM-4034
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Henning Rohde
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> It seems the serialized hooks are set only for Dataflow. It should probably 
> not panic if nothing is set. I see the below on Flink.
> $ docker logs 0f5
> 2018/04/09 21:03:48 Initializing Go harness: /opt/apache/beam/boot --id=-123 
> --logging_endpoint=docker.for.mac.host.internal:61723 
> --artifact_endpoint=docker.for.mac.host.internal:61724 
> --provision_endpoint=docker.for.mac.host.internal:61725 
> --control_endpoint=docker.for.mac.host.internal:61721 
> --semi_persist_dir=/tmp/semi_persistent_dir2198305397089608348
> Worker panic: DeserializeHooks failed on input "": unexpected end of JSON 
> inputgoroutine 1 [running]:
> runtime/debug.Stack(0x4f, 0x0, 0x0)
>   /usr/local/go/src/runtime/debug/stack.go:24 +0xa7
> runtime/debug.PrintStack()
>   /usr/local/go/src/runtime/debug/stack.go:16 +0x22
> github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/init.hook.func1()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/init/init.go:76
>  +0xb6
> panic(0xc65060, 0xc4200402a0)
>   /usr/local/go/src/runtime/panic.go:491 +0x283
> github.com/apache/beam/sdks/go/pkg/beam/core/util/hooks.DeserializeHooksFromOptions()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/util/hooks/hooks.go:135
>  +0x402
> github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness.Main(0x136dd00, 
> 0xc420060010, 0x7ffdcfb0dda8, 0x22, 0x7ffdcfb0ddde, 0x22, 0x0, 0x0)
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/harness.go:41
>  +0x4f
> github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/init.hook()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness/init/init.go:84
>  +0xc1
> github.com/apache/beam/sdks/go/pkg/beam/core/runtime.Init()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/core/runtime/init.go:42
>  +0x4c
> github.com/apache/beam/sdks/go/pkg/beam.Init()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/pkg/beam/forward.go:63 
> +0x20
> main.main()
>   
> /Users/herohde/go/src/github.com/apache/beam/sdks/go/examples/wordcount/wordcount.go:156
>  +0x39
> 2018/04/09 21:03:49 User program exited: exit status 2



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-09 Thread Geet Kumar (JIRA)
Geet Kumar created BEAM-4038:


 Summary: Support Kafka Headers in KafkaIO
 Key: BEAM-4038
 URL: https://issues.apache.org/jira/browse/BEAM-4038
 Project: Beam
  Issue Type: New Feature
  Components: io-java-kafka
Reporter: Geet Kumar
Assignee: Raghu Angadi


Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
purpose of this JIRA is to support this feature in KafkaIO.  

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #12

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[herohde] [BEAM-4034] Fix hooks panic with Go on Flink

--
[...truncated 16.16 MB...]
Apr 10, 2018 1:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/Flatten.PCollections as step s25
Apr 10, 2018 1:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/CreateDataflowView as step s26
Apr 10, 2018 1:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Partition 
input as step s27
Apr 10, 2018 1:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Group by 
partition as step s28
Apr 10, 2018 1:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Batch 
mutations together as step s29
Apr 10, 2018 1:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write 
mutations to Spanner as step s30
Apr 10, 2018 1:24:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410012453-3a21c811/output/results/staging/
Apr 10, 2018 1:24:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <80747 bytes, hash h_UtfCnQ4kHmHU9ZFLWNMQ> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0410012453-3a21c811/output/results/staging/pipeline-h_UtfCnQ4kHmHU9ZFLWNMQ.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 10, 2018 1:24:59 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-09_18_22_00-16544945643193247653 finished with status 
DONE.
Apr 10, 2018 1:24:59 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
checkForPAssertSuccess
INFO: Success result for Dataflow job 
2018-04-09_18_22_00-16544945643193247653. Found 0 success, 0 failures out of 0 
expected assertions.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 1:24:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-09_18_24_58-17834478248174554934?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-09_18_24_58-17834478248174554934

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 1:24:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-09_18_24_58-17834478248174554934
Apr 10, 2018 1:24:59 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-09_18_24_58-17834478248174554934 with 0 
expected assertions.

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 10, 2018 1:25:00 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 10, 2018 1:25:00 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 10, 2018 1:25:01 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 10, 2018 1:25:01 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 10, 2018 1:25:01 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil 
deleteAllEntities
INFO: Successfully deleted 1000 entities
Gradle Test Executor 126 finished executing tests.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 1:25:07 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T01:24:58.425Z: Autoscaling is enabled for job 
2018-04-09_18_24_58-17834478248174554934. The number of workers will be between 
1 and 1000.
Apr 10, 2018 1:25:07 AM 
org.apache.beam.runners.dataflow.uti

Build failed in Jenkins: beam_PerformanceTests_Spark #1571

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[daniel.o.programmer] [BEAM-3706] Removing SideInputs and Parameters from 
CombinePayload.

[daniel.o.programmer] [BEAM-3706] Removing side inputs from Combine translation 
logic.

[daniel.o.programmer] [BEAM-3706] Attempting to fix a findbug issue.

[daniel.o.programmer] [BEAM-3706] Cleaning up side input code in Flink runner.

[daniel.o.programmer] [BEAM-3706] Fixing issue in direct runner with side input 
combines.

[robertwb] Correctly account for keyword arguments in function calls.

[robertwb] Fix type inference for slicing.

[coheigea] Removing some null guards that are not needed

[lcwik] Get Spark validates runner streaming integration tests to use the

[lcwik] Speed up Spark post commit test run speed by running 4 tests

[lcwik] [BEAM-4014] Fix project evaluation order.

[herohde] Fix bad Gradle Go examples directory

[herohde] [BEAM-4034] Fix hooks panic with Go on Flink

[lcwik] [BEAM-4014] Fix class path for examplesJavaIntegrationTest, also fix

[robertwb] Revert "Revert #4781 which broke Python postsubmits"

[robertwb] Guard side input mutation for Fn API mode only.

[ehudm] Add grpcio-tools to gradle virtualenv.

[ehudm] Allow longs as input to Timestamp.of().

--
[...truncated 89.24 KB...]
'apache-beam-testing:bqjob_r57966c435cfd51de_0162ad17f212_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r57966c435cfd51de_0162ad17f212_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r57966c435cfd51de_0162ad17f212_1 ... (0s) Current status: DONE   
2018-04-10 01:07:32,466 8a30c321 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-10 01:07:49,649 8a30c321 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-10 01:07:52,647 8a30c321 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r6a1c42b91c8f2163_0162ad184053_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r6a1c42b91c8f2163_0162ad184053_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r6a1c42b91c8f2163_0162ad184053_1 ... (0s) Current status: DONE   
2018-04-10 01:07:52,647 8a30c321 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-10 01:08:14,377 8a30c321 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-10 01:08:17,762 8a30c321 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r7e6fbc91d873fc66_0162ad18a1cd_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WAR

[jira] [Work logged] (BEAM-4036) Pickler enters infinite recursion with self-referential classes

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4036?focusedWorklogId=89210&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89210
 ]

ASF GitHub Bot logged work on BEAM-4036:


Author: ASF GitHub Bot
Created on: 10/Apr/18 01:09
Start Date: 10/Apr/18 01:09
Worklog Time Spent: 10m 
  Work Description: chuanyu commented on issue #5072: [BEAM-4036] Fix 
pickling for "recursive" classes.
URL: https://github.com/apache/beam/pull/5072#issuecomment-379941565
 
 
   Thanks for the review, @aaltay! Addressed your comments. PTAL.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89210)
Time Spent: 40m  (was: 0.5h)

> Pickler enters infinite recursion with self-referential classes
> ---
>
> Key: BEAM-4036
> URL: https://issues.apache.org/jira/browse/BEAM-4036
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Chuan Yu Foo
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> The pickler recurses infinitely and dies with maximum recursion limit 
> exceeded when a module contains a self-referential class (or any class which 
> is part of a cycle).
> Here's a minimal example: 
> {code}
> class RecursiveClass(object):
>  SELF_TYPE = None
>  def __init__(self, datum)
>self.datum = 'RecursiveClass:%s' % datum
> RecursiveClass.SELF_TYPE = RecursiveClass
> {code}
> If this is in a module, then the pickler will enter the infinite recursion 
> when trying to pickle any nested class in that module.
>   
> An actual example is with typing.Type, which is part of a cycle typing.Type 
> -> type -> object -> typing.Type. If a module contains an attribute that 
> refers to typing.Type, such as a type alias, it will trigger this bug.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch master updated (9df0307 -> fb7c39d)

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 9df0307  Merge pull request #4983 [BEAM-2927] Re-enable side inputs 
for Fn API on Dataflow
 add d3aa421  Fix missing clock bug in nested TriggerContext
 new fb7c39d  Merge pull request #5073 from 
charlesccychen/fix-trigger-clock-bug

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/transforms/trigger.py | 7 +--
 1 file changed, 5 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[jira] [Work logged] (BEAM-4037) Add Python streaming wordcount snippets and test

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4037?focusedWorklogId=89209&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89209
 ]

ASF GitHub Bot logged work on BEAM-4037:


Author: ASF GitHub Bot
Created on: 10/Apr/18 01:04
Start Date: 10/Apr/18 01:04
Worklog Time Spent: 10m 
  Work Description: aaltay commented on issue #5071: [BEAM-4037] Add 
streaming wordcount snippets and test
URL: https://github.com/apache/beam/pull/5071#issuecomment-379940913
 
 
   Thank you I can merge once tests pass.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89209)
Time Spent: 50m  (was: 40m)

> Add Python streaming wordcount snippets and test
> 
>
> Key: BEAM-4037
> URL: https://issues.apache.org/jira/browse/BEAM-4037
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Charles Chen
>Assignee: Charles Chen
>Priority: Major
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> We should add Python streaming wordcount snippets and tests.  The 
> documentation will refer to these snippets.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #5073 from charlesccychen/fix-trigger-clock-bug

2018-04-09 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit fb7c39d087e64c413731abb90ee7aaedcd624f74
Merge: 9df0307 d3aa421
Author: Ahmet Altay 
AuthorDate: Mon Apr 9 18:04:17 2018 -0700

Merge pull request #5073 from charlesccychen/fix-trigger-clock-bug

Fix missing clock bug in nested TriggerContext

 sdks/python/apache_beam/transforms/trigger.py | 7 +--
 1 file changed, 5 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #31

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[daniel.o.programmer] [BEAM-3706] Removing SideInputs and Parameters from 
CombinePayload.

[daniel.o.programmer] [BEAM-3706] Removing side inputs from Combine translation 
logic.

[daniel.o.programmer] [BEAM-3706] Attempting to fix a findbug issue.

[daniel.o.programmer] [BEAM-3706] Cleaning up side input code in Flink runner.

[daniel.o.programmer] [BEAM-3706] Fixing issue in direct runner with side input 
combines.

[robertwb] Correctly account for keyword arguments in function calls.

[robertwb] Fix type inference for slicing.

[coheigea] Removing some null guards that are not needed

[lcwik] Get Spark validates runner streaming integration tests to use the

[lcwik] Speed up Spark post commit test run speed by running 4 tests

[lcwik] [BEAM-4014] Fix project evaluation order.

[herohde] Fix bad Gradle Go examples directory

[herohde] [BEAM-4034] Fix hooks panic with Go on Flink

[lcwik] [BEAM-4014] Fix class path for examplesJavaIntegrationTest, also fix

[robertwb] Revert "Revert #4781 which broke Python postsubmits"

[robertwb] Guard side input mutation for Fn API mode only.

[ehudm] Add grpcio-tools to gradle virtualenv.

[ehudm] Allow longs as input to Timestamp.of().

--
[...truncated 760.40 KB...]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.224.217.92:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.224.217.92:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms whi

Jenkins build is back to normal : beam_PerformanceTests_XmlIOIT_HDFS #29

2018-04-09 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #34

2018-04-09 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-4036) Pickler enters infinite recursion with self-referential classes

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4036?focusedWorklogId=89208&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89208
 ]

ASF GitHub Bot logged work on BEAM-4036:


Author: ASF GitHub Bot
Created on: 10/Apr/18 00:54
Start Date: 10/Apr/18 00:54
Worklog Time Spent: 10m 
  Work Description: aaltay commented on a change in pull request #5072: 
[BEAM-4036] Fix pickling for "recursive" classes.
URL: https://github.com/apache/beam/pull/5072#discussion_r180270969
 
 

 ##
 File path: sdks/python/apache_beam/internal/pickler.py
 ##
 @@ -48,7 +48,12 @@ def _is_nested_class(cls):
 def _find_containing_class(nested_class):
   """Finds containing class of a nestec class passed as argument."""
 
+  seen = {}
 
 Review comment:
   You could use a set instead. Such as:
   
   ```
   seen = set()
   ...
   if outer in seen:
 return
   seen.add(outer)
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89208)
Time Spent: 0.5h  (was: 20m)

> Pickler enters infinite recursion with self-referential classes
> ---
>
> Key: BEAM-4036
> URL: https://issues.apache.org/jira/browse/BEAM-4036
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Chuan Yu Foo
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> The pickler recurses infinitely and dies with maximum recursion limit 
> exceeded when a module contains a self-referential class (or any class which 
> is part of a cycle).
> Here's a minimal example: 
> {code}
> class RecursiveClass(object):
>  SELF_TYPE = None
>  def __init__(self, datum)
>self.datum = 'RecursiveClass:%s' % datum
> RecursiveClass.SELF_TYPE = RecursiveClass
> {code}
> If this is in a module, then the pickler will enter the infinite recursion 
> when trying to pickle any nested class in that module.
>   
> An actual example is with typing.Type, which is part of a cycle typing.Type 
> -> type -> object -> typing.Type. If a module contains an attribute that 
> refers to typing.Type, such as a type alias, it will trigger this bug.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4036) Pickler enters infinite recursion with self-referential classes

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4036?focusedWorklogId=89207&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89207
 ]

ASF GitHub Bot logged work on BEAM-4036:


Author: ASF GitHub Bot
Created on: 10/Apr/18 00:54
Start Date: 10/Apr/18 00:54
Worklog Time Spent: 10m 
  Work Description: aaltay commented on a change in pull request #5072: 
[BEAM-4036] Fix pickling for "recursive" classes.
URL: https://github.com/apache/beam/pull/5072#discussion_r180271017
 
 

 ##
 File path: sdks/python/apache_beam/internal/pickler.py
 ##
 @@ -48,7 +48,12 @@ def _is_nested_class(cls):
 def _find_containing_class(nested_class):
   """Finds containing class of a nestec class passed as argument."""
 
 Review comment:
   Could you also fix the typo here: `nestec class` -> `nested class`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89207)
Time Spent: 20m  (was: 10m)

> Pickler enters infinite recursion with self-referential classes
> ---
>
> Key: BEAM-4036
> URL: https://issues.apache.org/jira/browse/BEAM-4036
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Chuan Yu Foo
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> The pickler recurses infinitely and dies with maximum recursion limit 
> exceeded when a module contains a self-referential class (or any class which 
> is part of a cycle).
> Here's a minimal example: 
> {code}
> class RecursiveClass(object):
>  SELF_TYPE = None
>  def __init__(self, datum)
>self.datum = 'RecursiveClass:%s' % datum
> RecursiveClass.SELF_TYPE = RecursiveClass
> {code}
> If this is in a module, then the pickler will enter the infinite recursion 
> when trying to pickle any nested class in that module.
>   
> An actual example is with typing.Type, which is part of a cycle typing.Type 
> -> type -> object -> typing.Type. If a module contains an attribute that 
> refers to typing.Type, such as a type alias, it will trigger this bug.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Python #1127

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[daniel.o.programmer] [BEAM-3706] Removing SideInputs and Parameters from 
CombinePayload.

[daniel.o.programmer] [BEAM-3706] Removing side inputs from Combine translation 
logic.

[daniel.o.programmer] [BEAM-3706] Attempting to fix a findbug issue.

[daniel.o.programmer] [BEAM-3706] Cleaning up side input code in Flink runner.

[daniel.o.programmer] [BEAM-3706] Fixing issue in direct runner with side input 
combines.

[robertwb] Correctly account for keyword arguments in function calls.

[robertwb] Fix type inference for slicing.

[coheigea] Removing some null guards that are not needed

[lcwik] Get Spark validates runner streaming integration tests to use the

[lcwik] Speed up Spark post commit test run speed by running 4 tests

[lcwik] [BEAM-4014] Fix project evaluation order.

[herohde] Fix bad Gradle Go examples directory

[herohde] [BEAM-4034] Fix hooks panic with Go on Flink

[lcwik] [BEAM-4014] Fix class path for examplesJavaIntegrationTest, also fix

[robertwb] Revert "Revert #4781 which broke Python postsubmits"

[robertwb] Guard side input mutation for Fn API mode only.

[ehudm] Add grpcio-tools to gradle virtualenv.

[ehudm] Allow longs as input to Timestamp.of().

--
[...truncated 60.42 KB...]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 6 resources
[INFO] 
[INFO] --- maven-assembly-plugin:3.1.0:single (export-go-pkg-sources) @ 
beam-sdks-go ---
[INFO] Reading assembly descriptor: descriptor.xml
[INFO] Building zip: 

[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:get (go-get-imports) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go get google.golang.org/grpc 
golang.org/x/oauth2/google google.golang.org/api/storage/v1 
github.com/spf13/cobra cloud.google.com/go/bigquery 
google.golang.org/api/googleapi google.golang.org/api/dataflow/v1b3
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build-linux-amd64) @ beam-sdks-go 
---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- maven-checkstyle-plugin:3.0.0:check (default) @ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:test (go-test) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go test ./...
[INFO] 
[INFO] -Exec.Out-
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl/cmd  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/specialize   [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/symtab   [no test files]
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam 0.043s
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam/artifact0.118s
[INFO] 
[ERROR] 
[ERROR] -Exec.Err-
[ERROR] # github.com/apache/beam/sdks/go/pkg/beam/util/gcsx
[ERROR] github.com/apache/beam/sdks/go/pkg/beam/util/gcsx/gcs.go:46:37: 
undefined: option.WithoutAuthentication
[ERROR] 
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Beam :: Parent .. SUCCESS [  3.807 s]
[INFO] Apache Beam :: SDKs :: Java :: Build Tools . SUCCESS [  3.319 s]
[INFO] Apache Beam :: Model ... SUCCESS [  0.086 s]
[INFO] Apache Beam :: Model :: Pipeline ... SUCCESS [ 10.720 s]
[INFO] Apache Beam :: Model :: Job Management . SUCCESS [  3.722 s]
[INFO] Apache Beam :: Model :: Fn Execution ... SUCCESS [  4.704 s]
[INFO] Apache Beam :: SDKs  SUCCESS [  0.155 s]
[INFO] Apache Beam :: SDKs :: Go .. FAILURE [ 33.878 s]
[INFO] Apache Beam :: SDKs :: Go :: Container . SKIPPED
[INFO] Apache Beam :: SDKs :: Java  S

Build failed in Jenkins: beam_PostCommit_Python_Verify #4637

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[robertwb] Correctly account for keyword arguments in function calls.

[robertwb] Fix type inference for slicing.

[herohde] [BEAM-4034] Fix hooks panic with Go on Flink

[ehudm] Add grpcio-tools to gradle virtualenv.

--
[...truncated 1.04 MB...]
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operation_specs.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operations.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operations.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_main_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sideinputs.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sideinputs_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler_fast.pyx -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler_slow.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/worker_id_interceptor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/worker_id_interceptor_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/testing/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/pipeline_verifiers.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/pipeline_verifiers_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_pipeline.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_pipeline_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_stream.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_stream_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_utils.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/test_utils_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/util.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/util_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/testing
copying apache_beam/testing/data/standard_coders.yaml -> 
apache-beam-2.5.0.dev0/apache_beam/testing/d

[jira] [Work logged] (BEAM-2927) Python SDK support for portable side input

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2927?focusedWorklogId=89205&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89205
 ]

ASF GitHub Bot logged work on BEAM-2927:


Author: ASF GitHub Bot
Created on: 10/Apr/18 00:32
Start Date: 10/Apr/18 00:32
Worklog Time Spent: 10m 
  Work Description: robertwb closed pull request #4983: [BEAM-2927] 
Re-enable side inputs for Fn API on Dataflow
URL: https://github.com/apache/beam/pull/4983
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py 
b/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
index cb74fc06108..e2210aab770 100644
--- a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
+++ b/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
@@ -211,36 +211,8 @@ def visit_transform(self, transform_node):
 from apache_beam.transforms.core import GroupByKey, _GroupByKeyOnly
 if isinstance(transform_node.transform, (GroupByKey, _GroupByKeyOnly)):
   pcoll = transform_node.inputs[0]
-  input_type = pcoll.element_type
-  # If input_type is not specified, then treat it as `Any`.
-  if not input_type:
-input_type = typehints.Any
-
-  def coerce_to_kv_type(element_type):
-if isinstance(element_type, typehints.TupleHint.TupleConstraint):
-  if len(element_type.tuple_types) == 2:
-return element_type
-  else:
-raise ValueError(
-"Tuple input to GroupByKey must be have two components. "
-"Found %s for %s" % (element_type, pcoll))
-elif isinstance(input_type, typehints.AnyTypeConstraint):
-  # `Any` type needs to be replaced with a KV[Any, Any] to
-  # force a KV coder as the main output coder for the pcollection
-  # preceding a GroupByKey.
-  return typehints.KV[typehints.Any, typehints.Any]
-elif isinstance(element_type, typehints.UnionConstraint):
-  union_types = [
-  coerce_to_kv_type(t) for t in element_type.union_types]
-  return typehints.KV[
-  typehints.Union[tuple(t.tuple_types[0] for t in 
union_types)],
-  typehints.Union[tuple(t.tuple_types[1] for t in 
union_types)]]
-else:
-  # TODO: Possibly handle other valid types.
-  raise ValueError(
-  "Input to GroupByKey must be of Tuple or Any type. "
-  "Found %s for %s" % (element_type, pcoll))
-  pcoll.element_type = coerce_to_kv_type(input_type)
+  pcoll.element_type = typehints.coerce_to_kv_type(
+  pcoll.element_type, transform_node.full_label)
   key_type, value_type = pcoll.element_type.tuple_types
   if transform_node.outputs:
 transform_node.outputs[None].element_type = typehints.KV[
@@ -248,6 +220,59 @@ def coerce_to_kv_type(element_type):
 
 return GroupByKeyInputVisitor()
 
+  @staticmethod
+  def side_input_visitor():
+# Imported here to avoid circular dependencies.
+# pylint: disable=wrong-import-order, wrong-import-position
+from apache_beam.pipeline import PipelineVisitor
+from apache_beam.transforms.core import ParDo
+
+class SideInputVisitor(PipelineVisitor):
+  """Ensures input `PCollection` used as a side inputs has a `KV` type.
+
+  TODO(BEAM-115): Once Python SDK is compatible with the new Runner API,
+  we could directly replace the coder instead of mutating the element type.
+  """
+  def visit_transform(self, transform_node):
+if isinstance(transform_node.transform, ParDo):
+  new_side_inputs = []
+  for ix, side_input in enumerate(transform_node.side_inputs):
+access_pattern = side_input._side_input_data().access_pattern
+if access_pattern == common_urns.ITERABLE_SIDE_INPUT:
+  # Add a map to ('', value) as Dataflow currently only handles
+  # keyed side inputs.
+  pipeline = side_input.pvalue.pipeline
+  new_side_input = _DataflowIterableSideInput(side_input)
+  new_side_input.pvalue = beam.pvalue.PCollection(
+  pipeline,
+  element_type=typehints.KV[
+  str, side_input.pvalue.element_type])
+  parent = transform_node.parent or pipeline._root_transform()
+  map_to_void_key = beam.pipeline.AppliedPTransform(
+  pipeline,
+  beam.Map(lambda x: ('', x)),
+ 

[beam] 01/01: Merge pull request #4983 [BEAM-2927] Re-enable side inputs for Fn API on Dataflow

2018-04-09 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 9df0307853fe74de41916f6cb7c65151ca5e416d
Merge: 8f3c2b9 39c17f1
Author: Robert Bradshaw 
AuthorDate: Mon Apr 9 17:32:41 2018 -0700

Merge pull request #4983 [BEAM-2927] Re-enable side inputs for Fn API on 
Dataflow

 .../runners/dataflow/dataflow_runner.py| 159 -
 .../runners/dataflow/dataflow_runner_test.py   |  23 ++-
 .../runners/portability/fn_api_runner.py   |  13 +-
 .../apache_beam/runners/worker/bundle_processor.py |   9 +-
 .../apache_beam/runners/worker/sdk_worker.py   |  32 +++--
 sdks/python/apache_beam/typehints/typehints.py |  32 +
 6 files changed, 214 insertions(+), 54 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[beam] branch master updated (8f3c2b9 -> 9df0307)

2018-04-09 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 8f3c2b9  Merge pull request #5049: Allow longs as input to 
Timestamp.of()
 add 00f3e22  Revert "Revert #4781 which broke Python postsubmits"
 add 39c17f1  Guard side input mutation for Fn API mode only.
 new 9df0307  Merge pull request #4983 [BEAM-2927] Re-enable side inputs 
for Fn API on Dataflow

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../runners/dataflow/dataflow_runner.py| 159 -
 .../runners/dataflow/dataflow_runner_test.py   |  23 ++-
 .../runners/portability/fn_api_runner.py   |  13 +-
 .../apache_beam/runners/worker/bundle_processor.py |   9 +-
 .../apache_beam/runners/worker/sdk_worker.py   |  32 +++--
 sdks/python/apache_beam/typehints/typehints.py |  32 +
 6 files changed, 214 insertions(+), 54 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[jira] [Work logged] (BEAM-4037) Add Python streaming wordcount snippets and test

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4037?focusedWorklogId=89204&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89204
 ]

ASF GitHub Bot logged work on BEAM-4037:


Author: ASF GitHub Bot
Created on: 10/Apr/18 00:31
Start Date: 10/Apr/18 00:31
Worklog Time Spent: 10m 
  Work Description: charlesccychen commented on issue #5071: [BEAM-4037] 
Add streaming wordcount snippets and test
URL: https://github.com/apache/beam/pull/5071#issuecomment-379935862
 
 
   Sent out commit with `# coding=utf-8` fix.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89204)
Time Spent: 40m  (was: 0.5h)

> Add Python streaming wordcount snippets and test
> 
>
> Key: BEAM-4037
> URL: https://issues.apache.org/jira/browse/BEAM-4037
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Charles Chen
>Assignee: Charles Chen
>Priority: Major
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> We should add Python streaming wordcount snippets and tests.  The 
> documentation will refer to these snippets.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #49

2018-04-09 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-4037) Add Python streaming wordcount snippets and test

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4037?focusedWorklogId=89203&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89203
 ]

ASF GitHub Bot logged work on BEAM-4037:


Author: ASF GitHub Bot
Created on: 10/Apr/18 00:26
Start Date: 10/Apr/18 00:26
Worklog Time Spent: 10m 
  Work Description: aaltay commented on issue #5071: [BEAM-4037] Add 
streaming wordcount snippets and test
URL: https://github.com/apache/beam/pull/5071#issuecomment-379935233
 
 
   LGTM. snippets_test is failing, please fix that.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89203)
Time Spent: 0.5h  (was: 20m)

> Add Python streaming wordcount snippets and test
> 
>
> Key: BEAM-4037
> URL: https://issues.apache.org/jira/browse/BEAM-4037
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Charles Chen
>Assignee: Charles Chen
>Priority: Major
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> We should add Python streaming wordcount snippets and tests.  The 
> documentation will refer to these snippets.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3218) Change PubsubBoundedWriter to check total byte size

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3218?focusedWorklogId=89202&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89202
 ]

ASF GitHub Bot logged work on BEAM-3218:


Author: ASF GitHub Bot
Created on: 10/Apr/18 00:22
Start Date: 10/Apr/18 00:22
Worklog Time Spent: 10m 
  Work Description: chamikaramj commented on issue #4275: [BEAM-3218] Added 
Quota checks for PubsubMessage in PubsubBoundedWriter
URL: https://github.com/apache/beam/pull/4275#issuecomment-379934638
 
 
   Sorry about the delay. Reuven, could you review or pass to someone whose 
familiar with PubSubIO ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89202)
Time Spent: 10m
Remaining Estimate: 0h

> Change PubsubBoundedWriter to check total byte size
> ---
>
> Key: BEAM-3218
> URL: https://issues.apache.org/jira/browse/BEAM-3218
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Reporter: Thang Nguyen
>Assignee: Reuven Lax
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The PubsubBoundedWriter [does not 
> check|https://github.com/apache/beam/blob/master/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java#L895-L897]
>  the total byte size of outgoing messages when publishing. 
> AC:
> * Add a check to ensure the total bytes of the outgoing messages is less than 
> pubsub's allowed max



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3042) Add tracking of bytes read / time spent when reading side inputs

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3042?focusedWorklogId=89201&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89201
 ]

ASF GitHub Bot logged work on BEAM-3042:


Author: ASF GitHub Bot
Created on: 10/Apr/18 00:18
Start Date: 10/Apr/18 00:18
Worklog Time Spent: 10m 
  Work Description: pabloem opened a new pull request #5075: [BEAM-3042] 
Refactor of TransformIOCounters (performance, inheritance). Time counter for 
sideinputs.
URL: https://github.com/apache/beam/pull/5075
 
 
   Improving parent class behavior for `TransformIOCounter`, and avoiding 
recreating new objects when the step has not changed.
   
   Also, adding time tracking for side inputs, masked behind an experiment 
flag. This experiment flag is activated in internal dataflow tests, and bytes 
tracking has not affected performance in benchmarks, so we'd like to test 
adding time tracking.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89201)
Time Spent: 50m  (was: 40m)

> Add tracking of bytes read / time spent when reading side inputs
> 
>
> Key: BEAM-3042
> URL: https://issues.apache.org/jira/browse/BEAM-3042
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Pablo Estrada
>Assignee: Pablo Estrada
>Priority: Major
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> It is difficult for Dataflow users to understand how modifying a pipeline or 
> data set can affect how much inter-transform IO is used in their job. The 
> intent of this feature request is to help users understand how side inputs 
> behave when they are consumed.
> This will allow users to understand how much time and how much data their 
> pipeline uses to read/write to inter-transform IO. Users will also be able to 
> modify their pipelines and understand how their changes affect these IO 
> metrics.
> For further information, please review the internal Google doc 
> go/insights-transform-io-design-doc.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch master updated (0a5e592 -> 8f3c2b9)

2018-04-09 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 0a5e592  Merge pull request #5065: [BEAM-4034] Fix hooks panic with Go 
on Flink
 add d6e5b5b  Allow longs as input to Timestamp.of().
 new 8f3c2b9  Merge pull request #5049: Allow longs as input to 
Timestamp.of()

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/utils/timestamp.py | 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[beam] 01/01: Merge pull request #5049: Allow longs as input to Timestamp.of()

2018-04-09 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 8f3c2b9f7436eff8b0a4af9465980ded729fb87a
Merge: 0a5e592 d6e5b5b
Author: Chamikara Jayalath 
AuthorDate: Mon Apr 9 17:17:58 2018 -0700

Merge pull request #5049: Allow longs as input to Timestamp.of()

 sdks/python/apache_beam/utils/timestamp.py | 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[jira] [Resolved] (BEAM-2403) Provide BigQueryIO with method to indicate Priority (INTERACTIVE/BATCH)

2018-04-09 Thread Chamikara Jayalath (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2403?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chamikara Jayalath resolved BEAM-2403.
--
   Resolution: Duplicate
Fix Version/s: 2.5.0

Duplicate of https://issues.apache.org/jira/browse/BEAM-2817.

> Provide BigQueryIO with method to indicate Priority (INTERACTIVE/BATCH)
> ---
>
> Key: BEAM-2403
> URL: https://issues.apache.org/jira/browse/BEAM-2403
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-gcp
>Affects Versions: 2.0.0
>Reporter: Andre
>Assignee: Chamikara Jayalath
>Priority: Minor
> Fix For: 2.5.0
>
>
> Currently every BigQuery job runs in BATCH mode when issued from Beam which 
> could delay the execution of a scheduled job unnecessarily. By allowing to 
> provide the priority INTERACTIVE with BigQueryIO this could be resolved.
> See 
> https://stackoverflow.com/questions/44198891/change-google-cloud-dataflow-bigquery-priority/44223123



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-2403) Provide BigQueryIO with method to indicate Priority (INTERACTIVE/BATCH)

2018-04-09 Thread Chamikara Jayalath (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2403?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16431531#comment-16431531
 ] 

Chamikara Jayalath commented on BEAM-2403:
--

Yeah, marking this issue as a duplicate.

> Provide BigQueryIO with method to indicate Priority (INTERACTIVE/BATCH)
> ---
>
> Key: BEAM-2403
> URL: https://issues.apache.org/jira/browse/BEAM-2403
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-gcp
>Affects Versions: 2.0.0
>Reporter: Andre
>Assignee: Chamikara Jayalath
>Priority: Minor
>
> Currently every BigQuery job runs in BATCH mode when issued from Beam which 
> could delay the execution of a scheduled job unnecessarily. By allowing to 
> provide the priority INTERACTIVE with BigQueryIO this could be resolved.
> See 
> https://stackoverflow.com/questions/44198891/change-google-cloud-dataflow-bigquery-priority/44223123



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #11

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Add grpcio-tools to gradle virtualenv.

--
[...truncated 16.11 MB...]

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-09_17_02_15-3076076775435380248

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 12:02:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-09_17_02_15-3076076775435380248
Apr 10, 2018 12:02:16 AM 
org.apache.beam.runners.dataflow.TestDataflowRunner run
INFO: Running Dataflow job 2018-04-09_17_02_15-3076076775435380248 with 0 
expected assertions.

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 10, 2018 12:02:22 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:22.387Z: Autoscaling: Resized worker pool from 1 to 
0.
Apr 10, 2018 12:02:22 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:22.427Z: Autoscaling: Would further reduce the 
number of workers but reached the minimum number allowed for the job.
Apr 10, 2018 12:02:22 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:22.481Z: Worker pool stopped.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:15.621Z: Autoscaling is enabled for job 
2018-04-09_17_02_15-3076076775435380248. The number of workers will be between 
1 and 1000.
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:15.645Z: Autoscaling was automatically enabled for 
job 2018-04-09_17_02_15-3076076775435380248.
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:18.333Z: Checking required Cloud APIs are enabled.
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:18.438Z: Checking permissions granted to controller 
Service Account.
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:22.619Z: Expanding CoGroupByKey operations into 
optimizable parts.
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:22.807Z: Expanding GroupByKey operations into 
optimizable parts.
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:22.833Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:23.028Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:23.060Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write 
mutations to Cloud Spanner/Create.Values/Read(CreateSource)
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:23.088Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write
 mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
 into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:23.121Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
 into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
Apr 10, 2018 12:02:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-10T00:02:23.142Z: Fusing consumer SpannerIO.Write/

[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=89199&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89199
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 10/Apr/18 00:00
Start Date: 10/Apr/18 00:00
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-379929681
 
 
   Run Seed Job


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89199)
Time Spent: 81.5h  (was: 81h 20m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 81.5h
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3119) direct-metrics-counter-committer threads are leaking

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3119?focusedWorklogId=89195&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89195
 ]

ASF GitHub Bot logged work on BEAM-3119:


Author: ASF GitHub Bot
Created on: 09/Apr/18 23:45
Start Date: 09/Apr/18 23:45
Worklog Time Spent: 10m 
  Work Description: tgroh commented on a change in pull request #4965: 
BEAM-3119 ensure the metrics thread pool is related to an execution
URL: https://github.com/apache/beam/pull/4965#discussion_r180260599
 
 

 ##
 File path: 
runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectMetrics.java
 ##
 @@ -76,6 +64,8 @@
   private static class DirectMetric {
 private final MetricAggregation aggregation;
 
+private final ExecutorService executorService;
 
 Review comment:
   Worth noting that the lifecycle of the ExecutorService is managed outside of 
this class; Alternatively you could pass the `submit` method as a 
`Consumer` or use that type for the field, to ensure that future 
changes are clear that this can't be closed from this class.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89195)
Time Spent: 1h 40m  (was: 1.5h)

> direct-metrics-counter-committer threads are leaking
> 
>
> Key: BEAM-3119
> URL: https://issues.apache.org/jira/browse/BEAM-3119
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Etienne Chauchot
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> When I run ElasticsearchIOTests using ESv5, there is a thread leak control 
> mechanism ({{com.carrotsearch.randomizedtesting.ThreadLeakControl}}). It 
> waits for 5s for non-terminated threads at the end of a test. It detects 
> leaked {{direct-metrics-counter-committer}} thread.
> {code}
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie 
> threads that couldn't be terminated:
>1) Thread[id=296, name=direct-metrics-counter-committer, 
> state=TIMED_WAITING, group=TGRP-ElasticsearchIOTest]
> at sun.misc.Unsafe.park(Native Method)
> at 
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
> at 
> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
> at 
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>   at __randomizedtesting.SeedInfo.seed([59E504CA1B0DD6A8]:0){code}
> I tried to increase the timeout to 30s (by patching 
> randomizedtesting-runner-2.5.0.jar) but still gets a zombie thread.
> To reproduce, just comment 
> {code}
> @ThreadLeakScope(ThreadLeakScope.Scope.NONE)
> {code}
>  in 
> {code}
> beam/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
> {code}
> and run 
> {code}
> testRead()
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-2403) Provide BigQueryIO with method to indicate Priority (INTERACTIVE/BATCH)

2018-04-09 Thread Florian Scharinger (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2403?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16431505#comment-16431505
 ] 

Florian Scharinger commented on BEAM-2403:
--

This feature exists now, I think since v2.2, but definitely in v2.4. However, 
there is a bug that causes the query to always execute in BATCH mode, which has 
been fixed in master: 
https://github.com/apache/beam/commit/a463357becfbfaf687979dfed68ba43fefc40e52#diff-996fa1990332e50e8dd0b2078fbcede9L87

> Provide BigQueryIO with method to indicate Priority (INTERACTIVE/BATCH)
> ---
>
> Key: BEAM-2403
> URL: https://issues.apache.org/jira/browse/BEAM-2403
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-gcp
>Affects Versions: 2.0.0
>Reporter: Andre
>Assignee: Chamikara Jayalath
>Priority: Minor
>
> Currently every BigQuery job runs in BATCH mode when issued from Beam which 
> could delay the execution of a scheduled job unnecessarily. By allowing to 
> provide the priority INTERACTIVE with BigQueryIO this could be resolved.
> See 
> https://stackoverflow.com/questions/44198891/change-google-cloud-dataflow-bigquery-priority/44223123



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3119) direct-metrics-counter-committer threads are leaking

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3119?focusedWorklogId=89196&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89196
 ]

ASF GitHub Bot logged work on BEAM-3119:


Author: ASF GitHub Bot
Created on: 09/Apr/18 23:45
Start Date: 09/Apr/18 23:45
Worklog Time Spent: 10m 
  Work Description: tgroh commented on a change in pull request #4965: 
BEAM-3119 ensure the metrics thread pool is related to an execution
URL: https://github.com/apache/beam/pull/4965#discussion_r180260616
 
 

 ##
 File path: 
runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectMetrics.java
 ##
 @@ -223,13 +215,11 @@ public GaugeResult extract(GaugeData data) {
   };
 
   /** The current values of counters in memory. */
-  private MetricsMap> counters =
-  new MetricsMap<>(unusedKey -> new DirectMetric<>(COUNTER));
+  private MetricsMap> counters;
 
 Review comment:
   Can these be final?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89196)
Time Spent: 1h 50m  (was: 1h 40m)

> direct-metrics-counter-committer threads are leaking
> 
>
> Key: BEAM-3119
> URL: https://issues.apache.org/jira/browse/BEAM-3119
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Etienne Chauchot
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 1h 50m
>  Remaining Estimate: 0h
>
> When I run ElasticsearchIOTests using ESv5, there is a thread leak control 
> mechanism ({{com.carrotsearch.randomizedtesting.ThreadLeakControl}}). It 
> waits for 5s for non-terminated threads at the end of a test. It detects 
> leaked {{direct-metrics-counter-committer}} thread.
> {code}
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie 
> threads that couldn't be terminated:
>1) Thread[id=296, name=direct-metrics-counter-committer, 
> state=TIMED_WAITING, group=TGRP-ElasticsearchIOTest]
> at sun.misc.Unsafe.park(Native Method)
> at 
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
> at 
> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
> at 
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>   at __randomizedtesting.SeedInfo.seed([59E504CA1B0DD6A8]:0){code}
> I tried to increase the timeout to 30s (by patching 
> randomizedtesting-runner-2.5.0.jar) but still gets a zombie thread.
> To reproduce, just comment 
> {code}
> @ThreadLeakScope(ThreadLeakScope.Scope.NONE)
> {code}
>  in 
> {code}
> beam/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
> {code}
> and run 
> {code}
> testRead()
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3119) direct-metrics-counter-committer threads are leaking

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3119?focusedWorklogId=89194&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89194
 ]

ASF GitHub Bot logged work on BEAM-3119:


Author: ASF GitHub Bot
Created on: 09/Apr/18 23:45
Start Date: 09/Apr/18 23:45
Worklog Time Spent: 10m 
  Work Description: tgroh commented on a change in pull request #4965: 
BEAM-3119 ensure the metrics thread pool is related to an execution
URL: https://github.com/apache/beam/pull/4965#discussion_r180262119
 
 

 ##
 File path: 
runners/direct-java/src/main/java/org/apache/beam/runners/direct/ExecutorServiceParallelExecutor.java
 ##
 @@ -298,6 +303,11 @@ private void shutdownIfNecessary(State newState) {
 } catch (final RuntimeException re) {
   errors.add(re);
 }
+try {
+metricsExecutor.shutdown();
 
 Review comment:
   It's worth filing a follow-up jira to try to minimize the surface area of 
things the `ExecutorServiceParallelExecutor` needs to know about to shut down.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89194)
Time Spent: 1.5h  (was: 1h 20m)

> direct-metrics-counter-committer threads are leaking
> 
>
> Key: BEAM-3119
> URL: https://issues.apache.org/jira/browse/BEAM-3119
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Etienne Chauchot
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> When I run ElasticsearchIOTests using ESv5, there is a thread leak control 
> mechanism ({{com.carrotsearch.randomizedtesting.ThreadLeakControl}}). It 
> waits for 5s for non-terminated threads at the end of a test. It detects 
> leaked {{direct-metrics-counter-committer}} thread.
> {code}
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie 
> threads that couldn't be terminated:
>1) Thread[id=296, name=direct-metrics-counter-committer, 
> state=TIMED_WAITING, group=TGRP-ElasticsearchIOTest]
> at sun.misc.Unsafe.park(Native Method)
> at 
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
> at 
> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
> at 
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>   at __randomizedtesting.SeedInfo.seed([59E504CA1B0DD6A8]:0){code}
> I tried to increase the timeout to 30s (by patching 
> randomizedtesting-runner-2.5.0.jar) but still gets a zombie thread.
> To reproduce, just comment 
> {code}
> @ThreadLeakScope(ThreadLeakScope.Scope.NONE)
> {code}
>  in 
> {code}
> beam/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
> {code}
> and run 
> {code}
> testRead()
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3119) direct-metrics-counter-committer threads are leaking

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3119?focusedWorklogId=89198&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89198
 ]

ASF GitHub Bot logged work on BEAM-3119:


Author: ASF GitHub Bot
Created on: 09/Apr/18 23:45
Start Date: 09/Apr/18 23:45
Worklog Time Spent: 10m 
  Work Description: tgroh commented on a change in pull request #4965: 
BEAM-3119 ensure the metrics thread pool is related to an execution
URL: https://github.com/apache/beam/pull/4965#discussion_r180260921
 
 

 ##
 File path: 
runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectRunner.java
 ##
 @@ -172,48 +176,53 @@ public DirectPipelineResult run(Pipeline 
originalPipeline) {
 }
 pipeline.replaceAll(defaultTransformOverrides());
 MetricsEnvironment.setMetricsSupported(true);
 
 Review comment:
   This probably should restore the previous state, whatever it happened to be, 
rather than blind-setting to false; However, I'm not super worried about it, 
because this is already global state so multiple pipelines can interfere with 
each other.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89198)
Time Spent: 2h 10m  (was: 2h)

> direct-metrics-counter-committer threads are leaking
> 
>
> Key: BEAM-3119
> URL: https://issues.apache.org/jira/browse/BEAM-3119
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Etienne Chauchot
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> When I run ElasticsearchIOTests using ESv5, there is a thread leak control 
> mechanism ({{com.carrotsearch.randomizedtesting.ThreadLeakControl}}). It 
> waits for 5s for non-terminated threads at the end of a test. It detects 
> leaked {{direct-metrics-counter-committer}} thread.
> {code}
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie 
> threads that couldn't be terminated:
>1) Thread[id=296, name=direct-metrics-counter-committer, 
> state=TIMED_WAITING, group=TGRP-ElasticsearchIOTest]
> at sun.misc.Unsafe.park(Native Method)
> at 
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
> at 
> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
> at 
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>   at __randomizedtesting.SeedInfo.seed([59E504CA1B0DD6A8]:0){code}
> I tried to increase the timeout to 30s (by patching 
> randomizedtesting-runner-2.5.0.jar) but still gets a zombie thread.
> To reproduce, just comment 
> {code}
> @ThreadLeakScope(ThreadLeakScope.Scope.NONE)
> {code}
>  in 
> {code}
> beam/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
> {code}
> and run 
> {code}
> testRead()
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3119) direct-metrics-counter-committer threads are leaking

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3119?focusedWorklogId=89197&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89197
 ]

ASF GitHub Bot logged work on BEAM-3119:


Author: ASF GitHub Bot
Created on: 09/Apr/18 23:45
Start Date: 09/Apr/18 23:45
Worklog Time Spent: 10m 
  Work Description: tgroh commented on a change in pull request #4965: 
BEAM-3119 ensure the metrics thread pool is related to an execution
URL: https://github.com/apache/beam/pull/4965#discussion_r180261005
 
 

 ##
 File path: 
runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectRunner.java
 ##
 @@ -172,48 +176,53 @@ public DirectPipelineResult run(Pipeline 
originalPipeline) {
 }
 pipeline.replaceAll(defaultTransformOverrides());
 MetricsEnvironment.setMetricsSupported(true);
-DirectGraphVisitor graphVisitor = new DirectGraphVisitor();
-pipeline.traverseTopologically(graphVisitor);
+try {
+  DirectGraphVisitor graphVisitor = new DirectGraphVisitor();
+  pipeline.traverseTopologically(graphVisitor);
 
-@SuppressWarnings("rawtypes")
-KeyedPValueTrackingVisitor keyedPValueVisitor = 
KeyedPValueTrackingVisitor.create();
-pipeline.traverseTopologically(keyedPValueVisitor);
+  @SuppressWarnings("rawtypes")
+  KeyedPValueTrackingVisitor keyedPValueVisitor = 
KeyedPValueTrackingVisitor.create();
+  pipeline.traverseTopologically(keyedPValueVisitor);
 
-DisplayDataValidator.validatePipeline(pipeline);
-DisplayDataValidator.validateOptions(getPipelineOptions());
+  DisplayDataValidator.validatePipeline(pipeline);
+  DisplayDataValidator.validateOptions(getPipelineOptions());
 
-DirectGraph graph = graphVisitor.getGraph();
-EvaluationContext context =
-EvaluationContext.create(
-getPipelineOptions(),
-clockSupplier.get(),
-Enforcement.bundleFactoryFor(enabledEnforcements, graph),
-graph,
-keyedPValueVisitor.getKeyedPValues());
+  DirectGraph graph = graphVisitor.getGraph();
+  ExecutorService metricsPool = Executors.newCachedThreadPool(
+new ThreadFactoryBuilder()
+  .setThreadFactory(MoreExecutors.platformThreadFactory())
+  .setDaemon(false) // otherwise you say you want to leak, please 
don't!
+  .setNameFormat("direct-metrics-counter-committer")
+  .build());
+  EvaluationContext context = EvaluationContext.create(
+getPipelineOptions(), clockSupplier.get(),
 
 Review comment:
   This formatting is off.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89197)
Time Spent: 2h  (was: 1h 50m)

> direct-metrics-counter-committer threads are leaking
> 
>
> Key: BEAM-3119
> URL: https://issues.apache.org/jira/browse/BEAM-3119
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Etienne Chauchot
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 2h
>  Remaining Estimate: 0h
>
> When I run ElasticsearchIOTests using ESv5, there is a thread leak control 
> mechanism ({{com.carrotsearch.randomizedtesting.ThreadLeakControl}}). It 
> waits for 5s for non-terminated threads at the end of a test. It detects 
> leaked {{direct-metrics-counter-committer}} thread.
> {code}
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie 
> threads that couldn't be terminated:
>1) Thread[id=296, name=direct-metrics-counter-committer, 
> state=TIMED_WAITING, group=TGRP-ElasticsearchIOTest]
> at sun.misc.Unsafe.park(Native Method)
> at 
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
> at 
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
> at 
> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
> at 
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>   at __randomizedtesting.SeedInfo.seed([59E504CA1B0DD6A8]:0){code}
> I tried to increase the timeout to 30s (by patching

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #33

2018-04-09 Thread Apache Jenkins Server
See 


Changes:

[herohde] [BEAM-4034] Fix hooks panic with Go on Flink

[ehudm] Add grpcio-tools to gradle virtualenv.

--
[...truncated 1.60 MB...]
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testDiscardingMode(CreateStreamTest.java:203)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at 
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy3.processTestClass(Unknown Source)
at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:108)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.remote

[jira] [Work logged] (BEAM-4036) Pickler enters infinite recursion with self-referential classes

2018-04-09 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4036?focusedWorklogId=89193&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-89193
 ]

ASF GitHub Bot logged work on BEAM-4036:


Author: ASF GitHub Bot
Created on: 09/Apr/18 23:42
Start Date: 09/Apr/18 23:42
Worklog Time Spent: 10m 
  Work Description: chuanyu opened a new pull request #5072: [BEAM-4036] 
Fix pickling for "recursive" classes.
URL: https://github.com/apache/beam/pull/5072
 
 
   Fix pickling for recursive classes by maintaining a dict of already seen 
classes and not revisiting classes that have already been seen.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 89193)
Time Spent: 10m
Remaining Estimate: 0h

> Pickler enters infinite recursion with self-referential classes
> ---
>
> Key: BEAM-4036
> URL: https://issues.apache.org/jira/browse/BEAM-4036
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Chuan Yu Foo
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The pickler recurses infinitely and dies with maximum recursion limit 
> exceeded when a module contains a self-referential class (or any class which 
> is part of a cycle).
> Here's a minimal example: 
> {code}
> class RecursiveClass(object):
>  SELF_TYPE = None
>  def __init__(self, datum)
>self.datum = 'RecursiveClass:%s' % datum
> RecursiveClass.SELF_TYPE = RecursiveClass
> {code}
> If this is in a module, then the pickler will enter the infinite recursion 
> when trying to pickle any nested class in that module.
>   
> An actual example is with typing.Type, which is part of a cycle typing.Type 
> -> type -> object -> typing.Type. If a module contains an attribute that 
> refers to typing.Type, such as a type alias, it will trigger this bug.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-4023) Log warning for missing worker id in FnApiControlClientPoolService

2018-04-09 Thread Henning Rohde (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Henning Rohde updated BEAM-4023:

Component/s: (was: sdk-go)
 sdk-java-core

> Log warning for missing worker id in FnApiControlClientPoolService
> --
>
> Key: BEAM-4023
> URL: https://issues.apache.org/jira/browse/BEAM-4023
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Ankur Goenka
>Assignee: Ankur Goenka
>Priority: Major
>
> We should log warning for missing worker id when connecting the GRPC channel.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   3   >