[beam-site] branch asf-site updated (b9082bf -> 3edacf5)

2018-03-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


from b9082bf  Prepare repository for deployment.
 add f8a41c8  Update nexmark support matrix for timer and state APIs
 add cb89a1c  This closes #409
 new 3edacf5  Prepare repository for deployment.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 content/documentation/sdks/java/nexmark/index.html | 4 ++--
 src/documentation/sdks/nexmark.md  | 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/01: Prepare repository for deployment.

2018-03-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 3edacf5e9d0e763fe19ecb655392df75b1f746e8
Author: Mergebot 
AuthorDate: Wed Mar 28 22:25:19 2018 -0700

Prepare repository for deployment.
---
 content/documentation/sdks/java/nexmark/index.html | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/content/documentation/sdks/java/nexmark/index.html 
b/content/documentation/sdks/java/nexmark/index.html
index a274439..1935626 100644
--- a/content/documentation/sdks/java/nexmark/index.html
+++ b/content/documentation/sdks/java/nexmark/index.html
@@ -442,7 +442,7 @@ or may be published to Pub/Sub.
 
   3
   ok
-  https://issues.apache.org/jira/browse/BEAM-1115;>BEAM-1115
+  ok
   ok
   https://issues.apache.org/jira/browse/BEAM-1114;>BEAM-1114
 
@@ -545,7 +545,7 @@ or may be published to Pub/Sub.
 
   3
   ok
-  https://issues.apache.org/jira/browse/BEAM-1035;>BEAM-1035, https://issues.apache.org/jira/browse/BEAM-1115;>BEAM-1115
+  https://issues.apache.org/jira/browse/BEAM-2176;>BEAM-2176, https://issues.apache.org/jira/browse/BEAM-3961;>BEAM-3961
   ok
   https://issues.apache.org/jira/browse/BEAM-1114;>BEAM-1114
 

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=85502=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85502
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 29/Mar/18 05:23
Start Date: 29/Mar/18 05:23
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-377125329
 
 
   PTAL @tgroh 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85502)
Time Spent: 80.5h  (was: 80h 20m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 80.5h
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam-site] branch mergebot updated (de124e3 -> cb89a1c)

2018-03-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


 discard de124e3  This closes #409
 discard 219fa0c  Update nexmark support matrix for timer and state APIs
 new f8a41c8  Update nexmark support matrix for timer and state APIs
 new cb89a1c  This closes #409

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (de124e3)
\
 N -- N -- N   refs/heads/mergebot (cb89a1c)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 02/02: This closes #409

2018-03-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit cb89a1ca094bcb4ef0752bbce8d6979edf6a9f23
Merge: b9082bf f8a41c8
Author: Mergebot 
AuthorDate: Wed Mar 28 22:17:37 2018 -0700

This closes #409

 src/documentation/sdks/nexmark.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/02: Update nexmark support matrix for timer and state APIs

2018-03-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit f8a41c83393c7179aa7c86759b6ef34d6b63306f
Author: Etienne Chauchot 
AuthorDate: Wed Mar 28 16:44:22 2018 +0200

Update nexmark support matrix for timer and state APIs
---
 src/documentation/sdks/nexmark.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/src/documentation/sdks/nexmark.md 
b/src/documentation/sdks/nexmark.md
index f82e53d..c7655f2 100644
--- a/src/documentation/sdks/nexmark.md
+++ b/src/documentation/sdks/nexmark.md
@@ -234,7 +234,7 @@ These tables contain statuses of the queries runs in the 
different runners. Goog
 
   3
   ok
-  https://issues.apache.org/jira/browse/BEAM-1115;>BEAM-1115
+  ok
   ok
   https://issues.apache.org/jira/browse/BEAM-1114;>BEAM-1114
 
@@ -337,7 +337,7 @@ These tables contain statuses of the queries runs in the 
different runners. Goog
 
   3
   ok
-  https://issues.apache.org/jira/browse/BEAM-1035;>BEAM-1035, https://issues.apache.org/jira/browse/BEAM-1115;>BEAM-1115
+  https://issues.apache.org/jira/browse/BEAM-2176;>BEAM-2176, https://issues.apache.org/jira/browse/BEAM-3961;>BEAM-3961
   ok
   https://issues.apache.org/jira/browse/BEAM-1114;>BEAM-1114
 

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #166

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[yifanzou] return stdout text from command execution and stop using 
var.last_text

--
[...truncated 34.04 MB...]
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (34s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (35s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (36s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (37s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (38s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (39s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (40s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (41s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (42s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (43s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (44s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (45s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (46s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (47s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (48s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (49s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (50s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (51s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (52s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (53s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (54s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (55s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (56s) Current status: 
PENDING

   
Waiting on bqjob_r5614f1babc3e5e32_01626ff8fa89_1 ... (57s) Current status: 
PENDING

   

[jira] [Commented] (BEAM-3858) Data from JdbcIO.read() cannot pass to next transform on ApexRunner

2018-03-28 Thread huangjianhuang (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3858?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16418409#comment-16418409
 ] 

huangjianhuang commented on BEAM-3858:
--

[~jbonofre] any suggestion?

> Data from JdbcIO.read() cannot pass to next transform on ApexRunner
> ---
>
> Key: BEAM-3858
> URL: https://issues.apache.org/jira/browse/BEAM-3858
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-jdbc, runner-apex
>Affects Versions: 2.3.0
> Environment: ubuntu16.04
>Reporter: huangjianhuang
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>
> {code:java}
> public static void testJDBCRead(Pipeline pipeline) {
> System.out.println("in testJDBCRead()");
> pipeline.apply(JdbcIO.read()
> 
> .withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(
> "com.mysql.jdbc.Driver", 
> "jdbc:mysql://localhost:3307/libra")
> .withUsername("root")
> .withPassword("123456"))
> .withQuery("SELECT * FROM o_flow_account_login limit 3")
> .withCoder(StringUtf8Coder.of())
> .withRowMapper(new JdbcIO.RowMapper() {
> public String mapRow(ResultSet resultSet) throws Exception {
> System.out.println("maprow");
> return "tmp";
> }
> })
> )
> .apply(ParDo.of(new DoFn() {
> @ProcessElement
> public void process(ProcessContext context) {
> System.out.println("??");
> context.output(" ");
> }
> }));
> }
> {code}
> On DirectRunner or FlinkRunner, screen shows:
> {code:java}
> maprow
> maprow
> maprow
> ??
> ??
> ??
> {code}
> however on ApexRunner, screen only shows:
> {code:java}
> maprow
> maprow
> maprow
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=85491=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85491
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 29/Mar/18 03:53
Start Date: 29/Mar/18 03:53
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-377113184
 
 
   Run Dataflow PostRelease


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85491)
Time Spent: 80h 20m  (was: 80h 10m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 80h 20m
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=85490=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85490
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 29/Mar/18 03:48
Start Date: 29/Mar/18 03:48
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-377112562
 
 
   Run Seed Job


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85490)
Time Spent: 80h 10m  (was: 80h)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 80h 10m
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #6322

2018-03-28 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=85486=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85486
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 29/Mar/18 03:40
Start Date: 29/Mar/18 03:40
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-377111269
 
 
   Run Dataflow PostRelease


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85486)
Time Spent: 80h  (was: 79h 50m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 80h
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #165

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[yifanzou] return stdout text from command execution and stop using 
var.last_text

--
GitHub pull request #4788 of commit d21ee96d1416d9b7852f60ec14da4f40a3773061, 
no merge conflicts.
Setting status of d21ee96d1416d9b7852f60ec14da4f40a3773061 to PENDING with url 
https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/165/ and 
message: 'Build started sha1 is merged.'
Using context: Jenkins: ./gradlew :release:runJavaExamplesValidationTask
[EnvInject] - Loading node environment variables.
Building remotely on beam4 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/4788/*:refs/remotes/origin/pr/4788/*
 > git rev-parse refs/remotes/origin/pr/4788/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/4788/merge^{commit} # timeout=10
Checking out Revision e4cc451903634f783f04c6c8ddfa88d74f2d7ce8 
(refs/remotes/origin/pr/4788/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e4cc451903634f783f04c6c8ddfa88d74f2d7ce8
Commit message: "Merge d21ee96d1416d9b7852f60ec14da4f40a3773061 into 
397688a62b1f9f0f9840a43ed4ad1a59ba77b981"
 > git rev-list --no-walk 9725347b0dc7da1b26d89e7525bb2f6803261910 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 
 
-Pver= -Prepourl= :release:runJavaExamplesValidationTask
Parallel execution with configuration on demand is an incubating feature.
Applying build_rules.gradle to src
Applying build_rules.gradle to flink
applyJavaNature with [artifactId:beam-runners-flink_2.11] for project flink
Applying build_rules.gradle to fn-execution
applyJavaNature with [artifactId:beam-model-fn-execution, enableFindbugs:false] 
for project fn-execution
applyGrpcNature with default configuration for project fn-execution
Applying build_rules.gradle to core-java
applyJavaNature with [artifactId:runners-core-java] for project core-java
Applying build_rules.gradle to core
applyJavaNature with [artifactId:beam-sdks-java-core] for project core
applyAvroNature with default configuration for project core
Generating :runQuickstartJavaFlinkLocal
Applying build_rules.gradle to direct-java
applyJavaNature with [artifactId:beam-runners-direct-java] for project 
direct-java
Applying build_rules.gradle to core-construction-java
applyJavaNature with [artifactId:beam-runners-core-construction-java] for 
project core-construction-java
Generating :runQuickstartJavaDirect
Generating :runMobileGamingJavaDirect
Applying build_rules.gradle to google-cloud-dataflow-java
applyJavaNature with [enableFindbugs:false, 
artifactId:beam-runners-google-cloud-dataflow-java] for project 
google-cloud-dataflow-java
Applying build_rules.gradle to google-cloud-platform
applyJavaNature with [artifactId:beam-sdks-java-io-google-cloud-platform, 
enableFindbugs:false] for project google-cloud-platform
Applying build_rules.gradle to google-cloud-platform-core
applyJavaNature with 
[artifactId:beam-sdks-java-extensions-google-cloud-platform-core] for project 
google-cloud-platform-core
Generating :runQuickstartJavaDataflow
Generating :runMobileGamingJavaDataflow
Applying build_rules.gradle to apex
applyJavaNature with [artifactId:beam-runners-apex] for project apex
Generating :runQuickstartJavaApex
Applying build_rules.gradle to spark
applyJavaNature with [artifactId:beam-runners-spark] for project spark
Generating :runQuickstartJavaSpark
:release:compileJava NO-SOURCE
:release:compileGroovystartup failed:
:
 23: Apparent variable 'output_text' was found in a static scope but doesn't 
refer to a local variable, static field or class. Possible causes:
You attempted to reference a variable in the binding or an instance variable 
from a static context.
You misspelled a classname or statically imported field. Please check the 
spelling.
You attempted to use a method 'output_text' but left out brackets in a place 
not allowed by the grammar.

[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=85485=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85485
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 29/Mar/18 03:35
Start Date: 29/Mar/18 03:35
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-377110712
 
 
   Run Seed Job


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85485)
Time Spent: 79h 50m  (was: 79h 40m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 79h 50m
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #6321

2018-03-28 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostRelease_NightlySnapshot #164

2018-03-28 Thread Apache Jenkins Server
See 


--
[...truncated 237.26 KB...]
[INFO] Downloading from test.release: 
https://repository.apache.org/content/repositories/snapshots/org/apache/beam/beam-sdks-java-maven-archetypes-examples/2.5.0-SNAPSHOT/beam-sdks-java-maven-archetypes-examples-2.5.0-20180328.102636-19.jar
[INFO] Downloaded from test.release: 
https://repository.apache.org/content/repositories/snapshots/org/apache/beam/beam-sdks-java-maven-archetypes-examples/2.5.0-SNAPSHOT/beam-sdks-java-maven-archetypes-examples-2.5.0-20180328.102636-19.jar
 (2.8 MB at 4.8 MB/s)
[INFO] 

[INFO] Using following parameters for creating project from Archetype: 
beam-sdks-java-maven-archetypes-examples:2.5.0-SNAPSHOT
[INFO] 

[INFO] Parameter: groupId, Value: org.example
[INFO] Parameter: artifactId, Value: word-count-beam
[INFO] Parameter: version, Value: 0.1
[INFO] Parameter: package, Value: org.apache.beam.examples
[INFO] Parameter: packageInPathFormat, Value: org/apache/beam/examples
[INFO] Parameter: package, Value: org.apache.beam.examples
[INFO] Parameter: version, Value: 0.1
[INFO] Parameter: groupId, Value: org.example
[INFO] Parameter: targetPlatform, Value: 1.8
[INFO] Parameter: artifactId, Value: word-count-beam
[INFO] Downloaded from test.release: 
https://repository.apache.org/content/repositories/snapshots/org/apache/beam/beam-sdks-java-maven-archetypes-examples/2.5.0-SNAPSHOT/beam-sdks-java-maven-archetypes-examples-2.5.0-20180328.102636-19.jar
 (2.8 MB at 4.7 MB/s)
[INFO] Project created from Archetype in dir: 
/tmp/groovy-generated-83687350564370-tmpdir/word-count-beam
[INFO] 
[INFO] BUILD SUCCESS
[INFO] 
[INFO] Total time: 19.916 s
[INFO] Finished at: 2018-03-29T03:31:52Z
[INFO] 

[INFO] Using following parameters for creating project from Archetype: 
beam-sdks-java-maven-archetypes-examples:2.5.0-SNAPSHOT
[INFO] 

[INFO] Parameter: groupId, Value: org.example
[INFO] Parameter: artifactId, Value: word-count-beam
[INFO] Parameter: version, Value: 0.1
[INFO] Parameter: package, Value: org.apache.beam.examples
[INFO] Parameter: packageInPathFormat, Value: org/apache/beam/examples
[INFO] Parameter: package, Value: org.apache.beam.examples
[INFO] Parameter: version, Value: 0.1
[INFO] Parameter: groupId, Value: org.example
[INFO] Parameter: targetPlatform, Value: 1.8
[INFO] Parameter: artifactId, Value: word-count-beam
[INFO] Project created from Archetype in dir: 
/tmp/groovy-generated-2936964842916562708-tmpdir/word-count-beam
[INFO] 
[INFO] BUILD SUCCESS
[INFO] 
[INFO] Total time: 19.885 s
[INFO] Finished at: 2018-03-29T03:31:53Z
[INFO] Final Memory: 21M/300M
[INFO] 
Exception in thread "main" groovy.lang.MissingMethodException: No signature of 
method: TestScripts.see() is applicable for argument types: (java.lang.String) 
values: [[INFO] BUILD SUCCESS]
Possible solutions: see(java.lang.String, java.lang.String), ver(), 
run(java.lang.String), sleep(long), use([Ljava.lang.Object;), done()
at 
org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:58)
at 
org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:54)
at 
org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at 
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at 
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at QuickstartArchetype.generate(QuickstartArchetype.groovy:34)
at QuickstartArchetype$generate.call(Unknown Source)
at 
org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at 
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at 
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at quickstart-java-flinklocal.run(quickstart-java-flinklocal.groovy:29)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at 

[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=85483=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85483
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 29/Mar/18 03:30
Start Date: 29/Mar/18 03:30
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-377109950
 
 
   Run Dataflow PostRelease


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85483)
Time Spent: 79h 40m  (was: 79.5h)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 79h 40m
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3339) Create post-release testing of the nightly snapshots

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3339?focusedWorklogId=85482=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85482
 ]

ASF GitHub Bot logged work on BEAM-3339:


Author: ASF GitHub Bot
Created on: 29/Mar/18 03:26
Start Date: 29/Mar/18 03:26
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #4788: [BEAM-3339] Mobile 
gaming automation for Java nightly snapshot on core runners
URL: https://github.com/apache/beam/pull/4788#issuecomment-377109314
 
 
   Run Seed Job


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85482)
Time Spent: 79.5h  (was: 79h 20m)

> Create post-release testing of the nightly snapshots
> 
>
> Key: BEAM-3339
> URL: https://issues.apache.org/jira/browse/BEAM-3339
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Alan Myrvold
>Assignee: Jason Kuster
>Priority: Major
>  Time Spent: 79.5h
>  Remaining Estimate: 0h
>
> The nightly java snapshots in 
> https://repository.apache.org/content/groups/snapshots/org/apache/beam should 
> be verified by following the 
> https://beam.apache.org/get-started/quickstart-java/ instructions, to verify 
> that the release is usable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4528

2018-03-28 Thread Apache Jenkins Server
See 


--
[...truncated 260.87 KB...]
Creating file target/docs/source/apache_beam.io.gcp.pubsub.rst.
Creating file target/docs/source/apache_beam.io.gcp.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.adaptive_throttler.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.datastoreio.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.fake_datastore.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.helper.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.query_splitter.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.util.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.rst.
Creating file target/docs/source/apache_beam.metrics.cells.rst.
Creating file target/docs/source/apache_beam.metrics.metric.rst.
Creating file target/docs/source/apache_beam.metrics.metricbase.rst.
Creating file target/docs/source/apache_beam.metrics.rst.
Creating file target/docs/source/apache_beam.options.pipeline_options.rst.
Creating file 
target/docs/source/apache_beam.options.pipeline_options_validator.rst.
Creating file target/docs/source/apache_beam.options.value_provider.rst.
Creating file target/docs/source/apache_beam.options.rst.
Creating file target/docs/source/apache_beam.portability.common_urns.rst.
Creating file target/docs/source/apache_beam.portability.python_urns.rst.
Creating file target/docs/source/apache_beam.portability.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_artifact_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_fn_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_job_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_provision_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_runner_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.endpoints_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.standard_window_fns_pb2_grpc.rst.
Creating file target/docs/source/apache_beam.portability.api.rst.
Creating file target/docs/source/apache_beam.runners.pipeline_context.rst.
Creating file target/docs/source/apache_beam.runners.runner.rst.
Creating file target/docs/source/apache_beam.runners.sdf_common.rst.
Creating file target/docs/source/apache_beam.runners.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.dataflow_metrics.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.dataflow_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.ptransform_overrides.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.test_dataflow_runner.rst.
Creating file target/docs/source/apache_beam.runners.dataflow.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.native_io.iobase.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.native_io.streaming_create.rst.
Creating file target/docs/source/apache_beam.runners.dataflow.native_io.rst.
Creating file target/docs/source/apache_beam.runners.direct.bundle_factory.rst.
Creating file target/docs/source/apache_beam.runners.direct.clock.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.consumer_tracking_pipeline_visitor.rst.
Creating file target/docs/source/apache_beam.runners.direct.direct_metrics.rst.
Creating file target/docs/source/apache_beam.runners.direct.direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.evaluation_context.rst.
Creating file target/docs/source/apache_beam.runners.direct.executor.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.helper_transforms.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.sdf_direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.transform_evaluator.rst.
Creating file target/docs/source/apache_beam.runners.direct.util.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.watermark_manager.rst.
Creating file target/docs/source/apache_beam.runners.direct.rst.
Creating file target/docs/source/apache_beam.runners.experimental.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.python_rpc_direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.server.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.rst.
Creating file target/docs/source/apache_beam.runners.job.manager.rst.
Creating file target/docs/source/apache_beam.runners.job.utils.rst.
Creating file target/docs/source/apache_beam.runners.job.rst.
Creating file 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4527

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

[lcwik] [BEAM-3104] Set up state interfaces, wire into SDK harness client.

--
[...truncated 260.92 KB...]
Creating file target/docs/source/apache_beam.io.gcp.pubsub.rst.
Creating file target/docs/source/apache_beam.io.gcp.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.adaptive_throttler.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.datastoreio.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.fake_datastore.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.helper.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.query_splitter.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.util.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.rst.
Creating file target/docs/source/apache_beam.metrics.cells.rst.
Creating file target/docs/source/apache_beam.metrics.metric.rst.
Creating file target/docs/source/apache_beam.metrics.metricbase.rst.
Creating file target/docs/source/apache_beam.metrics.rst.
Creating file target/docs/source/apache_beam.options.pipeline_options.rst.
Creating file 
target/docs/source/apache_beam.options.pipeline_options_validator.rst.
Creating file target/docs/source/apache_beam.options.value_provider.rst.
Creating file target/docs/source/apache_beam.options.rst.
Creating file target/docs/source/apache_beam.portability.common_urns.rst.
Creating file target/docs/source/apache_beam.portability.python_urns.rst.
Creating file target/docs/source/apache_beam.portability.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_artifact_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_fn_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_job_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_provision_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_runner_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.endpoints_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.standard_window_fns_pb2_grpc.rst.
Creating file target/docs/source/apache_beam.portability.api.rst.
Creating file target/docs/source/apache_beam.runners.pipeline_context.rst.
Creating file target/docs/source/apache_beam.runners.runner.rst.
Creating file target/docs/source/apache_beam.runners.sdf_common.rst.
Creating file target/docs/source/apache_beam.runners.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.dataflow_metrics.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.dataflow_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.ptransform_overrides.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.test_dataflow_runner.rst.
Creating file target/docs/source/apache_beam.runners.dataflow.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.native_io.iobase.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.native_io.streaming_create.rst.
Creating file target/docs/source/apache_beam.runners.dataflow.native_io.rst.
Creating file target/docs/source/apache_beam.runners.direct.bundle_factory.rst.
Creating file target/docs/source/apache_beam.runners.direct.clock.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.consumer_tracking_pipeline_visitor.rst.
Creating file target/docs/source/apache_beam.runners.direct.direct_metrics.rst.
Creating file target/docs/source/apache_beam.runners.direct.direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.evaluation_context.rst.
Creating file target/docs/source/apache_beam.runners.direct.executor.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.helper_transforms.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.sdf_direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.transform_evaluator.rst.
Creating file target/docs/source/apache_beam.runners.direct.util.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.watermark_manager.rst.
Creating file target/docs/source/apache_beam.runners.direct.rst.
Creating file target/docs/source/apache_beam.runners.experimental.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.python_rpc_direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.server.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.rst.
Creating file target/docs/source/apache_beam.runners.job.manager.rst.
Creating file 

Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #6323

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3104] Set up state interfaces, wire into SDK harness client.

--
[...truncated 498.72 KB...]
2018-03-29T01:00:52.544 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] 
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] Skipping Apache Beam :: Examples :: Java
2018-03-29T01:00:52.544 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] 
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] Skipping Apache Beam :: SDKs :: Java :: Maven 
Archetypes :: Examples
2018-03-29T01:00:52.544 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] 
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] Skipping Apache Beam :: SDKs :: Java :: 
Extensions :: Jackson
2018-03-29T01:00:52.544 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] 
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] Skipping Apache Beam :: SDKs :: Java :: 
Extensions :: Join library
2018-03-29T01:00:52.544 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] 
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] Skipping Apache Beam :: SDKs :: Java :: 
Extensions :: Sketching
2018-03-29T01:00:52.544 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] 
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.544 [INFO] Skipping Apache Beam :: SDKs :: Java :: 
Extensions :: Sorter
2018-03-29T01:00:52.544 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.544 [INFO] 

2018-03-29T01:00:52.545 [INFO] 
2018-03-29T01:00:52.545 [INFO] 

2018-03-29T01:00:52.545 [INFO] Skipping Apache Beam :: SDKs :: Java :: 
Extensions :: SQL
2018-03-29T01:00:52.545 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.545 [INFO] 

2018-03-29T01:00:52.545 [INFO] 
2018-03-29T01:00:52.545 [INFO] 

2018-03-29T01:00:52.545 [INFO] Skipping Apache Beam :: SDKs :: Java :: Nexmark
2018-03-29T01:00:52.545 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.545 [INFO] 

2018-03-29T01:00:52.545 [INFO] 
2018-03-29T01:00:52.545 [INFO] 

2018-03-29T01:00:52.545 [INFO] Skipping Apache Beam :: Runners :: Java Fn 
Execution
2018-03-29T01:00:52.545 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.545 [INFO] 

2018-03-29T01:00:52.545 [INFO] 
2018-03-29T01:00:52.545 [INFO] 

2018-03-29T01:00:52.546 [INFO] Skipping Apache Beam :: Runners :: Java Local 
Artifact Service
2018-03-29T01:00:52.546 [INFO] This project has been banned from the build due 
to previous failures.
2018-03-29T01:00:52.546 [INFO] 

2018-03-29T01:00:52.546 [INFO] 
2018-03-29T01:00:52.547 [INFO] 

2018-03-29T01:00:52.547 [INFO] Skipping Apache 

Build failed in Jenkins: beam_PerformanceTests_Spark #1523

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

--
[...truncated 96.64 KB...]
'apache-beam-testing:bqjob_r316dce24734be691_01626f3b9d25_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r316dce24734be691_01626f3b9d25_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r316dce24734be691_01626f3b9d25_1 ... (0s) Current status: DONE   
2018-03-29 00:50:02,723 904a4b03 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-29 00:50:21,093 904a4b03 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-29 00:50:23,444 904a4b03 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r4cb71a5cba475d7_01626f3bee72_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r4cb71a5cba475d7_01626f3bee72_1 ... (0s) 
Current status: RUNNING 
Waiting on 
bqjob_r4cb71a5cba475d7_01626f3bee72_1 ... (0s) Current status: DONE   
2018-03-29 00:50:23,445 904a4b03 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-29 00:50:39,330 904a4b03 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-29 00:50:41,639 904a4b03 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r519624190352bbc3_01626f3c35c0_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r519624190352bbc3_01626f3c35c0_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r519624190352bbc3_01626f3c35c0_1 ... (0s) Current status: DONE   
2018-03-29 00:50:41,639 904a4b03 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-29 00:51:04,820 904a4b03 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-29 00:51:07,022 904a4b03 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r40864b80fcfe4f3e_01626f3c9949_1': Invalid schema
update. Field timestamp has changed 

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT #78

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

--
[...truncated 25.81 KB...]
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_TextIOIT #324

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

--
[...truncated 27.46 KB...]
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding 

[jira] [Work logged] (BEAM-3104) Implement a BeamFnState Service

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3104?focusedWorklogId=85454=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85454
 ]

ASF GitHub Bot logged work on BEAM-3104:


Author: ASF GitHub Bot
Created on: 29/Mar/18 00:49
Start Date: 29/Mar/18 00:49
Worklog Time Spent: 10m 
  Work Description: lukecwik closed pull request #4973: [BEAM-3104] Set up 
state interfaces, wire into SDK harness client.
URL: https://github.com/apache/beam/pull/4973
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git 
a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
 
b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
index ebcb8b4d33f..0447cf4574c 100644
--- 
a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
+++ 
b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
@@ -17,22 +17,25 @@
  */
 package org.apache.beam.runners.fnexecution.control;
 
+import static com.google.common.base.Preconditions.checkArgument;
+import static com.google.common.base.Preconditions.checkState;
+
 import com.google.auto.value.AutoValue;
-import com.google.common.cache.Cache;
-import com.google.common.cache.CacheBuilder;
-import java.util.Collections;
 import java.util.HashMap;
 import java.util.Map;
 import java.util.concurrent.CompletionStage;
-import java.util.concurrent.ExecutionException;
+import java.util.concurrent.ConcurrentHashMap;
 import java.util.concurrent.atomic.AtomicLong;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.InstructionResponse;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.ProcessBundleDescriptor;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.ProcessBundleRequest;
 import org.apache.beam.model.fnexecution.v1.BeamFnApi.RegisterResponse;
+import org.apache.beam.model.pipeline.v1.Endpoints;
 import org.apache.beam.runners.fnexecution.data.FnDataService;
 import org.apache.beam.runners.fnexecution.data.RemoteInputDestination;
+import org.apache.beam.runners.fnexecution.state.StateDelegator;
+import org.apache.beam.runners.fnexecution.state.StateRequestHandler;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.fn.data.CloseableFnDataReceiver;
 import org.apache.beam.sdk.fn.data.FnDataReceiver;
@@ -74,18 +77,20 @@ public String getId() {
* A processor capable of creating bundles for some registered {@link 
ProcessBundleDescriptor}.
*/
   public class BundleProcessor {
-private final String processBundleDescriptorId;
+private final ProcessBundleDescriptor processBundleDescriptor;
 private final CompletionStage registrationFuture;
-
 private final RemoteInputDestination remoteInput;
+private final StateDelegator stateDelegator;
 
 private BundleProcessor(
-String processBundleDescriptorId,
+ProcessBundleDescriptor processBundleDescriptor,
 CompletionStage registrationFuture,
-RemoteInputDestination remoteInput) {
-  this.processBundleDescriptorId = processBundleDescriptorId;
+RemoteInputDestination remoteInput,
+StateDelegator stateDelegator) {
+  this.processBundleDescriptor = processBundleDescriptor;
   this.registrationFuture = registrationFuture;
   this.remoteInput = remoteInput;
+  this.stateDelegator = stateDelegator;
 }
 
 public CompletionStage getRegistrationFuture() {
@@ -101,12 +106,37 @@ private BundleProcessor(
  * NOTE: It is important to {@link #close()} each bundle after all 
elements are emitted.
  * {@code
  * try (ActiveBundle bundle = SdkHarnessClient.newBundle(...)) {
- *   // send all elements
+ *   FnDataReceiver inputReceiver = bundle.getInputReceiver();
+ *   // send all elements ...
  * }
  * }
  */
 public ActiveBundle newBundle(
 Map outputReceivers) {
+  return newBundle(outputReceivers, request -> {
+throw new UnsupportedOperationException(String.format(
+"The %s does not have a registered state handler.",
+ActiveBundle.class.getSimpleName()));
+  });
+}
+
+/**
+ * Start a new bundle for the given {@link 
BeamFnApi.ProcessBundleDescriptor} identifier.
+ *
+ * The input channels for the returned {@link ActiveBundle} are derived 
from the instructions
+ * in the {@link BeamFnApi.ProcessBundleDescriptor}.
+ *
+ * NOTE: It 

[beam] 01/01: [BEAM-3104] Set up state interfaces, wire into SDK harness client.

2018-03-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 397688a62b1f9f0f9840a43ed4ad1a59ba77b981
Merge: 5973c4d cec7c03
Author: Lukasz Cwik 
AuthorDate: Wed Mar 28 17:48:57 2018 -0700

[BEAM-3104] Set up state interfaces, wire into SDK harness client.

 .../fnexecution/control/SdkHarnessClient.java  | 208 ++
 .../fnexecution/state/GrpcStateService.java|  25 ++-
 .../runners/fnexecution/state/StateDelegator.java  |  19 +-
 .../fnexecution/control/SdkHarnessClientTest.java  | 235 ++---
 4 files changed, 405 insertions(+), 82 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] branch master updated (5973c4d -> 397688a)

2018-03-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 5973c4d  [BEAM-3326] Address additional comments from PR/4963.
 add cec7c03  [BEAM-3104] Set up state interfaces, wire into SDK harness 
client.
 new 397688a  [BEAM-3104] Set up state interfaces, wire into SDK harness 
client.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../fnexecution/control/SdkHarnessClient.java  | 208 ++
 .../fnexecution/state/GrpcStateService.java|  25 ++-
 .../runners/fnexecution/state/StateDelegator.java  |  19 +-
 .../fnexecution/control/SdkHarnessClientTest.java  | 235 ++---
 4 files changed, 405 insertions(+), 82 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #75

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

--
[...truncated 743.99 KB...]

Try downloading the file manually from the project website.

Then, install it using the command: 
mvn install:install-file -DgroupId=cascading -DartifactId=cascading-hadoop 
-Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file

Alternatively, if you host your own repository you can deploy the file there: 
mvn deploy:deploy-file -DgroupId=cascading -DartifactId=cascading-hadoop 
-Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] 
-DrepositoryId=[id]

Path to dependency: 
1) 
org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT
2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0
3) cascading:cascading-hadoop:jar:2.6.3


  cascading:cascading-hadoop:jar:2.6.3

from the specified remote repositories:
  Nexus (http://repository.apache.org/snapshots, releases=false, 
snapshots=true),
  central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false)

Downloading from central: 
https://repo.maven.apache.org/maven2/cascading/cascading-local/2.6.3/cascading-local-2.6.3.jar
[WARNING] Could not find artifact cascading:cascading-local:jar:2.6.3 in 
central (https://repo.maven.apache.org/maven2)

Try downloading the file manually from the project website.

Then, install it using the command: 
mvn install:install-file -DgroupId=cascading -DartifactId=cascading-local 
-Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file

Alternatively, if you host your own repository you can deploy the file there: 
mvn deploy:deploy-file -DgroupId=cascading -DartifactId=cascading-local 
-Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] 
-DrepositoryId=[id]

Path to dependency: 
1) 
org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT
2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0
3) cascading:cascading-local:jar:2.6.3


  cascading:cascading-local:jar:2.6.3

from the specified remote repositories:
  Nexus (http://repository.apache.org/snapshots, releases=false, 
snapshots=true),
  central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false)

[INFO] Adding ignore: module-info
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-banned-dependencies) 
@ beam-sdks-java-io-hadoop-input-format ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-java-io-hadoop-input-format ---
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:resources (default-resources) @ 
beam-sdks-java-io-hadoop-input-format ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ 
beam-sdks-java-io-hadoop-input-format ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 2 source files to 

[INFO] 
:
 

 uses unchecked or unsafe operations.
[INFO] 
:
 Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:testResources (default-testResources) @ 
beam-sdks-java-io-hadoop-input-format ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ 
beam-sdks-java-io-hadoop-input-format ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 13 source files to 

[INFO] 
:
 

[jira] [Work logged] (BEAM-3104) Implement a BeamFnState Service

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3104?focusedWorklogId=85451=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85451
 ]

ASF GitHub Bot logged work on BEAM-3104:


Author: ASF GitHub Bot
Created on: 29/Mar/18 00:41
Start Date: 29/Mar/18 00:41
Worklog Time Spent: 10m 
  Work Description: lukecwik commented on a change in pull request #4973: 
[BEAM-3104] Set up state interfaces, wire into SDK harness client.
URL: https://github.com/apache/beam/pull/4973#discussion_r177928337
 
 

 ##
 File path: 
runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
 ##
 @@ -141,7 +171,11 @@ private BundleProcessor(
   fnApiDataService.send(
   LogicalEndpoint.of(bundleId, remoteInput.getTarget()), 
remoteInput.getCoder());
 
-  return new ActiveBundle(bundleId, specificResponse, dataReceiver, 
outputClients);
+  return new ActiveBundle(bundleId,
 
 Review comment:
   Not that I'm aware of. Fixed up the few nits that you recommended and any 
others that I could see.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85451)
Time Spent: 1h  (was: 50m)

> Implement a BeamFnState Service
> ---
>
> Key: BEAM-3104
> URL: https://issues.apache.org/jira/browse/BEAM-3104
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core
>Reporter: Thomas Groh
>Priority: Major
>  Labels: portability
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> This needs to use java methods to handle State Requests, and convert the java 
> response into an appropriate State API response.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3104) Implement a BeamFnState Service

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3104?focusedWorklogId=85452=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85452
 ]

ASF GitHub Bot logged work on BEAM-3104:


Author: ASF GitHub Bot
Created on: 29/Mar/18 00:41
Start Date: 29/Mar/18 00:41
Worklog Time Spent: 10m 
  Work Description: lukecwik commented on a change in pull request #4973: 
[BEAM-3104] Set up state interfaces, wire into SDK harness client.
URL: https://github.com/apache/beam/pull/4973#discussion_r177928358
 
 

 ##
 File path: 
runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
 ##
 @@ -273,58 +324,120 @@ public SdkHarnessClient withIdGenerator(IdGenerator 
idGenerator) {
 return new SdkHarnessClient(fnApiControlClient, fnApiDataService, 
idGenerator);
   }
 
+  /**
+   * Provides {@link BundleProcessor} that is capable of processing bundles not
+   * containing any state accesses such as:
+   * 
+   *   Side inputs
+   *   User state
+   *   Remote references
+   * 
+   *
+   * Note that bundle processors are cached based upon the the
+   * {@link ProcessBundleDescriptor#getId() process bundle descriptor id}.
+   * A previously created instance may be returned.
+   */
   public  BundleProcessor getProcessor(
   BeamFnApi.ProcessBundleDescriptor descriptor,
   RemoteInputDestination remoteInputDesination) {
-try {
-  return clientProcessors.get(
-  descriptor.getId(),
-  () ->
-  (BundleProcessor)
-  register(
-  Collections.singletonMap(
-  descriptor, (RemoteInputDestination) 
remoteInputDesination))
-  .get(descriptor.getId()));
-} catch (ExecutionException e) {
-  throw new RuntimeException(e);
+checkState(!descriptor.hasStateApiServiceDescriptor(),
+"The %s cannot support a %s containing a state %s.",
+BundleProcessor.class.getSimpleName(),
+BeamFnApi.ProcessBundleDescriptor.class.getSimpleName(),
+Endpoints.ApiServiceDescriptor.class.getSimpleName());
+return getProcessor(descriptor, remoteInputDesination, 
NoOpStateDelegator.INSTANCE);
+  }
+
+  /**
+   * A {@link StateDelegator} that issues zero state requests to any provided
+   * {@link StateRequestHandler state handlers}.
+   */
+  private static class NoOpStateDelegator implements StateDelegator {
+private static final NoOpStateDelegator INSTANCE = new 
NoOpStateDelegator();
+@Override
+public Registration registerForProcessBundleInstructionId(String 
processBundleInstructionId,
+StateRequestHandler handler) {
+  return Registration.INSTANCE;
+}
+
+/**
+ * The corresponding registration for a {@link NoOpStateDelegator} that 
does nothing.
+ */
+private static class Registration implements StateDelegator.Registration {
+  private static final Registration INSTANCE = new Registration();
+
+  @Override
+  public void deregister() {
+  }
+
+  @Override
+  public void abort() {
+  }
 }
   }
 
   /**
-   * Registers a {@link BeamFnApi.ProcessBundleDescriptor} for future 
processing.
+   * Provides {@link BundleProcessor} that is capable of processing bundles 
containing
+   * state accesses such as:
+   * 
+   *   Side inputs
+   *   User state
+   *   Remote references
+   * 
*
-   * A client may block on the result future, but may also proceed without 
blocking.
+   * Note that bundle processors are cached based upon the the
+   * {@link ProcessBundleDescriptor#getId() process bundle descriptor id}.
+   * A previously created instance may be returned.
*/
-  public Map register(
-  Map>
-  processBundleDescriptors) {
+  public  BundleProcessor getProcessor(
+  BeamFnApi.ProcessBundleDescriptor descriptor,
+  RemoteInputDestination remoteInputDesination,
+  StateDelegator stateDelegator) {
+BundleProcessor bundleProcessor = clientProcessors.computeIfAbsent(
 
 Review comment:
   Done and one other place.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85452)
Time Spent: 1h 10m  (was: 1h)

> Implement a BeamFnState Service
> ---
>
> Key: BEAM-3104
> URL: https://issues.apache.org/jira/browse/BEAM-3104
> Project: Beam
>  Issue Type: Sub-task
>  

Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #4523

2018-03-28 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT #308

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

--
[...truncated 707.98 KB...]
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT #310

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

--
[...truncated 33.77 KB...]
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_Python #1079

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

--
[...truncated 60.10 KB...]
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:copy-resources (copy-go-cmd-source) @ 
beam-sdks-go ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 6 resources
[INFO] 
[INFO] --- maven-assembly-plugin:3.1.0:single (export-go-pkg-sources) @ 
beam-sdks-go ---
[INFO] Reading assembly descriptor: descriptor.xml
[INFO] Building zip: 

[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:get (go-get-imports) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go get google.golang.org/grpc 
golang.org/x/oauth2/google google.golang.org/api/storage/v1 
github.com/spf13/cobra cloud.google.com/go/bigquery 
google.golang.org/api/googleapi google.golang.org/api/dataflow/v1b3
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build-linux-amd64) @ beam-sdks-go 
---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- maven-checkstyle-plugin:3.0.0:check (default) @ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:test (go-test) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go test ./...
[INFO] 
[INFO] -Exec.Out-
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl/cmd  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/specialize   [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/symtab   [no test files]
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam 0.032s
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam/artifact0.052s
[INFO] 
[ERROR] 
[ERROR] -Exec.Err-
[ERROR] # github.com/apache/beam/sdks/go/pkg/beam/util/gcsx
[ERROR] github.com/apache/beam/sdks/go/pkg/beam/util/gcsx/gcs.go:46:37: 
undefined: option.WithoutAuthentication
[ERROR] 
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Beam :: Parent .. SUCCESS [  3.094 s]
[INFO] Apache Beam :: SDKs :: Java :: Build Tools . SUCCESS [  2.865 s]
[INFO] Apache Beam :: Model ... SUCCESS [  0.046 s]
[INFO] Apache Beam :: Model :: Pipeline ... SUCCESS [  9.040 s]
[INFO] Apache Beam :: Model :: Job Management . SUCCESS [  3.191 s]
[INFO] Apache Beam :: Model :: Fn Execution ... SUCCESS [  3.561 s]
[INFO] Apache Beam :: SDKs  SUCCESS [  0.151 s]
[INFO] Apache Beam :: SDKs :: Go .. FAILURE [ 28.966 s]
[INFO] Apache Beam :: SDKs :: Go :: Container . SKIPPED
[INFO] Apache Beam :: SDKs :: Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Core  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Fn Execution  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform Core 
SKIPPED
[INFO] Apache Beam :: Runners . SKIPPED
[INFO] Apache Beam :: Runners :: Core Construction Java ... SKIPPED
[INFO] Apache Beam :: Runners :: Core Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Harness . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Container ... SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Amazon Web Services SKIPPED
[INFO] Apache Beam :: Runners :: Local Java Core .. SKIPPED
[INFO] Apache Beam :: Runners :: Direct Java .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: AMQP .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: 

Build failed in Jenkins: beam_PerformanceTests_JDBC #386

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3326] Address additional comments from PR/4963.

--
[...truncated 47.72 KB...]
[INFO] Excluding io.opencensus:opencensus-api:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.dropwizard.metrics:metrics-core:jar:3.1.2 from the shaded 
jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded 
jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.22.0 from the 
shaded jar.
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:integration-test (default) @ 
beam-sdks-java-io-jdbc ---
[INFO] Failsafe report directory: 

[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 0 s <<< 
FAILURE! - in org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] org.apache.beam.sdk.io.jdbc.JdbcIOIT  Time elapsed: 0 s  <<< ERROR!
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:215)
at org.postgresql.Driver.makeConnection(Driver.java:404)
at org.postgresql.Driver.connect(Driver.java:272)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:86)
at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:71)
at 

[jira] [Work logged] (BEAM-3966) Move core utilities into a new top-level module

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3966?focusedWorklogId=85448=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85448
 ]

ASF GitHub Bot logged work on BEAM-3966:


Author: ASF GitHub Bot
Created on: 29/Mar/18 00:07
Start Date: 29/Mar/18 00:07
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on issue #4974: [BEAM-3966] Move 
`sdks/java/fn-execution` to `util/java/fn-execution`
URL: https://github.com/apache/beam/pull/4974#issuecomment-377078542
 
 
   LGTM.
   
   True that I generally dislike `util` as a name because it is extremely 
unspecific / meaningless and usually ends up containing ~everything. The Java 
libraries are a good example. If you went with `everything` it would make it 
clear that the prefix is not useful. I often go with `misc` to indicate that 
things that are _not_ `misc` should gradually move out once they have a real 
theme.
   
   In this case, having a bucket for "everything not SDK or runner" seems 
useful, so I'm fine with it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85448)
Time Spent: 1h  (was: 50m)

> Move core utilities into a new top-level module
> ---
>
> Key: BEAM-3966
> URL: https://issues.apache.org/jira/browse/BEAM-3966
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Ben Sidhom
>Assignee: Kenneth Knowles
>Priority: Minor
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> As part of a longer-term dependency cleanup, fn-execution and similar 
> utilities should be moved into a new top-level module (util?) that can be 
> shared among runner and/or SDK code while clearly delineating the boundary 
> between runner and SDK.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3437) Support schema in PCollections

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3437?focusedWorklogId=85445=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85445
 ]

ASF GitHub Bot logged work on BEAM-3437:


Author: ASF GitHub Bot
Created on: 28/Mar/18 23:47
Start Date: 28/Mar/18 23:47
Worklog Time Spent: 10m 
  Work Description: reuvenlax commented on issue #4964: [BEAM-3437] 
Introduce Schema class, and use it in BeamSQL
URL: https://github.com/apache/beam/pull/4964#issuecomment-377074839
 
 
   FYI the test breakage is a known issue 
(https://issues.apache.org/jira/browse/BEAM-3964) and is unrelated to this PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85445)
Time Spent: 40m  (was: 0.5h)

> Support schema in PCollections
> --
>
> Key: BEAM-3437
> URL: https://issues.apache.org/jira/browse/BEAM-3437
> Project: Beam
>  Issue Type: Wish
>  Components: beam-model
>Reporter: Jean-Baptiste Onofré
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> As discussed with some people in the team, it would be great to add schema 
> support in {{PCollections}}. It will allow us:
> 1. To expect some data type in {{PTransforms}}
> 2. Improve some runners with additional features (I'm thinking about Spark 
> runner with data frames for instance).
> A technical draft document has been created: 
> https://docs.google.com/document/d/1tnG2DPHZYbsomvihIpXruUmQ12pHGK0QIvXS1FOTgRc/edit?disco=BhykQIs=5a203b46=comment_email_document
> I also started a PoC on a branch, I will update this Jira with a "discussion" 
> PR.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3104) Implement a BeamFnState Service

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3104?focusedWorklogId=85442=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85442
 ]

ASF GitHub Bot logged work on BEAM-3104:


Author: ASF GitHub Bot
Created on: 28/Mar/18 23:24
Start Date: 28/Mar/18 23:24
Worklog Time Spent: 10m 
  Work Description: axelmagn commented on a change in pull request #4973: 
[BEAM-3104] Set up state interfaces, wire into SDK harness client.
URL: https://github.com/apache/beam/pull/4973#discussion_r177918185
 
 

 ##
 File path: 
runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
 ##
 @@ -141,7 +171,11 @@ private BundleProcessor(
   fnApiDataService.send(
   LogicalEndpoint.of(bundleId, remoteInput.getTarget()), 
remoteInput.getCoder());
 
-  return new ActiveBundle(bundleId, specificResponse, dataReceiver, 
outputClients);
+  return new ActiveBundle(bundleId,
 
 Review comment:
   there is some inconsistency here as to whether or not to place a line break 
before the first parameter of a long invocation (see lines 343, 396, 434).  I 
know I have my own preference (in favor of the line break), but I don't know if 
we have an explicit style convention around it.  Are you aware of any?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85442)
Time Spent: 50m  (was: 40m)

> Implement a BeamFnState Service
> ---
>
> Key: BEAM-3104
> URL: https://issues.apache.org/jira/browse/BEAM-3104
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core
>Reporter: Thomas Groh
>Priority: Major
>  Labels: portability
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> This needs to use java methods to handle State Requests, and convert the java 
> response into an appropriate State API response.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3104) Implement a BeamFnState Service

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3104?focusedWorklogId=85440=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85440
 ]

ASF GitHub Bot logged work on BEAM-3104:


Author: ASF GitHub Bot
Created on: 28/Mar/18 23:19
Start Date: 28/Mar/18 23:19
Worklog Time Spent: 10m 
  Work Description: axelmagn commented on a change in pull request #4973: 
[BEAM-3104] Set up state interfaces, wire into SDK harness client.
URL: https://github.com/apache/beam/pull/4973#discussion_r177917404
 
 

 ##
 File path: 
runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
 ##
 @@ -273,58 +324,120 @@ public SdkHarnessClient withIdGenerator(IdGenerator 
idGenerator) {
 return new SdkHarnessClient(fnApiControlClient, fnApiDataService, 
idGenerator);
   }
 
+  /**
+   * Provides {@link BundleProcessor} that is capable of processing bundles not
+   * containing any state accesses such as:
+   * 
+   *   Side inputs
+   *   User state
+   *   Remote references
+   * 
+   *
+   * Note that bundle processors are cached based upon the the
+   * {@link ProcessBundleDescriptor#getId() process bundle descriptor id}.
+   * A previously created instance may be returned.
+   */
   public  BundleProcessor getProcessor(
   BeamFnApi.ProcessBundleDescriptor descriptor,
   RemoteInputDestination remoteInputDesination) {
-try {
-  return clientProcessors.get(
-  descriptor.getId(),
-  () ->
-  (BundleProcessor)
-  register(
-  Collections.singletonMap(
-  descriptor, (RemoteInputDestination) 
remoteInputDesination))
-  .get(descriptor.getId()));
-} catch (ExecutionException e) {
-  throw new RuntimeException(e);
+checkState(!descriptor.hasStateApiServiceDescriptor(),
+"The %s cannot support a %s containing a state %s.",
+BundleProcessor.class.getSimpleName(),
+BeamFnApi.ProcessBundleDescriptor.class.getSimpleName(),
+Endpoints.ApiServiceDescriptor.class.getSimpleName());
+return getProcessor(descriptor, remoteInputDesination, 
NoOpStateDelegator.INSTANCE);
+  }
+
+  /**
+   * A {@link StateDelegator} that issues zero state requests to any provided
+   * {@link StateRequestHandler state handlers}.
+   */
+  private static class NoOpStateDelegator implements StateDelegator {
+private static final NoOpStateDelegator INSTANCE = new 
NoOpStateDelegator();
+@Override
+public Registration registerForProcessBundleInstructionId(String 
processBundleInstructionId,
+StateRequestHandler handler) {
+  return Registration.INSTANCE;
+}
+
+/**
+ * The corresponding registration for a {@link NoOpStateDelegator} that 
does nothing.
+ */
+private static class Registration implements StateDelegator.Registration {
+  private static final Registration INSTANCE = new Registration();
+
+  @Override
+  public void deregister() {
+  }
+
+  @Override
+  public void abort() {
+  }
 }
   }
 
   /**
-   * Registers a {@link BeamFnApi.ProcessBundleDescriptor} for future 
processing.
+   * Provides {@link BundleProcessor} that is capable of processing bundles 
containing
+   * state accesses such as:
+   * 
+   *   Side inputs
+   *   User state
+   *   Remote references
+   * 
*
-   * A client may block on the result future, but may also proceed without 
blocking.
+   * Note that bundle processors are cached based upon the the
+   * {@link ProcessBundleDescriptor#getId() process bundle descriptor id}.
+   * A previously created instance may be returned.
*/
-  public Map register(
-  Map>
-  processBundleDescriptors) {
+  public  BundleProcessor getProcessor(
+  BeamFnApi.ProcessBundleDescriptor descriptor,
+  RemoteInputDestination remoteInputDesination,
+  StateDelegator stateDelegator) {
+BundleProcessor bundleProcessor = clientProcessors.computeIfAbsent(
 
 Review comment:
   prefer breaking the line at the higher syntactic level.  In this case after 
the '='


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85440)
Time Spent: 0.5h  (was: 20m)

> Implement a BeamFnState Service
> ---
>
> Key: BEAM-3104
> URL: https://issues.apache.org/jira/browse/BEAM-3104
> 

[jira] [Work logged] (BEAM-3104) Implement a BeamFnState Service

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3104?focusedWorklogId=85441=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85441
 ]

ASF GitHub Bot logged work on BEAM-3104:


Author: ASF GitHub Bot
Created on: 28/Mar/18 23:19
Start Date: 28/Mar/18 23:19
Worklog Time Spent: 10m 
  Work Description: axelmagn commented on a change in pull request #4973: 
[BEAM-3104] Set up state interfaces, wire into SDK harness client.
URL: https://github.com/apache/beam/pull/4973#discussion_r177917404
 
 

 ##
 File path: 
runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
 ##
 @@ -273,58 +324,120 @@ public SdkHarnessClient withIdGenerator(IdGenerator 
idGenerator) {
 return new SdkHarnessClient(fnApiControlClient, fnApiDataService, 
idGenerator);
   }
 
+  /**
+   * Provides {@link BundleProcessor} that is capable of processing bundles not
+   * containing any state accesses such as:
+   * 
+   *   Side inputs
+   *   User state
+   *   Remote references
+   * 
+   *
+   * Note that bundle processors are cached based upon the the
+   * {@link ProcessBundleDescriptor#getId() process bundle descriptor id}.
+   * A previously created instance may be returned.
+   */
   public  BundleProcessor getProcessor(
   BeamFnApi.ProcessBundleDescriptor descriptor,
   RemoteInputDestination remoteInputDesination) {
-try {
-  return clientProcessors.get(
-  descriptor.getId(),
-  () ->
-  (BundleProcessor)
-  register(
-  Collections.singletonMap(
-  descriptor, (RemoteInputDestination) 
remoteInputDesination))
-  .get(descriptor.getId()));
-} catch (ExecutionException e) {
-  throw new RuntimeException(e);
+checkState(!descriptor.hasStateApiServiceDescriptor(),
+"The %s cannot support a %s containing a state %s.",
+BundleProcessor.class.getSimpleName(),
+BeamFnApi.ProcessBundleDescriptor.class.getSimpleName(),
+Endpoints.ApiServiceDescriptor.class.getSimpleName());
+return getProcessor(descriptor, remoteInputDesination, 
NoOpStateDelegator.INSTANCE);
+  }
+
+  /**
+   * A {@link StateDelegator} that issues zero state requests to any provided
+   * {@link StateRequestHandler state handlers}.
+   */
+  private static class NoOpStateDelegator implements StateDelegator {
+private static final NoOpStateDelegator INSTANCE = new 
NoOpStateDelegator();
+@Override
+public Registration registerForProcessBundleInstructionId(String 
processBundleInstructionId,
+StateRequestHandler handler) {
+  return Registration.INSTANCE;
+}
+
+/**
+ * The corresponding registration for a {@link NoOpStateDelegator} that 
does nothing.
+ */
+private static class Registration implements StateDelegator.Registration {
+  private static final Registration INSTANCE = new Registration();
+
+  @Override
+  public void deregister() {
+  }
+
+  @Override
+  public void abort() {
+  }
 }
   }
 
   /**
-   * Registers a {@link BeamFnApi.ProcessBundleDescriptor} for future 
processing.
+   * Provides {@link BundleProcessor} that is capable of processing bundles 
containing
+   * state accesses such as:
+   * 
+   *   Side inputs
+   *   User state
+   *   Remote references
+   * 
*
-   * A client may block on the result future, but may also proceed without 
blocking.
+   * Note that bundle processors are cached based upon the the
+   * {@link ProcessBundleDescriptor#getId() process bundle descriptor id}.
+   * A previously created instance may be returned.
*/
-  public Map register(
-  Map>
-  processBundleDescriptors) {
+  public  BundleProcessor getProcessor(
+  BeamFnApi.ProcessBundleDescriptor descriptor,
+  RemoteInputDestination remoteInputDesination,
+  StateDelegator stateDelegator) {
+BundleProcessor bundleProcessor = clientProcessors.computeIfAbsent(
 
 Review comment:
   prefer breaking the line at the higher syntactic level.  In this case after 
the '='.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85441)
Time Spent: 40m  (was: 0.5h)

> Implement a BeamFnState Service
> ---
>
> Key: BEAM-3104
> URL: https://issues.apache.org/jira/browse/BEAM-3104
> 

[jira] [Work logged] (BEAM-3326) Execute a Stage via the portability framework in the ReferenceRunner

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3326?focusedWorklogId=85428=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85428
 ]

ASF GitHub Bot logged work on BEAM-3326:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:53
Start Date: 28/Mar/18 22:53
Worklog Time Spent: 10m 
  Work Description: lukecwik closed pull request #4970: [BEAM-3326] Address 
additional comments from PR/4963.
URL: https://github.com/apache/beam/pull/4970
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git 
a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
 
b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
index de1aa57fe32..ebcb8b4d33f 100644
--- 
a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
+++ 
b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
@@ -180,7 +180,7 @@ public String getBundleId() {
 
 /**
  * Returns a {@link FnDataReceiver receiver} which consumes input elements 
forwarding them
- * to the SDK. When
+ * to the SDK.
  */
 public FnDataReceiver getInputReceiver() {
   return inputReceiver;
@@ -211,8 +211,9 @@ public void close() throws Exception {
 if (exception == null) {
   MoreFutures.get(response);
 } else {
-  // TODO: Handle aborting the bundle being processed.
-  throw new IllegalStateException("Processing bundle failed, TODO: 
abort bundle.");
+  // TODO: [BEAM-3962] Handle aborting the bundle being processed.
+  throw new IllegalStateException("Processing bundle failed, "
+  + "TODO: [BEAM-3962] abort bundle.");
 }
   } catch (Exception e) {
 if (exception == null) {
diff --git 
a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/data/FnDataService.java
 
b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/data/FnDataService.java
index 6c7839fc193..d86324eb366 100644
--- 
a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/data/FnDataService.java
+++ 
b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/data/FnDataService.java
@@ -38,9 +38,9 @@
* The provided coder is used to decode inbound elements. The decoded 
elements are passed to
* the provided receiver.
*
-   * Any failure during decoding or processing of the element will complete 
the returned future
-   * exceptionally. On successful termination of the stream, the returned 
future is completed
-   * successfully.
+   * Any failure during decoding or processing of the element will put the
+   * {@link InboundDataClient} into an error state such that
+   * {@link InboundDataClient#awaitCompletion()} will throw an exception.
*
* The provided receiver is not required to be thread safe.
*/


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85428)
Time Spent: 5.5h  (was: 5h 20m)

> Execute a Stage via the portability framework in the ReferenceRunner
> 
>
> Key: BEAM-3326
> URL: https://issues.apache.org/jira/browse/BEAM-3326
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Major
>  Labels: portability
>  Time Spent: 5.5h
>  Remaining Estimate: 0h
>
> This is the supertask for remote execution in the Universal Local Runner 
> (BEAM-2899).
> This executes a stage remotely via portability framework APIs



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: [BEAM-3326] Address additional comments from PR/4963.

2018-03-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 5973c4dd10495e010e1fea687cb43b5d06736ed7
Merge: 8587839 202030b
Author: Lukasz Cwik 
AuthorDate: Wed Mar 28 15:53:40 2018 -0700

[BEAM-3326] Address additional comments from PR/4963.

 .../apache/beam/runners/fnexecution/control/SdkHarnessClient.java  | 7 ---
 .../org/apache/beam/runners/fnexecution/data/FnDataService.java| 6 +++---
 2 files changed, 7 insertions(+), 6 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] branch master updated (8587839 -> 5973c4d)

2018-03-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 8587839  Merge pull request #4851: [BEAM-3819] Add 
withRequestRecordsLimit() option to KinesisIO
 add 202030b  [BEAM-3326] Address additional comments from PR/4963.
 new 5973c4d  [BEAM-3326] Address additional comments from PR/4963.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache/beam/runners/fnexecution/control/SdkHarnessClient.java  | 7 ---
 .../org/apache/beam/runners/fnexecution/data/FnDataService.java| 6 +++---
 2 files changed, 7 insertions(+), 6 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #6320

2018-03-28 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=85422=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85422
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:37
Start Date: 28/Mar/18 22:37
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on a change in pull request 
#4905: [BEAM-3848] Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#discussion_r177909607
 
 

 ##
 File path: 
sdks/java/io/solr/src/test/java/org/apache/beam/sdk/io/solr/SolrIOTest.java
 ##
 @@ -263,4 +276,109 @@ public void testSplit() throws Exception {
 // therefore, can not exist an empty shard.
 assertEquals("Wrong number of empty splits", expectedNumSplits, 
nonEmptySplits);
   }
+
+  /**
+   * Ensure that the retrying is ignored under success conditions.
+   */
+  @Test
+  public void testWriteDefaultRetrySuccess() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(SolrIO.RetryConfiguration.create(10, 
Duration.standardSeconds(10)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate success
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenReturn(mock(SolrResponse.class));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+writeFn.flushBatch(solrClient, batch);
+verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+  }
+
+  /**
+   * Ensure that the default retrying behavior surfaces errors immediately 
under failure conditions.
+   */
+  @Test
+  public void testWriteRetryFail() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+
when(write.getRetryConfiguration()).thenReturn(SolrIO.DEFAULT_RETRY_CONFIGURATION);
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(new SolrServerException("Fail"));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+}
+  }
+
+  /**
+   * Ensure that a time bounded retrying is observed.
+   */
+  @Test
+  public void testWriteRetryTimeBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(
+SolrIO.RetryConfiguration.create(Integer.MAX_VALUE, 
Duration.standardSeconds(3)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(
+new HttpSolrClient.RemoteSolrException(
+"localhost", 1, "ignore", new IOException("Network")));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+Stopwatch stopwatch = Stopwatch.createStarted();
+
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  // at least two attempts must be made
+  verify(solrClient, Mockito.atLeast(2)).process(any(String.class), 
any(SolrRequest.class));
+  long seconds = stopwatch.elapsed(TimeUnit.SECONDS);
+  assertTrue(
+  "Retrying should have executed for at least 3 seconds but was " + 
seconds,
+  seconds >= 3);
+}
+  }
+
+  /**
+   * Ensure that retries are initiated up to a limited number.
+   */
+  @Test
+  public void testWriteRetryAttemptBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
 
 Review comment:
   How about the following (copied from the `JdbcIO` design and test approach)
   
1. I introduce a configurable `RetryStrategy` that can be added to the 
`Write` and will come into effect when a `RetryConfiguration` is also provided
2. A `DefaultRetryStrategy` will retry on 
`HttpSolrClient.RemoteSolrException`, `SolrServerException`, `IOException` and 
also on `SolrException` (where the code is `5xx` only and I see is missing in 
the original PR).
3. I introduce `SLF4J` logging at `warn` when retries are invoked
4. I run the mini-IT style tests using the embedded Solr server with 
1. a custom `RetryStrategy` to return `true`(i.e. retry) on any 
`SolrException`
2. write to a non 

[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=85421=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85421
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:37
Start Date: 28/Mar/18 22:37
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on a change in pull request 
#4905: [BEAM-3848] Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#discussion_r177909607
 
 

 ##
 File path: 
sdks/java/io/solr/src/test/java/org/apache/beam/sdk/io/solr/SolrIOTest.java
 ##
 @@ -263,4 +276,109 @@ public void testSplit() throws Exception {
 // therefore, can not exist an empty shard.
 assertEquals("Wrong number of empty splits", expectedNumSplits, 
nonEmptySplits);
   }
+
+  /**
+   * Ensure that the retrying is ignored under success conditions.
+   */
+  @Test
+  public void testWriteDefaultRetrySuccess() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(SolrIO.RetryConfiguration.create(10, 
Duration.standardSeconds(10)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate success
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenReturn(mock(SolrResponse.class));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+writeFn.flushBatch(solrClient, batch);
+verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+  }
+
+  /**
+   * Ensure that the default retrying behavior surfaces errors immediately 
under failure conditions.
+   */
+  @Test
+  public void testWriteRetryFail() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+
when(write.getRetryConfiguration()).thenReturn(SolrIO.DEFAULT_RETRY_CONFIGURATION);
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(new SolrServerException("Fail"));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+}
+  }
+
+  /**
+   * Ensure that a time bounded retrying is observed.
+   */
+  @Test
+  public void testWriteRetryTimeBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(
+SolrIO.RetryConfiguration.create(Integer.MAX_VALUE, 
Duration.standardSeconds(3)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(
+new HttpSolrClient.RemoteSolrException(
+"localhost", 1, "ignore", new IOException("Network")));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+Stopwatch stopwatch = Stopwatch.createStarted();
+
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  // at least two attempts must be made
+  verify(solrClient, Mockito.atLeast(2)).process(any(String.class), 
any(SolrRequest.class));
+  long seconds = stopwatch.elapsed(TimeUnit.SECONDS);
+  assertTrue(
+  "Retrying should have executed for at least 3 seconds but was " + 
seconds,
+  seconds >= 3);
+}
+  }
+
+  /**
+   * Ensure that retries are initiated up to a limited number.
+   */
+  @Test
+  public void testWriteRetryAttemptBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
 
 Review comment:
   How about the following (copied from the `JdbcIO` design and test approach)
   
1. I introduce a configurable `RetryStrategy` that can be added to the 
`Write` and will come into effect when a `RetryConfiguration` is also provided
2. A `DefaultRetryStrategy` will retry on 
`HttpSolrClient.RemoteSolrException`, `SolrServerException`, `IOException` and 
also on `SolrException` (where the code is `5xx` only) and I see is missing in 
the original PR.
3. I introduce `SLF4J` logging at `warn` when retries are invoked
4. I run the mini-IT style tests using the embedded Solr server with 
1. a custom `RetryStrategy` to return `true`(i.e. retry) on any 
`SolrException`
2. write to a non 

[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=85419=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85419
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:36
Start Date: 28/Mar/18 22:36
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on a change in pull request 
#4905: [BEAM-3848] Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#discussion_r177909607
 
 

 ##
 File path: 
sdks/java/io/solr/src/test/java/org/apache/beam/sdk/io/solr/SolrIOTest.java
 ##
 @@ -263,4 +276,109 @@ public void testSplit() throws Exception {
 // therefore, can not exist an empty shard.
 assertEquals("Wrong number of empty splits", expectedNumSplits, 
nonEmptySplits);
   }
+
+  /**
+   * Ensure that the retrying is ignored under success conditions.
+   */
+  @Test
+  public void testWriteDefaultRetrySuccess() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(SolrIO.RetryConfiguration.create(10, 
Duration.standardSeconds(10)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate success
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenReturn(mock(SolrResponse.class));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+writeFn.flushBatch(solrClient, batch);
+verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+  }
+
+  /**
+   * Ensure that the default retrying behavior surfaces errors immediately 
under failure conditions.
+   */
+  @Test
+  public void testWriteRetryFail() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+
when(write.getRetryConfiguration()).thenReturn(SolrIO.DEFAULT_RETRY_CONFIGURATION);
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(new SolrServerException("Fail"));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+}
+  }
+
+  /**
+   * Ensure that a time bounded retrying is observed.
+   */
+  @Test
+  public void testWriteRetryTimeBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(
+SolrIO.RetryConfiguration.create(Integer.MAX_VALUE, 
Duration.standardSeconds(3)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(
+new HttpSolrClient.RemoteSolrException(
+"localhost", 1, "ignore", new IOException("Network")));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+Stopwatch stopwatch = Stopwatch.createStarted();
+
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  // at least two attempts must be made
+  verify(solrClient, Mockito.atLeast(2)).process(any(String.class), 
any(SolrRequest.class));
+  long seconds = stopwatch.elapsed(TimeUnit.SECONDS);
+  assertTrue(
+  "Retrying should have executed for at least 3 seconds but was " + 
seconds,
+  seconds >= 3);
+}
+  }
+
+  /**
+   * Ensure that retries are initiated up to a limited number.
+   */
+  @Test
+  public void testWriteRetryAttemptBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
 
 Review comment:
   How about the following (copied from the `JdbcIO` design and test approach)
   
1. I introduce a configurable `RetryStrategy` that can be added to the 
`Write` and will come into effect when a `RetryConfiguration` is also provided
2. A `DefaultRetryStrategy` will retry on 
`HttpSolrClient.RemoteSolrException`, `SolrServerException`, `IOException` and 
also on `SolrException` (where the code is `5xx` only) and I see is missing in 
the original PR.
3. I introduce `SLF4J` logging at `warn` when retries are invoked
4. I run the mini-IT style tests using the embedded Solr server with 
1. a custom `RetryStrategy` to return `true`(i.e. retry) on any 
`SolrException`
2. write to a non 

[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=85418=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85418
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:35
Start Date: 28/Mar/18 22:35
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on a change in pull request 
#4905: [BEAM-3848] Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#discussion_r177909607
 
 

 ##
 File path: 
sdks/java/io/solr/src/test/java/org/apache/beam/sdk/io/solr/SolrIOTest.java
 ##
 @@ -263,4 +276,109 @@ public void testSplit() throws Exception {
 // therefore, can not exist an empty shard.
 assertEquals("Wrong number of empty splits", expectedNumSplits, 
nonEmptySplits);
   }
+
+  /**
+   * Ensure that the retrying is ignored under success conditions.
+   */
+  @Test
+  public void testWriteDefaultRetrySuccess() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(SolrIO.RetryConfiguration.create(10, 
Duration.standardSeconds(10)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate success
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenReturn(mock(SolrResponse.class));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+writeFn.flushBatch(solrClient, batch);
+verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+  }
+
+  /**
+   * Ensure that the default retrying behavior surfaces errors immediately 
under failure conditions.
+   */
+  @Test
+  public void testWriteRetryFail() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+
when(write.getRetryConfiguration()).thenReturn(SolrIO.DEFAULT_RETRY_CONFIGURATION);
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(new SolrServerException("Fail"));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+}
+  }
+
+  /**
+   * Ensure that a time bounded retrying is observed.
+   */
+  @Test
+  public void testWriteRetryTimeBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(
+SolrIO.RetryConfiguration.create(Integer.MAX_VALUE, 
Duration.standardSeconds(3)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(
+new HttpSolrClient.RemoteSolrException(
+"localhost", 1, "ignore", new IOException("Network")));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+Stopwatch stopwatch = Stopwatch.createStarted();
+
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  // at least two attempts must be made
+  verify(solrClient, Mockito.atLeast(2)).process(any(String.class), 
any(SolrRequest.class));
+  long seconds = stopwatch.elapsed(TimeUnit.SECONDS);
+  assertTrue(
+  "Retrying should have executed for at least 3 seconds but was " + 
seconds,
+  seconds >= 3);
+}
+  }
+
+  /**
+   * Ensure that retries are initiated up to a limited number.
+   */
+  @Test
+  public void testWriteRetryAttemptBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
 
 Review comment:
   How about the following (copied from the `JdbcIO` design and test approach)
   
1. I introduce a configurable `RetryStrategy` that can be added to the 
`Write` and will come into effect when a `RetryConfiguration` is also provided
2. A `DefaultRetryStrategy` will retry on 
`HttpSolrClient.RemoteSolrException`, `SolrServerException`, `IOException` and 
also on `SolrException` (where the code is `5xx` only) and I see is missing in 
the original PR.
3. I introduce `SLF4J` logging at `warn` when retries are invoked
4. I run the mini-IT style tests using the embedded Solr server with 
1. a custom `RetryStrategy` to return `true`(i.e. retry) on any 
`SolrException`
2. write to a non 

[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=85417=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85417
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:33
Start Date: 28/Mar/18 22:33
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on a change in pull request 
#4905: [BEAM-3848] Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#discussion_r177909607
 
 

 ##
 File path: 
sdks/java/io/solr/src/test/java/org/apache/beam/sdk/io/solr/SolrIOTest.java
 ##
 @@ -263,4 +276,109 @@ public void testSplit() throws Exception {
 // therefore, can not exist an empty shard.
 assertEquals("Wrong number of empty splits", expectedNumSplits, 
nonEmptySplits);
   }
+
+  /**
+   * Ensure that the retrying is ignored under success conditions.
+   */
+  @Test
+  public void testWriteDefaultRetrySuccess() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(SolrIO.RetryConfiguration.create(10, 
Duration.standardSeconds(10)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate success
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenReturn(mock(SolrResponse.class));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+writeFn.flushBatch(solrClient, batch);
+verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+  }
+
+  /**
+   * Ensure that the default retrying behavior surfaces errors immediately 
under failure conditions.
+   */
+  @Test
+  public void testWriteRetryFail() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+
when(write.getRetryConfiguration()).thenReturn(SolrIO.DEFAULT_RETRY_CONFIGURATION);
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(new SolrServerException("Fail"));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+}
+  }
+
+  /**
+   * Ensure that a time bounded retrying is observed.
+   */
+  @Test
+  public void testWriteRetryTimeBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(
+SolrIO.RetryConfiguration.create(Integer.MAX_VALUE, 
Duration.standardSeconds(3)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(
+new HttpSolrClient.RemoteSolrException(
+"localhost", 1, "ignore", new IOException("Network")));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+Stopwatch stopwatch = Stopwatch.createStarted();
+
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  // at least two attempts must be made
+  verify(solrClient, Mockito.atLeast(2)).process(any(String.class), 
any(SolrRequest.class));
+  long seconds = stopwatch.elapsed(TimeUnit.SECONDS);
+  assertTrue(
+  "Retrying should have executed for at least 3 seconds but was " + 
seconds,
+  seconds >= 3);
+}
+  }
+
+  /**
+   * Ensure that retries are initiated up to a limited number.
+   */
+  @Test
+  public void testWriteRetryAttemptBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
 
 Review comment:
   How about the following (copied from the `JdbcIO` design and test approach)
   
1. I introduce a configurable `RetryStrategy` that can be added to the 
`Write` and will come into effect when a `RetryConfiguration` is also provided
2. A `DefaultRetryStrategy` will retry on 
`HttpSolrClient.RemoteSolrException`, `SolrServerException`, `IOException` and 
also on `SolrException` (where the code is `5xx` only) and I see is missing in 
the original PR.
3. I introduce `SLF4J` logging at `warn` when retries are invoked
4. I run the mini-IT style tests using the embedded Solr server with 
1. a custom `RetryStrategy` to return `true`(i.e. retry) on any 
`SolrException`
2. write to a non 

[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=85416=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85416
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:32
Start Date: 28/Mar/18 22:32
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on a change in pull request 
#4905: [BEAM-3848] Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#discussion_r177909607
 
 

 ##
 File path: 
sdks/java/io/solr/src/test/java/org/apache/beam/sdk/io/solr/SolrIOTest.java
 ##
 @@ -263,4 +276,109 @@ public void testSplit() throws Exception {
 // therefore, can not exist an empty shard.
 assertEquals("Wrong number of empty splits", expectedNumSplits, 
nonEmptySplits);
   }
+
+  /**
+   * Ensure that the retrying is ignored under success conditions.
+   */
+  @Test
+  public void testWriteDefaultRetrySuccess() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(SolrIO.RetryConfiguration.create(10, 
Duration.standardSeconds(10)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate success
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenReturn(mock(SolrResponse.class));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+writeFn.flushBatch(solrClient, batch);
+verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+  }
+
+  /**
+   * Ensure that the default retrying behavior surfaces errors immediately 
under failure conditions.
+   */
+  @Test
+  public void testWriteRetryFail() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+
when(write.getRetryConfiguration()).thenReturn(SolrIO.DEFAULT_RETRY_CONFIGURATION);
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(new SolrServerException("Fail"));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+}
+  }
+
+  /**
+   * Ensure that a time bounded retrying is observed.
+   */
+  @Test
+  public void testWriteRetryTimeBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(
+SolrIO.RetryConfiguration.create(Integer.MAX_VALUE, 
Duration.standardSeconds(3)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(
+new HttpSolrClient.RemoteSolrException(
+"localhost", 1, "ignore", new IOException("Network")));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+Stopwatch stopwatch = Stopwatch.createStarted();
+
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  // at least two attempts must be made
+  verify(solrClient, Mockito.atLeast(2)).process(any(String.class), 
any(SolrRequest.class));
+  long seconds = stopwatch.elapsed(TimeUnit.SECONDS);
+  assertTrue(
+  "Retrying should have executed for at least 3 seconds but was " + 
seconds,
+  seconds >= 3);
+}
+  }
+
+  /**
+   * Ensure that retries are initiated up to a limited number.
+   */
+  @Test
+  public void testWriteRetryAttemptBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
 
 Review comment:
   How about the following (copied from the `JdbcIO` design and test approach)
   
1. I introduce a configurable `RetryStrategy` that can be added to the 
`Write` and will come into effect when a `RetryConfiguration` is also provided
2. A `DefaultRetryStrategy` will retry on 
`HttpSolrClient.RemoteSolrException`, `SolrServerException`, `IOException` and 
also on `SolrException` (where the code is `5xx` only) and I see is missing in 
the original PR.
3. I introduce logging using `SLF4J` logging at `warn` when retries are 
invoked
4. I run the mini-IT style tests using the embedded Solr server with 
1. a custom `RetryStrategy` to return `true`(i.e. retry) on any 
`SolrException`
2. 

[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=85415=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85415
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:32
Start Date: 28/Mar/18 22:32
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on a change in pull request 
#4905: [BEAM-3848] Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#discussion_r177909607
 
 

 ##
 File path: 
sdks/java/io/solr/src/test/java/org/apache/beam/sdk/io/solr/SolrIOTest.java
 ##
 @@ -263,4 +276,109 @@ public void testSplit() throws Exception {
 // therefore, can not exist an empty shard.
 assertEquals("Wrong number of empty splits", expectedNumSplits, 
nonEmptySplits);
   }
+
+  /**
+   * Ensure that the retrying is ignored under success conditions.
+   */
+  @Test
+  public void testWriteDefaultRetrySuccess() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(SolrIO.RetryConfiguration.create(10, 
Duration.standardSeconds(10)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate success
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenReturn(mock(SolrResponse.class));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+writeFn.flushBatch(solrClient, batch);
+verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+  }
+
+  /**
+   * Ensure that the default retrying behavior surfaces errors immediately 
under failure conditions.
+   */
+  @Test
+  public void testWriteRetryFail() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+
when(write.getRetryConfiguration()).thenReturn(SolrIO.DEFAULT_RETRY_CONFIGURATION);
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(new SolrServerException("Fail"));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  verify(solrClient, times(1)).process(any(String.class), 
any(SolrRequest.class));
+}
+  }
+
+  /**
+   * Ensure that a time bounded retrying is observed.
+   */
+  @Test
+  public void testWriteRetryTimeBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
+when(write.getRetryConfiguration())
+.thenReturn(
+SolrIO.RetryConfiguration.create(Integer.MAX_VALUE, 
Duration.standardSeconds(3)));
+SolrIO.Write.WriteFn writeFn = new SolrIO.Write.WriteFn(write);
+AuthorizedSolrClient solrClient = mock(AuthorizedSolrClient.class);
+
+// simulate failure
+when(solrClient.process(any(String.class), any(SolrRequest.class)))
+.thenThrow(
+new HttpSolrClient.RemoteSolrException(
+"localhost", 1, "ignore", new IOException("Network")));
+
+List batch = SolrIOTestUtils.createDocuments(1);
+Stopwatch stopwatch = Stopwatch.createStarted();
+
+try {
+  writeFn.flushBatch(solrClient, batch);
+  fail("Error should have been surfaced when flushing batch");
+} catch (IOException e) {
+  // at least two attempts must be made
+  verify(solrClient, Mockito.atLeast(2)).process(any(String.class), 
any(SolrRequest.class));
+  long seconds = stopwatch.elapsed(TimeUnit.SECONDS);
+  assertTrue(
+  "Retrying should have executed for at least 3 seconds but was " + 
seconds,
+  seconds >= 3);
+}
+  }
+
+  /**
+   * Ensure that retries are initiated up to a limited number.
+   */
+  @Test
+  public void testWriteRetryAttemptBound() throws Exception {
+SolrIO.Write write = mock(SolrIO.Write.class);
 
 Review comment:
   How about the following (copied from the `JdbcIO` design and test approach)
   
1. I introduce a configurable `RetryStrategy` that can be added to the 
`Write` and will come into effect when a the `RetryConfiguration` is also 
provided
2. A `DefaultRetryStrategy` will retry on 
`HttpSolrClient.RemoteSolrException`, `SolrServerException`, `IOException` and 
also on `SolrException` (where the code is `5xx` only) and I see is missing in 
the original PR.
3. I introduce logging using `SLF4J` logging at `warn` when retries are 
invoked
4. I run the mini-IT style tests using the embedded Solr server with 
1. a custom `RetryStrategy` to return `true`(i.e. retry) on any 
`SolrException`
   

[jira] [Work logged] (BEAM-3966) Move core utilities into a new top-level module

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3966?focusedWorklogId=85413=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85413
 ]

ASF GitHub Bot logged work on BEAM-3966:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:28
Start Date: 28/Mar/18 22:28
Worklog Time Spent: 10m 
  Work Description: lukecwik commented on issue #4974: [BEAM-3966] Move 
`sdks/java/fn-execution` to `util/java/fn-execution`
URL: https://github.com/apache/beam/pull/4974#issuecomment-377058787
 
 
   Can we defer doing moves/renames like this till we are further along?
   
   This will cause a bunch of pain for people integrating back changes on 
hacking branches.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85413)
Time Spent: 50m  (was: 40m)

> Move core utilities into a new top-level module
> ---
>
> Key: BEAM-3966
> URL: https://issues.apache.org/jira/browse/BEAM-3966
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Ben Sidhom
>Assignee: Kenneth Knowles
>Priority: Minor
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> As part of a longer-term dependency cleanup, fn-execution and similar 
> utilities should be moved into a new top-level module (util?) that can be 
> shared among runner and/or SDK code while clearly delineating the boundary 
> between runner and SDK.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3966) Move core utilities into a new top-level module

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3966?focusedWorklogId=85412=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85412
 ]

ASF GitHub Bot logged work on BEAM-3966:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:22
Start Date: 28/Mar/18 22:22
Worklog Time Spent: 10m 
  Work Description: tgroh commented on issue #4974: [BEAM-3966] Move 
`sdks/java/fn-execution` to `util/java/fn-execution`
URL: https://github.com/apache/beam/pull/4974#issuecomment-377057406
 
 
   LGTM; R: @kennknowles as I know you also dislike 'util' as a name, but it 
seems reasonable here (but I would be very happy if you have better ideas)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85412)
Time Spent: 40m  (was: 0.5h)

> Move core utilities into a new top-level module
> ---
>
> Key: BEAM-3966
> URL: https://issues.apache.org/jira/browse/BEAM-3966
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Ben Sidhom
>Assignee: Kenneth Knowles
>Priority: Minor
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> As part of a longer-term dependency cleanup, fn-execution and similar 
> utilities should be moved into a new top-level module (util?) that can be 
> shared among runner and/or SDK code while clearly delineating the boundary 
> between runner and SDK.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3966) Move core utilities into a new top-level module

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3966?focusedWorklogId=85411=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85411
 ]

ASF GitHub Bot logged work on BEAM-3966:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:22
Start Date: 28/Mar/18 22:22
Worklog Time Spent: 10m 
  Work Description: tgroh commented on issue #4974: [BEAM-3966] Move 
`sdks/java/fn-execution` to `util/java/fn-execution`
URL: https://github.com/apache/beam/pull/4974#issuecomment-377057406
 
 
   LGTM; R: @kennknowles as I know you also dislike 'util' as a name, but it 
seems reasonable here


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85411)
Time Spent: 0.5h  (was: 20m)

> Move core utilities into a new top-level module
> ---
>
> Key: BEAM-3966
> URL: https://issues.apache.org/jira/browse/BEAM-3966
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Ben Sidhom
>Assignee: Kenneth Knowles
>Priority: Minor
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> As part of a longer-term dependency cleanup, fn-execution and similar 
> utilities should be moved into a new top-level module (util?) that can be 
> shared among runner and/or SDK code while clearly delineating the boundary 
> between runner and SDK.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3966) Move core utilities into a new top-level module

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3966?focusedWorklogId=85408=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85408
 ]

ASF GitHub Bot logged work on BEAM-3966:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:18
Start Date: 28/Mar/18 22:18
Worklog Time Spent: 10m 
  Work Description: bsidhom opened a new pull request #4974: [BEAM-3966] 
Move `sdks/java/fn-execution` to `util/java/fn-execution`
URL: https://github.com/apache/beam/pull/4974
 
 
   This also renames the Java package itself to match the new module name.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [ ] Write a pull request description that is detailed enough to 
understand:
  - [ ] What the pull request does
  - [ ] Why it does it
  - [ ] How it does it
  - [ ] Why this approach
- [ ] Each commit in the pull request should have a meaningful subject line 
and body.
- [ ] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85408)
Time Spent: 10m
Remaining Estimate: 0h

> Move core utilities into a new top-level module
> ---
>
> Key: BEAM-3966
> URL: https://issues.apache.org/jira/browse/BEAM-3966
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Ben Sidhom
>Assignee: Kenneth Knowles
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> As part of a longer-term dependency cleanup, fn-execution and similar 
> utilities should be moved into a new top-level module (util?) that can be 
> shared among runner and/or SDK code while clearly delineating the boundary 
> between runner and SDK.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3966) Move core utilities into a new top-level module

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3966?focusedWorklogId=85409=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85409
 ]

ASF GitHub Bot logged work on BEAM-3966:


Author: ASF GitHub Bot
Created on: 28/Mar/18 22:18
Start Date: 28/Mar/18 22:18
Worklog Time Spent: 10m 
  Work Description: bsidhom commented on issue #4974: [BEAM-3966] Move 
`sdks/java/fn-execution` to `util/java/fn-execution`
URL: https://github.com/apache/beam/pull/4974#issuecomment-377056670
 
 
   R: @tgroh 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85409)
Time Spent: 20m  (was: 10m)

> Move core utilities into a new top-level module
> ---
>
> Key: BEAM-3966
> URL: https://issues.apache.org/jira/browse/BEAM-3966
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Ben Sidhom
>Assignee: Kenneth Knowles
>Priority: Minor
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> As part of a longer-term dependency cleanup, fn-execution and similar 
> utilities should be moved into a new top-level module (util?) that can be 
> shared among runner and/or SDK code while clearly delineating the boundary 
> between runner and SDK.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4526

2018-03-28 Thread Apache Jenkins Server
See 


--
[...truncated 260.89 KB...]
Creating file target/docs/source/apache_beam.io.gcp.pubsub.rst.
Creating file target/docs/source/apache_beam.io.gcp.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.adaptive_throttler.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.datastoreio.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.fake_datastore.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.helper.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.query_splitter.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.util.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.rst.
Creating file target/docs/source/apache_beam.metrics.cells.rst.
Creating file target/docs/source/apache_beam.metrics.metric.rst.
Creating file target/docs/source/apache_beam.metrics.metricbase.rst.
Creating file target/docs/source/apache_beam.metrics.rst.
Creating file target/docs/source/apache_beam.options.pipeline_options.rst.
Creating file 
target/docs/source/apache_beam.options.pipeline_options_validator.rst.
Creating file target/docs/source/apache_beam.options.value_provider.rst.
Creating file target/docs/source/apache_beam.options.rst.
Creating file target/docs/source/apache_beam.portability.common_urns.rst.
Creating file target/docs/source/apache_beam.portability.python_urns.rst.
Creating file target/docs/source/apache_beam.portability.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_artifact_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_fn_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_job_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_provision_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_runner_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.endpoints_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.standard_window_fns_pb2_grpc.rst.
Creating file target/docs/source/apache_beam.portability.api.rst.
Creating file target/docs/source/apache_beam.runners.pipeline_context.rst.
Creating file target/docs/source/apache_beam.runners.runner.rst.
Creating file target/docs/source/apache_beam.runners.sdf_common.rst.
Creating file target/docs/source/apache_beam.runners.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.dataflow_metrics.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.dataflow_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.ptransform_overrides.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.test_dataflow_runner.rst.
Creating file target/docs/source/apache_beam.runners.dataflow.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.native_io.iobase.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.native_io.streaming_create.rst.
Creating file target/docs/source/apache_beam.runners.dataflow.native_io.rst.
Creating file target/docs/source/apache_beam.runners.direct.bundle_factory.rst.
Creating file target/docs/source/apache_beam.runners.direct.clock.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.consumer_tracking_pipeline_visitor.rst.
Creating file target/docs/source/apache_beam.runners.direct.direct_metrics.rst.
Creating file target/docs/source/apache_beam.runners.direct.direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.evaluation_context.rst.
Creating file target/docs/source/apache_beam.runners.direct.executor.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.helper_transforms.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.sdf_direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.transform_evaluator.rst.
Creating file target/docs/source/apache_beam.runners.direct.util.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.watermark_manager.rst.
Creating file target/docs/source/apache_beam.runners.direct.rst.
Creating file target/docs/source/apache_beam.runners.experimental.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.python_rpc_direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.server.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.rst.
Creating file target/docs/source/apache_beam.runners.job.manager.rst.
Creating file target/docs/source/apache_beam.runners.job.utils.rst.
Creating file target/docs/source/apache_beam.runners.job.rst.
Creating file 

Jenkins build is back to normal : beam_PostCommit_Python_ValidatesRunner_Dataflow #1206

2018-03-28 Thread Apache Jenkins Server
See 




Jenkins build is unstable: beam_PostCommit_Java_MavenInstall #6319

2018-03-28 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-3104) Implement a BeamFnState Service

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3104?focusedWorklogId=85399=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85399
 ]

ASF GitHub Bot logged work on BEAM-3104:


Author: ASF GitHub Bot
Created on: 28/Mar/18 21:33
Start Date: 28/Mar/18 21:33
Worklog Time Spent: 10m 
  Work Description: lukecwik commented on issue #4973: [BEAM-3104] Set up 
state interfaces, wire into SDK harness client.
URL: https://github.com/apache/beam/pull/4973#issuecomment-377044527
 
 
   R: @axelmagn 
   CC: @bsidhom 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85399)
Time Spent: 20m  (was: 10m)

> Implement a BeamFnState Service
> ---
>
> Key: BEAM-3104
> URL: https://issues.apache.org/jira/browse/BEAM-3104
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core
>Reporter: Thomas Groh
>Priority: Major
>  Labels: portability
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> This needs to use java methods to handle State Requests, and convert the java 
> response into an appropriate State API response.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3104) Implement a BeamFnState Service

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3104?focusedWorklogId=85398=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85398
 ]

ASF GitHub Bot logged work on BEAM-3104:


Author: ASF GitHub Bot
Created on: 28/Mar/18 21:32
Start Date: 28/Mar/18 21:32
Worklog Time Spent: 10m 
  Work Description: lukecwik opened a new pull request #4973: [BEAM-3104] 
Set up state interfaces, wire into SDK harness client.
URL: https://github.com/apache/beam/pull/4973
 
 
   Allow for creating a BundleProcessor with a state delegator and an 
ActiveBundle with a state handler.
   Add tests that cover aborting state handling if something upstream fails.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [ ] Write a pull request description that is detailed enough to 
understand:
  - [ ] What the pull request does
  - [ ] Why it does it
  - [ ] How it does it
  - [ ] Why this approach
- [ ] Each commit in the pull request should have a meaningful subject line 
and body.
- [ ] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85398)
Time Spent: 10m
Remaining Estimate: 0h

> Implement a BeamFnState Service
> ---
>
> Key: BEAM-3104
> URL: https://issues.apache.org/jira/browse/BEAM-3104
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core
>Reporter: Thomas Groh
>Priority: Major
>  Labels: portability
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> This needs to use java methods to handle State Requests, and convert the java 
> response into an appropriate State API response.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3956) Stacktraces from exceptions in user code should be preserved in the Python SDK

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3956?focusedWorklogId=85384=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85384
 ]

ASF GitHub Bot logged work on BEAM-3956:


Author: ASF GitHub Bot
Created on: 28/Mar/18 20:57
Start Date: 28/Mar/18 20:57
Worklog Time Spent: 10m 
  Work Description: shoyer commented on issue #4959: [BEAM-3956] Preserve 
stacktraces for Python exceptions
URL: https://github.com/apache/beam/pull/4959#issuecomment-377033947
 
 
   Jenkins appears to be failing due to a lint error in a file I didn't edit 
(https://builds.apache.org/job/beam_PreCommit_Python_MavenInstall/4127/org.apache.beam$beam-sdks-python/console):
   ```
   Running pylint for module apache_beam:
   * Module apache_beam.typehints.typehints
   C:1101, 0: Trailing newlines (trailing-newlines)
   ```
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85384)
Time Spent: 1h 10m  (was: 1h)

> Stacktraces from exceptions in user code should be preserved in the Python SDK
> --
>
> Key: BEAM-3956
> URL: https://issues.apache.org/jira/browse/BEAM-3956
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Stephan Hoyer
>Priority: Major
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> Currently, Beam's Python SDK loses stacktraces for exceptions. It does 
> helpfully add a tag like "[while running StageA]" to exception error 
> messages, but that doesn't include the stacktrace of Python functions being 
> called.
> Including the full stacktraces would make a big difference for the ease of 
> debugging Beam pipelines when things go wrong.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3948) Quota exceeded error causes Performance tests failures.

2018-03-28 Thread Alan Myrvold (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3948?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16418076#comment-16418076
 ] 

Alan Myrvold commented on BEAM-3948:


I see additional failures due to issue BEAM-3964

> Quota exceeded error causes Performance tests failures.
> ---
>
> Key: BEAM-3948
> URL: https://issues.apache.org/jira/browse/BEAM-3948
> Project: Beam
>  Issue Type: Bug
>  Components: testing
>Reporter: Łukasz Gajowy
>Assignee: Chamikara Jayalath
>Priority: Major
> Fix For: Not applicable
>
>
> This happens for various jenkins jobs for some time now.
> Example logs: 
> https://builds.apache.org/view/A-D/view/Beam/job/beam_PerformanceTests_JDBC/379/console
>   or for TextIOIT: 
> [https://builds.apache.org/view/A-D/view/Beam/job/beam_PerformanceTests_TextIOIT/318/console]
> other IOITs are also affected.  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3964) Java Github HEAD doesn't run on Dataflow

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3964?focusedWorklogId=85380=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85380
 ]

ASF GitHub Bot logged work on BEAM-3964:


Author: ASF GitHub Bot
Created on: 28/Mar/18 20:30
Start Date: 28/Mar/18 20:30
Worklog Time Spent: 10m 
  Work Description: tgroh commented on issue #4971: [BEAM-3964] Temporarily 
Disable Dataflow Tests
URL: https://github.com/apache/beam/pull/4971#issuecomment-377025634
 
 
   Run Dataflow ValidatesRunner
   
   (If everything works as expected, this will be a very quick execution)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85380)
Time Spent: 0.5h  (was: 20m)

> Java Github HEAD doesn't run on Dataflow
> 
>
> Key: BEAM-3964
> URL: https://issues.apache.org/jira/browse/BEAM-3964
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Major
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> Rather: it runs, but cannot make progress, seemingly due to an Error with 
> metric names:
>  
> exception: "java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.metrics.MetricName.name()Ljava/lang/String; at 
> com.google.cloud.dataflow.worker.MetricsToCounterUpdateConverter.structuredNameAndMetadata(MetricsToCounterUpdateConverter.java:99)
>  at 
> com.google.cloud.dataflow.worker.MetricsToCounterUpdateConverter.fromCounter(MetricsToCounterUpdateConverter.java:68)
>  at 
> com.google.cloud.dataflow.worker.BatchModeExecutionContext.lambda$null$1(BatchModeExecutionContext.java:463)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.Iterators$7.transform(Iterators.java:750)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.TransformedIterator.next(TransformedIterator.java:47)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.MultitransformedIterator.next(MultitransformedIterator.java:66)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableCollection$Builder.addAll(ImmutableCollection.java:388)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableCollection$ArrayBasedBuilder.addAll(ImmutableCollection.java:472)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableList$Builder.addAll(ImmutableList.java:669)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.populateCounterUpdates(WorkItemStatusClient.java:256)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.createStatusUpdate(WorkItemStatusClient.java:240)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.reportError(WorkItemStatusClient.java:94)
>  at 
> com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:358)
>  at 
> com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
>  at 
> com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
>  at 
> com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
>  at 
> com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
>  at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)
>  
> Probably requires an updated container version.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)



[jira] [Created] (BEAM-3966) Move core utilities into a new top-level module

2018-03-28 Thread Ben Sidhom (JIRA)
Ben Sidhom created BEAM-3966:


 Summary: Move core utilities into a new top-level module
 Key: BEAM-3966
 URL: https://issues.apache.org/jira/browse/BEAM-3966
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-core
Reporter: Ben Sidhom
Assignee: Kenneth Knowles


As part of a longer-term dependency cleanup, fn-execution and similar utilities 
should be moved into a new top-level module (util?) that can be shared among 
runner and/or SDK code while clearly delineating the boundary between runner 
and SDK.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3706) Update CombinePayload to improved model for Portability

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3706?focusedWorklogId=85372=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85372
 ]

ASF GitHub Bot logged work on BEAM-3706:


Author: ASF GitHub Bot
Created on: 28/Mar/18 20:16
Start Date: 28/Mar/18 20:16
Worklog Time Spent: 10m 
  Work Description: bsidhom commented on issue #4924: [BEAM-3706] Removing 
side inputs from CombinePayload proto.
URL: https://github.com/apache/beam/pull/4924#issuecomment-377021479
 
 
   FYI, it looks like you have merge conflicts to deal with now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85372)
Time Spent: 3h 10m  (was: 3h)

> Update CombinePayload to improved model for Portability
> ---
>
> Key: BEAM-3706
> URL: https://issues.apache.org/jira/browse/BEAM-3706
> Project: Beam
>  Issue Type: Sub-task
>  Components: beam-model
>Reporter: Daniel Oliveira
>Assignee: Daniel Oliveira
>Priority: Minor
>  Labels: portability
>  Time Spent: 3h 10m
>  Remaining Estimate: 0h
>
> This will mean changing the proto definition in beam_runner_api, most likely 
> trimming out fields that are no longer necessary and adding any new ones that 
> could be useful. The majority of work will probably be in investigating if 
> some existing fields can actually be removed (SideInputs and Parameters for 
> example).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3695) beam_PostCommit_Python_ValidatesContainer_Dataflow red for a few days

2018-03-28 Thread Alan Myrvold (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3695?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Myrvold resolved BEAM-3695.

   Resolution: Fixed
Fix Version/s: 2.4.0

Passing now, 
https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/

> beam_PostCommit_Python_ValidatesContainer_Dataflow red for a few days
> -
>
> Key: BEAM-3695
> URL: https://issues.apache.org/jira/browse/BEAM-3695
> Project: Beam
>  Issue Type: Bug
>  Components: build-system, runner-dataflow, testing
>Reporter: Kenneth Knowles
>Assignee: Alan Myrvold
>Priority: Critical
> Fix For: 2.4.0
>
>  Time Spent: 2h 20m
>  Remaining Estimate: 0h
>
> I haven't looked into the logs, but this test last passed 5 days ago.
> Can the job or its notifications be disabled while it is under development, 
> perhaps?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3965) HDFS read broken

2018-03-28 Thread Udi Meiri (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3965?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Udi Meiri updated BEAM-3965:

Description: 
When running a command like:

{noformat}
python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
--output gs://.../py-wordcount-output --hdfs_host ... --hdfs_port 50070 
--hdfs_user ehudm --runner DataflowRunner --project ... --temp_location 
gs://.../temp-hdfs-int --staging_location gs://.../staging-hdfs-int 
--sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt
{noformat}

I get:

{noformat}
Traceback (most recent call last):
  File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
"__main__", fname, loader, pkg_name)
  File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
  File 
"/usr/local/google/home/ehudm/src/beam/sdks/python/apache_beam/examples/wordcount.py",
 line 136, in 
run()
  File 
"/usr/local/google/home/ehudm/src/beam/sdks/python/apache_beam/examples/wordcount.py",
 line 90, in run
lines = p | 'read' >> ReadFromText(known_args.input)
  File "apache_beam/io/textio.py", line 522, in __init__
skip_header_lines=skip_header_lines)
  File "apache_beam/io/textio.py", line 117, in __init__
validate=validate)
  File "apache_beam/io/filebasedsource.py", line 119, in __init__
self._validate()
  File "apache_beam/options/value_provider.py", line 124, in _f
return fnc(self, *args, **kwargs)
  File "apache_beam/io/filebasedsource.py", line 176, in _validate
match_result = FileSystems.match([pattern], limits=[1])[0]
  File "apache_beam/io/filesystems.py", line 159, in match
return filesystem.match(patterns, limits)
  File "apache_beam/io/hadoopfilesystem.py", line 221, in match
raise BeamIOError('Match operation failed', exceptions)
apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions 
{'hdfs://kinglear.txt': KeyError('name',)}
{noformat}

  was:
When running a command like:

python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
--output gs://.../py-wordcount-output --hdfs_host ... --hdfs_port 50070 
--hdfs_user ehudm --runner DataflowRunner --project ... --temp_location 
gs://.../temp-hdfs-int --staging_location gs://.../staging-hdfs-int 
--sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt

I get:

{{Traceback (most recent call last):}}
{{ File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main}}
{{ "__main__", fname, loader, pkg_name)}}
{{ File "/usr/lib/python2.7/runpy.py", line 72, in _run_code}}
{{ exec code in run_globals}}
{{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 136, in 
}}
{{ run()}}
{{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 90, in 
run}}
{{ lines = p | 'read' >> ReadFromText(known_args.input)}}
{{ File "apache_beam/io/textio.py", line 522, in __init__}}
{{ skip_header_lines=skip_header_lines)}}
{{ File "apache_beam/io/textio.py", line 117, in __init__}}
{{ validate=validate)}}
{{ File "apache_beam/io/filebasedsource.py", line 119, in __init__}}
{{ self._validate()}}
{{ File "apache_beam/options/value_provider.py", line 124, in _f}}
{{ return fnc(self, *args, **kwargs)}}
{{ File "apache_beam/io/filebasedsource.py", line 176, in _validate}}
{{ match_result = FileSystems.match([pattern], limits=[1])[0]}}
{{ File "apache_beam/io/filesystems.py", line 159, in match}}
{{ return filesystem.match(patterns, limits)}}
{{ File "apache_beam/io/hadoopfilesystem.py", line 221, in match}}
{{ raise BeamIOError('Match operation failed', exceptions)}}
{{apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions 
\{'hdfs://kinglear.txt': KeyError('name',)}}}


> HDFS read broken
> 
>
> Key: BEAM-3965
> URL: https://issues.apache.org/jira/browse/BEAM-3965
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Udi Meiri
>Assignee: Udi Meiri
>Priority: Major
>
> When running a command like:
> {noformat}
> python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
> --output gs://.../py-wordcount-output --hdfs_host ... --hdfs_port 50070 
> --hdfs_user ehudm --runner DataflowRunner --project ... --temp_location 
> gs://.../temp-hdfs-int --staging_location gs://.../staging-hdfs-int 
> --sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt
> {noformat}
> I get:
> {noformat}
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
> "__main__", fname, loader, pkg_name)
>   File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
> exec code in run_globals
>   File 
> "/usr/local/google/home/ehudm/src/beam/sdks/python/apache_beam/examples/wordcount.py",
>  line 136, in 
> run()
>   File 
> 

[jira] [Updated] (BEAM-3965) HDFS read broken

2018-03-28 Thread Udi Meiri (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3965?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Udi Meiri updated BEAM-3965:

Description: 
When running a command like:
{noformat}
python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
--output gs://.../py-wordcount-output \
  --hdfs_host ... --hdfs_port 50070 --hdfs_user ehudm --runner DataflowRunner 
--project ... \
  --temp_location gs://.../temp-hdfs-int --staging_location 
gs://.../staging-hdfs-int \
  --sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt
{noformat}
I get:
{noformat}
Traceback (most recent call last):
  File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
"__main__", fname, loader, pkg_name)
  File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
  File 
"/usr/local/google/home/ehudm/src/beam/sdks/python/apache_beam/examples/wordcount.py",
 line 136, in 
run()
  File 
"/usr/local/google/home/ehudm/src/beam/sdks/python/apache_beam/examples/wordcount.py",
 line 90, in run
lines = p | 'read' >> ReadFromText(known_args.input)
  File "apache_beam/io/textio.py", line 522, in __init__
skip_header_lines=skip_header_lines)
  File "apache_beam/io/textio.py", line 117, in __init__
validate=validate)
  File "apache_beam/io/filebasedsource.py", line 119, in __init__
self._validate()
  File "apache_beam/options/value_provider.py", line 124, in _f
return fnc(self, *args, **kwargs)
  File "apache_beam/io/filebasedsource.py", line 176, in _validate
match_result = FileSystems.match([pattern], limits=[1])[0]
  File "apache_beam/io/filesystems.py", line 159, in match
return filesystem.match(patterns, limits)
  File "apache_beam/io/hadoopfilesystem.py", line 221, in match
raise BeamIOError('Match operation failed', exceptions)
apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions 
{'hdfs://kinglear.txt': KeyError('name',)}
{noformat}

  was:
When running a command like:

{noformat}
python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
--output gs://.../py-wordcount-output --hdfs_host ... --hdfs_port 50070 
--hdfs_user ehudm --runner DataflowRunner --project ... --temp_location 
gs://.../temp-hdfs-int --staging_location gs://.../staging-hdfs-int 
--sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt
{noformat}

I get:

{noformat}
Traceback (most recent call last):
  File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
"__main__", fname, loader, pkg_name)
  File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
  File 
"/usr/local/google/home/ehudm/src/beam/sdks/python/apache_beam/examples/wordcount.py",
 line 136, in 
run()
  File 
"/usr/local/google/home/ehudm/src/beam/sdks/python/apache_beam/examples/wordcount.py",
 line 90, in run
lines = p | 'read' >> ReadFromText(known_args.input)
  File "apache_beam/io/textio.py", line 522, in __init__
skip_header_lines=skip_header_lines)
  File "apache_beam/io/textio.py", line 117, in __init__
validate=validate)
  File "apache_beam/io/filebasedsource.py", line 119, in __init__
self._validate()
  File "apache_beam/options/value_provider.py", line 124, in _f
return fnc(self, *args, **kwargs)
  File "apache_beam/io/filebasedsource.py", line 176, in _validate
match_result = FileSystems.match([pattern], limits=[1])[0]
  File "apache_beam/io/filesystems.py", line 159, in match
return filesystem.match(patterns, limits)
  File "apache_beam/io/hadoopfilesystem.py", line 221, in match
raise BeamIOError('Match operation failed', exceptions)
apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions 
{'hdfs://kinglear.txt': KeyError('name',)}
{noformat}


> HDFS read broken
> 
>
> Key: BEAM-3965
> URL: https://issues.apache.org/jira/browse/BEAM-3965
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Udi Meiri
>Assignee: Udi Meiri
>Priority: Major
>
> When running a command like:
> {noformat}
> python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
> --output gs://.../py-wordcount-output \
>   --hdfs_host ... --hdfs_port 50070 --hdfs_user ehudm --runner DataflowRunner 
> --project ... \
>   --temp_location gs://.../temp-hdfs-int --staging_location 
> gs://.../staging-hdfs-int \
>   --sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input 
> hdfs://kinglear.txt
> {noformat}
> I get:
> {noformat}
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
> "__main__", fname, loader, pkg_name)
>   File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
> exec code in run_globals
>   File 
> 

[jira] [Updated] (BEAM-3965) HDFS read broken

2018-03-28 Thread Udi Meiri (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3965?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Udi Meiri updated BEAM-3965:

Description: 
When running a command like:

python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
--output gs://.../py-wordcount-output --hdfs_host ... --hdfs_port 50070 
--hdfs_user ehudm --runner DataflowRunner --project ... --temp_location 
gs://.../temp-hdfs-int --staging_location gs://.../staging-hdfs-int 
--sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt

I get:

{{Traceback (most recent call last):}}
{{ File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main}}
{{ "__main__", fname, loader, pkg_name)}}
{{ File "/usr/lib/python2.7/runpy.py", line 72, in _run_code}}
{{ exec code in run_globals}}
{{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 136, in 
}}
{{ run()}}
{{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 90, in 
run}}
{{ lines = p | 'read' >> ReadFromText(known_args.input)}}
{{ File "apache_beam/io/textio.py", line 522, in __init__}}
{{ skip_header_lines=skip_header_lines)}}
{{ File "apache_beam/io/textio.py", line 117, in __init__}}
{{ validate=validate)}}
{{ File "apache_beam/io/filebasedsource.py", line 119, in __init__}}
{{ self._validate()}}
{{ File "apache_beam/options/value_provider.py", line 124, in _f}}
{{ return fnc(self, *args, **kwargs)}}
{{ File "apache_beam/io/filebasedsource.py", line 176, in _validate}}
{{ match_result = FileSystems.match([pattern], limits=[1])[0]}}
{{ File "apache_beam/io/filesystems.py", line 159, in match}}
{{ return filesystem.match(patterns, limits)}}
{{ File "apache_beam/io/hadoopfilesystem.py", line 221, in match}}
{{ raise BeamIOError('Match operation failed', exceptions)}}
{{apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions 
\{'hdfs://kinglear.txt': KeyError('name',)}}}

  was:
When running a command like:

{{python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
--output gs://.../py-wordcount-output --hdfs_host ... --hdfs_port 50070 
--hdfs_user ehudm --runner DataflowRunner --project ... --temp_location 
gs://.../temp-hdfs-int --staging_location gs://.../staging-hdfs-int 
--sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt}}

I get:

{{Traceback (most recent call last):}}
{{ File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main}}
{{ "__main__", fname, loader, pkg_name)}}
{{ File "/usr/lib/python2.7/runpy.py", line 72, in _run_code}}
{{ exec code in run_globals}}
{{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 136, in 
}}
{{ run()}}
{{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 90, in 
run}}
{{ lines = p | 'read' >> ReadFromText(known_args.input)}}
{{ File "apache_beam/io/textio.py", line 522, in __init__}}
{{ skip_header_lines=skip_header_lines)}}
{{ File "apache_beam/io/textio.py", line 117, in __init__}}
{{ validate=validate)}}
{{ File "apache_beam/io/filebasedsource.py", line 119, in __init__}}
{{ self._validate()}}
{{ File "apache_beam/options/value_provider.py", line 124, in _f}}
{{ return fnc(self, *args, **kwargs)}}
{{ File "apache_beam/io/filebasedsource.py", line 176, in _validate}}
{{ match_result = FileSystems.match([pattern], limits=[1])[0]}}
{{ File "apache_beam/io/filesystems.py", line 159, in match}}
{{ return filesystem.match(patterns, limits)}}
{{ File "apache_beam/io/hadoopfilesystem.py", line 221, in match}}
{{ raise BeamIOError('Match operation failed', exceptions)}}
{{apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions 
\{'hdfs://kinglear.txt': KeyError('name',)}}}


> HDFS read broken
> 
>
> Key: BEAM-3965
> URL: https://issues.apache.org/jira/browse/BEAM-3965
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Udi Meiri
>Assignee: Udi Meiri
>Priority: Major
>
> When running a command like:
> python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
> --output gs://.../py-wordcount-output --hdfs_host ... --hdfs_port 50070 
> --hdfs_user ehudm --runner DataflowRunner --project ... --temp_location 
> gs://.../temp-hdfs-int --staging_location gs://.../staging-hdfs-int 
> --sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt
> I get:
> {{Traceback (most recent call last):}}
> {{ File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main}}
> {{ "__main__", fname, loader, pkg_name)}}
> {{ File "/usr/lib/python2.7/runpy.py", line 72, in _run_code}}
> {{ exec code in run_globals}}
> {{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 136, 
> in }}
> {{ run()}}
> {{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 90, in 
> run}}
> {{ lines = p | 'read' >> 

[jira] [Created] (BEAM-3965) HDFS read broken

2018-03-28 Thread Udi Meiri (JIRA)
Udi Meiri created BEAM-3965:
---

 Summary: HDFS read broken
 Key: BEAM-3965
 URL: https://issues.apache.org/jira/browse/BEAM-3965
 Project: Beam
  Issue Type: Bug
  Components: sdk-py-core
Reporter: Udi Meiri
Assignee: Udi Meiri


When running a command like:

{{python setup.py sdist > /dev/null && python -m apache_beam.examples.wordcount 
--output gs://.../py-wordcount-output --hdfs_host ... --hdfs_port 50070 
--hdfs_user ehudm --runner DataflowRunner --project ... --temp_location 
gs://.../temp-hdfs-int --staging_location gs://.../staging-hdfs-int 
--sdk_location dist/apache-beam-2.5.0.dev0.tar.gz --input hdfs://kinglear.txt}}

I get:

{{Traceback (most recent call last):}}
{{ File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main}}
{{ "__main__", fname, loader, pkg_name)}}
{{ File "/usr/lib/python2.7/runpy.py", line 72, in _run_code}}
{{ exec code in run_globals}}
{{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 136, in 
}}
{{ run()}}
{{ File ".../beam/sdks/python/apache_beam/examples/wordcount.py", line 90, in 
run}}
{{ lines = p | 'read' >> ReadFromText(known_args.input)}}
{{ File "apache_beam/io/textio.py", line 522, in __init__}}
{{ skip_header_lines=skip_header_lines)}}
{{ File "apache_beam/io/textio.py", line 117, in __init__}}
{{ validate=validate)}}
{{ File "apache_beam/io/filebasedsource.py", line 119, in __init__}}
{{ self._validate()}}
{{ File "apache_beam/options/value_provider.py", line 124, in _f}}
{{ return fnc(self, *args, **kwargs)}}
{{ File "apache_beam/io/filebasedsource.py", line 176, in _validate}}
{{ match_result = FileSystems.match([pattern], limits=[1])[0]}}
{{ File "apache_beam/io/filesystems.py", line 159, in match}}
{{ return filesystem.match(patterns, limits)}}
{{ File "apache_beam/io/hadoopfilesystem.py", line 221, in match}}
{{ raise BeamIOError('Match operation failed', exceptions)}}
{{apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions 
\{'hdfs://kinglear.txt': KeyError('name',)}}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #4522

2018-03-28 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_TFRecordIOIT #305

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 16.15 KB...]
Requirement already satisfied: google-auth-httplib2 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: google-gax<0.16dev,>=0.15.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: grpc-google-iam-v1<0.12dev,>=0.11.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: cachetools>=2.0.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-auth<2.0.0dev,>=0.4.0->google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: future<0.17dev,>=0.16.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: ply==3.8 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Installing collected packages: hdfs, pytz, nose, apache-beam
  Found existing installation: apache-beam 2.5.0.dev0
Not uninstalling apache-beam at 
/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_Python/src/sdks/python,
 outside environment 

  Running setup.py develop for apache-beam
Successfully installed apache-beam hdfs-2.1.0 nose-1.3.7 pytz-2018.3
[beam_PerformanceTests_TFRecordIOIT] $ /bin/bash -xe 
/tmp/jenkins2743706120351958333.sh
+ .env/bin/python PerfKitBenchmarker/pkb.py --project=apache-beam-testing 
--dpb_log_level=INFO --maven_binary=/home/jenkins/tools/maven/latest/bin/mvn 
--bigquery_table=beam_performance.tfrecordioit_pkb_results 
--temp_dir=
 --official=true --benchmarks=beam_integration_benchmark --beam_it_timeout=1200 
--beam_it_profile=io-it --beam_prebuilt=true --beam_sdk=java 
--beam_it_module=sdks/java/io/file-based-io-tests 
--beam_it_class=org.apache.beam.sdk.io.tfrecord.TFRecordIOIT 
'--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TFRecordIOIT/305/,--numberOfRecords=100]'
 '--beam_extra_mvn_properties=[filesystem=gcs]'
2018-03-28 19:09:29,023 e83bfa7a MainThread INFO Verbose logging to: 

2018-03-28 19:09:29,023 e83bfa7a MainThread INFO PerfKitBenchmarker 
version: v1.12.0-446-gfa9cdd1
2018-03-28 19:09:29,024 e83bfa7a MainThread INFO Flag values:
--beam_extra_mvn_properties=[filesystem=gcs]
--beam_it_class=org.apache.beam.sdk.io.tfrecord.TFRecordIOIT
--beam_it_timeout=1200
--beam_sdk=java
--temp_dir=
--maven_binary=/home/jenkins/tools/maven/latest/bin/mvn
--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TFRecordIOIT/305/,--numberOfRecords=100]
--beam_prebuilt
--project=apache-beam-testing
--bigquery_table=beam_performance.tfrecordioit_pkb_results
--official
--beam_it_module=sdks/java/io/file-based-io-tests
--dpb_log_level=INFO
--beam_it_profile=io-it
--benchmarks=beam_integration_benchmark
2018-03-28 19:09:29,487 e83bfa7a MainThread WARNING  The key "flags" was not in 
the default config, but was in user overrides. This may indicate a typo.
2018-03-28 19:09:29,487 e83bfa7a MainThread INFO Initializing the edw 
service decoder
2018-03-28 19:09:29,610 e83bfa7a MainThread beam_integration_benchmark(1/1) 
INFO Provisioning resources for benchmark beam_integration_benchmark
2018-03-28 19:09:29,612 e83bfa7a MainThread beam_integration_benchmark(1/1) 
INFO Preparing benchmark beam_integration_benchmark
2018-03-28 19:09:29,612 e83bfa7a MainThread beam_integration_benchmark(1/1) 
INFO Running: git clone https://github.com/apache/beam.git
2018-03-28 19:09:38,291 e83bfa7a MainThread 

[jira] [Work logged] (BEAM-3867) Add support for decompressing and reading tar files

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3867?focusedWorklogId=85356=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85356
 ]

ASF GitHub Bot logged work on BEAM-3867:


Author: ASF GitHub Bot
Created on: 28/Mar/18 19:27
Start Date: 28/Mar/18 19:27
Worklog Time Spent: 10m 
  Work Description: eachsaj closed pull request #4957: BEAM-3867 Add 
support for decompressing and reading tar files
URL: https://github.com/apache/beam/pull/4957
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85356)
Time Spent: 40m  (was: 0.5h)

> Add support for decompressing and reading tar files
> ---
>
> Key: BEAM-3867
> URL: https://issues.apache.org/jira/browse/BEAM-3867
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-core
>Reporter: Chamikara Jayalath
>Assignee: Sajeevan Achuthan
>Priority: Major
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> See below for a thread regarding this.
> https://lists.apache.org/thread.html/eb006101b9d3cb6bf88ec1f8f29e2aaec96ab5760fb7041336fa541a@%3Cdev.beam.apache.org%3E



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_JDBC #385

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 21.74 KB...]
--dpb_log_level=INFO
--beam_options_config_file=
--beam_it_profile=io-it
--benchmarks=beam_integration_benchmark
2018-03-28 18:54:51,985 8d2ea224 MainThread WARNING  The key "flags" was not in 
the default config, but was in user overrides. This may indicate a typo.
2018-03-28 18:54:51,986 8d2ea224 MainThread INFO Initializing the edw 
service decoder
2018-03-28 18:54:52,112 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Provisioning resources for benchmark beam_integration_benchmark
2018-03-28 18:54:52,114 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Preparing benchmark beam_integration_benchmark
2018-03-28 18:54:52,114 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: git clone https://github.com/apache/beam.git
2018-03-28 18:54:59,516 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: kubectl 
--kubeconfig=
 create -f 

2018-03-28 18:55:00,097 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running benchmark beam_integration_benchmark
2018-03-28 18:55:00,121 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: kubectl 
--kubeconfig=
 get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-03-28 18:55:10,287 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: kubectl 
--kubeconfig=
 get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-03-28 18:55:20,456 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: kubectl 
--kubeconfig=
 get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-03-28 18:55:30,619 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: kubectl 
--kubeconfig=
 get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-03-28 18:55:40,747 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: kubectl 
--kubeconfig=
 get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-03-28 18:55:50,902 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: kubectl 
--kubeconfig=
 get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-03-28 18:55:51,060 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Using LoadBalancer IP Address: 35.184.230.194
2018-03-28 18:55:51,067 8d2ea224 MainThread beam_integration_benchmark(1/1) 
INFO Running: /home/jenkins/tools/maven/latest/bin/mvn -e verify 
-Dit.test=org.apache.beam.sdk.io.jdbc.JdbcIOIT -DskipITs=false -pl 
sdks/java/io/jdbc -Pio-it -Pdataflow-runner 
-DintegrationTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--postgresPort=5432","--numberOfRecords=500","--postgresServerName=35.184.230.194","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresSsl=False","--runner=TestDataflowRunner"]
2018-03-28 19:25:51,073 8d2ea224 Thread-9 ERRORIssueCommand timed out after 
1800 seconds. Killing command "/home/jenkins/tools/maven/latest/bin/mvn -e 
verify -Dit.test=org.apache.beam.sdk.io.jdbc.JdbcIOIT -DskipITs=false -pl 
sdks/java/io/jdbc -Pio-it -Pdataflow-runner 
-DintegrationTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--postgresPort=5432","--numberOfRecords=500","--postgresServerName=35.184.230.194","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresSsl=False","--runner=TestDataflowRunner"]".
2018-03-28 19:25:51,125 8d2ea224 MainThread 

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT #77

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 27.71 KB...]
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding 

[jira] [Work logged] (BEAM-3867) Add support for decompressing and reading tar files

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3867?focusedWorklogId=85355=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85355
 ]

ASF GitHub Bot logged work on BEAM-3867:


Author: ASF GitHub Bot
Created on: 28/Mar/18 19:23
Start Date: 28/Mar/18 19:23
Worklog Time Spent: 10m 
  Work Description: eachsaj commented on issue #4957: BEAM-3867 Add support 
for decompressing and reading tar files
URL: https://github.com/apache/beam/pull/4957#issuecomment-377005855
 
 
   Hi Eugene
   
 Thanks, I noticed it. I will make another pull request.
   
   thanks
   Saj
   
   On 28 March 2018 at 02:50, Eugene Kirpichov 
   wrote:
   
   > Something's weird with the history here - all commits on this PR are
   > unrelated to it. Please make sure that your branch consists of the master
   > branch + your commit(s), with nothing else.
   >
   > —
   > You are receiving this because you authored the thread.
   > Reply to this email directly, view it on GitHub
   > , or mute
   > the thread
   > 

   > .
   >
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85355)
Time Spent: 0.5h  (was: 20m)

> Add support for decompressing and reading tar files
> ---
>
> Key: BEAM-3867
> URL: https://issues.apache.org/jira/browse/BEAM-3867
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-core
>Reporter: Chamikara Jayalath
>Assignee: Sajeevan Achuthan
>Priority: Major
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> See below for a thread regarding this.
> https://lists.apache.org/thread.html/eb006101b9d3cb6bf88ec1f8f29e2aaec96ab5760fb7041336fa541a@%3Cdev.beam.apache.org%3E



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Spark #1522

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 94.31 KB...]
'apache-beam-testing:bqjob_r65f2fd99cd18cf11_01626e0bc382_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-28 19:18:09,660 6cbecb05 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-28 19:18:25,182 6cbecb05 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-28 19:18:27,434 6cbecb05 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_r264016cc95880533_01626e0c09a0_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r264016cc95880533_01626e0c09a0_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r264016cc95880533_01626e0c09a0_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-28 19:18:27,435 6cbecb05 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-28 19:18:50,992 6cbecb05 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-28 19:18:53,330 6cbecb05 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_rca7e34e24efc7fc_01626e0c6e88_1 ... (0s) Current status: 
RUNNING 
Waiting on bqjob_rca7e34e24efc7fc_01626e0c6e88_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_rca7e34e24efc7fc_01626e0c6e88_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-28 19:18:53,330 6cbecb05 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-28 19:19:19,461 6cbecb05 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-28 19:19:25,807 6cbecb05 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_r79000dfac2efcb6f_01626e0cddbc_1 ... (0s) Current status: 
RUNNING 
 Waiting on 

[jira] [Work logged] (BEAM-3706) Update CombinePayload to improved model for Portability

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3706?focusedWorklogId=85353=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85353
 ]

ASF GitHub Bot logged work on BEAM-3706:


Author: ASF GitHub Bot
Created on: 28/Mar/18 19:16
Start Date: 28/Mar/18 19:16
Worklog Time Spent: 10m 
  Work Description: youngoli commented on issue #4924: [BEAM-3706] Removing 
side inputs from CombinePayload proto.
URL: https://github.com/apache/beam/pull/4924#issuecomment-377003788
 
 
   So the previous changes were actually breaking the DirectRunner because it 
would translate a combine then immediately call getCombineFn on it. This worked 
fine when all combines were translated with CombineFns, but now that combines 
with side inputs aren't given CombineFns (or even Payloads) it was breaking. I 
fixed that issue by making those getters return Optional<> where appropriate, 
which should prevent further instances of assuming a CombineFn (or 
CombinePayload) is present when it isn't.
   
   @lukecwik
   This change was a bit involved so please take another look at this PR if you 
can.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85353)
Time Spent: 3h  (was: 2h 50m)

> Update CombinePayload to improved model for Portability
> ---
>
> Key: BEAM-3706
> URL: https://issues.apache.org/jira/browse/BEAM-3706
> Project: Beam
>  Issue Type: Sub-task
>  Components: beam-model
>Reporter: Daniel Oliveira
>Assignee: Daniel Oliveira
>Priority: Minor
>  Labels: portability
>  Time Spent: 3h
>  Remaining Estimate: 0h
>
> This will mean changing the proto definition in beam_runner_api, most likely 
> trimming out fields that are no longer necessary and adding any new ones that 
> could be useful. The majority of work will probably be in investigating if 
> some existing fields can actually be removed (SideInputs and Parameters for 
> example).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_TextIOIT #323

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 25.81 KB...]
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #74

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 81.97 KB...]
from the specified remote repositories:
  Nexus (http://repository.apache.org/snapshots, releases=false, 
snapshots=true),
  central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false)

Downloading from central: 
https://repo.maven.apache.org/maven2/cascading/cascading-hadoop/2.6.3/cascading-hadoop-2.6.3.jar
[WARNING] Could not find artifact cascading:cascading-hadoop:jar:2.6.3 in 
central (https://repo.maven.apache.org/maven2)

Try downloading the file manually from the project website.

Then, install it using the command: 
mvn install:install-file -DgroupId=cascading -DartifactId=cascading-hadoop 
-Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file

Alternatively, if you host your own repository you can deploy the file there: 
mvn deploy:deploy-file -DgroupId=cascading -DartifactId=cascading-hadoop 
-Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] 
-DrepositoryId=[id]

Path to dependency: 
1) 
org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT
2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0
3) cascading:cascading-hadoop:jar:2.6.3


  cascading:cascading-hadoop:jar:2.6.3

from the specified remote repositories:
  Nexus (http://repository.apache.org/snapshots, releases=false, 
snapshots=true),
  central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false)

Downloading from central: 
https://repo.maven.apache.org/maven2/cascading/cascading-local/2.6.3/cascading-local-2.6.3.jar
[WARNING] Could not find artifact cascading:cascading-local:jar:2.6.3 in 
central (https://repo.maven.apache.org/maven2)

Try downloading the file manually from the project website.

Then, install it using the command: 
mvn install:install-file -DgroupId=cascading -DartifactId=cascading-local 
-Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file

Alternatively, if you host your own repository you can deploy the file there: 
mvn deploy:deploy-file -DgroupId=cascading -DartifactId=cascading-local 
-Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] 
-DrepositoryId=[id]

Path to dependency: 
1) 
org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT
2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0
3) cascading:cascading-local:jar:2.6.3


  cascading:cascading-local:jar:2.6.3

from the specified remote repositories:
  Nexus (http://repository.apache.org/snapshots, releases=false, 
snapshots=true),
  central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false)

[INFO] Adding ignore: module-info
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-banned-dependencies) 
@ beam-sdks-java-io-hadoop-input-format ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-java-io-hadoop-input-format ---
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:resources (default-resources) @ 
beam-sdks-java-io-hadoop-input-format ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ 
beam-sdks-java-io-hadoop-input-format ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 2 source files to 

[INFO] 
:
 

 uses unchecked or unsafe operations.
[INFO] 
:
 Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:testResources (default-testResources) @ 
beam-sdks-java-io-hadoop-input-format ---
[INFO] Using 'UTF-8' encoding to 

Build failed in Jenkins: beam_PerformanceTests_Python #1078

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 62.72 KB...]
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:copy-resources (copy-go-cmd-source) @ 
beam-sdks-go ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 6 resources
[INFO] 
[INFO] --- maven-assembly-plugin:3.1.0:single (export-go-pkg-sources) @ 
beam-sdks-go ---
[INFO] Reading assembly descriptor: descriptor.xml
[INFO] Building zip: 

[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:get (go-get-imports) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go get google.golang.org/grpc 
golang.org/x/oauth2/google google.golang.org/api/storage/v1 
github.com/spf13/cobra cloud.google.com/go/bigquery 
google.golang.org/api/googleapi google.golang.org/api/dataflow/v1b3
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build-linux-amd64) @ beam-sdks-go 
---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- maven-checkstyle-plugin:3.0.0:check (default) @ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:test (go-test) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go test ./...
[INFO] 
[INFO] -Exec.Out-
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl/cmd  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/specialize   [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/symtab   [no test files]
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam 0.029s
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam/artifact0.095s
[INFO] 
[ERROR] 
[ERROR] -Exec.Err-
[ERROR] # github.com/apache/beam/sdks/go/pkg/beam/util/gcsx
[ERROR] github.com/apache/beam/sdks/go/pkg/beam/util/gcsx/gcs.go:46:37: 
undefined: option.WithoutAuthentication
[ERROR] 
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Beam :: Parent .. SUCCESS [  4.577 s]
[INFO] Apache Beam :: SDKs :: Java :: Build Tools . SUCCESS [  3.181 s]
[INFO] Apache Beam :: Model ... SUCCESS [  0.081 s]
[INFO] Apache Beam :: Model :: Pipeline ... SUCCESS [ 13.655 s]
[INFO] Apache Beam :: Model :: Job Management . SUCCESS [  3.795 s]
[INFO] Apache Beam :: Model :: Fn Execution ... SUCCESS [  8.007 s]
[INFO] Apache Beam :: SDKs  SUCCESS [  0.372 s]
[INFO] Apache Beam :: SDKs :: Go .. FAILURE [ 32.477 s]
[INFO] Apache Beam :: SDKs :: Go :: Container . SKIPPED
[INFO] Apache Beam :: SDKs :: Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Core  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Fn Execution  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform Core 
SKIPPED
[INFO] Apache Beam :: Runners . SKIPPED
[INFO] Apache Beam :: Runners :: Core Construction Java ... SKIPPED
[INFO] Apache Beam :: Runners :: Core Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Harness . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Container ... SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Amazon Web Services SKIPPED
[INFO] 

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT #307

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 28.85 KB...]
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] 

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT #309

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 709.49 KB...]
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding 

[jira] [Work logged] (BEAM-3326) Execute a Stage via the portability framework in the ReferenceRunner

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3326?focusedWorklogId=85348=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85348
 ]

ASF GitHub Bot logged work on BEAM-3326:


Author: ASF GitHub Bot
Created on: 28/Mar/18 18:40
Start Date: 28/Mar/18 18:40
Worklog Time Spent: 10m 
  Work Description: lukecwik opened a new pull request #4970: [BEAM-3326] 
Address additional comments from PR/4963.
URL: https://github.com/apache/beam/pull/4970
 
 
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [ ] Write a pull request description that is detailed enough to 
understand:
  - [ ] What the pull request does
  - [ ] Why it does it
  - [ ] How it does it
  - [ ] Why this approach
- [ ] Each commit in the pull request should have a meaningful subject line 
and body.
- [ ] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85348)
Time Spent: 5h 10m  (was: 5h)

> Execute a Stage via the portability framework in the ReferenceRunner
> 
>
> Key: BEAM-3326
> URL: https://issues.apache.org/jira/browse/BEAM-3326
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Major
>  Labels: portability
>  Time Spent: 5h 10m
>  Remaining Estimate: 0h
>
> This is the supertask for remote execution in the Universal Local Runner 
> (BEAM-2899).
> This executes a stage remotely via portability framework APIs



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3326) Execute a Stage via the portability framework in the ReferenceRunner

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3326?focusedWorklogId=85349=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85349
 ]

ASF GitHub Bot logged work on BEAM-3326:


Author: ASF GitHub Bot
Created on: 28/Mar/18 18:40
Start Date: 28/Mar/18 18:40
Worklog Time Spent: 10m 
  Work Description: lukecwik commented on issue #4970: [BEAM-3326] Address 
additional comments from PR/4963.
URL: https://github.com/apache/beam/pull/4970#issuecomment-376992745
 
 
   CC: @bsidhom 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85349)
Time Spent: 5h 20m  (was: 5h 10m)

> Execute a Stage via the portability framework in the ReferenceRunner
> 
>
> Key: BEAM-3326
> URL: https://issues.apache.org/jira/browse/BEAM-3326
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Major
>  Labels: portability
>  Time Spent: 5h 20m
>  Remaining Estimate: 0h
>
> This is the supertask for remote execution in the Universal Local Runner 
> (BEAM-2899).
> This executes a stage remotely via portability framework APIs



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3326) Execute a Stage via the portability framework in the ReferenceRunner

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3326?focusedWorklogId=85347=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85347
 ]

ASF GitHub Bot logged work on BEAM-3326:


Author: ASF GitHub Bot
Created on: 28/Mar/18 18:35
Start Date: 28/Mar/18 18:35
Worklog Time Spent: 10m 
  Work Description: lukecwik commented on a change in pull request #4963: 
[BEAM-3326] Abstract away closing the inbound receiver, waiting for the bundle 
to finish, waiting for outbound to complete within the ActiveBundle.
URL: https://github.com/apache/beam/pull/4963#discussion_r177849543
 
 

 ##
 File path: 
runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/control/SdkHarnessClient.java
 ##
 @@ -146,22 +154,92 @@ private BundleProcessor(
   }
 
   /** An active bundle for a particular {@link 
BeamFnApi.ProcessBundleDescriptor}. */
-  @AutoValue
-  public abstract static class ActiveBundle {
-public abstract String getBundleId();
-
-public abstract CompletionStage 
getBundleResponse();
+  public static class ActiveBundle implements AutoCloseable {
+private final String bundleId;
+private final CompletionStage response;
+private final CloseableFnDataReceiver inputReceiver;
+private final Map outputClients;
 
-public abstract CloseableFnDataReceiver 
getInputReceiver();
-public abstract Map 
getOutputClients();
-
-public static  ActiveBundle create(
+private ActiveBundle(
 String bundleId,
 CompletionStage response,
-CloseableFnDataReceiver dataReceiver,
+CloseableFnDataReceiver inputReceiver,
 Map outputClients) {
-  return new AutoValue_SdkHarnessClient_ActiveBundle<>(
-  bundleId, response, dataReceiver, outputClients);
+  this.bundleId = bundleId;
+  this.response = response;
+  this.inputReceiver = inputReceiver;
+  this.outputClients = outputClients;
+}
+
+/**
+ * Returns an id used to represent this bundle.
+ */
+public String getBundleId() {
+  return bundleId;
+}
+
+/**
+ * Returns a {@link FnDataReceiver receiver} which consumes input elements 
forwarding them
+ * to the SDK. When
+ */
+public FnDataReceiver getInputReceiver() {
+  return inputReceiver;
+}
+
+/**
+ * Blocks till bundle processing is finished. This is comprised of:
+ * 
+ *   closing the {@link #getInputReceiver() input receiver}.
+ *   waiting for the SDK to say that processing the bundle is 
finished.
+ *   waiting for all inbound data clients to complete
+ * 
+ *
+ * This method will throw an exception if bundle processing has failed.
+ * {@link Throwable#getSuppressed()} will return all the reasons as to why 
processing has
+ * failed.
+ */
+@Override
+public void close() throws Exception {
+  Exception exception = null;
+  try {
+inputReceiver.close();
+  } catch (Exception e) {
+exception = e;
+  }
+  try {
+// We don't have to worry about the completion stage.
+if (exception == null) {
+  MoreFutures.get(response);
+} else {
+  // TODO: Handle aborting the bundle being processed.
+  throw new IllegalStateException("Processing bundle failed, TODO: 
abort bundle.");
+}
+  } catch (Exception e) {
+if (exception == null) {
+  exception = e;
+} else {
+  exception.addSuppressed(e);
+}
+  }
+  for (InboundDataClient outputClient : outputClients.values()) {
+try {
+  // If we failed processing this bundle, we should cancel all inbound 
data.
+  if (exception == null) {
+outputClient.awaitCompletion();
 
 Review comment:
   We await completion on all outbound clients.
   
   No, the bundle result represents to the best knowledge that the SDK did 
everything it needed to. There can still be wire transfer/network/decoding 
failures that the SDK wouldn't be aware of after sending bundle process 
completion.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85347)
Time Spent: 5h  (was: 4h 50m)

> Execute a Stage via the portability framework in the ReferenceRunner
> 
>
> Key: BEAM-3326
> URL: 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4525

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 260.97 KB...]
Creating file target/docs/source/apache_beam.io.gcp.pubsub.rst.
Creating file target/docs/source/apache_beam.io.gcp.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.adaptive_throttler.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.datastoreio.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.fake_datastore.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.helper.rst.
Creating file 
target/docs/source/apache_beam.io.gcp.datastore.v1.query_splitter.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.util.rst.
Creating file target/docs/source/apache_beam.io.gcp.datastore.v1.rst.
Creating file target/docs/source/apache_beam.metrics.cells.rst.
Creating file target/docs/source/apache_beam.metrics.metric.rst.
Creating file target/docs/source/apache_beam.metrics.metricbase.rst.
Creating file target/docs/source/apache_beam.metrics.rst.
Creating file target/docs/source/apache_beam.options.pipeline_options.rst.
Creating file 
target/docs/source/apache_beam.options.pipeline_options_validator.rst.
Creating file target/docs/source/apache_beam.options.value_provider.rst.
Creating file target/docs/source/apache_beam.options.rst.
Creating file target/docs/source/apache_beam.portability.common_urns.rst.
Creating file target/docs/source/apache_beam.portability.python_urns.rst.
Creating file target/docs/source/apache_beam.portability.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_artifact_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_fn_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_job_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_provision_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.beam_runner_api_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.endpoints_pb2_grpc.rst.
Creating file 
target/docs/source/apache_beam.portability.api.standard_window_fns_pb2_grpc.rst.
Creating file target/docs/source/apache_beam.portability.api.rst.
Creating file target/docs/source/apache_beam.runners.pipeline_context.rst.
Creating file target/docs/source/apache_beam.runners.runner.rst.
Creating file target/docs/source/apache_beam.runners.sdf_common.rst.
Creating file target/docs/source/apache_beam.runners.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.dataflow_metrics.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.dataflow_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.ptransform_overrides.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.test_dataflow_runner.rst.
Creating file target/docs/source/apache_beam.runners.dataflow.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.native_io.iobase.rst.
Creating file 
target/docs/source/apache_beam.runners.dataflow.native_io.streaming_create.rst.
Creating file target/docs/source/apache_beam.runners.dataflow.native_io.rst.
Creating file target/docs/source/apache_beam.runners.direct.bundle_factory.rst.
Creating file target/docs/source/apache_beam.runners.direct.clock.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.consumer_tracking_pipeline_visitor.rst.
Creating file target/docs/source/apache_beam.runners.direct.direct_metrics.rst.
Creating file target/docs/source/apache_beam.runners.direct.direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.evaluation_context.rst.
Creating file target/docs/source/apache_beam.runners.direct.executor.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.helper_transforms.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.sdf_direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.transform_evaluator.rst.
Creating file target/docs/source/apache_beam.runners.direct.util.rst.
Creating file 
target/docs/source/apache_beam.runners.direct.watermark_manager.rst.
Creating file target/docs/source/apache_beam.runners.direct.rst.
Creating file target/docs/source/apache_beam.runners.experimental.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.python_rpc_direct_runner.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.server.rst.
Creating file 
target/docs/source/apache_beam.runners.experimental.python_rpc_direct.rst.
Creating file 

[jira] [Work logged] (BEAM-3932) Spanner MutationSizeEstimator doesn't handle null array values

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3932?focusedWorklogId=85345=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85345
 ]

ASF GitHub Bot logged work on BEAM-3932:


Author: ASF GitHub Bot
Created on: 28/Mar/18 18:19
Start Date: 28/Mar/18 18:19
Worklog Time Spent: 10m 
  Work Description: mairbek commented on issue #4951: [BEAM-3932] Adds 
handling of array null values to MutationSizeEstimator
URL: https://github.com/apache/beam/pull/4951#issuecomment-376985764
 
 
   Awesome! Thank you, Ismaël


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85345)
Time Spent: 1h 10m  (was: 1h)

> Spanner MutationSizeEstimator doesn't handle null array values
> --
>
> Key: BEAM-3932
> URL: https://issues.apache.org/jira/browse/BEAM-3932
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.3.0
>Reporter: Mairbek Khadikov
>Assignee: Mairbek Khadikov
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> More details here 
> https://stackoverflow.com/questions/49484770/dataflow-spanner-api-throws-illegalstateexception-when-value-of-array-type-colum



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3964) Java Github HEAD doesn't run on Dataflow

2018-03-28 Thread Thomas Groh (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16417874#comment-16417874
 ] 

Thomas Groh commented on BEAM-3964:
---

[https://github.com/apache/beam/commit/a76734237124777130d093613b6c4c2b54756e87#diff-75b946144e5e5ab0060a36a5de3c8c77]
 is the cause

> Java Github HEAD doesn't run on Dataflow
> 
>
> Key: BEAM-3964
> URL: https://issues.apache.org/jira/browse/BEAM-3964
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Major
>
> Rather: it runs, but cannot make progress, seemingly due to an Error with 
> metric names:
>  
> exception: "java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.metrics.MetricName.name()Ljava/lang/String; at 
> com.google.cloud.dataflow.worker.MetricsToCounterUpdateConverter.structuredNameAndMetadata(MetricsToCounterUpdateConverter.java:99)
>  at 
> com.google.cloud.dataflow.worker.MetricsToCounterUpdateConverter.fromCounter(MetricsToCounterUpdateConverter.java:68)
>  at 
> com.google.cloud.dataflow.worker.BatchModeExecutionContext.lambda$null$1(BatchModeExecutionContext.java:463)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.Iterators$7.transform(Iterators.java:750)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.TransformedIterator.next(TransformedIterator.java:47)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.MultitransformedIterator.next(MultitransformedIterator.java:66)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableCollection$Builder.addAll(ImmutableCollection.java:388)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableCollection$ArrayBasedBuilder.addAll(ImmutableCollection.java:472)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableList$Builder.addAll(ImmutableList.java:669)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.populateCounterUpdates(WorkItemStatusClient.java:256)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.createStatusUpdate(WorkItemStatusClient.java:240)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.reportError(WorkItemStatusClient.java:94)
>  at 
> com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:358)
>  at 
> com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
>  at 
> com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
>  at 
> com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
>  at 
> com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
>  at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)
>  
> Probably requires an updated container version.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3213) Add a performance test for MongoDBIO

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3213?focusedWorklogId=85344=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85344
 ]

ASF GitHub Bot logged work on BEAM-3213:


Author: ASF GitHub Bot
Created on: 28/Mar/18 18:13
Start Date: 28/Mar/18 18:13
Worklog Time Spent: 10m 
  Work Description: chamikaramj commented on issue #4859: [BEAM-3213] 
MongodbIO performance test
URL: https://github.com/apache/beam/pull/4859#issuecomment-376984215
 
 
   Run Java PreCommit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85344)
Time Spent: 1h  (was: 50m)

> Add a performance test for MongoDBIO
> 
>
> Key: BEAM-3213
> URL: https://issues.apache.org/jira/browse/BEAM-3213
> Project: Beam
>  Issue Type: Test
>  Components: io-java-mongodb
>Reporter: Chamikara Jayalath
>Assignee: Łukasz Gajowy
>Priority: Major
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> We should add a large scale performance test for MongoDBIO. We could use 
> PerfKitBenchmarker based performance testing framework [1] to manage a 
> Kubernetes based multi-node MongoDB cluster and to publish benchmark results.
> Example docker image to use: https://hub.docker.com/_/mongo/
> [1] https://beam.apache.org/documentation/io/testing/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3213) Add a performance test for MongoDBIO

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3213?focusedWorklogId=85343=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85343
 ]

ASF GitHub Bot logged work on BEAM-3213:


Author: ASF GitHub Bot
Created on: 28/Mar/18 18:13
Start Date: 28/Mar/18 18:13
Worklog Time Spent: 10m 
  Work Description: chamikaramj commented on issue #4859: [BEAM-3213] 
MongodbIO performance test
URL: https://github.com/apache/beam/pull/4859#issuecomment-376984131
 
 
   Thanks JB. I'll merge after tests pass.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85343)
Time Spent: 50m  (was: 40m)

> Add a performance test for MongoDBIO
> 
>
> Key: BEAM-3213
> URL: https://issues.apache.org/jira/browse/BEAM-3213
> Project: Beam
>  Issue Type: Test
>  Components: io-java-mongodb
>Reporter: Chamikara Jayalath
>Assignee: Łukasz Gajowy
>Priority: Major
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> We should add a large scale performance test for MongoDBIO. We could use 
> PerfKitBenchmarker based performance testing framework [1] to manage a 
> Kubernetes based multi-node MongoDB cluster and to publish benchmark results.
> Example docker image to use: https://hub.docker.com/_/mongo/
> [1] https://beam.apache.org/documentation/io/testing/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1205

2018-03-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] [BEAM-3932] Adds handling of array null values to 
MutationSizeEstimator

[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO

[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh

--
[...truncated 508.05 KB...]
}
  ], 
  "is_wrapper": true
}, 
"output_name": "out", 
"user_name": "assert_that/Group/GroupByKey.out"
  }
], 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s10"
}, 
"serialized_fn": 
"%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
 
"user_name": "assert_that/Group/GroupByKey"
  }
}, 
{
  "kind": "ParallelDo", 
  "name": "s12", 
  "properties": {
"display_data": [
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
"type": "STRING", 
"value": "_merge_tagged_vals_under_key"
  }, 
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.ParDo", 
"shortValue": "CallableWrapperDoFn", 
"type": "STRING", 
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
  }
], 
"non_parallel_inputs": {}, 
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}, 
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}
  ], 
  "is_pair_like": true
}, 
{
  "@type": "kind:global_window"
}
  ], 
  "is_wrapper": true
}, 
"output_name": "out", 
"user_name": 
"assert_that/Group/Map(_merge_tagged_vals_under_key).out"
  }
], 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s11"
}, 
"serialized_fn": "", 
"user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key)"
  }
}, 
{
  "kind": "ParallelDo", 
  "name": "s13", 
  "properties": {
"display_data": [
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
"type": "STRING", 
"value": ""
  }, 
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.ParDo", 
"shortValue": "CallableWrapperDoFn", 
"type": "STRING", 
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
  }
], 
"non_parallel_inputs": {}, 
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}, 
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  

[jira] [Updated] (BEAM-3964) Java Github HEAD doesn't run on Dataflow

2018-03-28 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3964?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh updated BEAM-3964:
--
Description: 
Rather: it runs, but cannot make progress, seemingly due to an Error with 
metric names:

 

exception: "java.lang.NoSuchMethodError: 
org.apache.beam.sdk.metrics.MetricName.name()Ljava/lang/String; at 
com.google.cloud.dataflow.worker.MetricsToCounterUpdateConverter.structuredNameAndMetadata(MetricsToCounterUpdateConverter.java:99)
 at 
com.google.cloud.dataflow.worker.MetricsToCounterUpdateConverter.fromCounter(MetricsToCounterUpdateConverter.java:68)
 at 
com.google.cloud.dataflow.worker.BatchModeExecutionContext.lambda$null$1(BatchModeExecutionContext.java:463)
 at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.Iterators$7.transform(Iterators.java:750)
 at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.TransformedIterator.next(TransformedIterator.java:47)
 at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.MultitransformedIterator.next(MultitransformedIterator.java:66)
 at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableCollection$Builder.addAll(ImmutableCollection.java:388)
 at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableCollection$ArrayBasedBuilder.addAll(ImmutableCollection.java:472)
 at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableList$Builder.addAll(ImmutableList.java:669)
 at 
com.google.cloud.dataflow.worker.WorkItemStatusClient.populateCounterUpdates(WorkItemStatusClient.java:256)
 at 
com.google.cloud.dataflow.worker.WorkItemStatusClient.createStatusUpdate(WorkItemStatusClient.java:240)
 at 
com.google.cloud.dataflow.worker.WorkItemStatusClient.reportError(WorkItemStatusClient.java:94)
 at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:358)
 at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
 at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
 at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
 at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
 at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745)

 

Probably requires an updated container version.

> Java Github HEAD doesn't run on Dataflow
> 
>
> Key: BEAM-3964
> URL: https://issues.apache.org/jira/browse/BEAM-3964
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Major
>
> Rather: it runs, but cannot make progress, seemingly due to an Error with 
> metric names:
>  
> exception: "java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.metrics.MetricName.name()Ljava/lang/String; at 
> com.google.cloud.dataflow.worker.MetricsToCounterUpdateConverter.structuredNameAndMetadata(MetricsToCounterUpdateConverter.java:99)
>  at 
> com.google.cloud.dataflow.worker.MetricsToCounterUpdateConverter.fromCounter(MetricsToCounterUpdateConverter.java:68)
>  at 
> com.google.cloud.dataflow.worker.BatchModeExecutionContext.lambda$null$1(BatchModeExecutionContext.java:463)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.Iterators$7.transform(Iterators.java:750)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.TransformedIterator.next(TransformedIterator.java:47)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.MultitransformedIterator.next(MultitransformedIterator.java:66)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableCollection$Builder.addAll(ImmutableCollection.java:388)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableCollection$ArrayBasedBuilder.addAll(ImmutableCollection.java:472)
>  at 
> com.google.cloud.dataflow.worker.repackaged.com.google.common.collect.ImmutableList$Builder.addAll(ImmutableList.java:669)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.populateCounterUpdates(WorkItemStatusClient.java:256)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.createStatusUpdate(WorkItemStatusClient.java:240)
>  at 
> com.google.cloud.dataflow.worker.WorkItemStatusClient.reportError(WorkItemStatusClient.java:94)
>  at 
> 

[jira] [Created] (BEAM-3964) Java Github HEAD doesn't run on Dataflow

2018-03-28 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-3964:
-

 Summary: Java Github HEAD doesn't run on Dataflow
 Key: BEAM-3964
 URL: https://issues.apache.org/jira/browse/BEAM-3964
 Project: Beam
  Issue Type: Bug
  Components: runner-dataflow
Reporter: Thomas Groh
Assignee: Thomas Groh






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3956) Stacktraces from exceptions in user code should be preserved in the Python SDK

2018-03-28 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3956?focusedWorklogId=85340=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-85340
 ]

ASF GitHub Bot logged work on BEAM-3956:


Author: ASF GitHub Bot
Created on: 28/Mar/18 17:49
Start Date: 28/Mar/18 17:49
Worklog Time Spent: 10m 
  Work Description: shoyer commented on issue #4959: [BEAM-3956] Preserve 
stacktraces for Python exceptions
URL: https://github.com/apache/beam/pull/4959#issuecomment-376976017
 
 
   Jenkins failed with a lint error but that should be resolved now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 85340)
Time Spent: 1h  (was: 50m)

> Stacktraces from exceptions in user code should be preserved in the Python SDK
> --
>
> Key: BEAM-3956
> URL: https://issues.apache.org/jira/browse/BEAM-3956
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Stephan Hoyer
>Priority: Major
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> Currently, Beam's Python SDK loses stacktraces for exceptions. It does 
> helpfully add a tag like "[while running StageA]" to exception error 
> messages, but that doesn't include the stacktrace of Python functions being 
> called.
> Including the full stacktraces would make a big difference for the ease of 
> debugging Beam pipelines when things go wrong.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (BEAM-3963) Dataflow PostCommits have not built since Mar 26, 2018

2018-03-28 Thread Thomas Groh (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3963?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16417821#comment-16417821
 ] 

Thomas Groh edited comment on BEAM-3963 at 3/28/18 5:47 PM:


Most of the 'failed to build' are actually timeouts in running the 
{{ValidatesRunner}} tests, and many of the prior failures are quota-related


was (Author: tgroh):
Most of the 'failed to build' are actually timeouts in running the 
{{ValidatesRunner}} tests

> Dataflow PostCommits have not built since Mar 26, 2018 
> ---
>
> Key: BEAM-3963
> URL: https://issues.apache.org/jira/browse/BEAM-3963
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Blocker
>
> They have not succeeded since [Mar 15, 2018 12:00 
> PM|https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/5156/]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3963) Dataflow PostCommits have not built since Mar 26, 2018

2018-03-28 Thread Thomas Groh (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3963?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16417821#comment-16417821
 ] 

Thomas Groh commented on BEAM-3963:
---

Most of the 'failed to build' are actually timeouts in running the 
{{ValidatesRunner}} tests

> Dataflow PostCommits have not built since Mar 26, 2018 
> ---
>
> Key: BEAM-3963
> URL: https://issues.apache.org/jira/browse/BEAM-3963
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Blocker
>
> They have not succeeded since [Mar 15, 2018 12:00 
> PM|https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/5156/]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3963) Dataflow PostCommits have not built since Mar 26, 2018 and

2018-03-28 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-3963:
-

 Summary: Dataflow PostCommits have not built since Mar 26, 2018 
and 
 Key: BEAM-3963
 URL: https://issues.apache.org/jira/browse/BEAM-3963
 Project: Beam
  Issue Type: Bug
  Components: runner-dataflow
Reporter: Thomas Groh
Assignee: Thomas Groh


They have not succeeded since [Mar 15, 2018 12:00 
PM|https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/5156/]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3963) Dataflow PostCommits have not built since Mar 26, 2018

2018-03-28 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3963?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh updated BEAM-3963:
--
Summary: Dataflow PostCommits have not built since Mar 26, 2018   (was: 
Dataflow PostCommits have not built since Mar 26, 2018 and )

> Dataflow PostCommits have not built since Mar 26, 2018 
> ---
>
> Key: BEAM-3963
> URL: https://issues.apache.org/jira/browse/BEAM-3963
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Blocker
>
> They have not succeeded since [Mar 15, 2018 12:00 
> PM|https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/5156/]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   3   >