[jira] [Closed] (BEAM-69) TextIO/AvroIO numShards may not work in different runners

2016-07-11 Thread Daniel Halperin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-69?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Halperin closed BEAM-69.
---
   Resolution: Fixed
 Assignee: Daniel Halperin
Fix Version/s: Not applicable

> TextIO/AvroIO numShards may not work in different runners
> -
>
> Key: BEAM-69
> URL: https://issues.apache.org/jira/browse/BEAM-69
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-core
>Reporter: Daniel Halperin
>Assignee: Daniel Halperin
> Fix For: Not applicable
>
>
> See also: https://issues.apache.org/jira/browse/BEAM-68
> Because the Beam Model does not support limiting parallelism of a specific 
> step, individual runners needs to add runner-specific overrides of 
> TextIO.Write and AvroIO.Write to enforce the `numShards` parameter. 
> DataflowPipelineRunner and DirectPipelineRunner do that now; other runners 
> will need to do so as well, or ignore/reject the writes, until BEAM-68 is 
> resolved.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (BEAM-375) HadoopIO and runners-spark conflict with hadoop.version

2016-07-11 Thread Pei He (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-375?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pei He updated BEAM-375:

Issue Type: Bug  (was: New Feature)

> HadoopIO and runners-spark conflict with hadoop.version
> ---
>
> Key: BEAM-375
> URL: https://issues.apache.org/jira/browse/BEAM-375
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Pei He
>Assignee: Pei He
>
> HadoopIO currently uses 2.7.0 and runners-spark uses 2.2.0 for hadoop-client, 
> hadoop-common.
> From [~amitsela]
> "Spark can be built against different hadoop versions, but the release in 
> maven central is a 2.2.0 build (latest). ''
> For HadoopIO, I don't know why 2.7.0 is picked at the beginning. I can check 
> if it will work with 2.2.0.
> I am creating this issue, since I think it there is a general question.
> In principle, HadoopIO and other sdks Sources should work with any runners. 
> But, when one set of runners require version A, but the other set of runners 
> require version B, we will need a general solution for it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (BEAM-443) PipelineResult needs waitToFinish() and cancel()

2016-07-11 Thread Pei He (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-443?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pei He updated BEAM-443:

Issue Type: New Feature  (was: Bug)

> PipelineResult needs waitToFinish() and cancel()
> 
>
> Key: BEAM-443
> URL: https://issues.apache.org/jira/browse/BEAM-443
> Project: Beam
>  Issue Type: New Feature
>Reporter: Pei He
>Assignee: Pei He
>
> waitToFinish() and cancel() are two most common operations for users to 
> interact with a started pipeline.
> Right now, they are only available in DataflowPipelineJob. But, it is better 
> to move them to the common interface, so people can start implement them in 
> other runners, and runner agnostic code can interact with PipelineResult 
> better.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (BEAM-375) HadoopIO and runners-spark conflict with hadoop.version

2016-07-11 Thread Pei He (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-375?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pei He updated BEAM-375:

Issue Type: New Feature  (was: Bug)

> HadoopIO and runners-spark conflict with hadoop.version
> ---
>
> Key: BEAM-375
> URL: https://issues.apache.org/jira/browse/BEAM-375
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-extensions
>Reporter: Pei He
>Assignee: Pei He
>
> HadoopIO currently uses 2.7.0 and runners-spark uses 2.2.0 for hadoop-client, 
> hadoop-common.
> From [~amitsela]
> "Spark can be built against different hadoop versions, but the release in 
> maven central is a 2.2.0 build (latest). ''
> For HadoopIO, I don't know why 2.7.0 is picked at the beginning. I can check 
> if it will work with 2.2.0.
> I am creating this issue, since I think it there is a general question.
> In principle, HadoopIO and other sdks Sources should work with any runners. 
> But, when one set of runners require version A, but the other set of runners 
> require version B, we will need a general solution for it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (BEAM-443) PipelineResult needs waitToFinish() and cancel()

2016-07-11 Thread Pei He (JIRA)
Pei He created BEAM-443:
---

 Summary: PipelineResult needs waitToFinish() and cancel()
 Key: BEAM-443
 URL: https://issues.apache.org/jira/browse/BEAM-443
 Project: Beam
  Issue Type: Bug
Reporter: Pei He
Assignee: Pei He


waitToFinish() and cancel() are two most common operations for users to 
interact with a started pipeline.

Right now, they are only available in DataflowPipelineJob. But, it is better to 
move them to the common interface, so people can start implement them in other 
runners, and runner agnostic code can interact with PipelineResult better.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #637: Remove getDataflowClient() from DataflowPi...

2016-07-11 Thread peihe
GitHub user peihe opened a pull request:

https://github.com/apache/incubator-beam/pull/637

Remove getDataflowClient() from DataflowPipelineJob




You can merge this pull request into a Git repository by running:

$ git pull https://github.com/peihe/incubator-beam PipelineResult-fixup

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/637.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #637


commit 23637cbd1d0fa008851829e46b392de84352c10a
Author: Pei He 
Date:   2016-07-12T04:29:26Z

Remove getDataflowClient() from DataflowPipelineJob




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_PythonVerify #68

2016-07-11 Thread Apache Jenkins Server
See 

--
[...truncated 1282 lines...]

creating apache-beam-sdk-0.3.0
creating apache-beam-sdk-0.3.0/apache_beam
creating apache-beam-sdk-0.3.0/apache_beam/coders
creating apache-beam-sdk-0.3.0/apache_beam/examples
creating apache-beam-sdk-0.3.0/apache_beam/internal
creating apache-beam-sdk-0.3.0/apache_beam/internal/clients
creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/bigquery
creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/dataflow
creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/storage
creating apache-beam-sdk-0.3.0/apache_beam/io
creating apache-beam-sdk-0.3.0/apache_beam/runners
creating apache-beam-sdk-0.3.0/apache_beam/transforms
creating apache-beam-sdk-0.3.0/apache_beam/typehints
creating apache-beam-sdk-0.3.0/apache_beam/utils
creating apache-beam-sdk-0.3.0/apache_beam_sdk.egg-info
making hard links in apache-beam-sdk-0.3.0...
hard linking setup.cfg -> apache-beam-sdk-0.3.0
hard linking setup.py -> apache-beam-sdk-0.3.0
hard linking apache_beam/__init__.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/dataflow_test.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/error.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/pipeline.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/pipeline_test.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/pvalue.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/pvalue_test.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/version.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/coders/__init__.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coder_impl.pxd -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coder_impl.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coders.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coders_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coders_test_common.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/fast_coders_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/observable.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/observable_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/slow_coders_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/slow_stream.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/stream.pxd -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/stream.pyx -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/stream_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/typecoders.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/typecoders_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/examples/__init__.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/examples/streaming_wordcap.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/examples/streaming_wordcount.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/examples/wordcount.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/examples/wordcount_debugging.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/examples/wordcount_debugging_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/examples/wordcount_minimal.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/examples/wordcount_minimal_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/examples/wordcount_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/examples
hard linking apache_beam/internal/__init__.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/apiclient.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/apiclient_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/auth.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/json_value.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/json_value_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/module_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/pickler.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/pickler_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/internal
hard linking apache_beam/internal/util.py -> 

Jenkins build is back to stable : beam_PostCommit_RunnableOnService_GoogleCloudDataflow #736

2016-07-11 Thread Apache Jenkins Server
See 




[jira] [Updated] (BEAM-442) Examples demonstrate code, but not project configuration

2016-07-11 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-442?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-442:
-
Assignee: (was: Frances Perry)

> Examples demonstrate code, but not project configuration
> 
>
> Key: BEAM-442
> URL: https://issues.apache.org/jira/browse/BEAM-442
> Project: Beam
>  Issue Type: Improvement
>  Components: examples-java
>Reporter: Kenneth Knowles
>
> Deriving a possible issue from 
> http://stackoverflow.com/questions/38317428/how-do-i-resolve-java-lang-nosuchmethoderror-com-google-api-services-dataflow-m
> The above StackOverflow question illuminates the fact that the 
> Dataflow-examples repository is not a demonstration of proper project 
> configuration, but only code. Today there is not a separate examples 
> repository for Beam, but just an examples module, so the issue is not 
> highlighted.
> It is probably quite useful to have a starter project that actually has the 
> dependencies set up correctly, not referencing the Beam parent pom, so users 
> can clone it and get started. It could be in a separate repository or not - 
> that aspect is actually orthogonal IMO.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (BEAM-442) Examples demonstrate code, but not project configuration

2016-07-11 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-442?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-442:
-
Description: 
Deriving a possible issue from 
http://stackoverflow.com/questions/38317428/how-do-i-resolve-java-lang-nosuchmethoderror-com-google-api-services-dataflow-m

The above StackOverflow question illuminates the fact that the 
Dataflow-examples repository is not a demonstration of proper project 
configuration, but only code. Today there is not a separate examples repository 
for Beam, but just an examples module, so the issue is not highlighted.

It is probably quite useful to have a starter project that actually has the 
dependencies set up correctly, not referencing the Beam parent pom, so users 
can clone it and get started. It could be in a separate repository or not - 
that aspect is actually orthogonal IMO.

  was:
Deriving a possible issue from 
http://stackoverflow.com/questions/38317428/how-do-i-resolve-java-lang-nosuchmethoderror-com-google-api-services-dataflow-m

The above StackOverflow question illuminates the fact that the 
Dataflow-examples repository is not a demonstration of proper project 
configuration, but only code. Today there is not a separate examples repository 
for Beam, but just an examples module, so the issue is not highlighted.

It is probably quite useful to have a starter project that actually has the 
dependencies set up correctly, not references the Beam parent pom, so users can 
clone it and get started. It could be in a separate repository or not - that 
aspect is actually orthogonal IMO.


> Examples demonstrate code, but not project configuration
> 
>
> Key: BEAM-442
> URL: https://issues.apache.org/jira/browse/BEAM-442
> Project: Beam
>  Issue Type: Improvement
>  Components: examples-java
>Reporter: Kenneth Knowles
>Assignee: Frances Perry
>
> Deriving a possible issue from 
> http://stackoverflow.com/questions/38317428/how-do-i-resolve-java-lang-nosuchmethoderror-com-google-api-services-dataflow-m
> The above StackOverflow question illuminates the fact that the 
> Dataflow-examples repository is not a demonstration of proper project 
> configuration, but only code. Today there is not a separate examples 
> repository for Beam, but just an examples module, so the issue is not 
> highlighted.
> It is probably quite useful to have a starter project that actually has the 
> dependencies set up correctly, not referencing the Beam parent pom, so users 
> can clone it and get started. It could be in a separate repository or not - 
> that aspect is actually orthogonal IMO.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (BEAM-442) Examples demonstrate code, but not project configuration

2016-07-11 Thread Kenneth Knowles (JIRA)
Kenneth Knowles created BEAM-442:


 Summary: Examples demonstrate code, but not project configuration
 Key: BEAM-442
 URL: https://issues.apache.org/jira/browse/BEAM-442
 Project: Beam
  Issue Type: Improvement
  Components: examples-java
Reporter: Kenneth Knowles
Assignee: Frances Perry


Deriving a possible issue from 
http://stackoverflow.com/questions/38317428/how-do-i-resolve-java-lang-nosuchmethoderror-com-google-api-services-dataflow-m

The above StackOverflow question illuminates the fact that the 
Dataflow-examples repository is not a demonstration of proper project 
configuration, but only code. Today there is not a separate examples repository 
for Beam, but just an examples module, so the issue is not highlighted.

It is probably quite useful to have a starter project that actually has the 
dependencies set up correctly, not references the Beam parent pom, so users can 
clone it and get started. It could be in a separate repository or not - that 
aspect is actually orthogonal IMO.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Jenkins build is still unstable: beam_PostCommit_RunnableOnService_GoogleCloudDataflow #735

2016-07-11 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-404) PubsubIO should have a mode that supports maintaining message attributes.

2016-07-11 Thread Daniel Halperin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-404?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15372044#comment-15372044
 ] 

Daniel Halperin commented on BEAM-404:
--

Of course instead of {{KV}} this could be something else like a 
{{PubsubMessage}}, so that the attributes are encoded in a different, better 
way.

> PubsubIO should have a mode that supports maintaining message attributes.
> -
>
> Key: BEAM-404
> URL: https://issues.apache.org/jira/browse/BEAM-404
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Daniel Halperin
>
> Right now, PubsubIO only lets uses access the message payload, decoded with 
> the user-provided coder.
> We should add a mode in which the source can return a message with the 
> metadata (attributes) as well.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #631: [BEAM-433] Change the ExampleUtils constru...

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/631


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: [BEAM-433] Change the ExampleUtils constructor takes PipelineOptions

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master a59ddab21 -> f5a5eb34e


[BEAM-433] Change the ExampleUtils constructor takes PipelineOptions


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/46140307
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/46140307
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/46140307

Branch: refs/heads/master
Commit: 4614030729bbaf2458f6c98dc41f9cde5451624c
Parents: a59ddab
Author: Pei He 
Authored: Mon Jul 11 14:22:15 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 18:34:41 2016 -0700

--
 .../beam/examples/common/ExampleUtils.java  | 20 
 1 file changed, 12 insertions(+), 8 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/46140307/examples/java/src/main/java/org/apache/beam/examples/common/ExampleUtils.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/common/ExampleUtils.java 
b/examples/java/src/main/java/org/apache/beam/examples/common/ExampleUtils.java
index 6b71b0f..ad00a14 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/common/ExampleUtils.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/common/ExampleUtils.java
@@ -24,6 +24,9 @@ import 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
 import org.apache.beam.runners.dataflow.util.MonitoringUtil;
 import org.apache.beam.sdk.PipelineResult;
 import org.apache.beam.sdk.options.BigQueryOptions;
+import org.apache.beam.sdk.options.PipelineOptions;
+import org.apache.beam.sdk.options.PubsubOptions;
+import org.apache.beam.sdk.options.StreamingOptions;
 import org.apache.beam.sdk.runners.PipelineRunner;
 import org.apache.beam.sdk.util.AttemptBoundedExponentialBackOff;
 import org.apache.beam.sdk.util.Transport;
@@ -65,7 +68,7 @@ public class ExampleUtils {
 
   private static final int SC_NOT_FOUND = 404;
 
-  private final DataflowPipelineOptions options;
+  private final PipelineOptions options;
   private Bigquery bigQueryClient = null;
   private Pubsub pubsubClient = null;
   private Dataflow dataflowClient = null;
@@ -75,7 +78,7 @@ public class ExampleUtils {
   /**
* Do resources and runner options setup.
*/
-  public ExampleUtils(DataflowPipelineOptions options) {
+  public ExampleUtils(PipelineOptions options) {
 this.options = options;
 setupRunner();
   }
@@ -230,7 +233,7 @@ public class ExampleUtils {
 
   private void setupPubsubTopic(String topic) throws IOException {
 if (pubsubClient == null) {
-  pubsubClient = Transport.newPubsubClient(options).build();
+  pubsubClient = 
Transport.newPubsubClient(options.as(PubsubOptions.class)).build();
 }
 if (executeNullIfNotFound(pubsubClient.projects().topics().get(topic)) == 
null) {
   pubsubClient.projects().topics().create(topic, new 
Topic().setName(topic)).execute();
@@ -239,7 +242,7 @@ public class ExampleUtils {
 
   private void setupPubsubSubscription(String topic, String subscription) 
throws IOException {
 if (pubsubClient == null) {
-  pubsubClient = Transport.newPubsubClient(options).build();
+  pubsubClient = 
Transport.newPubsubClient(options.as(PubsubOptions.class)).build();
 }
 if 
(executeNullIfNotFound(pubsubClient.projects().subscriptions().get(subscription))
 == null) {
   Subscription subInfo = new Subscription()
@@ -256,7 +259,7 @@ public class ExampleUtils {
*/
   private void deletePubsubTopic(String topic) throws IOException {
 if (pubsubClient == null) {
-  pubsubClient = Transport.newPubsubClient(options).build();
+  pubsubClient = 
Transport.newPubsubClient(options.as(PubsubOptions.class)).build();
 }
 if (executeNullIfNotFound(pubsubClient.projects().topics().get(topic)) != 
null) {
   pubsubClient.projects().topics().delete(topic).execute();
@@ -270,7 +273,7 @@ public class ExampleUtils {
*/
   private void deletePubsubSubscription(String subscription) throws 
IOException {
 if (pubsubClient == null) {
-  pubsubClient = Transport.newPubsubClient(options).build();
+  pubsubClient = 
Transport.newPubsubClient(options.as(PubsubOptions.class)).build();
 }
 if 
(executeNullIfNotFound(pubsubClient.projects().subscriptions().get(subscription))
 != null) {
   pubsubClient.projects().subscriptions().delete(subscription).execute();
@@ -283,7 +286,8 @@ public class ExampleUtils {
*/
   private void setupRunner() {
 Class> runner = options.getRunner();
-if (options.isStreaming() && runner.equals(BlockingDataflowRunner.class)) {
+if (options.as(StreamingOptions.class).isStreaming()
+&& 

[jira] [Commented] (BEAM-433) Make Beam examples runners agnostic

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15372036#comment-15372036
 ] 

ASF GitHub Bot commented on BEAM-433:
-

Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/631


> Make Beam examples runners agnostic
> ---
>
> Key: BEAM-433
> URL: https://issues.apache.org/jira/browse/BEAM-433
> Project: Beam
>  Issue Type: Improvement
>Reporter: Pei He
>Assignee: Pei He
>
> Beam examples are ported from Dataflow, and they heavily reference to 
> Dataflow classes.
> There are following cleanup tasks:
> 1. Remove Dataflow streaming and batch injector setup (Done).
> 2. Remove references to DataflowPipelineOptions.
> 3. Move cancel() from DataflowPipelineJob to PipelineResult.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[2/2] incubator-beam git commit: Closes #631

2016-07-11 Thread dhalperi
Closes #631


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/f5a5eb34
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/f5a5eb34
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/f5a5eb34

Branch: refs/heads/master
Commit: f5a5eb34e984ba361aceb3b84e5d3a229a3c1bc3
Parents: a59ddab 4614030
Author: Dan Halperin 
Authored: Mon Jul 11 18:34:42 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 18:34:42 2016 -0700

--
 .../beam/examples/common/ExampleUtils.java  | 20 
 1 file changed, 12 insertions(+), 8 deletions(-)
--




[jira] [Commented] (BEAM-389) DelegateCoder needs equals and hashCode

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15372007#comment-15372007
 ] 

ASF GitHub Bot commented on BEAM-389:
-

Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/626


> DelegateCoder needs equals and hashCode
> ---
>
> Key: BEAM-389
> URL: https://issues.apache.org/jira/browse/BEAM-389
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Pei He
>Assignee: Pei He
>Priority: Minor
> Fix For: 0.2.0-incubating
>
>
> Currently, DelegateCoder inherit equals() and hashCode() from StandardCoder.
> And, it makes DelegateCoder.of(VarIntCoder) equal to 
> DelegateCoder.of(BigEndianIntegerCoder).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[2/2] incubator-beam git commit: Closes #626

2016-07-11 Thread dhalperi
Closes #626


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/a59ddab2
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/a59ddab2
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/a59ddab2

Branch: refs/heads/master
Commit: a59ddab2164b7c5c25ddde1f88e62ea91a710581
Parents: d93f564 5316f20
Author: Dan Halperin 
Authored: Mon Jul 11 18:02:30 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 18:02:30 2016 -0700

--
 .../org/apache/beam/sdk/transforms/Combine.java | 167 +--
 .../org/apache/beam/sdk/transforms/SumTest.java |  33 
 2 files changed, 152 insertions(+), 48 deletions(-)
--




[GitHub] incubator-beam pull request #626: [BEAM-389] Fix the uses of DelegateCoder i...

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/626


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: [BEAM-389] Fix the uses of DelegateCoder in Combine

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master d93f564eb -> a59ddab21


[BEAM-389] Fix the uses of DelegateCoder in Combine


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/5316f20a
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/5316f20a
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/5316f20a

Branch: refs/heads/master
Commit: 5316f20a8bdf30f5eabba3763544e39608d53d12
Parents: d93f564
Author: Pei He 
Authored: Mon Jul 11 13:01:24 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 18:02:29 2016 -0700

--
 .../org/apache/beam/sdk/transforms/Combine.java | 167 +--
 .../org/apache/beam/sdk/transforms/SumTest.java |  33 
 2 files changed, 152 insertions(+), 48 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/5316f20a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Combine.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Combine.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Combine.java
index 9a87b36..96c03eb 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Combine.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Combine.java
@@ -723,19 +723,7 @@ public class Combine {
 @Override
 public Coder getAccumulatorCoder(CoderRegistry registry, 
Coder inputCoder) {
   return DelegateCoder.of(
-  inputCoder,
-  new DelegateCoder.CodingFunction() {
-@Override
-public Integer apply(int[] accumulator) {
-  return accumulator[0];
-}
-  },
-  new DelegateCoder.CodingFunction() {
-@Override
-public int[] apply(Integer value) {
-  return wrap(value);
-}
-  });
+  inputCoder, new ToIntegerCodingFunction(), new 
FromIntegerCodingFunction());
 }
 
 @Override
@@ -744,12 +732,48 @@ public class Combine {
   return inputCoder;
 }
 
-private int[] wrap(int value) {
+private static int[] wrap(int value) {
   return new int[] { value };
 }
 
-public Counter getCounter(String name) {
-  throw new UnsupportedOperationException("BinaryCombineDoubleFn does not 
support getCounter");
+public Counter getCounter(@SuppressWarnings("unused") String 
name) {
+  throw new UnsupportedOperationException("BinaryCombineIntegerFn does not 
support getCounter");
+}
+
+private static final class ToIntegerCodingFunction
+implements DelegateCoder.CodingFunction {
+  @Override
+  public Integer apply(int[] accumulator) {
+return accumulator[0];
+  }
+
+  @Override
+  public boolean equals(Object o) {
+return o instanceof ToIntegerCodingFunction;
+  }
+
+  @Override
+  public int hashCode() {
+return this.getClass().hashCode();
+  }
+}
+
+private static final class FromIntegerCodingFunction
+implements DelegateCoder.CodingFunction {
+  @Override
+  public int[] apply(Integer value) {
+return wrap(value);
+  }
+
+  @Override
+  public boolean equals(Object o) {
+return o instanceof FromIntegerCodingFunction;
+  }
+
+  @Override
+  public int hashCode() {
+return this.getClass().hashCode();
+  }
 }
   }
 
@@ -803,20 +827,7 @@ public class Combine {
 
 @Override
 public Coder getAccumulatorCoder(CoderRegistry registry, 
Coder inputCoder) {
-  return DelegateCoder.of(
-  inputCoder,
-  new DelegateCoder.CodingFunction() {
-@Override
-public Long apply(long[] accumulator) {
-  return accumulator[0];
-}
-  },
-  new DelegateCoder.CodingFunction() {
-@Override
-public long[] apply(Long value) {
-  return wrap(value);
-}
-  });
+  return DelegateCoder.of(inputCoder, new ToLongCodingFunction(), new 
FromLongCodingFunction());
 }
 
 @Override
@@ -824,12 +835,48 @@ public class Combine {
   return inputCoder;
 }
 
-private long[] wrap(long value) {
+private static long[] wrap(long value) {
   return new long[] { value };
 }
 
-public Counter getCounter(String name) {
-  throw new UnsupportedOperationException("BinaryCombineDoubleFn does not 
support getCounter");
+public Counter getCounter(@SuppressWarnings("unused") String name) {
+  throw new 

Re: Build failed in Jenkins: beam_PostCommit_PythonVerify #67

2016-07-11 Thread Dan Halperin
Filed JIRA: https://issues.apache.org/jira/browse/BEAM-441

On Mon, Jul 11, 2016 at 5:54 PM, Apache Jenkins Server <
jenk...@builds.apache.org> wrote:

> See  >
>
> Changes:
>
> [dhalperi] Fix warnings that came with newly released Pylint (1.6.1)
> version.
>
> --
> [...truncated 1247 lines...]
> echo ">>> RUNNING DIRECT RUNNER py-wordcount"
> >>> RUNNING DIRECT RUNNER py-wordcount
> python -m apache_beam.examples.wordcount --output /tmp/py-wordcount-direct
> INFO:root:Missing pipeline option (runner). Executing pipeline using the
> default runner: DirectPipelineRunner.
> INFO:oauth2client.client:Attempting refresh to obtain initial access_token
> INFO:root:Starting finalize_write threads with num_shards: 1, num_threads:
> 1
> INFO:root:Renamed 1 shards in 0.02 seconds.
> INFO:root:Final: Debug counters: {'element_counts':
> Counter({('group/reify_windows', None): 28001, ('pair_with_one', None):
> 28001, ('split', None): 28001, 'read': 5525, ('group/group_by_window',
> None): 4784, ('format', None): 4784, ('count', None): 4784,
> 'group/group_by_key': 4784, ('write/WriteImpl/finalize_write', None): 1,
> ('write/WriteImpl/write_bundles', None): 1,
> ('write/WriteImpl/initialize_write', None): 1, 'write/WriteImpl/DoOnce':
> 1})}
> INFO:root:number of empty lines: 1663
> INFO:root:average word lengths: [4.204242705617657]
> # TODO: check that output file is generated for Direct Runner.
>
> # Run wordcount on the service.
>
> # Where to store wordcount output.
> GCS_LOCATION=gs://temp-storage-for-end-to-end-tests
>
> # Job name needs to be unique
> JOBNAME=py-wordcount-`date +%s`
> date +%s
>
> PROJECT=apache-beam-testing
>
> # Create a tarball
> python setup.py sdist
> running sdist
> running egg_info
> writing requirements to apache_beam_sdk.egg-info/requires.txt
> writing apache_beam_sdk.egg-info/PKG-INFO
> writing top-level names to apache_beam_sdk.egg-info/top_level.txt
> writing dependency_links to apache_beam_sdk.egg-info/dependency_links.txt
> reading manifest file 'apache_beam_sdk.egg-info/SOURCES.txt'
> writing manifest file 'apache_beam_sdk.egg-info/SOURCES.txt'
> warning: sdist: standard file not found: should have one of README,
> README.rst, README.txt
>
> running check
> warning: check: missing meta-data: if 'author' supplied, 'author_email'
> must be supplied too
>
> creating apache-beam-sdk-0.3.0
> creating apache-beam-sdk-0.3.0/apache_beam
> creating apache-beam-sdk-0.3.0/apache_beam/coders
> creating apache-beam-sdk-0.3.0/apache_beam/examples
> creating apache-beam-sdk-0.3.0/apache_beam/internal
> creating apache-beam-sdk-0.3.0/apache_beam/internal/clients
> creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/bigquery
> creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/dataflow
> creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/storage
> creating apache-beam-sdk-0.3.0/apache_beam/io
> creating apache-beam-sdk-0.3.0/apache_beam/runners
> creating apache-beam-sdk-0.3.0/apache_beam/transforms
> creating apache-beam-sdk-0.3.0/apache_beam/typehints
> creating apache-beam-sdk-0.3.0/apache_beam/utils
> creating apache-beam-sdk-0.3.0/apache_beam_sdk.egg-info
> making hard links in apache-beam-sdk-0.3.0...
> hard linking setup.cfg -> apache-beam-sdk-0.3.0
> hard linking setup.py -> apache-beam-sdk-0.3.0
> hard linking apache_beam/__init__.py -> apache-beam-sdk-0.3.0/apache_beam
> hard linking apache_beam/dataflow_test.py ->
> apache-beam-sdk-0.3.0/apache_beam
> hard linking apache_beam/error.py -> apache-beam-sdk-0.3.0/apache_beam
> hard linking apache_beam/pipeline.py -> apache-beam-sdk-0.3.0/apache_beam
> hard linking apache_beam/pipeline_test.py ->
> apache-beam-sdk-0.3.0/apache_beam
> hard linking apache_beam/pvalue.py -> apache-beam-sdk-0.3.0/apache_beam
> hard linking apache_beam/pvalue_test.py ->
> apache-beam-sdk-0.3.0/apache_beam
> hard linking apache_beam/version.py -> apache-beam-sdk-0.3.0/apache_beam
> hard linking apache_beam/coders/__init__.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/coder_impl.pxd ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/coder_impl.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/coders.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/coders_test.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/coders_test_common.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/fast_coders_test.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/observable.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/observable_test.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard linking apache_beam/coders/slow_coders_test.py ->
> apache-beam-sdk-0.3.0/apache_beam/coders
> hard 

Jenkins build is still unstable: beam_PostCommit_RunnableOnService_GoogleCloudDataflow #734

2016-07-11 Thread Apache Jenkins Server
See 




[jira] [Closed] (BEAM-394) AvroCoder is Serializable with non-serializable fields

2016-07-11 Thread Daniel Halperin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Halperin closed BEAM-394.

   Resolution: Fixed
Fix Version/s: 0.2.0-incubating

> AvroCoder is Serializable with non-serializable fields
> --
>
> Key: BEAM-394
> URL: https://issues.apache.org/jira/browse/BEAM-394
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Scott Wegner
>Priority: Minor
>  Labels: findbugs, newbie, starter
> Fix For: 0.2.0-incubating
>
>
> [FindBugs 
> SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
>  Non-transient non-serializable instance field in serializable class.
> Applies to AvroCoder fields:
> * decoder
> * encoder
> * reader
> * writer
> * schema
> These should likely be marked transient and ensure they are properly 
> initialized after deserialization.
> This is a good starter bug. When fixing, please remove the corresponding 
> entries from 
> [findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
>  and verify the build passes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Build failed in Jenkins: beam_PostCommit_PythonVerify #67

2016-07-11 Thread Apache Jenkins Server
See 

Changes:

[dhalperi] Fix warnings that came with newly released Pylint (1.6.1) version.

--
[...truncated 1247 lines...]
echo ">>> RUNNING DIRECT RUNNER py-wordcount"
>>> RUNNING DIRECT RUNNER py-wordcount
python -m apache_beam.examples.wordcount --output /tmp/py-wordcount-direct
INFO:root:Missing pipeline option (runner). Executing pipeline using the 
default runner: DirectPipelineRunner.
INFO:oauth2client.client:Attempting refresh to obtain initial access_token
INFO:root:Starting finalize_write threads with num_shards: 1, num_threads: 1
INFO:root:Renamed 1 shards in 0.02 seconds.
INFO:root:Final: Debug counters: {'element_counts': 
Counter({('group/reify_windows', None): 28001, ('pair_with_one', None): 28001, 
('split', None): 28001, 'read': 5525, ('group/group_by_window', None): 4784, 
('format', None): 4784, ('count', None): 4784, 'group/group_by_key': 4784, 
('write/WriteImpl/finalize_write', None): 1, ('write/WriteImpl/write_bundles', 
None): 1, ('write/WriteImpl/initialize_write', None): 1, 
'write/WriteImpl/DoOnce': 1})}
INFO:root:number of empty lines: 1663
INFO:root:average word lengths: [4.204242705617657]
# TODO: check that output file is generated for Direct Runner.

# Run wordcount on the service.

# Where to store wordcount output.
GCS_LOCATION=gs://temp-storage-for-end-to-end-tests

# Job name needs to be unique
JOBNAME=py-wordcount-`date +%s`
date +%s

PROJECT=apache-beam-testing

# Create a tarball
python setup.py sdist
running sdist
running egg_info
writing requirements to apache_beam_sdk.egg-info/requires.txt
writing apache_beam_sdk.egg-info/PKG-INFO
writing top-level names to apache_beam_sdk.egg-info/top_level.txt
writing dependency_links to apache_beam_sdk.egg-info/dependency_links.txt
reading manifest file 'apache_beam_sdk.egg-info/SOURCES.txt'
writing manifest file 'apache_beam_sdk.egg-info/SOURCES.txt'
warning: sdist: standard file not found: should have one of README, README.rst, 
README.txt

running check
warning: check: missing meta-data: if 'author' supplied, 'author_email' must be 
supplied too

creating apache-beam-sdk-0.3.0
creating apache-beam-sdk-0.3.0/apache_beam
creating apache-beam-sdk-0.3.0/apache_beam/coders
creating apache-beam-sdk-0.3.0/apache_beam/examples
creating apache-beam-sdk-0.3.0/apache_beam/internal
creating apache-beam-sdk-0.3.0/apache_beam/internal/clients
creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/bigquery
creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/dataflow
creating apache-beam-sdk-0.3.0/apache_beam/internal/clients/storage
creating apache-beam-sdk-0.3.0/apache_beam/io
creating apache-beam-sdk-0.3.0/apache_beam/runners
creating apache-beam-sdk-0.3.0/apache_beam/transforms
creating apache-beam-sdk-0.3.0/apache_beam/typehints
creating apache-beam-sdk-0.3.0/apache_beam/utils
creating apache-beam-sdk-0.3.0/apache_beam_sdk.egg-info
making hard links in apache-beam-sdk-0.3.0...
hard linking setup.cfg -> apache-beam-sdk-0.3.0
hard linking setup.py -> apache-beam-sdk-0.3.0
hard linking apache_beam/__init__.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/dataflow_test.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/error.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/pipeline.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/pipeline_test.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/pvalue.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/pvalue_test.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/version.py -> apache-beam-sdk-0.3.0/apache_beam
hard linking apache_beam/coders/__init__.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coder_impl.pxd -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coder_impl.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coders.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coders_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/coders_test_common.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/fast_coders_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/observable.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/observable_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/slow_coders_test.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/slow_stream.py -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/stream.pxd -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/stream.pyx -> 
apache-beam-sdk-0.3.0/apache_beam/coders
hard linking apache_beam/coders/stream_test.py -> 

[GitHub] incubator-beam pull request #628: Fix warnings that came with newly released...

2016-07-11 Thread aaltay
Github user aaltay closed the pull request at:

https://github.com/apache/incubator-beam/pull/628


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-394) AvroCoder is Serializable with non-serializable fields

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-394?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371999#comment-15371999
 ] 

ASF GitHub Bot commented on BEAM-394:
-

Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/624


> AvroCoder is Serializable with non-serializable fields
> --
>
> Key: BEAM-394
> URL: https://issues.apache.org/jira/browse/BEAM-394
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Scott Wegner
>Priority: Minor
>  Labels: findbugs, newbie, starter
>
> [FindBugs 
> SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
>  Non-transient non-serializable instance field in serializable class.
> Applies to AvroCoder fields:
> * decoder
> * encoder
> * reader
> * writer
> * schema
> These should likely be marked transient and ensure they are properly 
> initialized after deserialization.
> This is a good starter bug. When fixing, please remove the corresponding 
> entries from 
> [findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
>  and verify the build passes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[2/2] incubator-beam git commit: AvroCoder: fix findbugs errors

2016-07-11 Thread dhalperi
AvroCoder: fix findbugs errors

* Updated the AvroCoder class to set unserializable fields as transient.
* Updated the findbugs_filter to remove ignored errors.
* Added tests.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/d2ad9511
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/d2ad9511
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/d2ad9511

Branch: refs/heads/master
Commit: d2ad95116ab895a429d33f2f3edff897c10db2c6
Parents: 3653c5c
Author: Ilya Ganelin 
Authored: Mon Jul 11 14:36:45 2016 -0400
Committer: Dan Halperin 
Committed: Mon Jul 11 17:52:30 2016 -0700

--
 .../src/main/resources/beam/findbugs-filter.xml | 36 
 .../org/apache/beam/sdk/coders/AvroCoder.java   | 10 +++---
 .../apache/beam/sdk/coders/AvroCoderTest.java   | 26 ++
 3 files changed, 31 insertions(+), 41 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d2ad9511/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml
--
diff --git a/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml 
b/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml
index e15cf7b..ee55ef0 100644
--- a/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml
+++ b/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml
@@ -30,42 +30,6 @@
   the issue.
 -->
   
-
-
-
-
-  
-  
-
-
-
-
-  
-  
-
-
-
-
-  
-  
-
-
-
-
-  
-  
-
-
-
-
-  
-  
-
-
-
-
-  
-  
 
 
 

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d2ad9511/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java
index 00c1cbc..873a591 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/AvroCoder.java
@@ -168,7 +168,7 @@ public class AvroCoder extends StandardCoder {
   };
 
   private final Class type;
-  private final Schema schema;
+  private final transient Schema schema;
 
   private final List nonDeterministicReasons;
 
@@ -178,10 +178,10 @@ public class AvroCoder extends StandardCoder {
   // Cache the old encoder/decoder and let the factories reuse them when 
possible. To be threadsafe,
   // these are ThreadLocal. This code does not need to be re-entrant as 
AvroCoder does not use
   // an inner coder.
-  private final ThreadLocal decoder;
-  private final ThreadLocal encoder;
-  private final ThreadLocal writer;
-  private final ThreadLocal reader;
+  private final transient ThreadLocal decoder;
+  private final transient ThreadLocal encoder;
+  private final transient ThreadLocal writer;
+  private final transient ThreadLocal reader;
 
   protected AvroCoder(Class type, Schema schema) {
 this.type = type;

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d2ad9511/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/AvroCoderTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/AvroCoderTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/AvroCoderTest.java
index 207bfdd..54f7ec1 100644
--- a/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/AvroCoderTest.java
+++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/AvroCoderTest.java
@@ -62,6 +62,8 @@ import org.junit.runners.JUnit4;
 
 import java.io.ByteArrayInputStream;
 import java.io.ByteArrayOutputStream;
+import java.io.ObjectInputStream;
+import java.io.ObjectOutputStream;
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.Collection;
@@ -148,6 +150,30 @@ public class AvroCoderTest {
 Matchers.containsInAnyOrder("@type", "type", "schema", "encoding_id"));
   }
 
+  /**
+   * Confirm that we can serialize and deserialize an AvroCoder object and 
still decode after.
+   * (BEAM-349).
+   *
+   * @throws Exception
+   */
+  @Test
+  public void testTransientFieldInitialization() throws Exception {
+Pojo value = new Pojo("Hello", 42);
+AvroCoder coder = AvroCoder.of(Pojo.class);
+
+//Serialization of object
+ByteArrayOutputStream bos = new ByteArrayOutputStream();
+ObjectOutputStream out = new ObjectOutputStream(bos);
+out.writeObject(coder);
+
+   

[1/2] incubator-beam git commit: Closes #624

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master 3653c5c0b -> d93f564eb


Closes #624


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/d93f564e
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/d93f564e
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/d93f564e

Branch: refs/heads/master
Commit: d93f564eb66ca9f68e6e3b1713db6d05da604440
Parents: 3653c5c d2ad951
Author: Dan Halperin 
Authored: Mon Jul 11 17:52:30 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 17:52:30 2016 -0700

--
 .../src/main/resources/beam/findbugs-filter.xml | 36 
 .../org/apache/beam/sdk/coders/AvroCoder.java   | 10 +++---
 .../apache/beam/sdk/coders/AvroCoderTest.java   | 26 ++
 3 files changed, 31 insertions(+), 41 deletions(-)
--




[GitHub] incubator-beam pull request #624: [BEAM-394] Updated the AvroCoder class to ...

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/624


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-399) SerializableCoder.equals does not handle null

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371997#comment-15371997
 ] 

ASF GitHub Bot commented on BEAM-399:
-

Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/627


> SerializableCoder.equals does not handle null
> -
>
> Key: BEAM-399
> URL: https://issues.apache.org/jira/browse/BEAM-399
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Scott Wegner
>Priority: Minor
>  Labels: findbugs, newbie, starter
>
> [FindBugs 
> NP_EQUALS_SHOULD_HANDLE_NULL_ARGUMENT|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L100]:
>  equals() method does not check for null argument
> Applies to: 
> [SerializableCoder|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/SerializableCoder.java].
> This is a good starter bug. When fixing, please remove the corresponding 
> entries from 
> [findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
>  and verify the build passes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #627: [BEAM-399] Updated the SerializableCoder c...

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/627


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: Fix warnings that came with newly released Pylint (1.6.1) version.

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/python-sdk b5daba37f -> b6bb97f57


Fix warnings that came with newly released Pylint (1.6.1) version.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/d57c23d5
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/d57c23d5
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/d57c23d5

Branch: refs/heads/python-sdk
Commit: d57c23d50ac0ec4baa4a226319bb1ccdd8ea21e0
Parents: b5daba3
Author: Ahmet Altay 
Authored: Mon Jul 11 13:49:50 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 17:48:33 2016 -0700

--
 sdks/python/apache_beam/coders/coder_impl.py| 4 ++--
 sdks/python/apache_beam/examples/cookbook/combiners_test.py | 3 ---
 sdks/python/apache_beam/typehints/trivial_inference.py  | 1 -
 sdks/python/apache_beam/utils/retry.py  | 4 
 4 files changed, 2 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d57c23d5/sdks/python/apache_beam/coders/coder_impl.py
--
diff --git a/sdks/python/apache_beam/coders/coder_impl.py 
b/sdks/python/apache_beam/coders/coder_impl.py
index 8ff73bf..ca64d9c 100644
--- a/sdks/python/apache_beam/coders/coder_impl.py
+++ b/sdks/python/apache_beam/coders/coder_impl.py
@@ -28,7 +28,7 @@ coder_impl.pxd file for type hints.
 import collections
 
 
-# pylint: disable=wrong-import-order, wrong-import-position
+# pylint: disable=wrong-import-order, wrong-import-position, ungrouped-imports
 try:
   # Don't depend on the full dataflow sdk to test coders.
   from apache_beam.transforms.window import WindowedValue
@@ -42,7 +42,7 @@ try:
 except ImportError:
   from slow_stream import InputStream as create_InputStream
   from slow_stream import OutputStream as create_OutputStream
-# pylint: enable=wrong-import-order, wrong-import-position
+# pylint: enable=wrong-import-order, wrong-import-position, ungrouped-imports
 
 
 class CoderImpl(object):

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d57c23d5/sdks/python/apache_beam/examples/cookbook/combiners_test.py
--
diff --git a/sdks/python/apache_beam/examples/cookbook/combiners_test.py 
b/sdks/python/apache_beam/examples/cookbook/combiners_test.py
index 1b032e4..56f5e78 100644
--- a/sdks/python/apache_beam/examples/cookbook/combiners_test.py
+++ b/sdks/python/apache_beam/examples/cookbook/combiners_test.py
@@ -71,6 +71,3 @@ class CombinersTest(unittest.TestCase):
 if __name__ == '__main__':
   logging.getLogger().setLevel(logging.INFO)
   unittest.main()
-
-
-

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d57c23d5/sdks/python/apache_beam/typehints/trivial_inference.py
--
diff --git a/sdks/python/apache_beam/typehints/trivial_inference.py 
b/sdks/python/apache_beam/typehints/trivial_inference.py
index c8b2d72..e1fbc42 100644
--- a/sdks/python/apache_beam/typehints/trivial_inference.py
+++ b/sdks/python/apache_beam/typehints/trivial_inference.py
@@ -415,4 +415,3 @@ def infer_return_type_func(f, input_types, debug=False, 
depth=0):
   if debug:
 print f, id(f), input_types, '->', result
   return result
-

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d57c23d5/sdks/python/apache_beam/utils/retry.py
--
diff --git a/sdks/python/apache_beam/utils/retry.py 
b/sdks/python/apache_beam/utils/retry.py
index b4992d4..67cbeb2 100644
--- a/sdks/python/apache_beam/utils/retry.py
+++ b/sdks/python/apache_beam/utils/retry.py
@@ -191,7 +191,3 @@ def with_exponential_backoff(
 return wrapper
 
   return real_decorator
-
-
-
-



[2/2] incubator-beam git commit: Fixed equals method to handle null and added respective test.

2016-07-11 Thread dhalperi
Fixed equals method to handle null and added respective test.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/732e8bf7
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/732e8bf7
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/732e8bf7

Branch: refs/heads/master
Commit: 732e8bf7eb205429d4313c0620bb3fb82ade5b3b
Parents: c5744cc
Author: Ilya Ganelin 
Authored: Mon Jul 11 16:05:51 2016 -0400
Committer: Dan Halperin 
Committed: Mon Jul 11 17:49:44 2016 -0700

--
 .../build-tools/src/main/resources/beam/findbugs-filter.xml| 6 --
 .../java/org/apache/beam/sdk/coders/SerializableCoder.java | 6 ++
 .../java/org/apache/beam/sdk/coders/SerializableCoderTest.java | 6 ++
 3 files changed, 8 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/732e8bf7/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml
--
diff --git a/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml 
b/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml
index 1226cd1..e15cf7b 100644
--- a/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml
+++ b/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml
@@ -83,12 +83,6 @@
 
   
   
-
-
-
-
-  
-  
 
 
 

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/732e8bf7/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/SerializableCoder.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/SerializableCoder.java
 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/SerializableCoder.java
index 310ecb7..0995bdc 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/SerializableCoder.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/SerializableCoder.java
@@ -166,10 +166,8 @@ public class SerializableCoder 
extends AtomicCoder {
 
   @Override
   public boolean equals(Object other) {
-if (getClass() != other.getClass()) {
-  return false;
-}
-return type == ((SerializableCoder) other).type;
+return !(other == null || getClass() != other.getClass())
+&& type == ((SerializableCoder) other).type;
   }
 
   @Override

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/732e8bf7/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/SerializableCoderTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/SerializableCoderTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/SerializableCoderTest.java
index f79f243..d6423e5 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/SerializableCoderTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/coders/SerializableCoderTest.java
@@ -130,6 +130,12 @@ public class SerializableCoderTest implements Serializable 
{
   }
 
   @Test
+  public void testNullEquals() {
+SerializableCoder coder = SerializableCoder.of(MyRecord.class);
+Assert.assertFalse(coder.equals(null));
+  }
+
+  @Test
   @Category(NeedsRunner.class)
   public void testDefaultCoder() throws Exception {
 Pipeline p = TestPipeline.create();



[2/2] incubator-beam git commit: Closes #628

2016-07-11 Thread dhalperi
Closes #628


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/b6bb97f5
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/b6bb97f5
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/b6bb97f5

Branch: refs/heads/python-sdk
Commit: b6bb97f57b3fe898f629f6c9a71f6100e5af2c6d
Parents: b5daba3 d57c23d
Author: Dan Halperin 
Authored: Mon Jul 11 17:48:34 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 17:48:34 2016 -0700

--
 sdks/python/apache_beam/coders/coder_impl.py| 4 ++--
 sdks/python/apache_beam/examples/cookbook/combiners_test.py | 3 ---
 sdks/python/apache_beam/typehints/trivial_inference.py  | 1 -
 sdks/python/apache_beam/utils/retry.py  | 4 
 4 files changed, 2 insertions(+), 10 deletions(-)
--




[GitHub] incubator-beam pull request #636: Remove more tests from nose tests exclusio...

2016-07-11 Thread aaltay
GitHub user aaltay opened a pull request:

https://github.com/apache/incubator-beam/pull/636

Remove more tests from nose tests exclusion list

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/aaltay/incubator-beam assert

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/636.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #636


commit d34a9665ea4ce25390d9ac1153c57075ed0217a3
Author: Ahmet Altay 
Date:   2016-07-12T00:08:40Z

Remove examples from the nose tests exclusion list

commit 341a200de6fcc38241b53224c3cfdd39a817d3c6
Author: Ahmet Altay 
Date:   2016-07-12T00:44:08Z

Remove non-existent tests from the exclusion list




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-354) Modify DatastoreIO to use Datastore v1beta3 API

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-354?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371964#comment-15371964
 ] 

ASF GitHub Bot commented on BEAM-354:
-

GitHub user vikkyrk opened a pull request:

https://github.com/apache/incubator-beam/pull/635

BEAM-354: End-to-end tests for V1Beta3 Datastore

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---

- Add end-to-end tests for v1beta3 DatastoreIO
- Common helper functions are under V1Beta3TestUtils, which contains direct 
reader and writer for datastore. 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/vikkyrk/incubator-beam 
vikasrk/datastoreIOE2ETest

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/635.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #635


commit 3f24cf227b6550a5af7c4497108293981a0c529a
Author: Vikas Kedigehalli 
Date:   2016-07-11T18:31:48Z

datastore e2e tests

commit 4e9d7540349337e1e4b576f5e907d7fab1b1e5d4
Author: Vikas Kedigehalli 
Date:   2016-07-11T23:50:46Z

Clean up

commit 468124b8887727112b643bd004a82589515b9fae
Author: Vikas Kedigehalli 
Date:   2016-07-12T00:21:31Z

Checkstyle and some comments




> Modify DatastoreIO to use Datastore v1beta3 API
> ---
>
> Key: BEAM-354
> URL: https://issues.apache.org/jira/browse/BEAM-354
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-gcp
>Affects Versions: 0.2.0-incubating
>Reporter: Vikas Kedigehalli
>Assignee: Vikas Kedigehalli
>
> Datastore v1beta2 API is getting deprecated in favor of v1beta3. Hence the 
> DatastoreIO needs to be migrated to use the new version. Also in the process 
> of doing so, this is a good time to add a level of indirection via a 
> PTranform such that future changes in Datastore API would not result in 
> changing user/pipeline code. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #635: BEAM-354: End-to-end tests for V1Beta3 Dat...

2016-07-11 Thread vikkyrk
GitHub user vikkyrk opened a pull request:

https://github.com/apache/incubator-beam/pull/635

BEAM-354: End-to-end tests for V1Beta3 Datastore

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---

- Add end-to-end tests for v1beta3 DatastoreIO
- Common helper functions are under V1Beta3TestUtils, which contains direct 
reader and writer for datastore. 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/vikkyrk/incubator-beam 
vikasrk/datastoreIOE2ETest

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/635.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #635


commit 3f24cf227b6550a5af7c4497108293981a0c529a
Author: Vikas Kedigehalli 
Date:   2016-07-11T18:31:48Z

datastore e2e tests

commit 4e9d7540349337e1e4b576f5e907d7fab1b1e5d4
Author: Vikas Kedigehalli 
Date:   2016-07-11T23:50:46Z

Clean up

commit 468124b8887727112b643bd004a82589515b9fae
Author: Vikas Kedigehalli 
Date:   2016-07-12T00:21:31Z

Checkstyle and some comments




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Resolved] (BEAM-379) DisplayDataEvaluator does not support source transforms of the form PTransform<PBegin, ?>

2016-07-11 Thread Vikas Kedigehalli (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-379?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vikas Kedigehalli resolved BEAM-379.

   Resolution: Fixed
Fix Version/s: 0.2.0-incubating

> DisplayDataEvaluator does not support source transforms of the form 
> PTransform
> -
>
> Key: BEAM-379
> URL: https://issues.apache.org/jira/browse/BEAM-379
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Vikas Kedigehalli
>Assignee: Vikas Kedigehalli
> Fix For: 0.2.0-incubating
>
>
> DisplayDataEvaluator 
> (https://github.com/apache/incubator-beam/blob/c0efe568e5291298c1394016a12e7979b37afc44/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java#L81)
>  takes PTranform, ? extends POutput>, but this 
> doesn't work for source transforms of the form PTransform PCollection>. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #633: Internal cleanup.

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/633


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] incubator-beam git commit: Closes #633

2016-07-11 Thread robertwb
Closes #633


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/b5daba37
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/b5daba37
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/b5daba37

Branch: refs/heads/python-sdk
Commit: b5daba37fa285d428a1d8ffe9106b7fe265b2914
Parents: 73168b2 c7b8c68
Author: Robert Bradshaw 
Authored: Mon Jul 11 16:47:24 2016 -0700
Committer: Robert Bradshaw 
Committed: Mon Jul 11 16:47:24 2016 -0700

--
 sdks/python/apache_beam/coders/coder_impl.pxd   |  3 ++-
 sdks/python/apache_beam/pvalue.py   |  3 +++
 .../apache_beam/runners/dataflow_runner.py  | 25 +++-
 sdks/python/apache_beam/transforms/core.py  | 18 ++
 4 files changed, 33 insertions(+), 16 deletions(-)
--




[1/2] incubator-beam git commit: Internal cleanup.

2016-07-11 Thread robertwb
Repository: incubator-beam
Updated Branches:
  refs/heads/python-sdk 73168b220 -> b5daba37f


Internal cleanup.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/c7b8c688
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/c7b8c688
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/c7b8c688

Branch: refs/heads/python-sdk
Commit: c7b8c688e1bf3d44f6c9072d58a173a2618a5e13
Parents: 73168b2
Author: Robert Bradshaw 
Authored: Mon Jul 11 15:20:25 2016 -0700
Committer: Robert Bradshaw 
Committed: Mon Jul 11 15:20:25 2016 -0700

--
 sdks/python/apache_beam/coders/coder_impl.pxd   |  3 ++-
 sdks/python/apache_beam/pvalue.py   |  3 +++
 .../apache_beam/runners/dataflow_runner.py  | 25 +++-
 sdks/python/apache_beam/transforms/core.py  | 18 ++
 4 files changed, 33 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c7b8c688/sdks/python/apache_beam/coders/coder_impl.pxd
--
diff --git a/sdks/python/apache_beam/coders/coder_impl.pxd 
b/sdks/python/apache_beam/coders/coder_impl.pxd
index a86d08b..0c92c5b 100644
--- a/sdks/python/apache_beam/coders/coder_impl.pxd
+++ b/sdks/python/apache_beam/coders/coder_impl.pxd
@@ -29,7 +29,8 @@ from .stream cimport InputStream, OutputStream
 
 
 cdef object loads, dumps, create_InputStream, create_OutputStream
-cdef type WindowedValue
+# Temporarily untyped to allow monkeypatching on failed import.
+#cdef type WindowedValue
 
 
 cdef class CoderImpl(object):

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c7b8c688/sdks/python/apache_beam/pvalue.py
--
diff --git a/sdks/python/apache_beam/pvalue.py 
b/sdks/python/apache_beam/pvalue.py
index 78ff209..8552a45 100644
--- a/sdks/python/apache_beam/pvalue.py
+++ b/sdks/python/apache_beam/pvalue.py
@@ -197,6 +197,9 @@ class DoOutputsTuple(object):
 # Transfer the producer from the DoOutputsTuple to the resulting
 # PCollection.
 pcoll.producer = self.producer
+# Add this as an output to both the inner ParDo and the outer _MultiParDo
+# PTransforms.
+self.producer.parts[0].add_output(pcoll, tag)
 self.producer.add_output(pcoll, tag)
 self._pcolls[tag] = pcoll
 return pcoll

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c7b8c688/sdks/python/apache_beam/runners/dataflow_runner.py
--
diff --git a/sdks/python/apache_beam/runners/dataflow_runner.py 
b/sdks/python/apache_beam/runners/dataflow_runner.py
index 43a50ba..24edb05 100644
--- a/sdks/python/apache_beam/runners/dataflow_runner.py
+++ b/sdks/python/apache_beam/runners/dataflow_runner.py
@@ -190,7 +190,8 @@ class DataflowPipelineRunner(PipelineRunner):
 return self._get_cloud_encoding(self._get_coder(typehint,
 window_coder=window_coder))
 
-  def _get_coder(self, typehint, window_coder):
+  @staticmethod
+  def _get_coder(typehint, window_coder):
 """Returns a coder based on a typehint object."""
 if window_coder:
   return coders.WindowedValueCoder(
@@ -360,27 +361,20 @@ class DataflowPipelineRunner(PipelineRunner):
 
 # Attach side inputs.
 si_dict = {}
-si_tags_and_types = []
+lookup_label = lambda side_pval: 
self._cache.get_pvalue(side_pval).step_name
 for side_pval in transform_node.side_inputs:
   assert isinstance(side_pval, PCollectionView)
-  side_input_step = self._cache.get_pvalue(side_pval)
-  si_label = side_input_step.step_name
+  si_label = lookup_label(side_pval)
   si_dict[si_label] = {
   '@type': 'OutputReference',
   PropertyNames.STEP_NAME: si_label,
   PropertyNames.OUTPUT_NAME: PropertyNames.OUT}
-  # The label for the side input step will appear as a 'tag' property for
-  # the side input source specification. Its type (singleton or iterator)
-  # will also be used to read the entire source or just first element.
-  si_tags_and_types.append((si_label, side_pval.__class__,
-side_pval._view_options()))  # pylint: 
disable=protected-access
 
 # Now create the step for the ParDo transform being handled.
 step = self._add_step(
 TransformNames.DO, transform_node.full_label, transform_node,
 transform_node.transform.side_output_tags)
-fn_data = (transform.fn, transform.args, transform.kwargs,
-   si_tags_and_types, transform_node.inputs[0].windowing)
+fn_data = self._pardo_fn_data(transform_node, lookup_label)
 

[jira] [Commented] (BEAM-433) Make Beam examples runners agnostic

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371880#comment-15371880
 ] 

ASF GitHub Bot commented on BEAM-433:
-

GitHub user peihe opened a pull request:

https://github.com/apache/incubator-beam/pull/634

[BEAM-433] Remove references to JobName in examples pipeline options.



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/peihe/incubator-beam cleanup-examples-utils

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/634.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #634


commit 21e9918b14a6cff069c2e5857e31c69e0ab48447
Author: Pei He 
Date:   2016-07-11T21:22:15Z

[BEAM-433] Change the ExampleUtils constructor takes PipelineOptions

commit 1ebb06781bd3e08c949e0516a81eb842d81e5eda
Author: Pei He 
Date:   2016-07-11T21:05:46Z

[BEAM-433] Remove reference to JobName in examples pipeline options.




> Make Beam examples runners agnostic
> ---
>
> Key: BEAM-433
> URL: https://issues.apache.org/jira/browse/BEAM-433
> Project: Beam
>  Issue Type: Improvement
>Reporter: Pei He
>Assignee: Pei He
>
> Beam examples are ported from Dataflow, and they heavily reference to 
> Dataflow classes.
> There are following cleanup tasks:
> 1. Remove Dataflow streaming and batch injector setup (Done).
> 2. Remove references to DataflowPipelineOptions.
> 3. Move cancel() from DataflowPipelineJob to PipelineResult.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #634: [BEAM-433] Remove references to JobName in...

2016-07-11 Thread peihe
GitHub user peihe opened a pull request:

https://github.com/apache/incubator-beam/pull/634

[BEAM-433] Remove references to JobName in examples pipeline options.



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/peihe/incubator-beam cleanup-examples-utils

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/634.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #634


commit 21e9918b14a6cff069c2e5857e31c69e0ab48447
Author: Pei He 
Date:   2016-07-11T21:22:15Z

[BEAM-433] Change the ExampleUtils constructor takes PipelineOptions

commit 1ebb06781bd3e08c949e0516a81eb842d81e5eda
Author: Pei He 
Date:   2016-07-11T21:05:46Z

[BEAM-433] Remove reference to JobName in examples pipeline options.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] incubator-beam pull request #629: Pickle only used symbols from __main__ nam...

2016-07-11 Thread robertwb
Github user robertwb closed the pull request at:

https://github.com/apache/incubator-beam/pull/629


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: Pickle only used symbols from __main__ namespace.

2016-07-11 Thread robertwb
Repository: incubator-beam
Updated Branches:
  refs/heads/apache_beam [created] cd3bcd51f


Pickle only used symbols from __main__ namespace.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/2809edc5
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/2809edc5
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/2809edc5

Branch: refs/heads/apache_beam
Commit: 2809edc5c05701c2d97d93cd355337bc86a368e3
Parents: 73168b2
Author: Robert Bradshaw 
Authored: Mon Jul 11 14:06:39 2016 -0700
Committer: Robert Bradshaw 
Committed: Mon Jul 11 15:59:07 2016 -0700

--
 sdks/python/apache_beam/internal/pickler.py  | 18 ++
 sdks/python/apache_beam/internal/pickler_test.py |  3 +++
 2 files changed, 21 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/2809edc5/sdks/python/apache_beam/internal/pickler.py
--
diff --git a/sdks/python/apache_beam/internal/pickler.py 
b/sdks/python/apache_beam/internal/pickler.py
index 898e04b..f427aa5 100644
--- a/sdks/python/apache_beam/internal/pickler.py
+++ b/sdks/python/apache_beam/internal/pickler.py
@@ -159,6 +159,24 @@ if 'save_module' in dir(dill.dill):
   return old_save_module_dict(pickler, obj)
   dill.dill.save_module_dict = new_save_module_dict
 
+
+  old_save_function = dill.dill.save_function
+
+  @dill.dill.register(types.FunctionType)
+  def new_save_function(pickler, obj):
+globs = obj.__globals__ if dill.dill.PY3 else obj.func_globals
+if (dill.dill.is_dill(pickler) and globs == pickler._main.__dict__
+and not pickler._recurse):
+  try:
+pickler._recurse = True
+return old_save_function(pickler, obj)
+  finally:
+pickler._recurse = False
+else:
+  return old_save_function(pickler, obj)
+  dill.dill.save_function = new_save_function
+
+
   def _nest_dill_logging():
 """Prefix all dill logging with its depth in the callstack.
 

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/2809edc5/sdks/python/apache_beam/internal/pickler_test.py
--
diff --git a/sdks/python/apache_beam/internal/pickler_test.py 
b/sdks/python/apache_beam/internal/pickler_test.py
index 20de923..0952a26 100644
--- a/sdks/python/apache_beam/internal/pickler_test.py
+++ b/sdks/python/apache_beam/internal/pickler_test.py
@@ -40,6 +40,9 @@ class PicklerTest(unittest.TestCase):
 ['abc', 'def'],
 loads(dumps(module_test.get_lambda_with_globals()))('abc def'))
 
+  def test_lambda_with_main_globals(self):
+self.assertEquals(unittest, loads(dumps(lambda: unittest))())
+
   def test_lambda_with_closure(self):
 """Tests that the closure of a function is preserved."""
 self.assertEquals(



[2/2] incubator-beam git commit: Closes #629

2016-07-11 Thread robertwb
Closes #629


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/cd3bcd51
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/cd3bcd51
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/cd3bcd51

Branch: refs/heads/apache_beam
Commit: cd3bcd51fa83e19b7d284d6394da98d1462c5732
Parents: 73168b2 2809edc
Author: Robert Bradshaw 
Authored: Mon Jul 11 15:59:15 2016 -0700
Committer: Robert Bradshaw 
Committed: Mon Jul 11 15:59:15 2016 -0700

--
 sdks/python/apache_beam/internal/pickler.py  | 18 ++
 sdks/python/apache_beam/internal/pickler_test.py |  3 +++
 2 files changed, 21 insertions(+)
--




[jira] [Updated] (BEAM-440) Create.values() returns a type-unsafe Coder

2016-07-11 Thread Daniel Halperin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-440?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Halperin updated BEAM-440:
-
Labels: newbie starter  (was: )

> Create.values() returns a type-unsafe Coder
> ---
>
> Key: BEAM-440
> URL: https://issues.apache.org/jira/browse/BEAM-440
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Daniel Halperin
>  Labels: newbie, starter
>
> Create.values() with no arguments will default to a VoidCoder, unless one is 
> set later with #setCoder(Coder).
> Although it will encode its input correctly, this seems like a bad choice in 
> many cases. E.g., with Flatten:
> PCollection> initial = p.apply("First", 
> Create.>of());
> PCollection> second =
> p.apply("Second", Create.of("a", "b")).apply(ParDo.of(new 
> MyAvroDoFn()));
> PCollectionList
> .of(initial).and(second)
> .apply(Flatten.>pCollections());
> This crashes trying to cast a KV from "Second" to a Void.class.
> 1. Suggest throwing a warning in #getDefaultOutputCoder when defaulting to 
> VoidCoder for an empty elements list. Should this be an error?
> 2. Suggest adding something like Create.empty(TypeDescriptor) to handle this 
> case properly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-440) Create.values() returns a type-unsafe Coder

2016-07-11 Thread Daniel Halperin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-440?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371738#comment-15371738
 ] 

Daniel Halperin commented on BEAM-440:
--

This is a good starter issue.

> Create.values() returns a type-unsafe Coder
> ---
>
> Key: BEAM-440
> URL: https://issues.apache.org/jira/browse/BEAM-440
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Daniel Halperin
>  Labels: newbie, starter
>
> Create.values() with no arguments will default to a VoidCoder, unless one is 
> set later with #setCoder(Coder).
> Although it will encode its input correctly, this seems like a bad choice in 
> many cases. E.g., with Flatten:
> PCollection> initial = p.apply("First", 
> Create.>of());
> PCollection> second =
> p.apply("Second", Create.of("a", "b")).apply(ParDo.of(new 
> MyAvroDoFn()));
> PCollectionList
> .of(initial).and(second)
> .apply(Flatten.>pCollections());
> This crashes trying to cast a KV from "Second" to a Void.class.
> 1. Suggest throwing a warning in #getDefaultOutputCoder when defaulting to 
> VoidCoder for an empty elements list. Should this be an error?
> 2. Suggest adding something like Create.empty(TypeDescriptor) to handle this 
> case properly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (BEAM-440) Create.values() returns a type-unsafe Coder

2016-07-11 Thread Daniel Halperin (JIRA)
Daniel Halperin created BEAM-440:


 Summary: Create.values() returns a type-unsafe Coder
 Key: BEAM-440
 URL: https://issues.apache.org/jira/browse/BEAM-440
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-core
Reporter: Daniel Halperin


Create.values() with no arguments will default to a VoidCoder, unless one is 
set later with #setCoder(Coder).

Although it will encode its input correctly, this seems like a bad choice in 
many cases. E.g., with Flatten:

PCollection> initial = p.apply("First", 
Create.>of());
PCollection> second =
p.apply("Second", Create.of("a", "b")).apply(ParDo.of(new 
MyAvroDoFn()));
PCollectionList
.of(initial).and(second)
.apply(Flatten.>pCollections());

This crashes trying to cast a KV from "Second" to a Void.class.

1. Suggest throwing a warning in #getDefaultOutputCoder when defaulting to 
VoidCoder for an empty elements list. Should this be an error?

2. Suggest adding something like Create.empty(TypeDescriptor) to handle this 
case properly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Jenkins build is still unstable: beam_PostCommit_RunnableOnService_GoogleCloudDataflow #733

2016-07-11 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-434) When examples write output to file it creates many output files instead of one

2016-07-11 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371700#comment-15371700
 ] 

Kenneth Knowles commented on BEAM-434:
--

I like all of these, but 2 and 3 actually a bit better than 1 for the reason 
you say - it let's users know that output is sharded when they just look at the 
output files.

For the same reason, I prefer 2 over 3 as it let's users know from the "other 
end" that sharding has to be controlled explicitly.

> When examples write output to file it creates many output files instead of one
> --
>
> Key: BEAM-434
> URL: https://issues.apache.org/jira/browse/BEAM-434
> Project: Beam
>  Issue Type: Bug
>  Components: examples-java
>Reporter: Amit Sela
>Assignee: Amit Sela
>Priority: Minor
>
> When using `TextIO.Write.to("/path/to/output")` without any restrictions on 
> the number of shards, it might generate many output files (depending on your 
> input), for WordCount for example, you'll get as many output files as unique 
> words in your input.
> Since I think examples are expected to execute in a friendly manner to "see" 
> what it does and not optimize for performance in some way, I suggest to use 
> `withoutSharding()` when writing the example output to an output file.
> Examples I could find that behave this way:
> org.apache.beam.examples.WordCount
> org.apache.beam.examples.complete.TfIdf
> org.apache.beam.examples.cookbook.DeDupExample



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #632: [BEAM-439] Remove optional args from start...

2016-07-11 Thread silviulica
GitHub user silviulica opened a pull request:

https://github.com/apache/incubator-beam/pull/632

[BEAM-439] Remove optional args from start/finish bundle methods

@robertwb can you please take a look?
Technically we do not want to allow deferred side inputs but it is much 
simple conceptually to remove
extra arguments altogether. 


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/silviulica/incubator-beam start_finish_no_args

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/632.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #632


commit d5a0ec6eb67b5d87e1557be83dbcbc07bbedf36d
Author: Silviu Calinoiu 
Date:   2016-07-11T21:21:44Z

Remove optional args from start/finish bundle methods




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (BEAM-439) The start_bundle/finish_bundle should not allow side inputs

2016-07-11 Thread Silviu Calinoiu (JIRA)
Silviu Calinoiu created BEAM-439:


 Summary: The start_bundle/finish_bundle should not allow side 
inputs
 Key: BEAM-439
 URL: https://issues.apache.org/jira/browse/BEAM-439
 Project: Beam
  Issue Type: Bug
  Components: sdk-py
Reporter: Silviu Calinoiu
Assignee: Frances Perry
Priority: Minor


Allowing them will create headaches when we support streaming.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-434) When examples write output to file it creates many output files instead of one

2016-07-11 Thread Daniel Halperin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371666#comment-15371666
 ] 

Daniel Halperin commented on BEAM-434:
--

[~bchambers] confirmed that the Direct Runner is producing a bundle per key, 
and proposed #3 below

> When examples write output to file it creates many output files instead of one
> --
>
> Key: BEAM-434
> URL: https://issues.apache.org/jira/browse/BEAM-434
> Project: Beam
>  Issue Type: Bug
>  Components: examples-java
>Reporter: Amit Sela
>Assignee: Amit Sela
>Priority: Minor
>
> When using `TextIO.Write.to("/path/to/output")` without any restrictions on 
> the number of shards, it might generate many output files (depending on your 
> input), for WordCount for example, you'll get as many output files as unique 
> words in your input.
> Since I think examples are expected to execute in a friendly manner to "see" 
> what it does and not optimize for performance in some way, I suggest to use 
> `withoutSharding()` when writing the example output to an output file.
> Examples I could find that behave this way:
> org.apache.beam.examples.WordCount
> org.apache.beam.examples.complete.TfIdf
> org.apache.beam.examples.cookbook.DeDupExample



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-434) When examples write output to file it creates many output files instead of one

2016-07-11 Thread Daniel Halperin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371665#comment-15371665
 ] 

Daniel Halperin commented on BEAM-434:
--

cc [~bchambers]]

> When examples write output to file it creates many output files instead of one
> --
>
> Key: BEAM-434
> URL: https://issues.apache.org/jira/browse/BEAM-434
> Project: Beam
>  Issue Type: Bug
>  Components: examples-java
>Reporter: Amit Sela
>Assignee: Amit Sela
>Priority: Minor
>
> When using `TextIO.Write.to("/path/to/output")` without any restrictions on 
> the number of shards, it might generate many output files (depending on your 
> input), for WordCount for example, you'll get as many output files as unique 
> words in your input.
> Since I think examples are expected to execute in a friendly manner to "see" 
> what it does and not optimize for performance in some way, I suggest to use 
> `withoutSharding()` when writing the example output to an output file.
> Examples I could find that behave this way:
> org.apache.beam.examples.WordCount
> org.apache.beam.examples.complete.TfIdf
> org.apache.beam.examples.cookbook.DeDupExample



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-434) When examples write output to file it creates many output files instead of one

2016-07-11 Thread Daniel Halperin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371662#comment-15371662
 ] 

Daniel Halperin commented on BEAM-434:
--

Proposals:

1. Modify examples to use {{TextIO.Write#withoutSharding}}. (This is 
[~amitsela]'s PR.)
2. Modify examples to use, e.g., {{TextIO.Write#withNumShards(3)}}. This will 
provide a better user experience while still conveying that 
multiple-shard-per-file is part of the Beam model.
3. Modify the Direct Runner to set, e.g., {{#withNumShards(3)}} when it sees a 
{{TextIO.Write}} with default sharding. This way the examples stay "pure" but 
the direct runner is nicer for users. Users can still override the number of 
shards for TextIO.

Of these, I think I prefer 3 the most. Amit?

> When examples write output to file it creates many output files instead of one
> --
>
> Key: BEAM-434
> URL: https://issues.apache.org/jira/browse/BEAM-434
> Project: Beam
>  Issue Type: Bug
>  Components: examples-java
>Reporter: Amit Sela
>Assignee: Amit Sela
>Priority: Minor
>
> When using `TextIO.Write.to("/path/to/output")` without any restrictions on 
> the number of shards, it might generate many output files (depending on your 
> input), for WordCount for example, you'll get as many output files as unique 
> words in your input.
> Since I think examples are expected to execute in a friendly manner to "see" 
> what it does and not optimize for performance in some way, I suggest to use 
> `withoutSharding()` when writing the example output to an output file.
> Examples I could find that behave this way:
> org.apache.beam.examples.WordCount
> org.apache.beam.examples.complete.TfIdf
> org.apache.beam.examples.cookbook.DeDupExample



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #631: [BEAM-433] Change the ExampleUtils constru...

2016-07-11 Thread peihe
GitHub user peihe opened a pull request:

https://github.com/apache/incubator-beam/pull/631

[BEAM-433] Change the ExampleUtils constructor takes PipelineOptions




You can merge this pull request into a Git repository by running:

$ git pull https://github.com/peihe/incubator-beam cleanup-examples-utils-0

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/631.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #631


commit 21e9918b14a6cff069c2e5857e31c69e0ab48447
Author: Pei He 
Date:   2016-07-11T21:22:15Z

[BEAM-433] Change the ExampleUtils constructor takes PipelineOptions




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-433) Make Beam examples runners agnostic

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371656#comment-15371656
 ] 

ASF GitHub Bot commented on BEAM-433:
-

GitHub user peihe opened a pull request:

https://github.com/apache/incubator-beam/pull/631

[BEAM-433] Change the ExampleUtils constructor takes PipelineOptions




You can merge this pull request into a Git repository by running:

$ git pull https://github.com/peihe/incubator-beam cleanup-examples-utils-0

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/631.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #631


commit 21e9918b14a6cff069c2e5857e31c69e0ab48447
Author: Pei He 
Date:   2016-07-11T21:22:15Z

[BEAM-433] Change the ExampleUtils constructor takes PipelineOptions




> Make Beam examples runners agnostic
> ---
>
> Key: BEAM-433
> URL: https://issues.apache.org/jira/browse/BEAM-433
> Project: Beam
>  Issue Type: Improvement
>Reporter: Pei He
>Assignee: Pei He
>
> Beam examples are ported from Dataflow, and they heavily reference to 
> Dataflow classes.
> There are following cleanup tasks:
> 1. Remove Dataflow streaming and batch injector setup (Done).
> 2. Remove references to DataflowPipelineOptions.
> 3. Move cancel() from DataflowPipelineJob to PipelineResult.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Comment Edited] (BEAM-404) PubsubIO should have a mode that supports maintaining message attributes.

2016-07-11 Thread Ilya Ganelin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-404?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371631#comment-15371631
 ] 

Ilya Ganelin edited comment on BEAM-404 at 7/11/16 9:15 PM:


[~dhalp...@google.com] What would this look like? At the moment there seem to 
be two method types for "Read":

{code}   
public static Bound topic(String topic) {
  return new Bound<>(DEFAULT_PUBSUB_CODER).topic(topic);
}
{code}

and 

{code}
public Bound topic(String topic) {
return new Bound<>(name, subscription, PubsubTopic.fromPath(topic), 
timestampLabel, coder, idLabel, maxNumRecords, maxReadTime);
}
{code}

Is the ask in this PR to make something like 
{code}
public Bound topic(String topic, Metadata metadata) {
return new 
Bound<>(DEFAULT_PUBSUB_CODER).topic(topic).metadata(metadata);
}
{code}

Is there a particular kind of Metadata you had in mind?


was (Author: ilganeli):
[~dhalp...@google.com] What would this look like? At the moment there seem to 
be two method types for "Read":

{code}   
public static Bound topic(String topic) {
  return new Bound<>(DEFAULT_PUBSUB_CODER).topic(topic);
}
{code}

> PubsubIO should have a mode that supports maintaining message attributes.
> -
>
> Key: BEAM-404
> URL: https://issues.apache.org/jira/browse/BEAM-404
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Daniel Halperin
>
> Right now, PubsubIO only lets uses access the message payload, decoded with 
> the user-provided coder.
> We should add a mode in which the source can return a message with the 
> metadata (attributes) as well.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-404) PubsubIO should have a mode that supports maintaining message attributes.

2016-07-11 Thread Ilya Ganelin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-404?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371631#comment-15371631
 ] 

Ilya Ganelin commented on BEAM-404:
---

[~dhalp...@google.com] What would this look like? At the moment there seem to 
be two method types for "Read":

{code}   
public static Bound topic(String topic) {
  return new Bound<>(DEFAULT_PUBSUB_CODER).topic(topic);
}
{code}

> PubsubIO should have a mode that supports maintaining message attributes.
> -
>
> Key: BEAM-404
> URL: https://issues.apache.org/jira/browse/BEAM-404
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Daniel Halperin
>
> Right now, PubsubIO only lets uses access the message payload, decoded with 
> the user-provided coder.
> We should add a mode in which the source can return a message with the 
> metadata (attributes) as well.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #630: Allow custom timestamp/watermark function ...

2016-07-11 Thread Lusitanian
GitHub user Lusitanian opened a pull request:

https://github.com/apache/incubator-beam/pull/630

Allow custom timestamp/watermark function for UnboundedFlinkSource

So, using an `UnboundedFlinkSource` seems to force the timestamp applied to 
each element of the incoming stream to the ingestion time, rather than allowing 
for proper event timestamping. This PR  adds the ability to created an 
`UnboundedFlinkSource` with a custom TimestampAssigner, which should alleviate 
that issue. Note that this is particularly useful, as currently the only means 
of consuming from Kafka 0.8 using Beam/Flink Runner is to wrap Flink's Kafka 
0.8 consumer.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Lusitanian/incubator-beam flink_event_time

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/630.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #630


commit 2c5a4cd080eeaaf3505c64b773589c2da8217381
Author: David Desberg 
Date:   2016-07-11T19:24:18Z

Allow for custom timestamp/watermark function in FlinkPipelineRunner

commit 5d274e4ee5b5fc498bcabcab75bd7be30210bb3c
Author: David Desberg 
Date:   2016-07-11T19:56:21Z

Added new "of" signature and constructor for UnboundedFlinkSource to
allow event timestamping




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] incubator-beam pull request #629: Pickle only used symbols from __main__ nam...

2016-07-11 Thread robertwb
GitHub user robertwb opened a pull request:

https://github.com/apache/incubator-beam/pull/629

Pickle only used symbols from __main__ namespace.

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/robertwb/incubator-beam pickling

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/629.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #629


commit e91a6bbb6e5c040f635affd0b5d913ecb79fbaaf
Author: Robert Bradshaw 
Date:   2016-07-11T21:06:39Z

Pickle only used symbols from __main__ namespace.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] incubator-beam pull request #619: Remove ptransform tests from the excluded ...

2016-07-11 Thread aaltay
Github user aaltay closed the pull request at:

https://github.com/apache/incubator-beam/pull/619


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: Remove ptransform tests from the excluded tests list

2016-07-11 Thread robertwb
Repository: incubator-beam
Updated Branches:
  refs/heads/python-sdk 39dda320a -> 73168b220


Remove ptransform tests from the excluded tests list

This will enable ptransform_test as part of the pre-commit test.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/2fc45b4c
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/2fc45b4c
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/2fc45b4c

Branch: refs/heads/python-sdk
Commit: 2fc45b4c6f08f5437312d0cd1acfc1a5187a3d7a
Parents: 39dda32
Author: Ahmet Altay 
Authored: Sat Jul 9 15:08:45 2016 -0700
Committer: Robert Bradshaw 
Committed: Mon Jul 11 13:51:50 2016 -0700

--
 sdks/python/apache_beam/transforms/ptransform_test.py | 9 -
 sdks/python/setup.cfg | 2 +-
 2 files changed, 5 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/2fc45b4c/sdks/python/apache_beam/transforms/ptransform_test.py
--
diff --git a/sdks/python/apache_beam/transforms/ptransform_test.py 
b/sdks/python/apache_beam/transforms/ptransform_test.py
index aedb384..feb081c 100644
--- a/sdks/python/apache_beam/transforms/ptransform_test.py
+++ b/sdks/python/apache_beam/transforms/ptransform_test.py
@@ -461,9 +461,8 @@ class PTransformTest(unittest.TestCase):
 
 self.assertStartswith(
 e.exception.message,
-'Runtime type violation detected within '
-'ParDo(D/reify_windows): Input to GroupByKey must be '
-'a PCollection with elements compatible with KV[A, B]')
+'Input type hint violation at D: expected '
+'Tuple[TypeVariable[K], TypeVariable[V]]')
 
   def test_group_by_key_only_input_must_be_kv_pairs(self):
 pipeline = Pipeline('DirectPipelineRunner')
@@ -472,8 +471,8 @@ class PTransformTest(unittest.TestCase):
   pcolls | beam.GroupByKeyOnly('D')
   pipeline.run()
 
-expected_error_prefix = ('Input to GroupByKeyOnly must be a PCollection of 
'
- 'windowed key-value pairs.')
+expected_error_prefix = ('Input type hint violation at D: expected '
+ 'Tuple[TypeVariable[K], TypeVariable[V]]')
 self.assertStartswith(cm.exception.message, expected_error_prefix)
 
   def test_keys_and_values(self):

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/2fc45b4c/sdks/python/setup.cfg
--
diff --git a/sdks/python/setup.cfg b/sdks/python/setup.cfg
index 5b58e8d..317121f 100644
--- a/sdks/python/setup.cfg
+++ b/sdks/python/setup.cfg
@@ -34,5 +34,5 @@ verbosity=2
 #
 # The batchworker_test is excluded because it imports batchworker.py which
 # in-turn tries to import module 'resource' which does not work for Windows.
-exclude=examples|bigquery_test|ptransform_test|fast_coders_test|typecoders_test|workitem_test|executor_test|streamingworker_test|batchworker_test
+exclude=examples|bigquery_test|fast_coders_test|typecoders_test|workitem_test|executor_test|streamingworker_test|batchworker_test
 



[jira] [Commented] (BEAM-404) PubsubIO should have a mode that supports maintaining message attributes.

2016-07-11 Thread Lucas Amorim (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-404?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371586#comment-15371586
 ] 

Lucas Amorim commented on BEAM-404:
---

I can work on that over the weekend. 

> PubsubIO should have a mode that supports maintaining message attributes.
> -
>
> Key: BEAM-404
> URL: https://issues.apache.org/jira/browse/BEAM-404
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Daniel Halperin
>
> Right now, PubsubIO only lets uses access the message payload, decoded with 
> the user-provided coder.
> We should add a mode in which the source can return a message with the 
> metadata (attributes) as well.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #628: Fix warnings that came with newly released...

2016-07-11 Thread aaltay
GitHub user aaltay opened a pull request:

https://github.com/apache/incubator-beam/pull/628

Fix warnings that came with newly released Pylint (1.6.1) version

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/aaltay/incubator-beam lint

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/628.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #628


commit 2042560e3facfab7b80afa560032debf8ade0dde
Author: Ahmet Altay 
Date:   2016-07-11T20:49:50Z

Fix warnings that came with newly released Pylint (1.6.1) version.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build is still unstable: beam_PostCommit_RunnableOnService_GoogleCloudDataflow #732

2016-07-11 Thread Apache Jenkins Server
See 




[GitHub] incubator-beam pull request #622: Update Python aggregator example to match ...

2016-07-11 Thread charlesccychen
Github user charlesccychen closed the pull request at:

https://github.com/apache/incubator-beam/pull/622


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] incubator-beam pull request #621: Allow ".tar" files in extra_packages

2016-07-11 Thread charlesccychen
Github user charlesccychen closed the pull request at:

https://github.com/apache/incubator-beam/pull/621


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: Allow ".tar" files in extra_packages

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/python-sdk 1b1c8d538 -> 39dda320a


Allow ".tar" files in extra_packages


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/516f15dd
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/516f15dd
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/516f15dd

Branch: refs/heads/python-sdk
Commit: 516f15dd656fd9a1df36c91a25e555ca1f7e1e42
Parents: 1b1c8d5
Author: Charles Chen 
Authored: Sun Jul 10 20:59:08 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 13:14:12 2016 -0700

--
 sdks/python/apache_beam/utils/dependency.py  |  8 
 sdks/python/apache_beam/utils/dependency_test.py | 12 
 2 files changed, 12 insertions(+), 8 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/516f15dd/sdks/python/apache_beam/utils/dependency.py
--
diff --git a/sdks/python/apache_beam/utils/dependency.py 
b/sdks/python/apache_beam/utils/dependency.py
index ddb640a..b809cf2 100644
--- a/sdks/python/apache_beam/utils/dependency.py
+++ b/sdks/python/apache_beam/utils/dependency.py
@@ -29,7 +29,7 @@ and a --setup_file option.
 If --setup_file is present then it is assumed that the folder containing the
 file specified by the option has the typical layout required by setuptools and
 it will run 'python setup.py sdist' to produce a source distribution. The
-resulting tarball (a file ending in .tar.gz) will be staged at the GCS staging
+resulting tarball (a .tar or .tar.gz file) will be staged at the GCS staging
 location specified as job option. When a worker starts it will check for the
 presence of this file and will run 'easy_install tarball' to install the
 package in the worker.
@@ -137,10 +137,11 @@ def _stage_extra_packages(extra_packages, 
staging_location, temp_dir,
   staging_temp_dir = None
   local_packages = []
   for package in extra_packages:
-if not os.path.basename(package).endswith('.tar.gz'):
+if not (os.path.basename(package).endswith('.tar') or
+os.path.basename(package).endswith('.tar.gz')):
   raise RuntimeError(
   'The --extra_packages option expects a full path ending with '
-  '\'.tar.gz\' instead of %s' % package)
+  '\'.tar\' or \'.tar.gz\' instead of %s' % package)
 
 if not os.path.isfile(package):
   if package.startswith('gs://'):
@@ -460,4 +461,3 @@ def _download_pypi_sdk_package(temp_dir):
   'Failed to download a source distribution for the running SDK. Expected '
   'either %s or %s to be found in the download folder.' % (
   zip_expected, tgz_expected))
-

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/516f15dd/sdks/python/apache_beam/utils/dependency_test.py
--
diff --git a/sdks/python/apache_beam/utils/dependency_test.py 
b/sdks/python/apache_beam/utils/dependency_test.py
index ab6446d..b7a9296 100644
--- a/sdks/python/apache_beam/utils/dependency_test.py
+++ b/sdks/python/apache_beam/utils/dependency_test.py
@@ -333,6 +333,8 @@ class SetupTest(unittest.TestCase):
 self.create_temp_file(
 os.path.join(source_dir, 'xyz.tar.gz'), 'nothing')
 self.create_temp_file(
+os.path.join(source_dir, 'xyz2.tar'), 'nothing')
+self.create_temp_file(
 os.path.join(source_dir, dependency.EXTRA_PACKAGES_FILE), 'nothing')
 
 options = PipelineOptions()
@@ -341,6 +343,7 @@ class SetupTest(unittest.TestCase):
 options.view_as(SetupOptions).extra_packages = [
 os.path.join(source_dir, 'abc.tar.gz'),
 os.path.join(source_dir, 'xyz.tar.gz'),
+os.path.join(source_dir, 'xyz2.tar'),
 'gs://my-gcs-bucket/gcs.tar.gz']
 
 gcs_copied_files = []
@@ -359,13 +362,13 @@ class SetupTest(unittest.TestCase):
 dependency._dependency_file_copy = file_copy
 
 self.assertEqual(
-['abc.tar.gz', 'xyz.tar.gz', 'gcs.tar.gz',
+['abc.tar.gz', 'xyz.tar.gz', 'xyz2.tar', 'gcs.tar.gz',
  dependency.EXTRA_PACKAGES_FILE,
  names.PICKLED_MAIN_SESSION_FILE],
 dependency.stage_job_resources(options))
 with open(os.path.join(staging_dir, dependency.EXTRA_PACKAGES_FILE)) as f:
-  self.assertEqual(['abc.tar.gz\n', 'xyz.tar.gz\n', 'gcs.tar.gz\n'],
-   f.readlines())
+  self.assertEqual(['abc.tar.gz\n', 'xyz.tar.gz\n', 'xyz2.tar\n',
+'gcs.tar.gz\n'], f.readlines())
 self.assertEqual(['gs://my-gcs-bucket/gcs.tar.gz'], gcs_copied_files)
 
   def test_with_extra_packages_missing_files(self):
@@ -398,7 +401,8 @@ class SetupTest(unittest.TestCase):
 

[2/2] incubator-beam git commit: Closes #621

2016-07-11 Thread dhalperi
Closes #621


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/39dda320
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/39dda320
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/39dda320

Branch: refs/heads/python-sdk
Commit: 39dda320a02fb706e66b918228a12b23c5124cbd
Parents: 1b1c8d5 516f15d
Author: Dan Halperin 
Authored: Mon Jul 11 13:14:13 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 13:14:13 2016 -0700

--
 sdks/python/apache_beam/utils/dependency.py  |  8 
 sdks/python/apache_beam/utils/dependency_test.py | 12 
 2 files changed, 12 insertions(+), 8 deletions(-)
--




[GitHub] incubator-beam pull request #627: [BEAM-399] Updated the SerializableCoder c...

2016-07-11 Thread ilganeli
GitHub user ilganeli opened a pull request:

https://github.com/apache/incubator-beam/pull/627

[BEAM-399] Updated the SerializableCoder class to fix equals method to 
handle nulls.

* Equals method now checks for null
* Added test to check for null equals

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ilganeli/incubator-beam BEAM-399

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/627.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #627


commit 79345d809409fd9bf9f8a7ca0a4719487006b026
Author: Ilya Ganelin 
Date:   2016-07-11T20:05:51Z

Fixed equals method to handle null and added respective test.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-399) SerializableCoder.equals does not handle null

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371507#comment-15371507
 ] 

ASF GitHub Bot commented on BEAM-399:
-

GitHub user ilganeli opened a pull request:

https://github.com/apache/incubator-beam/pull/627

[BEAM-399] Updated the SerializableCoder class to fix equals method to 
handle nulls.

* Equals method now checks for null
* Added test to check for null equals

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ilganeli/incubator-beam BEAM-399

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/627.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #627


commit 79345d809409fd9bf9f8a7ca0a4719487006b026
Author: Ilya Ganelin 
Date:   2016-07-11T20:05:51Z

Fixed equals method to handle null and added respective test.




> SerializableCoder.equals does not handle null
> -
>
> Key: BEAM-399
> URL: https://issues.apache.org/jira/browse/BEAM-399
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Scott Wegner
>Priority: Minor
>  Labels: findbugs, newbie, starter
>
> [FindBugs 
> NP_EQUALS_SHOULD_HANDLE_NULL_ARGUMENT|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L100]:
>  equals() method does not check for null argument
> Applies to: 
> [SerializableCoder|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/SerializableCoder.java].
> This is a good starter bug. When fixing, please remove the corresponding 
> entries from 
> [findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
>  and verify the build passes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-389) DelegateCoder needs equals and hashCode

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371502#comment-15371502
 ] 

ASF GitHub Bot commented on BEAM-389:
-

GitHub user peihe opened a pull request:

https://github.com/apache/incubator-beam/pull/626

[BEAM-389] Fix the uses of DelegateCoder in Combine




You can merge this pull request into a Git repository by running:

$ git pull https://github.com/peihe/incubator-beam fix-binary-combine

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/626.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #626


commit b13eaffbf2a2c087d4f60ae5bbe59922672d4dd2
Author: Pei He 
Date:   2016-07-11T20:01:24Z

Fix the uses of DelegateCoder in Combine




> DelegateCoder needs equals and hashCode
> ---
>
> Key: BEAM-389
> URL: https://issues.apache.org/jira/browse/BEAM-389
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Pei He
>Assignee: Pei He
>Priority: Minor
> Fix For: 0.2.0-incubating
>
>
> Currently, DelegateCoder inherit equals() and hashCode() from StandardCoder.
> And, it makes DelegateCoder.of(VarIntCoder) equal to 
> DelegateCoder.of(BigEndianIntegerCoder).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #616: Migrated IO display data tests must be run...

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/616


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: Migrated IO display data tests must be runner-filesystem agnostic

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master 1c5858b86 -> c5744ccf2


Migrated IO display data tests must be runner-filesystem agnostic


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/1117a033
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/1117a033
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/1117a033

Branch: refs/heads/master
Commit: 1117a0331485d654ceb0d6ce424f681d48fab5ea
Parents: 1c5858b
Author: Scott Wegner 
Authored: Fri Jul 8 15:53:43 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 12:55:18 2016 -0700

--
 .../java/org/apache/beam/sdk/io/AvroIOTest.java | 20 +---
 .../org/apache/beam/sdk/io/BigQueryIOTest.java  | 14 ++
 .../java/org/apache/beam/sdk/io/TextIOTest.java | 20 +---
 3 files changed, 44 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/1117a033/sdks/java/core/src/test/java/org/apache/beam/sdk/io/AvroIOTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/AvroIOTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/AvroIOTest.java
index 047e7d0..026724a 100644
--- a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/AvroIOTest.java
+++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/AvroIOTest.java
@@ -29,10 +29,12 @@ import static org.junit.Assert.assertTrue;
 import org.apache.beam.sdk.coders.AvroCoder;
 import org.apache.beam.sdk.coders.DefaultCoder;
 import org.apache.beam.sdk.io.AvroIO.Write.Bound;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.testing.NeedsRunner;
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.RunnableOnService;
 import org.apache.beam.sdk.testing.TestPipeline;
+import org.apache.beam.sdk.testing.TestPipelineOptions;
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.transforms.display.DisplayData;
 import org.apache.beam.sdk.transforms.display.DisplayDataEvaluator;
@@ -47,6 +49,8 @@ import org.apache.avro.Schema;
 import org.apache.avro.file.DataFileReader;
 import org.apache.avro.generic.GenericRecord;
 import org.apache.avro.reflect.Nullable;
+import org.junit.BeforeClass;
+import org.junit.Ignore;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
@@ -69,6 +73,11 @@ public class AvroIOTest {
   @Rule
   public TemporaryFolder tmpFolder = new TemporaryFolder();
 
+  @BeforeClass
+  public static void setupClass() {
+
IOChannelUtils.registerStandardIOFactories(TestPipeline.testingPipelineOptions());
+  }
+
   @Test
   public void testReadWithoutValidationFlag() throws Exception {
 AvroIO.Read.Bound read = 
AvroIO.Read.from("gs://bucket/foo*/baz");
@@ -313,11 +322,16 @@ public class AvroIOTest {
 
   @Test
   @Category(RunnableOnService.class)
-  public void testPrimitiveWriteDisplayData() {
-DisplayDataEvaluator evaluator = DisplayDataEvaluator.create();
+  @Ignore("[BEAM-436] DirectRunner RunnableOnService tempLocation 
configuration insufficient")
+  public void testPrimitiveWriteDisplayData() throws IOException {
+PipelineOptions options = DisplayDataEvaluator.getDefaultOptions();
+String tempRoot = options.as(TestPipelineOptions.class).getTempRoot();
+String outputPath = IOChannelUtils.getFactory(tempRoot).resolve(tempRoot, 
"foo");
+
+DisplayDataEvaluator evaluator = DisplayDataEvaluator.create(options);
 
 AvroIO.Write.Bound write = AvroIO.Write
-.to("foo")
+.to(outputPath)
 .withSchema(Schema.create(Schema.Type.STRING))
 .withoutValidation();
 

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/1117a033/sdks/java/core/src/test/java/org/apache/beam/sdk/io/BigQueryIOTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/BigQueryIOTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/BigQueryIOTest.java
index 78d950e..0d1a9f8 100644
--- a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/BigQueryIOTest.java
+++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/BigQueryIOTest.java
@@ -94,6 +94,7 @@ import org.hamcrest.CoreMatchers;
 import org.hamcrest.Matchers;
 import org.junit.Assert;
 import org.junit.Before;
+import org.junit.Ignore;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
@@ -643,8 +644,9 @@ public class BigQueryIOTest implements Serializable {
 
   @Test
   @Category(RunnableOnService.class)
+  @Ignore("[BEAM-436] DirectRunner RunnableOnService 

[2/2] incubator-beam git commit: Closes #616

2016-07-11 Thread dhalperi
Closes #616


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/c5744ccf
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/c5744ccf
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/c5744ccf

Branch: refs/heads/master
Commit: c5744ccf2d287db7edcee4c6aa4add2ac84621e3
Parents: 1c5858b 1117a03
Author: Dan Halperin 
Authored: Mon Jul 11 12:55:19 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 12:55:19 2016 -0700

--
 .../java/org/apache/beam/sdk/io/AvroIOTest.java | 20 +---
 .../org/apache/beam/sdk/io/BigQueryIOTest.java  | 14 ++
 .../java/org/apache/beam/sdk/io/TextIOTest.java | 20 +---
 3 files changed, 44 insertions(+), 10 deletions(-)
--




[2/2] incubator-beam git commit: Update Python aggregator example to match Java usage

2016-07-11 Thread dhalperi
Update Python aggregator example to match Java usage


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/c2730c87
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/c2730c87
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/c2730c87

Branch: refs/heads/python-sdk
Commit: c2730c872eecd8f1207cbada93fa2c7962afbec1
Parents: 5523682
Author: Charles Chen 
Authored: Sun Jul 10 21:15:56 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 12:53:28 2016 -0700

--
 .../python/apache_beam/transforms/aggregator.py | 38 +---
 1 file changed, 25 insertions(+), 13 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c2730c87/sdks/python/apache_beam/transforms/aggregator.py
--
diff --git a/sdks/python/apache_beam/transforms/aggregator.py 
b/sdks/python/apache_beam/transforms/aggregator.py
index a5e83cb..05ef635 100644
--- a/sdks/python/apache_beam/transforms/aggregator.py
+++ b/sdks/python/apache_beam/transforms/aggregator.py
@@ -17,26 +17,29 @@
 
 """Support for user-defined Aggregators.
 
-Aggregators allow a pipeline to have the workers do custom aggregation
-of statistics about the data processed.  To update an aggregator's value,
-call aggregate_to() on the context passed to a DoFn.
+Aggregators allow steps in a pipeline to perform custom aggregation of
+statistics about the data processed across all workers.  To update an
+aggregator's value, call aggregate_to() on the context passed to a DoFn.
 
 Example:
 import apache_beam as beam
 
-simple_counter = beam.Aggregator('example-counter')
-
 class ExampleDoFn(beam.DoFn):
+  def __init__(self):
+super(ExampleDoFn, self).__init__()
+self.simple_counter = beam.Aggregator('example-counter')
+
   def process(self, context):
-context.aggregate_to(simple_counter, 1)
+context.aggregate_to(self.simple_counter, 1)
 ...
 
-The aggregators defined here show up in the UI as "Custom counters."
+These aggregators may be used by runners to collect and present statistics of
+a pipeline.  For example, in the Google Cloud Dataflow console, aggregators and
+their values show up in the UI under "Custom counters."
 
 You can also query the combined value(s) of an aggregator by calling
-aggregated_value() or aggregated_values() on the result of running a
-pipeline.
-
+aggregated_value() or aggregated_values() on the result object returned after
+running a pipeline.
 """
 
 from __future__ import absolute_import
@@ -45,7 +48,7 @@ from apache_beam.transforms import core
 
 
 class Aggregator(object):
-  """A user-specified aggregator of statistics about pipeline data.
+  """A user-specified aggregator of statistics about a pipeline step.
 
   Args:
 combine_fn: how to combine values input to the aggregation.
@@ -63,8 +66,17 @@ class Aggregator(object):
   Example uses::
 
 import apache_beam as beam
-simple_counter = beam.Aggregator('example-counter')
-complex_counter = beam.Aggregator('other-counter', beam.Mean(), float)
+
+class ExampleDoFn(beam.DoFn):
+  def __init__(self):
+super(ExampleDoFn, self).__init__()
+self.simple_counter = beam.Aggregator('example-counter')
+self.complex_counter = beam.Aggregator('other-counter', beam.Mean(),
+   float)
+
+  def process(self, context):
+context.aggregate_to(self.simple_counter, 1)
+context.aggregate_to(self.complex_counter, float(len(context.element))
   """
 
   def __init__(self, name, combine_fn=sum, input_type=int):



[1/2] incubator-beam git commit: Closes #622

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/python-sdk 55236825d -> 1b1c8d538


Closes #622


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/1b1c8d53
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/1b1c8d53
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/1b1c8d53

Branch: refs/heads/python-sdk
Commit: 1b1c8d538493b398424f5b094c9c169e14d92d9c
Parents: 5523682 c2730c8
Author: Dan Halperin 
Authored: Mon Jul 11 12:53:28 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 12:53:28 2016 -0700

--
 .../python/apache_beam/transforms/aggregator.py | 38 +---
 1 file changed, 25 insertions(+), 13 deletions(-)
--




[2/2] incubator-beam git commit: Closes #623

2016-07-11 Thread dhalperi
Closes #623


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/1c5858b8
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/1c5858b8
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/1c5858b8

Branch: refs/heads/master
Commit: 1c5858b8697268bea462bf13aecde8400e14728f
Parents: 77d9282 59f7e3d
Author: Dan Halperin 
Authored: Mon Jul 11 12:41:31 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 12:41:31 2016 -0700

--
 .../java/org/apache/beam/sdk/util/state/StateTable.java | 12 ++--
 1 file changed, 2 insertions(+), 10 deletions(-)
--




[GitHub] incubator-beam pull request #623: StateTable: simplify with HashBasedTable.c...

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/623


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: StateTable: simplify with HashBasedTable.create()

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master 77d928210 -> 1c5858b86


StateTable: simplify with HashBasedTable.create()


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/59f7e3dd
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/59f7e3dd
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/59f7e3dd

Branch: refs/heads/master
Commit: 59f7e3ddee1887e046ab3dbc4f3cf469c4ba36d9
Parents: 77d9282
Author: Dan Halperin 
Authored: Mon Jul 11 10:03:29 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 10:03:29 2016 -0700

--
 .../java/org/apache/beam/sdk/util/state/StateTable.java | 12 ++--
 1 file changed, 2 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/59f7e3dd/sdks/java/core/src/main/java/org/apache/beam/sdk/util/state/StateTable.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/state/StateTable.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/state/StateTable.java
index 650ebb0..2ae6516 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/state/StateTable.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/state/StateTable.java
@@ -19,11 +19,9 @@ package org.apache.beam.sdk.util.state;
 
 import org.apache.beam.sdk.util.state.StateTag.StateBinder;
 
-import com.google.common.base.Supplier;
+import com.google.common.collect.HashBasedTable;
 import com.google.common.collect.Table;
-import com.google.common.collect.Tables;
 
-import java.util.HashMap;
 import java.util.Map;
 import java.util.Set;
 
@@ -33,13 +31,7 @@ import java.util.Set;
 public abstract class StateTable {
 
   private final Table 
stateTable =
-  Tables.newCustomTable(new HashMap>(),
-  new Supplier>() {
-@Override
-public Map get() {
-  return new HashMap<>();
-}
-  });
+  HashBasedTable.create();
 
   /**
* Gets the {@link State} in the specified {@link StateNamespace} with the 
specified {@link



Jenkins build is still unstable: beam_PostCommit_RunnableOnService_GoogleCloudDataflow #731

2016-07-11 Thread Apache Jenkins Server
See 




[GitHub] incubator-beam pull request #625: Made checksum_output optional in bigshuffl...

2016-07-11 Thread mdvorsky
GitHub user mdvorsky opened a pull request:

https://github.com/apache/incubator-beam/pull/625

Made checksum_output optional in bigshuffle.py.

Hi @chamikaramj, can you please take a look?

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mdvorsky/incubator-beam python-bigshuffle

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/625.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #625


commit 735185017380603a3c7c5a199c3fbeace3a3dc73
Author: Marian Dvorsky 
Date:   2016-07-11T18:47:01Z

Made checksum_output optional in bigshuffle.py.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] incubator-beam pull request #624: [BEAM-394] Updated the AvroCoder class to ...

2016-07-11 Thread ilganeli
GitHub user ilganeli opened a pull request:

https://github.com/apache/incubator-beam/pull/624

[BEAM-394] Updated the AvroCoder class to set unserializable fields as 
transient

* Updated AvroCoder to mark non-serializable fields as transient.
* Updated the findbugs_filter to remove ignored errors.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ilganeli/incubator-beam BEAM-394

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/624.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #624


commit 131b425914a25b5b6a39e47b948f22571b11ac07
Author: Ilya Ganelin 
Date:   2016-07-11T18:36:45Z

Updated the AvroCoder class to set unserializable fields as transient. 
Updated the findbugs_filter to remove ignored errors.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-394) AvroCoder is Serializable with non-serializable fields

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-394?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371377#comment-15371377
 ] 

ASF GitHub Bot commented on BEAM-394:
-

GitHub user ilganeli opened a pull request:

https://github.com/apache/incubator-beam/pull/624

[BEAM-394] Updated the AvroCoder class to set unserializable fields as 
transient

* Updated AvroCoder to mark non-serializable fields as transient.
* Updated the findbugs_filter to remove ignored errors.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ilganeli/incubator-beam BEAM-394

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/624.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #624


commit 131b425914a25b5b6a39e47b948f22571b11ac07
Author: Ilya Ganelin 
Date:   2016-07-11T18:36:45Z

Updated the AvroCoder class to set unserializable fields as transient. 
Updated the findbugs_filter to remove ignored errors.




> AvroCoder is Serializable with non-serializable fields
> --
>
> Key: BEAM-394
> URL: https://issues.apache.org/jira/browse/BEAM-394
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Scott Wegner
>Priority: Minor
>  Labels: findbugs, newbie, starter
>
> [FindBugs 
> SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
>  Non-transient non-serializable instance field in serializable class.
> Applies to AvroCoder fields:
> * decoder
> * encoder
> * reader
> * writer
> * schema
> These should likely be marked transient and ensure they are properly 
> initialized after deserialization.
> This is a good starter bug. When fixing, please remove the corresponding 
> entries from 
> [findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
>  and verify the build passes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Jenkins build is still unstable: beam_PostCommit_RunnableOnService_GoogleCloudDataflow #730

2016-07-11 Thread Apache Jenkins Server
See 




[jira] [Updated] (BEAM-394) AvroCoder is Serializable with non-serializable fields

2016-07-11 Thread Ilya Ganelin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ilya Ganelin updated BEAM-394:
--
Description: 
[FindBugs 
SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
 Non-transient non-serializable instance field in serializable class.

Applies to AvroCoder fields:
* decoder
* encoder
* reader
* writer
* schema

These should likely be marked transient and ensure they are properly 
initialized after deserialization.

This is a good starter bug. When fixing, please remove the corresponding 
entries from 
[findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
 and verify the build passes.

  was:
[FindBugs 
SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
 Non-transient non-serializable instance field in serializable class.

Applies to AvroCoder fields:
* decoder
* encoder
* reader
* writer

These should likely be marked transient and ensure they are properly 
initialized after deserialization.

This is a good starter bug. When fixing, please remove the corresponding 
entries from 
[findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
 and verify the build passes.


> AvroCoder is Serializable with non-serializable fields
> --
>
> Key: BEAM-394
> URL: https://issues.apache.org/jira/browse/BEAM-394
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Scott Wegner
>Priority: Minor
>  Labels: findbugs, newbie, starter
>
> [FindBugs 
> SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
>  Non-transient non-serializable instance field in serializable class.
> Applies to AvroCoder fields:
> * decoder
> * encoder
> * reader
> * writer
> * schema
> These should likely be marked transient and ensure they are properly 
> initialized after deserialization.
> This is a good starter bug. When fixing, please remove the corresponding 
> entries from 
> [findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
>  and verify the build passes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (BEAM-394) AvroCoder is Serializable with non-serializable fields

2016-07-11 Thread Ilya Ganelin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ilya Ganelin updated BEAM-394:
--
Description: 
[FindBugs 
SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
 Non-transient non-serializable instance field in serializable class.

Applies to AvroCoder fields:
* decoder
* encoder
* reader
* writer

These should likely be marked transient and ensure they are properly 
initialized after deserialization.

This is a good starter bug. When fixing, please remove the corresponding 
entries from 
[findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
 and verify the build passes.

  was:
[FindBugs 
SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
 Non-transient non-serializable instance field in serializable class.

Applies to AvroCoder fields:
* decoder
* encoder
* reader

These should likely be marked transient and ensure they are properly 
initialized after deserialization.

This is a good starter bug. When fixing, please remove the corresponding 
entries from 
[findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
 and verify the build passes.


> AvroCoder is Serializable with non-serializable fields
> --
>
> Key: BEAM-394
> URL: https://issues.apache.org/jira/browse/BEAM-394
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Scott Wegner
>Priority: Minor
>  Labels: findbugs, newbie, starter
>
> [FindBugs 
> SE_BAD_FIELD|https://github.com/apache/incubator-beam/blob/58a029a06aea1030279e5da8f9fa3114f456c1db/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml#L30]:
>  Non-transient non-serializable instance field in serializable class.
> Applies to AvroCoder fields:
> * decoder
> * encoder
> * reader
> * writer
> These should likely be marked transient and ensure they are properly 
> initialized after deserialization.
> This is a good starter bug. When fixing, please remove the corresponding 
> entries from 
> [findbugs-filter.xml|https://github.com/apache/incubator-beam/blob/master/sdks/java/build-tools/src/main/resources/beam/findbugs-filter.xml]
>  and verify the build passes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Jenkins build is still unstable: beam_PostCommit_RunnableOnService_GoogleCloudDataflow #729

2016-07-11 Thread Apache Jenkins Server
See 




[GitHub] incubator-beam pull request #623: StateTable: simplify with HashBasedTable.c...

2016-07-11 Thread dhalperi
GitHub user dhalperi opened a pull request:

https://github.com/apache/incubator-beam/pull/623

StateTable: simplify with HashBasedTable.create()

R: @lukecwik @kennknowles @peihe  @swegner 

Minor code cleanup -- Any LGTM will do :)

This is a forward-port of 
https://github.com/GoogleCloudPlatform/DataflowJavaSDK/pull/336

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/dhalperi/incubator-beam guava-fix

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/623.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #623


commit 59f7e3ddee1887e046ab3dbc4f3cf469c4ba36d9
Author: Dan Halperin 
Date:   2016-07-11T17:03:29Z

StateTable: simplify with HashBasedTable.create()




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] incubator-beam pull request #615: Making the dataflow temp_location argument...

2016-07-11 Thread zoyahav
Github user zoyahav closed the pull request at:

https://github.com/apache/incubator-beam/pull/615


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: Making the dataflow temp_location argument optional

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/python-sdk e167d2b5a -> 55236825d


Making the dataflow temp_location argument optional

* Migrated the changes to make temp_location optional.
* Modifying temp_location flag documentation
* Adding a log message when defaulting to staging_location
* Add validation that staging is given if test isn't and adding test cases


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/8ce07b0f
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/8ce07b0f
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/8ce07b0f

Branch: refs/heads/python-sdk
Commit: 8ce07b0f67e5ed21b43d9070552aa73918e30b9a
Parents: e167d2b
Author: Zohar Yahav 
Authored: Fri Jul 8 14:03:19 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 09:44:08 2016 -0700

--
 sdks/python/apache_beam/internal/apiclient.py   | 12 +++--
 sdks/python/apache_beam/utils/options.py|  5 +-
 .../utils/pipeline_options_validator_test.py| 52 +++-
 3 files changed, 52 insertions(+), 17 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/8ce07b0f/sdks/python/apache_beam/internal/apiclient.py
--
diff --git a/sdks/python/apache_beam/internal/apiclient.py 
b/sdks/python/apache_beam/internal/apiclient.py
index 0bb30ac..99c7090 100644
--- a/sdks/python/apache_beam/internal/apiclient.py
+++ b/sdks/python/apache_beam/internal/apiclient.py
@@ -339,16 +339,20 @@ class Job(object):
   def __init__(self, options):
 self.options = options
 self.google_cloud_options = options.view_as(GoogleCloudOptions)
-required_google_cloud_options = ['project',
- 'job_name',
- 'staging_location',
- 'temp_location']
+required_google_cloud_options = ['project', 'job_name', 'staging_location']
 missing = [
 option for option in required_google_cloud_options
 if not getattr(self.google_cloud_options, option)]
 if missing:
   raise ValueError(
   'Missing required configuration parameters: %s' % missing)
+
+if not self.google_cloud_options.temp_location:
+  logging.info('Defaulting to the staging_location as temp_location: %s',
+   self.google_cloud_options.staging_location)
+  (self.google_cloud_options
+   .temp_location) = self.google_cloud_options.staging_location
+
 # Make the staging and temp locations job name and time specific. This is
 # needed to avoid clashes between job submissions using the same staging
 # area or team members using same job names. This method is not entirely

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/8ce07b0f/sdks/python/apache_beam/utils/options.py
--
diff --git a/sdks/python/apache_beam/utils/options.py 
b/sdks/python/apache_beam/utils/options.py
index e7a6f52..335beea 100644
--- a/sdks/python/apache_beam/utils/options.py
+++ b/sdks/python/apache_beam/utils/options.py
@@ -236,6 +236,7 @@ class GoogleCloudOptions(PipelineOptions):
 help='GCS path for staging code packages needed by '
 'workers.')
 # Remote execution must check that this option is not None.
+# If temp_location is not set, it defaults to staging_location.
 parser.add_argument('--temp_location',
 default=None,
 help='GCS path for saving temporary workflow jobs.')
@@ -254,7 +255,9 @@ class GoogleCloudOptions(PipelineOptions):
 if validator.is_service_runner():
   errors.extend(validator.validate_cloud_options(self))
   errors.extend(validator.validate_gcs_path(self, 'staging_location'))
-  errors.extend(validator.validate_gcs_path(self, 'temp_location'))
+  if getattr(self, 'temp_location',
+ None) or getattr(self, 'staging_location', None) is None:
+errors.extend(validator.validate_gcs_path(self, 'temp_location'))
 return errors
 
 

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/8ce07b0f/sdks/python/apache_beam/utils/pipeline_options_validator_test.py
--
diff --git a/sdks/python/apache_beam/utils/pipeline_options_validator_test.py 
b/sdks/python/apache_beam/utils/pipeline_options_validator_test.py
index 1f59261..bca9fa5 100644
--- a/sdks/python/apache_beam/utils/pipeline_options_validator_test.py
+++ b/sdks/python/apache_beam/utils/pipeline_options_validator_test.py
@@ -74,31 +74,59 @@ class SetupTest(unittest.TestCase):
 

[2/2] incubator-beam git commit: Closes #615

2016-07-11 Thread dhalperi
Closes #615


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/55236825
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/55236825
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/55236825

Branch: refs/heads/python-sdk
Commit: 55236825d716cc1d21e898dba14bb8cf26632546
Parents: e167d2b 8ce07b0
Author: Dan Halperin 
Authored: Mon Jul 11 09:44:34 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 09:44:34 2016 -0700

--
 sdks/python/apache_beam/internal/apiclient.py   | 12 +++--
 sdks/python/apache_beam/utils/options.py|  5 +-
 .../utils/pipeline_options_validator_test.py| 52 +++-
 3 files changed, 52 insertions(+), 17 deletions(-)
--




[jira] [Commented] (BEAM-431) Examples dependencies on runners are a bit much and not enough

2016-07-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-431?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15371114#comment-15371114
 ] 

ASF GitHub Bot commented on BEAM-431:
-

Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/613


> Examples dependencies on runners are a bit much and not enough
> --
>
> Key: BEAM-431
> URL: https://issues.apache.org/jira/browse/BEAM-431
> Project: Beam
>  Issue Type: Bug
>  Components: examples-java
>Reporter: Kenneth Knowles
>
> The Java 7 examples directly depend on the Dataflow runner as a compile 
> dependency. This should just be fixed and removed.
> The Java 8 examples have optional runtime dependencies on the Dataflow and 
> Flink runners. But even optional runtime dependencies must be resolved in a 
> test scope, so it is not possible to exclude these from a hermetic testing 
> environment - quite annoying. And the Spark runner should be included as well.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[1/2] incubator-beam git commit: Closes #613

2016-07-11 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master 8db73fe0d -> 77d928210


Closes #613


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/77d92821
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/77d92821
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/77d92821

Branch: refs/heads/master
Commit: 77d92821006992603f7253888284d1395722796d
Parents: 8db73fe d395a7c
Author: Dan Halperin 
Authored: Mon Jul 11 09:37:08 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 09:37:08 2016 -0700

--
 examples/java8/pom.xml | 13 +
 1 file changed, 13 insertions(+)
--




[GitHub] incubator-beam pull request #613: [BEAM-431] Add test dependency on direct r...

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/613


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] incubator-beam git commit: Add test dependency on direct runner

2016-07-11 Thread dhalperi
Add test dependency on direct runner

The testing of the examples themselves is done
via the direct runner, so this change adds
a straightforward test scoped dependency.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/d395a7c2
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/d395a7c2
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/d395a7c2

Branch: refs/heads/master
Commit: d395a7c2cfe1ef0accfdf2cc9b54054f75313159
Parents: 8db73fe
Author: Kenneth Knowles 
Authored: Fri Jul 8 12:28:33 2016 -0700
Committer: Dan Halperin 
Committed: Mon Jul 11 09:37:08 2016 -0700

--
 examples/java8/pom.xml | 13 +
 1 file changed, 13 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d395a7c2/examples/java8/pom.xml
--
diff --git a/examples/java8/pom.xml b/examples/java8/pom.xml
index 3cd1787..cf6b545 100644
--- a/examples/java8/pom.xml
+++ b/examples/java8/pom.xml
@@ -211,5 +211,18 @@
   com.google.api-client
   google-api-client
 
+
+
+
+
+
+  org.apache.beam
+  beam-runners-direct-java
+  ${project.version}
+  test
+
   
 



[GitHub] incubator-beam pull request #618: Changed JUnitMatchers to Matchers

2016-07-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/618


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


  1   2   >