[jira] [Updated] (BEAM-1684) Add unit tests for iobase.py

2017-03-30 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1684?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated BEAM-1684:
-
Description: 
Python-SDK's {{apache_beam/runners/dataflow/native_io/iobase.py}} does not have 
associated unit tests and has low (indirect) test coverage.
 
Create the respective tests to ensure code quality and increase test coverage.

  was:
Python-SDK's {{apache_beam/runners/dataflow/native_io/iobase.py}} does not have 
associated unit tests and have low (indirect) test coverage.
 
Create the respective tests to ensure code quality and increase test coverage.


> Add unit tests for iobase.py
> 
>
> Key: BEAM-1684
> URL: https://issues.apache.org/jira/browse/BEAM-1684
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py
>Reporter: Tibor Kiss
>Assignee: Rahul Sabbineni
>Priority: Minor
>
> Python-SDK's {{apache_beam/runners/dataflow/native_io/iobase.py}} does not 
> have associated unit tests and has low (indirect) test coverage.
>  
> Create the respective tests to ensure code quality and increase test coverage.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (BEAM-1684) Add unit tests for iobase.py

2017-03-30 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1684?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss reassigned BEAM-1684:


Assignee: Rahul Sabbineni  (was: Tibor Kiss)

> Add unit tests for iobase.py
> 
>
> Key: BEAM-1684
> URL: https://issues.apache.org/jira/browse/BEAM-1684
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py
>Reporter: Tibor Kiss
>Assignee: Rahul Sabbineni
>Priority: Minor
>
> Python-SDK's {{apache_beam/runners/dataflow/native_io/iobase.py}} does not 
> have associated unit tests and have low (indirect) test coverage.
>  
> Create the respective tests to ensure code quality and increase test coverage.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Python_Verify #1685

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] Fix PubSubIO write attribute issue

[hepei.hp] [BEAM-1838] GlobalWindow: add equals() and hashCode().

--
[...truncated 657.12 KB...]
test_empty_side_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing

Jenkins build is back to normal : beam_PostCommit_Java_MavenInstall #3104

2017-03-30 Thread Apache Jenkins Server
See 




[jira] [Closed] (BEAM-1838) GlobalWindow equals() and hashCode() doesn't work with other serialization frameworks

2017-03-30 Thread Daniel Halperin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1838?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Halperin closed BEAM-1838.
-
   Resolution: Fixed
Fix Version/s: First stable release

> GlobalWindow equals() and hashCode() doesn't work with other serialization 
> frameworks
> -
>
> Key: BEAM-1838
> URL: https://issues.apache.org/jira/browse/BEAM-1838
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Pei He
>Assignee: Pei He
> Fix For: First stable release
>
>
> When using kyro serialization, two copy of GlobalWindow are not equal.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Python_Verify #1684

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 643.90 KB...]
test_undeclared_side_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
test_empty_side_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_as_dict_with_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
DEPRECATION: pip install --download has been deprecated and will be removed in 

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark #1450

2017-03-30 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #2371: [BEAM-1838] GlobalWindow: add equals() and hashCode...

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2371


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1838) GlobalWindow equals() and hashCode() doesn't work with other serialization frameworks

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1838?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950272#comment-15950272
 ] 

ASF GitHub Bot commented on BEAM-1838:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2371


> GlobalWindow equals() and hashCode() doesn't work with other serialization 
> frameworks
> -
>
> Key: BEAM-1838
> URL: https://issues.apache.org/jira/browse/BEAM-1838
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Pei He
>Assignee: Pei He
>
> When using kyro serialization, two copy of GlobalWindow are not equal.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[2/2] beam git commit: This closes #2371

2017-03-30 Thread pei
This closes #2371


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/935ecd4e
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/935ecd4e
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/935ecd4e

Branch: refs/heads/master
Commit: 935ecd4e032e18e428ee33cbf5484c5fce726b4f
Parents: 1e2ad65 ea9f5b1
Author: Pei He 
Authored: Fri Mar 31 11:32:36 2017 +0800
Committer: Pei He 
Committed: Fri Mar 31 11:32:36 2017 +0800

--
 .../beam/sdk/transforms/windowing/GlobalWindow.java   | 10 ++
 1 file changed, 10 insertions(+)
--




[1/2] beam git commit: [BEAM-1838] GlobalWindow: add equals() and hashCode().

2017-03-30 Thread pei
Repository: beam
Updated Branches:
  refs/heads/master 1e2ad65f8 -> 935ecd4e0


[BEAM-1838] GlobalWindow: add equals() and hashCode().


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/ea9f5b1b
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/ea9f5b1b
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/ea9f5b1b

Branch: refs/heads/master
Commit: ea9f5b1b77402b6336bcc0093de65c1550c9150c
Parents: 1e2ad65
Author: Pei He 
Authored: Thu Mar 30 21:35:00 2017 +0800
Committer: Pei He 
Committed: Fri Mar 31 11:32:30 2017 +0800

--
 .../beam/sdk/transforms/windowing/GlobalWindow.java   | 10 ++
 1 file changed, 10 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/ea9f5b1b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/GlobalWindow.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/GlobalWindow.java
 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/GlobalWindow.java
index c27749d..337886d 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/GlobalWindow.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/GlobalWindow.java
@@ -47,6 +47,16 @@ public class GlobalWindow extends BoundedWindow {
 return END_OF_GLOBAL_WINDOW;
   }
 
+  @Override
+  public boolean equals(Object other) {
+return other instanceof GlobalWindow;
+  }
+
+  @Override
+  public int hashCode() {
+return GlobalWindow.class.hashCode();
+  }
+
   private GlobalWindow() {}
 
   /**



[jira] [Commented] (BEAM-1673) PubSubIO can't write attributes

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1673?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950257#comment-15950257
 ] 

ASF GitHub Bot commented on BEAM-1673:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2209


> PubSubIO can't write attributes
> ---
>
> Key: BEAM-1673
> URL: https://issues.apache.org/jira/browse/BEAM-1673
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Affects Versions: 0.5.0
>Reporter: Bin Chen
>Assignee: Jean-Baptiste Onofré
>
> PubSubIO ignore the attributes of message when writing



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2209: [BEAM-1673] Fix PubSubIO write attribute issue

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2209


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: Fix PubSubIO write attribute issue

2017-03-30 Thread dhalperi
Repository: beam
Updated Branches:
  refs/heads/master b1c287bd5 -> 1e2ad65f8


Fix PubSubIO write attribute issue


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/ad9df5b5
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/ad9df5b5
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/ad9df5b5

Branch: refs/heads/master
Commit: ad9df5b5591ce9d153039ac91e8862af6ea42b45
Parents: b1c287b
Author: Chen Bin 
Authored: Thu Mar 9 11:09:04 2017 +0800
Committer: Dan Halperin 
Committed: Thu Mar 30 20:15:58 2017 -0700

--
 .../org/apache/beam/sdk/util/PubsubJsonClient.java |  2 +-
 .../org/apache/beam/sdk/util/PubsubJsonClientTest.java | 13 ++---
 2 files changed, 11 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/ad9df5b5/sdks/java/core/src/main/java/org/apache/beam/sdk/util/PubsubJsonClient.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/PubsubJsonClient.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/PubsubJsonClient.java
index 6bc104f..ef8abfd 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/PubsubJsonClient.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/PubsubJsonClient.java
@@ -135,7 +135,7 @@ public class PubsubJsonClient extends PubsubClient {
 for (OutgoingMessage outgoingMessage : outgoingMessages) {
   PubsubMessage pubsubMessage = new 
PubsubMessage().encodeData(outgoingMessage.elementBytes);
 
-  Map attributes = pubsubMessage.getAttributes();
+  Map attributes = outgoingMessage.attributes;
   if ((timestampLabel != null || idLabel != null) && attributes == null) {
 attributes = new TreeMap<>();
   }

http://git-wip-us.apache.org/repos/asf/beam/blob/ad9df5b5/sdks/java/core/src/test/java/org/apache/beam/sdk/util/PubsubJsonClientTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/PubsubJsonClientTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/PubsubJsonClientTest.java
index 17e1870..019190b 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/PubsubJsonClientTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/PubsubJsonClientTest.java
@@ -30,7 +30,10 @@ import com.google.api.services.pubsub.model.ReceivedMessage;
 import com.google.common.collect.ImmutableList;
 import com.google.common.collect.ImmutableMap;
 import java.io.IOException;
+import java.util.HashMap;
 import java.util.List;
+import java.util.Map;
+
 import org.apache.beam.sdk.util.PubsubClient.IncomingMessage;
 import org.apache.beam.sdk.util.PubsubClient.OutgoingMessage;
 import org.apache.beam.sdk.util.PubsubClient.SubscriptionPath;
@@ -114,8 +117,10 @@ public class PubsubJsonClientTest {
 PubsubMessage expectedPubsubMessage = new PubsubMessage()
 .encodeData(DATA.getBytes())
 .setAttributes(
-ImmutableMap.of(TIMESTAMP_LABEL, String.valueOf(MESSAGE_TIME),
-ID_LABEL, RECORD_ID));
+ImmutableMap. builder()
+.put(TIMESTAMP_LABEL, String.valueOf(MESSAGE_TIME))
+.put(ID_LABEL, RECORD_ID)
+.put("k", "v").build());
 PublishRequest expectedRequest = new PublishRequest()
 .setMessages(ImmutableList.of(expectedPubsubMessage));
 PublishResponse expectedResponse = new PublishResponse()
@@ -125,8 +130,10 @@ public class PubsubJsonClientTest {
 .publish(expectedTopic, expectedRequest)
 .execute()))
.thenReturn(expectedResponse);
+Map attrs = new HashMap<>();
+attrs.put("k", "v");
 OutgoingMessage actualMessage = new OutgoingMessage(
-DATA.getBytes(), null, MESSAGE_TIME, RECORD_ID);
+DATA.getBytes(), attrs, MESSAGE_TIME, RECORD_ID);
 int n = client.publish(TOPIC, ImmutableList.of(actualMessage));
 assertEquals(1, n);
   }



[2/2] beam git commit: This closes #2209

2017-03-30 Thread dhalperi
This closes #2209


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/1e2ad65f
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/1e2ad65f
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/1e2ad65f

Branch: refs/heads/master
Commit: 1e2ad65f80df4d21e25df51c53414054799d803e
Parents: b1c287b ad9df5b
Author: Dan Halperin 
Authored: Thu Mar 30 20:16:01 2017 -0700
Committer: Dan Halperin 
Committed: Thu Mar 30 20:16:01 2017 -0700

--
 .../org/apache/beam/sdk/util/PubsubJsonClient.java |  2 +-
 .../org/apache/beam/sdk/util/PubsubJsonClientTest.java | 13 ++---
 2 files changed, 11 insertions(+), 4 deletions(-)
--




Build failed in Jenkins: beam_PostCommit_Python_Verify #1683

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] Fixed: small documentation issue in HDFS IO

--
[...truncated 669.47 KB...]
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_empty_side_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_as_list_with_unique_labels 

Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #3102

2017-03-30 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_Verify #1682

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[altay] Increase the memory threshold for the direct runner test

[altay] [BEAM-1441] Fix size check for windows and improve error message

[dhalperi] Reopen BigQuery utils after #2271

--
[...truncated 409.87 KB...]
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_undeclared_side_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock 

Jenkins build is back to normal : beam_PostCommit_Java_MavenInstall #3101

2017-03-30 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #2389: Add TestStream to Python SDK

2017-03-30 Thread charlesccychen
GitHub user charlesccychen opened a pull request:

https://github.com/apache/beam/pull/2389

Add TestStream to Python SDK

The TestStream will be used for verifying streaming runner semantics.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/charlesccychen/beam test-stream

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2389.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2389


commit 4238de506ae9a9290bfdb9360f96a90e97f54958
Author: Charles Chen 
Date:   2017-03-31T01:20:04Z

Add TestStream to Python SDK

The TestStream will be used for verifying streaming runner semantics.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #2388: Change side inputs to be references rather than ful...

2017-03-30 Thread robertwb
GitHub user robertwb opened a pull request:

https://github.com/apache/beam/pull/2388

Change side inputs to be references rather than full PValues.

This is more consistent with the Runner API's structure.

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/robertwb/incubator-beam side-inputs

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2388.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2388


commit 85c60920325b8c06f0120fa8be2a2020d4cbdff4
Author: Robert Bradshaw 
Date:   2017-03-30T15:20:21Z

Change side inputs to be references rather than full PValues.

This is more consistent with the Runner API's structure.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark #1449

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 220.72 KB...]
 x [deleted] (none) -> origin/pr/942/head
 x [deleted] (none) -> origin/pr/942/merge
 x [deleted] (none) -> origin/pr/943/head
 x [deleted] (none) -> origin/pr/943/merge
 x [deleted] (none) -> origin/pr/944/head
 x [deleted] (none) -> origin/pr/945/head
 x [deleted] (none) -> origin/pr/945/merge
 x [deleted] (none) -> origin/pr/946/head
 x [deleted] (none) -> origin/pr/946/merge
 x [deleted] (none) -> origin/pr/947/head
 x [deleted] (none) -> origin/pr/947/merge
 x [deleted] (none) -> origin/pr/948/head
 x [deleted] (none) -> origin/pr/948/merge
 x [deleted] (none) -> origin/pr/949/head
 x [deleted] (none) -> origin/pr/949/merge
 x [deleted] (none) -> origin/pr/95/head
 x [deleted] (none) -> origin/pr/95/merge
 x [deleted] (none) -> origin/pr/950/head
 x [deleted] (none) -> origin/pr/951/head
 x [deleted] (none) -> origin/pr/951/merge
 x [deleted] (none) -> origin/pr/952/head
 x [deleted] (none) -> origin/pr/952/merge
 x [deleted] (none) -> origin/pr/953/head
 x [deleted] (none) -> origin/pr/954/head
 x [deleted] (none) -> origin/pr/954/merge
 x [deleted] (none) -> origin/pr/955/head
 x [deleted] (none) -> origin/pr/955/merge
 x [deleted] (none) -> origin/pr/956/head
 x [deleted] (none) -> origin/pr/957/head
 x [deleted] (none) -> origin/pr/958/head
 x [deleted] (none) -> origin/pr/959/head
 x [deleted] (none) -> origin/pr/959/merge
 x [deleted] (none) -> origin/pr/96/head
 x [deleted] (none) -> origin/pr/96/merge
 x [deleted] (none) -> origin/pr/960/head
 x [deleted] (none) -> origin/pr/960/merge
 x [deleted] (none) -> origin/pr/961/head
 x [deleted] (none) -> origin/pr/962/head
 x [deleted] (none) -> origin/pr/962/merge
 x [deleted] (none) -> origin/pr/963/head
 x [deleted] (none) -> origin/pr/963/merge
 x [deleted] (none) -> origin/pr/964/head
 x [deleted] (none) -> origin/pr/965/head
 x [deleted] (none) -> origin/pr/965/merge
 x [deleted] (none) -> origin/pr/966/head
 x [deleted] (none) -> origin/pr/967/head
 x [deleted] (none) -> origin/pr/967/merge
 x [deleted] (none) -> origin/pr/968/head
 x [deleted] (none) -> origin/pr/968/merge
 x [deleted] (none) -> origin/pr/969/head
 x [deleted] (none) -> origin/pr/969/merge
 x [deleted] (none) -> origin/pr/97/head
 x [deleted] (none) -> origin/pr/97/merge
 x [deleted] (none) -> origin/pr/970/head
 x [deleted] (none) -> origin/pr/970/merge
 x [deleted] (none) -> origin/pr/971/head
 x [deleted] (none) -> origin/pr/971/merge
 x [deleted] (none) -> origin/pr/972/head
 x [deleted] (none) -> origin/pr/973/head
 x [deleted] (none) -> origin/pr/974/head
 x [deleted] (none) -> origin/pr/974/merge
 x [deleted] (none) -> origin/pr/975/head
 x [deleted] (none) -> origin/pr/975/merge
 x [deleted] (none) -> origin/pr/976/head
 x [deleted] (none) -> origin/pr/976/merge
 x [deleted] (none) -> origin/pr/977/head
 x [deleted] (none) -> origin/pr/977/merge
 x [deleted] (none) -> origin/pr/978/head
 x [deleted] (none) -> origin/pr/978/merge
 x [deleted] (none) -> origin/pr/979/head
 x [deleted] (none) -> origin/pr/979/merge
 x [deleted] (none) -> origin/pr/98/head
 x [deleted] (none) -> origin/pr/980/head
 x [deleted] (none) -> origin/pr/980/merge
 x [deleted] (none) -> origin/pr/981/head
 x [deleted] (none) -> origin/pr/982/head
 x [deleted] (none) -> origin/pr/982/merge
 x [deleted] (none) -> origin/pr/983/head
 x [deleted] (none) -> origin/pr/983/merge
 x [deleted] (none) -> origin/pr/984/head
 x [deleted] (none) -> origin/pr/984/merge
 x [deleted] (none) -> origin/pr/985/head
 x [deleted] (none) -> origin/pr/985/merge
 x [deleted] (none) -> origin/pr/986/head
 x [deleted] (none) -> origin/pr/986/merge
 x [deleted] (none) -> origin/pr/987/head
 x [deleted] (none) -> origin/pr/988/head
 x [deleted] (none) -> origin/pr/988/merge
 x [deleted] (none) -> 

Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3103

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 216.04 KB...]
 x [deleted] (none) -> origin/pr/942/head
 x [deleted] (none) -> origin/pr/942/merge
 x [deleted] (none) -> origin/pr/943/head
 x [deleted] (none) -> origin/pr/943/merge
 x [deleted] (none) -> origin/pr/944/head
 x [deleted] (none) -> origin/pr/945/head
 x [deleted] (none) -> origin/pr/945/merge
 x [deleted] (none) -> origin/pr/946/head
 x [deleted] (none) -> origin/pr/946/merge
 x [deleted] (none) -> origin/pr/947/head
 x [deleted] (none) -> origin/pr/947/merge
 x [deleted] (none) -> origin/pr/948/head
 x [deleted] (none) -> origin/pr/948/merge
 x [deleted] (none) -> origin/pr/949/head
 x [deleted] (none) -> origin/pr/949/merge
 x [deleted] (none) -> origin/pr/95/head
 x [deleted] (none) -> origin/pr/95/merge
 x [deleted] (none) -> origin/pr/950/head
 x [deleted] (none) -> origin/pr/951/head
 x [deleted] (none) -> origin/pr/951/merge
 x [deleted] (none) -> origin/pr/952/head
 x [deleted] (none) -> origin/pr/952/merge
 x [deleted] (none) -> origin/pr/953/head
 x [deleted] (none) -> origin/pr/954/head
 x [deleted] (none) -> origin/pr/954/merge
 x [deleted] (none) -> origin/pr/955/head
 x [deleted] (none) -> origin/pr/955/merge
 x [deleted] (none) -> origin/pr/956/head
 x [deleted] (none) -> origin/pr/957/head
 x [deleted] (none) -> origin/pr/958/head
 x [deleted] (none) -> origin/pr/959/head
 x [deleted] (none) -> origin/pr/959/merge
 x [deleted] (none) -> origin/pr/96/head
 x [deleted] (none) -> origin/pr/96/merge
 x [deleted] (none) -> origin/pr/960/head
 x [deleted] (none) -> origin/pr/960/merge
 x [deleted] (none) -> origin/pr/961/head
 x [deleted] (none) -> origin/pr/962/head
 x [deleted] (none) -> origin/pr/962/merge
 x [deleted] (none) -> origin/pr/963/head
 x [deleted] (none) -> origin/pr/963/merge
 x [deleted] (none) -> origin/pr/964/head
 x [deleted] (none) -> origin/pr/965/head
 x [deleted] (none) -> origin/pr/965/merge
 x [deleted] (none) -> origin/pr/966/head
 x [deleted] (none) -> origin/pr/967/head
 x [deleted] (none) -> origin/pr/967/merge
 x [deleted] (none) -> origin/pr/968/head
 x [deleted] (none) -> origin/pr/968/merge
 x [deleted] (none) -> origin/pr/969/head
 x [deleted] (none) -> origin/pr/969/merge
 x [deleted] (none) -> origin/pr/97/head
 x [deleted] (none) -> origin/pr/97/merge
 x [deleted] (none) -> origin/pr/970/head
 x [deleted] (none) -> origin/pr/970/merge
 x [deleted] (none) -> origin/pr/971/head
 x [deleted] (none) -> origin/pr/971/merge
 x [deleted] (none) -> origin/pr/972/head
 x [deleted] (none) -> origin/pr/973/head
 x [deleted] (none) -> origin/pr/974/head
 x [deleted] (none) -> origin/pr/974/merge
 x [deleted] (none) -> origin/pr/975/head
 x [deleted] (none) -> origin/pr/975/merge
 x [deleted] (none) -> origin/pr/976/head
 x [deleted] (none) -> origin/pr/976/merge
 x [deleted] (none) -> origin/pr/977/head
 x [deleted] (none) -> origin/pr/977/merge
 x [deleted] (none) -> origin/pr/978/head
 x [deleted] (none) -> origin/pr/978/merge
 x [deleted] (none) -> origin/pr/979/head
 x [deleted] (none) -> origin/pr/979/merge
 x [deleted] (none) -> origin/pr/98/head
 x [deleted] (none) -> origin/pr/980/head
 x [deleted] (none) -> origin/pr/980/merge
 x [deleted] (none) -> origin/pr/981/head
 x [deleted] (none) -> origin/pr/982/head
 x [deleted] (none) -> origin/pr/982/merge
 x [deleted] (none) -> origin/pr/983/head
 x [deleted] (none) -> origin/pr/983/merge
 x [deleted] (none) -> origin/pr/984/head
 x [deleted] (none) -> origin/pr/984/merge
 x [deleted] (none) -> origin/pr/985/head
 x [deleted] (none) -> origin/pr/985/merge
 x [deleted] (none) -> origin/pr/986/head
 x [deleted] (none) -> origin/pr/986/merge
 x [deleted] (none) -> origin/pr/987/head
 x [deleted] (none) -> origin/pr/988/head
 x [deleted] (none) -> origin/pr/988/merge
 x [deleted] (none) -> 

Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #1448

2017-03-30 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-1579) Runners should verify that PT overrides converged

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950129#comment-15950129
 ] 

ASF GitHub Bot commented on BEAM-1579:
--

GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2387

[BEAM-1579] Add a Dataflow-specific primitive for creating a view

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---
Allows overrides of CreatePCollectionView to work with the batch
override API.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam create_dataflow_view

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2387.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2387


commit 1fc6db171eafad63b4ef26a49d2c7db4c42b5bae
Author: Thomas Groh 
Date:   2017-03-31T01:09:26Z

Add a Dataflow-specific primitive for creating a view

Allows overrides of CreatePCollectionView to work with the batch
override API.




> Runners should verify that PT overrides converged
> -
>
> Key: BEAM-1579
> URL: https://issues.apache.org/jira/browse/BEAM-1579
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core, runner-dataflow, runner-direct
>Reporter: Eugene Kirpichov
>Assignee: Thomas Groh
>
> PT overrides are applied in order, see 
> https://issues.apache.org/jira/browse/BEAM-1578 .
> To make sure that the order is correct and avoid confusing errors, after 
> applying the overrides in order, we should verify that the pipeline has 
> converged, i.e. the overrides no longer match.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2346: Fixed: small documentation issue in HDFS IO

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2346


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #2346

2017-03-30 Thread dhalperi
This closes #2346


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/b1c287bd
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/b1c287bd
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/b1c287bd

Branch: refs/heads/master
Commit: b1c287bd5946de70e4848775bbf400e787392d27
Parents: 7057d1e aa42067
Author: Dan Halperin 
Authored: Thu Mar 30 18:10:29 2017 -0700
Committer: Dan Halperin 
Committed: Thu Mar 30 18:10:29 2017 -0700

--
 sdks/java/io/hdfs/README.md| 6 +++---
 .../main/java/org/apache/beam/sdk/io/hdfs/HDFSFileSource.java  | 2 +-
 2 files changed, 4 insertions(+), 4 deletions(-)
--




[GitHub] beam pull request #2387: [BEAM-1579] Add a Dataflow-specific primitive for c...

2017-03-30 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2387

[BEAM-1579] Add a Dataflow-specific primitive for creating a view

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---
Allows overrides of CreatePCollectionView to work with the batch
override API.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam create_dataflow_view

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2387.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2387


commit 1fc6db171eafad63b4ef26a49d2c7db4c42b5bae
Author: Thomas Groh 
Date:   2017-03-31T01:09:26Z

Add a Dataflow-specific primitive for creating a view

Allows overrides of CreatePCollectionView to work with the batch
override API.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: Fixed: small documentation issue in HDFS IO

2017-03-30 Thread dhalperi
Repository: beam
Updated Branches:
  refs/heads/master 7057d1e22 -> b1c287bd5


Fixed: small documentation issue in HDFS IO


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/aa420678
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/aa420678
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/aa420678

Branch: refs/heads/master
Commit: aa4206787c79272ec9af5f96f6170c61dd585fea
Parents: 7057d1e
Author: peay 
Authored: Sun Mar 26 11:20:40 2017 -0400
Committer: Dan Halperin 
Committed: Thu Mar 30 18:10:27 2017 -0700

--
 sdks/java/io/hdfs/README.md| 6 +++---
 .../main/java/org/apache/beam/sdk/io/hdfs/HDFSFileSource.java  | 2 +-
 2 files changed, 4 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/aa420678/sdks/java/io/hdfs/README.md
--
diff --git a/sdks/java/io/hdfs/README.md b/sdks/java/io/hdfs/README.md
index 1c6134f..3a734f2 100644
--- a/sdks/java/io/hdfs/README.md
+++ b/sdks/java/io/hdfs/README.md
@@ -31,13 +31,13 @@ A `HDFSFileSource` can be read from using the
 ```java
 HDFSFileSource source = HDFSFileSource.from(path, MyInputFormat.class,
   MyKey.class, MyValue.class);
-PCollection> records = Read.from(mySource);
+PCollection> records = pipeline.apply(Read.from(mySource));
 ```
 
 Alternatively, the `readFrom` method is a convenience method that returns a 
read
 transform. For example:
 
 ```java
-PCollection> records = HDFSFileSource.readFrom(path,
-  MyInputFormat.class, MyKey.class, MyValue.class);
+PCollection> records = 
pipeline.apply(HDFSFileSource.readFrom(path,
+  MyInputFormat.class, MyKey.class, MyValue.class));
 ```

http://git-wip-us.apache.org/repos/asf/beam/blob/aa420678/sdks/java/io/hdfs/src/main/java/org/apache/beam/sdk/io/hdfs/HDFSFileSource.java
--
diff --git 
a/sdks/java/io/hdfs/src/main/java/org/apache/beam/sdk/io/hdfs/HDFSFileSource.java
 
b/sdks/java/io/hdfs/src/main/java/org/apache/beam/sdk/io/hdfs/HDFSFileSource.java
index 357a527..e317c6e 100644
--- 
a/sdks/java/io/hdfs/src/main/java/org/apache/beam/sdk/io/hdfs/HDFSFileSource.java
+++ 
b/sdks/java/io/hdfs/src/main/java/org/apache/beam/sdk/io/hdfs/HDFSFileSource.java
@@ -90,7 +90,7 @@ import org.slf4j.LoggerFactory;
  * {@code
  * HDFSFileSource source = HDFSFileSource.from(path, MyInputFormat.class,
  *   MyKey.class, MyValue.class);
- * PCollection> records = Read.from(mySource);
+ * PCollection> records = 
pipeline.apply(Read.from(mySource));
  * }
  * 
  *



[jira] [Commented] (BEAM-1836) Reopen BigQuery utils after #2271

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950120#comment-15950120
 ] 

ASF GitHub Bot commented on BEAM-1836:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2366


> Reopen BigQuery utils after #2271
> -
>
> Key: BEAM-1836
> URL: https://issues.apache.org/jira/browse/BEAM-1836
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Rafal Wojdyla
>Assignee: Rafal Wojdyla
>
> https://github.com/apache/beam/pull/2271 splits BigQueryIO, should not change 
> any functionality but closes some of the useful BigQuery utils like:
>  * {{toTableSpec}}
>  * {{parseTableSpec}}
> we in scio use those and would prefer not to reimplement them.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2366: [BEAM-1836] Reopen BigQuery utils after #2271

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2366


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #2366

2017-03-30 Thread dhalperi
This closes #2366


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/7057d1e2
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/7057d1e2
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/7057d1e2

Branch: refs/heads/master
Commit: 7057d1e2249c0bc78bd3ca8cec513b31dd6c9423
Parents: 3b4e0fb 077ee5e
Author: Dan Halperin 
Authored: Thu Mar 30 17:58:34 2017 -0700
Committer: Dan Halperin 
Committed: Thu Mar 30 17:58:34 2017 -0700

--
 .../org/apache/beam/sdk/io/gcp/bigquery/BigQueryHelpers.java   | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--




[1/2] beam git commit: Reopen BigQuery utils after #2271

2017-03-30 Thread dhalperi
Repository: beam
Updated Branches:
  refs/heads/master 3b4e0fb16 -> 7057d1e22


Reopen BigQuery utils after #2271


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/077ee5ef
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/077ee5ef
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/077ee5ef

Branch: refs/heads/master
Commit: 077ee5efe7caf16354a3b9afbcc81f0a92abe248
Parents: 3b4e0fb
Author: Rafal Wojdyla 
Authored: Wed Mar 29 20:14:12 2017 -0400
Committer: Dan Halperin 
Committed: Thu Mar 30 17:58:32 2017 -0700

--
 .../org/apache/beam/sdk/io/gcp/bigquery/BigQueryHelpers.java   | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/077ee5ef/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryHelpers.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryHelpers.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryHelpers.java
index c5156e9..846103d 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryHelpers.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryHelpers.java
@@ -43,7 +43,7 @@ import org.apache.beam.sdk.transforms.SerializableFunction;
 /**
  * A set of helper functions and classes used by {@link BigQueryIO}.
  */
-class BigQueryHelpers {
+public class BigQueryHelpers {
   private static final String RESOURCE_NOT_FOUND_ERROR =
   "BigQuery %1$s not found for table \"%2$s\" . Please create the %1$s 
before pipeline"
   + " execution. If the %1$s is created by an earlier stage of the 
pipeline, this"
@@ -78,7 +78,7 @@ class BigQueryHelpers {
   /**
* Returns a canonical string representation of the {@link TableReference}.
*/
-  static String toTableSpec(TableReference ref) {
+  public static String toTableSpec(TableReference ref) {
 StringBuilder sb = new StringBuilder();
 if (ref.getProjectId() != null) {
   sb.append(ref.getProjectId());
@@ -104,7 +104,7 @@ class BigQueryHelpers {
*
* If the project id is omitted, the default project id is used.
*/
-  static TableReference parseTableSpec(String tableSpec) {
+  public static TableReference parseTableSpec(String tableSpec) {
 Matcher match = BigQueryIO.TABLE_SPEC.matcher(tableSpec);
 if (!match.matches()) {
   throw new IllegalArgumentException(



[GitHub] beam pull request #2386: Improve PTransformMatcher ToStrings

2017-03-30 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2386

Improve PTransformMatcher ToStrings

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam matcher_tostrings

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2386.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2386


commit df364d6da821d44c87acaceac13acc3461d7f6fe
Author: Thomas Groh 
Date:   2017-03-31T00:54:29Z

Improve PTransformMatcher ToStrings




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (BEAM-1845) ERROR: Error fetching remote repo 'origin'

2017-03-30 Thread Ahmet Altay (JIRA)
Ahmet Altay created BEAM-1845:
-

 Summary: ERROR: Error fetching remote repo 'origin'
 Key: BEAM-1845
 URL: https://issues.apache.org/jira/browse/BEAM-1845
 Project: Beam
  Issue Type: Bug
  Components: build-system, sdk-py
Reporter: Ahmet Altay
Assignee: Jason Kuster


Happened twice in a row, do we know what is causing this?

https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1680/consoleFull
https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1681/console



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Python_Verify #1681

2017-03-30 Thread Apache Jenkins Server
See 


--
Started by GitHub push by asfgit
[EnvInject] - Loading node environment variables.
Building remotely on beam3 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Pruning obsolete local branches
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* +refs/pull/*:refs/remotes/origin/pr/* 
 > --prune
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://github.com/apache/beam.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:806)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1070)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1101)
at hudson.scm.SCM.checkout(SCM.java:495)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1278)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:604)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:529)
at hudson.model.Run.execute(Run.java:1728)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:404)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://github.com/apache/beam.git 
+refs/heads/*:refs/remotes/origin/* +refs/pull/*:refs/remotes/origin/pr/* 
--prune" returned status code 128:
stdout: 
stderr: error: RPC failed; result=18, HTTP code = 200
fatal: The remote end hung up unexpectedly

at 
org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1793)
at 
org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandWithCredentials(CliGitAPIImpl.java:1519)
at 
org.jenkinsci.plugins.gitclient.CliGitAPIImpl.access$300(CliGitAPIImpl.java:64)
at 
org.jenkinsci.plugins.gitclient.CliGitAPIImpl$1.execute(CliGitAPIImpl.java:315)
at 
org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:153)
at 
org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:146)
at hudson.remoting.UserRequest.perform(UserRequest.java:153)
at hudson.remoting.UserRequest.perform(UserRequest.java:50)
at hudson.remoting.Request$2.run(Request.java:336)
at 
hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
at ..remote call to beam3(Native Method)
at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1537)
at hudson.remoting.UserResponse.retrieve(UserRequest.java:253)
at hudson.remoting.Channel.call(Channel.java:822)
at 
org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.execute(RemoteGitImpl.java:146)
at sun.reflect.GeneratedMethodAccessor699.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.invoke(RemoteGitImpl.java:132)
at com.sun.proxy.$Proxy97.execute(Unknown Source)
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:804)
... 11 more
ERROR: null
Retrying after 10 seconds
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Pruning obsolete local branches
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* +refs/pull/*:refs/remotes/origin/pr/* 
 > --prune
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://github.com/apache/beam.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:806)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1070)
at 

[jira] [Commented] (BEAM-1441) Add an IOChannelFactory interface to Python SDK

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1441?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950114#comment-15950114
 ] 

ASF GitHub Bot commented on BEAM-1441:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2382


> Add an IOChannelFactory interface to Python SDK
> ---
>
> Key: BEAM-1441
> URL: https://issues.apache.org/jira/browse/BEAM-1441
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-py
>Reporter: Chamikara Jayalath
>Assignee: Sourabh Bajaj
> Fix For: First stable release
>
>
> Based on proposal [1], an IOChannelFactory interface was added to Java SDK  
> [2].
> We should add a similar interface to Python SDK and provide proper 
> implementations for native files, GCS, and other useful formats.
> Python SDK currently has a basic ChannelFactory interface [3] which is used 
> by FileBasedSource [4].
> [1] 
> https://docs.google.com/document/d/11TdPyZ9_zmjokhNWM3Id-XJsVG3qel2lhdKTknmZ_7M/edit#heading=h.kpqagzh8i11w
> [2] 
> https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/util/IOChannelFactory.java
> [3] 
> https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/fileio.py#L107
> [4] 
> https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/filebasedsource.py
>  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2382: [BEAM-1441] Fix size check for windows and improve ...

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2382


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Python_Verify #1680

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 219.17 KB...]
 x [deleted] (none) -> origin/pr/942/head
 x [deleted] (none) -> origin/pr/942/merge
 x [deleted] (none) -> origin/pr/943/head
 x [deleted] (none) -> origin/pr/943/merge
 x [deleted] (none) -> origin/pr/944/head
 x [deleted] (none) -> origin/pr/945/head
 x [deleted] (none) -> origin/pr/945/merge
 x [deleted] (none) -> origin/pr/946/head
 x [deleted] (none) -> origin/pr/946/merge
 x [deleted] (none) -> origin/pr/947/head
 x [deleted] (none) -> origin/pr/947/merge
 x [deleted] (none) -> origin/pr/948/head
 x [deleted] (none) -> origin/pr/948/merge
 x [deleted] (none) -> origin/pr/949/head
 x [deleted] (none) -> origin/pr/949/merge
 x [deleted] (none) -> origin/pr/95/head
 x [deleted] (none) -> origin/pr/95/merge
 x [deleted] (none) -> origin/pr/950/head
 x [deleted] (none) -> origin/pr/951/head
 x [deleted] (none) -> origin/pr/951/merge
 x [deleted] (none) -> origin/pr/952/head
 x [deleted] (none) -> origin/pr/952/merge
 x [deleted] (none) -> origin/pr/953/head
 x [deleted] (none) -> origin/pr/954/head
 x [deleted] (none) -> origin/pr/954/merge
 x [deleted] (none) -> origin/pr/955/head
 x [deleted] (none) -> origin/pr/955/merge
 x [deleted] (none) -> origin/pr/956/head
 x [deleted] (none) -> origin/pr/957/head
 x [deleted] (none) -> origin/pr/958/head
 x [deleted] (none) -> origin/pr/959/head
 x [deleted] (none) -> origin/pr/959/merge
 x [deleted] (none) -> origin/pr/96/head
 x [deleted] (none) -> origin/pr/96/merge
 x [deleted] (none) -> origin/pr/960/head
 x [deleted] (none) -> origin/pr/960/merge
 x [deleted] (none) -> origin/pr/961/head
 x [deleted] (none) -> origin/pr/962/head
 x [deleted] (none) -> origin/pr/962/merge
 x [deleted] (none) -> origin/pr/963/head
 x [deleted] (none) -> origin/pr/963/merge
 x [deleted] (none) -> origin/pr/964/head
 x [deleted] (none) -> origin/pr/965/head
 x [deleted] (none) -> origin/pr/965/merge
 x [deleted] (none) -> origin/pr/966/head
 x [deleted] (none) -> origin/pr/967/head
 x [deleted] (none) -> origin/pr/967/merge
 x [deleted] (none) -> origin/pr/968/head
 x [deleted] (none) -> origin/pr/968/merge
 x [deleted] (none) -> origin/pr/969/head
 x [deleted] (none) -> origin/pr/969/merge
 x [deleted] (none) -> origin/pr/97/head
 x [deleted] (none) -> origin/pr/97/merge
 x [deleted] (none) -> origin/pr/970/head
 x [deleted] (none) -> origin/pr/970/merge
 x [deleted] (none) -> origin/pr/971/head
 x [deleted] (none) -> origin/pr/971/merge
 x [deleted] (none) -> origin/pr/972/head
 x [deleted] (none) -> origin/pr/973/head
 x [deleted] (none) -> origin/pr/974/head
 x [deleted] (none) -> origin/pr/974/merge
 x [deleted] (none) -> origin/pr/975/head
 x [deleted] (none) -> origin/pr/975/merge
 x [deleted] (none) -> origin/pr/976/head
 x [deleted] (none) -> origin/pr/976/merge
 x [deleted] (none) -> origin/pr/977/head
 x [deleted] (none) -> origin/pr/977/merge
 x [deleted] (none) -> origin/pr/978/head
 x [deleted] (none) -> origin/pr/978/merge
 x [deleted] (none) -> origin/pr/979/head
 x [deleted] (none) -> origin/pr/979/merge
 x [deleted] (none) -> origin/pr/98/head
 x [deleted] (none) -> origin/pr/980/head
 x [deleted] (none) -> origin/pr/980/merge
 x [deleted] (none) -> origin/pr/981/head
 x [deleted] (none) -> origin/pr/982/head
 x [deleted] (none) -> origin/pr/982/merge
 x [deleted] (none) -> origin/pr/983/head
 x [deleted] (none) -> origin/pr/983/merge
 x [deleted] (none) -> origin/pr/984/head
 x [deleted] (none) -> origin/pr/984/merge
 x [deleted] (none) -> origin/pr/985/head
 x [deleted] (none) -> origin/pr/985/merge
 x [deleted] (none) -> origin/pr/986/head
 x [deleted] (none) -> origin/pr/986/merge
 x [deleted] (none) -> origin/pr/987/head
 x [deleted] (none) -> origin/pr/988/head
 x [deleted] (none) -> origin/pr/988/merge
 x [deleted] (none) -> origin/pr/989/head
 

[2/2] beam git commit: This closes #2382

2017-03-30 Thread altay
This closes #2382


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/3b4e0fb1
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/3b4e0fb1
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/3b4e0fb1

Branch: refs/heads/master
Commit: 3b4e0fb1686108e2d74b6dd3cea1a23398462270
Parents: 07d9bd5 6ceae06
Author: Ahmet Altay 
Authored: Thu Mar 30 17:48:02 2017 -0700
Committer: Ahmet Altay 
Committed: Thu Mar 30 17:48:02 2017 -0700

--
 sdks/python/apache_beam/io/filesystem.py |  8 +---
 sdks/python/apache_beam/io/gcp/gcsfilesystem_test.py | 12 
 sdks/python/apache_beam/io/localfilesystem_test.py   | 12 
 3 files changed, 21 insertions(+), 11 deletions(-)
--




Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3100

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 219.29 KB...]
 x [deleted] (none) -> origin/pr/942/head
 x [deleted] (none) -> origin/pr/942/merge
 x [deleted] (none) -> origin/pr/943/head
 x [deleted] (none) -> origin/pr/943/merge
 x [deleted] (none) -> origin/pr/944/head
 x [deleted] (none) -> origin/pr/945/head
 x [deleted] (none) -> origin/pr/945/merge
 x [deleted] (none) -> origin/pr/946/head
 x [deleted] (none) -> origin/pr/946/merge
 x [deleted] (none) -> origin/pr/947/head
 x [deleted] (none) -> origin/pr/947/merge
 x [deleted] (none) -> origin/pr/948/head
 x [deleted] (none) -> origin/pr/948/merge
 x [deleted] (none) -> origin/pr/949/head
 x [deleted] (none) -> origin/pr/949/merge
 x [deleted] (none) -> origin/pr/95/head
 x [deleted] (none) -> origin/pr/95/merge
 x [deleted] (none) -> origin/pr/950/head
 x [deleted] (none) -> origin/pr/951/head
 x [deleted] (none) -> origin/pr/951/merge
 x [deleted] (none) -> origin/pr/952/head
 x [deleted] (none) -> origin/pr/952/merge
 x [deleted] (none) -> origin/pr/953/head
 x [deleted] (none) -> origin/pr/954/head
 x [deleted] (none) -> origin/pr/954/merge
 x [deleted] (none) -> origin/pr/955/head
 x [deleted] (none) -> origin/pr/955/merge
 x [deleted] (none) -> origin/pr/956/head
 x [deleted] (none) -> origin/pr/957/head
 x [deleted] (none) -> origin/pr/958/head
 x [deleted] (none) -> origin/pr/959/head
 x [deleted] (none) -> origin/pr/959/merge
 x [deleted] (none) -> origin/pr/96/head
 x [deleted] (none) -> origin/pr/96/merge
 x [deleted] (none) -> origin/pr/960/head
 x [deleted] (none) -> origin/pr/960/merge
 x [deleted] (none) -> origin/pr/961/head
 x [deleted] (none) -> origin/pr/962/head
 x [deleted] (none) -> origin/pr/962/merge
 x [deleted] (none) -> origin/pr/963/head
 x [deleted] (none) -> origin/pr/963/merge
 x [deleted] (none) -> origin/pr/964/head
 x [deleted] (none) -> origin/pr/965/head
 x [deleted] (none) -> origin/pr/965/merge
 x [deleted] (none) -> origin/pr/966/head
 x [deleted] (none) -> origin/pr/967/head
 x [deleted] (none) -> origin/pr/967/merge
 x [deleted] (none) -> origin/pr/968/head
 x [deleted] (none) -> origin/pr/968/merge
 x [deleted] (none) -> origin/pr/969/head
 x [deleted] (none) -> origin/pr/969/merge
 x [deleted] (none) -> origin/pr/97/head
 x [deleted] (none) -> origin/pr/97/merge
 x [deleted] (none) -> origin/pr/970/head
 x [deleted] (none) -> origin/pr/970/merge
 x [deleted] (none) -> origin/pr/971/head
 x [deleted] (none) -> origin/pr/971/merge
 x [deleted] (none) -> origin/pr/972/head
 x [deleted] (none) -> origin/pr/973/head
 x [deleted] (none) -> origin/pr/974/head
 x [deleted] (none) -> origin/pr/974/merge
 x [deleted] (none) -> origin/pr/975/head
 x [deleted] (none) -> origin/pr/975/merge
 x [deleted] (none) -> origin/pr/976/head
 x [deleted] (none) -> origin/pr/976/merge
 x [deleted] (none) -> origin/pr/977/head
 x [deleted] (none) -> origin/pr/977/merge
 x [deleted] (none) -> origin/pr/978/head
 x [deleted] (none) -> origin/pr/978/merge
 x [deleted] (none) -> origin/pr/979/head
 x [deleted] (none) -> origin/pr/979/merge
 x [deleted] (none) -> origin/pr/98/head
 x [deleted] (none) -> origin/pr/980/head
 x [deleted] (none) -> origin/pr/980/merge
 x [deleted] (none) -> origin/pr/981/head
 x [deleted] (none) -> origin/pr/982/head
 x [deleted] (none) -> origin/pr/982/merge
 x [deleted] (none) -> origin/pr/983/head
 x [deleted] (none) -> origin/pr/983/merge
 x [deleted] (none) -> origin/pr/984/head
 x [deleted] (none) -> origin/pr/984/merge
 x [deleted] (none) -> origin/pr/985/head
 x [deleted] (none) -> origin/pr/985/merge
 x [deleted] (none) -> origin/pr/986/head
 x [deleted] (none) -> origin/pr/986/merge
 x [deleted] (none) -> origin/pr/987/head
 x [deleted] (none) -> origin/pr/988/head
 x [deleted] (none) -> origin/pr/988/merge
 x [deleted] (none) -> 

[1/2] beam git commit: [BEAM-1441] Fix size check for windows and improve error message

2017-03-30 Thread altay
Repository: beam
Updated Branches:
  refs/heads/master 07d9bd5f8 -> 3b4e0fb16


[BEAM-1441] Fix size check for windows and improve error message


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/6ceae061
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/6ceae061
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/6ceae061

Branch: refs/heads/master
Commit: 6ceae061e1e3c8dadd575f2c9f4e12dea391ae66
Parents: 07d9bd5
Author: Sourabh Bajaj 
Authored: Thu Mar 30 17:17:01 2017 -0700
Committer: Ahmet Altay 
Committed: Thu Mar 30 17:47:58 2017 -0700

--
 sdks/python/apache_beam/io/filesystem.py |  8 +---
 sdks/python/apache_beam/io/gcp/gcsfilesystem_test.py | 12 
 sdks/python/apache_beam/io/localfilesystem_test.py   | 12 
 3 files changed, 21 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/6ceae061/sdks/python/apache_beam/io/filesystem.py
--
diff --git a/sdks/python/apache_beam/io/filesystem.py 
b/sdks/python/apache_beam/io/filesystem.py
index 14493c0..b0d2f48 100644
--- a/sdks/python/apache_beam/io/filesystem.py
+++ b/sdks/python/apache_beam/io/filesystem.py
@@ -269,8 +269,9 @@ class FileMetadata(object):
   """
   def __init__(self, path, size_in_bytes):
 assert isinstance(path, basestring) and path, "Path should be a string"
-assert isinstance(size_in_bytes, int) and size_in_bytes >= 0, \
-"Size of bytes should be greater than equal to zero"
+assert isinstance(size_in_bytes, (int, long)) and size_in_bytes >= 0, \
+"Invalid value for size_in_bytes should %s (of type %s)" % (
+size_in_bytes, type(size_in_bytes))
 self.path = path
 self.size_in_bytes = size_in_bytes
 
@@ -312,7 +313,8 @@ class BeamIOError(IOError):
 may have failed anywhere so the user should use match to determine
 the current state of the system.
 """
-super(BeamIOError, self).__init__(msg)
+message = "%s with exceptions %s" % (msg, exception_details)
+super(BeamIOError, self).__init__(message)
 self.exception_details = exception_details
 
 

http://git-wip-us.apache.org/repos/asf/beam/blob/6ceae061/sdks/python/apache_beam/io/gcp/gcsfilesystem_test.py
--
diff --git a/sdks/python/apache_beam/io/gcp/gcsfilesystem_test.py 
b/sdks/python/apache_beam/io/gcp/gcsfilesystem_test.py
index 3fe5cce..73a3893 100644
--- a/sdks/python/apache_beam/io/gcp/gcsfilesystem_test.py
+++ b/sdks/python/apache_beam/io/gcp/gcsfilesystem_test.py
@@ -68,7 +68,8 @@ class GCSFileSystemTest(unittest.TestCase):
 file_system = gcsfilesystem.GCSFileSystem()
 with self.assertRaises(BeamIOError) as error:
   file_system.match(['gs://bucket/'])
-self.assertEqual(error.exception.message, 'Match operation failed')
+self.assertTrue(
+error.exception.message.startswith('Match operation failed'))
 self.assertEqual(error.exception.exception_details, expected_results)
 gcsio_mock.size_of_files_in_glob.assert_called_once_with('gs://bucket/*')
 
@@ -148,7 +149,8 @@ class GCSFileSystemTest(unittest.TestCase):
 file_system = gcsfilesystem.GCSFileSystem()
 with self.assertRaises(BeamIOError) as error:
   file_system.copy(sources, destinations)
-self.assertEqual(error.exception.message, 'Copy operation failed')
+self.assertTrue(
+error.exception.message.startswith('Copy operation failed'))
 self.assertEqual(error.exception.exception_details, expected_results)
 
 gcsio_mock.copy.assert_called_once_with(
@@ -240,7 +242,8 @@ class GCSFileSystemTest(unittest.TestCase):
 file_system = gcsfilesystem.GCSFileSystem()
 with self.assertRaises(BeamIOError) as error:
   file_system.rename(sources, destinations)
-self.assertEqual(error.exception.message, 'Rename operation failed')
+self.assertTrue(
+error.exception.message.startswith('Rename operation failed'))
 self.assertEqual(error.exception.exception_details, expected_results)
 
 gcsio_mock.copy_batch.assert_called_once_with([
@@ -288,6 +291,7 @@ class GCSFileSystemTest(unittest.TestCase):
 file_system = gcsfilesystem.GCSFileSystem()
 with self.assertRaises(BeamIOError) as error:
   file_system.delete(files)
-self.assertEqual(error.exception.message, 'Delete operation failed')
+self.assertTrue(
+error.exception.message.startswith('Delete operation failed'))
 self.assertEqual(error.exception.exception_details, expected_results)
 gcsio_mock.delete_batch.assert_called()


[jira] [Commented] (BEAM-1844) test_memory_usage fails in post commit

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1844?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950110#comment-15950110
 ] 

ASF GitHub Bot commented on BEAM-1844:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2384


> test_memory_usage fails in post commit
> --
>
> Key: BEAM-1844
> URL: https://issues.apache.org/jira/browse/BEAM-1844
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Ahmet Altay
>
> ...
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/typehints/typecheck.py",
>  line 78, in wrapper
> result = method(*args, **kwargs)
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/transforms/core.py",
>  line 719, in 
> wrapper = lambda x, *args, **kwargs: [fn(x, *args, **kwargs)]
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/pipeline_test.py",
>  line 245, in check_memory
> 'High memory usage: %d > %d' % (memory_usage, memory_threshold))
> RuntimeError: High memory usage: 125566976 > 123487104 [while running 
> 'oom:check']
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1677/consoleFull
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1675/consoleFull



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2384: [BEAM-1844] Increase the memory threshold for the d...

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2384


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #2384

2017-03-30 Thread altay
This closes #2384


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/07d9bd5f
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/07d9bd5f
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/07d9bd5f

Branch: refs/heads/master
Commit: 07d9bd5f81a3029fc8332a9bb70508adfdd4067d
Parents: d4de137 065a545
Author: Ahmet Altay 
Authored: Thu Mar 30 17:45:39 2017 -0700
Committer: Ahmet Altay 
Committed: Thu Mar 30 17:45:39 2017 -0700

--
 sdks/python/apache_beam/pipeline_test.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--




[1/2] beam git commit: Increase the memory threshold for the direct runner test

2017-03-30 Thread altay
Repository: beam
Updated Branches:
  refs/heads/master d4de13718 -> 07d9bd5f8


Increase the memory threshold for the direct runner test


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/065a545a
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/065a545a
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/065a545a

Branch: refs/heads/master
Commit: 065a545ae098c8df81769e4c88c5031deef4494a
Parents: d4de137
Author: Ahmet Altay 
Authored: Thu Mar 30 17:08:00 2017 -0700
Committer: Ahmet Altay 
Committed: Thu Mar 30 17:08:00 2017 -0700

--
 sdks/python/apache_beam/pipeline_test.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/065a545a/sdks/python/apache_beam/pipeline_test.py
--
diff --git a/sdks/python/apache_beam/pipeline_test.py 
b/sdks/python/apache_beam/pipeline_test.py
index d464fdb..ba219bf 100644
--- a/sdks/python/apache_beam/pipeline_test.py
+++ b/sdks/python/apache_beam/pipeline_test.py
@@ -253,7 +253,7 @@ class PipelineTest(unittest.TestCase):
 
 # Consumed memory should not be proportional to the number of maps.
 memory_threshold = (
-get_memory_usage_in_bytes() + (3 * len_elements * num_elements))
+get_memory_usage_in_bytes() + (5 * len_elements * num_elements))
 
 # Plus small additional slack for memory fluctuations during the test.
 memory_threshold += 10 * (2 ** 20)



[jira] [Resolved] (BEAM-1640) data file missing when submit a job on Flink

2017-03-30 Thread Xu Mingmin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1640?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xu Mingmin resolved BEAM-1640.
--
   Resolution: Not A Problem
Fix Version/s: Not applicable

Not related with Beam,
issue cause by {{System.properties}} conflict after migrate to Flink 1.2.0.


> data file missing when submit a job on Flink
> 
>
> Key: BEAM-1640
> URL: https://issues.apache.org/jira/browse/BEAM-1640
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 0.6.0
>Reporter: Xu Mingmin
>Assignee: Aljoscha Krettek
> Fix For: Not applicable
>
>
> I've one file with path 'META-INF/jaas/kafka_jaas.conf' in my jar package. it 
> works with Beam 0.5.0, when I re-package it with 0.6.0-SNAPSHOT, it fails to 
> submit with bin/flink command. --Both run on YARN.
> The error is show as below, I guess this file maybe lost in Flink-Runner. 
> {code}
> Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka 
> consumer
>   at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:702)
>   at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:557)
>   at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:540)
>   at org.apache.beam.sdk.io.kafka.KafkaIO$Read$2.apply(KafkaIO.java:503)
>   at org.apache.beam.sdk.io.kafka.KafkaIO$Read$2.apply(KafkaIO.java:501)
>   at 
> org.apache.beam.sdk.io.kafka.KafkaIO$UnboundedKafkaSource.generateInitialSplits(KafkaIO.java:620)
>   at 
> org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper.(UnboundedSourceWrapper.java:159)
>   at 
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$UnboundedReadSourceTranslator.translateNode(FlinkStreamingTransformTranslators.java:267)
>   ... 33 more
> Caused by: org.apache.kafka.common.KafkaException: 
> java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in 
> jaas config.
>   at 
> org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86)
>   at 
> org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:70)
>   at 
> org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:83)
>   at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
>   ... 40 more
> Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' 
> entry in jaas config.
>   at io.ebay.rheos.kafka.security.iaf.IAFLogin.login(IAFLogin.java:54)
>   at 
> org.apache.kafka.common.security.authenticator.LoginManager.(LoginManager.java:53)
>   at 
> org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:75)
>   at 
> org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:78)
>   ... 43 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1640) data file missing when submit a job on Flink

2017-03-30 Thread Xu Mingmin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1640?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950102#comment-15950102
 ] 

Xu Mingmin commented on BEAM-1640:
--

Figure out that it's an issue of System.properties conflict, in Flink 1.2.0 a 
new properties {{java.security.auth.login.config}} is added when running with 
{{./flink-1.2.0/bin/flink run}}.
Will close this task

> data file missing when submit a job on Flink
> 
>
> Key: BEAM-1640
> URL: https://issues.apache.org/jira/browse/BEAM-1640
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 0.6.0
>Reporter: Xu Mingmin
>Assignee: Aljoscha Krettek
>
> I've one file with path 'META-INF/jaas/kafka_jaas.conf' in my jar package. it 
> works with Beam 0.5.0, when I re-package it with 0.6.0-SNAPSHOT, it fails to 
> submit with bin/flink command. --Both run on YARN.
> The error is show as below, I guess this file maybe lost in Flink-Runner. 
> {code}
> Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka 
> consumer
>   at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:702)
>   at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:557)
>   at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:540)
>   at org.apache.beam.sdk.io.kafka.KafkaIO$Read$2.apply(KafkaIO.java:503)
>   at org.apache.beam.sdk.io.kafka.KafkaIO$Read$2.apply(KafkaIO.java:501)
>   at 
> org.apache.beam.sdk.io.kafka.KafkaIO$UnboundedKafkaSource.generateInitialSplits(KafkaIO.java:620)
>   at 
> org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper.(UnboundedSourceWrapper.java:159)
>   at 
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$UnboundedReadSourceTranslator.translateNode(FlinkStreamingTransformTranslators.java:267)
>   ... 33 more
> Caused by: org.apache.kafka.common.KafkaException: 
> java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in 
> jaas config.
>   at 
> org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86)
>   at 
> org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:70)
>   at 
> org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:83)
>   at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
>   ... 40 more
> Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' 
> entry in jaas config.
>   at io.ebay.rheos.kafka.security.iaf.IAFLogin.login(IAFLogin.java:54)
>   at 
> org.apache.kafka.common.security.authenticator.LoginManager.(LoginManager.java:53)
>   at 
> org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:75)
>   at 
> org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:78)
>   ... 43 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3099

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 1.98 MB...]
  File 
"
 line 60, in process
return self.wrapper(self.dofn.process, args, kwargs)
  File 
"
 line 78, in wrapper
result = method(*args, **kwargs)
  File 
"
 line 719, in 
wrapper = lambda x, *args, **kwargs: [fn(x, *args, **kwargs)]
  File 
"
 line 245, in check_memory
'High memory usage: %d > %d' % (memory_usage, memory_threshold))
RuntimeError: High memory usage: 191967232 > 190788480 [while running 
'oom:check']
Traceback (most recent call last):
  File 
"
 line 297, in __call__
evaluator.process_element(value)
  File 
"
 line 355, in process_element
self.runner.process(element)
  File 
"
 line 267, in process
self.reraise_augmented(exn)
  File 
"
 line 265, in process
self._dofn_invoker(element)
  File 
"
 line 232, in _dofn_invoker
self._dofn_per_window_invoker(element)
  File 
"
 line 218, in _dofn_per_window_invoker
self._process_outputs(element, self.dofn_process(*args))
  File 
"
 line 60, in process
return self.wrapper(self.dofn.process, args, kwargs)
  File 
"
 line 78, in wrapper
result = method(*args, **kwargs)
  File 
"
 line 719, in 
wrapper = lambda x, *args, **kwargs: [fn(x, *args, **kwargs)]
  File 
"
 line 245, in check_memory
'High memory usage: %d > %d' % (memory_usage, memory_threshold))
RuntimeError: High memory usage: 191967232 > 190788480 [while running 
'oom:check']
root: WARNING: A task failed with exception.
 High memory usage: 191967232 > 190788480 [while running 'oom:check']
- >> end captured logging << -

--
Ran 1049 tests in 69.435s

FAILED (errors=1, skipped=13)
Test failed: 
error: Test failed: 
ERROR: InvocationError: 
'
 setup.py test'
___ summary 
  docs: commands succeeded
  lint: commands succeeded
  py27: commands succeeded
  py27cython: commands succeeded
ERROR:   py27gcp: commands failed
2017-03-31T00:33:48.873 [ERROR] Command execution failed.
org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit 
value: 1)
at 
org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
at 
org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:764)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:711)
at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:289)
at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at 

[GitHub] beam pull request #2385: Add request number of splits to display data in Dat...

2017-03-30 Thread vikkyrk
GitHub user vikkyrk opened a pull request:

https://github.com/apache/beam/pull/2385

Add request number of splits to display data in DatastoreIO

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/vikkyrk/incubator-beam ds_data

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2385.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2385


commit c14bf8dbf083de0926adfe76db632756f88afdf1
Author: Vikas Kedigehalli 
Date:   2017-03-31T00:31:18Z

Add request number of splits to display data in DatastoreIO




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build is still unstable: beam_PostCommit_Java_ValidatesRunner_Spark #1447

2017-03-30 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_JDBC #56

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] Fix NPE in Kafka value writer.

[kirpichov] Improves logging in FileBasedSource splitting

[altay] [BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0

--
[...truncated 294.59 KB...]
at 
org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:393)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)


Results :

Failed tests: 
  JdbcIOIT.testRead:142 (58cabce072ab8a4d): java.lang.RuntimeException: 
org.apache.beam.sdk.util.UserCodeException: org.postgresql.util.PSQLException: 
The connection attempt failed.
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:289)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:271)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:243)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:124)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:104)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:91)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: 
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$INbuNj1W.invokeSetup(Unknown
 Source)
at 
com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:66)
at 
com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:48)
at 
com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:99)
at 
com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:363)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
... 14 more
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:215)
at org.postgresql.Driver.makeConnection(Driver.java:404)
at org.postgresql.Driver.connect(Driver.java:272)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:86)
at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:71)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$DataSourceConfiguration.getConnection(JdbcIO.java:234)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn.setup(JdbcIO.java:358)
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at 

[jira] [Commented] (BEAM-1844) test_memory_usage fails in post commit

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1844?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950093#comment-15950093
 ] 

ASF GitHub Bot commented on BEAM-1844:
--

GitHub user aaltay opened a pull request:

https://github.com/apache/beam/pull/2384

[BEAM-1844] Increase the memory threshold for the direct runner test

It seems like overtime the memory usage increased, this is to prevent test 
failures.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/aaltay/beam oom

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2384.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2384


commit 065a545ae098c8df81769e4c88c5031deef4494a
Author: Ahmet Altay 
Date:   2017-03-31T00:08:00Z

Increase the memory threshold for the direct runner test




> test_memory_usage fails in post commit
> --
>
> Key: BEAM-1844
> URL: https://issues.apache.org/jira/browse/BEAM-1844
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Ahmet Altay
>
> ...
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/typehints/typecheck.py",
>  line 78, in wrapper
> result = method(*args, **kwargs)
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/transforms/core.py",
>  line 719, in 
> wrapper = lambda x, *args, **kwargs: [fn(x, *args, **kwargs)]
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/pipeline_test.py",
>  line 245, in check_memory
> 'High memory usage: %d > %d' % (memory_usage, memory_threshold))
> RuntimeError: High memory usage: 125566976 > 123487104 [while running 
> 'oom:check']
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1677/consoleFull
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1675/consoleFull



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2384: [BEAM-1844] Increase the memory threshold for the d...

2017-03-30 Thread aaltay
GitHub user aaltay opened a pull request:

https://github.com/apache/beam/pull/2384

[BEAM-1844] Increase the memory threshold for the direct runner test

It seems like overtime the memory usage increased, this is to prevent test 
failures.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/aaltay/beam oom

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2384.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2384


commit 065a545ae098c8df81769e4c88c5031deef4494a
Author: Ahmet Altay 
Date:   2017-03-31T00:08:00Z

Increase the memory threshold for the direct runner test




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PerformanceTests_Dataflow #250

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] Fix NPE in Kafka value writer.

[kirpichov] Improves logging in FileBasedSource splitting

[altay] [BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0

--
[...truncated 272.87 KB...]
 * [new ref] refs/pull/928/head -> origin/pr/928/head
 * [new ref] refs/pull/929/head -> origin/pr/929/head
 * [new ref] refs/pull/93/head -> origin/pr/93/head
 * [new ref] refs/pull/930/head -> origin/pr/930/head
 * [new ref] refs/pull/930/merge -> origin/pr/930/merge
 * [new ref] refs/pull/931/head -> origin/pr/931/head
 * [new ref] refs/pull/931/merge -> origin/pr/931/merge
 * [new ref] refs/pull/932/head -> origin/pr/932/head
 * [new ref] refs/pull/932/merge -> origin/pr/932/merge
 * [new ref] refs/pull/933/head -> origin/pr/933/head
 * [new ref] refs/pull/933/merge -> origin/pr/933/merge
 * [new ref] refs/pull/934/head -> origin/pr/934/head
 * [new ref] refs/pull/934/merge -> origin/pr/934/merge
 * [new ref] refs/pull/935/head -> origin/pr/935/head
 * [new ref] refs/pull/936/head -> origin/pr/936/head
 * [new ref] refs/pull/936/merge -> origin/pr/936/merge
 * [new ref] refs/pull/937/head -> origin/pr/937/head
 * [new ref] refs/pull/937/merge -> origin/pr/937/merge
 * [new ref] refs/pull/938/head -> origin/pr/938/head
 * [new ref] refs/pull/939/head -> origin/pr/939/head
 * [new ref] refs/pull/94/head -> origin/pr/94/head
 * [new ref] refs/pull/940/head -> origin/pr/940/head
 * [new ref] refs/pull/940/merge -> origin/pr/940/merge
 * [new ref] refs/pull/941/head -> origin/pr/941/head
 * [new ref] refs/pull/941/merge -> origin/pr/941/merge
 * [new ref] refs/pull/942/head -> origin/pr/942/head
 * [new ref] refs/pull/942/merge -> origin/pr/942/merge
 * [new ref] refs/pull/943/head -> origin/pr/943/head
 * [new ref] refs/pull/943/merge -> origin/pr/943/merge
 * [new ref] refs/pull/944/head -> origin/pr/944/head
 * [new ref] refs/pull/945/head -> origin/pr/945/head
 * [new ref] refs/pull/945/merge -> origin/pr/945/merge
 * [new ref] refs/pull/946/head -> origin/pr/946/head
 * [new ref] refs/pull/946/merge -> origin/pr/946/merge
 * [new ref] refs/pull/947/head -> origin/pr/947/head
 * [new ref] refs/pull/947/merge -> origin/pr/947/merge
 * [new ref] refs/pull/948/head -> origin/pr/948/head
 * [new ref] refs/pull/948/merge -> origin/pr/948/merge
 * [new ref] refs/pull/949/head -> origin/pr/949/head
 * [new ref] refs/pull/949/merge -> origin/pr/949/merge
 * [new ref] refs/pull/95/head -> origin/pr/95/head
 * [new ref] refs/pull/95/merge -> origin/pr/95/merge
 * [new ref] refs/pull/950/head -> origin/pr/950/head
 * [new ref] refs/pull/951/head -> origin/pr/951/head
 * [new ref] refs/pull/951/merge -> origin/pr/951/merge
 * [new ref] refs/pull/952/head -> origin/pr/952/head
 * [new ref] refs/pull/952/merge -> origin/pr/952/merge
 * [new ref] refs/pull/953/head -> origin/pr/953/head
 * [new ref] refs/pull/954/head -> origin/pr/954/head
 * [new ref] refs/pull/954/merge -> origin/pr/954/merge
 * [new ref] refs/pull/955/head -> origin/pr/955/head
 * [new ref] refs/pull/955/merge -> origin/pr/955/merge
 * [new ref] refs/pull/956/head -> origin/pr/956/head
 * [new ref] refs/pull/957/head -> origin/pr/957/head
 * [new ref] refs/pull/958/head -> origin/pr/958/head
 * [new ref] refs/pull/959/head -> origin/pr/959/head
 * [new ref] refs/pull/959/merge -> origin/pr/959/merge
 * [new ref] refs/pull/96/head -> origin/pr/96/head
 * [new ref] refs/pull/96/merge -> origin/pr/96/merge
 * [new ref] refs/pull/960/head -> origin/pr/960/head
 * [new ref] refs/pull/960/merge -> origin/pr/960/merge
 * [new ref] refs/pull/961/head -> origin/pr/961/head
 * [new ref] refs/pull/962/head -> origin/pr/962/head
 * [new ref] refs/pull/962/merge -> origin/pr/962/merge
 * [new ref] refs/pull/963/head -> origin/pr/963/head
 * [new ref] refs/pull/963/merge -> origin/pr/963/merge
 * [new ref] refs/pull/964/head -> origin/pr/964/head
 * [new ref] refs/pull/965/head -> origin/pr/965/head
 * [new ref] refs/pull/965/merge -> origin/pr/965/merge
 * [new ref] refs/pull/966/head -> origin/pr/966/head
 * [new ref] refs/pull/967/head -> origin/pr/967/head
 * [new ref] refs/pull/967/merge -> origin/pr/967/merge
 * [new ref] refs/pull/968/head -> origin/pr/968/head
 * [new ref] refs/pull/968/merge -> origin/pr/968/merge
 * [new ref] 

[jira] [Created] (BEAM-1843) 'PDone' object has no attribute 'to_runner_api'

2017-03-30 Thread Ahmet Altay (JIRA)
Ahmet Altay created BEAM-1843:
-

 Summary: 'PDone' object has no attribute 'to_runner_api'
 Key: BEAM-1843
 URL: https://issues.apache.org/jira/browse/BEAM-1843
 Project: Beam
  Issue Type: Bug
  Components: sdk-py
Reporter: Ahmet Altay
Assignee: Robert Bradshaw


Post commit failure with 
(https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1676/consoleFull):

...
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/pipeline.py",
 line 512, in 
for tag, out in self.outputs.items()},
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/runners/pipeline_context.py",
 line 52, in get_id
self._id_to_proto[id] = obj.to_runner_api(self._pipeline_context)
AttributeError: 'PDone' object has no attribute 'to_runner_api'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1843) 'PDone' object has no attribute 'to_runner_api'

2017-03-30 Thread Ahmet Altay (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1843?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950085#comment-15950085
 ] 

Ahmet Altay commented on BEAM-1843:
---

Two more instances:
https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1678/consoleFull
https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1679/consoleFull

(It seems to affect all post commits, some of the fail earlier due to 
BEAM-1844) 

> 'PDone' object has no attribute 'to_runner_api'
> ---
>
> Key: BEAM-1843
> URL: https://issues.apache.org/jira/browse/BEAM-1843
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Robert Bradshaw
>
> Post commit failure with 
> (https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1676/consoleFull):
> ...
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/pipeline.py",
>  line 512, in 
> for tag, out in self.outputs.items()},
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/runners/pipeline_context.py",
>  line 52, in get_id
> self._id_to_proto[id] = obj.to_runner_api(self._pipeline_context)
> AttributeError: 'PDone' object has no attribute 'to_runner_api'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is back to normal : beam_PostCommit_Java_MavenInstall #3098

2017-03-30 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink #2126

2017-03-30 Thread Apache Jenkins Server
See 




[jira] [Comment Edited] (BEAM-1823) TimedOutException in postcommit

2017-03-30 Thread Ahmet Altay (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950076#comment-15950076
 ] 

Ahmet Altay edited comment on BEAM-1823 at 3/30/17 11:50 PM:
-

Thank you Mark.

I think 10 minutes timeout makes sense. BEAM-1728 will help a lot (so that we 
can verify manually that job actually passed) and also include in the error 
message that for how long the test waited for timeout.

There was other instances today 
(https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1673/consoleFull,
 
https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1674/consoleFull)
 and it is hard to debug without the above information.


was (Author: altay):
Thank you Mark.

I think 10 minutes timeout makes sense. BEAM-1728 will help a lot (so that we 
can verify manually that job actually passed) and also include in the error 
message that for how long the test waited for timeout.

There was another instance today 
(https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1673/consoleFull)
 and it is hard to debug without the above information.

> TimedOutException in postcommit
> ---
>
> Key: BEAM-1823
> URL: https://issues.apache.org/jira/browse/BEAM-1823
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Mark Liu
>
> Mark, do you know what this error means? Where is the timeout configured.
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1657/console
> I _think_ this is one of the underlying Dataflow executions and it completed 
> (although much slower than usual):
> https://pantheon.corp.google.com/dataflow/job/2017-03-28_14_25_21-13472017589125356257?project=apache-beam-testing=433637338589
> It makes sense to time out the test but I want to know how it is configured. 
> Also, is it possible to print out output logs for failed/timed out tests so 
> that we can clearly associate tests with job executions.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (BEAM-1844) test_memory_usage fails in post commit

2017-03-30 Thread Ahmet Altay (JIRA)
Ahmet Altay created BEAM-1844:
-

 Summary: test_memory_usage fails in post commit
 Key: BEAM-1844
 URL: https://issues.apache.org/jira/browse/BEAM-1844
 Project: Beam
  Issue Type: Bug
  Components: sdk-py
Reporter: Ahmet Altay
Assignee: Ahmet Altay


...
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/typehints/typecheck.py",
 line 78, in wrapper
result = method(*args, **kwargs)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/transforms/core.py",
 line 719, in 
wrapper = lambda x, *args, **kwargs: [fn(x, *args, **kwargs)]
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/sdks/python/apache_beam/pipeline_test.py",
 line 245, in check_memory
'High memory usage: %d > %d' % (memory_usage, memory_threshold))
RuntimeError: High memory usage: 125566976 > 123487104 [while running 
'oom:check']

https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1677/consoleFull
https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1675/consoleFull



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam-site pull request #197: Transfer content from Create Your Pipeline to P...

2017-03-30 Thread hadarhg
GitHub user hadarhg opened a pull request:

https://github.com/apache/beam-site/pull/197

Transfer content from Create Your Pipeline to Programming Guide.

Transfer content from Create Your Pipeline to Programming Guide. 
Delete Create Your Pipeline and remove from top menu.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/hadarhg/incubator-beam-site 
create-your-pipeline

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam-site/pull/197.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #197


commit 0a395c20d1e81338f5c91ef18ebe85781acfcdde
Author: Hadar Hod 
Date:   2017-03-30T23:48:26Z

Transfer content from Create Your Pipeline to Programming Guide. Delete 
Create Your Pipeline and remove from top menu.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1823) TimedOutException in postcommit

2017-03-30 Thread Mark Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950083#comment-15950083
 ] 

Mark Liu commented on BEAM-1823:


If pipeline failed, I think the full console log will be printed. But timeout 
is another case which seems nose framework not handles. 

> TimedOutException in postcommit
> ---
>
> Key: BEAM-1823
> URL: https://issues.apache.org/jira/browse/BEAM-1823
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Mark Liu
>
> Mark, do you know what this error means? Where is the timeout configured.
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1657/console
> I _think_ this is one of the underlying Dataflow executions and it completed 
> (although much slower than usual):
> https://pantheon.corp.google.com/dataflow/job/2017-03-28_14_25_21-13472017589125356257?project=apache-beam-testing=433637338589
> It makes sense to time out the test but I want to know how it is configured. 
> Also, is it possible to print out output logs for failed/timed out tests so 
> that we can clearly associate tests with job executions.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Python_Verify #1679

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[altay] [BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0

--
[...truncated 657.63 KB...]
test_undeclared_side_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
test_empty_side_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_as_dict_with_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... 

[jira] [Commented] (BEAM-1823) TimedOutException in postcommit

2017-03-30 Thread Ahmet Altay (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950078#comment-15950078
 ] 

Ahmet Altay commented on BEAM-1823:
---

Before BEAM-1728 is there a way to find out the job ids from the logs?


> TimedOutException in postcommit
> ---
>
> Key: BEAM-1823
> URL: https://issues.apache.org/jira/browse/BEAM-1823
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Mark Liu
>
> Mark, do you know what this error means? Where is the timeout configured.
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1657/console
> I _think_ this is one of the underlying Dataflow executions and it completed 
> (although much slower than usual):
> https://pantheon.corp.google.com/dataflow/job/2017-03-28_14_25_21-13472017589125356257?project=apache-beam-testing=433637338589
> It makes sense to time out the test but I want to know how it is configured. 
> Also, is it possible to print out output logs for failed/timed out tests so 
> that we can clearly associate tests with job executions.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1823) TimedOutException in postcommit

2017-03-30 Thread Ahmet Altay (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950076#comment-15950076
 ] 

Ahmet Altay commented on BEAM-1823:
---

Thank you Mark.

I think 10 minutes timeout makes sense. BEAM-1728 will help a lot (so that we 
can verify manually that job actually passed) and also include in the error 
message that for how long the test waited for timeout.

There was another instance today 
(https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1673/consoleFull)
 and it is hard to debug without the above information.

> TimedOutException in postcommit
> ---
>
> Key: BEAM-1823
> URL: https://issues.apache.org/jira/browse/BEAM-1823
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Mark Liu
>
> Mark, do you know what this error means? Where is the timeout configured.
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/1657/console
> I _think_ this is one of the underlying Dataflow executions and it completed 
> (although much slower than usual):
> https://pantheon.corp.google.com/dataflow/job/2017-03-28_14_25_21-13472017589125356257?project=apache-beam-testing=433637338589
> It makes sense to time out the test but I want to know how it is configured. 
> Also, is it possible to print out output logs for failed/timed out tests so 
> that we can clearly associate tests with job executions.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (BEAM-1795) Upgrade google-cloud-bigquery to 0.23.0

2017-03-30 Thread Ahmet Altay (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1795?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ahmet Altay closed BEAM-1795.
-
   Resolution: Fixed
Fix Version/s: First stable release

> Upgrade google-cloud-bigquery to 0.23.0
> ---
>
> Key: BEAM-1795
> URL: https://issues.apache.org/jira/browse/BEAM-1795
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Mark Liu
> Fix For: First stable release
>
>
> Should we upgrade this?
> https://pypi.python.org/pypi/google-cloud-bigquery/0.23.0



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Dataflow #2694

2017-03-30 Thread Apache Jenkins Server
See 




Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #1446

2017-03-30 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #2383: Add ValueProvider class for FileBasedSource I/O Tra...

2017-03-30 Thread mariapython
GitHub user mariapython opened a pull request:

https://github.com/apache/beam/pull/2383

Add ValueProvider class for FileBasedSource I/O Transforms

- [x] Add ValueProvider class.
- [x] Derive StaticValueProvider and RuntimeValueProvider from 
ValueProvider.
- [x] Derive ValueProviderArgumentParser from argparse.ArgumentParser as 
API for the template user.
- [x] Modify FileBasedSource I/O transforms to accept objects of type 
ValueProvider.
- [x] Modify display_data.
- [x] Handle serialization / deserialization.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mariapython/incubator-beam ppp_final

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2383.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2383


commit c08cbbb5bb8b08db948972e24fdebc6fdea298ec
Author: Maria Garcia Herrero 
Date:   2017-03-30T21:21:15Z

Add ValueProvider class for FileBasedSource I/O Transforms

Incorporate a BeamArgumentParser (argparse.ArgumentParser + ValueProviders).
Add StaticValueProvider and RuntimeValueProvider derived from ValueProvider.
Add serialization for ValueProvider objects.
Add testing for ValueProvider objects.
Modify FileBasedSource and FileSink to accept ValueProvider objects.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1441) Add an IOChannelFactory interface to Python SDK

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1441?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950044#comment-15950044
 ] 

ASF GitHub Bot commented on BEAM-1441:
--

GitHub user sb2nov opened a pull request:

https://github.com/apache/beam/pull/2382

[BEAM-1441] Fix size check for windows and improve error message

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---

R: @aaltay PTAL
cc: @charlesccychen 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sb2nov/beam BEAM-1441-fix-windows-size-check

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2382.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2382


commit 2e41e7ddeeef502eb1b2ee2f19be4838d79edd03
Author: Sourabh Bajaj 
Date:   2017-03-30T23:07:13Z

[BEAM-1441] Fix size check for windows and improve error message




> Add an IOChannelFactory interface to Python SDK
> ---
>
> Key: BEAM-1441
> URL: https://issues.apache.org/jira/browse/BEAM-1441
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-py
>Reporter: Chamikara Jayalath
>Assignee: Sourabh Bajaj
> Fix For: First stable release
>
>
> Based on proposal [1], an IOChannelFactory interface was added to Java SDK  
> [2].
> We should add a similar interface to Python SDK and provide proper 
> implementations for native files, GCS, and other useful formats.
> Python SDK currently has a basic ChannelFactory interface [3] which is used 
> by FileBasedSource [4].
> [1] 
> https://docs.google.com/document/d/11TdPyZ9_zmjokhNWM3Id-XJsVG3qel2lhdKTknmZ_7M/edit#heading=h.kpqagzh8i11w
> [2] 
> https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/util/IOChannelFactory.java
> [3] 
> https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/fileio.py#L107
> [4] 
> https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/filebasedsource.py
>  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3096

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] Improves logging in FileBasedSource splitting

--
[...truncated 1.96 MB...]
test_transform_no_super_init (apache_beam.pipeline_test.PipelineTest) ... ok
test_visit_entire_graph (apache_beam.pipeline_test.PipelineTest) ... ok
test_simple (apache_beam.pipeline_test.RunnerApiTest)
Tests serializing, deserializing, and running a simple pipeline. ... ok
test_pcollectionview_not_recreated (apache_beam.pvalue_test.PValueTest) ... ok
test_pvalue_expected_arguments (apache_beam.pvalue_test.PValueTest) ... ok
test_append_extra_options (apache_beam.test_pipeline_test.TestPipelineTest) ... 
ok
test_append_verifier_in_extra_opt 
(apache_beam.test_pipeline_test.TestPipelineTest) ... ok
test_create_test_pipeline_options 
(apache_beam.test_pipeline_test.TestPipelineTest) ... ok
test_empty_option_args_parsing 
(apache_beam.test_pipeline_test.TestPipelineTest) ... ok
test_get_option (apache_beam.test_pipeline_test.TestPipelineTest) ... ok
test_option_args_parsing (apache_beam.test_pipeline_test.TestPipelineTest) ... 
ok
test_skip_IT (apache_beam.test_pipeline_test.TestPipelineTest) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
test_file_checksum_matchcer_invalid_sleep_time 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... 
:129:
 DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
  self.assertEqual(cm.exception.message,
ok
test_file_checksum_matcher_read_failed 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... ok
test_file_checksum_matcher_service_error 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... ok
test_file_checksum_matcher_sleep_before_verify 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... ok
test_file_checksum_matcher_success 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... ok
test_pipeline_state_matcher_fails 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest)
Test PipelineStateMatcher fails when using default expected state ... ok
test_pipeline_state_matcher_given_state 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest)
Test PipelineStateMatcher successes when matches given state ... ok
test_pipeline_state_matcher_success 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest)
Test PipelineStateMatcher successes when using default expected state ... ok

==
FAIL: test_using_slow_impl (apache_beam.coders.slow_coders_test.SlowCoders)
--
Traceback (most recent call last):
  File 
"
 line 41, in test_using_slow_impl
import apache_beam.coders.stream
AssertionError: ImportError not raised

--
Ran 1085 tests in 64.310s

FAILED (failures=1, skipped=13)
Test failed: 
error: Test failed: 
ERROR: InvocationError: 
'
 setup.py test'
___ summary 
  docs: commands succeeded
  lint: commands succeeded
  py27: commands succeeded
ERROR:   py27cython: commands failed
ERROR:   py27gcp: commands failed
2017-03-30T22:51:08.622 [ERROR] Command execution failed.
org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit 
value: 1)
at 
org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
at 
org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:764)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:711)
at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:289)
at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at 

Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3097

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 2.02 MB...]
test_transform_no_super_init (apache_beam.pipeline_test.PipelineTest) ... ok
test_visit_entire_graph (apache_beam.pipeline_test.PipelineTest) ... ok
test_simple (apache_beam.pipeline_test.RunnerApiTest)
Tests serializing, deserializing, and running a simple pipeline. ... ok
test_pcollectionview_not_recreated (apache_beam.pvalue_test.PValueTest) ... ok
test_pvalue_expected_arguments (apache_beam.pvalue_test.PValueTest) ... ok
test_append_extra_options (apache_beam.test_pipeline_test.TestPipelineTest) ... 
ok
test_append_verifier_in_extra_opt 
(apache_beam.test_pipeline_test.TestPipelineTest) ... ok
test_create_test_pipeline_options 
(apache_beam.test_pipeline_test.TestPipelineTest) ... ok
test_empty_option_args_parsing 
(apache_beam.test_pipeline_test.TestPipelineTest) ... ok
test_get_option (apache_beam.test_pipeline_test.TestPipelineTest) ... ok
test_option_args_parsing (apache_beam.test_pipeline_test.TestPipelineTest) ... 
ok
test_skip_IT (apache_beam.test_pipeline_test.TestPipelineTest) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
test_file_checksum_matchcer_invalid_sleep_time 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... 
:129:
 DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
  self.assertEqual(cm.exception.message,
ok
test_file_checksum_matcher_read_failed 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... ok
test_file_checksum_matcher_service_error 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... ok
test_file_checksum_matcher_sleep_before_verify 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... ok
test_file_checksum_matcher_success 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest) ... ok
test_pipeline_state_matcher_fails 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest)
Test PipelineStateMatcher fails when using default expected state ... ok
test_pipeline_state_matcher_given_state 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest)
Test PipelineStateMatcher successes when matches given state ... ok
test_pipeline_state_matcher_success 
(apache_beam.tests.pipeline_verifiers_test.PipelineVerifiersTest)
Test PipelineStateMatcher successes when using default expected state ... ok

==
FAIL: test_using_slow_impl (apache_beam.coders.slow_coders_test.SlowCoders)
--
Traceback (most recent call last):
  File 
"
 line 41, in test_using_slow_impl
import apache_beam.coders.stream
AssertionError: ImportError not raised

--
Ran 1085 tests in 58.343s

FAILED (failures=1, skipped=13)
Test failed: 
error: Test failed: 
ERROR: InvocationError: 
'
 setup.py test'
___ summary 
  docs: commands succeeded
  lint: commands succeeded
  py27: commands succeeded
ERROR:   py27cython: commands failed
ERROR:   py27gcp: commands failed
2017-03-30T22:50:01.811 [ERROR] Command execution failed.
org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit 
value: 1)
at 
org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
at 
org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:764)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:711)
at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:289)
at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at 
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at 

[jira] [Commented] (BEAM-1795) Upgrade google-cloud-bigquery to 0.23.0

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950021#comment-15950021
 ] 

ASF GitHub Bot commented on BEAM-1795:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2378


> Upgrade google-cloud-bigquery to 0.23.0
> ---
>
> Key: BEAM-1795
> URL: https://issues.apache.org/jira/browse/BEAM-1795
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Mark Liu
>
> Should we upgrade this?
> https://pypi.python.org/pypi/google-cloud-bigquery/0.23.0



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2378: [BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2378


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #2378

2017-03-30 Thread altay
This closes #2378


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/d4de1371
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/d4de1371
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/d4de1371

Branch: refs/heads/master
Commit: d4de13718ff97d1ab04796ad84cd28b857a2a70f
Parents: cfd1643 82aba4b
Author: Ahmet Altay 
Authored: Thu Mar 30 15:56:37 2017 -0700
Committer: Ahmet Altay 
Committed: Thu Mar 30 15:56:37 2017 -0700

--
 sdks/python/setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--




[jira] [Commented] (BEAM-1579) Runners should verify that PT overrides converged

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950020#comment-15950020
 ] 

ASF GitHub Bot commented on BEAM-1579:
--

GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2381

[BEAM-1579] Use Batch Replacement in the Apex Runner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam apex_batch_replacement

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2381.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2381


commit d8ff2206e83ae9b3ed2db222d1dbb4cde804c0d1
Author: Thomas Groh 
Date:   2017-03-30T22:55:28Z

Use Batch Replacement in the Apex Runner




> Runners should verify that PT overrides converged
> -
>
> Key: BEAM-1579
> URL: https://issues.apache.org/jira/browse/BEAM-1579
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core, runner-dataflow, runner-direct
>Reporter: Eugene Kirpichov
>Assignee: Thomas Groh
>
> PT overrides are applied in order, see 
> https://issues.apache.org/jira/browse/BEAM-1578 .
> To make sure that the order is correct and avoid confusing errors, after 
> applying the overrides in order, we should verify that the pipeline has 
> converged, i.e. the overrides no longer match.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[1/2] beam git commit: [BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0

2017-03-30 Thread altay
Repository: beam
Updated Branches:
  refs/heads/master cfd164398 -> d4de13718


[BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/82aba4ba
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/82aba4ba
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/82aba4ba

Branch: refs/heads/master
Commit: 82aba4baec1f1e2f3d2887ad2c44497dcbd24540
Parents: cfd1643
Author: Mark Liu 
Authored: Fri Mar 24 09:30:53 2017 -0700
Committer: Ahmet Altay 
Committed: Thu Mar 30 15:56:31 2017 -0700

--
 sdks/python/setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/82aba4ba/sdks/python/setup.py
--
diff --git a/sdks/python/setup.py b/sdks/python/setup.py
index 3b44b82..ee3b5e4 100644
--- a/sdks/python/setup.py
+++ b/sdks/python/setup.py
@@ -106,7 +106,7 @@ GCP_REQUIREMENTS = [
   'proto-google-cloud-datastore-v1==0.90.0',
   'googledatastore==7.0.0',
   # GCP packages required by tests
-  'google-cloud-bigquery>=0.22.1,<0.23',
+  'google-cloud-bigquery>=0.23.0,<0.24.0',
 ]
 
 



[GitHub] beam pull request #2381: [BEAM-1579] Use Batch Replacement in the Apex Runne...

2017-03-30 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2381

[BEAM-1579] Use Batch Replacement in the Apex Runner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam apex_batch_replacement

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2381.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2381


commit d8ff2206e83ae9b3ed2db222d1dbb4cde804c0d1
Author: Thomas Groh 
Date:   2017-03-30T22:55:28Z

Use Batch Replacement in the Apex Runner




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1579) Runners should verify that PT overrides converged

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950019#comment-15950019
 ] 

ASF GitHub Bot commented on BEAM-1579:
--

GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2380

[BEAM-1579] Use Batch Replacement in the Flink Runner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam flink_batch_replacement

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2380.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2380


commit 5883f0057cfba6371a1298c99754b9bf956f59be
Author: Thomas Groh 
Date:   2017-03-30T22:53:16Z

Use Batch Replacement in the Flink Runner




> Runners should verify that PT overrides converged
> -
>
> Key: BEAM-1579
> URL: https://issues.apache.org/jira/browse/BEAM-1579
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core, runner-dataflow, runner-direct
>Reporter: Eugene Kirpichov
>Assignee: Thomas Groh
>
> PT overrides are applied in order, see 
> https://issues.apache.org/jira/browse/BEAM-1578 .
> To make sure that the order is correct and avoid confusing errors, after 
> applying the overrides in order, we should verify that the pipeline has 
> converged, i.e. the overrides no longer match.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2380: [BEAM-1579] Use Batch Replacement in the Flink Runn...

2017-03-30 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2380

[BEAM-1579] Use Batch Replacement in the Flink Runner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam flink_batch_replacement

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2380.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2380


commit 5883f0057cfba6371a1298c99754b9bf956f59be
Author: Thomas Groh 
Date:   2017-03-30T22:53:16Z

Use Batch Replacement in the Flink Runner




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Python_Verify #1678

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] Improves logging in FileBasedSource splitting

--
[...truncated 647.37 KB...]
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_empty_side_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_as_list_and_as_dict_side_inputs 

[jira] [Commented] (BEAM-1579) Runners should verify that PT overrides converged

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15950008#comment-15950008
 ] 

ASF GitHub Bot commented on BEAM-1579:
--

GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2379

[BEAM-1579] Use Batch Replacement API in the Dataflow Runner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam dataflow_batch_replacement

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2379.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2379


commit 29ffe5ee2a30df801730278971e3c74f180a34b2
Author: Thomas Groh 
Date:   2017-03-23T16:38:53Z

Use Batch Replacement API in the Dataflow Runner




> Runners should verify that PT overrides converged
> -
>
> Key: BEAM-1579
> URL: https://issues.apache.org/jira/browse/BEAM-1579
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core, runner-dataflow, runner-direct
>Reporter: Eugene Kirpichov
>Assignee: Thomas Groh
>
> PT overrides are applied in order, see 
> https://issues.apache.org/jira/browse/BEAM-1578 .
> To make sure that the order is correct and avoid confusing errors, after 
> applying the overrides in order, we should verify that the pipeline has 
> converged, i.e. the overrides no longer match.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2379: [BEAM-1579] Use Batch Replacement API in the Datafl...

2017-03-30 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2379

[BEAM-1579] Use Batch Replacement API in the Dataflow Runner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam dataflow_batch_replacement

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2379.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2379


commit 29ffe5ee2a30df801730278971e3c74f180a34b2
Author: Thomas Groh 
Date:   2017-03-23T16:38:53Z

Use Batch Replacement API in the Dataflow Runner




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1795) Upgrade google-cloud-bigquery to 0.23.0

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15949977#comment-15949977
 ] 

ASF GitHub Bot commented on BEAM-1795:
--

GitHub user markflyhigh opened a pull request:

https://github.com/apache/beam/pull/2378

[BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---

Update google-cloud-bigquery to next stable version. No additional 
dependency is introduced after transitive dependency inspection.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markflyhigh/incubator-beam 
update-bigquery-version

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2378.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2378


commit 051b16301a06146fba93ae8e90eadd4e1c2989d7
Author: Mark Liu 
Date:   2017-03-24T16:30:53Z

[BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0




> Upgrade google-cloud-bigquery to 0.23.0
> ---
>
> Key: BEAM-1795
> URL: https://issues.apache.org/jira/browse/BEAM-1795
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Mark Liu
>
> Should we upgrade this?
> https://pypi.python.org/pypi/google-cloud-bigquery/0.23.0



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2378: [BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0

2017-03-30 Thread markflyhigh
GitHub user markflyhigh opened a pull request:

https://github.com/apache/beam/pull/2378

[BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---

Update google-cloud-bigquery to next stable version. No additional 
dependency is introduced after transitive dependency inspection.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markflyhigh/incubator-beam 
update-bigquery-version

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2378.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2378


commit 051b16301a06146fba93ae8e90eadd4e1c2989d7
Author: Mark Liu 
Date:   2017-03-24T16:30:53Z

[BEAM-1795] Upgrade google-cloud-bigquery to 0.23.0




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #2375: Improves logging in FileBasedSource splitting

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2375


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #2375

2017-03-30 Thread jkff
This closes #2375


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/cfd16439
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/cfd16439
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/cfd16439

Branch: refs/heads/master
Commit: cfd1643984a5cc4d26e9b1df3c7a217127e12764
Parents: 66f2499 9ead0a2
Author: Eugene Kirpichov 
Authored: Thu Mar 30 15:11:22 2017 -0700
Committer: Eugene Kirpichov 
Committed: Thu Mar 30 15:11:22 2017 -0700

--
 .../main/java/org/apache/beam/sdk/io/FileBasedSource.java | 10 +++---
 1 file changed, 7 insertions(+), 3 deletions(-)
--




Build failed in Jenkins: beam_PostCommit_Python_Verify #1677

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] Fix NPE in Kafka value writer.

--
[...truncated 589.05 KB...]
testGetAttr (apache_beam.typehints.trivial_inference_test.TrivialInferenceTest) 
... ok
testGlobals (apache_beam.typehints.trivial_inference_test.TrivialInferenceTest) 
... ok
testIdentity 
(apache_beam.typehints.trivial_inference_test.TrivialInferenceTest) ... ok
testListComprehension 
(apache_beam.typehints.trivial_inference_test.TrivialInferenceTest) ... ok
testMethod (apache_beam.typehints.trivial_inference_test.TrivialInferenceTest) 
... ok
testTupleListComprehension 
(apache_beam.typehints.trivial_inference_test.TrivialInferenceTest) ... ok
testTuples (apache_beam.typehints.trivial_inference_test.TrivialInferenceTest) 
... ok
testUnpack (apache_beam.typehints.trivial_inference_test.TrivialInferenceTest) 
... ok
test_custom_transform 
(apache_beam.typehints.typed_pipeline_test.CustomTransformTest) ... ok
test_flat_type_hint 
(apache_beam.typehints.typed_pipeline_test.CustomTransformTest) ... ok
test_keyword_type_hints 
(apache_beam.typehints.typed_pipeline_test.CustomTransformTest) ... ok
test_bad_main_input (apache_beam.typehints.typed_pipeline_test.MainInputTest) 
... ok
test_loose_bounds (apache_beam.typehints.typed_pipeline_test.MainInputTest) ... 
ok
test_non_function (apache_beam.typehints.typed_pipeline_test.MainInputTest) ... 
ok
test_typed_dofn_class (apache_beam.typehints.typed_pipeline_test.MainInputTest) 
... ok
test_typed_dofn_instance 
(apache_beam.typehints.typed_pipeline_test.MainInputTest) ... ok
test_basic_side_input_hint 
(apache_beam.typehints.typed_pipeline_test.SideInputTest) ... ok
test_default_typed_hint 
(apache_beam.typehints.typed_pipeline_test.SideInputTest) ... ok
test_default_untyped_hint 
(apache_beam.typehints.typed_pipeline_test.SideInputTest) ... ok
test_deferred_side_input_iterable 
(apache_beam.typehints.typed_pipeline_test.SideInputTest) ... ok
test_deferred_side_inputs 
(apache_beam.typehints.typed_pipeline_test.SideInputTest) ... ok
test_keyword_side_input_hint 
(apache_beam.typehints.typed_pipeline_test.SideInputTest) ... ok
test_any_compatibility 
(apache_beam.typehints.typehints_test.AnyTypeConstraintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.AnyTypeConstraintTestCase) ... 
ok
test_type_check 
(apache_beam.typehints.typehints_test.AnyTypeConstraintTestCase) ... ok
test_composite_takes_and_returns_hints 
(apache_beam.typehints.typehints_test.CombinedReturnsAndTakesTestCase) ... ok
test_enable_and_disable_type_checking_returns 
(apache_beam.typehints.typehints_test.CombinedReturnsAndTakesTestCase) ... ok
test_enable_and_disable_type_checking_takes 
(apache_beam.typehints.typehints_test.CombinedReturnsAndTakesTestCase) ... ok
test_simple_takes_and_returns_hints 
(apache_beam.typehints.typehints_test.CombinedReturnsAndTakesTestCase) ... ok
test_valid_mix_pos_and_keyword_with_both_orders 
(apache_beam.typehints.typehints_test.CombinedReturnsAndTakesTestCase) ... ok
test_getcallargs_forhints 
(apache_beam.typehints.typehints_test.DecoratorHelpers) ... ok
test_hint_helper (apache_beam.typehints.typehints_test.DecoratorHelpers) ... ok
test_positional_arg_hints 
(apache_beam.typehints.typehints_test.DecoratorHelpers) ... ok
test_compatibility (apache_beam.typehints.typehints_test.DictHintTestCase) ... 
ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... 
:497:
 DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
  e.exception.message)
ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_key_type_must_be_valid_composite_param 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_match_type_variables 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_type_check_invalid_key_type 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_type_check_invalid_value_type 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_type_check_valid_composite_type 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_type_check_valid_simple_type 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_type_checks_not_dict 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_value_type_must_be_valid_composite_param 
(apache_beam.typehints.typehints_test.DictHintTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.GeneratorHintTestCase) 
... ok
test_generator_argument_hint_invalid_yield_type 
(apache_beam.typehints.typehints_test.GeneratorHintTestCase) ... ok
test_generator_return_hint_invalid_yield_type 

[GitHub] beam pull request #2376: Annotate output type for python CoGroupByKey

2017-03-30 Thread vikkyrk
Github user vikkyrk closed the pull request at:

https://github.com/apache/beam/pull/2376


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Python_Verify #1676

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 652.36 KB...]
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:132:
 UserWarning: Using fallback coder for typehint: List[int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:132:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-34.3.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-2.0.0.tar.gz
Collecting packaging>=16.8 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/packaging-16.8.tar.gz
Collecting appdirs>=1.4.0 (from setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/appdirs-1.4.3.tar.gz
Collecting pyparsing (from packaging>=16.8->setuptools->pyhamcrest->-r 
postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/pyparsing-2.2.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr packaging 
appdirs pyparsing
test_as_dict_with_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
DEPRECATION: pip install 

Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3095

2017-03-30 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] Fix NPE in Kafka value writer.

--
[...truncated 2.01 MB...]
  File 
"
 line 60, in process
return self.wrapper(self.dofn.process, args, kwargs)
  File 
"
 line 78, in wrapper
result = method(*args, **kwargs)
  File 
"
 line 719, in 
wrapper = lambda x, *args, **kwargs: [fn(x, *args, **kwargs)]
  File 
"
 line 245, in check_memory
'High memory usage: %d > %d' % (memory_usage, memory_threshold))
RuntimeError: High memory usage: 192524288 > 192353152 [while running 
'oom:check']
Traceback (most recent call last):
  File 
"
 line 297, in __call__
evaluator.process_element(value)
  File 
"
 line 355, in process_element
self.runner.process(element)
  File 
"
 line 267, in process
self.reraise_augmented(exn)
  File 
"
 line 265, in process
self._dofn_invoker(element)
  File 
"
 line 232, in _dofn_invoker
self._dofn_per_window_invoker(element)
  File 
"
 line 218, in _dofn_per_window_invoker
self._process_outputs(element, self.dofn_process(*args))
  File 
"
 line 60, in process
return self.wrapper(self.dofn.process, args, kwargs)
  File 
"
 line 78, in wrapper
result = method(*args, **kwargs)
  File 
"
 line 719, in 
wrapper = lambda x, *args, **kwargs: [fn(x, *args, **kwargs)]
  File 
"
 line 245, in check_memory
'High memory usage: %d > %d' % (memory_usage, memory_threshold))
RuntimeError: High memory usage: 192524288 > 192353152 [while running 
'oom:check']
root: WARNING: A task failed with exception.
 High memory usage: 192524288 > 192353152 [while running 'oom:check']
- >> end captured logging << -

--
Ran 1049 tests in 60.060s

FAILED (errors=1, skipped=13)
Test failed: 
error: Test failed: 
ERROR: InvocationError: 
'
 setup.py test'
___ summary 
  docs: commands succeeded
  lint: commands succeeded
  py27: commands succeeded
  py27cython: commands succeeded
ERROR:   py27gcp: commands failed
2017-03-30T21:33:19.230 [ERROR] Command execution failed.
org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit 
value: 1)
at 
org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
at 
org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:764)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:711)
at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:289)
at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at 

[jira] [Commented] (BEAM-1269) BigtableIO should make more efficient use of connections

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1269?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15949879#comment-15949879
 ] 

ASF GitHub Bot commented on BEAM-1269:
--

GitHub user mdshalda opened a pull request:

https://github.com/apache/beam/pull/2377

BEAM-1269: Update bigtable library dependency and add cached data pools.

Hi @dhalperi, 

Can you review these changes to add the new bigtable library and cached 
data pools?


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mdshalda/beam mshalda/beam-1269

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2377.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2377


commit a9503d0cb973743437f74badaeea2127d176f84a
Author: mdshalda 
Date:   2017-03-30T21:06:03Z

BEAM-1269: Update bigtable library dependency. Enable cached data pools for 
efficiency and remove pegging data channel count to 1.




> BigtableIO should make more efficient use of connections
> 
>
> Key: BEAM-1269
> URL: https://issues.apache.org/jira/browse/BEAM-1269
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-gcp
>Reporter: Daniel Halperin
>  Labels: newbie, starter
>
> RIght now, {{BigtableIO}} opens up a new Bigtable session for every DoFn, in 
> the {{@Setup}} function. However, sessions can support multiple connections, 
> so perhaps this code should be modified to open up a smaller session pool and 
> then allocation connections in {{@StartBundle}}.
> This would likely make more efficient use of resources, especially for highly 
> multithreaded workers.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2377: BEAM-1269: Update bigtable library dependency and a...

2017-03-30 Thread mdshalda
GitHub user mdshalda opened a pull request:

https://github.com/apache/beam/pull/2377

BEAM-1269: Update bigtable library dependency and add cached data pools.

Hi @dhalperi, 

Can you review these changes to add the new bigtable library and cached 
data pools?


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mdshalda/beam mshalda/beam-1269

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2377.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2377


commit a9503d0cb973743437f74badaeea2127d176f84a
Author: mdshalda 
Date:   2017-03-30T21:06:03Z

BEAM-1269: Update bigtable library dependency. Enable cached data pools for 
efficiency and remove pegging data channel count to 1.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1795) Upgrade google-cloud-bigquery to 0.23.0

2017-03-30 Thread Mark Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15949848#comment-15949848
 ] 

Mark Liu commented on BEAM-1795:


+1 to have a guide for dependency update. I found Java dependency tree in the 
website (https://beam.apache.org/contribute/maturity-model/#dependency-tree), 
but it's outdated for a while. 

> Upgrade google-cloud-bigquery to 0.23.0
> ---
>
> Key: BEAM-1795
> URL: https://issues.apache.org/jira/browse/BEAM-1795
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Mark Liu
>
> Should we upgrade this?
> https://pypi.python.org/pypi/google-cloud-bigquery/0.23.0



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2369: [BEAM-1837] Fix NPE in KafkaIO writer.

2017-03-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2369


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow #2693

2017-03-30 Thread Apache Jenkins Server
See 


--
[...truncated 219.08 KB...]
 x [deleted] (none) -> origin/pr/942/head
 x [deleted] (none) -> origin/pr/942/merge
 x [deleted] (none) -> origin/pr/943/head
 x [deleted] (none) -> origin/pr/943/merge
 x [deleted] (none) -> origin/pr/944/head
 x [deleted] (none) -> origin/pr/945/head
 x [deleted] (none) -> origin/pr/945/merge
 x [deleted] (none) -> origin/pr/946/head
 x [deleted] (none) -> origin/pr/946/merge
 x [deleted] (none) -> origin/pr/947/head
 x [deleted] (none) -> origin/pr/947/merge
 x [deleted] (none) -> origin/pr/948/head
 x [deleted] (none) -> origin/pr/948/merge
 x [deleted] (none) -> origin/pr/949/head
 x [deleted] (none) -> origin/pr/949/merge
 x [deleted] (none) -> origin/pr/95/head
 x [deleted] (none) -> origin/pr/95/merge
 x [deleted] (none) -> origin/pr/950/head
 x [deleted] (none) -> origin/pr/951/head
 x [deleted] (none) -> origin/pr/951/merge
 x [deleted] (none) -> origin/pr/952/head
 x [deleted] (none) -> origin/pr/952/merge
 x [deleted] (none) -> origin/pr/953/head
 x [deleted] (none) -> origin/pr/954/head
 x [deleted] (none) -> origin/pr/954/merge
 x [deleted] (none) -> origin/pr/955/head
 x [deleted] (none) -> origin/pr/955/merge
 x [deleted] (none) -> origin/pr/956/head
 x [deleted] (none) -> origin/pr/957/head
 x [deleted] (none) -> origin/pr/958/head
 x [deleted] (none) -> origin/pr/959/head
 x [deleted] (none) -> origin/pr/959/merge
 x [deleted] (none) -> origin/pr/96/head
 x [deleted] (none) -> origin/pr/96/merge
 x [deleted] (none) -> origin/pr/960/head
 x [deleted] (none) -> origin/pr/960/merge
 x [deleted] (none) -> origin/pr/961/head
 x [deleted] (none) -> origin/pr/962/head
 x [deleted] (none) -> origin/pr/962/merge
 x [deleted] (none) -> origin/pr/963/head
 x [deleted] (none) -> origin/pr/963/merge
 x [deleted] (none) -> origin/pr/964/head
 x [deleted] (none) -> origin/pr/965/head
 x [deleted] (none) -> origin/pr/965/merge
 x [deleted] (none) -> origin/pr/966/head
 x [deleted] (none) -> origin/pr/967/head
 x [deleted] (none) -> origin/pr/967/merge
 x [deleted] (none) -> origin/pr/968/head
 x [deleted] (none) -> origin/pr/968/merge
 x [deleted] (none) -> origin/pr/969/head
 x [deleted] (none) -> origin/pr/969/merge
 x [deleted] (none) -> origin/pr/97/head
 x [deleted] (none) -> origin/pr/97/merge
 x [deleted] (none) -> origin/pr/970/head
 x [deleted] (none) -> origin/pr/970/merge
 x [deleted] (none) -> origin/pr/971/head
 x [deleted] (none) -> origin/pr/971/merge
 x [deleted] (none) -> origin/pr/972/head
 x [deleted] (none) -> origin/pr/973/head
 x [deleted] (none) -> origin/pr/974/head
 x [deleted] (none) -> origin/pr/974/merge
 x [deleted] (none) -> origin/pr/975/head
 x [deleted] (none) -> origin/pr/975/merge
 x [deleted] (none) -> origin/pr/976/head
 x [deleted] (none) -> origin/pr/976/merge
 x [deleted] (none) -> origin/pr/977/head
 x [deleted] (none) -> origin/pr/977/merge
 x [deleted] (none) -> origin/pr/978/head
 x [deleted] (none) -> origin/pr/978/merge
 x [deleted] (none) -> origin/pr/979/head
 x [deleted] (none) -> origin/pr/979/merge
 x [deleted] (none) -> origin/pr/98/head
 x [deleted] (none) -> origin/pr/980/head
 x [deleted] (none) -> origin/pr/980/merge
 x [deleted] (none) -> origin/pr/981/head
 x [deleted] (none) -> origin/pr/982/head
 x [deleted] (none) -> origin/pr/982/merge
 x [deleted] (none) -> origin/pr/983/head
 x [deleted] (none) -> origin/pr/983/merge
 x [deleted] (none) -> origin/pr/984/head
 x [deleted] (none) -> origin/pr/984/merge
 x [deleted] (none) -> origin/pr/985/head
 x [deleted] (none) -> origin/pr/985/merge
 x [deleted] (none) -> origin/pr/986/head
 x [deleted] (none) -> origin/pr/986/merge
 x [deleted] (none) -> origin/pr/987/head
 x [deleted] (none) -> origin/pr/988/head
 x [deleted] (none) -> origin/pr/988/merge
 x [deleted] (none) -> 

[jira] [Commented] (BEAM-1837) NPE in KafkaIO writer

2017-03-30 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15949820#comment-15949820
 ] 

ASF GitHub Bot commented on BEAM-1837:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2369


> NPE in KafkaIO writer
> -
>
> Key: BEAM-1837
> URL: https://issues.apache.org/jira/browse/BEAM-1837
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Raghu Angadi
>Assignee: Raghu Angadi
>
> {{KafkaIO.writer()...values()}} does not require user to set key coder since 
> the key always null. Validation passes, but it results in an NPE at runtime 
> when the writer is tries to instantiates the producer.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[1/2] beam git commit: Fix NPE in Kafka value writer.

2017-03-30 Thread jkff
Repository: beam
Updated Branches:
  refs/heads/master ffd87553f -> 66f249933


Fix NPE in Kafka value writer.

KafkaIO.writer()...values() does not require user to set key coder since the 
key always null.
Validation passes, but it results in an NPE at runtime when the writer is
tries to instantiates the producer. Set key coder to 'NullOnlyCoder'.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/d0462f59
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/d0462f59
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/d0462f59

Branch: refs/heads/master
Commit: d0462f59548ebed0dd7ae744b138ff956b742cad
Parents: ffd8755
Author: Raghu Angadi 
Authored: Wed Mar 29 23:21:54 2017 -0700
Committer: Eugene Kirpichov 
Committed: Thu Mar 30 13:57:26 2017 -0700

--
 .../org/apache/beam/sdk/io/kafka/KafkaIO.java   | 22 +---
 .../apache/beam/sdk/io/kafka/KafkaIOTest.java   | 16 ++
 2 files changed, 26 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/d0462f59/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
--
diff --git 
a/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java 
b/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
index 7880cbc..bb7d971 100644
--- a/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
+++ b/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
@@ -254,7 +254,6 @@ public class KafkaIO {
   public static  Write write() {
 return new AutoValue_KafkaIO_Write.Builder()
 .setProducerConfig(Write.DEFAULT_PRODUCER_PROPERTIES)
-.setValueOnly(false)
 .build();
   }
 
@@ -1159,7 +1158,6 @@ public class KafkaIO {
 @Nullable abstract String getTopic();
 @Nullable abstract Coder getKeyCoder();
 @Nullable abstract Coder getValueCoder();
-abstract boolean getValueOnly();
 abstract Map getProducerConfig();
 @Nullable
 abstract SerializableFunction, Producer> 
getProducerFactoryFn();
@@ -1171,7 +1169,6 @@ public class KafkaIO {
   abstract Builder setTopic(String topic);
   abstract Builder setKeyCoder(Coder keyCoder);
   abstract Builder setValueCoder(Coder valueCoder);
-  abstract Builder setValueOnly(boolean valueOnly);
   abstract Builder setProducerConfig(Map 
producerConfig);
   abstract Builder setProducerFactoryFn(
   SerializableFunction, Producer> fn);
@@ -1231,7 +1228,7 @@ public class KafkaIO {
  * collections of values rather thank {@link KV}s.
  */
 public PTransform values() {
-  return new KafkaValueWrite<>(toBuilder().setValueOnly(true).build());
+  return new KafkaValueWrite<>(withKeyCoder(new 
NullOnlyCoder()).toBuilder().build());
 }
 
 @Override
@@ -1245,9 +1242,7 @@ public class KafkaIO {
   
checkNotNull(getProducerConfig().get(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG),
   "Kafka bootstrap servers should be set");
   checkNotNull(getTopic(), "Kafka topic should be set");
-  if (!getValueOnly()) {
-checkNotNull(getKeyCoder(), "Key coder should be set");
-  }
+  checkNotNull(getKeyCoder(), "Key coder should be set");
   checkNotNull(getValueCoder(), "Value coder should be set");
 }
 
@@ -1255,11 +1250,12 @@ public class KafkaIO {
 private static final Map DEFAULT_PRODUCER_PROPERTIES =
 ImmutableMap.of(
 ProducerConfig.RETRIES_CONFIG, 3,
+// See comment about custom serializers in KafkaWriter constructor.
 ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, 
CoderBasedKafkaSerializer.class,
 ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, 
CoderBasedKafkaSerializer.class);
 
 /**
- * A set of properties that are not required or don't make sense for our 
consumer.
+ * A set of properties that are not required or don't make sense for our 
producer.
  */
 private static final Map IGNORED_PRODUCER_PROPERTIES = 
ImmutableMap.of(
 ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "Set keyCoder instead",
@@ -1373,11 +1369,13 @@ public class KafkaIO {
 KafkaWriter(Write spec) {
   this.spec = spec;
 
-  // Set custom kafka serializers. We can not serialize user objects then 
pass the bytes to
-  // producer. The key and value objects are used in kafka Partitioner 
interface.
+  // Set custom kafka serializers. We do not want to serialize user 
objects 

[2/2] beam git commit: This closes #2369

2017-03-30 Thread jkff
This closes #2369


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/66f24993
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/66f24993
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/66f24993

Branch: refs/heads/master
Commit: 66f249933054d063f2d546c27674b2289a6d2002
Parents: ffd8755 d0462f5
Author: Eugene Kirpichov 
Authored: Thu Mar 30 13:58:07 2017 -0700
Committer: Eugene Kirpichov 
Committed: Thu Mar 30 13:58:07 2017 -0700

--
 .../org/apache/beam/sdk/io/kafka/KafkaIO.java   | 22 +---
 .../apache/beam/sdk/io/kafka/KafkaIOTest.java   | 16 ++
 2 files changed, 26 insertions(+), 12 deletions(-)
--




  1   2   3   >