[beam] branch master updated (0a1ec0c -> c02af60)

2019-05-10 Thread ccy
This is an automated email from the ASF dual-hosted git repository.

ccy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 0a1ec0c  Merge pull request #8544 from angoenka/revert_utf8
 add c02af60  [BEAM-4858] Increase tolerance on linear regression tests 
(#8556)

No new revisions were added by this update.

Summary of changes:
 sdks/python/apache_beam/transforms/util_test.py | 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)



[beam] branch master updated (117a2d6 -> 0a1ec0c)

2019-05-10 Thread goenka
This is an automated email from the ASF dual-hosted git repository.

goenka pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 117a2d6  Merge pull request #8533 from aaltay/errtinf
 new cc8bc6c  Revert "Merge pull request #8228: [BEAM-7008] adding UTF8 
String coder to Java SDK ModelCoders"
 new 466038d  Revert "Merge pull request #8545 from ihji/BEAM-7260"
 new 0a1ec0c  Merge pull request #8544 from angoenka/revert_utf8

The 21423 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../beam/model/fnexecution/v1/standard_coders.yaml | 24 ---
 .../core/construction/ModelCoderRegistrar.java |  3 -
 .../runners/core/construction/ModelCoders.java |  2 -
 .../core/construction/CoderTranslationTest.java|  1 -
 .../runners/core/construction/CommonCoderTest.java |  9 +--
 .../streaming/ExecutableStageDoFnOperator.java |  3 +-
 .../streaming/ExecutableStageDoFnOperatorTest.java |  2 +-
 .../worker/graph/LengthPrefixUnknownCoders.java|  3 +-
 .../worker/fn/control/TimerReceiverTest.java   |  4 +-
 .../graph/LengthPrefixUnknownCodersTest.java   | 13 ++--
 .../fnexecution/control/RemoteExecutionTest.java   | 80 ++
 11 files changed, 65 insertions(+), 79 deletions(-)



[beam] branch master updated: Downgrade missing coder error logs to info logs.

2019-05-10 Thread goenka
This is an automated email from the ASF dual-hosted git repository.

goenka pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new 2650c52  Downgrade missing coder error logs to info logs.
 new 117a2d6  Merge pull request #8533 from aaltay/errtinf
2650c52 is described below

commit 2650c5282833ce146df23ee488c7099a96676106
Author: Ahmet Altay 
AuthorDate: Wed May 8 15:05:22 2019 -0700

Downgrade missing coder error logs to info logs.
---
 sdks/python/apache_beam/runners/worker/bundle_processor.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/sdks/python/apache_beam/runners/worker/bundle_processor.py 
b/sdks/python/apache_beam/runners/worker/bundle_processor.py
index 24ac478..b04dabf 100644
--- a/sdks/python/apache_beam/runners/worker/bundle_processor.py
+++ b/sdks/python/apache_beam/runners/worker/bundle_processor.py
@@ -870,7 +870,7 @@ def create(factory, transform_id, transform_proto, 
grpc_port, consumers):
   if grpc_port.coder_id:
 output_coder = factory.get_coder(grpc_port.coder_id)
   else:
-logging.error(
+logging.info(
 'Missing required coder_id on grpc_port for %s; '
 'using deprecated fallback.',
 transform_id)
@@ -895,7 +895,7 @@ def create(factory, transform_id, transform_proto, 
grpc_port, consumers):
   if grpc_port.coder_id:
 output_coder = factory.get_coder(grpc_port.coder_id)
   else:
-logging.error(
+logging.info(
 'Missing required coder_id on grpc_port for %s; '
 'using deprecated fallback.',
 transform_id)



[beam] branch release-2.13.0 updated (1e8f052 -> b95e8d9)

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a change to branch release-2.13.0
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 1e8f052  Create release branch for version 2.13.0.
 add e3ca830  [BEAM-7145] Make FlinkRunner compatible with Flink 1.8
 new b95e8d9  Merge pull request #8549: [BEAM-7145] Make FlinkRunner 
compatible with Flink 1.8

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 runners/flink/{ => 1.5}/build.gradle   | 14 ++--
 .../{1.7 => 1.5}/job-server-container/build.gradle |  0
 runners/flink/{1.7 => 1.5}/job-server/build.gradle |  0
 .../translation/types/CoderTypeSerializer.java |  2 +-
 .../translation/types/EncodedValueSerializer.java  |  0
 .../FlinkBroadcastStateInternalsTest.java  |  0
 .../flink/streaming/FlinkStateInternalsTest.java   |  0
 .../translation/types/CoderTypeSerializerTest.java |  0
 runners/flink/1.6/build.gradle |  8 +--
 runners/flink/1.7/build.gradle |  8 +--
 runners/flink/{1.6 => 1.8}/build.gradle|  8 +--
 .../{1.7 => 1.8}/job-server-container/build.gradle |  0
 runners/flink/{1.7 => 1.8}/job-server/build.gradle |  0
 .../translation/types/CoderTypeSerializer.java | 79 +-
 .../translation/types/EncodedValueSerializer.java  | 31 -
 .../FlinkBroadcastStateInternalsTest.java  |  4 +-
 .../flink/streaming/FlinkStateInternalsTest.java   |  9 ++-
 .../translation/types/CoderTypeSerializerTest.java | 25 ---
 runners/flink/flink_runner.gradle  |  4 ++
 runners/flink/job-server-container/build.gradle| 24 ---
 runners/flink/job-server/build.gradle  | 30 
 .../runners/flink/FlinkExecutionEnvironments.java  | 10 +--
 .../beam/runners/flink/FlinkSavepointTest.java | 24 +--
 .../src/main/resources/beam/suppressions.xml   |  6 ++
 settings.gradle| 19 --
 25 files changed, 135 insertions(+), 170 deletions(-)
 rename runners/flink/{ => 1.5}/build.gradle (73%)
 copy runners/flink/{1.7 => 1.5}/job-server-container/build.gradle (100%)
 copy runners/flink/{1.7 => 1.5}/job-server/build.gradle (100%)
 copy runners/flink/{ => 
1.5}/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
 (99%)
 copy runners/flink/{ => 
1.5}/src/main/java/org/apache/beam/runners/flink/translation/types/EncodedValueSerializer.java
 (100%)
 copy runners/flink/{ => 
1.5}/src/test/java/org/apache/beam/runners/flink/streaming/FlinkBroadcastStateInternalsTest.java
 (100%)
 copy runners/flink/{ => 
1.5}/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
 (100%)
 copy runners/flink/{ => 
1.5}/src/test/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializerTest.java
 (100%)
 copy runners/flink/{1.6 => 1.8}/build.gradle (84%)
 copy runners/flink/{1.7 => 1.8}/job-server-container/build.gradle (100%)
 copy runners/flink/{1.7 => 1.8}/job-server/build.gradle (100%)
 rename runners/flink/{ => 
1.8}/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
 (65%)
 rename runners/flink/{ => 
1.8}/src/main/java/org/apache/beam/runners/flink/translation/types/EncodedValueSerializer.java
 (71%)
 rename runners/flink/{ => 
1.8}/src/test/java/org/apache/beam/runners/flink/streaming/FlinkBroadcastStateInternalsTest.java
 (94%)
 rename runners/flink/{ => 
1.8}/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
 (95%)
 rename runners/flink/{ => 
1.8}/src/test/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializerTest.java
 (73%)
 delete mode 100644 runners/flink/job-server-container/build.gradle
 delete mode 100644 runners/flink/job-server/build.gradle



[beam] branch asf-site updated: Publishing website 2019/05/10 19:52:48 at commit f310234

2019-05-10 Thread git-site-role
This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 270649f  Publishing website 2019/05/10 19:52:48 at commit f310234
270649f is described below

commit 270649f7733963a39e180fa12d574d384d3ebfd7
Author: jenkins 
AuthorDate: Fri May 10 19:52:48 2019 +

Publishing website 2019/05/10 19:52:48 at commit f310234
---
 website/generated-content/contribute/release-guide/index.html | 4 
 1 file changed, 4 insertions(+)

diff --git a/website/generated-content/contribute/release-guide/index.html 
b/website/generated-content/contribute/release-guide/index.html
index 2ac1b71..26895f1 100644
--- a/website/generated-content/contribute/release-guide/index.html
+++ b/website/generated-content/contribute/release-guide/index.html
@@ -817,6 +817,10 @@ so builds will be broken until a new snapshot is 
available.
  sudo pip 
install cython
  sudo apt-get install gcc
  sudo apt-get install python-dev
+ sudo apt-get install python3-dev
+ sudo apt-get install python3.5-dev
+ sudo apt-get install python3.6-dev
+ sudo apt-get install python3.7-dev
 
 
   



[beam] 01/01: Merge pull request #8549: [BEAM-7145] Make FlinkRunner compatible with Flink 1.8

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch release-2.13.0
in repository https://gitbox.apache.org/repos/asf/beam.git

commit b95e8d9591a06eba8759ad2b3fbd90e1a178d5ad
Merge: 1e8f052 e3ca830
Author: Ismaël Mejía 
AuthorDate: Fri May 10 21:55:16 2019 +0200

Merge pull request #8549: [BEAM-7145] Make FlinkRunner compatible with 
Flink 1.8

 runners/flink/{ => 1.5}/build.gradle   | 14 ++--
 .../{ => 1.5}/job-server-container/build.gradle|  6 +-
 runners/flink/{ => 1.5}/job-server/build.gradle|  2 +-
 .../translation/types/CoderTypeSerializer.java |  2 +-
 .../translation/types/EncodedValueSerializer.java  |  0
 .../FlinkBroadcastStateInternalsTest.java  |  0
 .../flink/streaming/FlinkStateInternalsTest.java   |  0
 .../translation/types/CoderTypeSerializerTest.java |  0
 runners/flink/1.6/build.gradle |  8 +--
 runners/flink/1.7/build.gradle |  8 +--
 runners/flink/{1.6 => 1.8}/build.gradle|  8 +--
 .../{ => 1.8}/job-server-container/build.gradle|  6 +-
 runners/flink/{ => 1.8}/job-server/build.gradle|  2 +-
 .../translation/types/CoderTypeSerializer.java | 79 +-
 .../translation/types/EncodedValueSerializer.java  | 31 -
 .../FlinkBroadcastStateInternalsTest.java  |  4 +-
 .../flink/streaming/FlinkStateInternalsTest.java   |  9 ++-
 .../translation/types/CoderTypeSerializerTest.java | 25 ---
 runners/flink/flink_runner.gradle  |  4 ++
 .../runners/flink/FlinkExecutionEnvironments.java  | 10 +--
 .../beam/runners/flink/FlinkSavepointTest.java | 24 +--
 .../src/main/resources/beam/suppressions.xml   |  6 ++
 settings.gradle| 19 --
 23 files changed, 145 insertions(+), 122 deletions(-)



[beam] branch master updated: [BEAM-7267] Add python3*-dev to verify script

2019-05-10 Thread goenka
This is an automated email from the ASF dual-hosted git repository.

goenka pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new 8d59768  [BEAM-7267] Add python3*-dev to verify script
 new f310234  Merge pull request #8551 from angoenka/fix_verify_scipt
8d59768 is described below

commit 8d597685a7b20f1bd2188aea85f4feb72a0a42c5
Author: Ankur Goenka 
AuthorDate: Fri May 10 10:52:17 2019 -0700

[BEAM-7267] Add python3*-dev to verify script
---
 release/src/main/scripts/verify_release_build.sh | 4 
 website/src/contribute/release-guide.md  | 4 
 2 files changed, 8 insertions(+)

diff --git a/release/src/main/scripts/verify_release_build.sh 
b/release/src/main/scripts/verify_release_build.sh
index d8164cf..42a47d1 100755
--- a/release/src/main/scripts/verify_release_build.sh
+++ b/release/src/main/scripts/verify_release_build.sh
@@ -80,6 +80,10 @@ if [[ -z `which cython` ]]; then
 sudo `which pip` install cython
 sudo apt-get install gcc
 sudo apt-get install python-dev
+sudo apt-get install python3-dev
+sudo apt-get install python3.5-dev
+sudo apt-get install python3.6-dev
+sudo apt-get install python3.7-dev
   fi
 else
   cython --version
diff --git a/website/src/contribute/release-guide.md 
b/website/src/contribute/release-guide.md
index 5de5eaf..0a7401c 100644
--- a/website/src/contribute/release-guide.md
+++ b/website/src/contribute/release-guide.md
@@ -360,6 +360,10 @@ There are 2 ways to perform this verification, either 
running automation script(
   sudo pip install cython
   sudo apt-get install gcc
   sudo apt-get install python-dev
+  sudo apt-get install python3-dev
+  sudo apt-get install python3.5-dev
+  sudo apt-get install python3.6-dev
+  sudo apt-get install python3.7-dev
   ```
   1. Make sure your ```time``` alias to ```/usr/bin/time```, if not:
 



[beam] branch master updated: [BEAM-7260] UTF8 coder is breaking dataflow tests

2019-05-10 Thread goenka
This is an automated email from the ASF dual-hosted git repository.

goenka pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new c4dbe12  [BEAM-7260] UTF8 coder is breaking dataflow tests
 new da4bb49  Merge pull request #8545 from ihji/BEAM-7260
c4dbe12 is described below

commit c4dbe12a09e93d0e91d2c7c3bcd3e9bb9117310c
Author: Heejong Lee 
AuthorDate: Thu May 9 17:22:38 2019 -0700

[BEAM-7260] UTF8 coder is breaking dataflow tests

StringUtf8Coder is now ModelCoder in JavaSDK. We need to add it to
WELL_KNOWN_CODER_TYPES for dataflow worker harness as well.
---
 .../dataflow/worker/graph/LengthPrefixUnknownCoders.java|  3 ++-
 .../worker/graph/LengthPrefixUnknownCodersTest.java | 13 +
 2 files changed, 7 insertions(+), 9 deletions(-)

diff --git 
a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/graph/LengthPrefixUnknownCoders.java
 
b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/graph/LengthPrefixUnknownCoders.java
index d4b9e6c..1537288 100644
--- 
a/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/graph/LengthPrefixUnknownCoders.java
+++ 
b/runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/graph/LengthPrefixUnknownCoders.java
@@ -72,7 +72,8 @@ public class LengthPrefixUnknownCoders {
   "kind:fixed_big_endian_int64",
   "kind:var_int32",
   "kind:void",
-  "org.apache.beam.sdk.coders.DoubleCoder");
+  "org.apache.beam.sdk.coders.DoubleCoder",
+  "org.apache.beam.sdk.coders.StringUtf8Coder");
 
   private static final String LENGTH_PREFIX_CODER_TYPE = "kind:length_prefix";
 
diff --git 
a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/graph/LengthPrefixUnknownCodersTest.java
 
b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/graph/LengthPrefixUnknownCodersTest.java
index f73538a..679a292 100644
--- 
a/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/graph/LengthPrefixUnknownCodersTest.java
+++ 
b/runners/google-cloud-dataflow-java/worker/src/test/java/org/apache/beam/runners/dataflow/worker/graph/LengthPrefixUnknownCodersTest.java
@@ -77,14 +77,13 @@ public class LengthPrefixUnknownCodersTest {
 
   private static final Coder>> 
prefixedWindowedValueCoder =
   WindowedValue.getFullCoder(
-  KvCoder.of(
-  LengthPrefixCoder.of(StringUtf8Coder.of()), 
LengthPrefixCoder.of(VarIntCoder.of())),
+  KvCoder.of(StringUtf8Coder.of(), 
LengthPrefixCoder.of(VarIntCoder.of())),
   GlobalWindow.Coder.INSTANCE);
 
-  private static final Coder>>
+  private static final Coder>>
   prefixedAndReplacedWindowedValueCoder =
   WindowedValue.getFullCoder(
-  KvCoder.of(LENGTH_PREFIXED_BYTE_ARRAY_CODER, 
LENGTH_PREFIXED_BYTE_ARRAY_CODER),
+  KvCoder.of(StringUtf8Coder.of(), 
LENGTH_PREFIXED_BYTE_ARRAY_CODER),
   GlobalWindow.Coder.INSTANCE);
 
   private static final String MERGE_BUCKETS_DO_FN = "MergeBucketsDoFn";
@@ -124,8 +123,7 @@ public class LengthPrefixUnknownCodersTest {
 
 Coder>> expectedCoder =
 WindowedValue.getFullCoder(
-KvCoder.of(
-LengthPrefixCoder.of(StringUtf8Coder.of()), 
LengthPrefixCoder.of(VarIntCoder.of())),
+KvCoder.of(StringUtf8Coder.of(), 
LengthPrefixCoder.of(VarIntCoder.of())),
 GlobalWindow.Coder.INSTANCE);
 
 assertEquals(
@@ -138,8 +136,7 @@ public class LengthPrefixUnknownCodersTest {
   public void testLengthPrefixAndReplaceUnknownCoder() throws Exception {
 Coder>> windowedValueCoder =
 WindowedValue.getFullCoder(
-KvCoder.of(LengthPrefixCoder.of(StringUtf8Coder.of()), 
VarIntCoder.of()),
-GlobalWindow.Coder.INSTANCE);
+KvCoder.of(StringUtf8Coder.of(), VarIntCoder.of()), 
GlobalWindow.Coder.INSTANCE);
 
 Map lengthPrefixedCoderCloudObject =
 forCodec(CloudObjects.asCloudObject(windowedValueCoder, 
/*sdkComponents=*/ null), true);



[beam] branch master updated (6679b00 -> 01e9878)

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 6679b00  Merge pull request #8548: [BEAM-7265] Update Spark runner to 
use spark version 2.4.3
 add 40da292  [BEAM-7145] Make FlinkRunner compatible with Flink 1.8
 new 01e9878  Merge pull request #8540: [BEAM-7145] Make FlinkRunner 
compatible with Flink 1.8

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 runners/flink/{ => 1.5}/build.gradle   | 14 ++--
 .../{1.7 => 1.5}/job-server-container/build.gradle |  0
 runners/flink/{1.7 => 1.5}/job-server/build.gradle |  0
 .../translation/types/CoderTypeSerializer.java |  2 +-
 .../translation/types/EncodedValueSerializer.java  |  0
 .../FlinkBroadcastStateInternalsTest.java  |  0
 .../flink/streaming/FlinkStateInternalsTest.java   |  0
 .../translation/types/CoderTypeSerializerTest.java |  0
 runners/flink/1.6/build.gradle |  8 +--
 runners/flink/1.7/build.gradle |  8 +--
 runners/flink/{1.6 => 1.8}/build.gradle|  8 +--
 .../{1.7 => 1.8}/job-server-container/build.gradle |  0
 runners/flink/{1.7 => 1.8}/job-server/build.gradle |  0
 .../translation/types/CoderTypeSerializer.java | 79 +-
 .../translation/types/EncodedValueSerializer.java  | 31 -
 .../FlinkBroadcastStateInternalsTest.java  |  4 +-
 .../flink/streaming/FlinkStateInternalsTest.java   |  9 ++-
 .../translation/types/CoderTypeSerializerTest.java | 25 ---
 runners/flink/flink_runner.gradle  |  4 ++
 runners/flink/job-server-container/build.gradle| 24 ---
 runners/flink/job-server/build.gradle  | 30 
 .../runners/flink/FlinkExecutionEnvironments.java  | 10 +--
 .../beam/runners/flink/FlinkSavepointTest.java | 24 +--
 .../src/main/resources/beam/suppressions.xml   |  6 ++
 settings.gradle| 19 --
 25 files changed, 135 insertions(+), 170 deletions(-)
 rename runners/flink/{ => 1.5}/build.gradle (73%)
 copy runners/flink/{1.7 => 1.5}/job-server-container/build.gradle (100%)
 copy runners/flink/{1.7 => 1.5}/job-server/build.gradle (100%)
 copy runners/flink/{ => 
1.5}/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
 (99%)
 copy runners/flink/{ => 
1.5}/src/main/java/org/apache/beam/runners/flink/translation/types/EncodedValueSerializer.java
 (100%)
 copy runners/flink/{ => 
1.5}/src/test/java/org/apache/beam/runners/flink/streaming/FlinkBroadcastStateInternalsTest.java
 (100%)
 copy runners/flink/{ => 
1.5}/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
 (100%)
 copy runners/flink/{ => 
1.5}/src/test/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializerTest.java
 (100%)
 copy runners/flink/{1.6 => 1.8}/build.gradle (84%)
 copy runners/flink/{1.7 => 1.8}/job-server-container/build.gradle (100%)
 copy runners/flink/{1.7 => 1.8}/job-server/build.gradle (100%)
 rename runners/flink/{ => 
1.8}/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
 (65%)
 rename runners/flink/{ => 
1.8}/src/main/java/org/apache/beam/runners/flink/translation/types/EncodedValueSerializer.java
 (71%)
 rename runners/flink/{ => 
1.8}/src/test/java/org/apache/beam/runners/flink/streaming/FlinkBroadcastStateInternalsTest.java
 (94%)
 rename runners/flink/{ => 
1.8}/src/test/java/org/apache/beam/runners/flink/streaming/FlinkStateInternalsTest.java
 (95%)
 rename runners/flink/{ => 
1.8}/src/test/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializerTest.java
 (73%)
 delete mode 100644 runners/flink/job-server-container/build.gradle
 delete mode 100644 runners/flink/job-server/build.gradle



[beam] 01/01: Merge pull request #8540: [BEAM-7145] Make FlinkRunner compatible with Flink 1.8

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 01e9878986da31bbba5c9b8dd6749ffdc08718a1
Merge: 6679b00 40da292
Author: Ismaël Mejía 
AuthorDate: Fri May 10 16:16:29 2019 +0200

Merge pull request #8540: [BEAM-7145] Make FlinkRunner compatible with 
Flink 1.8

 runners/flink/{ => 1.5}/build.gradle   | 14 ++--
 .../{ => 1.5}/job-server-container/build.gradle|  6 +-
 runners/flink/{ => 1.5}/job-server/build.gradle|  2 +-
 .../translation/types/CoderTypeSerializer.java |  2 +-
 .../translation/types/EncodedValueSerializer.java  |  0
 .../FlinkBroadcastStateInternalsTest.java  |  0
 .../flink/streaming/FlinkStateInternalsTest.java   |  0
 .../translation/types/CoderTypeSerializerTest.java |  0
 runners/flink/1.6/build.gradle |  8 +--
 runners/flink/1.7/build.gradle |  8 +--
 runners/flink/{1.6 => 1.8}/build.gradle|  8 +--
 .../{ => 1.8}/job-server-container/build.gradle|  6 +-
 runners/flink/{ => 1.8}/job-server/build.gradle|  2 +-
 .../translation/types/CoderTypeSerializer.java | 79 +-
 .../translation/types/EncodedValueSerializer.java  | 31 -
 .../FlinkBroadcastStateInternalsTest.java  |  4 +-
 .../flink/streaming/FlinkStateInternalsTest.java   |  9 ++-
 .../translation/types/CoderTypeSerializerTest.java | 25 ---
 runners/flink/flink_runner.gradle  |  4 ++
 .../runners/flink/FlinkExecutionEnvironments.java  | 10 +--
 .../beam/runners/flink/FlinkSavepointTest.java | 24 +--
 .../src/main/resources/beam/suppressions.xml   |  6 ++
 settings.gradle| 19 --
 23 files changed, 145 insertions(+), 122 deletions(-)




[beam] branch spark-runner_structured-streaming updated: Consider null object case on RowHelpers, fixes empty side inputs tests.

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch spark-runner_structured-streaming
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to 
refs/heads/spark-runner_structured-streaming by this push:
 new 728aa1f  Consider null object case on RowHelpers, fixes empty side 
inputs tests.
728aa1f is described below

commit 728aa1f7d5988acfa87daad29da4fafd19eb0455
Author: Ismaël Mejía 
AuthorDate: Fri May 10 12:18:08 2019 +0200

Consider null object case on RowHelpers, fixes empty side inputs tests.
---
 .../spark/structuredstreaming/translation/helpers/RowHelpers.java  | 3 +++
 1 file changed, 3 insertions(+)

diff --git 
a/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/helpers/RowHelpers.java
 
b/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/helpers/RowHelpers.java
index ca88abe..da5cc96 100644
--- 
a/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/helpers/RowHelpers.java
+++ 
b/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/helpers/RowHelpers.java
@@ -93,6 +93,9 @@ public final class RowHelpers {
   public static  T extractObjectFromRow(Row value) {
 // there is only one value put in each Row by the InputPartitionReader
 byte[] bytes = (byte[]) value.get(0);
+if (bytes == null) {
+  return null;
+}
 ByteArrayInputStream inputStream = new ByteArrayInputStream(bytes);
 Kryo kryo = new Kryo();
 Input input = new Input(inputStream);



[beam] branch spark-runner_structured-streaming updated (5cf3e1b -> 1d9155d)

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a change to branch spark-runner_structured-streaming
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 5cf3e1b  fixup Enable UsesFailureMessage category of tests
 new ad22b68  Pass transform based doFnSchemaInformation in ParDo 
translation
 new be79a86  Enable UsesSchema tests on ValidatesRunner
 new 1d9155d  fixup hadoop-format is not mandataory to run ValidatesRunner 
tests

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 runners/spark/build.gradle  | 6 +++---
 .../structuredstreaming/translation/batch/ParDoTranslatorBatch.java | 5 -
 2 files changed, 7 insertions(+), 4 deletions(-)



[beam] 03/03: fixup hadoop-format is not mandataory to run ValidatesRunner tests

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch spark-runner_structured-streaming
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 1d9155d6b66a506ef79a057df94188c095911836
Author: Ismaël Mejía 
AuthorDate: Fri May 10 11:36:23 2019 +0200

fixup hadoop-format is not mandataory to run ValidatesRunner tests
---
 runners/spark/build.gradle | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/runners/spark/build.gradle b/runners/spark/build.gradle
index 46e5c25..9619c5e 100644
--- a/runners/spark/build.gradle
+++ b/runners/spark/build.gradle
@@ -178,7 +178,9 @@ task validatesStructuredStreamingRunnerBatch(type: Test) {
   systemProperty "spark.ui.showConsoleProgress", "false"
 
   classpath = configurations.validatesRunner
-  testClassesDirs = 
files(project(":beam-sdks-java-core").sourceSets.test.output.classesDirs) + 
files(project(":beam-sdks-java-io-hadoop-format").sourceSets.test.output.classesDirs)
 + files(project.sourceSets.test.output.classesDirs)
+  testClassesDirs += 
files(project(":beam-sdks-java-core").sourceSets.test.output.classesDirs)
+  testClassesDirs += files(project.sourceSets.test.output.classesDirs)
+
   // Only one SparkContext may be running in a JVM (SPARK-2243)
   forkEvery 1
   maxParallelForks 4



[beam] 01/03: Pass transform based doFnSchemaInformation in ParDo translation

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch spark-runner_structured-streaming
in repository https://gitbox.apache.org/repos/asf/beam.git

commit ad22b6828cf7b78945e10cd60362c27a0447e66a
Author: Ismaël Mejía 
AuthorDate: Fri May 10 11:44:53 2019 +0200

Pass transform based doFnSchemaInformation in ParDo translation
---
 .../structuredstreaming/translation/batch/ParDoTranslatorBatch.java  | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git 
a/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/batch/ParDoTranslatorBatch.java
 
b/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/batch/ParDoTranslatorBatch.java
index b16d7e9..400b025 100644
--- 
a/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/batch/ParDoTranslatorBatch.java
+++ 
b/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/batch/ParDoTranslatorBatch.java
@@ -77,6 +77,9 @@ class ParDoTranslatorBatch
 signature.stateDeclarations().size() > 0 || 
signature.timerDeclarations().size() > 0;
 checkState(!stateful, "States and timers are not supported for the 
moment.");
 
+DoFnSchemaInformation doFnSchemaInformation =
+ParDoTranslation.getSchemaInformation(context.getCurrentTransform());
+
 // Init main variables
 Dataset> inputDataSet = 
context.getDataset(context.getInput());
 Map, PValue> outputs = context.getOutputs();
@@ -110,7 +113,7 @@ class ParDoTranslatorBatch
 inputCoder,
 outputCoderMap,
 broadcastStateData,
-DoFnSchemaInformation.create());
+doFnSchemaInformation);
 
 Dataset, WindowedValue>> allOutputs =
 inputDataSet.mapPartitions(doFnWrapper, 
EncoderHelpers.tuple2Encoder());



[beam] 02/03: Enable UsesSchema tests on ValidatesRunner

2019-05-10 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch spark-runner_structured-streaming
in repository https://gitbox.apache.org/repos/asf/beam.git

commit be79a86181f91d3e9b201bc3eafd412833a8cc72
Author: Ismaël Mejía 
AuthorDate: Fri May 10 11:45:59 2019 +0200

Enable UsesSchema tests on ValidatesRunner
---
 runners/spark/build.gradle | 2 --
 1 file changed, 2 deletions(-)

diff --git a/runners/spark/build.gradle b/runners/spark/build.gradle
index 02dfac7..46e5c25 100644
--- a/runners/spark/build.gradle
+++ b/runners/spark/build.gradle
@@ -207,8 +207,6 @@ task validatesStructuredStreamingRunnerBatch(type: Test) {
 // Portability
 excludeCategories 'org.apache.beam.sdk.testing.UsesImpulse'
 excludeCategories 'org.apache.beam.sdk.testing.UsesCrossLanguageTransforms'
-// Schema
-excludeCategories 'org.apache.beam.sdk.testing.UsesSchema'
   }
 }
 



[beam] branch master updated (765fe3b -> 6679b00)

2019-05-10 Thread aromanenko
This is an automated email from the ASF dual-hosted git repository.

aromanenko pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 765fe3b  Merge pull request #8547 from 
iemejia/BEAM-7263-jdbc-deprecations
 add ac48aab  [BEAM-7265] Update Spark runner to use spark version 2.4.3
 new 6679b00  Merge pull request #8548: [BEAM-7265] Update Spark runner to 
use spark version 2.4.3

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 buildSrc/src/main/groovy/org/apache/beam/gradle/BeamModulePlugin.groovy | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[beam] 01/01: Merge pull request #8548: [BEAM-7265] Update Spark runner to use spark version 2.4.3

2019-05-10 Thread aromanenko
This is an automated email from the ASF dual-hosted git repository.

aromanenko pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 6679b00138a5b82a6a55e7bc94c453957cea501c
Merge: 765fe3b ac48aab
Author: Alexey Romanenko <33895511+aromanenko-...@users.noreply.github.com>
AuthorDate: Fri May 10 11:40:27 2019 +0200

Merge pull request #8548: [BEAM-7265] Update Spark runner to use spark 
version 2.4.3

 buildSrc/src/main/groovy/org/apache/beam/gradle/BeamModulePlugin.groovy | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[beam] branch master updated: [BEAM-7263] Deprecate set/getClientConfiguration in JdbcIO

2019-05-10 Thread jbonofre
This is an automated email from the ASF dual-hosted git repository.

jbonofre pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new 7d8447e  [BEAM-7263] Deprecate set/getClientConfiguration in JdbcIO
 new 765fe3b  Merge pull request #8547 from 
iemejia/BEAM-7263-jdbc-deprecations
7d8447e is described below

commit 7d8447eaa73ace1215332642f5f88fc1c846475e
Author: Ismaël Mejía 
AuthorDate: Fri May 10 09:50:38 2019 +0200

[BEAM-7263] Deprecate set/getClientConfiguration in JdbcIO
---
 .../src/main/java/org/apache/beam/sdk/io/jdbc/JdbcIO.java| 12 
 1 file changed, 12 insertions(+)

diff --git 
a/sdks/java/io/jdbc/src/main/java/org/apache/beam/sdk/io/jdbc/JdbcIO.java 
b/sdks/java/io/jdbc/src/main/java/org/apache/beam/sdk/io/jdbc/JdbcIO.java
index a14a742..7a5cdc3 100644
--- a/sdks/java/io/jdbc/src/main/java/org/apache/beam/sdk/io/jdbc/JdbcIO.java
+++ b/sdks/java/io/jdbc/src/main/java/org/apache/beam/sdk/io/jdbc/JdbcIO.java
@@ -429,6 +429,8 @@ public class JdbcIO {
   /** Implementation of {@link #read}. */
   @AutoValue
   public abstract static class Read extends PTransform> {
+/** @deprecated It is not needed anymore. It will be removed in a future 
version of Beam. */
+@Deprecated
 @Nullable
 abstract DataSourceConfiguration getDataSourceConfiguration();
 
@@ -455,6 +457,8 @@ public class JdbcIO {
 
 @AutoValue.Builder
 abstract static class Builder {
+  /** @deprecated It is not needed anymore. It will be removed in a future 
version of Beam. */
+  @Deprecated
   abstract Builder setDataSourceConfiguration(DataSourceConfiguration 
config);
 
   abstract Builder setDataSourceProviderFn(
@@ -572,6 +576,8 @@ public class JdbcIO {
   @AutoValue
   public abstract static class ReadAll
   extends PTransform, PCollection> {
+/** @deprecated It is not needed anymore. It will be removed in a future 
version of Beam. */
+@Deprecated
 @Nullable
 abstract DataSourceConfiguration getDataSourceConfiguration();
 
@@ -598,6 +604,8 @@ public class JdbcIO {
 
 @AutoValue.Builder
 abstract static class Builder {
+  /** @deprecated It is not needed anymore. It will be removed in a future 
version of Beam. */
+  @Deprecated
   abstract Builder setDataSourceConfiguration(
   DataSourceConfiguration config);
 
@@ -870,6 +878,8 @@ public class JdbcIO {
   /** A {@link PTransform} to write to a JDBC datasource. */
   @AutoValue
   public abstract static class WriteVoid extends PTransform, 
PCollection> {
+/** @deprecated It is not needed anymore. It will be removed in a future 
version of Beam. */
+@Deprecated
 @Nullable
 abstract DataSourceConfiguration getDataSourceConfiguration();
 
@@ -891,6 +901,8 @@ public class JdbcIO {
 
 @AutoValue.Builder
 abstract static class Builder {
+  /** @deprecated It is not needed anymore. It will be removed in a future 
version of Beam. */
+  @Deprecated
   abstract Builder setDataSourceConfiguration(DataSourceConfiguration 
config);
 
   abstract Builder setDataSourceProviderFn(