[jira] [Assigned] (BEAM-3735) Beam 2.3.0 release archetypes missing mobile gaming examples

2018-02-28 Thread Luke Cwik (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3735?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik reassigned BEAM-3735:
---

Assignee: yifan zou  (was: Jean-Baptiste Onofré)

> Beam 2.3.0 release archetypes missing mobile gaming examples
> 
>
> Key: BEAM-3735
> URL: https://issues.apache.org/jira/browse/BEAM-3735
> Project: Beam
>  Issue Type: Bug
>  Components: examples-java
>Affects Versions: 2.3.0
>Reporter: yifan zou
>Assignee: yifan zou
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> We stopped copying the mobile gaming examples after merging Java8 examples to 
> "mian" Java examples. 
> [Here|https://github.com/apache/beam/pull/4479/files#diff-3e5600cc4b04a4a7f27d7ce10ac2994aL51]
>  is the PR. So, we're not able to run those pipelines via mvn archepyte: 
> generate.
> We need bring those examples back.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: [BEAM-3735] Copy mobile-gaming sources in to archetypes

2018-02-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit c44b3df85ce256bf564e3c049c4ec6e135010711
Merge: d917f99 768a098
Author: Lukasz Cwik 
AuthorDate: Wed Feb 28 23:57:49 2018 -0800

[BEAM-3735] Copy mobile-gaming sources in to archetypes

 .../maven-archetypes/examples/generate-sources.sh  | 26 +-
 1 file changed, 20 insertions(+), 6 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] branch master updated (d917f99 -> c44b3df)

2018-02-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from d917f99  [BEAM-3762] Update Dataflow worker container image to support 
unlimited JCE policy.
 add 768a098  [BEAM-3735] copy mobile-gaming sources in to archetypes
 new c44b3df  [BEAM-3735] Copy mobile-gaming sources in to archetypes

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../maven-archetypes/examples/generate-sources.sh  | 26 +-
 1 file changed, 20 insertions(+), 6 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[jira] [Resolved] (BEAM-3762) Update Dataflow worker image to support unlimited JCE policy

2018-02-28 Thread Luke Cwik (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik resolved BEAM-3762.
-
Resolution: Fixed

> Update Dataflow worker image to support unlimited JCE policy
> 
>
> Key: BEAM-3762
> URL: https://issues.apache.org/jira/browse/BEAM-3762
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-dataflow
>Reporter: Luke Cwik
>Assignee: Luke Cwik
>Priority: Minor
> Fix For: 2.4.0
>
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: [BEAM-3762] Update Dataflow worker container image to support unlimited JCE policy.

2018-02-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit d917f995948e73bdb5f26a1123cfe03da589148e
Merge: 9133b59 ff10c84
Author: Lukasz Cwik 
AuthorDate: Wed Feb 28 23:55:53 2018 -0800

[BEAM-3762] Update Dataflow worker container image to support unlimited JCE 
policy.

 runners/google-cloud-dataflow-java/build.gradle | 2 +-
 runners/google-cloud-dataflow-java/pom.xml  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] branch master updated (9133b59 -> d917f99)

2018-02-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 9133b59  Merge pull request #4752: [BEAM-3756] Update SpannerIO to use 
Batch API
 add ff10c84  [BEAM-3762] Update Dataflow worker container image to support 
unlimited JCE policy.
 new d917f99  [BEAM-3762] Update Dataflow worker container image to support 
unlimited JCE policy.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 runners/google-cloud-dataflow-java/build.gradle | 2 +-
 runners/google-cloud-dataflow-java/pom.xml  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #4306

2018-02-28 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Spark #1414

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] Update SpannerIO to use Batch API.

--
[...truncated 94.69 KB...]
'apache-beam-testing:bqjob_r42eea8e7d47e50de_0161e037d893_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-01 06:20:13,720 ec240a45 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-01 06:20:38,927 ec240a45 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-01 06:20:41,170 ec240a45 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.23s,  CPU:0.26s,  MaxMemory:25364kb 
STDOUT: Upload complete.
Waiting on bqjob_r2e97785feb0e138c_0161e0384391_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r2e97785feb0e138c_0161e0384391_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r2e97785feb0e138c_0161e0384391_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-01 06:20:41,170 ec240a45 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-01 06:20:56,684 ec240a45 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-01 06:20:59,120 ec240a45 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.42s,  CPU:0.26s,  MaxMemory:25392kb 
STDOUT: Upload complete.
Waiting on bqjob_r3d473efcfab90ec8_0161e03888f4_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r3d473efcfab90ec8_0161e03888f4_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r3d473efcfab90ec8_0161e03888f4_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-01 06:20:59,120 ec240a45 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-01 06:21:16,976 ec240a45 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-01 06:21:19,581 ec240a45 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.59s,  CPU:0.24s,  MaxMemory:25528kb 
STDOUT: Upload complete.
Waiting on bqjob_r277db45981dc1c01_0161e038d832_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r277db45981dc1c01_0161e038d832_1 ... (0s) 
Current status: DONE   
BigQuery error in load 

Build failed in Jenkins: beam_PerformanceTests_TextIOIT #214

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] Update SpannerIO to use Batch API.

--
[...truncated 16.47 KB...]
Requirement already satisfied: google-gax<0.16dev,>=0.15.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: 
proto-google-cloud-pubsub-v1[grpc]<0.16dev,>=0.15.4 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: grpc-google-iam-v1<0.12dev,>=0.11.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: cachetools>=2.0.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-auth<2.0.0dev,>=0.4.0->google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: future<0.17dev,>=0.16.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: ply==3.8 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.4.0.dev0)
Installing collected packages: hdfs, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam hdfs-2.1.0
[beam_PerformanceTests_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins7176208685852065319.sh
+ .env/bin/python PerfKitBenchmarker/pkb.py --project=apache-beam-testing 
--dpb_log_level=INFO --maven_binary=/home/jenkins/tools/maven/latest/bin/mvn 
--bigquery_table=beam_performance.textioit_pkb_results 
--temp_dir= 
--official=true --benchmarks=beam_integration_benchmark --beam_it_timeout=1200 
--beam_it_profile=io-it --beam_prebuilt=true --beam_sdk=java 
--beam_it_module=sdks/java/io/file-based-io-tests 
--beam_it_class=org.apache.beam.sdk.io.text.TextIOIT 
'--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--numberOfRecords=100,--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TextIOIT/214/]'
 '--beam_extra_mvn_properties=[filesystem=gcs]'
2018-03-01 06:00:38,830 0b288bcb MainThread INFO Verbose logging to: 

2018-03-01 06:00:38,830 0b288bcb MainThread INFO PerfKitBenchmarker 
version: v1.12.0-375-g7965638
2018-03-01 06:00:38,830 0b288bcb MainThread INFO Flag values:
--beam_extra_mvn_properties=[filesystem=gcs]
--beam_it_class=org.apache.beam.sdk.io.text.TextIOIT
--beam_it_timeout=1200
--beam_it_module=sdks/java/io/file-based-io-tests
--beam_sdk=java
--temp_dir=
--maven_binary=/home/jenkins/tools/maven/latest/bin/mvn
--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--numberOfRecords=100,--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TextIOIT/214/]
--beam_prebuilt
--project=apache-beam-testing
--bigquery_table=beam_performance.textioit_pkb_results
--official
--dpb_log_level=INFO
--beam_it_profile=io-it
--benchmarks=beam_integration_benchmark
2018-03-01 06:00:39,093 0b288bcb MainThread WARNING  The key "flags" was not in 
the default config, but was in user overrides. This may indicate a typo.
2018-03-01 06:00:39,094 0b288bcb MainThread INFO Initializing the edw 
service decoder
2018-03-01 06:00:39,189 0b288bcb MainThread beam_integration_benchmark(1/1) 
INFO Provisioning resources for benchmark beam_integration_benchmark
2018-03-01 06:00:39,191 0b288bcb MainThread beam_integration_benchmark(1/1) 
INFO Preparing benchmark beam_integration_benchmark
2018-03-01 06:00:39,191 0b288bcb MainThread beam_integration_benchmark(1/1) 
INFO Running: git clone https://github.com/apache/beam.git
2018-03-01 06:00:46,236 0b288bcb MainThread beam_integration_benchmark(1/1) 
INFO Running benchmark beam_integration_benchmark
2018-03-01 06:00:46,242 0b288bcb MainThread beam_integration_benchmark(1/1) 
INFO Running: /home/jenkins/tools/maven/latest/bin/mvn -e verify 
-Dit.test=org.apache.beam.sdk.io.text.TextIOIT -DskipITs=false -pl 
sdks/java/io/file-based-io-tests -Pio-it -Pdataflow-runner -Dfilesystem=gcs 

Build failed in Jenkins: beam_PerformanceTests_JDBC #274

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] Update SpannerIO to use Batch API.

--
[...truncated 48.91 KB...]
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] --- maven-failsafe-plugin:2.20.1:integration-test (default) @ 
beam-sdks-java-io-jdbc ---
[INFO] Failsafe report directory: 

[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 
1,041.716 s <<< FAILURE! - in org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] testWriteThenRead(org.apache.beam.sdk.io.jdbc.JdbcIOIT)  Time elapsed: 
1,041.716 s  <<< ERROR!
java.lang.RuntimeException: 
(ed98f0c45f5f7cc4): java.lang.RuntimeException: 
org.apache.beam.sdk.util.UserCodeException: org.postgresql.util.PSQLException: 
The connection attempt failed.
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:404)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:374)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:158)
at 
com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:308)
at 
com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:264)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:133)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:113)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:100)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: 
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeSetup(Unknown 
Source)
at 
com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:63)
at 
com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:45)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:94)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:481)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:392)
... 14 more
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at 

Build failed in Jenkins: beam_PerformanceTests_Python #970

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[mairbek] Update SpannerIO to use Batch API.

--
[...truncated 726 B...]
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9133b593453d37066212c32922f9f88cb236b6bc (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9133b593453d37066212c32922f9f88cb236b6bc
Commit message: "Merge pull request #4752: [BEAM-3756] Update SpannerIO to use 
Batch API"
 > git rev-list --no-walk 3788a48770da9f48beed96fbbec56cf1beaeb6aa # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6189404951678294040.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1256035297818677425.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins4017476556675741.sh
+ virtualenv .env --system-site-packages
New python executable in 

Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1356853821669255447.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./.env/lib/python2.7/site-packages
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1998509383867717750.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3188420239957846467.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
:122:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
  Using cached numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages 

[jira] [Updated] (BEAM-3700) PipelineOptionsFactory leaks memory

2018-02-28 Thread Robert Bradshaw (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Bradshaw updated BEAM-3700:
--
Fix Version/s: (was: 2.4.0)
   2.5.0

> PipelineOptionsFactory leaks memory
> ---
>
> Key: BEAM-3700
> URL: https://issues.apache.org/jira/browse/BEAM-3700
> Project: Beam
>  Issue Type: Task
>  Components: sdk-java-core
>Reporter: Romain Manni-Bucau
>Assignee: Romain Manni-Bucau
>Priority: Major
> Fix For: 2.5.0
>
>
> PipelineOptionsFactory has a lot of cache but no way to reset it. This task 
> is about adding a public method to be able to control it in integrations 
> (runners likely).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (BEAM-3689) Direct runner leak a reader for every 10 input records

2018-02-28 Thread Robert Bradshaw (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Bradshaw closed BEAM-3689.
-
Resolution: Fixed

Based on my reading of the PR, this is fixed. 

> Direct runner leak a reader for every 10 input records
> --
>
> Key: BEAM-3689
> URL: https://issues.apache.org/jira/browse/BEAM-3689
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-direct
>Reporter: Raghu Angadi
>Assignee: Raghu Angadi
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> Direct runner reads 10 records at a time from a reader. I think the intention 
> is to reuse the reader, but it reuses only if the reader is idle initially, 
> not when the source has messages available.
> When I was testing KafkaIO with direct runner it kept opening new reader for 
> every 10 records and soon ran out of file descriptors.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3702) Support system properties source for pipeline options

2018-02-28 Thread Robert Bradshaw (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3702?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Bradshaw updated BEAM-3702:
--
Fix Version/s: (was: 2.4.0)
   2.5.0

> Support system properties source for pipeline options
> -
>
> Key: BEAM-3702
> URL: https://issues.apache.org/jira/browse/BEAM-3702
> Project: Beam
>  Issue Type: Task
>  Components: sdk-java-core
>Reporter: Romain Manni-Bucau
>Assignee: Romain Manni-Bucau
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 8h 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-02-28 Thread Robert Bradshaw (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Bradshaw updated BEAM-3749:
--
Fix Version/s: (was: 2.4.0)
   2.5.0

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-02-28 Thread Robert Bradshaw (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16381530#comment-16381530
 ] 

Robert Bradshaw commented on BEAM-3749:
---

I wonder if sink-based triggers would be a better fit here than adding 
triggering to an sql statement. 

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3756) Update SpannerIO to use Batch API

2018-02-28 Thread Chamikara Jayalath (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3756?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chamikara Jayalath updated BEAM-3756:
-
Fix Version/s: (was: 2.4.0)
   Not applicable

> Update SpannerIO to use Batch API
> -
>
> Key: BEAM-3756
> URL: https://issues.apache.org/jira/browse/BEAM-3756
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Affects Versions: Not applicable
>Reporter: Chamikara Jayalath
>Assignee: Mairbek Khadikov
>Priority: Critical
> Fix For: Not applicable
>
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> Active pull requests:
> [https://github.com/apache/beam/pull/4752]
> [https://github.com/apache/beam/pull/4727]
> [https://github.com/apache/beam/pull/4707]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #4752: [BEAM-3756] Update SpannerIO to use Batch API

2018-02-28 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 9133b593453d37066212c32922f9f88cb236b6bc
Merge: 3788a48 b6fea3b
Author: Chamikara Jayalath 
AuthorDate: Wed Feb 28 19:30:39 2018 -0800

Merge pull request #4752: [BEAM-3756] Update SpannerIO to use Batch API

 build.gradle   |   2 +-
 pom.xml|   2 +-
 .../beam/sdk/io/gcp/spanner/BatchSpannerRead.java  | 161 ++
 .../sdk/io/gcp/spanner/CreateTransactionFn.java|  29 +-
 .../sdk/io/gcp/spanner/NaiveSpannerReadFn.java |  85 --
 .../beam/sdk/io/gcp/spanner/ReadOperation.java |  13 +-
 .../beam/sdk/io/gcp/spanner/SpannerAccessor.java   |   9 +-
 .../beam/sdk/io/gcp/spanner/SpannerConfig.java |   5 +-
 .../apache/beam/sdk/io/gcp/spanner/SpannerIO.java  |  51 ++--
 .../beam/sdk/io/gcp/spanner/Transaction.java   |  10 +-
 .../cloud/spanner/FakeBatchTransactionId.java  |  54 
 .../google/cloud/spanner/FakePartitionFactory.java |  43 +++
 .../apache/beam/sdk/io/gcp/GcpApiSurfaceTest.java  |   2 +-
 .../sdk/io/gcp/spanner/FakeServiceFactory.java |  13 +
 .../beam/sdk/io/gcp/spanner/SpannerIOReadTest.java | 329 +++--
 .../beam/sdk/io/gcp/spanner/SpannerReadIT.java |  42 +++
 16 files changed, 551 insertions(+), 299 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[beam] branch master updated (3788a48 -> 9133b59)

2018-02-28 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 3788a48  Bump container image tag to fix incompatibility between the 
SDK and the container.
 add b6fea3b  Update SpannerIO to use Batch API.
 new 9133b59  Merge pull request #4752: [BEAM-3756] Update SpannerIO to use 
Batch API

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 build.gradle   |   2 +-
 pom.xml|   2 +-
 .../beam/sdk/io/gcp/spanner/BatchSpannerRead.java  | 161 ++
 .../sdk/io/gcp/spanner/CreateTransactionFn.java|  29 +-
 .../sdk/io/gcp/spanner/NaiveSpannerReadFn.java |  85 --
 .../beam/sdk/io/gcp/spanner/ReadOperation.java |  13 +-
 .../beam/sdk/io/gcp/spanner/SpannerAccessor.java   |   9 +-
 .../beam/sdk/io/gcp/spanner/SpannerConfig.java |   5 +-
 .../apache/beam/sdk/io/gcp/spanner/SpannerIO.java  |  51 ++--
 .../beam/sdk/io/gcp/spanner/Transaction.java   |  10 +-
 .../cloud/spanner/FakeBatchTransactionId.java  |  41 ++-
 .../google/cloud/spanner/FakePartitionFactory.java |  39 ++-
 .../apache/beam/sdk/io/gcp/GcpApiSurfaceTest.java  |   2 +-
 .../sdk/io/gcp/spanner/FakeServiceFactory.java |  13 +
 .../beam/sdk/io/gcp/spanner/SpannerIOReadTest.java | 329 +++--
 .../beam/sdk/io/gcp/spanner/SpannerReadIT.java |  42 +++
 16 files changed, 491 insertions(+), 342 deletions(-)
 create mode 100644 
sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/BatchSpannerRead.java
 delete mode 100644 
sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/NaiveSpannerReadFn.java
 copy 
runners/spark/src/main/java/org/apache/beam/runners/spark/util/ByteArray.java 
=> 
sdks/java/io/google-cloud-platform/src/test/java/com/google/cloud/spanner/FakeBatchTransactionId.java
 (56%)
 copy 
runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectTranslator.java
 => 
sdks/java/io/google-cloud-platform/src/test/java/com/google/cloud/spanner/FakePartitionFactory.java
 (51%)

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[jira] [Updated] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-02-28 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh updated BEAM-3327:
--
Summary: Add abstractions to manage Environment Instance lifecycles.  (was: 
Create and Manage Containers in the Universal Local Runner)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Ben Sidhom
>Priority: Major
>  Labels: portability
>  Time Spent: 4h
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3327) Create and Manage Containers in the Universal Local Runner

2018-02-28 Thread Thomas Groh (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16381414#comment-16381414
 ] 

Thomas Groh commented on BEAM-3327:
---

This is really "Add interfaces and an implementation to manage environments", 
and I believe you are working on this at the current time, Ben

 

> Create and Manage Containers in the Universal Local Runner
> --
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Ben Sidhom
>Priority: Major
>  Labels: portability
>  Time Spent: 4h
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3327) Create and Manage Containers in the Universal Local Runner

2018-02-28 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh reassigned BEAM-3327:
-

Assignee: Ben Sidhom  (was: Thomas Groh)

> Create and Manage Containers in the Universal Local Runner
> --
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Ben Sidhom
>Priority: Major
>  Labels: portability
>  Time Spent: 4h
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3714) JdbcIO.read() should create a forward-only, read-only result set

2018-02-28 Thread Innocent (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3714?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16381404#comment-16381404
 ] 

Innocent commented on BEAM-3714:


Hi Eugene, Thanks for your support, I am not sure about the Fetch Size part. 
according to this documentation from Oracle 
[https://docs.oracle.com/cd/A81042_01/DOC/java.816/a81354/resltse5.htm] Fetch 
size is set to 10 by default. I do not have much experience with this did you 
had a specific value/ range of values in mind when suggesting that it should be 
set to a big value ?

> JdbcIO.read() should create a forward-only, read-only result set
> 
>
> Key: BEAM-3714
> URL: https://issues.apache.org/jira/browse/BEAM-3714
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-jdbc
>Reporter: Eugene Kirpichov
>Assignee: Innocent
>Priority: Major
>
> [https://stackoverflow.com/questions/48784889/streaming-data-from-cloudsql-into-dataflow/48819934#48819934]
>  - a user is trying to load a large table from MySQL, and the MySQL JDBC 
> driver requires special measures when loading large result sets.
> JdbcIO currently calls simply "connection.prepareStatement(query)" 
> https://github.com/apache/beam/blob/bb8c12c4956cbe3c6f2e57113e7c0ce2a5c05009/sdks/java/io/jdbc/src/main/java/org/apache/beam/sdk/io/jdbc/JdbcIO.java#L508
>  - it should specify type TYPE_FORWARD_ONLY and concurrency CONCUR_READ_ONLY 
> - these values should always be used.
> Seems that different databases have different requirements for streaming 
> result sets.
> E.g. MySQL requires setting fetch size; PostgreSQL says "The Connection must 
> not be in autocommit mode." 
> https://jdbc.postgresql.org/documentation/head/query.html#query-with-cursor . 
> Oracle, I think, doesn't have any special requirements but I don't know. 
> Fetch size should probably still be set to a reasonably large value.
> Seems that the common denominator of these requirements is: set fetch size to 
> a reasonably large but not maximum value; disable autocommit (there's nothing 
> to commit in read() anyway).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3714) JdbcIO.read() should create a forward-only, read-only result set

2018-02-28 Thread Eugene Kirpichov (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3714?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16381382#comment-16381382
 ] 

Eugene Kirpichov commented on BEAM-3714:


Hey Innocent, thanks for taking this! I'll be happy to help if you have any 
questions, and review your PR when it's ready.

> JdbcIO.read() should create a forward-only, read-only result set
> 
>
> Key: BEAM-3714
> URL: https://issues.apache.org/jira/browse/BEAM-3714
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-jdbc
>Reporter: Eugene Kirpichov
>Assignee: Innocent
>Priority: Major
>
> [https://stackoverflow.com/questions/48784889/streaming-data-from-cloudsql-into-dataflow/48819934#48819934]
>  - a user is trying to load a large table from MySQL, and the MySQL JDBC 
> driver requires special measures when loading large result sets.
> JdbcIO currently calls simply "connection.prepareStatement(query)" 
> https://github.com/apache/beam/blob/bb8c12c4956cbe3c6f2e57113e7c0ce2a5c05009/sdks/java/io/jdbc/src/main/java/org/apache/beam/sdk/io/jdbc/JdbcIO.java#L508
>  - it should specify type TYPE_FORWARD_ONLY and concurrency CONCUR_READ_ONLY 
> - these values should always be used.
> Seems that different databases have different requirements for streaming 
> result sets.
> E.g. MySQL requires setting fetch size; PostgreSQL says "The Connection must 
> not be in autocommit mode." 
> https://jdbc.postgresql.org/documentation/head/query.html#query-with-cursor . 
> Oracle, I think, doesn't have any special requirements but I don't know. 
> Fetch size should probably still be set to a reasonably large value.
> Seems that the common denominator of these requirements is: set fetch size to 
> a reasonably large but not maximum value; disable autocommit (there's nothing 
> to commit in read() anyway).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Java_MavenInstall #6088

2018-02-28 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-3764) Add retry logic in ExampleUtils that handles network flakes.

2018-02-28 Thread David Yan (JIRA)
David Yan created BEAM-3764:
---

 Summary: Add retry logic in ExampleUtils that handles network 
flakes.
 Key: BEAM-3764
 URL: https://issues.apache.org/jira/browse/BEAM-3764
 Project: Beam
  Issue Type: Improvement
  Components: examples-java
Reporter: David Yan
Assignee: Reuven Lax


This exception is thrown when there is a network flake when running the traffic 
examples. Should we retry instead of throwing the exception and failing the job?

{{Exception in thread "main" java.io.IOException: Error getting access token 
for service account: }}
{{        at 
com.google.auth.oauth2.ServiceAccountCredentials.refreshAccessToken(ServiceAccountCredentials.java:319)}}
{{        at 
com.google.auth.oauth2.OAuth2Credentials.refresh(OAuth2Credentials.java:149)}}
{{        at 
com.google.auth.oauth2.OAuth2Credentials.getRequestMetadata(OAuth2Credentials.java:135)}}
{{        at 
com.google.auth.http.HttpCredentialsAdapter.initialize(HttpCredentialsAdapter.java:96)}}
{{        at 
com.google.cloud.hadoop.util.ChainingHttpRequestInitializer.initialize(ChainingHttpRequestInitializer.java:52)}}
{{        at 
com.google.api.client.http.HttpRequestFactory.buildRequest(HttpRequestFactory.java:93)}}
{{        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequest(AbstractGoogleClientRequest.java:300)}}
{{        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)}}
{{        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)}}
{{        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)}}
{{        at 
org.apache.beam.examples.common.ExampleUtils.executeNullIfNotFound(ExampleUtils.java:397)}}
{{        at 
org.apache.beam.examples.common.ExampleUtils.setupPubsubTopic(ExampleUtils.java:283)}}
{{        at 
org.apache.beam.examples.common.ExampleUtils.setupPubsub(ExampleUtils.java:131)}}
{{        at 
org.apache.beam.examples.common.ExampleUtils.setup(ExampleUtils.java:105)}}
{{        at 
org.apache.beam.examples.complete.TrafficMaxLaneFlow.main(TrafficMaxLaneFlow.java:334)}}
{{Caused by: javax.net.ssl.SSLHandshakeException: Remote host closed connection 
during handshake}}
{{        at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:994)}}
{{        at 
sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1379)}}
{{        at 
sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1407)}}
{{        at 
sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1391)}}
{{        at 
sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)}}
{{        at 
sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)}}
{{        at 
sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1316)}}
{{        at 
sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1291)}}
{{        at 
sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:250)}}
{{        at 
com.google.api.client.http.javanet.NetHttpRequest.execute(NetHttpRequest.java:77)}}
{{        at 
com.google.api.client.http.HttpRequest.execute(HttpRequest.java:981)}}
{{        at 
com.google.auth.oauth2.ServiceAccountCredentials.refreshAccessToken(ServiceAccountCredentials.java:317)}}
{{        ... 14 more}}
{{Caused by: java.io.EOFException: SSL peer shut down incorrectly}}
{{        at sun.security.ssl.InputRecord.read(InputRecord.java:505)}}
{{        at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:975)}}
{{        ... 25 more}}

Add retry code somewhere here?

https://github.com/apache/beam/blob/29859eb54d05b96a9db477e7bb04537510273bd2/examples/java/src/main/java/org/apache/beam/examples/common/ExampleUtils.java#L398



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #6087

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.08 MB...]
2018-03-01T01:04:21.648 [INFO] Excluding com.google.inject:guice:jar:3.0 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding javax.inject:javax.inject:jar:1 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding aopalliance:aopalliance:jar:1.0 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 
from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.sun.jersey.contribs:jersey-guice:jar:1.9 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding jline:jline:jar:2.11 from the shaded 
jar.
2018-03-01T01:04:21.648 [INFO] Excluding org.apache.ant:ant:jar:1.9.2 from the 
shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding org.apache.ant:ant-launcher:jar:1.9.2 
from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding net.engio:mbassador:jar:1.1.9 from the 
shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding net.lingala.zip4j:zip4j:jar:1.3.2 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding commons-codec:commons-codec:jar:1.10 
from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.xbean:xbean-asm5-shaded:jar:4.3 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding org.jctools:jctools-core:jar:1.1 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.beam:beam-model-pipeline:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.beam:beam-sdks-java-core:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Including com.google.guava:guava:jar:20.0 in the 
shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.github.stephenc.findbugs:findbugs-annotations:jar:1.3.9-1 from the shaded 
jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-core:jar:2.8.9 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-annotations:jar:2.8.9 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-databind:jar:2.8.9 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding net.bytebuddy:byte-buddy:jar:1.7.10 
from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding org.apache.avro:avro:jar:1.8.2 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.thoughtworks.paranamer:paranamer:jar:2.7 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding org.tukaani:xz:jar:1.5 from the shaded 
jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.xerial.snappy:snappy-java:jar:1.1.4 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.commons:commons-compress:jar:1.14 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.beam:beam-runners-core-construction-java:jar:2.4.0-SNAPSHOT from the 
shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.beam:beam-model-job-management:jar:2.4.0-SNAPSHOT from the shaded 
jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.google.protobuf:protobuf-java-util:jar:3.2.0 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding com.google.code.gson:gson:jar:2.7 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the 
shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.google.errorprone:error_prone_annotations:jar:2.0.15 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.google.instrumentation:instrumentation-api:jar:0.3.0 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the 
shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.beam:beam-runners-core-java:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.beam:beam-model-fn-execution:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
org.apache.commons:commons-lang3:jar:3.6 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.google.code.findbugs:jsr305:jar:3.0.1 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding org.objenesis:objenesis:jar:1.0 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding 
com.google.auto.service:auto-service:jar:1.0-rc2 from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding com.google.auto:auto-common:jar:0.3 
from the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from 
the shaded jar.
2018-03-01T01:04:21.648 [INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 
from the shaded jar.
2018-03-01T01:04:23.616 [INFO] Replacing 

Build failed in Jenkins: beam_PerformanceTests_Spark #1413

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Avoid unnecessary autoboxing by replacing Integer/Long.valueOf with

[github] Bump container image tag to fix incompatibility between the SDK and the

[github] Update dependency.py

--
[...truncated 65.77 KB...]
2018-03-01 00:49:18,992 c2fe9935 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-01 00:49:45,520 c2fe9935 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-01 00:49:54,413 c2fe9935 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:08.88s,  CPU:0.53s,  MaxMemory:31060kb 
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r2b3674ed64fb17cc_0161df09545e_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r2b3674ed64fb17cc_0161df09545e_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r2b3674ed64fb17cc_0161df09545e_1 ... (0s) Current status: DONE   
2018-03-01 00:49:54,414 c2fe9935 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-01 00:50:18,762 c2fe9935 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-01 00:50:22,318 c2fe9935 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:03.54s,  CPU:0.49s,  MaxMemory:31060kb 
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r212bb7b06e18a1db_0161df09d5f5_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r212bb7b06e18a1db_0161df09d5f5_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r212bb7b06e18a1db_0161df09d5f5_1 ... (0s) Current status: DONE   
2018-03-01 00:50:22,318 c2fe9935 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-01 00:50:39,252 c2fe9935 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-01 00:50:42,808 c2fe9935 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:03.54s,  CPU:0.52s,  MaxMemory:31060kb 
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r7cb333d189ccb9bf_0161df0a263d_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r7cb333d189ccb9bf_0161df0a263d_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r7cb333d189ccb9bf_0161df0a263d_1 ... (0s) Current status: DONE   
2018-03-01 00:50:42,808 c2fe9935 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-01 00:51:05,155 c2fe9935 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-01 00:51:09,691 c2fe9935 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:04.52s,  CPU:0.49s,  MaxMemory:30824kb 
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_ra703ddaa0183488_0161df0a8b20_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload 

[jira] [Assigned] (BEAM-3714) JdbcIO.read() should create a forward-only, read-only result set

2018-02-28 Thread Innocent (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3714?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Innocent reassigned BEAM-3714:
--

Assignee: Innocent  (was: Jean-Baptiste Onofré)

> JdbcIO.read() should create a forward-only, read-only result set
> 
>
> Key: BEAM-3714
> URL: https://issues.apache.org/jira/browse/BEAM-3714
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-jdbc
>Reporter: Eugene Kirpichov
>Assignee: Innocent
>Priority: Major
>
> [https://stackoverflow.com/questions/48784889/streaming-data-from-cloudsql-into-dataflow/48819934#48819934]
>  - a user is trying to load a large table from MySQL, and the MySQL JDBC 
> driver requires special measures when loading large result sets.
> JdbcIO currently calls simply "connection.prepareStatement(query)" 
> https://github.com/apache/beam/blob/bb8c12c4956cbe3c6f2e57113e7c0ce2a5c05009/sdks/java/io/jdbc/src/main/java/org/apache/beam/sdk/io/jdbc/JdbcIO.java#L508
>  - it should specify type TYPE_FORWARD_ONLY and concurrency CONCUR_READ_ONLY 
> - these values should always be used.
> Seems that different databases have different requirements for streaming 
> result sets.
> E.g. MySQL requires setting fetch size; PostgreSQL says "The Connection must 
> not be in autocommit mode." 
> https://jdbc.postgresql.org/documentation/head/query.html#query-with-cursor . 
> Oracle, I think, doesn't have any special requirements but I don't know. 
> Fetch size should probably still be set to a reasonably large value.
> Seems that the common denominator of these requirements is: set fetch size to 
> a reasonably large but not maximum value; disable autocommit (there's nothing 
> to commit in read() anyway).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Python #969

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Avoid unnecessary autoboxing by replacing Integer/Long.valueOf with

[github] Bump container image tag to fix incompatibility between the SDK and the

[github] Update dependency.py

--
[...truncated 630 B...]
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3788a48770da9f48beed96fbbec56cf1beaeb6aa (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3788a48770da9f48beed96fbbec56cf1beaeb6aa
Commit message: "Bump container image tag to fix incompatibility between the 
SDK and the container."
 > git rev-list --no-walk 7fa6292a21564744011fe94a7e50f7e074564b71 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2010681853653977920.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2369217602760097121.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins796617262901681607.sh
+ virtualenv .env --system-site-packages
New python executable in 

Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2253171331843572864.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./.env/lib/python2.7/site-packages
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2915317656148962468.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6555351829290318576.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
:122:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
  Using cached numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: 

Jenkins build is unstable: beam_PostCommit_Java_MavenInstall #6086

2018-02-28 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_JDBC #273

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Avoid unnecessary autoboxing by replacing Integer/Long.valueOf with

[github] Bump container image tag to fix incompatibility between the SDK and the

[github] Update dependency.py

--
[...truncated 724.63 KB...]
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] --- maven-failsafe-plugin:2.20.1:integration-test (default) @ 
beam-sdks-java-io-jdbc ---
[INFO] Failsafe report directory: 

[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 345.297 
s <<< FAILURE! - in org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] testWriteThenRead(org.apache.beam.sdk.io.jdbc.JdbcIOIT)  Time elapsed: 
345.297 s  <<< ERROR!
java.lang.RuntimeException: 
(eea6d4dcdc184948): java.lang.RuntimeException: 
org.apache.beam.sdk.util.UserCodeException: org.postgresql.util.PSQLException: 
The connection attempt failed.
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:404)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:374)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:158)
at 
com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:308)
at 
com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:264)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:133)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:113)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:100)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: 
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$Write$WriteFn$DoFnInvoker.invokeSetup(Unknown
 Source)
at 
com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:63)
at 
com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:45)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:94)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:481)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:392)
... 14 more
Caused by: 

Jenkins build is back to normal : beam_PostCommit_Python_Verify #4333

2018-02-28 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Python_ValidatesRunner_Dataflow #1016

2018-02-28 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #6085

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.06 MB...]
2018-02-28T23:59:02.264 [INFO] Excluding 
com.google.inject.extensions:guice-servlet:jar:3.0 from the shaded jar.
2018-02-28T23:59:02.264 [INFO] Excluding com.google.inject:guice:jar:3.0 from 
the shaded jar.
2018-02-28T23:59:02.264 [INFO] Excluding javax.inject:javax.inject:jar:1 from 
the shaded jar.
2018-02-28T23:59:02.264 [INFO] Excluding aopalliance:aopalliance:jar:1.0 from 
the shaded jar.
2018-02-28T23:59:02.264 [INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 
from the shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding 
com.sun.jersey.contribs:jersey-guice:jar:1.9 from the shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding jline:jline:jar:2.11 from the shaded 
jar.
2018-02-28T23:59:02.265 [INFO] Excluding org.apache.ant:ant:jar:1.9.2 from the 
shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding org.apache.ant:ant-launcher:jar:1.9.2 
from the shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding net.engio:mbassador:jar:1.1.9 from the 
shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding net.lingala.zip4j:zip4j:jar:1.3.2 from 
the shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding commons-codec:commons-codec:jar:1.10 
from the shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding 
org.apache.xbean:xbean-asm5-shaded:jar:4.3 from the shaded jar.
2018-02-28T23:59:02.265 [INFO] Excluding org.jctools:jctools-core:jar:1.1 from 
the shaded jar.
2018-02-28T23:59:02.266 [INFO] Excluding 
org.apache.beam:beam-model-pipeline:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-02-28T23:59:02.266 [INFO] Excluding 
com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded jar.
2018-02-28T23:59:02.266 [INFO] Excluding 
org.apache.beam:beam-sdks-java-core:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-02-28T23:59:02.266 [INFO] Including com.google.guava:guava:jar:20.0 in the 
shaded jar.
2018-02-28T23:59:02.266 [INFO] Excluding 
com.github.stephenc.findbugs:findbugs-annotations:jar:1.3.9-1 from the shaded 
jar.
2018-02-28T23:59:02.266 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-core:jar:2.8.9 from the shaded jar.
2018-02-28T23:59:02.266 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-annotations:jar:2.8.9 from the shaded jar.
2018-02-28T23:59:02.266 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-databind:jar:2.8.9 from the shaded jar.
2018-02-28T23:59:02.267 [INFO] Excluding net.bytebuddy:byte-buddy:jar:1.7.10 
from the shaded jar.
2018-02-28T23:59:02.267 [INFO] Excluding org.apache.avro:avro:jar:1.8.2 from 
the shaded jar.
2018-02-28T23:59:02.267 [INFO] Excluding 
com.thoughtworks.paranamer:paranamer:jar:2.7 from the shaded jar.
2018-02-28T23:59:02.267 [INFO] Excluding org.tukaani:xz:jar:1.5 from the shaded 
jar.
2018-02-28T23:59:02.267 [INFO] Excluding 
org.xerial.snappy:snappy-java:jar:1.1.4 from the shaded jar.
2018-02-28T23:59:02.267 [INFO] Excluding 
org.apache.commons:commons-compress:jar:1.14 from the shaded jar.
2018-02-28T23:59:02.267 [INFO] Excluding 
org.apache.beam:beam-runners-core-construction-java:jar:2.4.0-SNAPSHOT from the 
shaded jar.
2018-02-28T23:59:02.267 [INFO] Excluding 
org.apache.beam:beam-model-job-management:jar:2.4.0-SNAPSHOT from the shaded 
jar.
2018-02-28T23:59:02.268 [INFO] Excluding 
com.google.protobuf:protobuf-java-util:jar:3.2.0 from the shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding com.google.code.gson:gson:jar:2.7 from 
the shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the 
shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding 
com.google.errorprone:error_prone_annotations:jar:2.0.15 from the shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from 
the shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding 
com.google.instrumentation:instrumentation-api:jar:0.3.0 from the shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the 
shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding 
org.apache.beam:beam-runners-core-java:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding 
org.apache.beam:beam-model-fn-execution:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-02-28T23:59:02.268 [INFO] Excluding 
org.apache.commons:commons-lang3:jar:3.6 from the shaded jar.
2018-02-28T23:59:02.269 [INFO] Excluding 
com.google.code.findbugs:jsr305:jar:3.0.1 from the shaded jar.
2018-02-28T23:59:02.269 [INFO] Excluding org.objenesis:objenesis:jar:1.0 from 
the shaded jar.
2018-02-28T23:59:02.269 [INFO] Excluding 
com.google.auto.service:auto-service:jar:1.0-rc2 from the shaded jar.
2018-02-28T23:59:02.269 [INFO] Excluding com.google.auto:auto-common:jar:0.3 
from the shaded jar.
2018-02-28T23:59:02.269 [INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from 
the shaded jar.
2018-02-28T23:59:02.269 [INFO] 

[jira] [Commented] (BEAM-3289) Add ReadFromBigQuery and several other cleanups of bigquery.py

2018-02-28 Thread Chamikara Jayalath (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16381235#comment-16381235
 ] 

Chamikara Jayalath commented on BEAM-3289:
--

"Make WriteToBigQuery to use Write(BigQuery) for batch pipelines so that both 
both batch and streaming users can use that": this was already done in 
https://github.com/apache/beam/pull/3306.

> Add ReadFromBigQuery and several other cleanups of bigquery.py
> --
>
> Key: BEAM-3289
> URL: https://issues.apache.org/jira/browse/BEAM-3289
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Chamikara Jayalath
>Assignee: Chamikara Jayalath
>Priority: Major
>
> We need to do following cleanups for Python BigQuery module.
> * Add ReadFromBigQuery that wraps Read(BigQuerySource).
> * Make WriteToBigQuery to use Write(BigQuery) for batch pipelines so that 
> both both batch and streaming users can use that.
> * Update documentation of WriteToBigQuery



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Bump container image tag to fix incompatibility between the SDK and the container.

2018-02-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 3788a48770da9f48beed96fbbec56cf1beaeb6aa
Merge: 4d8cdbd 7577015
Author: Lukasz Cwik 
AuthorDate: Wed Feb 28 15:23:25 2018 -0800

Bump container image tag to fix incompatibility between the SDK and the 
container.

 sdks/python/apache_beam/runners/dataflow/internal/dependency.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] branch master updated (4d8cdbd -> 3788a48)

2018-02-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 4d8cdbd  Avoid unnecessary autoboxing by replacing 
Integer/Long.valueOf with Integer.parseInt/Long.parseLong
 add 604e99e  Bump container image tag to fix incompatibility between the 
SDK and the container.
 add 7577015  Update dependency.py
 new 3788a48  Bump container image tag to fix incompatibility between the 
SDK and the container.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/runners/dataflow/internal/dependency.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[jira] [Resolved] (BEAM-3569) SpannerIO.write throws on delete mutations

2018-02-28 Thread Oscar Korz (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3569?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Oscar Korz resolved BEAM-3569.
--
Resolution: Fixed

> SpannerIO.write throws on delete mutations
> --
>
> Key: BEAM-3569
> URL: https://issues.apache.org/jira/browse/BEAM-3569
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.2.0
>Reporter: Oscar Korz
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.3.0
>
> Attachments: beam-spanner-io-delete.tar.gz
>
>
> It is currently impossible to delete a Spanner row in Beam with SpannerIO. 
> The exception is generated by trying to guess the size of a delete mutation 
> which cannot contain any values (deletes are simply by key).
> The root exception stack trace:
> {code:java}
>  Caused by: java.lang.IllegalStateException: values() cannot be called for a 
> DELETE mutation
>   at com.google.common.base.Preconditions.checkState(Preconditions.java:456)
>   at com.google.cloud.spanner.Mutation.getValues(Mutation.java:233)
>   at 
> org.apache.beam.sdk.io.gcp.spanner.MutationSizeEstimator.sizeOf(MutationSizeEstimator.java:33)
>   at 
> org.apache.beam.sdk.io.gcp.spanner.MutationSizeEstimator.sizeOf(MutationSizeEstimator.java:51)
> {code}
> I believe this can be fixed by special casing MutationSizeEstimator.sizeOf to 
> either 0 or 1 for Mutations with getOperation() = Op.DELETE.
> The workaround is to avoid using SpannerIO and use Spanner client API 
> directly in a custom DoFn, but this forces users to either reimplement all 
> the intelligent batching that SpannerIO does or suffer poor performance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3569) SpannerIO.write throws on delete mutations

2018-02-28 Thread Oscar Korz (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3569?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Oscar Korz updated BEAM-3569:
-
Fix Version/s: 2.3.0

Looks like this was fixed in 2.3.0, thanks.

https://github.com/apache/beam/blob/release-2.3.0/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/MutationSizeEstimator.java

> SpannerIO.write throws on delete mutations
> --
>
> Key: BEAM-3569
> URL: https://issues.apache.org/jira/browse/BEAM-3569
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.2.0
>Reporter: Oscar Korz
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.3.0
>
> Attachments: beam-spanner-io-delete.tar.gz
>
>
> It is currently impossible to delete a Spanner row in Beam with SpannerIO. 
> The exception is generated by trying to guess the size of a delete mutation 
> which cannot contain any values (deletes are simply by key).
> The root exception stack trace:
> {code:java}
>  Caused by: java.lang.IllegalStateException: values() cannot be called for a 
> DELETE mutation
>   at com.google.common.base.Preconditions.checkState(Preconditions.java:456)
>   at com.google.cloud.spanner.Mutation.getValues(Mutation.java:233)
>   at 
> org.apache.beam.sdk.io.gcp.spanner.MutationSizeEstimator.sizeOf(MutationSizeEstimator.java:33)
>   at 
> org.apache.beam.sdk.io.gcp.spanner.MutationSizeEstimator.sizeOf(MutationSizeEstimator.java:51)
> {code}
> I believe this can be fixed by special casing MutationSizeEstimator.sizeOf to 
> either 0 or 1 for Mutations with getOperation() = Op.DELETE.
> The workaround is to avoid using SpannerIO and use Spanner client API 
> directly in a custom DoFn, but this forces users to either reimplement all 
> the intelligent batching that SpannerIO does or suffer poor performance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1015

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 123.24 KB...]
  File "/usr/lib/python2.7/pickle.py", line 649, in save_dict
self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 681, in _batch_setitems
save(v)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File 
"
 line 1311, in save_function
obj.__dict__), obj=obj)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
save(args)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 562, in save_tuple
save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File 
"
 line 807, in save_code
pickler.save_reduce(CodeType, args, obj=obj)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
save(args)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 581, in save_tuple
self.memoize(obj)
  File "/usr/lib/python2.7/pickle.py", line 246, in memoize
self.write(self.put(memo_len))
  File "/usr/lib/python2.7/pickle.py", line 253, in put
return BINPUT + chr(i)
  File 
"
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)'

==
ERROR: test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
--
Traceback (most recent call last):
  File 
"
 line 812, in run
test(orig)
  File 
"
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"
 line 133, in run
self.runTest(result)
  File 
"
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 395, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 331, in run
testMethod()
  File 
"
 line 157, in test_multi_valued_singleton_side_input
pipeline.run()
  File 
"
 line 102, in run
result = super(TestPipeline, self).run()
  File 
"
 line 369, in run
self.to_runner_api(), self.runner, self._options).run(False)
  File 
"
 line 382, in run
return self.runner.run_pipeline(self)
  File 
"
 line 308, in run_pipeline
super(DataflowRunner, self).run_pipeline(pipeline)
  File 
"
 line 157, in run_pipeline
pipeline.visit(RunVisitor(self))
  File 
"
 line 410, in visit
self._root_transform().visit(visitor, self, visited)
  File 
"
 line 764, in 

[jira] [Commented] (BEAM-3611) Split KafkaIO.java into smaller files

2018-02-28 Thread Chamikara Jayalath (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3611?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16381160#comment-16381160
 ] 

Chamikara Jayalath commented on BEAM-3611:
--

Was this fixed by [https://github.com/apache/beam/pull/4586] ?

> Split KafkaIO.java into smaller files
> -
>
> Key: BEAM-3611
> URL: https://issues.apache.org/jira/browse/BEAM-3611
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-kafka
>Reporter: Raghu Angadi
>Assignee: Raghu Angadi
>Priority: Minor
> Fix For: 2.4.0
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> KafkaIO.java has grown too big and includes both source and sink 
> implementation. Better to move these to own files. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3689) Direct runner leak a reader for every 10 input records

2018-02-28 Thread Chamikara Jayalath (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16381157#comment-16381157
 ] 

Chamikara Jayalath commented on BEAM-3689:
--

Looks like linked PR was merged. Can this be closed ?

> Direct runner leak a reader for every 10 input records
> --
>
> Key: BEAM-3689
> URL: https://issues.apache.org/jira/browse/BEAM-3689
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-direct
>Reporter: Raghu Angadi
>Assignee: Raghu Angadi
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> Direct runner reads 10 records at a time from a reader. I think the intention 
> is to reuse the reader, but it reuses only if the reader is idle initially, 
> not when the source has messages available.
> When I was testing KafkaIO with direct runner it kept opening new reader for 
> every 10 records and soon ran out of file descriptors.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3763) Add per-transform documentation to the website

2018-02-28 Thread Melissa Pashniak (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3763?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Melissa Pashniak reassigned BEAM-3763:
--

Assignee: Rafael Fernandez  (was: Melissa Pashniak)

> Add per-transform documentation to the website
> --
>
> Key: BEAM-3763
> URL: https://issues.apache.org/jira/browse/BEAM-3763
> Project: Beam
>  Issue Type: Task
>  Components: website
>Reporter: Rafael Fernandez
>Assignee: Rafael Fernandez
>Priority: Minor
>  Labels: easyfix
>
> Add structure to the website to incrementally document per-transform 
> definitions and examples. The idea is to incrementally populate this section 
> and clean up stale javadoc entries which have unworkable / outdated examples.
>  
> This task tracks creating the right structure for the website. Each transform 
> cleanup/documentation will come with its own JIRA, to facilitate other 
> members of the community to pick up outstanding work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3763) Add per-transform documentation to the website

2018-02-28 Thread Melissa Pashniak (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3763?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Melissa Pashniak reassigned BEAM-3763:
--

Assignee: Melissa Pashniak  (was: Reuven Lax)

> Add per-transform documentation to the website
> --
>
> Key: BEAM-3763
> URL: https://issues.apache.org/jira/browse/BEAM-3763
> Project: Beam
>  Issue Type: Task
>  Components: website
>Reporter: Rafael Fernandez
>Assignee: Melissa Pashniak
>Priority: Minor
>  Labels: easyfix
>
> Add structure to the website to incrementally document per-transform 
> definitions and examples. The idea is to incrementally populate this section 
> and clean up stale javadoc entries which have unworkable / outdated examples.
>  
> This task tracks creating the right structure for the website. Each transform 
> cleanup/documentation will come with its own JIRA, to facilitate other 
> members of the community to pick up outstanding work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3763) Add per-transform documentation to the website

2018-02-28 Thread Melissa Pashniak (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3763?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Melissa Pashniak reassigned BEAM-3763:
--

Assignee: Reuven Lax  (was: Melissa Pashniak)

> Add per-transform documentation to the website
> --
>
> Key: BEAM-3763
> URL: https://issues.apache.org/jira/browse/BEAM-3763
> Project: Beam
>  Issue Type: Task
>  Components: website
>Reporter: Rafael Fernandez
>Assignee: Reuven Lax
>Priority: Minor
>  Labels: easyfix
>
> Add structure to the website to incrementally document per-transform 
> definitions and examples. The idea is to incrementally populate this section 
> and clean up stale javadoc entries which have unworkable / outdated examples.
>  
> This task tracks creating the right structure for the website. Each transform 
> cleanup/documentation will come with its own JIRA, to facilitate other 
> members of the community to pick up outstanding work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3763) Add per-transform documentation to the website

2018-02-28 Thread Melissa Pashniak (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3763?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Melissa Pashniak reassigned BEAM-3763:
--

Assignee: Melissa Pashniak  (was: Reuven Lax)

> Add per-transform documentation to the website
> --
>
> Key: BEAM-3763
> URL: https://issues.apache.org/jira/browse/BEAM-3763
> Project: Beam
>  Issue Type: Task
>  Components: website
>Reporter: Rafael Fernandez
>Assignee: Melissa Pashniak
>Priority: Minor
>  Labels: easyfix
>
> Add structure to the website to incrementally document per-transform 
> definitions and examples. The idea is to incrementally populate this section 
> and clean up stale javadoc entries which have unworkable / outdated examples.
>  
> This task tracks creating the right structure for the website. Each transform 
> cleanup/documentation will come with its own JIRA, to facilitate other 
> members of the community to pick up outstanding work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4332

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.02 MB...]
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 

[jira] [Created] (BEAM-3763) Add per-transform documentation to the website

2018-02-28 Thread Rafael Fernandez (JIRA)
Rafael Fernandez created BEAM-3763:
--

 Summary: Add per-transform documentation to the website
 Key: BEAM-3763
 URL: https://issues.apache.org/jira/browse/BEAM-3763
 Project: Beam
  Issue Type: Task
  Components: website
Reporter: Rafael Fernandez
Assignee: Reuven Lax


Add structure to the website to incrementally document per-transform 
definitions and examples. The idea is to incrementally populate this section 
and clean up stale javadoc entries which have unworkable / outdated examples.

 

This task tracks creating the right structure for the website. Each transform 
cleanup/documentation will come with its own JIRA, to facilitate other members 
of the community to pick up outstanding work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3760) core-construction-java NeedsRunner Tests are not executed

2018-02-28 Thread Luke Cwik (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik reassigned BEAM-3760:
---

Assignee: Ben Sidhom  (was: Thomas Groh)

> core-construction-java NeedsRunner Tests are not executed
> -
>
> Key: BEAM-3760
> URL: https://issues.apache.org/jira/browse/BEAM-3760
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core, runner-direct
>Reporter: Thomas Groh
>Assignee: Ben Sidhom
>Priority: Blocker
> Fix For: Not applicable
>
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> The core-construction-java dependency isn't scanned.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Python_ValidatesRunner_Dataflow #1013

2018-02-28 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-02-28 Thread Xu Mingmin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16380953#comment-16380953
 ] 

Xu Mingmin commented on BEAM-3749:
--

{{Window.triggering(trigger).discardingFiredPanes().withAllowedLateness()}} is 
a good idea, based on my applications it's not enough, for example multiple 
queries sharing the same source table.

I would submit a PR and we should make it comparable when 
{{Window.triggering(trigger)...}} works in future.

 

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.4.0
>
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1012

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Avoid unnecessary autoboxing by replacing Integer/Long.valueOf with

--
[...truncated 123.66 KB...]
  File 
"
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"
 line 133, in run
self.runTest(result)
  File 
"
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 395, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 331, in run
testMethod()
  File 
"
 line 178, in test_iterable_side_input
pipeline.run()
  File 
"
 line 102, in run
result = super(TestPipeline, self).run()
  File 
"
 line 367, in run
if test_runner_api and self._verify_runner_api_compatible():
  File 
"
 line 570, in _verify_runner_api_compatible
self.visit(Visitor())
  File 
"
 line 410, in visit
self._root_transform().visit(visitor, self, visited)
  File 
"
 line 764, in visit
part.visit(visitor, pipeline, visited)
  File 
"
 line 764, in visit
part.visit(visitor, pipeline, visited)
  File 
"
 line 764, in visit
part.visit(visitor, pipeline, visited)
  File 
"
 line 767, in visit
visitor.visit_transform(self)
  File 
"
 line 561, in visit_transform
enable_trace=False),
  File 
"
 line 193, in dumps
s = dill.dumps(o)
  File 
"
 line 259, in dumps
dump(obj, file, protocol, byref, fmode, recurse)#, strictio)
  File 
"
 line 252, in dump
pik.dump(obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 419, in save_reduce
save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File 
"
 line 165, in new_save_module_dict
return old_save_module_dict(pickler, obj)
  File 
"
 line 841, in save_module_dict
StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 649, in save_dict
self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 681, in _batch_setitems
save(v)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 419, in save_reduce
save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4331

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Avoid unnecessary autoboxing by replacing Integer/Long.valueOf with

--
[...truncated 1.02 MB...]
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 

[jira] [Created] (BEAM-3762) Update Dataflow worker image to support unlimited JCE policy

2018-02-28 Thread Luke Cwik (JIRA)
Luke Cwik created BEAM-3762:
---

 Summary: Update Dataflow worker image to support unlimited JCE 
policy
 Key: BEAM-3762
 URL: https://issues.apache.org/jira/browse/BEAM-3762
 Project: Beam
  Issue Type: Improvement
  Components: runner-dataflow
Reporter: Luke Cwik
Assignee: Luke Cwik
 Fix For: 2.4.0






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-02-28 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16380811#comment-16380811
 ] 

Kenneth Knowles commented on BEAM-3749:
---

I realize a problem though - you do need to add the trigger after the read, so 
the way that IO is done in pure SQL CLI means you need to add the triggering 
after the IO, yet this is part of the query. So actually we might need a change 
like this one you suggest, but to set up triggering on the table scan relnodes.

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.4.0
>
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-02-28 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16380809#comment-16380809
 ] 

Kenneth Knowles commented on BEAM-3749:
---

Hmm, the first bit seems like a bug. You should be able to set just the 
triggering.

For the second thing - if you just say `Window.into(FixedWindows.of(...))` and 
do not say triggering, it will use whatever the upstream trigger was (aka leave 
it alone).

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.4.0
>
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam-site] branch mergebot updated (6764c8c -> 98b9ea3)

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


 discard 6764c8c  This closes #388
 discard 515cd4e  Explicitly define section id due to kramdown id generation 
changes
 discard 1a192bd  Update Gemfile.lock
 new a6756dc  Update Gemfile.lock
 new b47e381  Explicitly define section id due to kramdown id generation 
changes
 new 98b9ea3  This closes #388

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (6764c8c)
\
 N -- N -- N   refs/heads/mergebot (98b9ea3)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/03: Update Gemfile.lock

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit a6756dcddc932a2e5b50e919b8c6ece47ca89ed6
Author: Kenneth Knowles 
AuthorDate: Wed Feb 14 11:36:59 2018 -0800

Update Gemfile.lock
---
 Gemfile.lock | 78 +---
 1 file changed, 43 insertions(+), 35 deletions(-)

diff --git a/Gemfile.lock b/Gemfile.lock
index 1ab575d..f3ca8ed 100644
--- a/Gemfile.lock
+++ b/Gemfile.lock
@@ -1,29 +1,31 @@
 GEM
   remote: https://rubygems.org/
   specs:
-activesupport (4.2.7.1)
+activesupport (4.2.10)
   i18n (~> 0.7)
-  json (~> 1.7, >= 1.7.7)
   minitest (~> 5.1)
   thread_safe (~> 0.3, >= 0.3.4)
   tzinfo (~> 1.1)
-addressable (2.4.0)
+addressable (2.5.2)
+  public_suffix (>= 2.0.2, < 4.0)
 colorator (1.1.0)
-colored (1.2)
-ethon (0.9.1)
+colorize (0.8.1)
+concurrent-ruby (1.0.5)
+ethon (0.11.0)
   ffi (>= 1.3.0)
-ffi (1.9.14)
+ffi (1.9.21)
 forwardable-extended (2.6.0)
-html-proofer (3.3.1)
+html-proofer (3.8.0)
   activesupport (>= 4.2, < 6.0)
   addressable (~> 2.3)
-  colored (~> 1.2)
+  colorize (~> 0.8)
   mercenary (~> 0.3.2)
-  nokogiri (~> 1.5)
+  nokogiri (~> 1.8.1)
   parallel (~> 1.3)
-  typhoeus (~> 0.7)
+  typhoeus (~> 1.3)
   yell (~> 2.0)
-i18n (0.7.0)
+i18n (0.9.5)
+  concurrent-ruby (~> 1.0)
 jekyll (3.2.0)
   colorator (~> 1.0)
   jekyll-sass-converter (~> 1.0)
@@ -36,40 +38,46 @@ GEM
   safe_yaml (~> 1.0)
 jekyll-redirect-from (0.11.0)
   jekyll (>= 2.0)
-jekyll-sass-converter (1.4.0)
+jekyll-sass-converter (1.5.2)
   sass (~> 3.4)
-jekyll-watch (1.5.0)
-  listen (~> 3.0, < 3.1)
-jekyll_github_sample (0.3.0)
+jekyll-watch (1.5.1)
+  listen (~> 3.0)
+jekyll_github_sample (0.3.1)
   activesupport (~> 4.0)
   jekyll (~> 3.0)
-json (1.8.3)
-kramdown (1.12.0)
+kramdown (1.16.2)
 liquid (3.0.6)
-listen (3.0.8)
+listen (3.1.5)
   rb-fsevent (~> 0.9, >= 0.9.4)
   rb-inotify (~> 0.9, >= 0.9.7)
+  ruby_dep (~> 1.2)
 mercenary (0.3.6)
-mini_portile2 (2.1.0)
-minitest (5.9.1)
-nokogiri (1.6.8.1)
-  mini_portile2 (~> 2.1.0)
-parallel (1.9.0)
-pathutil (0.14.0)
+mini_portile2 (2.3.0)
+minitest (5.11.3)
+nokogiri (1.8.2)
+  mini_portile2 (~> 2.3.0)
+parallel (1.12.1)
+pathutil (0.16.1)
   forwardable-extended (~> 2.6)
-rake (11.3.0)
-rb-fsevent (0.9.7)
-rb-inotify (0.9.7)
-  ffi (>= 0.5.0)
+public_suffix (3.0.2)
+rake (12.3.0)
+rb-fsevent (0.10.2)
+rb-inotify (0.9.10)
+  ffi (>= 0.5.0, < 2)
 rouge (1.11.1)
+ruby_dep (1.5.0)
 safe_yaml (1.0.4)
-sass (3.4.22)
-thread_safe (0.3.5)
-typhoeus (0.8.0)
-  ethon (>= 0.8.0)
-tzinfo (1.2.2)
+sass (3.5.5)
+  sass-listen (~> 4.0.0)
+sass-listen (4.0.0)
+  rb-fsevent (~> 0.9, >= 0.9.4)
+  rb-inotify (~> 0.9, >= 0.9.7)
+thread_safe (0.3.6)
+typhoeus (1.3.0)
+  ethon (>= 0.9.0)
+tzinfo (1.2.5)
   thread_safe (~> 0.1)
-yell (2.0.6)
+yell (2.0.7)
 
 PLATFORMS
   ruby
@@ -84,4 +92,4 @@ DEPENDENCIES
   rake
 
 BUNDLED WITH
-   1.13.7
+   1.16.0

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 03/03: This closes #388

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 98b9ea3c5c51d83d35db13b452adef87f73f0066
Merge: 1333c13 b47e381
Author: Mergebot 
AuthorDate: Wed Feb 28 10:38:21 2018 -0800

This closes #388

 Gemfile.lock   |  78 +++--
 src/documentation/programming-guide.md | 195 -
 2 files changed, 138 insertions(+), 135 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 02/03: Explicitly define section id due to kramdown id generation changes

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit b47e381f27ea393f0034ca6fdecab0247ea7b100
Author: melissa 
AuthorDate: Tue Feb 20 14:18:14 2018 -0800

Explicitly define section id due to kramdown id generation changes
---
 src/documentation/programming-guide.md | 195 -
 1 file changed, 95 insertions(+), 100 deletions(-)

diff --git a/src/documentation/programming-guide.md 
b/src/documentation/programming-guide.md
index 7f6aea5..6b86743 100644
--- a/src/documentation/programming-guide.md
+++ b/src/documentation/programming-guide.md
@@ -26,12 +26,7 @@ how to implement Beam concepts in your pipelines.
   
 
 
-**Table of Contents:**
-* TOC
-{:toc}
-
-
-## 1. Overview
+## 1. Overview {#overview}
 
 To use Beam, you need to first create a driver program using the classes in one
 of the Beam SDKs. Your driver program *defines* your pipeline, including all of
@@ -94,7 +89,7 @@ objects you've created and transforms that you've applied. 
That graph is then
 executed using the appropriate distributed processing back-end, becoming an
 asynchronous "job" (or equivalent) on that back-end.
 
-## 2. Creating a pipeline
+## 2. Creating a pipeline {#creating-a-pipeline}
 
 The `Pipeline` abstraction encapsulates all the data and steps in your data
 processing task. Your Beam driver program typically starts by constructing a
@@ -122,7 +117,7 @@ Pipeline p = Pipeline.create(options);
 %}
 ```
 
-### 2.1. Configuring pipeline options
+### 2.1. Configuring pipeline options {#configuring-pipeline-options}
 
 Use the pipeline options to configure different aspects of your pipeline, such
 as the pipeline runner that will execute your pipeline and any runner-specific
@@ -134,7 +129,7 @@ When you run the pipeline on a runner of your choice, a 
copy of the
 PipelineOptions will be available to your code. For example, you can read
 PipelineOptions from a DoFn's Context.
 
- 2.1.1. Setting PipelineOptions from command-line arguments
+ 2.1.1. Setting PipelineOptions from command-line arguments 
{#pipeline-options-cli}
 
 While you can configure your pipeline by creating a `PipelineOptions` object 
and
 setting the fields directly, the Beam SDKs include a command-line parser that
@@ -167,7 +162,7 @@ a command-line argument.
 > demonstrates how to set pipeline options at runtime by using command-line
 > options.
 
- 2.1.2. Creating custom options
+ 2.1.2. Creating custom options {#creating-custom-options}
 
 You can add your own custom options in addition to the standard
 `PipelineOptions`. To add your own options, define an interface with getter and
@@ -223,7 +218,7 @@ MyOptions options = PipelineOptionsFactory.fromArgs(args)
 
 Now your pipeline can accept `--myCustomOption=value` as a command-line 
argument.
 
-## 3. PCollections
+## 3. PCollections {#pcollections}
 
 The [PCollection]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/PCollection.html)
 `PCollection` abstraction represents a
@@ -236,7 +231,7 @@ After you've created your `Pipeline`, you'll need to begin 
by creating at least
 one `PCollection` in some form. The `PCollection` you create serves as the 
input
 for the first operation in your pipeline.
 
-### 3.1. Creating a PCollection
+### 3.1. Creating a PCollection {#creating-a-pcollection}
 
 You create a `PCollection` by either reading data from an external source using
 Beam's [Source API](#pipeline-io), or you can create a `PCollection` of data
@@ -246,7 +241,7 @@ contain adapters to help you read from external sources 
like large cloud-based
 files, databases, or subscription services. The latter is primarily useful for
 testing and debugging purposes.
 
- 3.1.1. Reading from an external source
+ 3.1.1. Reading from an external source {#reading-external-source}
 
 To read from an external source, you use one of the [Beam-provided I/O
 adapters](#pipeline-io). The adapters vary in their exact usage, but all of 
them
@@ -283,7 +278,7 @@ public static void main(String[] args) {
 See the [section on I/O](#pipeline-io) to learn more about how to read from the
 various data sources supported by the Beam SDK.
 
- 3.1.2. Creating a PCollection from in-memory data
+ 3.1.2. Creating a PCollection from in-memory data 
{#creating-pcollection-in-memory}
 
 {:.language-java}
 To create a `PCollection` from an in-memory Java `Collection`, you use the
@@ -326,14 +321,14 @@ public static void main(String[] args) {
 %}
 ```
 
-### 3.2. PCollection characteristics
+### 3.2. PCollection characteristics {#pcollection-characteristics}
 
 A `PCollection` is owned by the specific `Pipeline` object for which it is
 created; multiple pipelines cannot share a `PCollection`. In some respects, a
 `PCollection` functions like a collection class. However, 

Build failed in Jenkins: beam_PerformanceTests_Spark #1412

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-3681] Make S3FileSystem copy atomic for smaller than 5GB 
objects

[echauchot] [BEAM-3681] Add a comment for the extra check of objectSize in

--
[...truncated 93.44 KB...]
'apache-beam-testing:bqjob_r5b4dc83be3dc75f1_0161dda49007_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-02-28 18:20:07,260 890b6841 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-02-28 18:20:27,814 890b6841 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-02-28 18:20:29,924 890b6841 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.10s,  CPU:0.27s,  MaxMemory:25256kb 
STDOUT: Upload complete.
Waiting on bqjob_r21a6f85a9b361a2d_0161dda4ea33_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r21a6f85a9b361a2d_0161dda4ea33_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r21a6f85a9b361a2d_0161dda4ea33_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-02-28 18:20:29,924 890b6841 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-02-28 18:20:52,406 890b6841 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-02-28 18:20:54,533 890b6841 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.12s,  CPU:0.24s,  MaxMemory:25336kb 
STDOUT: Upload complete.
Waiting on bqjob_r69aaf5b1dc2ac00b_0161dda54a3f_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r69aaf5b1dc2ac00b_0161dda54a3f_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r69aaf5b1dc2ac00b_0161dda54a3f_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-02-28 18:20:54,534 890b6841 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-02-28 18:21:11,694 890b6841 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-02-28 18:21:13,928 890b6841 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.22s,  CPU:0.24s,  MaxMemory:25508kb 
STDOUT: Upload complete.
Waiting on bqjob_r4554ba69fafe47fa_0161dda595a1_1 ... (0s) Current status: 
RUNNING 
 

[beam-site] branch mergebot updated (d23e1ba -> 6764c8c)

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


 discard d23e1ba  This closes #388
 discard 4eae6c2  Explicitly define section id due to kramdown id generation 
changes
 discard e59956d  Update Gemfile.lock
 new 1a192bd  Update Gemfile.lock
 new 515cd4e  Explicitly define section id due to kramdown id generation 
changes
 new 6764c8c  This closes #388

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (d23e1ba)
\
 N -- N -- N   refs/heads/mergebot (6764c8c)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 02/03: Explicitly define section id due to kramdown id generation changes

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 515cd4ea04eb6f8f4179c036db6914c894fa1a78
Author: melissa 
AuthorDate: Tue Feb 20 14:18:14 2018 -0800

Explicitly define section id due to kramdown id generation changes
---
 src/documentation/programming-guide.md | 195 -
 1 file changed, 95 insertions(+), 100 deletions(-)

diff --git a/src/documentation/programming-guide.md 
b/src/documentation/programming-guide.md
index 7f6aea5..6b86743 100644
--- a/src/documentation/programming-guide.md
+++ b/src/documentation/programming-guide.md
@@ -26,12 +26,7 @@ how to implement Beam concepts in your pipelines.
   
 
 
-**Table of Contents:**
-* TOC
-{:toc}
-
-
-## 1. Overview
+## 1. Overview {#overview}
 
 To use Beam, you need to first create a driver program using the classes in one
 of the Beam SDKs. Your driver program *defines* your pipeline, including all of
@@ -94,7 +89,7 @@ objects you've created and transforms that you've applied. 
That graph is then
 executed using the appropriate distributed processing back-end, becoming an
 asynchronous "job" (or equivalent) on that back-end.
 
-## 2. Creating a pipeline
+## 2. Creating a pipeline {#creating-a-pipeline}
 
 The `Pipeline` abstraction encapsulates all the data and steps in your data
 processing task. Your Beam driver program typically starts by constructing a
@@ -122,7 +117,7 @@ Pipeline p = Pipeline.create(options);
 %}
 ```
 
-### 2.1. Configuring pipeline options
+### 2.1. Configuring pipeline options {#configuring-pipeline-options}
 
 Use the pipeline options to configure different aspects of your pipeline, such
 as the pipeline runner that will execute your pipeline and any runner-specific
@@ -134,7 +129,7 @@ When you run the pipeline on a runner of your choice, a 
copy of the
 PipelineOptions will be available to your code. For example, you can read
 PipelineOptions from a DoFn's Context.
 
- 2.1.1. Setting PipelineOptions from command-line arguments
+ 2.1.1. Setting PipelineOptions from command-line arguments 
{#pipeline-options-cli}
 
 While you can configure your pipeline by creating a `PipelineOptions` object 
and
 setting the fields directly, the Beam SDKs include a command-line parser that
@@ -167,7 +162,7 @@ a command-line argument.
 > demonstrates how to set pipeline options at runtime by using command-line
 > options.
 
- 2.1.2. Creating custom options
+ 2.1.2. Creating custom options {#creating-custom-options}
 
 You can add your own custom options in addition to the standard
 `PipelineOptions`. To add your own options, define an interface with getter and
@@ -223,7 +218,7 @@ MyOptions options = PipelineOptionsFactory.fromArgs(args)
 
 Now your pipeline can accept `--myCustomOption=value` as a command-line 
argument.
 
-## 3. PCollections
+## 3. PCollections {#pcollections}
 
 The [PCollection]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/PCollection.html)
 `PCollection` abstraction represents a
@@ -236,7 +231,7 @@ After you've created your `Pipeline`, you'll need to begin 
by creating at least
 one `PCollection` in some form. The `PCollection` you create serves as the 
input
 for the first operation in your pipeline.
 
-### 3.1. Creating a PCollection
+### 3.1. Creating a PCollection {#creating-a-pcollection}
 
 You create a `PCollection` by either reading data from an external source using
 Beam's [Source API](#pipeline-io), or you can create a `PCollection` of data
@@ -246,7 +241,7 @@ contain adapters to help you read from external sources 
like large cloud-based
 files, databases, or subscription services. The latter is primarily useful for
 testing and debugging purposes.
 
- 3.1.1. Reading from an external source
+ 3.1.1. Reading from an external source {#reading-external-source}
 
 To read from an external source, you use one of the [Beam-provided I/O
 adapters](#pipeline-io). The adapters vary in their exact usage, but all of 
them
@@ -283,7 +278,7 @@ public static void main(String[] args) {
 See the [section on I/O](#pipeline-io) to learn more about how to read from the
 various data sources supported by the Beam SDK.
 
- 3.1.2. Creating a PCollection from in-memory data
+ 3.1.2. Creating a PCollection from in-memory data 
{#creating-pcollection-in-memory}
 
 {:.language-java}
 To create a `PCollection` from an in-memory Java `Collection`, you use the
@@ -326,14 +321,14 @@ public static void main(String[] args) {
 %}
 ```
 
-### 3.2. PCollection characteristics
+### 3.2. PCollection characteristics {#pcollection-characteristics}
 
 A `PCollection` is owned by the specific `Pipeline` object for which it is
 created; multiple pipelines cannot share a `PCollection`. In some respects, a
 `PCollection` functions like a collection class. However, 

[beam-site] 03/03: This closes #388

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 6764c8c06571a4e86c96e0a63199b3d9072761a0
Merge: 1333c13 515cd4e
Author: Mergebot 
AuthorDate: Wed Feb 28 10:22:01 2018 -0800

This closes #388

 Gemfile.lock   |  78 +++--
 src/documentation/programming-guide.md | 195 -
 2 files changed, 138 insertions(+), 135 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


Jenkins build is back to normal : beam_PerformanceTests_JDBC #272

2018-02-28 Thread Apache Jenkins Server
See 




[beam] branch master updated (7fa6292 -> 4d8cdbd)

2018-02-28 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 7fa6292  Merge pull request #4739: [BEAM-3681] Make S3FileSystem copy 
atomic for smaller than 5GB objects
 add 5cb3653  Avoid unnecessary autoboxing by replacing 
Integer/Long.valueOf with Integer.parseInt/Long.parseLong
 add 4d8cdbd  Avoid unnecessary autoboxing by replacing 
Integer/Long.valueOf with Integer.parseInt/Long.parseLong

No new revisions were added by this update.

Summary of changes:
 .../apache/beam/runners/dataflow/util/TimeUtil.java   | 19 ++-
 1 file changed, 10 insertions(+), 9 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


Build failed in Jenkins: beam_PerformanceTests_Python #968

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-3681] Make S3FileSystem copy atomic for smaller than 5GB 
objects

[echauchot] [BEAM-3681] Add a comment for the extra check of objectSize in

--
[...truncated 1.05 KB...]
Commit message: "Merge pull request #4739: [BEAM-3681] Make S3FileSystem copy 
atomic for smaller than 5GB objects"
 > git rev-list --no-walk 948988c921747ab6298059a94daf1180e5b76cd4 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins877855133173275.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6181234315286343649.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7289117006199323529.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8763492092282232682.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/43/41/033a273f9a25cb63050a390ee8397acbc7eae2159195d85f06f17e7be45a/setuptools-38.5.1-py2.py3-none-any.whl#md5=908b8b5e50bf429e520b2b5fa1b350e5
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6187110093687609731.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3401499517458785318.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied: numpy==1.13.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: colorama; extra == "windows" in 
/usr/lib/python2.7/dist-packages (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: xmltodict in 
/home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1011

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-3681] Make S3FileSystem copy atomic for smaller than 5GB 
objects

[echauchot] [BEAM-3681] Add a comment for the extra check of objectSize in

--
[...truncated 118.39 KB...]
  File 
"
 line 597, in from_runner_api
context.transforms.get_by_id(root_transform_id)]
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 842, in from_runner_api
part = context.transforms.get_by_id(transform_id)
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 842, in from_runner_api
part = context.transforms.get_by_id(transform_id)
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 842, in from_runner_api
part = context.transforms.get_by_id(transform_id)
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 833, in from_runner_api
transform=ptransform.PTransform.from_runner_api(proto.spec, context),
  File 
"
 line 555, in from_runner_api
context)
  File 
"
 line 886, in from_runner_api_parameter
result = ParDo(fn, *args, **kwargs)
  File 
"
 line 781, in __init__
super(ParDo, self).__init__(fn, *args, **kwargs)
  File 
"
 line 627, in __init__
self.fn = pickler.loads(pickler.dumps(self.fn))
  File 
"
 line 221, in loads
return dill.loads(s)
  File 
"
 line 277, in loads
return load(file)
  File 
"
 line 266, in load
obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1090, in load_global
klass = self.find_class(module, name)
  File 
"
 line 423, in find_class
return StockUnpickler.find_class(self, module, name)
  File "/usr/lib/python2.7/pickle.py", line 1124, in find_class
__import__(module)
  File 
"
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)'

==
ERROR: test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
--
Traceback (most recent call last):
  File 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4330

2018-02-28 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-3681] Make S3FileSystem copy atomic for smaller than 5GB 
objects

[echauchot] [BEAM-3681] Add a comment for the extra check of objectSize in

--
[...truncated 1.02 MB...]
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/test

[jira] [Resolved] (BEAM-3681) S3Filesystem fails when copying empty files

2018-02-28 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3681?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía resolved BEAM-3681.

Resolution: Fixed

> S3Filesystem fails when copying empty files
> ---
>
> Key: BEAM-3681
> URL: https://issues.apache.org/jira/browse/BEAM-3681
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-aws
>Affects Versions: 2.3.0
>Reporter: Ismaël Mejía
>Assignee: Ismaël Mejía
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 4h 10m
>  Remaining Estimate: 0h
>
> When executing a simple write on S3 with the direct runner. It breaks 
> sometimes when it ends up trying to write 'empty' shards to S3.
> {code:java}
> Pipeline pipeline = Pipeline.create(options);
> pipeline
>  .apply("CreateSomeData", Create.of("1", "2", "3"))
>  .apply("WriteToFS", TextIO.write().to(options.getOutput()));
> pipeline.run();{code}
> The related exception is:
> {code:java}
> Exception in thread "main" 
> org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.IOException: 
> com.amazonaws.services.s3.model.AmazonS3Exception: The XML you provided was 
> not well-formed or did not validate against our published schema (Service: 
> Amazon S3; Status Code: 400; Error Code: MalformedXML; Request ID: 
> 402E99C2F602AD09; S3 Extended Request ID: 
> SDdU8AqW2mfZuG1xcKUSNeHiR0IUKcRCpZ1Wjx7sAor1CdYf8f+0dDIcQpvr3GXgqwsyk5PGWVE=),
>  S3 Extended Request ID: 
> SDdU8AqW2mfZuG1xcKUSNeHiR0IUKcRCpZ1Wjx7sAor1CdYf8f+0dDIcQpvr3GXgqwsyk5PGWVE=
>     at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:342)
>     at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:312)
>     at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:206)
>     at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:62)
>     at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
>     at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
>     at 
> org.apache.beam.samples.ingest.amazon.IngestToS3.main(IngestToS3.java:82)
> Caused by: java.io.IOException: 
> com.amazonaws.services.s3.model.AmazonS3Exception: The XML you provided was 
> not well-formed or did not validate against our published schema (Service: 
> Amazon S3; Status Code: 400; Error Code: MalformedXML; Request ID: 
> 402E99C2F602AD09; S3 Extended Request ID: 
> SDdU8AqW2mfZuG1xcKUSNeHiR0IUKcRCpZ1Wjx7sAor1CdYf8f+0dDIcQpvr3GXgqwsyk5PGWVE=),
>  S3 Extended Request ID: 
> SDdU8AqW2mfZuG1xcKUSNeHiR0IUKcRCpZ1Wjx7sAor1CdYf8f+0dDIcQpvr3GXgqwsyk5PGWVE=
>     at org.apache.beam.sdk.io.aws.s3.S3FileSystem.copy(S3FileSystem.java:563)
>     at 
> org.apache.beam.sdk.io.aws.s3.S3FileSystem.lambda$copy$4(S3FileSystem.java:495)
>     at 
> org.apache.beam.sdk.io.aws.s3.S3FileSystem.lambda$callTasks$8(S3FileSystem.java:642)
>     at 
> org.apache.beam.sdk.util.MoreFutures.lambda$supplyAsync$0(MoreFutures.java:100)
>     at 
> java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1626)
> Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: The XML you 
> provided was not well-formed or did not validate against our published schema 
> (Service: Amazon S3; Status Code: 400; Error Code: MalformedXML; Request ID: 
> 402E99C2F602AD09; S3 Extended Request ID: 
> SDdU8AqW2mfZuG1xcKUSNeHiR0IUKcRCpZ1Wjx7sAor1CdYf8f+0dDIcQpvr3GXgqwsyk5PGWVE=),
>  S3 Extended Request ID: 
> SDdU8AqW2mfZuG1xcKUSNeHiR0IUKcRCpZ1Wjx7sAor1CdYf8f+0dDIcQpvr3GXgqwsyk5PGWVE=
>     at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1639)
>     at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1304)
>     at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1056)
>     at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:743)
>     at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
>     at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
>     at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
>     at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
>     at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
>     at 
> com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325)
>     at 
> com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272)
>     at 
> com.amazonaws.services.s3.AmazonS3Client.completeMultipartUpload(AmazonS3Client.java:3065)
>     at org.apache.beam.sdk.io.aws.s3.S3FileSystem.copy(S3FileSystem.java:561)
>     

[beam] 01/01: Merge pull request #4739: [BEAM-3681] Make S3FileSystem copy atomic for smaller than 5GB objects

2018-02-28 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 7fa6292a21564744011fe94a7e50f7e074564b71
Merge: 948988c ffebd65
Author: Ismaël Mejía 
AuthorDate: Wed Feb 28 17:56:24 2018 +0100

Merge pull request #4739: [BEAM-3681] Make S3FileSystem copy atomic for 
smaller than 5GB objects

 .../java/org/apache/beam/sdk/io/FileBasedSink.java |  26 ++-
 .../apache/beam/sdk/io/aws/s3/S3FileSystem.java| 132 +++-
 .../beam/sdk/io/aws/s3/S3FileSystemTest.java   | 228 -
 3 files changed, 229 insertions(+), 157 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
ieme...@apache.org.


[beam] branch master updated (948988c -> 7fa6292)

2018-02-28 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 948988c  Merge pull request #4764: [BEAM-3760] Scan Core Construction 
NeedsRunner Tests
 add 9ef8f63  [BEAM-3681] Make S3FileSystem copy atomic for smaller than 
5GB objects and fix wrong indentation on FileBasedSink
 add ffebd65  [BEAM-3681] Add a comment for the extra check of objectSize 
in S3FileSystem.multipartCopy
 new 7fa6292  Merge pull request #4739: [BEAM-3681] Make S3FileSystem copy 
atomic for smaller than 5GB objects

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../java/org/apache/beam/sdk/io/FileBasedSink.java |  26 ++-
 .../apache/beam/sdk/io/aws/s3/S3FileSystem.java| 132 +++-
 .../beam/sdk/io/aws/s3/S3FileSystemTest.java   | 228 -
 3 files changed, 229 insertions(+), 157 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
ieme...@apache.org.


[jira] [Assigned] (BEAM-3664) Port SolrIOTest off DoFnTester

2018-02-28 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3664:
-

Assignee: Willy Lulciuc

> Port SolrIOTest off DoFnTester
> --
>
> Key: BEAM-3664
> URL: https://issues.apache.org/jira/browse/BEAM-3664
> Project: Beam
>  Issue Type: Sub-task
>  Components: io-java-solr
>Reporter: Kenneth Knowles
>Assignee: Willy Lulciuc
>Priority: Major
>  Labels: beginner, newbie, starter
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4329

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.01 MB...]
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1010

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 124.74 KB...]
  File 
"
 line 810, in to_runner_api
for part in self.parts],
  File 
"
 line 60, in get_id
self._id_to_proto[id] = obj.to_runner_api(self._pipeline_context)
  File 
"
 line 810, in to_runner_api
for part in self.parts],
  File 
"
 line 60, in get_id
self._id_to_proto[id] = obj.to_runner_api(self._pipeline_context)
  File 
"
 line 808, in to_runner_api
spec=transform_to_runner_api(self.transform, context),
  File 
"
 line 805, in transform_to_runner_api
return transform.to_runner_api(context)
  File 
"
 line 542, in to_runner_api
urn, typed_param = self.to_runner_api_parameter(context)
  File 
"
 line 839, in to_runner_api_parameter
source=self.source.to_runner_api(context),
  File 
"
 line 94, in to_runner_api
urn, typed_param = self.to_runner_api_parameter(context)
  File 
"
 line 82, in 
pickle_urn, wrappers_pb2.BytesValue(value=pickler.dumps(self
  File 
"
 line 193, in dumps
s = dill.dumps(o)
  File 
"
 line 259, in dumps
dump(obj, file, protocol, byref, fmode, recurse)#, strictio)
  File 
"
 line 252, in dump
pik.dump(obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 396, in save_reduce
save(cls)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File 
"
 line 94, in wrapper
obj=obj)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
save(args)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 562, in save_tuple
save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File 
"
 line 165, in new_save_module_dict
return old_save_module_dict(pickler, obj)
  File 
"
 line 841, in save_module_dict
StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 649, in save_dict
self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 681, in _batch_setitems
save(v)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File 
"
 line 1311, in save_function
obj.__dict__), obj=obj)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
save(args)
  File 

[jira] [Commented] (BEAM-3649) HadoopSeekableByteChannel breaks when backing InputStream doesn't supporte ByteBuffers

2018-02-28 Thread Guillaume Balaine (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3649?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16380436#comment-16380436
 ] 

Guillaume Balaine commented on BEAM-3649:
-

I was using HDFS mostly for its compatibility with other APIs such as
Parquet (there is another ongoing PR for this with Beam), but certainly a
custom s3 client is better for simply appending.
The thing is Hadoop has a huge ecosystem and the hadoop-fs is often
targeted for improved s3 access layers such as : SparkTC/stocator. So the
s3 impl in Beam needs to be pretty solid if it wants to get used.

On Mon, Feb 26, 2018 at 5:36 PM, Ismaël Mejía (JIRA) 



> HadoopSeekableByteChannel breaks when backing InputStream doesn't supporte 
> ByteBuffers
> --
>
> Key: BEAM-3649
> URL: https://issues.apache.org/jira/browse/BEAM-3649
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-hadoop
>Affects Versions: 2.0.0, 2.1.0, 2.2.0
>Reporter: Guillaume Balaine
>Priority: Minor
> Fix For: Not applicable
>
>
> This happened last summer, when I wanted to use S3A as the backing HDFS 
> access implementation. 
> This is because while this method is called : 
> [https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FSDataInputStream.java#L145]
> This class does not implement ByteBuffer readable 
> https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AFileSystem.java
> I fixed it by manually incrementing the read position and copying the backing 
> array instead of buffering.
> [https://github.com/Igosuki/beam/commit/3838f0db43b6422833a045d1f097f6d7643219f1]
> I know the s3 direct implementation is the preferred path, but this is 
> possible, and likely happens to a lot of developers.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam-site] 02/03: Explicitly define section id due to kramdown id generation changes

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 4eae6c26fadbc19e7760cde967060a995b6a0efe
Author: melissa 
AuthorDate: Tue Feb 20 14:18:14 2018 -0800

Explicitly define section id due to kramdown id generation changes
---
 src/documentation/programming-guide.md | 195 -
 1 file changed, 95 insertions(+), 100 deletions(-)

diff --git a/src/documentation/programming-guide.md 
b/src/documentation/programming-guide.md
index 7f6aea5..6b86743 100644
--- a/src/documentation/programming-guide.md
+++ b/src/documentation/programming-guide.md
@@ -26,12 +26,7 @@ how to implement Beam concepts in your pipelines.
   
 
 
-**Table of Contents:**
-* TOC
-{:toc}
-
-
-## 1. Overview
+## 1. Overview {#overview}
 
 To use Beam, you need to first create a driver program using the classes in one
 of the Beam SDKs. Your driver program *defines* your pipeline, including all of
@@ -94,7 +89,7 @@ objects you've created and transforms that you've applied. 
That graph is then
 executed using the appropriate distributed processing back-end, becoming an
 asynchronous "job" (or equivalent) on that back-end.
 
-## 2. Creating a pipeline
+## 2. Creating a pipeline {#creating-a-pipeline}
 
 The `Pipeline` abstraction encapsulates all the data and steps in your data
 processing task. Your Beam driver program typically starts by constructing a
@@ -122,7 +117,7 @@ Pipeline p = Pipeline.create(options);
 %}
 ```
 
-### 2.1. Configuring pipeline options
+### 2.1. Configuring pipeline options {#configuring-pipeline-options}
 
 Use the pipeline options to configure different aspects of your pipeline, such
 as the pipeline runner that will execute your pipeline and any runner-specific
@@ -134,7 +129,7 @@ When you run the pipeline on a runner of your choice, a 
copy of the
 PipelineOptions will be available to your code. For example, you can read
 PipelineOptions from a DoFn's Context.
 
- 2.1.1. Setting PipelineOptions from command-line arguments
+ 2.1.1. Setting PipelineOptions from command-line arguments 
{#pipeline-options-cli}
 
 While you can configure your pipeline by creating a `PipelineOptions` object 
and
 setting the fields directly, the Beam SDKs include a command-line parser that
@@ -167,7 +162,7 @@ a command-line argument.
 > demonstrates how to set pipeline options at runtime by using command-line
 > options.
 
- 2.1.2. Creating custom options
+ 2.1.2. Creating custom options {#creating-custom-options}
 
 You can add your own custom options in addition to the standard
 `PipelineOptions`. To add your own options, define an interface with getter and
@@ -223,7 +218,7 @@ MyOptions options = PipelineOptionsFactory.fromArgs(args)
 
 Now your pipeline can accept `--myCustomOption=value` as a command-line 
argument.
 
-## 3. PCollections
+## 3. PCollections {#pcollections}
 
 The [PCollection]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/PCollection.html)
 `PCollection` abstraction represents a
@@ -236,7 +231,7 @@ After you've created your `Pipeline`, you'll need to begin 
by creating at least
 one `PCollection` in some form. The `PCollection` you create serves as the 
input
 for the first operation in your pipeline.
 
-### 3.1. Creating a PCollection
+### 3.1. Creating a PCollection {#creating-a-pcollection}
 
 You create a `PCollection` by either reading data from an external source using
 Beam's [Source API](#pipeline-io), or you can create a `PCollection` of data
@@ -246,7 +241,7 @@ contain adapters to help you read from external sources 
like large cloud-based
 files, databases, or subscription services. The latter is primarily useful for
 testing and debugging purposes.
 
- 3.1.1. Reading from an external source
+ 3.1.1. Reading from an external source {#reading-external-source}
 
 To read from an external source, you use one of the [Beam-provided I/O
 adapters](#pipeline-io). The adapters vary in their exact usage, but all of 
them
@@ -283,7 +278,7 @@ public static void main(String[] args) {
 See the [section on I/O](#pipeline-io) to learn more about how to read from the
 various data sources supported by the Beam SDK.
 
- 3.1.2. Creating a PCollection from in-memory data
+ 3.1.2. Creating a PCollection from in-memory data 
{#creating-pcollection-in-memory}
 
 {:.language-java}
 To create a `PCollection` from an in-memory Java `Collection`, you use the
@@ -326,14 +321,14 @@ public static void main(String[] args) {
 %}
 ```
 
-### 3.2. PCollection characteristics
+### 3.2. PCollection characteristics {#pcollection-characteristics}
 
 A `PCollection` is owned by the specific `Pipeline` object for which it is
 created; multiple pipelines cannot share a `PCollection`. In some respects, a
 `PCollection` functions like a collection class. However, 

[beam-site] branch mergebot updated (d77143a -> d23e1ba)

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


from d77143a  This closes #394
 add 1333c13  Prepare repository for deployment.
 new e59956d  Update Gemfile.lock
 new 4eae6c2  Explicitly define section id due to kramdown id generation 
changes
 new d23e1ba  This closes #388

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 Gemfile.lock   |  78 +
 content/documentation/io/built-in/index.html   |  25 +--
 .../get-started/mobile-gaming-example/index.html   |   4 +-
 src/documentation/programming-guide.md | 195 ++---
 4 files changed, 149 insertions(+), 153 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 03/03: This closes #388

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit d23e1ba554d4caa5fca7dce0314f68a1cfbfe875
Merge: 1333c13 4eae6c2
Author: Mergebot 
AuthorDate: Wed Feb 28 06:33:55 2018 -0800

This closes #388

 Gemfile.lock   |  78 +++--
 src/documentation/programming-guide.md | 195 -
 2 files changed, 138 insertions(+), 135 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/03: Update Gemfile.lock

2018-02-28 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit e59956d62241b27542259fe044a36b0816bdbc19
Author: Kenneth Knowles 
AuthorDate: Wed Feb 14 11:36:59 2018 -0800

Update Gemfile.lock
---
 Gemfile.lock | 78 +---
 1 file changed, 43 insertions(+), 35 deletions(-)

diff --git a/Gemfile.lock b/Gemfile.lock
index 1ab575d..f3ca8ed 100644
--- a/Gemfile.lock
+++ b/Gemfile.lock
@@ -1,29 +1,31 @@
 GEM
   remote: https://rubygems.org/
   specs:
-activesupport (4.2.7.1)
+activesupport (4.2.10)
   i18n (~> 0.7)
-  json (~> 1.7, >= 1.7.7)
   minitest (~> 5.1)
   thread_safe (~> 0.3, >= 0.3.4)
   tzinfo (~> 1.1)
-addressable (2.4.0)
+addressable (2.5.2)
+  public_suffix (>= 2.0.2, < 4.0)
 colorator (1.1.0)
-colored (1.2)
-ethon (0.9.1)
+colorize (0.8.1)
+concurrent-ruby (1.0.5)
+ethon (0.11.0)
   ffi (>= 1.3.0)
-ffi (1.9.14)
+ffi (1.9.21)
 forwardable-extended (2.6.0)
-html-proofer (3.3.1)
+html-proofer (3.8.0)
   activesupport (>= 4.2, < 6.0)
   addressable (~> 2.3)
-  colored (~> 1.2)
+  colorize (~> 0.8)
   mercenary (~> 0.3.2)
-  nokogiri (~> 1.5)
+  nokogiri (~> 1.8.1)
   parallel (~> 1.3)
-  typhoeus (~> 0.7)
+  typhoeus (~> 1.3)
   yell (~> 2.0)
-i18n (0.7.0)
+i18n (0.9.5)
+  concurrent-ruby (~> 1.0)
 jekyll (3.2.0)
   colorator (~> 1.0)
   jekyll-sass-converter (~> 1.0)
@@ -36,40 +38,46 @@ GEM
   safe_yaml (~> 1.0)
 jekyll-redirect-from (0.11.0)
   jekyll (>= 2.0)
-jekyll-sass-converter (1.4.0)
+jekyll-sass-converter (1.5.2)
   sass (~> 3.4)
-jekyll-watch (1.5.0)
-  listen (~> 3.0, < 3.1)
-jekyll_github_sample (0.3.0)
+jekyll-watch (1.5.1)
+  listen (~> 3.0)
+jekyll_github_sample (0.3.1)
   activesupport (~> 4.0)
   jekyll (~> 3.0)
-json (1.8.3)
-kramdown (1.12.0)
+kramdown (1.16.2)
 liquid (3.0.6)
-listen (3.0.8)
+listen (3.1.5)
   rb-fsevent (~> 0.9, >= 0.9.4)
   rb-inotify (~> 0.9, >= 0.9.7)
+  ruby_dep (~> 1.2)
 mercenary (0.3.6)
-mini_portile2 (2.1.0)
-minitest (5.9.1)
-nokogiri (1.6.8.1)
-  mini_portile2 (~> 2.1.0)
-parallel (1.9.0)
-pathutil (0.14.0)
+mini_portile2 (2.3.0)
+minitest (5.11.3)
+nokogiri (1.8.2)
+  mini_portile2 (~> 2.3.0)
+parallel (1.12.1)
+pathutil (0.16.1)
   forwardable-extended (~> 2.6)
-rake (11.3.0)
-rb-fsevent (0.9.7)
-rb-inotify (0.9.7)
-  ffi (>= 0.5.0)
+public_suffix (3.0.2)
+rake (12.3.0)
+rb-fsevent (0.10.2)
+rb-inotify (0.9.10)
+  ffi (>= 0.5.0, < 2)
 rouge (1.11.1)
+ruby_dep (1.5.0)
 safe_yaml (1.0.4)
-sass (3.4.22)
-thread_safe (0.3.5)
-typhoeus (0.8.0)
-  ethon (>= 0.8.0)
-tzinfo (1.2.2)
+sass (3.5.5)
+  sass-listen (~> 4.0.0)
+sass-listen (4.0.0)
+  rb-fsevent (~> 0.9, >= 0.9.4)
+  rb-inotify (~> 0.9, >= 0.9.7)
+thread_safe (0.3.6)
+typhoeus (1.3.0)
+  ethon (>= 0.9.0)
+tzinfo (1.2.5)
   thread_safe (~> 0.1)
-yell (2.0.6)
+yell (2.0.7)
 
 PLATFORMS
   ruby
@@ -84,4 +92,4 @@ DEPENDENCIES
   rake
 
 BUNDLED WITH
-   1.13.7
+   1.16.0

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[jira] [Assigned] (BEAM-3753) Integration ITCase tests are not executed

2018-02-28 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Grzegorz Kołakowski reassigned BEAM-3753:
-

Assignee: Grzegorz Kołakowski  (was: Aljoscha Krettek)

> Integration ITCase tests are not executed
> -
>
> Key: BEAM-3753
> URL: https://issues.apache.org/jira/browse/BEAM-3753
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Reporter: Grzegorz Kołakowski
>Assignee: Grzegorz Kołakowski
>Priority: Major
>
> The flink-runner {{*ITCase.java}} tests are not executed at all, either by 
> surefire or by filesafe plugin.
>  * org.apache.beam.runners.flink.ReadSourceStreamingITCase
>  * org.apache.beam.runners.flink.ReadSourceITCase
>  * org.apache.beam.runners.flink.streaming.TopWikipediaSessionsITCase
> In addition, two of them fail if run manually for IDE.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Spark #1411

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 91.08 KB...]
'apache-beam-testing:bqjob_r4e39cc2afb517109_0161dc5a40fb_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-02-28 12:19:20,006 c44f84ae MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-02-28 12:19:42,282 c44f84ae MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-02-28 12:19:44,603 c44f84ae MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.31s,  CPU:0.34s,  MaxMemory:25340kb 
STDOUT: Upload complete.
Waiting on bqjob_r391f04875ad8ecfb_0161dc5aa1b3_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r391f04875ad8ecfb_0161dc5aa1b3_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r391f04875ad8ecfb_0161dc5aa1b3_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-02-28 12:19:44,604 c44f84ae MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-02-28 12:20:02,078 c44f84ae MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-02-28 12:20:04,283 c44f84ae MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.19s,  CPU:0.34s,  MaxMemory:25512kb 
STDOUT: Upload complete.
Waiting on bqjob_r668c363181b68f64_0161dc5aeeff_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r668c363181b68f64_0161dc5aeeff_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r668c363181b68f64_0161dc5aeeff_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-02-28 12:20:04,284 c44f84ae MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-02-28 12:20:23,962 c44f84ae MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-02-28 12:20:26,134 c44f84ae MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.16s,  CPU:0.30s,  MaxMemory:25336kb 
STDOUT: Upload complete.
Waiting on bqjob_r45351b88d69fe11a_0161dc5b4457_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r45351b88d69fe11a_0161dc5b4457_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job

Build failed in Jenkins: beam_PerformanceTests_JDBC #271

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 721.06 KB...]
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:481)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:392)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:374)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:158)
at 
com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:308)
at 
com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:264)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:133)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:113)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:100)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
(b31e705f47e562a3): java.lang.RuntimeException: 
org.apache.beam.sdk.util.UserCodeException: org.postgresql.util.PSQLException: 
The connection attempt failed.
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:404)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:374)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:158)
at 
com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:308)
at 
com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:264)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:133)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:113)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:100)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: 
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeSetup(Unknown 
Source)
at 
com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:63)
at 
com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:45)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:94)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:481)
at 
com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:392)
... 14 more
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:215)

Build failed in Jenkins: beam_PerformanceTests_Python #967

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 726 B...]
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 948988c921747ab6298059a94daf1180e5b76cd4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 948988c921747ab6298059a94daf1180e5b76cd4
Commit message: "Merge pull request #4764: [BEAM-3760] Scan Core Construction 
NeedsRunner Tests"
 > git rev-list --no-walk 948988c921747ab6298059a94daf1180e5b76cd4 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1671596526708896429.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4494764489942789651.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5998774611413045254.sh
+ virtualenv .env --system-site-packages
New python executable in 

Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3280614105053070620.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./.env/lib/python2.7/site-packages
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3949566164283538836.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3403035176663608705.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
:122:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
  Using cached numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))

Build failed in Jenkins: beam_PostCommit_Python_Verify #4328

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.02 MB...]
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1009

2018-02-28 Thread Apache Jenkins Server
See 


--
[...truncated 131.75 KB...]
  File 
"
 line 382, in run
return self.runner.run_pipeline(self)
  File 
"
 line 285, in run_pipeline
return_context=True)
  File 
"
 line 580, in to_runner_api
root_transform_id = context.transforms.get_id(self._root_transform())
  File 
"
 line 60, in get_id
self._id_to_proto[id] = obj.to_runner_api(self._pipeline_context)
  File 
"
 line 810, in to_runner_api
for part in self.parts],
  File 
"
 line 60, in get_id
self._id_to_proto[id] = obj.to_runner_api(self._pipeline_context)
  File 
"
 line 810, in to_runner_api
for part in self.parts],
  File 
"
 line 60, in get_id
self._id_to_proto[id] = obj.to_runner_api(self._pipeline_context)
  File 
"
 line 810, in to_runner_api
for part in self.parts],
  File 
"
 line 60, in get_id
self._id_to_proto[id] = obj.to_runner_api(self._pipeline_context)
  File 
"
 line 808, in to_runner_api
spec=transform_to_runner_api(self.transform, context),
  File 
"
 line 805, in transform_to_runner_api
return transform.to_runner_api(context)
  File 
"
 line 542, in to_runner_api
urn, typed_param = self.to_runner_api_parameter(context)
  File 
"
 line 839, in to_runner_api_parameter
source=self.source.to_runner_api(context),
  File 
"
 line 94, in to_runner_api
urn, typed_param = self.to_runner_api_parameter(context)
  File 
"
 line 82, in 
pickle_urn, wrappers_pb2.BytesValue(value=pickler.dumps(self
  File 
"
 line 193, in dumps
s = dill.dumps(o)
  File 
"
 line 259, in dumps
dump(obj, file, protocol, byref, fmode, recurse)#, strictio)
  File 
"
 line 252, in dump
pik.dump(obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 396, in save_reduce
save(cls)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
  File 
"
 line 94, in wrapper
obj=obj)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
save(args)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) 

[jira] [Assigned] (BEAM-3359) Unable to change "flinkMaster" from "[auto]" in TestFlinkRunner

2018-02-28 Thread Dawid Wysakowicz (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3359?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dawid Wysakowicz reassigned BEAM-3359:
--

Assignee: Dawid Wysakowicz

> Unable to change "flinkMaster" from "[auto]" in TestFlinkRunner
> ---
>
> Key: BEAM-3359
> URL: https://issues.apache.org/jira/browse/BEAM-3359
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Reporter: Łukasz Gajowy
>Assignee: Dawid Wysakowicz
>Priority: Minor
>
> In TestFlinkRunner's constructor there is a line like this:
> {{options.setFlinkMaster("\[auto\]");}}
> which basically ignores any "flinkMaster" provided earlier (eg. using command 
> line) leading to  errors that are hard to find (for example wondering: "i 
> provided good url in pipeline options... why is it not connecting to my 
> cluster?). 
> Setting a {{@Default.String("\[auto\]")}} in FlinkPipelineOptions could be 
> one solution I guess. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)