[GitHub] beam pull request #2780: [BEAM-2214] KafkaIO: fix display data for read/writ...

2017-04-29 Thread peay
GitHub user peay opened a pull request:

https://github.com/apache/beam/pull/2780

[BEAM-2214] KafkaIO: fix display data for read/write with coders, error 
messages

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [x] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [x] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

cc @jkff @dhalperi 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/peay/beam beam-2114

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2780.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2780


commit 3e2e889d4095091334c82c2ec73759fa6107b583
Author: peay 
Date:   2017-04-29T09:08:21Z

[BEAM-2114] Fixed display data for Kafka read/write with coders

commit 3ae50571f5dec5de91e6800abcc10f5db7b83a6c
Author: peay 
Date:   2017-04-29T09:31:15Z

[BEAM-2114] Throw instead of warning when KafkaIO cannot infer coder




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (BEAM-1148) Port PAssert away from Aggregators

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur updated BEAM-1148:

Fix Version/s: First stable release

> Port PAssert away from Aggregators
> --
>
> Key: BEAM-1148
> URL: https://issues.apache.org/jira/browse/BEAM-1148
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Assignee: Pablo Estrada
> Fix For: First stable release
>
>
> One step in the removal of Aggregators (in favor of Metrics) is to remove our 
> reliance on them for PAssert checking.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (BEAM-1148) Port PAssert away from Aggregators

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur reassigned BEAM-1148:
---

Assignee: Pablo Estrada

> Port PAssert away from Aggregators
> --
>
> Key: BEAM-1148
> URL: https://issues.apache.org/jira/browse/BEAM-1148
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Assignee: Pablo Estrada
> Fix For: First stable release
>
>
> One step in the removal of Aggregators (in favor of Metrics) is to remove our 
> reliance on them for PAssert checking.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (BEAM-1148) Port PAssert away from Aggregators

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur resolved BEAM-1148.
-
Resolution: Done

> Port PAssert away from Aggregators
> --
>
> Key: BEAM-1148
> URL: https://issues.apache.org/jira/browse/BEAM-1148
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Assignee: Pablo Estrada
> Fix For: First stable release
>
>
> One step in the removal of Aggregators (in favor of Metrics) is to remove our 
> reliance on them for PAssert checking.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-1148) Port PAssert away from Aggregators

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur updated BEAM-1148:

Issue Type: Improvement  (was: New Feature)

> Port PAssert away from Aggregators
> --
>
> Key: BEAM-1148
> URL: https://issues.apache.org/jira/browse/BEAM-1148
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Assignee: Pablo Estrada
> Fix For: First stable release
>
>
> One step in the removal of Aggregators (in favor of Metrics) is to remove our 
> reliance on them for PAssert checking.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (BEAM-1814) Support for new State API in Spark runner

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur resolved BEAM-1814.
-
   Resolution: Duplicate
Fix Version/s: Not applicable

> Support for new State API in Spark runner
> -
>
> Key: BEAM-1814
> URL: https://issues.apache.org/jira/browse/BEAM-1814
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-spark
>Reporter: Etienne Chauchot
>Assignee: Amit Sela
> Fix For: Not applicable
>
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (BEAM-2111) java.lang.ClassCastException in spark in streaming mode

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2111?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur reassigned BEAM-2111:
---

Assignee: Aviem Zur  (was: Amit Sela)

> java.lang.ClassCastException in spark in streaming mode
> ---
>
> Key: BEAM-2111
> URL: https://issues.apache.org/jira/browse/BEAM-2111
> Project: Beam
>  Issue Type: Bug
>  Components: runner-spark
>Reporter: Etienne Chauchot
>Assignee: Aviem Zur
>
> Bug can be reproduced with :
> run Nexmark query5 (https://github.com/iemejia/beam/tree/BEAM-160-nexmark) in 
> streaming mode using Spark.
> Run main in 
> {code}org.apache.beam.integration.nexmark.drivers.NexmarkSparkDriver{code}
> with VMOptions:
> {code} -Dspark.ui.enabled=false -DSPARK_LOCAL_IP=localhost 
> -Dsun.io.serialization.extendedDebugInfo=true {code}
> with Program arguments:
> {code}--query=5  --streaming=true --numEventGenerators=4 
> --manageResources=false --monitorJobs=true --enforceEncodability=false 
> --enforceImmutability=false{code}
> StackTrace is 
> {code}
>  com.esotericsoftware.kryo.KryoException: java.lang.RuntimeException: 
> java.lang.ClassCastException: [J cannot be cast to [Ljava.lang.Object;
> Serialization trace:
> key (org.apache.beam.sdk.values.KV)
> value (org.apache.beam.sdk.values.KV)
> value (org.apache.beam.sdk.util.WindowedValue$TimestampedValueInSingleWindow)
>   at 
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>   at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
>   at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
>   at 
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>   at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
>   at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
>   at 
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>   at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
>   at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:761)
>   at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)
>   at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)
>   at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:761)
>   at 
> org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:228)
>   at 
> org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:181)
>   at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
>   at 
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
>   at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388)
>   at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>   at 
> scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:29)
>   at 
> org.apache.beam.runners.spark.translation.SparkProcessContext.processPartition(SparkProcessContext.java:64)
>   at 
> org.apache.beam.runners.spark.translation.MultiDoFnFunction.call(MultiDoFnFunction.java:114)
>   at 
> org.apache.beam.runners.spark.translation.MultiDoFnFunction.call(MultiDoFnFunction.java:52)
>   at 
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:192)
>   at 
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:192)
>   at 
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
>   at 
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
>   at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>   at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>   at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>   at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:87)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>   at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> 

[GitHub] beam pull request #2781: Fix broken build

2017-04-29 Thread aviemzur
GitHub user aviemzur opened a pull request:

https://github.com/apache/beam/pull/2781

Fix broken build

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/aviemzur/beam fix-broken-build

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2781.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2781


commit 2384e46be0f49ffa9e5c94ec929b344a46a72af6
Author: Aviem Zur 
Date:   2017-04-29T11:08:06Z

Fix broken build




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PerformanceTests_Dataflow #351

2017-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 275.47 KB...]
 * [new ref] refs/pull/2709/head -> origin/pr/2709/head
 * [new ref] refs/pull/2709/merge -> origin/pr/2709/merge
 * [new ref] refs/pull/2710/head -> origin/pr/2710/head
 * [new ref] refs/pull/2710/merge -> origin/pr/2710/merge
 * [new ref] refs/pull/2711/head -> origin/pr/2711/head
 * [new ref] refs/pull/2711/merge -> origin/pr/2711/merge
 * [new ref] refs/pull/2712/head -> origin/pr/2712/head
 * [new ref] refs/pull/2712/merge -> origin/pr/2712/merge
 * [new ref] refs/pull/2713/head -> origin/pr/2713/head
 * [new ref] refs/pull/2713/merge -> origin/pr/2713/merge
 * [new ref] refs/pull/2714/head -> origin/pr/2714/head
 * [new ref] refs/pull/2714/merge -> origin/pr/2714/merge
 * [new ref] refs/pull/2715/head -> origin/pr/2715/head
 * [new ref] refs/pull/2715/merge -> origin/pr/2715/merge
 * [new ref] refs/pull/2716/head -> origin/pr/2716/head
 * [new ref] refs/pull/2716/merge -> origin/pr/2716/merge
 * [new ref] refs/pull/2717/head -> origin/pr/2717/head
 * [new ref] refs/pull/2717/merge -> origin/pr/2717/merge
 * [new ref] refs/pull/2718/head -> origin/pr/2718/head
 * [new ref] refs/pull/2718/merge -> origin/pr/2718/merge
 * [new ref] refs/pull/2719/head -> origin/pr/2719/head
 * [new ref] refs/pull/2719/merge -> origin/pr/2719/merge
 * [new ref] refs/pull/2720/head -> origin/pr/2720/head
 * [new ref] refs/pull/2721/head -> origin/pr/2721/head
 * [new ref] refs/pull/2721/merge -> origin/pr/2721/merge
 * [new ref] refs/pull/2722/head -> origin/pr/2722/head
 * [new ref] refs/pull/2722/merge -> origin/pr/2722/merge
 * [new ref] refs/pull/2723/head -> origin/pr/2723/head
 * [new ref] refs/pull/2723/merge -> origin/pr/2723/merge
 * [new ref] refs/pull/2724/head -> origin/pr/2724/head
 * [new ref] refs/pull/2724/merge -> origin/pr/2724/merge
 * [new ref] refs/pull/2725/head -> origin/pr/2725/head
 * [new ref] refs/pull/2726/head -> origin/pr/2726/head
 * [new ref] refs/pull/2727/head -> origin/pr/2727/head
 * [new ref] refs/pull/2727/merge -> origin/pr/2727/merge
 * [new ref] refs/pull/2728/head -> origin/pr/2728/head
 * [new ref] refs/pull/2729/head -> origin/pr/2729/head
 * [new ref] refs/pull/2729/merge -> origin/pr/2729/merge
 * [new ref] refs/pull/2730/head -> origin/pr/2730/head
 * [new ref] refs/pull/2730/merge -> origin/pr/2730/merge
 * [new ref] refs/pull/2731/head -> origin/pr/2731/head
 * [new ref] refs/pull/2732/head -> origin/pr/2732/head
 * [new ref] refs/pull/2732/merge -> origin/pr/2732/merge
 * [new ref] refs/pull/2733/head -> origin/pr/2733/head
 * [new ref] refs/pull/2733/merge -> origin/pr/2733/merge
 * [new ref] refs/pull/2734/head -> origin/pr/2734/head
 * [new ref] refs/pull/2735/head -> origin/pr/2735/head
 * [new ref] refs/pull/2735/merge -> origin/pr/2735/merge
 * [new ref] refs/pull/2736/head -> origin/pr/2736/head
 * [new ref] refs/pull/2736/merge -> origin/pr/2736/merge
 * [new ref] refs/pull/2737/head -> origin/pr/2737/head
 * [new ref] refs/pull/2737/merge -> origin/pr/2737/merge
 * [new ref] refs/pull/2738/head -> origin/pr/2738/head
 * [new ref] refs/pull/2738/merge -> origin/pr/2738/merge
 * [new ref] refs/pull/2739/head -> origin/pr/2739/head
 * [new ref] refs/pull/2739/merge -> origin/pr/2739/merge
 * [new ref] refs/pull/2740/head -> origin/pr/2740/head
 * [new ref] refs/pull/2740/merge -> origin/pr/2740/merge
 * [new ref] refs/pull/2741/head -> origin/pr/2741/head
 * [new ref] refs/pull/2742/head -> origin/pr/2742/head
 * [new ref] refs/pull/2743/head -> origin/pr/2743/head
 * [new ref] refs/pull/2743/merge -> origin/pr/2743/merge
 * [new ref] refs/pull/2744/head -> origin/pr/2744/head
 * [new ref] refs/pull/2744/merge -> origin/pr/2744/merge
 * [new ref] refs/pull/2745/head -> origin/pr/2745/head
 * [new ref] refs/pull/2745/merge -> origin/pr/2745/merge
 * [new ref] refs/pull/2746/head -> origin/pr/2746/head
 * [new ref] refs/pull/2746/merge -> origin/pr/2746/merge
 * [new ref] refs/pull/2747/head -> origin/pr/2747/head
 * [new ref] refs/pull/2747/merge -> origin/pr/2747/merge
 * [new ref] refs/pull/2748/head -> origin/pr/2748/head
 * [new ref] refs/pull/2748/merge -> origin/pr/2748/merge
 * [new ref] refs/pull/2749/head -> origin/pr/2749/head
 * [new ref] refs/pull/2749/merge -> origin/pr/2749/merge
 * [new ref] refs/pull/2750/

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink #2549

2017-04-29 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_JDBC #157

2017-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 731.97 KB...]
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:127)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:94)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
(74042ad042dd7e2): java.lang.RuntimeException: 
org.apache.beam.sdk.util.UserCodeException: org.postgresql.util.PSQLException: 
The connection attempt failed.
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:289)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:127)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:94)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: 
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$udFQ1SnH.invokeSetup(Unknown
 Source)
at 
com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:66)
at 
com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:48)
at 
com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:99)
at 
com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:363)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
... 14 more
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:215)
at org.postgresql.Driver.makeConnection(Driver.java:404)
at org.postgresql.Driver.connect(Driver.java:272)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:86)
at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:71)
at org.apache.be

Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3526

2017-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 1.82 MB...]
2017-04-29T12:25:15.672 [INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-gcp-core:jar:0.7.0-SNAPSHOT from the 
shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.http-client:google-http-client-jackson2:jar:1.22.0 from the shaded 
jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.auth:google-auth-library-oauth2-http:jar:0.6.1 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.api-client:google-api-client:jar:1.22.0 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.oauth-client:google-oauth-client:jar:1.22.0 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.cloud.bigdataoss:gcsio:jar:1.4.5 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.api-client:google-api-client-java6:jar:1.22.0 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.api-client:google-api-client-jackson2:jar:1.22.0 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 from the shaded 
jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.apis:google-api-services-cloudresourcemanager:jar:v1-rev6-1.22.0 
from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.auth:google-auth-library-credentials:jar:0.6.1 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
org.apache.beam:beam-runners-core-java:jar:0.7.0-SNAPSHOT from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
org.apache.beam:beam-sdks-common-runner-api:jar:0.7.0-SNAPSHOT from the shaded 
jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
org.apache.beam:beam-runners-core-construction-java:jar:0.7.0-SNAPSHOT from the 
shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
org.apache.beam:beam-runners-google-cloud-dataflow-java:jar:0.7.0-SNAPSHOT from 
the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:0.7.0-SNAPSHOT from 
the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:0.7.0-SNAPSHOT from the 
shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev295-1.22.0 from the 
shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 from the shaded 
jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.api.grpc:grpc-google-pubsub-v1:jar:0.1.0 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.api.grpc:grpc-google-iam-v1:jar:0.1.0 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 from the shaded 
jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.http-client:google-http-client-jackson:jar:1.22.0 from the shaded 
jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the 
shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final 
from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final 
from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the 
shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the 
shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 
from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from 
the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 
from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.protobuf.nano:protobuf-javanano:jar:3.0.0-alpha-5 from the shaded 
jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.cloud.bigtable:bigtable-protos:jar:0.9.6 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.cloud.bigtable:bigtable-client-core:jar:0.9.6 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.auth:google-auth-library-appengine:jar:0.6.0 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.appengine:appengine-api-1.0-sdk:jar:1.9.34 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
io.dropwizard.metrics:metrics-core:jar:3.1.2 from the shaded jar.
2017-04-29T12:25:15.672 [INFO] Excluding 
com.google.api.grpc:grpc-google-common-protos:jar:0.1.0 from the shaded jar.
2017-04

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Apex #1268

2017-04-29 Thread Apache Jenkins Server
See 




[1/2] beam git commit: Fix broken build

2017-04-29 Thread jbonofre
Repository: beam
Updated Branches:
  refs/heads/master 2b6cb8ca1 -> bac06331e


Fix broken build


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/2384e46b
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/2384e46b
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/2384e46b

Branch: refs/heads/master
Commit: 2384e46be0f49ffa9e5c94ec929b344a46a72af6
Parents: 2b6cb8c
Author: Aviem Zur 
Authored: Sat Apr 29 14:08:06 2017 +0300
Committer: Aviem Zur 
Committed: Sat Apr 29 14:08:06 2017 +0300

--
 sdks/java/harness/pom.xml | 10 ++
 1 file changed, 10 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/2384e46b/sdks/java/harness/pom.xml
--
diff --git a/sdks/java/harness/pom.xml b/sdks/java/harness/pom.xml
index 76ceb3d..5cff5cc 100644
--- a/sdks/java/harness/pom.xml
+++ b/sdks/java/harness/pom.xml
@@ -65,6 +65,16 @@
 
 
   org.apache.beam
+  beam-runners-core-construction-java
+
+
+
+  org.apache.beam
+  beam-sdks-common-runner-api
+
+
+
+  org.apache.beam
   beam-sdks-java-core
   ${project.version}
   tests



[GitHub] beam pull request #2781: Fix broken build

2017-04-29 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2781


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #2781

2017-04-29 Thread jbonofre
This closes #2781


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/bac06331
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/bac06331
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/bac06331

Branch: refs/heads/master
Commit: bac06331e771b786afb1704647e57e53b0c3bdb0
Parents: 2b6cb8c 2384e46
Author: Jean-Baptiste Onofré 
Authored: Sat Apr 29 16:37:16 2017 +0200
Committer: Jean-Baptiste Onofré 
Committed: Sat Apr 29 16:37:16 2017 +0200

--
 sdks/java/harness/pom.xml | 10 ++
 1 file changed, 10 insertions(+)
--




[1/2] beam git commit: [BEAM-2057] Add a test for metrics reporting in Spark runner.

2017-04-29 Thread aviemzur
Repository: beam
Updated Branches:
  refs/heads/master bac06331e -> 81474aeaf


[BEAM-2057] Add a test for metrics reporting in Spark runner.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/59117737
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/59117737
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/59117737

Branch: refs/heads/master
Commit: 59117737619ba90345761ae0aefcf361eabf3772
Parents: bac0633
Author: Holden Karau 
Authored: Wed Apr 26 22:22:49 2017 -0700
Committer: Aviem Zur 
Committed: Sat Apr 29 17:57:42 2017 +0300

--
 .../metrics/sink/SparkMetricsSinkTest.java  | 86 
 1 file changed, 86 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/59117737/runners/spark/src/test/java/org/apache/beam/runners/spark/aggregators/metrics/sink/SparkMetricsSinkTest.java
--
diff --git 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/aggregators/metrics/sink/SparkMetricsSinkTest.java
 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/aggregators/metrics/sink/SparkMetricsSinkTest.java
new file mode 100644
index 000..b0ad972
--- /dev/null
+++ 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/aggregators/metrics/sink/SparkMetricsSinkTest.java
@@ -0,0 +1,86 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.beam.runners.spark.aggregators.metrics.sink;
+
+import static org.hamcrest.Matchers.is;
+import static org.hamcrest.Matchers.nullValue;
+import static org.junit.Assert.assertThat;
+
+import com.google.common.collect.ImmutableSet;
+import java.util.Arrays;
+import java.util.List;
+import java.util.Set;
+import org.apache.beam.runners.spark.PipelineRule;
+import org.apache.beam.runners.spark.examples.WordCount;
+import org.apache.beam.sdk.Pipeline;
+import org.apache.beam.sdk.coders.StringUtf8Coder;
+import org.apache.beam.sdk.testing.PAssert;
+import org.apache.beam.sdk.transforms.Create;
+import org.apache.beam.sdk.transforms.MapElements;
+import org.apache.beam.sdk.values.PCollection;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExternalResource;
+
+
+/**
+ * A test that verifies Beam metrics are reported to Spark's metrics sink.
+ */
+public class SparkMetricsSinkTest {
+
+  @Rule
+  public ExternalResource inMemoryMetricsSink = new InMemoryMetricsSinkRule();
+
+  @Rule
+  public final PipelineRule pipelineRule = PipelineRule.batch();
+
+  private Pipeline createSparkPipeline() {
+pipelineRule.getOptions().setEnableSparkMetricSinks(true);
+return pipelineRule.createPipeline();
+  }
+
+  private void runPipeline() {
+final List words =
+Arrays.asList("hi there", "hi", "hi sue bob", "hi sue", "", "bob hi");
+
+final Set expectedCounts =
+ImmutableSet.of("hi: 5", "there: 1", "sue: 2", "bob: 2");
+
+final Pipeline pipeline = createSparkPipeline();
+
+final PCollection output =
+pipeline
+.apply(Create.of(words).withCoder(StringUtf8Coder.of()))
+.apply(new WordCount.CountWords())
+.apply(MapElements.via(new WordCount.FormatAsTextFn()));
+
+PAssert.that(output).containsInAnyOrder(expectedCounts);
+
+pipeline.run();
+  }
+
+  @Test
+  public void testNamedMetric() throws Exception {
+assertThat(InMemoryMetrics.valueOf("emptyLines"), is(nullValue()));
+
+runPipeline();
+
+assertThat(InMemoryMetrics.valueOf("emptyLines"), is(1d));
+  }
+}



[2/2] beam git commit: This closes #2730

2017-04-29 Thread aviemzur
This closes #2730


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/81474aea
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/81474aea
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/81474aea

Branch: refs/heads/master
Commit: 81474aeafeb4dbc16a48b62114dc6f348eb5f426
Parents: bac0633 5911773
Author: Aviem Zur 
Authored: Sat Apr 29 17:58:10 2017 +0300
Committer: Aviem Zur 
Committed: Sat Apr 29 17:58:10 2017 +0300

--
 .../metrics/sink/SparkMetricsSinkTest.java  | 86 
 1 file changed, 86 insertions(+)
--




[jira] [Commented] (BEAM-2057) Test metrics are reported to Spark Metrics sink.

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989903#comment-15989903
 ] 

ASF GitHub Bot commented on BEAM-2057:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2730


> Test metrics are reported to Spark Metrics sink.
> 
>
> Key: BEAM-2057
> URL: https://issues.apache.org/jira/browse/BEAM-2057
> Project: Beam
>  Issue Type: Test
>  Components: runner-spark
>Reporter: Aviem Zur
>Assignee: Holden Karau
>  Labels: newbie, starter
>
> Test that metrics are reported to Spark's metric sink.
> Use {{InMemoryMetrics}} and {{InMemoryMetricsSinkRule}} similarly to the 
> {{NamedAggregatorsTest}} which tests that aggregators are reported to Spark's 
> metrics sink (Aggregators are being removed so this test should be in a 
> separate class).
> For an example on how to create a pipeline with metrics take a look at 
> {{MetricsTest}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2730: [BEAM-2057] Add a test for metrics reporting in Spa...

2017-04-29 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2730


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: [BEAM-1398] KafkaIO metrics.

2017-04-29 Thread aviemzur
Repository: beam
Updated Branches:
  refs/heads/master 81474aeaf -> 47821ad69


[BEAM-1398] KafkaIO metrics.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/930c27f5
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/930c27f5
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/930c27f5

Branch: refs/heads/master
Commit: 930c27f55fc980702089fe58fdb0edded96a2ac6
Parents: 81474ae
Author: Aviem Zur 
Authored: Tue Mar 28 07:29:53 2017 +0300
Committer: Aviem Zur 
Committed: Sat Apr 29 18:08:19 2017 +0300

--
 .../org/apache/beam/sdk/io/kafka/KafkaIO.java   |  65 +-
 .../apache/beam/sdk/io/kafka/KafkaIOTest.java   | 130 +++
 2 files changed, 194 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/930c27f5/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
--
diff --git 
a/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java 
b/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
index 47d8281..211f1a4 100644
--- a/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
+++ b/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
@@ -69,6 +69,10 @@ import 
org.apache.beam.sdk.io.UnboundedSource.UnboundedReader;
 import org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark;
 import org.apache.beam.sdk.io.kafka.serialization.CoderBasedKafkaDeserializer;
 import org.apache.beam.sdk.io.kafka.serialization.CoderBasedKafkaSerializer;
+import org.apache.beam.sdk.metrics.Counter;
+import org.apache.beam.sdk.metrics.Gauge;
+import org.apache.beam.sdk.metrics.SinkMetrics;
+import org.apache.beam.sdk.metrics.SourceMetrics;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.ValueProvider;
 import org.apache.beam.sdk.transforms.DoFn;
@@ -950,6 +954,13 @@ public class KafkaIO {
 private Deserializer keyDeserializerInstance = null;
 private Deserializer valueDeserializerInstance = null;
 
+private final Counter elementsRead = SourceMetrics.elementsRead();
+private final Counter bytesRead = SourceMetrics.bytesRead();
+private final Counter elementsReadBySplit;
+private final Counter bytesReadBySplit;
+private final Gauge backlogBytesOfSplit;
+private final Gauge backlogElementsOfSplit;
+
 private static final Duration KAFKA_POLL_TIMEOUT = Duration.millis(1000);
 private static final Duration NEW_RECORDS_POLL_TIMEOUT = 
Duration.millis(10);
 
@@ -1023,10 +1034,18 @@ public class KafkaIO {
 
   synchronized long approxBacklogInBytes() {
 // Note that is an an estimate of uncompressed backlog.
+long backlogMessageCount = backlogMessageCount();
+if (backlogMessageCount == UnboundedReader.BACKLOG_UNKNOWN) {
+  return UnboundedReader.BACKLOG_UNKNOWN;
+}
+return (long) (backlogMessageCount * avgRecordSize);
+  }
+
+  synchronized long backlogMessageCount() {
 if (latestOffset < 0 || nextOffset < 0) {
   return UnboundedReader.BACKLOG_UNKNOWN;
 }
-return Math.max(0, (long) ((latestOffset - nextOffset) * 
avgRecordSize));
+return Math.max(0, (latestOffset - nextOffset));
   }
 }
 
@@ -1065,6 +1084,13 @@ public class KafkaIO {
   partitionStates.get(i).nextOffset = ckptMark.getNextOffset();
 }
   }
+
+  String splitId = String.valueOf(source.id);
+
+  elementsReadBySplit = SourceMetrics.elementsReadBySplit(splitId);
+  bytesReadBySplit = SourceMetrics.bytesReadBySplit(splitId);
+  backlogBytesOfSplit = SourceMetrics.backlogBytesOfSplit(splitId);
+  backlogElementsOfSplit = SourceMetrics.backlogElementsOfSplit(splitId);
 }
 
 private void consumerPollLoop() {
@@ -1194,6 +1220,9 @@ public class KafkaIO {
 if (curBatch.hasNext()) {
   PartitionState pState = curBatch.next();
 
+  elementsRead.inc();
+  elementsReadBySplit.inc();
+
   if (!pState.recordIter.hasNext()) { // -- (c)
 pState.recordIter = Collections.emptyIterator(); // drop ref
 curBatch.remove();
@@ -1241,6 +1270,8 @@ public class KafkaIO {
   int recordSize = (rawRecord.key() == null ? 0 : 
rawRecord.key().length)
   + (rawRecord.value() == null ? 0 : rawRecord.value().length);
   pState.recordConsumed(offset, recordSize);
+  bytesRead.inc(recordSize);
+  bytesReadBySplit.inc(recordSize);
   return true;
 
 } else { // -- (b)
@@ -1278,6 +1309,19 @@ public class KafkaIO {
   LOG.debug("{}:  backlog {}", this, getSplitBacklogBytes());
 }
 
+private void reportBacklog() {
+ 

[2/2] beam git commit: This closes #2344

2017-04-29 Thread aviemzur
This closes #2344


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/47821ad6
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/47821ad6
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/47821ad6

Branch: refs/heads/master
Commit: 47821ad695f67977c775f62b6f8791ca109a7d0b
Parents: 81474ae 930c27f
Author: Aviem Zur 
Authored: Sat Apr 29 18:16:17 2017 +0300
Committer: Aviem Zur 
Committed: Sat Apr 29 18:16:17 2017 +0300

--
 .../org/apache/beam/sdk/io/kafka/KafkaIO.java   |  65 +-
 .../apache/beam/sdk/io/kafka/KafkaIOTest.java   | 130 +++
 2 files changed, 194 insertions(+), 1 deletion(-)
--




[jira] [Commented] (BEAM-1398) KafkaIO metrics

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1398?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989911#comment-15989911
 ] 

ASF GitHub Bot commented on BEAM-1398:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2344


> KafkaIO metrics
> ---
>
> Key: BEAM-1398
> URL: https://issues.apache.org/jira/browse/BEAM-1398
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-extensions
>Reporter: Aviem Zur
>Assignee: Aviem Zur
>
> Add metrics to {{KafkaIO}} using the metrics API.
> Metrics (Feel free to add more ideas here) per split (Where applicable):
> * Backlog in bytes.
> * Backlog in number of messages.
> * Messages consumed.
> * Bytes consumed.
> * Messages produced.
> Add {{NeedsRunner}} test which creates a pipeline and asserts metrics values.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2344: [BEAM-1398] KafkaIO metrics.

2017-04-29 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2344


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (BEAM-1398) KafkaIO metrics

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1398?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur updated BEAM-1398:

Fix Version/s: First stable release

> KafkaIO metrics
> ---
>
> Key: BEAM-1398
> URL: https://issues.apache.org/jira/browse/BEAM-1398
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-extensions
>Reporter: Aviem Zur
>Assignee: Aviem Zur
> Fix For: First stable release
>
>
> Add metrics to {{KafkaIO}} using the metrics API.
> Metrics (Feel free to add more ideas here) per split (Where applicable):
> * Backlog in bytes.
> * Backlog in number of messages.
> * Messages consumed.
> * Bytes consumed.
> * Messages produced.
> Add {{NeedsRunner}} test which creates a pipeline and asserts metrics values.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (BEAM-2057) Test metrics are reported to Spark Metrics sink.

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2057?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur resolved BEAM-2057.
-
Resolution: Done

> Test metrics are reported to Spark Metrics sink.
> 
>
> Key: BEAM-2057
> URL: https://issues.apache.org/jira/browse/BEAM-2057
> Project: Beam
>  Issue Type: Test
>  Components: runner-spark
>Reporter: Aviem Zur
>Assignee: Holden Karau
>  Labels: newbie, starter
> Fix For: First stable release
>
>
> Test that metrics are reported to Spark's metric sink.
> Use {{InMemoryMetrics}} and {{InMemoryMetricsSinkRule}} similarly to the 
> {{NamedAggregatorsTest}} which tests that aggregators are reported to Spark's 
> metrics sink (Aggregators are being removed so this test should be in a 
> separate class).
> For an example on how to create a pipeline with metrics take a look at 
> {{MetricsTest}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-2057) Test metrics are reported to Spark Metrics sink.

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2057?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur updated BEAM-2057:

Fix Version/s: First stable release

> Test metrics are reported to Spark Metrics sink.
> 
>
> Key: BEAM-2057
> URL: https://issues.apache.org/jira/browse/BEAM-2057
> Project: Beam
>  Issue Type: Test
>  Components: runner-spark
>Reporter: Aviem Zur
>Assignee: Holden Karau
>  Labels: newbie, starter
> Fix For: First stable release
>
>
> Test that metrics are reported to Spark's metric sink.
> Use {{InMemoryMetrics}} and {{InMemoryMetricsSinkRule}} similarly to the 
> {{NamedAggregatorsTest}} which tests that aggregators are reported to Spark's 
> metrics sink (Aggregators are being removed so this test should be in a 
> separate class).
> For an example on how to create a pipeline with metrics take a look at 
> {{MetricsTest}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (BEAM-1398) KafkaIO metrics

2017-04-29 Thread Aviem Zur (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1398?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aviem Zur resolved BEAM-1398.
-
Resolution: Implemented

> KafkaIO metrics
> ---
>
> Key: BEAM-1398
> URL: https://issues.apache.org/jira/browse/BEAM-1398
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-extensions
>Reporter: Aviem Zur
>Assignee: Aviem Zur
> Fix For: First stable release
>
>
> Add metrics to {{KafkaIO}} using the metrics API.
> Metrics (Feel free to add more ideas here) per split (Where applicable):
> * Backlog in bytes.
> * Backlog in number of messages.
> * Messages consumed.
> * Bytes consumed.
> * Messages produced.
> Add {{NeedsRunner}} test which creates a pipeline and asserts metrics values.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is back to normal : beam_PostCommit_Java_MavenInstall #3527

2017-04-29 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #2782: [BEAM-2022] fix triggering for processing time time...

2017-04-29 Thread tweise
GitHub user tweise opened a pull request:

https://github.com/apache/beam/pull/2782

[BEAM-2022] fix triggering for processing time timers

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---
R: @kennknowles 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tweise/beam BEAM-2022-processingTimeTimers

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2782.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2782


commit eb860388a8626837655e82171c8480421384e419
Author: Thomas Weise 
Date:   2017-04-29T08:17:22Z

BEAM-2022 fix triggering for processing time timers




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2022) ApexTimerInternals seems to treat processing time timers as event time timers

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2022?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989919#comment-15989919
 ] 

ASF GitHub Bot commented on BEAM-2022:
--

GitHub user tweise opened a pull request:

https://github.com/apache/beam/pull/2782

[BEAM-2022] fix triggering for processing time timers

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---
R: @kennknowles 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tweise/beam BEAM-2022-processingTimeTimers

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2782.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2782


commit eb860388a8626837655e82171c8480421384e419
Author: Thomas Weise 
Date:   2017-04-29T08:17:22Z

BEAM-2022 fix triggering for processing time timers




> ApexTimerInternals seems to treat processing time timers as event time timers
> -
>
> Key: BEAM-2022
> URL: https://issues.apache.org/jira/browse/BEAM-2022
> Project: Beam
>  Issue Type: Bug
>  Components: runner-apex
>Reporter: Kenneth Knowles
>Assignee: Thomas Weise
> Fix For: First stable release
>
>
> I first noticed that {{currentProcessingTime()}} was using {{Instant.now()}}, 
> which has some bad issues in a distributed setting. But it seemed on 
> inspection that processing time timers are simply treated as event time. 
> Perhaps I am reading the code wrong?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #3528

2017-04-29 Thread Apache Jenkins Server
See 




[2/2] beam git commit: [BEAM-1871] Hide internal implementation details of how we create a DefaultBucket for GCP Temp Location

2017-04-29 Thread lcwik
[BEAM-1871] Hide internal implementation details of how we create a 
DefaultBucket for GCP Temp Location

This closes #2747


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/f29444bf
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/f29444bf
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/f29444bf

Branch: refs/heads/master
Commit: f29444bf80281807736d01fea7f4238660d1c9a9
Parents: 47821ad c1b35f4
Author: Lukasz Cwik 
Authored: Sat Apr 29 09:11:58 2017 -0700
Committer: Lukasz Cwik 
Committed: Sat Apr 29 09:11:58 2017 -0700

--
 .../options/CloudResourceManagerOptions.java|  14 -
 .../sdk/extensions/gcp/options/GcpOptions.java  | 124 ++-
 .../org/apache/beam/sdk/util/DefaultBucket.java | 105 --
 .../apache/beam/sdk/util/GcpProjectUtil.java| 106 --
 .../extensions/gcp/options/GcpOptionsTest.java  | 325 ---
 .../apache/beam/sdk/util/DefaultBucketTest.java | 112 ---
 .../beam/sdk/util/GcpProjectUtilTest.java   |  77 -
 7 files changed, 335 insertions(+), 528 deletions(-)
--




[GitHub] beam pull request #2747: [BEAM-1871] Hide internal implementation details of...

2017-04-29 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2747


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: [BEAM-1871] Hide internal implementation details of how we create a DefaultBucket for GCP Temp Location

2017-04-29 Thread lcwik
Repository: beam
Updated Branches:
  refs/heads/master 47821ad69 -> f29444bf8


[BEAM-1871] Hide internal implementation details of how we create a 
DefaultBucket for GCP Temp Location

Moved relevant contents of GcpProjectUtil and DefaultProject into 
GcpOptions.GcpTempLocation


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/c1b35f46
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/c1b35f46
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/c1b35f46

Branch: refs/heads/master
Commit: c1b35f46ddd321b29132606d3633d45ff134ff6c
Parents: 47821ad
Author: Luke Cwik 
Authored: Thu Apr 27 13:50:00 2017 -0700
Committer: Lukasz Cwik 
Committed: Sat Apr 29 09:06:12 2017 -0700

--
 .../options/CloudResourceManagerOptions.java|  14 -
 .../sdk/extensions/gcp/options/GcpOptions.java  | 124 ++-
 .../org/apache/beam/sdk/util/DefaultBucket.java | 105 --
 .../apache/beam/sdk/util/GcpProjectUtil.java| 106 --
 .../extensions/gcp/options/GcpOptionsTest.java  | 325 ---
 .../apache/beam/sdk/util/DefaultBucketTest.java | 112 ---
 .../beam/sdk/util/GcpProjectUtilTest.java   |  77 -
 7 files changed, 335 insertions(+), 528 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/c1b35f46/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/CloudResourceManagerOptions.java
--
diff --git 
a/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/CloudResourceManagerOptions.java
 
b/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/CloudResourceManagerOptions.java
index 68432cf..87557e5 100644
--- 
a/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/CloudResourceManagerOptions.java
+++ 
b/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/CloudResourceManagerOptions.java
@@ -17,14 +17,10 @@
  */
 package org.apache.beam.sdk.extensions.gcp.options;
 
-import com.fasterxml.jackson.annotation.JsonIgnore;
 import org.apache.beam.sdk.options.ApplicationNameOptions;
-import org.apache.beam.sdk.options.Default;
 import org.apache.beam.sdk.options.Description;
-import org.apache.beam.sdk.options.Hidden;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.StreamingOptions;
-import org.apache.beam.sdk.util.GcpProjectUtil;
 
 /**
  * Properties needed when using Google CloudResourceManager with the Apache 
Beam SDK.
@@ -33,14 +29,4 @@ import org.apache.beam.sdk.util.GcpProjectUtil;
 + "https://cloud.google.com/resource-manager/ for details on 
CloudResourceManager.")
 public interface CloudResourceManagerOptions extends ApplicationNameOptions, 
GcpOptions,
 PipelineOptions, StreamingOptions {
-  /**
-   * The GcpProjectUtil instance that should be used to communicate with 
Google Cloud Storage.
-   */
-  @JsonIgnore
-  @Description("The GcpProjectUtil instance that should be used to communicate"
-   + " with Google Cloud Resource Manager.")
-  @Default.InstanceFactory(GcpProjectUtil.GcpProjectUtilFactory.class)
-  @Hidden
-  GcpProjectUtil getGcpProjectUtil();
-  void setGcpProjectUtil(GcpProjectUtil value);
 }

http://git-wip-us.apache.org/repos/asf/beam/blob/c1b35f46/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java
--
diff --git 
a/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java
 
b/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java
index 09904b6..b2a83e9 100644
--- 
a/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java
+++ 
b/sdks/java/extensions/gcp-core/src/main/java/org/apache/beam/sdk/extensions/gcp/options/GcpOptions.java
@@ -17,15 +17,24 @@
  */
 package org.apache.beam.sdk.extensions.gcp.options;
 
+import static com.google.common.base.Preconditions.checkArgument;
 import static com.google.common.base.Strings.isNullOrEmpty;
 
 import com.fasterxml.jackson.annotation.JsonIgnore;
+import com.google.api.client.util.BackOff;
+import com.google.api.client.util.Sleeper;
+import com.google.api.services.cloudresourcemanager.CloudResourceManager;
+import com.google.api.services.cloudresourcemanager.model.Project;
+import com.google.api.services.storage.model.Bucket;
 import com.google.auth.Credentials;
+import com.google.cloud.hadoop.util.ResilientOperation;
+import com.google.cloud.hadoop.util.RetryDeterminer;
 import com.google.common.annotations.VisibleForTesting;
 import com.google.common.io.Files;
 imp

[jira] [Commented] (BEAM-1871) Thin Java SDK Core

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1871?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989926#comment-15989926
 ] 

ASF GitHub Bot commented on BEAM-1871:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2747


> Thin Java SDK Core
> --
>
> Key: BEAM-1871
> URL: https://issues.apache.org/jira/browse/BEAM-1871
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Daniel Halperin
>Assignee: Luke Cwik
> Fix For: First stable release
>
>
> Before first stable release we need to thin out {{sdk-java-core}} module. 
> Some candidates for removal, but not a non-exhaustive list:
> {{sdk/io}}
> * anything BigQuery related
> * anything PubSub related
> * everything Protobuf related
> * TFRecordIO
> * XMLSink
> {{sdk/util}}
> * Everything GCS related
> * Everything Backoff related
> * Everything Google API related: ResponseInterceptors, RetryHttpBackoff, etc.
> * Everything CloudObject-related
> * Pubsub stuff
> {{sdk/coders}}
> * JAXBCoder
> * TableRowJsoNCoder



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is back to stable : beam_PostCommit_Java_MavenInstall #3529

2017-04-29 Thread Apache Jenkins Server
See 




[1/2] beam git commit: [BEAM-2098] Walkthrough URL in example code Javadoc is 404 not found

2017-04-29 Thread lcwik
Repository: beam
Updated Branches:
  refs/heads/master f29444bf8 -> be883666f


[BEAM-2098] Walkthrough URL in example code Javadoc is 404 not found


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/15802de1
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/15802de1
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/15802de1

Branch: refs/heads/master
Commit: 15802de1708df27d30f30c8fb6f1606fadb1ac26
Parents: f29444b
Author: Peter Gergo Barna 
Authored: Thu Apr 27 18:30:56 2017 +0200
Committer: Lukasz Cwik 
Committed: Sat Apr 29 09:47:47 2017 -0700

--
 .../java/src/main/java/org/apache/beam/examples/WordCount.java   | 4 ++--
 .../java/org/apache/beam/examples/cookbook/TriggerExample.java   | 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/15802de1/examples/java/src/main/java/org/apache/beam/examples/WordCount.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/WordCount.java 
b/examples/java/src/main/java/org/apache/beam/examples/WordCount.java
index 58720b8..b64d2c1 100644
--- a/examples/java/src/main/java/org/apache/beam/examples/WordCount.java
+++ b/examples/java/src/main/java/org/apache/beam/examples/WordCount.java
@@ -45,8 +45,8 @@ import org.apache.beam.sdk.values.PCollection;
  * pipeline, for introduction of additional concepts.
  *
  * For a detailed walkthrough of this example, see
- *   http://beam.apache.org/use/walkthroughs/";>
- *   http://beam.apache.org/use/walkthroughs/
+ *   https://beam.apache.org/get-started/wordcount-example/";>
+ *   https://beam.apache.org/get-started/wordcount-example/
  *   
  *
  * Basic concepts, also in the MinimalWordCount example:

http://git-wip-us.apache.org/repos/asf/beam/blob/15802de1/examples/java/src/main/java/org/apache/beam/examples/cookbook/TriggerExample.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/TriggerExample.java
 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/TriggerExample.java
index 0b5d9ad..49d5eda 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/TriggerExample.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/TriggerExample.java
@@ -75,8 +75,8 @@ import org.joda.time.Instant;
  *
  * Before running this example, it will be useful to familiarize yourself 
with Beam triggers
  * and understand the concept of 'late data',
- * See: http://beam.apache.org/use/walkthroughs/";>
- * http://beam.apache.org/use/walkthroughs/
+ * See: https://beam.apache.org/documentation/programming-guide/#triggers";>
+ * https://beam.apache.org/documentation/programming-guide/#triggers
  *
  * The example is configured to use the default BigQuery table from the 
example common package
  * (there are no defaults for a general Beam pipeline).



[2/2] beam git commit: [BEAM-2098] Walkthrough URL in example code Javadoc is 404 not found

2017-04-29 Thread lcwik
[BEAM-2098] Walkthrough URL in example code Javadoc is 404 not found

This closes #2743


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/be883666
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/be883666
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/be883666

Branch: refs/heads/master
Commit: be883666f377b29926cc1e7408d7cd268ca6ceff
Parents: f29444b 15802de
Author: Lukasz Cwik 
Authored: Sat Apr 29 09:48:13 2017 -0700
Committer: Lukasz Cwik 
Committed: Sat Apr 29 09:48:13 2017 -0700

--
 .../java/src/main/java/org/apache/beam/examples/WordCount.java   | 4 ++--
 .../java/org/apache/beam/examples/cookbook/TriggerExample.java   | 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)
--




[GitHub] beam pull request #2743: [BEAM-2098] Walkthrough URL in example code Javadoc...

2017-04-29 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2743


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2098) Walkthrough URL in example code Javadoc is 404 not found

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989939#comment-15989939
 ] 

ASF GitHub Bot commented on BEAM-2098:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2743


> Walkthrough URL in example code Javadoc is 404 not found
> 
>
> Key: BEAM-2098
> URL: https://issues.apache.org/jira/browse/BEAM-2098
> Project: Beam
>  Issue Type: Bug
>  Components: examples-java
>Reporter: Péter Gergő Barna
>Assignee: Péter Gergő Barna
>Priority: Trivial
> Fix For: First stable release
>
>
> Walkthrough URL in example code Javadoc is 404 not found:
> http://beam.apache.org/use/walkthroughs/
> It should be replaced with:
> https://beam.apache.org/get-started/wordcount-example/



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1970) Cannot run UserScore on Flink runner due to AvroCoder classload issues

2017-04-29 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989945#comment-15989945
 ] 

Kenneth Knowles commented on BEAM-1970:
---

>From [~StephanEwen] on u...@beam.apache.org here is some help: 
>https://ci.apache.org/projects/flink/flink-docs-release-1.2/monitoring/debugging_classloading.html

Perhaps we can stymie the caching somewhere in how {{AvroCoder}} uses Avro, 
based on that write up and discussion?

> Cannot run UserScore on Flink runner due to AvroCoder classload issues
> --
>
> Key: BEAM-1970
> URL: https://issues.apache.org/jira/browse/BEAM-1970
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Reporter: Ahmet Altay
>Assignee: Aljoscha Krettek
> Fix For: First stable release
>
>
> Fails with error:
> ClassCastException: 
> org.apache.beam.examples.complete.game.UserScore$GameActionInfo cannot be 
> cast to org.apache.beam.examples.complete.game.UserScore$GameActionInfo
> full stack:
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The main method 
> caused an error.
> at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
> at 
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
> at 
> org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
> at 
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
> at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
> at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
> at 
> org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> at 
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1116)
> Caused by: java.lang.RuntimeException: Pipeline execution failed
> at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:119)
> at org.apache.beam.sdk.Pipeline.run(Pipeline.java:265)
> at 
> org.apache.beam.examples.complete.game.UserScore.main(UserScore.java:238)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
> ... 13 more
> Caused by: org.apache.flink.client.program.ProgramInvocationException: The 
> program execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
> at 
> org.apache.flink.yarn.YarnClusterClient.submitJob(YarnClusterClient.java:210)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:400)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:387)
> at 
> org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
> at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.executePipeline(FlinkPipelineExecutionEnvironment.java:111)
> at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:116)
> ... 20 more
> Caused by: org.apache.flink.runtime.client.JobExecutionException: Job 
> execution failed.
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply$mcV$sp(JobManager.scala:900)
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:843)
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:843)
> at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.sca

Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #3530

2017-04-29 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Dataflow #352

2017-04-29 Thread Apache Jenkins Server
See 


Changes:

[aviemzur] Fix broken build

[aviemzur] [BEAM-2057] Add a test for metrics reporting in Spark runner.

[aviemzur] [BEAM-1398] KafkaIO metrics.

[lcwik] [BEAM-1871] Hide internal implementation details of how we create a

[lcwik] [BEAM-2098] Walkthrough URL in example code Javadoc is 404 not found

--
[...truncated 272.56 KB...]
 * [new ref] refs/pull/2709/merge -> origin/pr/2709/merge
 * [new ref] refs/pull/2710/head -> origin/pr/2710/head
 * [new ref] refs/pull/2710/merge -> origin/pr/2710/merge
 * [new ref] refs/pull/2711/head -> origin/pr/2711/head
 * [new ref] refs/pull/2711/merge -> origin/pr/2711/merge
 * [new ref] refs/pull/2712/head -> origin/pr/2712/head
 * [new ref] refs/pull/2712/merge -> origin/pr/2712/merge
 * [new ref] refs/pull/2713/head -> origin/pr/2713/head
 * [new ref] refs/pull/2713/merge -> origin/pr/2713/merge
 * [new ref] refs/pull/2714/head -> origin/pr/2714/head
 * [new ref] refs/pull/2714/merge -> origin/pr/2714/merge
 * [new ref] refs/pull/2715/head -> origin/pr/2715/head
 * [new ref] refs/pull/2715/merge -> origin/pr/2715/merge
 * [new ref] refs/pull/2716/head -> origin/pr/2716/head
 * [new ref] refs/pull/2716/merge -> origin/pr/2716/merge
 * [new ref] refs/pull/2717/head -> origin/pr/2717/head
 * [new ref] refs/pull/2717/merge -> origin/pr/2717/merge
 * [new ref] refs/pull/2718/head -> origin/pr/2718/head
 * [new ref] refs/pull/2718/merge -> origin/pr/2718/merge
 * [new ref] refs/pull/2719/head -> origin/pr/2719/head
 * [new ref] refs/pull/2719/merge -> origin/pr/2719/merge
 * [new ref] refs/pull/2720/head -> origin/pr/2720/head
 * [new ref] refs/pull/2721/head -> origin/pr/2721/head
 * [new ref] refs/pull/2721/merge -> origin/pr/2721/merge
 * [new ref] refs/pull/2722/head -> origin/pr/2722/head
 * [new ref] refs/pull/2722/merge -> origin/pr/2722/merge
 * [new ref] refs/pull/2723/head -> origin/pr/2723/head
 * [new ref] refs/pull/2723/merge -> origin/pr/2723/merge
 * [new ref] refs/pull/2724/head -> origin/pr/2724/head
 * [new ref] refs/pull/2724/merge -> origin/pr/2724/merge
 * [new ref] refs/pull/2725/head -> origin/pr/2725/head
 * [new ref] refs/pull/2726/head -> origin/pr/2726/head
 * [new ref] refs/pull/2727/head -> origin/pr/2727/head
 * [new ref] refs/pull/2727/merge -> origin/pr/2727/merge
 * [new ref] refs/pull/2728/head -> origin/pr/2728/head
 * [new ref] refs/pull/2729/head -> origin/pr/2729/head
 * [new ref] refs/pull/2729/merge -> origin/pr/2729/merge
 * [new ref] refs/pull/2730/head -> origin/pr/2730/head
 * [new ref] refs/pull/2730/merge -> origin/pr/2730/merge
 * [new ref] refs/pull/2731/head -> origin/pr/2731/head
 * [new ref] refs/pull/2732/head -> origin/pr/2732/head
 * [new ref] refs/pull/2732/merge -> origin/pr/2732/merge
 * [new ref] refs/pull/2733/head -> origin/pr/2733/head
 * [new ref] refs/pull/2733/merge -> origin/pr/2733/merge
 * [new ref] refs/pull/2734/head -> origin/pr/2734/head
 * [new ref] refs/pull/2735/head -> origin/pr/2735/head
 * [new ref] refs/pull/2735/merge -> origin/pr/2735/merge
 * [new ref] refs/pull/2736/head -> origin/pr/2736/head
 * [new ref] refs/pull/2736/merge -> origin/pr/2736/merge
 * [new ref] refs/pull/2737/head -> origin/pr/2737/head
 * [new ref] refs/pull/2737/merge -> origin/pr/2737/merge
 * [new ref] refs/pull/2738/head -> origin/pr/2738/head
 * [new ref] refs/pull/2738/merge -> origin/pr/2738/merge
 * [new ref] refs/pull/2739/head -> origin/pr/2739/head
 * [new ref] refs/pull/2739/merge -> origin/pr/2739/merge
 * [new ref] refs/pull/2740/head -> origin/pr/2740/head
 * [new ref] refs/pull/2740/merge -> origin/pr/2740/merge
 * [new ref] refs/pull/2741/head -> origin/pr/2741/head
 * [new ref] refs/pull/2742/head -> origin/pr/2742/head
 * [new ref] refs/pull/2743/head -> origin/pr/2743/head
 * [new ref] refs/pull/2743/merge -> origin/pr/2743/merge
 * [new ref] refs/pull/2744/head -> origin/pr/2744/head
 * [new ref] refs/pull/2744/merge -> origin/pr/2744/merge
 * [new ref] refs/pull/2745/head -> origin/pr/2745/head
 * [new ref] refs/pull/2745/merge -> origin/pr/2745/merge
 * [new ref] refs/pull/2746/head -> origin/pr/2746/head
 * [new ref] refs/pull/2746/merge -> origin/pr/2746/merge
 * [new ref] refs/pull/2747/head -> origin/pr/2747/head
 * [new ref] refs/pull/2747/merge -> origin/pr/2747/merge
 * [new ref] refs/pull/2748/he

Build failed in Jenkins: beam_PerformanceTests_JDBC #158

2017-04-29 Thread Apache Jenkins Server
See 


Changes:

[aviemzur] Fix broken build

[aviemzur] [BEAM-2057] Add a test for metrics reporting in Spark runner.

[aviemzur] [BEAM-1398] KafkaIO metrics.

[lcwik] [BEAM-1871] Hide internal implementation details of how we create a

[lcwik] [BEAM-2098] Walkthrough URL in example code Javadoc is 404 not found

--
[...truncated 848.26 KB...]
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:127)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:94)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
(d732055f59cba3da): java.lang.RuntimeException: 
org.apache.beam.sdk.util.UserCodeException: org.postgresql.util.PSQLException: 
The connection attempt failed.
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:289)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:127)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:94)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: 
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$ruH9D6bk.invokeSetup(Unknown
 Source)
at 
com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:66)
at 
com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:48)
at 
com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:99)
at 
com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:363)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
... 14 more
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:215)
at org.postgresql.Driver.makeConnection(Driver.java:404)
at org.postgresql.Driver.connect(Driver.java:272)
at java.sql.Drive

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Dataflow #2974

2017-04-29 Thread Apache Jenkins Server
See 




Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #1838

2017-04-29 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : beam_PostCommit_Java_MavenInstall #3532

2017-04-29 Thread Apache Jenkins Server
See 




[5/5] beam git commit: This closes #2666: Remove some construction-time uses of PipelineOptions

2017-04-29 Thread kenn
This closes #2666: Remove some construction-time uses of PipelineOptions

  Supply PipelineOptions at Pipeline.run()
  Move PTransform.validate to post-construction, modulo BigQueryIO
  Remove PipelineOptions from createWriteOperation()
  Move stable name validation to Pipeline.run()


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/f5e3f523
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/f5e3f523
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/f5e3f523

Branch: refs/heads/master
Commit: f5e3f5230af35da5a03ba9740f087b0f22df6dca
Parents: be88366 77a2545
Author: Kenneth Knowles 
Authored: Sat Apr 29 12:06:07 2017 -0700
Committer: Kenneth Knowles 
Committed: Sat Apr 29 12:06:07 2017 -0700

--
 .../core/construction/ForwardingPTransform.java |   5 +-
 .../construction/ForwardingPTransformTest.java  |   7 +-
 .../construction/PTransformMatchersTest.java|   4 +-
 .../direct/WriteWithShardingFactoryTest.java|   2 +-
 .../beam/runners/flink/FlinkTestPipeline.java   |   9 +-
 .../beam/runners/dataflow/AssignWindows.java|   5 +-
 .../runners/dataflow/DataflowRunnerTest.java|  12 +-
 .../main/java/org/apache/beam/sdk/Pipeline.java | 133 +--
 .../apache/beam/sdk/coders/CoderRegistry.java   |   9 +-
 .../java/org/apache/beam/sdk/io/AvroIO.java |   2 +-
 .../org/apache/beam/sdk/io/FileBasedSink.java   |   2 +-
 .../java/org/apache/beam/sdk/io/TFRecordIO.java |   2 +-
 .../java/org/apache/beam/sdk/io/TextIO.java |   3 +-
 .../java/org/apache/beam/sdk/io/WriteFiles.java |   9 +-
 .../apache/beam/sdk/testing/TestPipeline.java   |   9 +-
 .../apache/beam/sdk/transforms/GroupByKey.java  |  27 ++--
 .../apache/beam/sdk/transforms/PTransform.java  |   8 +-
 .../org/apache/beam/sdk/transforms/View.java|  25 +---
 .../beam/sdk/transforms/windowing/Window.java   |   5 +-
 .../java/org/apache/beam/sdk/PipelineTest.java  |  39 +++---
 .../apache/beam/sdk/io/FileBasedSinkTest.java   |   2 +-
 .../java/org/apache/beam/sdk/io/SimpleSink.java |   2 +-
 .../sdk/io/elasticsearch/ElasticsearchIO.java   |   4 +-
 .../beam/sdk/io/gcp/bigquery/BatchLoads.java|  27 
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java|  41 ++
 .../beam/sdk/io/gcp/bigtable/BigtableIO.java|  12 +-
 .../beam/sdk/io/gcp/datastore/DatastoreV1.java  |   4 +-
 .../sdk/io/gcp/bigtable/BigtableIOTest.java |   6 +-
 .../hadoop/inputformat/HadoopInputFormatIO.java |  10 +-
 .../inputformat/HadoopInputFormatIOTest.java|  27 ++--
 .../org/apache/beam/sdk/io/hbase/HBaseIO.java   |   4 +-
 .../apache/beam/sdk/io/hbase/HBaseIOTest.java   |   1 +
 .../apache/beam/sdk/io/hdfs/HDFSFileSink.java   |   2 +-
 .../java/org/apache/beam/sdk/io/hdfs/Sink.java  |   2 +-
 .../java/org/apache/beam/sdk/io/hdfs/Write.java |   7 +-
 .../beam/sdk/io/hdfs/HDFSFileSinkTest.java  |   2 +-
 .../org/apache/beam/sdk/io/jdbc/JdbcIO.java |   5 +-
 .../java/org/apache/beam/sdk/io/jms/JmsIO.java  |   4 +-
 .../org/apache/beam/sdk/io/kafka/KafkaIO.java   |  37 ++
 .../apache/beam/sdk/io/mongodb/MongoDbIO.java   |   4 +-
 .../org/apache/beam/sdk/io/mqtt/MqttIO.java |   4 +-
 .../java/org/apache/beam/sdk/io/xml/XmlIO.java  |   5 +-
 .../org/apache/beam/sdk/io/xml/XmlSink.java |   2 +-
 .../org/apache/beam/sdk/io/xml/XmlSinkTest.java |   8 +-
 44 files changed, 296 insertions(+), 243 deletions(-)
--




[2/5] beam git commit: Remove PipelineOptions from createWriteOperation()

2017-04-29 Thread kenn
Remove PipelineOptions from createWriteOperation()


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/4291fa6d
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/4291fa6d
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/4291fa6d

Branch: refs/heads/master
Commit: 4291fa6ddc75a7142cacb39025b613eca54c48c3
Parents: f6ebb04
Author: Kenneth Knowles 
Authored: Mon Apr 24 13:57:23 2017 -0700
Committer: Kenneth Knowles 
Committed: Sat Apr 29 10:41:33 2017 -0700

--
 .../runners/core/construction/PTransformMatchersTest.java| 4 +---
 .../beam/runners/direct/WriteWithShardingFactoryTest.java| 2 +-
 .../core/src/main/java/org/apache/beam/sdk/io/AvroIO.java| 2 +-
 .../src/main/java/org/apache/beam/sdk/io/FileBasedSink.java  | 2 +-
 .../src/main/java/org/apache/beam/sdk/io/TFRecordIO.java | 2 +-
 .../core/src/main/java/org/apache/beam/sdk/io/TextIO.java| 3 +--
 .../src/main/java/org/apache/beam/sdk/io/WriteFiles.java | 2 +-
 .../test/java/org/apache/beam/sdk/io/FileBasedSinkTest.java  | 2 +-
 .../src/test/java/org/apache/beam/sdk/io/SimpleSink.java | 2 +-
 .../main/java/org/apache/beam/sdk/io/hdfs/HDFSFileSink.java  | 2 +-
 .../hdfs/src/main/java/org/apache/beam/sdk/io/hdfs/Sink.java | 2 +-
 .../src/main/java/org/apache/beam/sdk/io/hdfs/Write.java | 2 +-
 .../java/org/apache/beam/sdk/io/hdfs/HDFSFileSinkTest.java   | 2 +-
 .../src/main/java/org/apache/beam/sdk/io/xml/XmlSink.java| 2 +-
 .../test/java/org/apache/beam/sdk/io/xml/XmlSinkTest.java| 8 
 15 files changed, 18 insertions(+), 21 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/4291fa6d/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
--
diff --git 
a/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
 
b/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
index d9bc1e7..9754bb3 100644
--- 
a/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
+++ 
b/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
@@ -31,7 +31,6 @@ import org.apache.beam.sdk.coders.VarIntCoder;
 import org.apache.beam.sdk.coders.VoidCoder;
 import org.apache.beam.sdk.io.FileBasedSink;
 import org.apache.beam.sdk.io.WriteFiles;
-import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.runners.PTransformMatcher;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.AppliedPTransform;
@@ -503,8 +502,7 @@ public class PTransformMatchersTest implements Serializable 
{
 WriteFiles.to(
 new FileBasedSink("foo", "bar") {
   @Override
-  public FileBasedWriteOperation createWriteOperation(
-  PipelineOptions options) {
+  public FileBasedWriteOperation createWriteOperation() {
 return null;
   }
 });

http://git-wip-us.apache.org/repos/asf/beam/blob/4291fa6d/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
--
diff --git 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
index b0c9f6d..960640c 100644
--- 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
+++ 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
@@ -216,7 +216,7 @@ public class WriteWithShardingFactoryTest {
 public void validate(PipelineOptions options) {}
 
 @Override
-public FileBasedWriteOperation 
createWriteOperation(PipelineOptions options) {
+public FileBasedWriteOperation createWriteOperation() {
   throw new IllegalArgumentException("Should not be used");
 }
   }

http://git-wip-us.apache.org/repos/asf/beam/blob/4291fa6d/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java
--
diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java
index 24e158f..a48976f 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java
@@ -989,7 +989,7 @@ public class AvroIO {
 }
 
 @O

[4/5] beam git commit: Supply PipelineOptions at Pipeline.run()

2017-04-29 Thread kenn
Supply PipelineOptions at Pipeline.run()


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/77a25452
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/77a25452
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/77a25452

Branch: refs/heads/master
Commit: 77a25452f87dd380eed981603252c238089d4439
Parents: 19aa8ba
Author: Kenneth Knowles 
Authored: Mon Apr 24 12:50:57 2017 -0700
Committer: Kenneth Knowles 
Committed: Sat Apr 29 12:05:34 2017 -0700

--
 .../beam/runners/flink/FlinkTestPipeline.java   |  9 +--
 .../runners/dataflow/DataflowRunnerTest.java| 12 +++-
 .../main/java/org/apache/beam/sdk/Pipeline.java | 68 +---
 .../apache/beam/sdk/testing/TestPipeline.java   |  9 +--
 .../java/org/apache/beam/sdk/PipelineTest.java  | 28 +---
 5 files changed, 78 insertions(+), 48 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/77a25452/runners/flink/src/test/java/org/apache/beam/runners/flink/FlinkTestPipeline.java
--
diff --git 
a/runners/flink/src/test/java/org/apache/beam/runners/flink/FlinkTestPipeline.java
 
b/runners/flink/src/test/java/org/apache/beam/runners/flink/FlinkTestPipeline.java
index d6240c4..f3498be 100644
--- 
a/runners/flink/src/test/java/org/apache/beam/runners/flink/FlinkTestPipeline.java
+++ 
b/runners/flink/src/test/java/org/apache/beam/runners/flink/FlinkTestPipeline.java
@@ -18,9 +18,7 @@
 package org.apache.beam.runners.flink;
 
 import org.apache.beam.sdk.Pipeline;
-import org.apache.beam.sdk.PipelineResult;
 import org.apache.beam.sdk.options.PipelineOptions;
-import org.apache.beam.sdk.runners.PipelineRunner;
 
 /**
  * {@link org.apache.beam.sdk.Pipeline} for testing Dataflow programs on the
@@ -61,12 +59,11 @@ public class FlinkTestPipeline extends Pipeline {
*/
   private static FlinkTestPipeline create(boolean streaming) {
 TestFlinkRunner flinkRunner = TestFlinkRunner.create(streaming);
-return new FlinkTestPipeline(flinkRunner, 
flinkRunner.getPipelineOptions());
+return new FlinkTestPipeline(flinkRunner.getPipelineOptions());
   }
 
-  private FlinkTestPipeline(PipelineRunner runner,
-  PipelineOptions options) {
-super(runner, options);
+  private FlinkTestPipeline(PipelineOptions options) {
+super(options);
   }
 }
 

http://git-wip-us.apache.org/repos/asf/beam/blob/77a25452/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowRunnerTest.java
--
diff --git 
a/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowRunnerTest.java
 
b/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowRunnerTest.java
index 433fb77..c1d3fe6 100644
--- 
a/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowRunnerTest.java
+++ 
b/runners/google-cloud-dataflow-java/src/test/java/org/apache/beam/runners/dataflow/DataflowRunnerTest.java
@@ -252,7 +252,7 @@ public class DataflowRunnerTest {
 };
 
 try {
-  TestPipeline.fromOptions(PipelineOptionsFactory.fromArgs(args).create());
+  Pipeline.create(PipelineOptionsFactory.fromArgs(args).create()).run();
   fail();
 } catch (RuntimeException e) {
   assertThat(
@@ -271,7 +271,7 @@ public class DataflowRunnerTest {
 };
 
 try {
-  TestPipeline.fromOptions(PipelineOptionsFactory.fromArgs(args).create());
+  Pipeline.create(PipelineOptionsFactory.fromArgs(args).create()).run();
   fail();
 } catch (RuntimeException e) {
   assertThat(
@@ -917,7 +917,13 @@ public class DataflowRunnerTest {
 DataflowPipelineOptions streamingOptions = buildPipelineOptions();
 streamingOptions.setStreaming(true);
 streamingOptions.setRunner(DataflowRunner.class);
-Pipeline.create(streamingOptions);
+Pipeline p = Pipeline.create(streamingOptions);
+
+// Instantiation of a runner prior to run() currently has a side effect of 
mutating the options.
+// This could be tested by DataflowRunner.fromOptions(streamingOptions) 
but would not ensure
+// that the pipeline itself had the expected options set.
+p.run();
+
 assertEquals(
 DataflowRunner.GCS_UPLOAD_BUFFER_SIZE_BYTES_DEFAULT,
 streamingOptions.getGcsUploadBufferSizeBytes().intValue());

http://git-wip-us.apache.org/repos/asf/beam/blob/77a25452/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java
--
diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java
index 203bd14..d578a7a 100644
--- a/sdks

[1/5] beam git commit: Move stable name validation to Pipeline.run()

2017-04-29 Thread kenn
Repository: beam
Updated Branches:
  refs/heads/master be883666f -> f5e3f5230


Move stable name validation to Pipeline.run()

The action taken when a pipeline does not have unique stable
names depends on the PipelineOptions, which will not available
during construction. Moving this later removes one blocker
from the refactor to PipelineOptions availability.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/f6ebb045
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/f6ebb045
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/f6ebb045

Branch: refs/heads/master
Commit: f6ebb04513497c5a835e6b03cad3cefe20c682ed
Parents: f29444b
Author: Kenneth Knowles 
Authored: Mon Apr 24 12:49:39 2017 -0700
Committer: Kenneth Knowles 
Committed: Sat Apr 29 10:41:32 2017 -0700

--
 .../main/java/org/apache/beam/sdk/Pipeline.java | 49 
 .../java/org/apache/beam/sdk/PipelineTest.java  | 11 +++--
 2 files changed, 36 insertions(+), 24 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/f6ebb045/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java
--
diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java
index 88ecc0b..716b328 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java
@@ -19,8 +19,11 @@ package org.apache.beam.sdk;
 
 import static com.google.common.base.Preconditions.checkState;
 
+import com.google.common.annotations.VisibleForTesting;
+import com.google.common.base.Joiner;
 import com.google.common.collect.HashMultimap;
 import com.google.common.collect.SetMultimap;
+import java.util.ArrayList;
 import java.util.Collection;
 import java.util.HashSet;
 import java.util.List;
@@ -274,6 +277,7 @@ public class Pipeline {
 // pipeline.
 LOG.debug("Running {} via {}", this, runner);
 try {
+  validate(options);
   return runner.run(this);
 } catch (UserCodeException e) {
   // This serves to replace the stack with one that ends here and
@@ -418,6 +422,7 @@ public class Pipeline {
   private final TransformHierarchy transforms = new TransformHierarchy(this);
   private Set usedFullNames = new HashSet<>();
   private CoderRegistry coderRegistry;
+  private final List unstableNames = new ArrayList<>();
 
   protected Pipeline(PipelineRunner runner, PipelineOptions options) {
 this.runner = runner;
@@ -442,25 +447,7 @@ public class Pipeline {
 boolean nameIsUnique = uniqueName.equals(buildName(namePrefix, name));
 
 if (!nameIsUnique) {
-  switch (getOptions().getStableUniqueNames()) {
-case OFF:
-  break;
-case WARNING:
-  LOG.warn(
-  "Transform {} does not have a stable unique name. "
-  + "This will prevent updating of pipelines.",
-  uniqueName);
-  break;
-case ERROR:
-  throw new IllegalStateException(
-  "Transform "
-  + uniqueName
-  + " does not have a stable unique name. "
-  + "This will prevent updating of pipelines.");
-default:
-  throw new IllegalArgumentException(
-  "Unrecognized value for stable unique names: " + 
getOptions().getStableUniqueNames());
-  }
+  unstableNames.add(uniqueName);
 }
 
 LOG.debug("Adding {} to {}", transform, this);
@@ -504,6 +491,30 @@ public class Pipeline {
 }
   }
 
+  @VisibleForTesting
+  void validate(PipelineOptions options) {
+if (!unstableNames.isEmpty()) {
+  switch (options.getStableUniqueNames()) {
+case OFF:
+  break;
+case WARNING:
+  LOG.warn(
+  "The following transforms do not have stable unique names: {}",
+  Joiner.on(", ").join(unstableNames));
+  break;
+case ERROR:
+  throw new IllegalStateException(
+  String.format(
+  "Pipeline update will not be possible"
+  + " because the following transforms do not have stable 
unique names: %s.",
+  Joiner.on(", ").join(unstableNames)));
+default:
+  throw new IllegalArgumentException(
+  "Unrecognized value for stable unique names: " + 
options.getStableUniqueNames());
+  }
+}
+  }
+
   /**
* Returns the {@link PipelineOptions} provided at the time this {@link 
Pipeline} was created.
*

http://git-wip-us.apache.org/repos/asf/beam/blob/f6ebb045/sdks/java/core/src/test/java/org/apache/beam/sdk/PipelineTest.java
--
diff 

[3/5] beam git commit: Move PTransform.validate to post-construction, modulo BigQueryIO

2017-04-29 Thread kenn
Move PTransform.validate to post-construction, modulo BigQueryIO

PipelineOptions, as used for most validation, should not be available
at construction time. Instead, validation will be called just before
running a pipeline.

BigQueryIO is particularly problematic and will get further treatment.
For now, the workaround is to establish the proper validation methods
but then to call them (erroneously, basically) at construction time
in expand().


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/19aa8ba5
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/19aa8ba5
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/19aa8ba5

Branch: refs/heads/master
Commit: 19aa8ba576c2d43166dc6d67a4c5c103b3522870
Parents: 4291fa6
Author: Kenneth Knowles 
Authored: Mon Apr 24 14:37:30 2017 -0700
Committer: Kenneth Knowles 
Committed: Sat Apr 29 11:39:00 2017 -0700

--
 .../core/construction/ForwardingPTransform.java |  5 ++-
 .../construction/ForwardingPTransformTest.java  |  7 ++--
 .../beam/runners/dataflow/AssignWindows.java|  5 ++-
 .../main/java/org/apache/beam/sdk/Pipeline.java | 22 ++-
 .../apache/beam/sdk/coders/CoderRegistry.java   |  9 -
 .../java/org/apache/beam/sdk/io/WriteFiles.java |  7 +++-
 .../apache/beam/sdk/transforms/GroupByKey.java  | 27 ++---
 .../apache/beam/sdk/transforms/PTransform.java  |  8 ++--
 .../org/apache/beam/sdk/transforms/View.java| 25 +++-
 .../beam/sdk/transforms/windowing/Window.java   |  5 ++-
 .../sdk/io/elasticsearch/ElasticsearchIO.java   |  4 +-
 .../beam/sdk/io/gcp/bigquery/BatchLoads.java| 27 +
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 41 ++--
 .../beam/sdk/io/gcp/bigtable/BigtableIO.java| 12 +++---
 .../beam/sdk/io/gcp/datastore/DatastoreV1.java  |  4 +-
 .../sdk/io/gcp/bigtable/BigtableIOTest.java |  6 +--
 .../hadoop/inputformat/HadoopInputFormatIO.java | 10 ++---
 .../inputformat/HadoopInputFormatIOTest.java| 27 +++--
 .../org/apache/beam/sdk/io/hbase/HBaseIO.java   |  4 +-
 .../apache/beam/sdk/io/hbase/HBaseIOTest.java   |  1 +
 .../java/org/apache/beam/sdk/io/hdfs/Write.java |  7 +++-
 .../org/apache/beam/sdk/io/jdbc/JdbcIO.java |  5 ++-
 .../java/org/apache/beam/sdk/io/jms/JmsIO.java  |  4 +-
 .../org/apache/beam/sdk/io/kafka/KafkaIO.java   | 37 ++
 .../apache/beam/sdk/io/mongodb/MongoDbIO.java   |  4 +-
 .../org/apache/beam/sdk/io/mqtt/MqttIO.java |  4 +-
 .../java/org/apache/beam/sdk/io/xml/XmlIO.java  |  5 ++-
 27 files changed, 168 insertions(+), 154 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/19aa8ba5/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ForwardingPTransform.java
--
diff --git 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ForwardingPTransform.java
 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ForwardingPTransform.java
index 3bee281..2f427ad 100644
--- 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ForwardingPTransform.java
+++ 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ForwardingPTransform.java
@@ -19,6 +19,7 @@ package org.apache.beam.runners.core.construction;
 
 import org.apache.beam.sdk.coders.CannotProvideCoderException;
 import org.apache.beam.sdk.coders.Coder;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.PTransform;
 import org.apache.beam.sdk.transforms.display.DisplayData;
 import org.apache.beam.sdk.values.PInput;
@@ -40,8 +41,8 @@ public abstract class ForwardingPTransformhttp://git-wip-us.apache.org/repos/asf/beam/blob/19aa8ba5/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/ForwardingPTransformTest.java
--
diff --git 
a/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/ForwardingPTransformTest.java
 
b/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/ForwardingPTransformTest.java
index 7d3bfd8..74c056c 100644
--- 
a/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/ForwardingPTransformTest.java
+++ 
b/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/ForwardingPTransformTest.java
@@ -22,6 +22,7 @@ import static org.junit.Assert.assertThat;
 import static org.mockito.Matchers.any;
 
 import org.apache.beam.sdk.coders.Coder;
+import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.PTra

[GitHub] beam pull request #2666: [BEAM-818,BEAM-827,BEAM-828] Remove some constructi...

2017-04-29 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2666


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-818) Remove Pipeline.getPipelineOptions

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-818?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990013#comment-15990013
 ] 

ASF GitHub Bot commented on BEAM-818:
-

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2666


> Remove Pipeline.getPipelineOptions
> --
>
> Key: BEAM-818
> URL: https://issues.apache.org/jira/browse/BEAM-818
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Assignee: Kenneth Knowles
>  Labels: backward-incompatible
> Fix For: First stable release
>
>
> This stops transforms from changing their operation based on 
> construction-time options, and instead requires that configuration to be 
> explicit, or to obtain the configuration at runtime.
> https://docs.google.com/document/d/1Wr05cYdqnCfrLLqSk--XmGMGgDwwNwWZaFbxLKvPqEQ/edit#



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #1839

2017-04-29 Thread Apache Jenkins Server
See 




Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #3533

2017-04-29 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-1114) Support for new Timer API in Apex runner

2017-04-29 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990030#comment-15990030
 ] 

Kenneth Knowles commented on BEAM-1114:
---

With the improvements to ApexTimerInternals, I believe next steps are:
1. In ApexRunner overrides or somewhere, ensure that a ParDo is key partitioned 
appropriately if it is stateful
2. In ApexParDoOperator, ensure that timers are called. Probably very easy, 
something like https://github.com/apache/beam/pull/2032/files


> Support for new Timer API in Apex runner
> 
>
> Key: BEAM-1114
> URL: https://issues.apache.org/jira/browse/BEAM-1114
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-apex
>Reporter: Kenneth Knowles
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1970) Cannot run UserScore on Flink runner due to AvroCoder classload issues

2017-04-29 Thread Aljoscha Krettek (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990038#comment-15990038
 ] 

Aljoscha Krettek commented on BEAM-1970:


Thanks for that hint, [~StephanEwen]! I think AVRO-1283 could be what's biting 
us. If the caches were in a ThreadLocal we wouldn't have a problem since new 
Jobs run in new threads. Having them static is a problem because they survive 
inside TaskManager JVMs.

> Cannot run UserScore on Flink runner due to AvroCoder classload issues
> --
>
> Key: BEAM-1970
> URL: https://issues.apache.org/jira/browse/BEAM-1970
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Reporter: Ahmet Altay
>Assignee: Aljoscha Krettek
> Fix For: First stable release
>
>
> Fails with error:
> ClassCastException: 
> org.apache.beam.examples.complete.game.UserScore$GameActionInfo cannot be 
> cast to org.apache.beam.examples.complete.game.UserScore$GameActionInfo
> full stack:
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The main method 
> caused an error.
> at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
> at 
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
> at 
> org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
> at 
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
> at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
> at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
> at 
> org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> at 
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1116)
> Caused by: java.lang.RuntimeException: Pipeline execution failed
> at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:119)
> at org.apache.beam.sdk.Pipeline.run(Pipeline.java:265)
> at 
> org.apache.beam.examples.complete.game.UserScore.main(UserScore.java:238)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
> ... 13 more
> Caused by: org.apache.flink.client.program.ProgramInvocationException: The 
> program execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
> at 
> org.apache.flink.yarn.YarnClusterClient.submitJob(YarnClusterClient.java:210)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:400)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:387)
> at 
> org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
> at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.executePipeline(FlinkPipelineExecutionEnvironment.java:111)
> at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:116)
> ... 20 more
> Caused by: org.apache.flink.runtime.client.JobExecutionException: Job 
> execution failed.
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply$mcV$sp(JobManager.scala:900)
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:843)
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:843)
> at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> a

[jira] [Assigned] (BEAM-1970) Cannot run UserScore on Flink runner due to AvroCoder classload issues

2017-04-29 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-1970:
-

Assignee: Kenneth Knowles  (was: Aljoscha Krettek)

> Cannot run UserScore on Flink runner due to AvroCoder classload issues
> --
>
> Key: BEAM-1970
> URL: https://issues.apache.org/jira/browse/BEAM-1970
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Reporter: Ahmet Altay
>Assignee: Kenneth Knowles
> Fix For: First stable release
>
>
> Fails with error:
> ClassCastException: 
> org.apache.beam.examples.complete.game.UserScore$GameActionInfo cannot be 
> cast to org.apache.beam.examples.complete.game.UserScore$GameActionInfo
> full stack:
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The main method 
> caused an error.
> at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
> at 
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
> at 
> org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
> at 
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
> at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
> at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
> at 
> org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> at 
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1116)
> Caused by: java.lang.RuntimeException: Pipeline execution failed
> at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:119)
> at org.apache.beam.sdk.Pipeline.run(Pipeline.java:265)
> at 
> org.apache.beam.examples.complete.game.UserScore.main(UserScore.java:238)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
> ... 13 more
> Caused by: org.apache.flink.client.program.ProgramInvocationException: The 
> program execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
> at 
> org.apache.flink.yarn.YarnClusterClient.submitJob(YarnClusterClient.java:210)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:400)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:387)
> at 
> org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
> at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.executePipeline(FlinkPipelineExecutionEnvironment.java:111)
> at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:116)
> ... 20 more
> Caused by: org.apache.flink.runtime.client.JobExecutionException: Job 
> execution failed.
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply$mcV$sp(JobManager.scala:900)
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:843)
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:843)
> at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
> at 
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>

[GitHub] beam pull request #2783: [BEAM-1970] Use a new ReflectData for each AvroCode...

2017-04-29 Thread kennknowles
GitHub user kennknowles opened a pull request:

https://github.com/apache/beam/pull/2783

[BEAM-1970] Use a new ReflectData for each AvroCoder instance

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

This addresses an issue where Avro might have cached a class
from a different ClassLoader.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/kennknowles/beam AvroCoder

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2783.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2783


commit 72c189def87262c506c5c5c65552afcf31be8f04
Author: Kenneth Knowles 
Date:   2017-04-29T21:06:37Z

Use a new ReflectData for each AvroCoder instance

This addresses an issue where Avro might have cached a class
from a different ClassLoader.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1970) Cannot run UserScore on Flink runner due to AvroCoder classload issues

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990041#comment-15990041
 ] 

ASF GitHub Bot commented on BEAM-1970:
--

GitHub user kennknowles opened a pull request:

https://github.com/apache/beam/pull/2783

[BEAM-1970] Use a new ReflectData for each AvroCoder instance

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

This addresses an issue where Avro might have cached a class
from a different ClassLoader.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/kennknowles/beam AvroCoder

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2783.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2783


commit 72c189def87262c506c5c5c65552afcf31be8f04
Author: Kenneth Knowles 
Date:   2017-04-29T21:06:37Z

Use a new ReflectData for each AvroCoder instance

This addresses an issue where Avro might have cached a class
from a different ClassLoader.




> Cannot run UserScore on Flink runner due to AvroCoder classload issues
> --
>
> Key: BEAM-1970
> URL: https://issues.apache.org/jira/browse/BEAM-1970
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Reporter: Ahmet Altay
>Assignee: Kenneth Knowles
> Fix For: First stable release
>
>
> Fails with error:
> ClassCastException: 
> org.apache.beam.examples.complete.game.UserScore$GameActionInfo cannot be 
> cast to org.apache.beam.examples.complete.game.UserScore$GameActionInfo
> full stack:
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The main method 
> caused an error.
> at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
> at 
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
> at 
> org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
> at 
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
> at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
> at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
> at 
> org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> at 
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1116)
> Caused by: java.lang.RuntimeException: Pipeline execution failed
> at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:119)
> at org.apache.beam.sdk.Pipeline.run(Pipeline.java:265)
> at 
> org.apache.beam.examples.complete.game.UserScore.main(UserScore.java:238)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
> ... 13 more
> Caused by: org.apache.flink.client.program.ProgramInvocationException: The 
> program execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
> at 
> org.apache.flink.y

[9/9] beam git commit: This closes #2750

2017-04-29 Thread jkff
This closes #2750


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/14d60b26
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/14d60b26
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/14d60b26

Branch: refs/heads/master
Commit: 14d60b26e7e9b8a578038599341e66ccd99d012b
Parents: f5e3f52 8853d53
Author: Eugene Kirpichov 
Authored: Sat Apr 29 15:17:48 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 15:17:48 2017 -0700

--
 .../beam/examples/complete/game/GameStats.java  |   6 +-
 .../examples/complete/game/LeaderBoard.java |   6 +-
 .../beam/runners/dataflow/DataflowRunner.java   |  18 +-
 .../org/apache/beam/sdk/util/PropertyNames.java |   4 +-
 .../beam/sdk/io/gcp/pubsub/PubsubClient.java|  42 +-
 .../sdk/io/gcp/pubsub/PubsubGrpcClient.java |  36 +-
 .../apache/beam/sdk/io/gcp/pubsub/PubsubIO.java | 508 +--
 .../sdk/io/gcp/pubsub/PubsubJsonClient.java |  36 +-
 .../sdk/io/gcp/pubsub/PubsubTestClient.java |   6 +-
 .../sdk/io/gcp/pubsub/PubsubUnboundedSink.java  |  58 ++-
 .../io/gcp/pubsub/PubsubUnboundedSource.java|  65 ++-
 .../sdk/io/gcp/pubsub/PubsubClientTest.java |  50 +-
 .../sdk/io/gcp/pubsub/PubsubGrpcClientTest.java |  16 +-
 .../beam/sdk/io/gcp/pubsub/PubsubIOTest.java|  78 +--
 .../sdk/io/gcp/pubsub/PubsubJsonClientTest.java |  14 +-
 .../io/gcp/pubsub/PubsubUnboundedSinkTest.java  |  10 +-
 .../gcp/pubsub/PubsubUnboundedSourceTest.java   |   6 +-
 17 files changed, 459 insertions(+), 500 deletions(-)
--




[1/9] beam git commit: Renames {id, timestamp}Label to {id, timestamp}Attribute throughout SDK

2017-04-29 Thread jkff
Repository: beam
Updated Branches:
  refs/heads/master f5e3f5230 -> 14d60b26e


Renames {id,timestamp}Label to {id,timestamp}Attribute throughout SDK


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/8853d53d
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/8853d53d
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/8853d53d

Branch: refs/heads/master
Commit: 8853d53d9ffdf6e68c80880f6dd5f2d11a6e451e
Parents: f065114
Author: Eugene Kirpichov 
Authored: Thu Apr 27 17:19:14 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 13:15:48 2017 -0700

--
 .../beam/examples/complete/game/GameStats.java  |  2 +-
 .../examples/complete/game/LeaderBoard.java |  2 +-
 .../beam/runners/dataflow/DataflowRunner.java   | 18 ++---
 .../org/apache/beam/sdk/util/PropertyNames.java |  4 +-
 .../beam/sdk/io/gcp/pubsub/PubsubClient.java| 42 ++-
 .../sdk/io/gcp/pubsub/PubsubGrpcClient.java | 36 +-
 .../apache/beam/sdk/io/gcp/pubsub/PubsubIO.java | 74 ++--
 .../sdk/io/gcp/pubsub/PubsubJsonClient.java | 36 +-
 .../sdk/io/gcp/pubsub/PubsubTestClient.java |  6 +-
 .../sdk/io/gcp/pubsub/PubsubUnboundedSink.java  | 58 ---
 .../io/gcp/pubsub/PubsubUnboundedSource.java| 61 +---
 .../sdk/io/gcp/pubsub/PubsubClientTest.java | 50 ++---
 .../sdk/io/gcp/pubsub/PubsubGrpcClientTest.java | 16 +++--
 .../beam/sdk/io/gcp/pubsub/PubsubIOTest.java| 24 +++
 .../sdk/io/gcp/pubsub/PubsubJsonClientTest.java | 14 ++--
 .../io/gcp/pubsub/PubsubUnboundedSinkTest.java  | 10 +--
 .../gcp/pubsub/PubsubUnboundedSourceTest.java   |  6 +-
 17 files changed, 238 insertions(+), 221 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/8853d53d/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
--
diff --git 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
index d628497..a46d3c5 100644
--- 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
+++ 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
@@ -252,7 +252,7 @@ public class GameStats extends LeaderBoard {
 // Read Events from Pub/Sub using custom timestamps
 PCollection rawEvents = pipeline
 .apply(PubsubIO.readStrings()
-
.withTimestampLabel(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic()))
+
.withTimestampAttribute(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic()))
 .apply("ParseGameEvent", ParDo.of(new ParseEventFn()));
 
 // Extract username/score pairs from the event stream

http://git-wip-us.apache.org/repos/asf/beam/blob/8853d53d/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
--
diff --git 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
index fbffac6..9af34c5 100644
--- 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
+++ 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
@@ -191,7 +191,7 @@ public class LeaderBoard extends HourlyTeamScore {
 // data elements, and parse the data.
 PCollection gameEvents = pipeline
 .apply(PubsubIO.readStrings()
-
.withTimestampLabel(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic()))
+
.withTimestampAttribute(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic()))
 .apply("ParseGameEvent", ParDo.of(new ParseEventFn()));
 
 gameEvents.apply("CalculateTeamScores",

http://git-wip-us.apache.org/repos/asf/beam/blob/8853d53d/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
--
diff --git 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
index 63c2191..a61fe49 100644
--- 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
+++ 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
@@ -922,12 +922,13 @@ public class DataflowRunner extends 
PipelineRunner {
   ((NestedValueProvider) 
overriddenTransform.getSubscriptionProvider()).propertyName());
 }
  

[7/9] beam git commit: Renames PubsubIO.Read builder methods to be style guide compliant

2017-04-29 Thread jkff
Renames PubsubIO.Read builder methods to be style guide compliant


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/5d8fbc4c
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/5d8fbc4c
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/5d8fbc4c

Branch: refs/heads/master
Commit: 5d8fbc4c4d87f75ea84a40c2ee36531eb0eda26f
Parents: f4d0460
Author: Eugene Kirpichov 
Authored: Thu Apr 20 17:19:37 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 13:15:48 2017 -0700

--
 .../beam/examples/complete/game/GameStats.java  |  2 +-
 .../examples/complete/game/LeaderBoard.java |  2 +-
 .../apache/beam/sdk/io/gcp/pubsub/PubsubIO.java | 24 ++---
 .../beam/sdk/io/gcp/pubsub/PubsubIOTest.java| 38 ++--
 4 files changed, 33 insertions(+), 33 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/5d8fbc4c/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
--
diff --git 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
index e0048b7..d95eb06 100644
--- 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
+++ 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
@@ -253,7 +253,7 @@ public class GameStats extends LeaderBoard {
 // Read Events from Pub/Sub using custom timestamps
 PCollection rawEvents = pipeline
 .apply(PubsubIO.read()
-.timestampLabel(TIMESTAMP_ATTRIBUTE).topic(options.getTopic())
+
.withTimestampLabel(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic())
 .withCoder(StringUtf8Coder.of()))
 .apply("ParseGameEvent", ParDo.of(new ParseEventFn()));
 

http://git-wip-us.apache.org/repos/asf/beam/blob/5d8fbc4c/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
--
diff --git 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
index 96f4291..a87468a 100644
--- 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
+++ 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
@@ -192,7 +192,7 @@ public class LeaderBoard extends HourlyTeamScore {
 // data elements, and parse the data.
 PCollection gameEvents = pipeline
 .apply(PubsubIO.read()
-.timestampLabel(TIMESTAMP_ATTRIBUTE).topic(options.getTopic())
+
.withTimestampLabel(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic())
 .withCoder(StringUtf8Coder.of()))
 .apply("ParseGameEvent", ParDo.of(new ParseEventFn()));
 

http://git-wip-us.apache.org/repos/asf/beam/blob/5d8fbc4c/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
index 3c76942..20aed6d 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
@@ -519,14 +519,14 @@ public class PubsubIO {
  * some arbitrary portion of the data.  Most likely, separate readers 
should
  * use their own subscriptions.
  */
-public Read subscription(String subscription) {
-  return subscription(StaticValueProvider.of(subscription));
+public Read fromSubscription(String subscription) {
+  return fromSubscription(StaticValueProvider.of(subscription));
 }
 
 /**
  * Like {@code subscription()} but with a {@link ValueProvider}.
  */
-public Read subscription(ValueProvider subscription) {
+public Read fromSubscription(ValueProvider subscription) {
   if (subscription.isAccessible()) {
 // Validate.
 PubsubSubscription.fromPath(subscription.get());
@@ -541,7 +541,7 @@ public class PubsubIO {
 
 /**
  * Creates and returns a transform for reading from a Cloud Pub/Sub topic. 
Mutually exclusive
- * with {@link #subscription(String)}.
+ * with {@link #fromSubscription(String)}.
  *
  * See {@link PubsubIO.PubsubTopic#fromPath(String)} for more details 
on the format
  * of the {@code topic} string.
@@ -550,14 +550,14 

[3/9] beam git commit: Remove override of topic by subscription and vice versa

2017-04-29 Thread jkff
Remove override of topic by subscription and vice versa


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/9e815485
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/9e815485
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/9e815485

Branch: refs/heads/master
Commit: 9e815485b979b99b190c4acf1098ab054492ae9e
Parents: 5d8fbc4
Author: Eugene Kirpichov 
Authored: Thu Apr 27 17:04:58 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 13:15:48 2017 -0700

--
 .../org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java|  4 
 .../apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java| 13 -
 2 files changed, 8 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/9e815485/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
index 20aed6d..69a5bd6 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
@@ -534,8 +534,6 @@ public class PubsubIO {
   return toBuilder()
   .setSubscriptionProvider(
   NestedValueProvider.of(subscription, new 
SubscriptionTranslator()))
-  /* reset topic to null */
-  .setTopicProvider(null)
   .build();
 }
 
@@ -564,8 +562,6 @@ public class PubsubIO {
   }
   return toBuilder()
   .setTopicProvider(NestedValueProvider.of(topic, new 
TopicTranslator()))
-  /* reset subscription to null */
-  .setSubscriptionProvider(null)
   .build();
 }
 

http://git-wip-us.apache.org/repos/asf/beam/blob/9e815485/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
index f44fffc..69d989f 100644
--- 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
+++ 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
@@ -146,16 +146,19 @@ public class PubsubIOTest {
   public void testPrimitiveReadDisplayData() {
 DisplayDataEvaluator evaluator = DisplayDataEvaluator.create();
 Set displayData;
-PubsubIO.Read read = 
PubsubIO.read().withCoder(StringUtf8Coder.of());
+PubsubIO.Read baseRead = 
PubsubIO.read().withCoder(StringUtf8Coder.of());
 
 // Reading from a subscription.
-read = 
read.fromSubscription("projects/project/subscriptions/subscription");
+PubsubIO.Read read =
+
baseRead.fromSubscription("projects/project/subscriptions/subscription");
 displayData = evaluator.displayDataForPrimitiveSourceTransforms(read);
-assertThat("PubsubIO.Read should include the subscription in its primitive 
display data",
-displayData, hasItem(hasDisplayItem("subscription")));
+assertThat(
+"PubsubIO.Read should include the subscription in its primitive 
display data",
+displayData,
+hasItem(hasDisplayItem("subscription")));
 
 // Reading from a topic.
-read = read.fromTopic("projects/project/topics/topic");
+read = baseRead.fromTopic("projects/project/topics/topic");
 displayData = evaluator.displayDataForPrimitiveSourceTransforms(read);
 assertThat("PubsubIO.Read should include the topic in its primitive 
display data",
 displayData, hasItem(hasDisplayItem("topic")));



[6/9] beam git commit: Adds PubsubIO.readStrings(), readProtos(), readAvros()

2017-04-29 Thread jkff
Adds PubsubIO.readStrings(), readProtos(), readAvros()


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/079353d5
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/079353d5
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/079353d5

Branch: refs/heads/master
Commit: 079353d58c65141683e4640e425ee610001e7718
Parents: 42c975e
Author: Eugene Kirpichov 
Authored: Thu Apr 20 17:50:43 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 13:15:48 2017 -0700

--
 .../beam/examples/complete/game/GameStats.java  |  6 ++---
 .../examples/complete/game/LeaderBoard.java |  6 ++---
 .../apache/beam/sdk/io/gcp/pubsub/PubsubIO.java | 28 
 .../io/gcp/pubsub/PubsubUnboundedSource.java|  4 ++-
 .../beam/sdk/io/gcp/pubsub/PubsubIOTest.java|  3 +--
 5 files changed, 36 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/079353d5/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
--
diff --git 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
index d95eb06..d628497 100644
--- 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
+++ 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/GameStats.java
@@ -24,7 +24,6 @@ import org.apache.beam.examples.common.ExampleUtils;
 import org.apache.beam.examples.complete.game.utils.WriteWindowedToBigQuery;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.PipelineResult;
-import org.apache.beam.sdk.coders.StringUtf8Coder;
 import org.apache.beam.sdk.io.gcp.pubsub.PubsubIO;
 import org.apache.beam.sdk.metrics.Counter;
 import org.apache.beam.sdk.metrics.Metrics;
@@ -252,9 +251,8 @@ public class GameStats extends LeaderBoard {
 
 // Read Events from Pub/Sub using custom timestamps
 PCollection rawEvents = pipeline
-.apply(PubsubIO.read()
-
.withTimestampLabel(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic())
-.withCoder(StringUtf8Coder.of()))
+.apply(PubsubIO.readStrings()
+
.withTimestampLabel(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic()))
 .apply("ParseGameEvent", ParDo.of(new ParseEventFn()));
 
 // Extract username/score pairs from the event stream

http://git-wip-us.apache.org/repos/asf/beam/blob/079353d5/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
--
diff --git 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
index a87468a..fbffac6 100644
--- 
a/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
+++ 
b/examples/java8/src/main/java/org/apache/beam/examples/complete/game/LeaderBoard.java
@@ -27,7 +27,6 @@ import 
org.apache.beam.examples.complete.game.utils.WriteToBigQuery;
 import org.apache.beam.examples.complete.game.utils.WriteWindowedToBigQuery;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.PipelineResult;
-import org.apache.beam.sdk.coders.StringUtf8Coder;
 import org.apache.beam.sdk.io.gcp.pubsub.PubsubIO;
 import org.apache.beam.sdk.options.Default;
 import org.apache.beam.sdk.options.Description;
@@ -191,9 +190,8 @@ public class LeaderBoard extends HourlyTeamScore {
 // Read game events from Pub/Sub using custom timestamps, which are 
extracted from the pubsub
 // data elements, and parse the data.
 PCollection gameEvents = pipeline
-.apply(PubsubIO.read()
-
.withTimestampLabel(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic())
-.withCoder(StringUtf8Coder.of()))
+.apply(PubsubIO.readStrings()
+
.withTimestampLabel(TIMESTAMP_ATTRIBUTE).fromTopic(options.getTopic()))
 .apply("ParseGameEvent", ParDo.of(new ParseEventFn()));
 
 gameEvents.apply("CalculateTeamScores",

http://git-wip-us.apache.org/repos/asf/beam/blob/079353d5/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
index 99df3c6..9604864 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/

[5/9] beam git commit: Converts PubsubIO.Write to AutoValue

2017-04-29 Thread jkff
Converts PubsubIO.Write to AutoValue


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/df6ef969
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/df6ef969
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/df6ef969

Branch: refs/heads/master
Commit: df6ef969d6df5c42d091cc00997b0ed7680315fb
Parents: 9e81548
Author: Eugene Kirpichov 
Authored: Thu Apr 20 17:34:11 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 13:15:48 2017 -0700

--
 .../apache/beam/sdk/io/gcp/pubsub/PubsubIO.java | 166 +++
 1 file changed, 61 insertions(+), 105 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/df6ef969/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
index 69a5bd6..5702af1 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
@@ -461,8 +461,9 @@ public class PubsubIO {
 return new AutoValue_PubsubIO_Read.Builder().build();
   }
 
+  /** Returns A {@link PTransform} that writes to a Google Cloud Pub/Sub 
stream. */
   public static  Write write() {
-return new Write<>();
+return new AutoValue_PubsubIO_Write.Builder().build();
   }
 
   /** Implementation of {@link #read}. */
@@ -696,43 +697,47 @@ public class PubsubIO {
   private PubsubIO() {}
 
 
-  /**
-   * A {@link PTransform} that writes an unbounded {@link PCollection} of 
{@link String Strings}
-   * to a Cloud Pub/Sub stream.
-   */
-  public static class Write extends PTransform, PDone> {
-
-/** The Cloud Pub/Sub topic to publish to. */
+  /** Implementation of {@link #write}. */
+  @AutoValue
+  public abstract static class Write extends PTransform, 
PDone> {
 @Nullable
-private final ValueProvider topic;
+abstract ValueProvider getTopicProvider();
+
 /** The name of the message attribute to publish message timestamps in. */
 @Nullable
-private final String timestampLabel;
+abstract String getTimestampLabel();
+
 /** The name of the message attribute to publish unique message IDs in. */
 @Nullable
-private final String idLabel;
+abstract String getIdLabel();
+
 /** The input type Coder. */
-private final Coder coder;
+@Nullable
+abstract Coder getCoder();
+
 /** The format function for input PubsubMessage objects. */
-SimpleFunction formatFn;
+@Nullable
+abstract SimpleFunction getFormatFn();
 
-private Write() {
-  this(null, null, null, null, null, null);
-}
+abstract Builder toBuilder();
 
-private Write(
-String name, ValueProvider topic, String timestampLabel,
-String idLabel, Coder coder, SimpleFunction 
formatFn) {
-  super(name);
-  this.topic = topic;
-  this.timestampLabel = timestampLabel;
-  this.idLabel = idLabel;
-  this.coder = coder;
-  this.formatFn = formatFn;
+@AutoValue.Builder
+abstract static class Builder {
+  abstract Builder setTopicProvider(ValueProvider 
topicProvider);
+
+  abstract Builder setTimestampLabel(String timestampLabel);
+
+  abstract Builder setIdLabel(String idLabel);
+
+  abstract Builder setCoder(Coder coder);
+
+  abstract Builder setFormatFn(SimpleFunction 
formatFn);
+
+  abstract Write build();
 }
 
 /**
- * Creates a transform that publishes to the specified topic.
+ * Publishes to the specified topic.
  *
  * See {@link PubsubIO.PubsubTopic#fromPath(String)} for more details 
on the format of the
  * {@code topic} string.
@@ -745,14 +750,15 @@ public class PubsubIO {
  * Like {@code topic()} but with a {@link ValueProvider}.
  */
 public Write topic(ValueProvider topic) {
-  return new Write<>(name, NestedValueProvider.of(topic, new 
TopicTranslator()),
-  timestampLabel, idLabel, coder, formatFn);
+  return toBuilder()
+  .setTopicProvider(NestedValueProvider.of(topic, new 
TopicTranslator()))
+  .build();
 }
 
 /**
- * Creates a transform that writes to Pub/Sub, adds each record's 
timestamp to the published
- * messages in an attribute with the specified name. The value of the 
attribute will be a number
- * representing the number of milliseconds since the Unix epoch. For 
example, if using the Joda
+ * Writes to Pub/Sub and adds each record's timestamp to the published 
messages in an attribute
+ 

[8/9] beam git commit: Renames PubsubIO.Write builder methods to be style guide compliant

2017-04-29 Thread jkff
Renames PubsubIO.Write builder methods to be style guide compliant


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/42c975ee
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/42c975ee
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/42c975ee

Branch: refs/heads/master
Commit: 42c975ee533a63be750da2e8de1925b42efd2cad
Parents: df6ef96
Author: Eugene Kirpichov 
Authored: Thu Apr 20 17:41:48 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 13:15:48 2017 -0700

--
 .../org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java | 12 ++--
 .../org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java | 10 +-
 2 files changed, 11 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/42c975ee/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
index 5702af1..99df3c6 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
@@ -742,14 +742,14 @@ public class PubsubIO {
  * See {@link PubsubIO.PubsubTopic#fromPath(String)} for more details 
on the format of the
  * {@code topic} string.
  */
-public Write topic(String topic) {
-  return topic(StaticValueProvider.of(topic));
+public Write to(String topic) {
+  return to(StaticValueProvider.of(topic));
 }
 
 /**
  * Like {@code topic()} but with a {@link ValueProvider}.
  */
-public Write topic(ValueProvider topic) {
+public Write to(ValueProvider topic) {
   return toBuilder()
   .setTopicProvider(NestedValueProvider.of(topic, new 
TopicTranslator()))
   .build();
@@ -765,7 +765,7 @@ public class PubsubIO {
  * {@link PubsubIO.Read#withTimestampLabel(String)} can be used to ensure 
the other source reads
  * these timestamps from the appropriate attribute.
  */
-public Write timestampLabel(String timestampLabel) {
+public Write withTimestampLabel(String timestampLabel) {
   return toBuilder().setTimestampLabel(timestampLabel).build();
 }
 
@@ -777,7 +777,7 @@ public class PubsubIO {
  * {@link PubsubIO.Read#withIdLabel(String)} can be used to ensure that* 
the other source reads
  * these unique identifiers from the appropriate attribute.
  */
-public Write idLabel(String idLabel) {
+public Write withIdLabel(String idLabel) {
   return toBuilder().setIdLabel(idLabel).build();
 }
 
@@ -794,7 +794,7 @@ public class PubsubIO {
  * function translates the input type T to a PubsubMessage object, which 
is used by the sink
  * to separately set the PubSub message's payload and attributes.
  */
-public Write withAttributes(SimpleFunction formatFn) {
+public Write withFormatFn(SimpleFunction formatFn) {
   return toBuilder().setFormatFn(formatFn).build();
 }
 

http://git-wip-us.apache.org/repos/asf/beam/blob/42c975ee/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
index 69d989f..f896bfc 100644
--- 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
+++ 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIOTest.java
@@ -52,7 +52,7 @@ public class PubsubIOTest {
 assertEquals("PubsubIO.Read",
 
PubsubIO.read().fromTopic("projects/myproject/topics/mytopic").getName());
 assertEquals("PubsubIO.Write",
-
PubsubIO.write().topic("projects/myproject/topics/mytopic").getName());
+
PubsubIO.write().to("projects/myproject/topics/mytopic").getName());
   }
 
   @Test
@@ -168,9 +168,9 @@ public class PubsubIOTest {
   public void testWriteDisplayData() {
 String topic = "projects/project/topics/topic";
 PubsubIO.Write write = PubsubIO.write()
-.topic(topic)
-.timestampLabel("myTimestamp")
-.idLabel("myId");
+.to(topic)
+.withTimestampLabel("myTimestamp")
+.withIdLabel("myId");
 
 DisplayData displayData = DisplayData.from(write);
 
@@ -183,7 +183,7 @@ public class PubsubIOTest {
   

[2/9] beam git commit: Adds PubsubIO.writeStrings(), writeProtos(), writeAvros()

2017-04-29 Thread jkff
Adds PubsubIO.writeStrings(), writeProtos(), writeAvros()


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/f0651145
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/f0651145
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/f0651145

Branch: refs/heads/master
Commit: f0651145fea31854ab83fc064a3c7866251cc0a4
Parents: 079353d
Author: Eugene Kirpichov 
Authored: Thu Apr 20 17:54:03 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 13:15:48 2017 -0700

--
 .../apache/beam/sdk/io/gcp/pubsub/PubsubIO.java | 28 ++--
 1 file changed, 26 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/f0651145/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
index 9604864..3a7522e 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
@@ -474,8 +474,8 @@ public class PubsubIO {
   }
 
   /**
-   * Returns A {@link PTransform} that continuously reads binary encoded 
protos of the given type
-   * from a Google Cloud Pub/Sub stream.
+   * Returns A {@link PTransform} that continuously reads binary encoded 
protobuf messages of the
+   * given type from a Google Cloud Pub/Sub stream.
*/
   public static  Read readProtos(Class messageClass) {
 return PubsubIO.read().withCoder(ProtoCoder.of(messageClass));
@@ -494,6 +494,30 @@ public class PubsubIO {
 return new AutoValue_PubsubIO_Write.Builder().build();
   }
 
+  /**
+   * Returns A {@link PTransform} that writes UTF-8 encoded strings to a 
Google Cloud Pub/Sub
+   * stream.
+   */
+  public static Write writeStrings() {
+return PubsubIO.write().withCoder(StringUtf8Coder.of());
+  }
+
+  /**
+   * Returns A {@link PTransform} that writes binary encoded protobuf messages 
of a given type
+   * to a Google Cloud Pub/Sub stream.
+   */
+  public static  Write writeProtos(Class 
messageClass) {
+return PubsubIO.write().withCoder(ProtoCoder.of(messageClass));
+  }
+
+  /**
+   * Returns A {@link PTransform} that writes binary encoded Avro messages of 
a given type
+   * to a Google Cloud Pub/Sub stream.
+   */
+  public static  Write writeAvros(Class clazz) {
+return PubsubIO.write().withCoder(AvroCoder.of(clazz));
+  }
+
   /** Implementation of {@link #read}. */
   @AutoValue
   public abstract static class Read extends PTransform> {



[4/9] beam git commit: Converts PubsubIO.Read to AutoValue

2017-04-29 Thread jkff
Converts PubsubIO.Read to AutoValue


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/f4d04606
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/f4d04606
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/f4d04606

Branch: refs/heads/master
Commit: f4d04606c105ca45a7754516781cb72b4c818baf
Parents: f5e3f52
Author: Eugene Kirpichov 
Authored: Thu Apr 20 17:14:08 2017 -0700
Committer: Eugene Kirpichov 
Committed: Sat Apr 29 13:15:48 2017 -0700

--
 .../apache/beam/sdk/io/gcp/pubsub/PubsubIO.java | 244 +++
 .../beam/sdk/io/gcp/pubsub/PubsubIOTest.java|   8 +-
 2 files changed, 95 insertions(+), 157 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/f4d04606/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
index f0926d4..3c76942 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java
@@ -20,6 +20,7 @@ package org.apache.beam.sdk.io.gcp.pubsub;
 import static com.google.common.base.Preconditions.checkNotNull;
 import static com.google.common.base.Preconditions.checkState;
 
+import com.google.auto.value.AutoValue;
 import java.io.IOException;
 import java.io.Serializable;
 import java.util.ArrayList;
@@ -455,64 +456,61 @@ public class PubsubIO {
 }
   }
 
+   /** Returns A {@link PTransform} that continuously reads from a Google 
Cloud Pub/Sub stream. */
   public static  Read read() {
-return new Read<>();
+return new AutoValue_PubsubIO_Read.Builder().build();
   }
 
   public static  Write write() {
 return new Write<>();
   }
 
-  /**
-   * A {@link PTransform} that continuously reads from a Google Cloud Pub/Sub 
stream and
-   * returns a {@link PCollection} of {@link String Strings} containing the 
items from
-   * the stream.
-   */
-  public static class Read extends PTransform> {
-
-/** The Cloud Pub/Sub topic to read from. */
+  /** Implementation of {@link #read}. */
+  @AutoValue
+  public abstract static class Read extends PTransform> {
 @Nullable
-private final ValueProvider topic;
+abstract ValueProvider getTopicProvider();
 
-/** The Cloud Pub/Sub subscription to read from. */
 @Nullable
-private final ValueProvider subscription;
+abstract ValueProvider getSubscriptionProvider();
 
 /** The name of the message attribute to read timestamps from. */
 @Nullable
-private final String timestampLabel;
+abstract String getTimestampLabel();
 
 /** The name of the message attribute to read unique message IDs from. */
 @Nullable
-private final String idLabel;
+abstract String getIdLabel();
 
 /** The coder used to decode each record. */
 @Nullable
-private final Coder coder;
+abstract Coder getCoder();
 
 /** User function for parsing PubsubMessage object. */
-SimpleFunction parseFn;
+@Nullable
+abstract SimpleFunction getParseFn();
 
-private Read() {
-  this(null, null, null, null, null, null, null);
-}
+abstract Builder toBuilder();
 
-private Read(String name, ValueProvider subscription,
-ValueProvider topic, String timestampLabel, Coder 
coder,
-String idLabel,
-SimpleFunction parseFn) {
-  super(name);
-  this.subscription = subscription;
-  this.topic = topic;
-  this.timestampLabel = timestampLabel;
-  this.coder = coder;
-  this.idLabel = idLabel;
-  this.parseFn = parseFn;
+@AutoValue.Builder
+abstract static class Builder {
+  abstract Builder setTopicProvider(ValueProvider topic);
+
+  abstract Builder 
setSubscriptionProvider(ValueProvider subscription);
+
+  abstract Builder setTimestampLabel(String timestampLabel);
+
+  abstract Builder setIdLabel(String idLabel);
+
+  abstract Builder setCoder(Coder coder);
+
+  abstract Builder setParseFn(SimpleFunction parseFn);
+
+  abstract Read build();
 }
 
 /**
- * Returns a transform that's like this one but reading from the
- * given subscription.
+ * Reads from the given subscription.
  *
  * See {@link PubsubIO.PubsubSubscription#fromPath(String)} for more 
details on the format
  * of the {@code subscription} string.
@@ -520,8 +518,6 @@ public class PubsubIO {
  * Multiple readers reading from the same subscription will each receive
  * some arbitrary portion of the

[GitHub] beam pull request #2750: [BEAM-1415] PubsubIO style guide fixes, part 1

2017-04-29 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2750


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1415) PubsubIO should comply with PTransform style guide

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1415?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990066#comment-15990066
 ] 

ASF GitHub Bot commented on BEAM-1415:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2750


> PubsubIO should comply with PTransform style guide
> --
>
> Key: BEAM-1415
> URL: https://issues.apache.org/jira/browse/BEAM-1415
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Eugene Kirpichov
>Assignee: Eugene Kirpichov
>  Labels: backward-incompatible, easy, starter
> Fix For: First stable release
>
>
> Suggested changes:
> - Rename builder methods such as .subscription(), .topic() etc. to 
> .withSubscription, .withTopic()
> - Replace use of Coder from the API (.withCoder()) with a SerializableFunction
> - Rename .withAttributes() to something else, because it sounds like this is 
> a function that sets attributes.
> - (optional) use AutoValue



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #1840

2017-04-29 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #3534

2017-04-29 Thread Apache Jenkins Server
See 




[jira] [Comment Edited] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Ted Yu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15957130#comment-15957130
 ] 

Ted Yu edited comment on BEAM-1773 at 4/29/17 11:22 PM:


[~kenn] [~j...@nanthrax.net] :
Please kindly provide your feedback.


was (Author: yuzhih...@gmail.com):
[~kenn] [~j...@nanthrax.net]:
Please kindly provide your feedback.

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (BEAM-2123) Passing potential null pointer to encode() in StructuredCoder#structuralValue

2017-04-29 Thread Ted Yu (JIRA)
Ted Yu created BEAM-2123:


 Summary: Passing potential null pointer to encode() in 
StructuredCoder#structuralValue
 Key: BEAM-2123
 URL: https://issues.apache.org/jira/browse/BEAM-2123
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-core
Reporter: Ted Yu
Assignee: Davor Bonaci
Priority: Minor


{code}
  public Object structuralValue(T value) {
if (value != null && consistentWithEquals()) {
  return value;
} else {
  try {
ByteArrayOutputStream os = new ByteArrayOutputStream();
encode(value, os, Context.OUTER);
{code}
If value is null, encode() would throw CoderException (I checked ByteArrayCoder 
and KvCoder) which would be caught and converted to IllegalArgumentException.
Looks like structuralValue() can check null value directly and throw 
CoderException. This would result in clearer exception.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PerformanceTests_Dataflow #353

2017-04-29 Thread Apache Jenkins Server
See 


Changes:

[klk] Move stable name validation to Pipeline.run()

[klk] Remove PipelineOptions from createWriteOperation()

[klk] Move PTransform.validate to post-construction, modulo BigQueryIO

[klk] Supply PipelineOptions at Pipeline.run()

[kirpichov] Converts PubsubIO.Read to AutoValue

[kirpichov] Renames PubsubIO.Read builder methods to be style guide compliant

[kirpichov] Remove override of topic by subscription and vice versa

[kirpichov] Converts PubsubIO.Write to AutoValue

[kirpichov] Renames PubsubIO.Write builder methods to be style guide compliant

[kirpichov] Adds PubsubIO.readStrings(), readProtos(), readAvros()

[kirpichov] Adds PubsubIO.writeStrings(), writeProtos(), writeAvros()

[kirpichov] Renames {id,timestamp}Label to {id,timestamp}Attribute throughout 
SDK

--
[...truncated 277.08 KB...]
 * [new ref] refs/pull/2710/head -> origin/pr/2710/head
 * [new ref] refs/pull/2710/merge -> origin/pr/2710/merge
 * [new ref] refs/pull/2711/head -> origin/pr/2711/head
 * [new ref] refs/pull/2711/merge -> origin/pr/2711/merge
 * [new ref] refs/pull/2712/head -> origin/pr/2712/head
 * [new ref] refs/pull/2712/merge -> origin/pr/2712/merge
 * [new ref] refs/pull/2713/head -> origin/pr/2713/head
 * [new ref] refs/pull/2713/merge -> origin/pr/2713/merge
 * [new ref] refs/pull/2714/head -> origin/pr/2714/head
 * [new ref] refs/pull/2714/merge -> origin/pr/2714/merge
 * [new ref] refs/pull/2715/head -> origin/pr/2715/head
 * [new ref] refs/pull/2715/merge -> origin/pr/2715/merge
 * [new ref] refs/pull/2716/head -> origin/pr/2716/head
 * [new ref] refs/pull/2716/merge -> origin/pr/2716/merge
 * [new ref] refs/pull/2717/head -> origin/pr/2717/head
 * [new ref] refs/pull/2717/merge -> origin/pr/2717/merge
 * [new ref] refs/pull/2718/head -> origin/pr/2718/head
 * [new ref] refs/pull/2718/merge -> origin/pr/2718/merge
 * [new ref] refs/pull/2719/head -> origin/pr/2719/head
 * [new ref] refs/pull/2719/merge -> origin/pr/2719/merge
 * [new ref] refs/pull/2720/head -> origin/pr/2720/head
 * [new ref] refs/pull/2721/head -> origin/pr/2721/head
 * [new ref] refs/pull/2721/merge -> origin/pr/2721/merge
 * [new ref] refs/pull/2722/head -> origin/pr/2722/head
 * [new ref] refs/pull/2722/merge -> origin/pr/2722/merge
 * [new ref] refs/pull/2723/head -> origin/pr/2723/head
 * [new ref] refs/pull/2723/merge -> origin/pr/2723/merge
 * [new ref] refs/pull/2724/head -> origin/pr/2724/head
 * [new ref] refs/pull/2724/merge -> origin/pr/2724/merge
 * [new ref] refs/pull/2725/head -> origin/pr/2725/head
 * [new ref] refs/pull/2726/head -> origin/pr/2726/head
 * [new ref] refs/pull/2727/head -> origin/pr/2727/head
 * [new ref] refs/pull/2727/merge -> origin/pr/2727/merge
 * [new ref] refs/pull/2728/head -> origin/pr/2728/head
 * [new ref] refs/pull/2729/head -> origin/pr/2729/head
 * [new ref] refs/pull/2729/merge -> origin/pr/2729/merge
 * [new ref] refs/pull/2730/head -> origin/pr/2730/head
 * [new ref] refs/pull/2730/merge -> origin/pr/2730/merge
 * [new ref] refs/pull/2731/head -> origin/pr/2731/head
 * [new ref] refs/pull/2732/head -> origin/pr/2732/head
 * [new ref] refs/pull/2732/merge -> origin/pr/2732/merge
 * [new ref] refs/pull/2733/head -> origin/pr/2733/head
 * [new ref] refs/pull/2733/merge -> origin/pr/2733/merge
 * [new ref] refs/pull/2734/head -> origin/pr/2734/head
 * [new ref] refs/pull/2735/head -> origin/pr/2735/head
 * [new ref] refs/pull/2735/merge -> origin/pr/2735/merge
 * [new ref] refs/pull/2736/head -> origin/pr/2736/head
 * [new ref] refs/pull/2736/merge -> origin/pr/2736/merge
 * [new ref] refs/pull/2737/head -> origin/pr/2737/head
 * [new ref] refs/pull/2737/merge -> origin/pr/2737/merge
 * [new ref] refs/pull/2738/head -> origin/pr/2738/head
 * [new ref] refs/pull/2738/merge -> origin/pr/2738/merge
 * [new ref] refs/pull/2739/head -> origin/pr/2739/head
 * [new ref] refs/pull/2739/merge -> origin/pr/2739/merge
 * [new ref] refs/pull/2740/head -> origin/pr/2740/head
 * [new ref] refs/pull/2740/merge -> origin/pr/2740/merge
 * [new ref] refs/pull/2741/head -> origin/pr/2741/head
 * [new ref] refs/pull/2742/head -> origin/pr/2742/head
 * [new ref] refs/pull/2743/head -> origin/pr/2743/head
 * [new ref] refs/pull/2743/merge -> origin/pr/2743/merge
 * [new ref] refs/pull/2744/head -> origin/pr/2744/head
 * [new ref] refs/pull/2744/merge -> origin/pr/2744/merge
 * [new ref]

Build failed in Jenkins: beam_PerformanceTests_JDBC #159

2017-04-29 Thread Apache Jenkins Server
See 


Changes:

[klk] Move stable name validation to Pipeline.run()

[klk] Remove PipelineOptions from createWriteOperation()

[klk] Move PTransform.validate to post-construction, modulo BigQueryIO

[klk] Supply PipelineOptions at Pipeline.run()

[kirpichov] Converts PubsubIO.Read to AutoValue

[kirpichov] Renames PubsubIO.Read builder methods to be style guide compliant

[kirpichov] Remove override of topic by subscription and vice versa

[kirpichov] Converts PubsubIO.Write to AutoValue

[kirpichov] Renames PubsubIO.Write builder methods to be style guide compliant

[kirpichov] Adds PubsubIO.readStrings(), readProtos(), readAvros()

[kirpichov] Adds PubsubIO.writeStrings(), writeProtos(), writeAvros()

[kirpichov] Renames {id,timestamp}Label to {id,timestamp}Attribute throughout 
SDK

--
[...truncated 848.62 KB...]
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:127)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:94)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
(c1644776d6ea757a): java.lang.RuntimeException: 
org.apache.beam.sdk.util.UserCodeException: org.postgresql.util.PSQLException: 
The connection attempt failed.
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:289)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:127)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:94)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: 
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$IbMlo7Wd.invokeSetup(Unknown
 Source)
at 
com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:66)
at 
com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:48)
at 
com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:99)
at 
com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:363)
at 
com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
... 14 more
Caused by: 

Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #1841

2017-04-29 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-2124) Deprecate .options usage

2017-04-29 Thread JIRA
María GH created BEAM-2124:
--

 Summary: Deprecate .options usage
 Key: BEAM-2124
 URL: https://issues.apache.org/jira/browse/BEAM-2124
 Project: Beam
  Issue Type: Task
  Components: sdk-py
Reporter: María GH
Assignee: María GH
 Fix For: First stable release






--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #3535

2017-04-29 Thread Apache Jenkins Server
See 




[jira] [Resolved] (BEAM-2019) Count.globally() requires default values for non-GlobalWindows

2017-04-29 Thread Xu Mingmin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xu Mingmin resolved BEAM-2019.
--
   Resolution: Fixed
Fix Version/s: First stable release

> Count.globally() requires default values for non-GlobalWindows
> --
>
> Key: BEAM-2019
> URL: https://issues.apache.org/jira/browse/BEAM-2019
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Minor
> Fix For: First stable release
>
>
> Here's my code:
> {code}
> .apply(Window.into(FixedWindows.of(Duration.standardHours(1)))  
> .triggering(Repeatedly.forever(AfterProcessingTime.pastFirstElementInPane().plusDelayOf(Duration.standardMinutes(1
>   .withAllowedLateness(Duration.standardMinutes(10))
>   .accumulatingFiredPanes()
>   )
> .apply(Count.globally());
> {code}
> And the error message as below:
> {code}
> Exception in thread "main" java.lang.IllegalStateException: Default values 
> are not supported in Combine.globally() if the output PCollection is not 
> windowed by GlobalWindows. Instead, use Combine.globally().withoutDefaults() 
> to output an empty PCollection if the input PCollection is empty, or 
> Combine.globally().asSingletonView() to get the default output of the 
> CombineFn if the input PCollection is empty.
>   at 
> org.apache.beam.sdk.transforms.Combine$Globally.expand(Combine.java:1463)
>   at 
> org.apache.beam.sdk.transforms.Combine$Globally.expand(Combine.java:1336)
>   at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:420)
>   at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:334)
>   at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:154)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (BEAM-2125) update JavaDoc of BoundedWindow

2017-04-29 Thread Xu Mingmin (JIRA)
Xu Mingmin created BEAM-2125:


 Summary: update JavaDoc of BoundedWindow
 Key: BEAM-2125
 URL: https://issues.apache.org/jira/browse/BEAM-2125
 Project: Beam
  Issue Type: Task
  Components: sdk-java-core
Reporter: Xu Mingmin
Assignee: Xu Mingmin
Priority: Minor


{{BoundedWindow}} represents the window information of element, not represents 
a finite grouping of elements. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2784: [BEAM-2125] update JavaDoc of BoundedWindow

2017-04-29 Thread XuMingmin
GitHub user XuMingmin opened a pull request:

https://github.com/apache/beam/pull/2784

[BEAM-2125] update JavaDoc of BoundedWindow

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/XuMingmin/beam BEAM-2125

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2784.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2784


commit cd1e159e4b34c2b3f219dec35388c7f56dccbd41
Author: mingmxu 
Date:   2017-04-30T01:51:21Z

update JavaDoc for BoundedWindow




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2125) update JavaDoc of BoundedWindow

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2125?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990094#comment-15990094
 ] 

ASF GitHub Bot commented on BEAM-2125:
--

GitHub user XuMingmin opened a pull request:

https://github.com/apache/beam/pull/2784

[BEAM-2125] update JavaDoc of BoundedWindow

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/XuMingmin/beam BEAM-2125

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2784.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2784


commit cd1e159e4b34c2b3f219dec35388c7f56dccbd41
Author: mingmxu 
Date:   2017-04-30T01:51:21Z

update JavaDoc for BoundedWindow




> update JavaDoc of BoundedWindow
> ---
>
> Key: BEAM-2125
> URL: https://issues.apache.org/jira/browse/BEAM-2125
> Project: Beam
>  Issue Type: Task
>  Components: sdk-java-core
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Minor
>
> {{BoundedWindow}} represents the window information of element, not 
> represents a finite grouping of elements. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (BEAM-2126) Add JStorm runner to "Conbribute > Technical References > Ongoing Projects"

2017-04-29 Thread Kenneth Knowles (JIRA)
Kenneth Knowles created BEAM-2126:
-

 Summary: Add JStorm runner to "Conbribute > Technical References > 
Ongoing Projects"
 Key: BEAM-2126
 URL: https://issues.apache.org/jira/browse/BEAM-2126
 Project: Beam
  Issue Type: Improvement
  Components: runner-jstorm, website
Reporter: Kenneth Knowles
Assignee: Pei He


We should have this effort listed here: 
https://beam.apache.org/contribute/work-in-progress/



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Comment Edited] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990097#comment-15990097
 ] 

Kenneth Knowles edited comment on BEAM-1773 at 4/30/17 2:11 AM:


Checked exceptions should be understood as an alternative return value, 
basically. I am OK with adding a checked exception if it is a specific 
{{ValidationException}}. The source should add its own error message so the 
user knows what is going on. So you still have to wrap it. We should not add 
checked exceptions that automatically trickle up.

Incidentally, I don't think the way that {{InterruptedException}} is handled 
there is correct. It should _definitely_ not be thrown by {{validate}}. It 
should instead catch those two exceptions separately and handle interruption 
via {{Thread.currentThread().interrupt(); return;}}. See 
http://stackoverflow.com/questions/3976344/handling-interruptedexception-in-java


was (Author: kenn):
Checked exceptions should be understood as an alternative return value, 
basically. I am OK with adding a checked exception if it is a specific 
{{ValidationException}}. The source should add its own error message so the 
user knows what is going on. So you still have to wrap it. We should not add 
checked exceptions that automatically trickle up.

Incidentally, I don't think the way that {{InterruptedException}} is handled 
there is correct. It should _definitely_ not be thrown by {{validate}}. It 
should instead catch those two exceptions separately and handle interruption 
via {{Thread.currentThread().interrupt(); return;}}

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990097#comment-15990097
 ] 

Kenneth Knowles commented on BEAM-1773:
---

Checked exceptions should be understood as an alternative return value, 
basically. I am OK with adding a checked exception if it is a specific 
{{ValidationException}}. The source should add its own error message so the 
user knows what is going on. So you still have to wrap it. We should not add 
checked exceptions that automatically trickle up.

Incidentally, I don't think the way that {{InterruptedException}} is handled 
there is correct. It should _definitely_ not be thrown by {{validate}}. It 
should instead catch those two exceptions separately and handle interruption 
via {{Thread.currentThread().interrupt(); return;}}

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam-site pull request #226: [BEAM-2078] add BeamSQL feature branch in site

2017-04-29 Thread XuMingmin
GitHub user XuMingmin opened a pull request:

https://github.com/apache/beam-site/pull/226

[BEAM-2078] add BeamSQL feature branch in site

create a new PR as previous #224 edited the wrong file.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/XuMingmin/beam-site BEAM-2078-2

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam-site/pull/226.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #226


commit ee1f7d63919941bdaa3043d46d824d9c21489f03
Author: mingmxu 
Date:   2017-04-30T02:11:45Z

add SQL_DSL to page 'work-in-progress'




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2078) add BeamSQL feature branch in site

2017-04-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2078?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990100#comment-15990100
 ] 

ASF GitHub Bot commented on BEAM-2078:
--

GitHub user XuMingmin opened a pull request:

https://github.com/apache/beam-site/pull/226

[BEAM-2078] add BeamSQL feature branch in site

create a new PR as previous #224 edited the wrong file.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/XuMingmin/beam-site BEAM-2078-2

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam-site/pull/226.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #226


commit ee1f7d63919941bdaa3043d46d824d9c21489f03
Author: mingmxu 
Date:   2017-04-30T02:11:45Z

add SQL_DSL to page 'work-in-progress'




> add BeamSQL feature branch in site
> --
>
> Key: BEAM-2078
> URL: https://issues.apache.org/jira/browse/BEAM-2078
> Project: Beam
>  Issue Type: Task
>  Components: dsl-sql, website
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>
> Add {{dsl_sql}} feature branch to page 
> 'https://beam.apache.org/contribute/work-in-progress/', to track the status.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Comment Edited] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Ted Yu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990101#comment-15990101
 ] 

Ted Yu edited comment on BEAM-1773 at 4/30/17 2:23 AM:
---

w.r.t. ValidationException, I assume you mean creating ValidationException 
class (subclass of IOException) which is declared to be thrown by the relevant 
methods.

w.r.t. InterruptedException, allow me to refer to 
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 which has the following:
{code}
  } catch (InterruptedException e) {
throw (IOException)new InterruptedIOException("Interrupted: action="
+ action + ", retry policy=" + connectionRetryPolicy).initCause(e);
  }
{code}


was (Author: yuzhih...@gmail.com):
w.r.t. ValidationException, I assume you mean creating ValidationException 
class (subclass of IOException) which is declared by the relevant methods.

w.r.t. InterruptedException, allow me to refer to 
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 which has the following:
{code}
  } catch (InterruptedException e) {
throw (IOException)new InterruptedIOException("Interrupted: action="
+ action + ", retry policy=" + connectionRetryPolicy).initCause(e);
  }
{code}

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Ted Yu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990101#comment-15990101
 ] 

Ted Yu commented on BEAM-1773:
--

w.r.t. ValidationException, I assume you mean creating ValidationException 
class (subclass of IOException) which is declared by the relevant methods.

w.r.t. InterruptedException, allow me to refer to 
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
 which has the following:
{code}
  } catch (InterruptedException e) {
throw (IOException)new InterruptedIOException("Interrupted: action="
+ action + ", retry policy=" + connectionRetryPolicy).initCause(e);
  }
{code}

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Ted Yu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990102#comment-15990102
 ] 

Ted Yu commented on BEAM-1773:
--

Should ValidationException class be created in org.apache.beam.sdk package ?

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Ted Yu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ted Yu updated BEAM-1773:
-
Attachment: 1773.v4.patch

See if patch v4 is on right track.

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: 1773.v4.patch, beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Ted Yu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ted Yu updated BEAM-1773:
-
Attachment: 1773.v4.patch

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: 1773.v4.patch, beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-1773) Consider allowing Source#validate() to throw exception

2017-04-29 Thread Ted Yu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1773?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ted Yu updated BEAM-1773:
-
Attachment: (was: 1773.v4.patch)

> Consider allowing Source#validate() to throw exception
> --
>
> Key: BEAM-1773
> URL: https://issues.apache.org/jira/browse/BEAM-1773
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Ted Yu
>Assignee: Ted Yu
> Attachments: 1773.v4.patch, beam-1773.v1.patch, beam-1773.v2.patch
>
>
> In HDFSFileSource.java :
> {code}
>   @Override
>   public void validate() {
> ...
>   } catch (IOException | InterruptedException e) {
> throw new RuntimeException(e);
>   }
> {code}
> Source#validate() should be allowed to throw exception so that we don't 
> resort to using RuntimeException.
> Here was related thread on mailing list:
> http://search-hadoop.com/m/Beam/gfKHFOwE0uETxae?subj=Re+why+Source+validate+is+not+declared+to+throw+any+exception



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2785: Do not prune branches in Jenkins

2017-04-29 Thread kennknowles
GitHub user kennknowles opened a pull request:

https://github.com/apache/beam/pull/2785

Do not prune branches in Jenkins

It seems that pruning in Jenkins has bugs which result in
branches being pruned when they should not, resulting in
log spam and build delays.

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

R: @jasonkuster 

Reverting my prior change. The prior change supports changing the refspec, 
but I suspect bugs in this feature are behind unreasonable behavior such as the 
git fetch here: 
https://builds.apache.org/job/beam_PreCommit_Java_MavenInstall/10329/consoleFull

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/kennknowles/beam no-prune-jenkins

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2785.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2785


commit cc66e2defff592efbbc3755b16fe74ff2ae0d011
Author: Kenneth Knowles 
Date:   2017-04-30T04:25:03Z

Do not prune branches in Jenkins

It seems that pruning in Jenkins has bugs which result in
branches being pruned when they should not, resulting in
log spam and build delays.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #2786: Remove redundant private on enum constructors

2017-04-29 Thread wtanaka
GitHub user wtanaka opened a pull request:

https://github.com/apache/beam/pull/2786

Remove redundant private on enum constructors

https://docs.oracle.com/javase/specs/jls/se8/html/jls-8.html#jls-8.9.2

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wtanaka/beam enumconstruct

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2786.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2786


commit a11ff41c4df4fcb70c17bae2b8af5ae36b99cd94
Author: wtanaka.com 
Date:   2017-04-30T04:27:43Z

Remove redundant private on enum constructors

https://docs.oracle.com/javase/specs/jls/se8/html/jls-8.html#jls-8.9.2




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #2787: Remove useless continue statements

2017-04-29 Thread wtanaka
GitHub user wtanaka opened a pull request:

https://github.com/apache/beam/pull/2787

Remove useless continue statements



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wtanaka/beam unnecessarycontinue

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2787.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2787


commit 5b3a7d4a17afa687eedfbc205c682a2a1279624d
Author: wtanaka.com 
Date:   2017-04-30T04:44:06Z

Remove useless continue statements




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


  1   2   >