[jira] [Created] (BEAM-2455) Backlog size retrieval for Kinesis source

2017-06-16 Thread JIRA
Paweł Kaczmarczyk created BEAM-2455:
---

 Summary: Backlog size retrieval for Kinesis source
 Key: BEAM-2455
 URL: https://issues.apache.org/jira/browse/BEAM-2455
 Project: Beam
  Issue Type: Improvement
  Components: sdk-java-extensions
Reporter: Paweł Kaczmarczyk
Assignee: Davor Bonaci


Implement backlog size retrieval for Kinesis source with the use of Amazon 
CloudWatch. This will allow the runners to scale the amount of resources 
allocated to the pipeline.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Created] (BEAM-2456) Introduce generic metric poll thread interceptor

2017-06-16 Thread JIRA
Jean-Baptiste Onofré created BEAM-2456:
--

 Summary: Introduce generic metric poll thread interceptor
 Key: BEAM-2456
 URL: https://issues.apache.org/jira/browse/BEAM-2456
 Project: Beam
  Issue Type: New Feature
  Components: sdk-ideas
Reporter: Jean-Baptiste Onofré
Assignee: Jean-Baptiste Onofré


The Spark runner provides a convenient feature which is the metric sink.

By configuration, it allows us to configure a metric sink using 
{{metrics.properties}} configuration containing:

{code}
driver.sink.graphite.class=org.apache.beam.runners.spark.metrics.sink.GraphiteSink
driver.sink.graphite.host=localhost
driver.sink.graphite.port=2003
driver.sink.graphite.prefix=spark
driver.sink.graphite.period=1
driver.sink.graphite.unit=SECONDS 
{code}

This approach is very convenient to send the metric to any sink. I think we can 
apply this logic in generic way working with any runner.

The idea is to use a metric poll thread in the pipeline (that we can enable via 
{{PipelineOptions}}) and send to a sink.

I started a PoC about that.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


Jenkins build is back to stable : beam_PostCommit_Java_MavenInstall #4119

2017-06-16 Thread Apache Jenkins Server
See 




[jira] [Resolved] (BEAM-423) Refactore HDFS IO to use the "new" IO style

2017-06-16 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-423?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jean-Baptiste Onofré resolved BEAM-423.
---
   Resolution: Won't Fix
Fix Version/s: Not applicable

It doesn't make sense anymore thank to Beam filesystems.

> Refactore HDFS IO to use the "new" IO style
> ---
>
> Key: BEAM-423
> URL: https://issues.apache.org/jira/browse/BEAM-423
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Jean-Baptiste Onofré
>Assignee: Jean-Baptiste Onofré
> Fix For: Not applicable
>
>
> Currently, HDFS IO provides HDFSSource and HDFSSink. It's now better to use 
> the new "style" with an unique HdfsIO that "internally" create source/sink.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (BEAM-975) Issue with MongoDBIO

2017-06-16 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jean-Baptiste Onofré updated BEAM-975:
--
Description: 
It appears that there is an issue with MongoDBIO. I am using Apache Beam in a 
REST service that reads data from Mongo. After a number of requests, mongoIO 
throws the following exception:

{code}
com.mongodb.MongoSocketReadException: Prematurely reached end of stream
at com.mongodb.connection.SocketStream.read(SocketStream.java:88)
at 
com.mongodb.connection.InternalStreamConnection.receiveResponseBuffers(InternalStreamConnection.java:491)
at 
com.mongodb.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:221)
at 
com.mongodb.connection.CommandHelper.receiveReply(CommandHelper.java:134)
at 
com.mongodb.connection.CommandHelper.receiveCommandResult(CommandHelper.java:121)
at 
com.mongodb.connection.CommandHelper.executeCommand(CommandHelper.java:32)
at 
com.mongodb.connection.InternalStreamConnectionInitializer.initializeConnectionDescription(InternalStreamConnectionInitializer.java:83)
at 
com.mongodb.connection.InternalStreamConnectionInitializer.initialize(InternalStreamConnectionInitializer.java:43)
at 
com.mongodb.connection.InternalStreamConnection.open(InternalStreamConnection.java:115)
at 
com.mongodb.connection.UsageTrackingInternalConnection.open(UsageTrackingInternalConnection.java:46)
at 
com.mongodb.connection.DefaultConnectionPool$PooledConnection.open(DefaultConnectionPool.java:381)
at 
com.mongodb.connection.DefaultConnectionPool.get(DefaultConnectionPool.java:96)
at 
com.mongodb.connection.DefaultConnectionPool.get(DefaultConnectionPool.java:82)
at 
com.mongodb.connection.DefaultServer.getConnection(DefaultServer.java:72)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.getConnection(ClusterBinding.java:86)
at 
com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:237)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:212)
at com.mongodb.operation.FindOperation.execute(FindOperation.java:482)
at com.mongodb.operation.FindOperation.execute(FindOperation.java:79)
at com.mongodb.Mongo.execute(Mongo.java:772)
at com.mongodb.Mongo$2.execute(Mongo.java:759)
at com.mongodb.OperationIterable.iterator(OperationIterable.java:47)
at com.mongodb.FindIterableImpl.iterator(FindIterableImpl.java:143)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbReader.start(MongoDbIO.java:359)
at 
org.apache.beam.runners.direct.BoundedReadEvaluatorFactory$BoundedReadEvaluator.processElement(BoundedReadEvaluatorFactory.java:99)
at 
org.apache.beam.runners.direct.TransformExecutor.processElements(TransformExecutor.java:154)
at 
org.apache.beam.runners.direct.TransformExecutor.run(TransformExecutor.java:121)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
{code}

I suppose there must be a problem with Mongo connection which causes this issue.

  was:
It appears that there is an issue with MongoDBIO. I am using Apache Beam in a 
REST service that reads data from Mongo. After a number of requests, mongoIO 
throws the following exception:

com.mongodb.MongoSocketReadException: Prematurely reached end of stream
at com.mongodb.connection.SocketStream.read(SocketStream.java:88)
at 
com.mongodb.connection.InternalStreamConnection.receiveResponseBuffers(InternalStreamConnection.java:491)
at 
com.mongodb.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:221)
at 
com.mongodb.connection.CommandHelper.receiveReply(CommandHelper.java:134)
at 
com.mongodb.connection.CommandHelper.receiveCommandResult(CommandHelper.java:121)
at 
com.mongodb.connection.CommandHelper.executeCommand(CommandHelper.java:32)
at 
com.mongodb.connection.InternalStreamConnectionInitializer.initializeConnectionDescription(InternalStreamConnectionInitializer.java:83)
at 
com.mongodb.connection.InternalStreamConnectionInitializer.initialize(InternalStreamConnectionInitializer.java:43)
at 
com.mongodb.connection.InternalStreamConnection.open(InternalStreamConnection.java:115)
at 
com.mongodb.connection.UsageTrackingInternalConnection.open(UsageTrackingInternalConnection.java:46)
at 
com.mongodb.connection.DefaultConnectionPool$PooledConnection.open(DefaultConnectionPool.java

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow #3373

2017-06-16 Thread Apache Jenkins Server
See 


--
[...truncated 7.38 MB...]
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:188)
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:149)
at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:72)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.executeWork(DataflowWorker.java:341)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:297)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:125)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:105)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:92)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
(96edac8154a35c1f): java.lang.AssertionError: Accumulators should all have the 
same Side Input Value
Expected: "2"
 but: was "21"
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20)
at org.junit.Assert.assertThat(Assert.java:956)
at 
org.apache.beam.sdk.transforms.CombineTest$TestCombineFnWithContext.mergeAccumulators(CombineTest.java:)
at 
org.apache.beam.sdk.transforms.CombineTest$TestCombineFnWithContext.mergeAccumulators(CombineTest.java:1071)
at 
com.google.cloud.dataflow.worker.runners.worker.GroupAlsoByWindowParDoFnFactory$MergingKeyedCombineFnWithContext.extractOutput(GroupAlsoByWindowParDoFnFactory.java:344)
at 
com.google.cloud.dataflow.worker.runners.worker.GroupAlsoByWindowParDoFnFactory$MergingKeyedCombineFnWithContext.extractOutput(GroupAlsoByWindowParDoFnFactory.java:305)
at 
com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowAndCombineFn$KeyedCombineFnWithContextRunner.extractOutput(BatchGroupAlsoByWindowAndCombineFn.java:270)
at 
com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowAndCombineFn.closeWindow(BatchGroupAlsoByWindowAndCombineFn.java:192)
at 
com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowAndCombineFn.processElement(BatchGroupAlsoByWindowAndCombineFn.java:170)
at 
com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowAndCombineFn.processElement(BatchGroupAlsoByWindowAndCombineFn.java:57)
at 
com.google.cloud.dataflow.worker.runners.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:117)
at 
com.google.cloud.dataflow.worker.runners.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:74)
at 
com.google.cloud.dataflow.worker.runners.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:101)
at 
com.google.cloud.dataflow.worker.runners.worker.ForwardingParDoFn.processElement(ForwardingParDoFn.java:42)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorkerLoggingParDoFn.processElement(DataflowWorkerLoggingParDoFn.java:47)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at 
com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:188)
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:149)
at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:72)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.executeWork(DataflowWorker.java:341)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:297)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:125)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:105)
at 
com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$Worke

[jira] [Commented] (BEAM-593) Support unblocking run() in FlinkRunner and cancel() and waitUntilFinish() in FlinkRunnerResult

2017-06-16 Thread Etienne Chauchot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-593?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16051640#comment-16051640
 ] 

Etienne Chauchot commented on BEAM-593:
---

Flink termination of streaming pipelines is now ok as far as Nexmark is 
concerned


> Support unblocking run() in FlinkRunner and cancel() and waitUntilFinish() in 
> FlinkRunnerResult
> ---
>
> Key: BEAM-593
> URL: https://issues.apache.org/jira/browse/BEAM-593
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-flink
>Reporter: Pei He
>Assignee: Aljoscha Krettek
>
> We introduced both functions to PipelineResult.
> Currently, both of them throw UnsupportedOperationException in Flink runner.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (BEAM-2423) Abstract StateInternalsTest for the different state internals/Runners

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16051715#comment-16051715
 ] 

ASF GitHub Bot commented on BEAM-2423:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3348


> Abstract StateInternalsTest for the different state internals/Runners
> -
>
> Key: BEAM-2423
> URL: https://issues.apache.org/jira/browse/BEAM-2423
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-core, runner-flink
>Reporter: Jingsong Lee
>Assignee: Jingsong Lee
> Fix For: 2.1.0
>
>
> For the test of InMemoryStateInternals, ApexStateInternals, 
> FlinkStateInternals, SparkStateInternals, etc..
> Have a common base class for the state internals test that has an abstract 
> method createStateInternals() and all the test methods. Then an actual 
> implementation would just derive from that and only implement the method for 
> creating the state internals. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[2/2] beam git commit: This closes #3348

2017-06-16 Thread aljoscha
This closes #3348


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/d5261d74
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/d5261d74
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/d5261d74

Branch: refs/heads/master
Commit: d5261d74b2422f2876768860587f099f43ab3b0e
Parents: c528fb2 581ee15
Author: Aljoscha Krettek 
Authored: Fri Jun 16 11:49:02 2017 +0200
Committer: Aljoscha Krettek 
Committed: Fri Jun 16 11:49:02 2017 +0200

--
 runners/apex/pom.xml|   7 +
 .../utils/ApexStateInternalsTest.java   | 411 ---
 .../core/InMemoryStateInternalsTest.java|  46 ++-
 .../beam/runners/core/StateInternalsTest.java   |  14 +-
 .../FlinkBroadcastStateInternalsTest.java   | 242 +++
 .../FlinkKeyGroupStateInternalsTest.java| 359 
 .../streaming/FlinkSplitStateInternalsTest.java | 132 +++---
 runners/spark/pom.xml   |   7 +
 .../spark/stateful/SparkStateInternalsTest.java |  66 +++
 9 files changed, 521 insertions(+), 763 deletions(-)
--




[GitHub] beam pull request #3348: [BEAM-2423] Port state internals tests to the new b...

2017-06-16 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3348


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: [BEAM-2423] Port state internals tests to the new base class StateInternalsTest

2017-06-16 Thread aljoscha
Repository: beam
Updated Branches:
  refs/heads/master c528fb2f7 -> d5261d74b


[BEAM-2423] Port state internals tests to the new base class StateInternalsTest


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/581ee152
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/581ee152
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/581ee152

Branch: refs/heads/master
Commit: 581ee1520e497fca95e8c4aa75f90050952523d0
Parents: c528fb2
Author: JingsongLi 
Authored: Tue Jun 13 11:26:38 2017 +0800
Committer: Aljoscha Krettek 
Committed: Fri Jun 16 11:48:48 2017 +0200

--
 runners/apex/pom.xml|   7 +
 .../utils/ApexStateInternalsTest.java   | 411 ---
 .../core/InMemoryStateInternalsTest.java|  46 ++-
 .../beam/runners/core/StateInternalsTest.java   |  14 +-
 .../FlinkBroadcastStateInternalsTest.java   | 242 +++
 .../FlinkKeyGroupStateInternalsTest.java| 359 
 .../streaming/FlinkSplitStateInternalsTest.java | 132 +++---
 runners/spark/pom.xml   |   7 +
 .../spark/stateful/SparkStateInternalsTest.java |  66 +++
 9 files changed, 521 insertions(+), 763 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/581ee152/runners/apex/pom.xml
--
diff --git a/runners/apex/pom.xml b/runners/apex/pom.xml
index 4a36bec..d3d4318 100644
--- a/runners/apex/pom.xml
+++ b/runners/apex/pom.xml
@@ -184,6 +184,13 @@
   test-jar
   test
 
+
+
+  org.apache.beam
+  beam-runners-core-java
+  test-jar
+  test
+
   
 
   

http://git-wip-us.apache.org/repos/asf/beam/blob/581ee152/runners/apex/src/test/java/org/apache/beam/runners/apex/translation/utils/ApexStateInternalsTest.java
--
diff --git 
a/runners/apex/src/test/java/org/apache/beam/runners/apex/translation/utils/ApexStateInternalsTest.java
 
b/runners/apex/src/test/java/org/apache/beam/runners/apex/translation/utils/ApexStateInternalsTest.java
index a7e64af..87aa8c2 100644
--- 
a/runners/apex/src/test/java/org/apache/beam/runners/apex/translation/utils/ApexStateInternalsTest.java
+++ 
b/runners/apex/src/test/java/org/apache/beam/runners/apex/translation/utils/ApexStateInternalsTest.java
@@ -18,350 +18,109 @@
 package org.apache.beam.runners.apex.translation.utils;
 
 import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertFalse;
 import static org.junit.Assert.assertNotNull;
 import static org.junit.Assert.assertThat;
 
 import com.datatorrent.lib.util.KryoCloneUtils;
-import java.util.Arrays;
-import 
org.apache.beam.runners.apex.translation.utils.ApexStateInternals.ApexStateBackend;
-import 
org.apache.beam.runners.apex.translation.utils.ApexStateInternals.ApexStateInternalsFactory;
-import org.apache.beam.runners.core.StateMerging;
+import org.apache.beam.runners.core.StateInternals;
+import org.apache.beam.runners.core.StateInternalsTest;
 import org.apache.beam.runners.core.StateNamespace;
 import org.apache.beam.runners.core.StateNamespaceForTest;
 import org.apache.beam.runners.core.StateTag;
 import org.apache.beam.runners.core.StateTags;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
-import org.apache.beam.sdk.coders.VarIntCoder;
-import org.apache.beam.sdk.state.BagState;
-import org.apache.beam.sdk.state.CombiningState;
-import org.apache.beam.sdk.state.GroupingState;
-import org.apache.beam.sdk.state.ReadableState;
 import org.apache.beam.sdk.state.ValueState;
-import org.apache.beam.sdk.state.WatermarkHoldState;
-import org.apache.beam.sdk.transforms.Sum;
-import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
-import org.apache.beam.sdk.transforms.windowing.IntervalWindow;
-import org.apache.beam.sdk.transforms.windowing.TimestampCombiner;
 import org.hamcrest.Matchers;
-import org.joda.time.Instant;
-import org.junit.Before;
+import org.junit.Ignore;
 import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.JUnit4;
+import org.junit.runners.Suite;
 
 /**
  * Tests for {@link ApexStateInternals}. This is based on the tests for
- * {@code InMemoryStateInternals}.
+ * {@code StateInternalsTest}.
  */
+@RunWith(Suite.class)
+@Suite.SuiteClasses({
+ApexStateInternalsTest.StandardStateInternalsTests.class,
+ApexStateInternalsTest.OtherTests.class
+})
 public class ApexStateInternalsTest {
-  private static final BoundedWindow WINDOW_1 = new IntervalWindow(new 
Instant(0), new Instant(10));
-  private static final StateNamespace NAMESPACE_1 = new 
StateNamespaceForTest("ns1");
-  private static final StateNamespace NAMESPACE_2 = new 
StateNamespaceForTest("ns2");
-  private static final StateNamesp

Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #2380

2017-06-16 Thread Apache Jenkins Server
See 




Jenkins build is unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #3374

2017-06-16 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #2381

2017-06-16 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #92

2017-06-16 Thread Apache Jenkins Server
See 


Changes:

[aljoscha.krettek] [BEAM-2423] Port state internals tests to the new base class

--
[...truncated 1.56 MB...]
  [javadoc] OutputT extends Object declared in class BeamFnDataReadRunner
  [javadoc] 
:62:
 error: cannot find symbol
  [javadoc]   private CompletableFuture readFuture;
  [javadoc]   ^
  [javadoc]   symbol:   class CompletableFuture
  [javadoc]   location: class BeamFnDataReadRunner
  [javadoc]   where OutputT is a type-variable:
  [javadoc] OutputT extends Object declared in class BeamFnDataReadRunner
  [javadoc] 
:66:
 error: cannot find symbol
  [javadoc]   Supplier processBundleInstructionIdSupplier,
  [javadoc]   ^
  [javadoc]   symbol:   class Supplier
  [javadoc]   location: class BeamFnDataReadRunner
  [javadoc]   where OutputT is a type-variable:
  [javadoc] OutputT extends Object declared in class BeamFnDataReadRunner
  [javadoc] 
:51:
 warning - Tag @link: reference not found: DoFn.FinishBundle @FinishBundle
  [javadoc] 
:45:
 warning - Tag @link: reference not found: DoFn.OnTimer @OnTimer
  [javadoc] 
:39:
 warning - Tag @link: reference not found: DoFn.ProcessElement @ProcessElement
  [javadoc] 
:39:
 warning - Tag @link: reference not found: DoFn.ProcessContext
  [javadoc] 
:33:
 warning - Tag @link: reference not found: DoFn.StartBundle @StartBundle
  [javadoc] 
:33:
 warning - Tag @link: reference not found: DoFn.StartBundle @StartBundle
  [javadoc] 
:39:
 warning - Tag @link: reference not found: DoFn.ProcessElement @ProcessElement
  [javadoc] 
:39:
 warning - Tag @link: reference not found: DoFn.ProcessContext
  [javadoc] 
:45:
 warning - Tag @link: reference not found: DoFn.OnTimer @OnTimer
  [javadoc] 
:51:
 warning - Tag @link: reference not found: DoFn.FinishBundle @FinishBundle
  [javadoc] 
:34:
 warning - Tag @link: reference not found: Source.Reader
  [javadoc] 
:292:
 warning - Tag @link: reference not found: UnboundedSource.CheckpointMark
  [javadoc] 


Jenkins build is back to normal : beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 7 (on Ubuntu only),beam #92

2017-06-16 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-2328) Introduce Apache Tika Input component

2017-06-16 Thread Sergey Beryozkin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2328?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16051834#comment-16051834
 ] 

Sergey Beryozkin commented on BEAM-2328:


HI All,
The initial cleanup of the 'tikaio' branch is now complete (with thanks to JB), 
the commits - squashed, I'm now proceeding to creating the first PR. I'd like 
to ask JB to review it, the feedback from all of the team will also be welcomed.
[~talli...@mitre.org] Hi Tim, I hope that if the team accepts this PR then we 
can get TikaReader improved further :-). (I'm not sure if some more work will 
need to be done to make a better reporting of the embedded attachments inside a 
given PDF/etc, if some further ParserContext customizations may be needed - the 
input metadata and TikaConfig are covered though, etc); concatenating multiple 
SAX content bits into a minimum length fragments will optionally be supported 
too later on if needed

thanks 

> Introduce Apache Tika Input component
> -
>
> Key: BEAM-2328
> URL: https://issues.apache.org/jira/browse/BEAM-2328
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-ideas, sdk-java-extensions
>Reporter: Sergey Beryozkin
>Assignee: Sergey Beryozkin
> Fix For: 2.1.0
>
>
> Apache Tika is a popular project that offers an extensive support for parsing 
> the variety of file formats. It is used in many projects including Lucene and 
> Elastic Search. 
> Supporting a Tika Input (Read) at the Beam level would be of major interest 
> to many users.
> PR is to follow



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] beam pull request #3378: [BEAM-2328] Add TikaIO component

2017-06-16 Thread sberyozkin
GitHub user sberyozkin opened a pull request:

https://github.com/apache/beam/pull/3378

[BEAM-2328] Add TikaIO component

R: @jbonofre

Adding TikaSource and TikaReader tests
Updating TikaReader to use TikaInputStream as suggested by Tim Allison
Supporting the customization of TikaConfig
Cleanup:
Moving a 'tika' above 'xml' in io/pom.xml to keep the correct order
Renaming TikaInput to TikaIO, adding Read.withOptions, throwing 
NoSuchElementException if the current is null
Removing redundant test annotations
Fixing TikaIO JavaDoc typo


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sberyozkin/beam tikaio

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3378.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3378


commit 8c63d91c0a088e2d90d5572051f736f24ea338b5
Author: Sergey Beryozkin 
Date:   2017-05-25T15:47:59Z

Adding TikaIO component
Enforcing that start is called before advance
Adding TikaSource and TikaReader tests
Updating TikaReader to use TikaInputStream as suggested by Tim Allison
Supporting the customization of TikaConfig
Moving a 'tika' above 'xml' in io/pom.xml to keep the correct order
Renaming TikaInput to TikaIO, adding Read.withOptions, throwing 
NoSuchElementException if the current is null
Removing redundant test annotations
Fixing TikaIO JavaDoc typo




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2328) Introduce Apache Tika Input component

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2328?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16051839#comment-16051839
 ] 

ASF GitHub Bot commented on BEAM-2328:
--

GitHub user sberyozkin opened a pull request:

https://github.com/apache/beam/pull/3378

[BEAM-2328] Add TikaIO component

R: @jbonofre

Adding TikaSource and TikaReader tests
Updating TikaReader to use TikaInputStream as suggested by Tim Allison
Supporting the customization of TikaConfig
Cleanup:
Moving a 'tika' above 'xml' in io/pom.xml to keep the correct order
Renaming TikaInput to TikaIO, adding Read.withOptions, throwing 
NoSuchElementException if the current is null
Removing redundant test annotations
Fixing TikaIO JavaDoc typo


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sberyozkin/beam tikaio

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3378.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3378


commit 8c63d91c0a088e2d90d5572051f736f24ea338b5
Author: Sergey Beryozkin 
Date:   2017-05-25T15:47:59Z

Adding TikaIO component
Enforcing that start is called before advance
Adding TikaSource and TikaReader tests
Updating TikaReader to use TikaInputStream as suggested by Tim Allison
Supporting the customization of TikaConfig
Moving a 'tika' above 'xml' in io/pom.xml to keep the correct order
Renaming TikaInput to TikaIO, adding Read.withOptions, throwing 
NoSuchElementException if the current is null
Removing redundant test annotations
Fixing TikaIO JavaDoc typo




> Introduce Apache Tika Input component
> -
>
> Key: BEAM-2328
> URL: https://issues.apache.org/jira/browse/BEAM-2328
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-ideas, sdk-java-extensions
>Reporter: Sergey Beryozkin
>Assignee: Sergey Beryozkin
> Fix For: 2.1.0
>
>
> Apache Tika is a popular project that offers an extensive support for parsing 
> the variety of file formats. It is used in many projects including Lucene and 
> Elastic Search. 
> Supporting a Tika Input (Read) at the Beam level would be of major interest 
> to many users.
> PR is to follow



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #140

2017-06-16 Thread Apache Jenkins Server
See 


Changes:

[aljoscha.krettek] [BEAM-2423] Port state internals tests to the new base class

--
[...truncated 2.64 MB...]
2017-06-16T12:38:34.361 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
 (12 KB at 1125.6 KB/sec)
2017-06-16T12:38:34.365 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
2017-06-16T12:38:34.378 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
 (13 KB at 987.2 KB/sec)
2017-06-16T12:38:34.381 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
2017-06-16T12:38:34.390 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
 (3 KB at 241.4 KB/sec)
2017-06-16T12:38:34.394 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
2017-06-16T12:38:34.401 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
 (2 KB at 224.7 KB/sec)
2017-06-16T12:38:34.405 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
2017-06-16T12:38:34.413 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
 (6 KB at 656.2 KB/sec)
2017-06-16T12:38:34.417 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
2017-06-16T12:38:34.425 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
 (3 KB at 311.9 KB/sec)
2017-06-16T12:38:34.428 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
2017-06-16T12:38:34.436 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
 (4 KB at 428.3 KB/sec)
2017-06-16T12:38:34.441 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
2017-06-16T12:38:34.447 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
 (2 KB at 167.2 KB/sec)
2017-06-16T12:38:34.452 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
2017-06-16T12:38:34.459 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
 (2 KB at 242.3 KB/sec)
2017-06-16T12:38:34.464 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
2017-06-16T12:38:34.479 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
 (2 KB at 117.6 KB/sec)
2017-06-16T12:38:34.484 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
2017-06-16T12:38:34.494 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
 (3 KB at 265.4 KB/sec)
2017-06-16T12:38:34.499 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
2017-06-16T12:38:34.511 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
 (18 KB at 1468.0 KB/sec)
2017-06-16T12:38:34.515 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom
2017-06-16T12:38:34.528 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom (10 
KB at 725.2 KB/sec)
2017-06-16T12:38:34.532 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
2017-06-16T12:38:34.540 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
 (5 KB at 556.8 KB/sec)
2017-06-16T12:38:34.544 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-launcher/1.8.4/ant-launcher-1.8.4.pom
2017-06-16T12:38:34.555 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-launcher/1.8.4/ant-launcher-1.8.4.pom
 (3 KB at 209.6 KB/sec)
2017-06-16T12:38:34.559 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/fusesource/jansi/jansi/1.11/jansi-1.11.pom
2017-06-16T12:38:34.569 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/fusesource/jansi/jansi/1.11/jansi-1.11.pom
 (4

Jenkins build is still unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #3375

2017-06-16 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #3379: [BEAM] Fixed handling of --use_public_ips.

2017-06-16 Thread mdvorsky
GitHub user mdvorsky opened a pull request:

https://github.com/apache/beam/pull/3379

[BEAM] Fixed handling of --use_public_ips.

R: @aaltay 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mdvorsky/incubator-beam public_ips

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3379.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3379


commit 1350609d65c6e4dadcd9af022e11a9baa4237988
Author: Marian Dvorsky 
Date:   2017-06-16T13:57:13Z

Fixed handling of use_public_ips. Added test.

commit 88022ee5d7b7f86766f19cc4bcec3477c6db2635
Author: Marian Dvorsky 
Date:   2017-06-16T14:03:21Z

Fixed lint error.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: This closes #3369

2017-06-16 Thread tgroh
Repository: beam
Updated Branches:
  refs/heads/master d5261d74b -> 901e96813


This closes #3369


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/901e9681
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/901e9681
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/901e9681

Branch: refs/heads/master
Commit: 901e96813d1a3bcc075e7c540840888f4c37bf3f
Parents: d5261d7 027d89c
Author: Thomas Groh 
Authored: Fri Jun 16 09:33:13 2017 -0700
Committer: Thomas Groh 
Committed: Fri Jun 16 09:33:13 2017 -0700

--
 .../java/org/apache/beam/sdk/transforms/CombineTest.java | 8 
 1 file changed, 4 insertions(+), 4 deletions(-)
--




[GitHub] beam pull request #3369: Use the appropriate context in CombineTest Coder

2017-06-16 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3369


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: Use the appropriate context in CombineTest Coder

2017-06-16 Thread tgroh
Use the appropriate context in CombineTest Coder

The Accumulator was improperly decoding the seed value in the outer
context, as it is in the nested context.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/027d89c9
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/027d89c9
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/027d89c9

Branch: refs/heads/master
Commit: 027d89c91f8851195dedbab2879699738376ff77
Parents: d5261d7
Author: Thomas Groh 
Authored: Thu Jun 15 13:54:47 2017 -0700
Committer: Thomas Groh 
Committed: Fri Jun 16 09:33:13 2017 -0700

--
 .../java/org/apache/beam/sdk/transforms/CombineTest.java | 8 
 1 file changed, 4 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/027d89c9/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java
index 6a4348d..c4ba62d 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java
@@ -1000,7 +1000,7 @@ public class CombineTest implements Serializable {
   @Override
   public void encode(Accumulator accumulator, OutputStream outStream, 
Coder.Context context)
   throws CoderException, IOException {
-StringUtf8Coder.of().encode(accumulator.seed, outStream, context);
+StringUtf8Coder.of().encode(accumulator.seed, outStream, 
context.nested());
 StringUtf8Coder.of().encode(accumulator.value, outStream, context);
   }
 
@@ -1012,9 +1012,9 @@ public class CombineTest implements Serializable {
   @Override
   public Accumulator decode(InputStream inStream, Coder.Context 
context)
   throws CoderException, IOException {
-return new Accumulator(
-StringUtf8Coder.of().decode(inStream, context),
-StringUtf8Coder.of().decode(inStream, context));
+String seed = StringUtf8Coder.of().decode(inStream, 
context.nested());
+String value = StringUtf8Coder.of().decode(inStream, context);
+return new Accumulator(seed, value);
   }
 };
   }



[2/2] beam git commit: This closes #3379

2017-06-16 Thread altay
This closes #3379


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/f2a32b2c
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/f2a32b2c
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/f2a32b2c

Branch: refs/heads/master
Commit: f2a32b2ca61430d3cc2085166b758e7637d69a28
Parents: 901e968 c597a02
Author: Ahmet Altay 
Authored: Fri Jun 16 09:43:42 2017 -0700
Committer: Ahmet Altay 
Committed: Fri Jun 16 09:43:42 2017 -0700

--
 .../apache_beam/options/pipeline_options.py |  9 +++-
 .../runners/dataflow/internal/apiclient_test.py | 24 
 2 files changed, 32 insertions(+), 1 deletion(-)
--




[1/2] beam git commit: Fixed handling of use_public_ips. Added test.

2017-06-16 Thread altay
Repository: beam
Updated Branches:
  refs/heads/master 901e96813 -> f2a32b2ca


Fixed handling of use_public_ips. Added test.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/c597a020
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/c597a020
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/c597a020

Branch: refs/heads/master
Commit: c597a020c87f272a5920bf29195d9e314af2f828
Parents: 901e968
Author: Marian Dvorsky 
Authored: Fri Jun 16 15:57:13 2017 +0200
Committer: Ahmet Altay 
Committed: Fri Jun 16 09:43:35 2017 -0700

--
 .../apache_beam/options/pipeline_options.py |  9 +++-
 .../runners/dataflow/internal/apiclient_test.py | 24 
 2 files changed, 32 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/c597a020/sdks/python/apache_beam/options/pipeline_options.py
--
diff --git a/sdks/python/apache_beam/options/pipeline_options.py 
b/sdks/python/apache_beam/options/pipeline_options.py
index 283b340..8644e51 100644
--- a/sdks/python/apache_beam/options/pipeline_options.py
+++ b/sdks/python/apache_beam/options/pipeline_options.py
@@ -465,7 +465,14 @@ class WorkerOptions(PipelineOptions):
 parser.add_argument(
 '--use_public_ips',
 default=None,
-help='Whether to assign public IP addresses to the worker machines.')
+action='store_true',
+help='Whether to assign public IP addresses to the worker VMs.')
+parser.add_argument(
+'--no_use_public_ips',
+dest='use_public_ips',
+default=None,
+action='store_false',
+help='Whether to assign only private IP addresses to the worker VMs.')
 
   def validate(self, validator):
 errors = []

http://git-wip-us.apache.org/repos/asf/beam/blob/c597a020/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py
--
diff --git 
a/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py 
b/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py
index 67cf77f..55211f7 100644
--- a/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py
+++ b/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py
@@ -122,6 +122,30 @@ class UtilTest(unittest.TestCase):
 self.assertEqual(
 metric_update.floatingPointMean.count.lowBits, accumulator.count)
 
+  def test_default_ip_configuration(self):
+pipeline_options = PipelineOptions(
+['--temp_location', 'gs://any-location/temp'])
+env = apiclient.Environment([], pipeline_options, '2.0.0')
+self.assertEqual(env.proto.workerPools[0].ipConfiguration, None)
+
+  def test_public_ip_configuration(self):
+pipeline_options = PipelineOptions(
+['--temp_location', 'gs://any-location/temp',
+ '--use_public_ips'])
+env = apiclient.Environment([], pipeline_options, '2.0.0')
+self.assertEqual(
+env.proto.workerPools[0].ipConfiguration,
+dataflow.WorkerPool.IpConfigurationValueValuesEnum.WORKER_IP_PUBLIC)
+
+  def test_private_ip_configuration(self):
+pipeline_options = PipelineOptions(
+['--temp_location', 'gs://any-location/temp',
+ '--no_use_public_ips'])
+env = apiclient.Environment([], pipeline_options, '2.0.0')
+self.assertEqual(
+env.proto.workerPools[0].ipConfiguration,
+dataflow.WorkerPool.IpConfigurationValueValuesEnum.WORKER_IP_PRIVATE)
+
 
 if __name__ == '__main__':
   unittest.main()



[GitHub] beam pull request #3379: [BEAM] Fixed handling of --use_public_ips.

2017-06-16 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3379


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #3370

2017-06-16 Thread altay
This closes #3370


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/54f30789
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/54f30789
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/54f30789

Branch: refs/heads/master
Commit: 54f30789165a49d92479b54cf3dac9d4a063e366
Parents: f2a32b2 5aee624
Author: Ahmet Altay 
Authored: Fri Jun 16 09:57:16 2017 -0700
Committer: Ahmet Altay 
Committed: Fri Jun 16 09:57:16 2017 -0700

--
 .../runners/direct/bundle_factory.py|  2 +-
 .../apache_beam/runners/direct/executor.py  | 64 
 .../runners/direct/transform_evaluator.py   | 39 +++-
 3 files changed, 77 insertions(+), 28 deletions(-)
--




[GitHub] beam pull request #3370: [BEAM-1265] Introduce pending bundles and RootBundl...

2017-06-16 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3370


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: Introduce pending bundles and RootBundleProvider in DirectRunner

2017-06-16 Thread altay
Repository: beam
Updated Branches:
  refs/heads/master f2a32b2ca -> 54f307891


Introduce pending bundles and RootBundleProvider in DirectRunner


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/5aee624c
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/5aee624c
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/5aee624c

Branch: refs/heads/master
Commit: 5aee624cbc2815efaf04c7e4854138370a45a1f6
Parents: f2a32b2
Author: Charles Chen 
Authored: Thu Jun 15 14:27:47 2017 -0700
Committer: Ahmet Altay 
Committed: Fri Jun 16 09:56:51 2017 -0700

--
 .../runners/direct/bundle_factory.py|  2 +-
 .../apache_beam/runners/direct/executor.py  | 64 
 .../runners/direct/transform_evaluator.py   | 39 +++-
 3 files changed, 77 insertions(+), 28 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/5aee624c/sdks/python/apache_beam/runners/direct/bundle_factory.py
--
diff --git a/sdks/python/apache_beam/runners/direct/bundle_factory.py 
b/sdks/python/apache_beam/runners/direct/bundle_factory.py
index ed00b03..0182b4c 100644
--- a/sdks/python/apache_beam/runners/direct/bundle_factory.py
+++ b/sdks/python/apache_beam/runners/direct/bundle_factory.py
@@ -108,7 +108,7 @@ class _Bundle(object):
 self._initial_windowed_value.windows)
 
   def __init__(self, pcollection, stacked=True):
-assert isinstance(pcollection, pvalue.PCollection)
+assert isinstance(pcollection, (pvalue.PBegin, pvalue.PCollection))
 self._pcollection = pcollection
 self._elements = []
 self._stacked = stacked

http://git-wip-us.apache.org/repos/asf/beam/blob/5aee624c/sdks/python/apache_beam/runners/direct/executor.py
--
diff --git a/sdks/python/apache_beam/runners/direct/executor.py 
b/sdks/python/apache_beam/runners/direct/executor.py
index 86db291..eff2d3c 100644
--- a/sdks/python/apache_beam/runners/direct/executor.py
+++ b/sdks/python/apache_beam/runners/direct/executor.py
@@ -20,6 +20,7 @@
 from __future__ import absolute_import
 
 import collections
+import itertools
 import logging
 import Queue
 import sys
@@ -250,12 +251,12 @@ class TransformExecutor(_ExecutorService.CallableTask):
   """
 
   def __init__(self, transform_evaluator_registry, evaluation_context,
-   input_bundle, applied_transform, completion_callback,
+   input_bundle, applied_ptransform, completion_callback,
transform_evaluation_state):
 self._transform_evaluator_registry = transform_evaluator_registry
 self._evaluation_context = evaluation_context
 self._input_bundle = input_bundle
-self._applied_transform = applied_transform
+self._applied_ptransform = applied_ptransform
 self._completion_callback = completion_callback
 self._transform_evaluation_state = transform_evaluation_state
 self._side_input_values = {}
@@ -264,11 +265,11 @@ class TransformExecutor(_ExecutorService.CallableTask):
 
   def call(self):
 self._call_count += 1
-assert self._call_count <= (1 + len(self._applied_transform.side_inputs))
-metrics_container = MetricsContainer(self._applied_transform.full_label)
+assert self._call_count <= (1 + len(self._applied_ptransform.side_inputs))
+metrics_container = MetricsContainer(self._applied_ptransform.full_label)
 scoped_metrics_container = ScopedMetricsContainer(metrics_container)
 
-for side_input in self._applied_transform.side_inputs:
+for side_input in self._applied_ptransform.side_inputs:
   if side_input not in self._side_input_values:
 has_result, value = (
 self._evaluation_context.get_value_or_schedule_after_output(
@@ -280,11 +281,11 @@ class TransformExecutor(_ExecutorService.CallableTask):
 self._side_input_values[side_input] = value
 
 side_input_values = [self._side_input_values[side_input]
- for side_input in self._applied_transform.side_inputs]
+ for side_input in 
self._applied_ptransform.side_inputs]
 
 try:
-  evaluator = self._transform_evaluator_registry.for_application(
-  self._applied_transform, self._input_bundle,
+  evaluator = self._transform_evaluator_registry.get_evaluator(
+  self._applied_ptransform, self._input_bundle,
   side_input_values, scoped_metrics_container)
 
   if self._input_bundle:
@@ -298,13 +299,13 @@ class TransformExecutor(_ExecutorService.CallableTask):
   if self._evaluation_context.has_cache:
 for uncommitted_bundle in result.uncommitted_output_bundles:
   self._evaluation_context.append_to_cache(
-  self._applie

[jira] [Commented] (BEAM-1265) Add streaming support to Python DirectRunner

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1265?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052133#comment-16052133
 ] 

ASF GitHub Bot commented on BEAM-1265:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3370


> Add streaming support to Python DirectRunner
> 
>
> Key: BEAM-1265
> URL: https://issues.apache.org/jira/browse/BEAM-1265
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Charles Chen
>
> Continue the work started in https://issues.apache.org/jira/browse/BEAM-428



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] beam pull request #3326: Beam-2383 Round function

2017-06-16 Thread app-tarush
Github user app-tarush closed the pull request at:

https://github.com/apache/beam/pull/3326


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Dataflow #3376

2017-06-16 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #141

2017-06-16 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Use the appropriate context in CombineTest Coder

[altay] Fixed handling of use_public_ips. Added test.

[altay] Introduce pending bundles and RootBundleProvider in DirectRunner

--
[...truncated 1.51 MB...]
2017-06-16T18:17:31.278 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/oss/oss-parent/4/oss-parent-4.pom
2017-06-16T18:17:31.305 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/oss/oss-parent/4/oss-parent-4.pom
 (4 KB at 144.7 KB/sec)
2017-06-16T18:17:31.312 [INFO] Downloading: 
https://repository.apache.org/content/repositories/snapshots/io/netty/netty-codec-http2/maven-metadata.xml
2017-06-16T18:17:31.312 [INFO] Downloading: 
https://oss.sonatype.org/content/repositories/snapshots/io/netty/netty-codec-http2/maven-metadata.xml
2017-06-16T18:17:37.215 [INFO] Downloaded: 
https://oss.sonatype.org/content/repositories/snapshots/io/netty/netty-codec-http2/maven-metadata.xml
 (547 B at 0.1 KB/sec)
[JENKINS] Archiving disabled
2017-06-16T18:17:41.794 [INFO]  
   
2017-06-16T18:17:41.794 [INFO] 

2017-06-16T18:17:41.794 [INFO] Skipping Apache Beam :: Parent
2017-06-16T18:17:41.794 [INFO] This project has been banned from the build due 
to previous failures.
2017-06-16T18:17:41.794 [INFO] 

[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
2017-06-16T18:18:37.380 [INFO] 

2017-06-16T18:18:37.380 [INFO] Reactor Summary:
2017-06-16T18:18:37.380 [INFO] 
2017-06-16T18:18:37.381 [INFO] Apache Beam :: Parent 
.. SUCCESS [ 27.199 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: SDKs :: Java :: Build Tools 
. SUCCESS [ 16.688 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: SDKs 
 SUCCESS [  8.643 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: SDKs :: Common 
.. SUCCESS [  5.339 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: SDKs :: Common :: Runner API 
 SUCCESS [ 23.245 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: SDKs :: Common :: Fn API 
 SUCCESS [ 24.543 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: SDKs :: Java 
 SUCCESS [  5.915 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: SDKs :: Java :: Core 
 SUCCESS [04:13 min]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: Runners 
. SUCCESS [  7.169 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: Runners :: Core Construction Java 
... SUCCESS [ 47.885 s]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: Runners :: Core Java 
 SUCCESS [01:11 min]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: Runners :: Direct Java 
.. SUCCESS [03:29 min]
2017-06-16T18:18:37.381 [INFO] Apache Beam :: SDKs :: Java :: IO 
.. SUCCESS [  6.163 s]
2017-06-16T18:18:37.382 [INFO] Apache Beam :: SDKs :: Java :: IO :: Common 
 SUCCESS [ 10.861 s]
2017-06-16T18:18:37.382 [INFO] Apache Beam :: SDKs :: Java :: IO :: Cassandra 
. SUCCESS [ 42.754 s]
2017-06-16T18:18:37.382 [INFO] Apache Beam :: SDKs :: Java :: IO :: 
Elasticsearch . SUCCESS [ 56.752 s]
2017-06-16T18:18:37.382 [INFO] Apache Beam :: S

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #93

2017-06-16 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Use the appropriate context in CombineTest Coder

[altay] Fixed handling of use_public_ips. Added test.

[altay] Introduce pending bundles and RootBundleProvider in DirectRunner

--
[...truncated 1.58 MB...]
  [javadoc] 
:309:
 warning - Tag @link: reference not found: BoundedReader#advance
  [javadoc] 
:309:
 warning - Tag @see: reference not found: BoundedReader#advance
  [javadoc] 
:314:
 warning - Tag @link: can't find withFilenamePolicy(FilenamePolicy) in 
org.apache.beam.sdk.io.TextIO.Write
  [javadoc] 
:295:
 warning - Tag @link: can't find withFilenamePolicy(FilenamePolicy) in 
org.apache.beam.sdk.io.TextIO.Write
  [javadoc] 
:346:
 warning - Tag @link: can't find withFilenamePolicy(FilenamePolicy) in 
org.apache.beam.sdk.io.TextIO.Write
  [javadoc] 
:357:
 warning - Tag @link: can't find withFilenamePolicy(FilenamePolicy) in 
org.apache.beam.sdk.io.TextIO.Write
  [javadoc] 
:376:
 warning - Tag @link: reference not found: Mapper
  [javadoc] 
:376:
 warning - Tag @link: reference not found: MappingManager
  [javadoc] 
:376:
 warning - Tag @link: reference not found: Mapper
  [javadoc] 
:376:
 warning - Tag @link: reference not found: MappingManager
  [javadoc] 
:376:
 warning - Tag @link: reference not found: Mapper#save(Object)
  [javadoc] 
:136:
 warning - Tag @link: reference not found: DestinationT
  [javadoc] 
:136:
 warning - Tag @link: reference not found: DestinationT
  [javadoc] 
:136:
 warning - Tag @link: reference not found: DestinationT
  [javadoc] 
:472:
 warning - Tag @link: reference not found: Query.Builder#setLimit
  [javadoc] 
:110:
 warning - Tag @link: reference not found: UnboundedSource.UnboundedReader
  [ja

[3/3] beam git commit: [BEAM-1347] Break apart ProcessBundleHandler to use service locator pattern based upon URNs.

2017-06-16 Thread lcwik
[BEAM-1347] Break apart ProcessBundleHandler to use service locator pattern 
based upon URNs.

This closes #3375


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/aa555f59
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/aa555f59
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/aa555f59

Branch: refs/heads/master
Commit: aa555f593a3265ca015214a6059372aa6b0ce0af
Parents: 54f3078 c9c1a05
Author: Luke Cwik 
Authored: Fri Jun 16 12:03:35 2017 -0700
Committer: Luke Cwik 
Committed: Fri Jun 16 12:03:35 2017 -0700

--
 sdks/java/harness/pom.xml   |   6 +
 .../harness/control/ProcessBundleHandler.java   | 293 +++
 .../beam/runners/core/BeamFnDataReadRunner.java |  70 ++-
 .../runners/core/BeamFnDataWriteRunner.java |  67 ++-
 .../beam/runners/core/BoundedSourceRunner.java  |  74 ++-
 .../beam/runners/core/DoFnRunnerFactory.java| 182 +++
 .../runners/core/PTransformRunnerFactory.java   |  81 +++
 .../control/ProcessBundleHandlerTest.java   | 521 +++
 .../runners/core/BeamFnDataReadRunnerTest.java  | 112 +++-
 .../runners/core/BeamFnDataWriteRunnerTest.java | 120 -
 .../runners/core/BoundedSourceRunnerTest.java   | 124 -
 .../runners/core/DoFnRunnerFactoryTest.java | 209 
 12 files changed, 1134 insertions(+), 725 deletions(-)
--




[1/3] beam git commit: [BEAM-1347] Break apart ProcessBundleHandler to use service locator pattern based upon URNs.

2017-06-16 Thread lcwik
Repository: beam
Updated Branches:
  refs/heads/master 54f307891 -> aa555f593


http://git-wip-us.apache.org/repos/asf/beam/blob/c9c1a05d/sdks/java/harness/src/test/java/org/apache/beam/runners/core/BeamFnDataWriteRunnerTest.java
--
diff --git 
a/sdks/java/harness/src/test/java/org/apache/beam/runners/core/BeamFnDataWriteRunnerTest.java
 
b/sdks/java/harness/src/test/java/org/apache/beam/runners/core/BeamFnDataWriteRunnerTest.java
index a3c874e..64d9ea6 100644
--- 
a/sdks/java/harness/src/test/java/org/apache/beam/runners/core/BeamFnDataWriteRunnerTest.java
+++ 
b/sdks/java/harness/src/test/java/org/apache/beam/runners/core/BeamFnDataWriteRunnerTest.java
@@ -20,31 +20,48 @@ package org.apache.beam.runners.core;
 
 import static org.apache.beam.sdk.util.WindowedValue.valueInGlobalWindow;
 import static org.hamcrest.Matchers.contains;
+import static org.hamcrest.Matchers.containsInAnyOrder;
+import static org.junit.Assert.assertFalse;
 import static org.junit.Assert.assertThat;
 import static org.junit.Assert.assertTrue;
+import static org.junit.Assert.fail;
 import static org.mockito.Matchers.any;
 import static org.mockito.Matchers.eq;
 import static org.mockito.Mockito.verify;
 import static org.mockito.Mockito.verifyNoMoreInteractions;
+import static org.mockito.Mockito.verifyZeroInteractions;
 import static org.mockito.Mockito.when;
 
 import com.fasterxml.jackson.databind.ObjectMapper;
+import com.google.common.base.Suppliers;
+import com.google.common.collect.HashMultimap;
+import com.google.common.collect.ImmutableMap;
+import com.google.common.collect.Iterables;
+import com.google.common.collect.Multimap;
 import com.google.protobuf.Any;
 import com.google.protobuf.ByteString;
 import com.google.protobuf.BytesValue;
 import java.io.IOException;
 import java.util.ArrayList;
+import java.util.List;
+import java.util.ServiceLoader;
+import java.util.concurrent.atomic.AtomicBoolean;
 import java.util.concurrent.atomic.AtomicReference;
 import org.apache.beam.fn.harness.data.BeamFnDataClient;
 import org.apache.beam.fn.harness.fn.CloseableThrowingConsumer;
+import org.apache.beam.fn.harness.fn.ThrowingConsumer;
+import org.apache.beam.fn.harness.fn.ThrowingRunnable;
 import org.apache.beam.fn.v1.BeamFnApi;
+import org.apache.beam.runners.core.PTransformRunnerFactory.Registrar;
 import org.apache.beam.runners.dataflow.util.CloudObjects;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
 import org.apache.beam.sdk.common.runner.v1.RunnerApi;
+import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.transforms.windowing.GlobalWindow;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.beam.sdk.values.KV;
+import org.hamcrest.collection.IsMapContaining;
 import org.junit.Before;
 import org.junit.Test;
 import org.junit.runner.RunWith;
@@ -56,15 +73,18 @@ import org.mockito.MockitoAnnotations;
 /** Tests for {@link BeamFnDataWriteRunner}. */
 @RunWith(JUnit4.class)
 public class BeamFnDataWriteRunnerTest {
-  private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
 
+  private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
   private static final BeamFnApi.RemoteGrpcPort PORT_SPEC = 
BeamFnApi.RemoteGrpcPort.newBuilder()
   
.setApiServiceDescriptor(BeamFnApi.ApiServiceDescriptor.getDefaultInstance()).build();
   private static final RunnerApi.FunctionSpec FUNCTION_SPEC = 
RunnerApi.FunctionSpec.newBuilder()
   .setParameter(Any.pack(PORT_SPEC)).build();
+  private static final String CODER_ID = "string-coder-id";
   private static final Coder> CODER =
   WindowedValue.getFullCoder(StringUtf8Coder.of(), 
GlobalWindow.Coder.INSTANCE);
   private static final RunnerApi.Coder CODER_SPEC;
+  private static final String URN = "urn:org.apache.beam:sink:runner:0.1";
+
   static {
 try {
   CODER_SPEC = RunnerApi.Coder.newBuilder().setSpec(
@@ -85,18 +105,93 @@ public class BeamFnDataWriteRunnerTest {
   .setName("out")
   .build();
 
-  @Mock private BeamFnDataClient mockBeamFnDataClientFactory;
+  @Mock private BeamFnDataClient mockBeamFnDataClient;
 
   @Before
   public void setUp() {
 MockitoAnnotations.initMocks(this);
   }
 
+
+  @Test
+  public void testCreatingAndProcessingBeamFnDataWriteRunner() throws 
Exception {
+String bundleId = "57L";
+String inputId = "100L";
+
+Multimap>> consumers = 
HashMultimap.create();
+List startFunctions = new ArrayList<>();
+List finishFunctions = new ArrayList<>();
+
+RunnerApi.FunctionSpec functionSpec = RunnerApi.FunctionSpec.newBuilder()
+.setUrn("urn:org.apache.beam:sink:runner:0.1")
+.setParameter(Any.pack(PORT_SPEC))
+.build();
+
+RunnerApi.PTransform pTransform = RunnerApi.PTransform.newBuilder()
+.setSpec(functionSpec)
+.putInputs(inputId, "inputPC")
+.build();
+
+new Beam

[2/3] beam git commit: [BEAM-1347] Break apart ProcessBundleHandler to use service locator pattern based upon URNs.

2017-06-16 Thread lcwik
[BEAM-1347] Break apart ProcessBundleHandler to use service locator pattern 
based upon URNs.

This cleans up ProcessBundleHandler and allows for separate improvements of the 
various PTransform handler factories.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/c9c1a05d
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/c9c1a05d
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/c9c1a05d

Branch: refs/heads/master
Commit: c9c1a05dc07a9a7e57fefbe6e43f723b330499d5
Parents: 54f3078
Author: Luke Cwik 
Authored: Thu Jun 15 16:36:22 2017 -0700
Committer: Luke Cwik 
Committed: Fri Jun 16 12:03:06 2017 -0700

--
 sdks/java/harness/pom.xml   |   6 +
 .../harness/control/ProcessBundleHandler.java   | 293 +++
 .../beam/runners/core/BeamFnDataReadRunner.java |  70 ++-
 .../runners/core/BeamFnDataWriteRunner.java |  67 ++-
 .../beam/runners/core/BoundedSourceRunner.java  |  74 ++-
 .../beam/runners/core/DoFnRunnerFactory.java| 182 +++
 .../runners/core/PTransformRunnerFactory.java   |  81 +++
 .../control/ProcessBundleHandlerTest.java   | 521 +++
 .../runners/core/BeamFnDataReadRunnerTest.java  | 112 +++-
 .../runners/core/BeamFnDataWriteRunnerTest.java | 120 -
 .../runners/core/BoundedSourceRunnerTest.java   | 124 -
 .../runners/core/DoFnRunnerFactoryTest.java | 209 
 12 files changed, 1134 insertions(+), 725 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/c9c1a05d/sdks/java/harness/pom.xml
--
diff --git a/sdks/java/harness/pom.xml b/sdks/java/harness/pom.xml
index 61a170a..a35481d 100644
--- a/sdks/java/harness/pom.xml
+++ b/sdks/java/harness/pom.xml
@@ -154,6 +154,12 @@
   slf4j-api
 
 
+
+  com.google.auto.service
+  auto-service
+  true
+
+
 
 
   org.hamcrest

http://git-wip-us.apache.org/repos/asf/beam/blob/c9c1a05d/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java
--
diff --git 
a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java
 
b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java
index e33277a..4c4f73d 100644
--- 
a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java
+++ 
b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/ProcessBundleHandler.java
@@ -18,51 +18,32 @@
 
 package org.apache.beam.fn.harness.control;
 
-import static com.google.common.base.Preconditions.checkArgument;
-import static com.google.common.collect.Iterables.getOnlyElement;
-
-import com.google.common.collect.Collections2;
+import com.google.common.annotations.VisibleForTesting;
 import com.google.common.collect.HashMultimap;
 import com.google.common.collect.ImmutableMap;
-import com.google.common.collect.ImmutableMultimap;
 import com.google.common.collect.Lists;
 import com.google.common.collect.Multimap;
-import com.google.protobuf.ByteString;
-import com.google.protobuf.BytesValue;
-import com.google.protobuf.InvalidProtocolBufferException;
+import com.google.common.collect.Sets;
 import com.google.protobuf.Message;
 import java.io.IOException;
 import java.util.ArrayList;
-import java.util.Collection;
-import java.util.HashSet;
 import java.util.List;
 import java.util.Map;
-import java.util.Objects;
+import java.util.ServiceLoader;
+import java.util.Set;
 import java.util.function.Consumer;
 import java.util.function.Function;
 import java.util.function.Supplier;
 import org.apache.beam.fn.harness.data.BeamFnDataClient;
-import org.apache.beam.fn.harness.fake.FakeStepContext;
 import org.apache.beam.fn.harness.fn.ThrowingConsumer;
 import org.apache.beam.fn.harness.fn.ThrowingRunnable;
 import org.apache.beam.fn.v1.BeamFnApi;
-import org.apache.beam.runners.core.BeamFnDataReadRunner;
-import org.apache.beam.runners.core.BeamFnDataWriteRunner;
-import org.apache.beam.runners.core.BoundedSourceRunner;
-import org.apache.beam.runners.core.DoFnRunner;
-import org.apache.beam.runners.core.DoFnRunners;
-import org.apache.beam.runners.core.DoFnRunners.OutputManager;
-import org.apache.beam.runners.core.NullSideInputReader;
-import org.apache.beam.runners.dataflow.util.DoFnInfo;
+import org.apache.beam.runners.core.PTransformRunnerFactory;
+import org.apache.beam.runners.core.PTransformRunnerFactory.Registrar;
 import org.apache.beam.sdk.common.runner.v1.RunnerApi;
-import org.apache.beam.sdk.io.BoundedSource;
 import org.apache.beam.sdk.options.PipelineOptions;
-import org.apache.beam.sdk.options.PipelineOptionsFactory;
-import org.apache.beam.sdk.transforms.DoFn;
-impor

[GitHub] beam pull request #3375: [BEAM-1347] Break apart ProcessBundleHandler to use...

2017-06-16 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3375


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1347) Basic Java harness capable of understanding process bundle tasks and sending data over the Fn Api

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1347?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052277#comment-16052277
 ] 

ASF GitHub Bot commented on BEAM-1347:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3375


> Basic Java harness capable of understanding process bundle tasks and sending 
> data over the Fn Api
> -
>
> Key: BEAM-1347
> URL: https://issues.apache.org/jira/browse/BEAM-1347
> Project: Beam
>  Issue Type: Improvement
>  Components: beam-model-fn-api
>Reporter: Luke Cwik
>Assignee: Luke Cwik
>
> Create a basic Java harness capable of understanding process bundle requests 
> and able to stream data over the Fn Api.
> Overview: https://s.apache.org/beam-fn-api



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #2385

2017-06-16 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #3380: A few cleanups in CombineTest

2017-06-16 Thread jkff
GitHub user jkff opened a pull request:

https://github.com/apache/beam/pull/3380

A few cleanups in CombineTest

Better error messages and IntelliJ warning cleanups.

R: @tgroh 


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jkff/incubator-beam combine-cleanups

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3380.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3380


commit 1663d70b6a0b969571249fab444018578b999f9d
Author: Eugene Kirpichov 
Date:   2017-06-16T20:07:48Z

A few cleanups in CombineTest

Better error messages and IntelliJ warning cleanups.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #3335: Tests for reading windowed side input from resumed SDF call

2017-06-16 Thread jkff
This closes #3335: Tests for reading windowed side input from resumed SDF call


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/e827642e
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/e827642e
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/e827642e

Branch: refs/heads/master
Commit: e827642ef17259a60d1392827b750a798d36f69e
Parents: aa555f5 9a6a277
Author: Eugene Kirpichov 
Authored: Fri Jun 16 13:10:15 2017 -0700
Committer: Eugene Kirpichov 
Committed: Fri Jun 16 13:10:15 2017 -0700

--
 .../beam/sdk/transforms/SplittableDoFnTest.java | 145 ++-
 1 file changed, 140 insertions(+), 5 deletions(-)
--




[1/2] beam git commit: Tests for reading windowed side input from resumed SDF call

2017-06-16 Thread jkff
Repository: beam
Updated Branches:
  refs/heads/master aa555f593 -> e827642ef


Tests for reading windowed side input from resumed SDF call


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/9a6a277c
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/9a6a277c
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/9a6a277c

Branch: refs/heads/master
Commit: 9a6a277cea4582f0a64eac97730cb85af5ba352b
Parents: aa555f5
Author: Eugene Kirpichov 
Authored: Thu Jun 8 16:54:12 2017 -0700
Committer: Eugene Kirpichov 
Committed: Fri Jun 16 13:09:50 2017 -0700

--
 .../beam/sdk/transforms/SplittableDoFnTest.java | 145 ++-
 1 file changed, 140 insertions(+), 5 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/9a6a277c/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/SplittableDoFnTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/SplittableDoFnTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/SplittableDoFnTest.java
index 02a44d2..646d8d3 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/SplittableDoFnTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/SplittableDoFnTest.java
@@ -18,10 +18,13 @@
 package org.apache.beam.sdk.transforms;
 
 import static com.google.common.base.Preconditions.checkState;
+import static org.hamcrest.Matchers.greaterThan;
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertThat;
 import static org.junit.Assert.assertTrue;
 
+import com.google.common.collect.Ordering;
 import java.io.Serializable;
 import java.util.ArrayList;
 import java.util.Arrays;
@@ -29,6 +32,7 @@ import java.util.List;
 import org.apache.beam.sdk.coders.BigEndianIntegerCoder;
 import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
+import org.apache.beam.sdk.coders.VarIntCoder;
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.testing.TestStream;
@@ -182,6 +186,12 @@ public class SplittableDoFnTest implements Serializable {
   private static class SDFWithMultipleOutputsPerBlock extends DoFn {
 private static final int MAX_INDEX = 98765;
 
+private final TupleTag numProcessCalls;
+
+private SDFWithMultipleOutputsPerBlock(TupleTag numProcessCalls) {
+  this.numProcessCalls = numProcessCalls;
+}
+
 private static int snapToNextBlock(int index, int[] blockStarts) {
   for (int i = 1; i < blockStarts.length; ++i) {
 if (index > blockStarts[i - 1] && index <= blockStarts[i]) {
@@ -195,6 +205,7 @@ public class SplittableDoFnTest implements Serializable {
 public void processElement(ProcessContext c, OffsetRangeTracker tracker) {
   int[] blockStarts = {-1, 0, 12, 123, 1234, 12345, 34567, MAX_INDEX};
   int trueStart = snapToNextBlock((int) 
tracker.currentRestriction().getFrom(), blockStarts);
+  c.output(numProcessCalls, 1);
   for (int i = trueStart; tracker.tryClaim(blockStarts[i]); ++i) {
 for (int index = blockStarts[i]; index < blockStarts[i + 1]; ++index) {
   c.output(index);
@@ -211,10 +222,26 @@ public class SplittableDoFnTest implements Serializable {
   @Test
   @Category({ValidatesRunner.class, UsesSplittableParDo.class})
   public void testOutputAfterCheckpoint() throws Exception {
-PCollection outputs = p.apply(Create.of("foo"))
-.apply(ParDo.of(new SDFWithMultipleOutputsPerBlock()));
-PAssert.thatSingleton(outputs.apply(Count.globally()))
+TupleTag main = new TupleTag<>();
+TupleTag numProcessCalls = new TupleTag<>();
+PCollectionTuple outputs =
+p.apply(Create.of("foo"))
+.apply(
+ParDo.of(new SDFWithMultipleOutputsPerBlock(numProcessCalls))
+.withOutputTags(main, TupleTagList.of(numProcessCalls)));
+PAssert.thatSingleton(outputs.get(main).apply(Count.globally()))
 .isEqualTo((long) SDFWithMultipleOutputsPerBlock.MAX_INDEX);
+// Verify that more than 1 process() call was involved, i.e. that there 
was checkpointing.
+PAssert.thatSingleton(
+
outputs.get(numProcessCalls).setCoder(VarIntCoder.of()).apply(Sum.integersGlobally()))
+.satisfies(
+new SerializableFunction() {
+  @Override
+  public Void apply(Integer input) {
+assertThat(input, greaterThan(1));
+return null;
+  }
+});
 p.run();
   }
 
@@ -287,9 +314,117 @@ public class SplittableDoFnTest implements Serializable {
 PAssert.that(res).containsInAnyOrder("a:

[GitHub] beam pull request #3335: Tests for reading windowed side input from resumed ...

2017-06-16 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3335


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: Populate PBegin input when decoding from Runner API

2017-06-16 Thread robertwb
Repository: beam
Updated Branches:
  refs/heads/master e827642ef -> 0cabdf6e7


Populate PBegin input when decoding from Runner API


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/4519681e
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/4519681e
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/4519681e

Branch: refs/heads/master
Commit: 4519681ec3d2fb723a514128d7c9c531c8de9dbf
Parents: e827642
Author: Charles Chen 
Authored: Thu Jun 15 15:27:18 2017 -0700
Committer: Robert Bradshaw 
Committed: Fri Jun 16 13:53:59 2017 -0700

--
 sdks/python/apache_beam/pipeline.py | 13 -
 1 file changed, 12 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/4519681e/sdks/python/apache_beam/pipeline.py
--
diff --git a/sdks/python/apache_beam/pipeline.py 
b/sdks/python/apache_beam/pipeline.py
index ab77956..d84a2b7 100644
--- a/sdks/python/apache_beam/pipeline.py
+++ b/sdks/python/apache_beam/pipeline.py
@@ -515,7 +515,18 @@ class Pipeline(object):
 p.applied_labels = set([
 t.unique_name for t in proto.components.transforms.values()])
 for id in proto.components.pcollections:
-  context.pcollections.get_by_id(id).pipeline = p
+  pcollection = context.pcollections.get_by_id(id)
+  pcollection.pipeline = p
+
+# Inject PBegin input where necessary.
+from apache_beam.io.iobase import Read
+from apache_beam.transforms.core import Create
+has_pbegin = [Read, Create]
+for id in proto.components.transforms:
+  transform = context.transforms.get_by_id(id)
+  if not transform.inputs and transform.transform.__class__ in has_pbegin:
+transform.inputs = (pvalue.PBegin(p),)
+
 return p
 
 



[2/2] beam git commit: Closes #3373

2017-06-16 Thread robertwb
Closes #3373


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/0cabdf6e
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/0cabdf6e
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/0cabdf6e

Branch: refs/heads/master
Commit: 0cabdf6e776363e639f18779744da73f9d29bb5a
Parents: e827642 4519681
Author: Robert Bradshaw 
Authored: Fri Jun 16 13:54:00 2017 -0700
Committer: Robert Bradshaw 
Committed: Fri Jun 16 13:54:00 2017 -0700

--
 sdks/python/apache_beam/pipeline.py | 13 -
 1 file changed, 12 insertions(+), 1 deletion(-)
--




[GitHub] beam pull request #3373: Populate PBegin input when decoding from Runner API

2017-06-16 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3373


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #2386

2017-06-16 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-2457) Error: "Unable to find registrar for hdfs" - need to prevent/improve error message

2017-06-16 Thread Stephen Sisk (JIRA)
Stephen Sisk created BEAM-2457:
--

 Summary: Error: "Unable to find registrar for hdfs" - need to 
prevent/improve error message
 Key: BEAM-2457
 URL: https://issues.apache.org/jira/browse/BEAM-2457
 Project: Beam
  Issue Type: Improvement
  Components: sdk-java-core
Affects Versions: 2.0.0
Reporter: Stephen Sisk
Assignee: Davor Bonaci


I've noticed a number of user reports where jobs are failing with the error 
message "Unable to find registrar for hdfs": 
* 
https://stackoverflow.com/questions/44497662/apache-beamunable-to-find-registrar-for-hdfs/44508533?noredirect=1#comment76026835_44508533
* 
https://lists.apache.org/thread.html/144c384e54a141646fcbe854226bb3668da091c5dc7fa2d471626e9b@%3Cuser.beam.apache.org%3E
* 
https://lists.apache.org/thread.html/e4d5ac744367f9d036a1f776bba31b9c4fe377d8f11a4b530be9f829@%3Cuser.beam.apache.org%3E
 

This isn't too many reports, but it is the only time I can recall so many users 
reporting the same error message in a such a short amount of time. 

We believe the problem is one of two things: 
1) bad uber jar creation
2) incorrect HDFS configuration

However, it's highly possible this could have some other root cause. 

It seems like it'd be useful to:
1) Follow up with the above reports to see if they've resolved the issue, and 
if so what fixed it. There may be another root cause out there.
2) Improve the error message to include more information about how to resolve it
3) See if we can improve detection of the error cases to give more specific 
information (specifically, if HDFS is miconfigured, can we detect that somehow 
and tell the user exactly that?)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[1/2] beam git commit: [BEAM-2443] apply AutoValue to BeamSqlRecordType

2017-06-16 Thread takidau
Repository: beam
Updated Branches:
  refs/heads/DSL_SQL abe0f1a0a -> dcd769c8a


[BEAM-2443] apply AutoValue to BeamSqlRecordType


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/20453733
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/20453733
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/20453733

Branch: refs/heads/DSL_SQL
Commit: 20453733ea8679e9fe421950a69921ace80dd381
Parents: abe0f1a
Author: James Xu 
Authored: Fri Jun 16 14:31:55 2017 +0800
Committer: Tyler Akidau 
Committed: Fri Jun 16 14:19:02 2017 -0700

--
 dsls/sql/pom.xml|  5 +++
 .../beam/dsls/sql/example/BeamSqlExample.java   |  9 ++---
 .../beam/dsls/sql/rel/BeamAggregationRel.java   | 20 ++-
 .../beam/dsls/sql/schema/BeamSqlRecordType.java | 38 +---
 .../transform/BeamAggregationTransforms.java| 22 +++-
 .../beam/dsls/sql/utils/CalciteUtils.java   | 11 +++---
 6 files changed, 51 insertions(+), 54 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/20453733/dsls/sql/pom.xml
--
diff --git a/dsls/sql/pom.xml b/dsls/sql/pom.xml
index e70c88c..d866313 100644
--- a/dsls/sql/pom.xml
+++ b/dsls/sql/pom.xml
@@ -190,5 +190,10 @@
   0.10.1.0
   provided
 
+
+  com.google.auto.value
+  auto-value
+  provided
+
   
 

http://git-wip-us.apache.org/repos/asf/beam/blob/20453733/dsls/sql/src/main/java/org/apache/beam/dsls/sql/example/BeamSqlExample.java
--
diff --git 
a/dsls/sql/src/main/java/org/apache/beam/dsls/sql/example/BeamSqlExample.java 
b/dsls/sql/src/main/java/org/apache/beam/dsls/sql/example/BeamSqlExample.java
index 6bb1617..31f8302 100644
--- 
a/dsls/sql/src/main/java/org/apache/beam/dsls/sql/example/BeamSqlExample.java
+++ 
b/dsls/sql/src/main/java/org/apache/beam/dsls/sql/example/BeamSqlExample.java
@@ -18,6 +18,8 @@
 package org.apache.beam.dsls.sql.example;
 
 import java.sql.Types;
+import java.util.Arrays;
+import java.util.List;
 import org.apache.beam.dsls.sql.BeamSql;
 import org.apache.beam.dsls.sql.schema.BeamSqlRecordType;
 import org.apache.beam.dsls.sql.schema.BeamSqlRow;
@@ -47,10 +49,9 @@ class BeamSqlExample {
 Pipeline p = Pipeline.create(options);
 
 //define the input row format
-BeamSqlRecordType type = new BeamSqlRecordType();
-type.addField("c1", Types.INTEGER);
-type.addField("c2", Types.VARCHAR);
-type.addField("c3", Types.DOUBLE);
+List fieldNames = Arrays.asList("c1", "c2", "c3");
+List fieldTypes = Arrays.asList(Types.INTEGER, Types.VARCHAR, 
Types.DOUBLE);
+BeamSqlRecordType type = BeamSqlRecordType.create(fieldNames, fieldTypes);
 BeamSqlRow row = new BeamSqlRow(type);
 row.addField(0, 1);
 row.addField(1, "row");

http://git-wip-us.apache.org/repos/asf/beam/blob/20453733/dsls/sql/src/main/java/org/apache/beam/dsls/sql/rel/BeamAggregationRel.java
--
diff --git 
a/dsls/sql/src/main/java/org/apache/beam/dsls/sql/rel/BeamAggregationRel.java 
b/dsls/sql/src/main/java/org/apache/beam/dsls/sql/rel/BeamAggregationRel.java
index 595563d..bcdc44f 100644
--- 
a/dsls/sql/src/main/java/org/apache/beam/dsls/sql/rel/BeamAggregationRel.java
+++ 
b/dsls/sql/src/main/java/org/apache/beam/dsls/sql/rel/BeamAggregationRel.java
@@ -17,8 +17,8 @@
  */
 package org.apache.beam.dsls.sql.rel;
 
+import java.util.ArrayList;
 import java.util.List;
-
 import org.apache.beam.dsls.sql.schema.BeamSqlRecordType;
 import org.apache.beam.dsls.sql.schema.BeamSqlRow;
 import org.apache.beam.dsls.sql.schema.BeamSqlRowCoder;
@@ -125,25 +125,29 @@ public class BeamAggregationRel extends Aggregate 
implements BeamRelNode {
*/
   private BeamSqlRecordType exKeyFieldsSchema(RelDataType relDataType) {
 BeamSqlRecordType inputRecordType = 
CalciteUtils.toBeamRecordType(relDataType);
-BeamSqlRecordType typeOfKey = new BeamSqlRecordType();
+List fieldNames = new ArrayList<>();
+List fieldTypes = new ArrayList<>();
 for (int i : groupSet.asList()) {
   if (i != windowFieldIdx) {
-typeOfKey.addField(inputRecordType.getFieldsName().get(i),
-inputRecordType.getFieldsType().get(i));
+fieldNames.add(inputRecordType.getFieldsName().get(i));
+fieldTypes.add(inputRecordType.getFieldsType().get(i));
   }
 }
-return typeOfKey;
+return BeamSqlRecordType.create(fieldNames, fieldTypes);
   }
 
   /**
* Type of sub-rowrecord, that represents the list of aggregation fields.
*/
   private BeamSqlRecordType exAggFieldsSchema() {
-BeamSqlRecordType typeOfAggFields = new BeamSqlRecordType();
+List fi

[2/2] beam git commit: [BEAM-2443] This closes #3377

2017-06-16 Thread takidau
[BEAM-2443] This closes #3377


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/dcd769c8
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/dcd769c8
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/dcd769c8

Branch: refs/heads/DSL_SQL
Commit: dcd769c8a53daac25b4cb64f7418e29c6687c1df
Parents: abe0f1a 2045373
Author: Tyler Akidau 
Authored: Fri Jun 16 14:20:27 2017 -0700
Committer: Tyler Akidau 
Committed: Fri Jun 16 14:20:27 2017 -0700

--
 dsls/sql/pom.xml|  5 +++
 .../beam/dsls/sql/example/BeamSqlExample.java   |  9 ++---
 .../beam/dsls/sql/rel/BeamAggregationRel.java   | 20 ++-
 .../beam/dsls/sql/schema/BeamSqlRecordType.java | 38 +---
 .../transform/BeamAggregationTransforms.java| 22 +++-
 .../beam/dsls/sql/utils/CalciteUtils.java   | 11 +++---
 6 files changed, 51 insertions(+), 54 deletions(-)
--




[GitHub] beam pull request #3381: Increases backoff in GcsUtil

2017-06-16 Thread jkff
GitHub user jkff opened a pull request:

https://github.com/apache/beam/pull/3381

Increases backoff in GcsUtil

3 tries starting with 200ms is not enough for some transient network issues.

R: @lukecwik 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jkff/incubator-beam backoff

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3381.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3381


commit dbb2996316b914c6a299f5c3b523afed92afd3f4
Author: Eugene Kirpichov 
Date:   2017-06-16T21:27:51Z

Increases backoff in GcsUtil




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (BEAM-2457) Error: "Unable to find registrar for hdfs" - need to prevent/improve error message

2017-06-16 Thread Stephen Sisk (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2457?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Stephen Sisk updated BEAM-2457:
---
Description: 
I've noticed a number of user reports where jobs are failing with the error 
message "Unable to find registrar for hdfs": 
* 
https://stackoverflow.com/questions/44497662/apache-beamunable-to-find-registrar-for-hdfs/44508533?noredirect=1#comment76026835_44508533
* 
https://lists.apache.org/thread.html/144c384e54a141646fcbe854226bb3668da091c5dc7fa2d471626e9b@%3Cuser.beam.apache.org%3E
* 
https://lists.apache.org/thread.html/e4d5ac744367f9d036a1f776bba31b9c4fe377d8f11a4b530be9f829@%3Cuser.beam.apache.org%3E
 

This isn't too many reports, but it is the only time I can recall so many users 
reporting the same error message in a such a short amount of time. 

We believe the problem is one of two things: 
1) bad uber jar creation
2) incorrect HDFS configuration

However, it's highly possible this could have some other root cause. 

It seems like it'd be useful to:
1) Follow up with the above reports to see if they've resolved the issue, and 
if so what fixed it. There may be another root cause out there.
2) Improve the error message to include more information about how to resolve it
3) See if we can improve detection of the error cases to give more specific 
information (specifically, if HDFS is miconfigured, can we detect that somehow 
and tell the user exactly that?)
4) update documentation

  was:
I've noticed a number of user reports where jobs are failing with the error 
message "Unable to find registrar for hdfs": 
* 
https://stackoverflow.com/questions/44497662/apache-beamunable-to-find-registrar-for-hdfs/44508533?noredirect=1#comment76026835_44508533
* 
https://lists.apache.org/thread.html/144c384e54a141646fcbe854226bb3668da091c5dc7fa2d471626e9b@%3Cuser.beam.apache.org%3E
* 
https://lists.apache.org/thread.html/e4d5ac744367f9d036a1f776bba31b9c4fe377d8f11a4b530be9f829@%3Cuser.beam.apache.org%3E
 

This isn't too many reports, but it is the only time I can recall so many users 
reporting the same error message in a such a short amount of time. 

We believe the problem is one of two things: 
1) bad uber jar creation
2) incorrect HDFS configuration

However, it's highly possible this could have some other root cause. 

It seems like it'd be useful to:
1) Follow up with the above reports to see if they've resolved the issue, and 
if so what fixed it. There may be another root cause out there.
2) Improve the error message to include more information about how to resolve it
3) See if we can improve detection of the error cases to give more specific 
information (specifically, if HDFS is miconfigured, can we detect that somehow 
and tell the user exactly that?)


> Error: "Unable to find registrar for hdfs" - need to prevent/improve error 
> message
> --
>
> Key: BEAM-2457
> URL: https://issues.apache.org/jira/browse/BEAM-2457
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Affects Versions: 2.0.0
>Reporter: Stephen Sisk
>Assignee: Davor Bonaci
>
> I've noticed a number of user reports where jobs are failing with the error 
> message "Unable to find registrar for hdfs": 
> * 
> https://stackoverflow.com/questions/44497662/apache-beamunable-to-find-registrar-for-hdfs/44508533?noredirect=1#comment76026835_44508533
> * 
> https://lists.apache.org/thread.html/144c384e54a141646fcbe854226bb3668da091c5dc7fa2d471626e9b@%3Cuser.beam.apache.org%3E
> * 
> https://lists.apache.org/thread.html/e4d5ac744367f9d036a1f776bba31b9c4fe377d8f11a4b530be9f829@%3Cuser.beam.apache.org%3E
>  
> This isn't too many reports, but it is the only time I can recall so many 
> users reporting the same error message in a such a short amount of time. 
> We believe the problem is one of two things: 
> 1) bad uber jar creation
> 2) incorrect HDFS configuration
> However, it's highly possible this could have some other root cause. 
> It seems like it'd be useful to:
> 1) Follow up with the above reports to see if they've resolved the issue, and 
> if so what fixed it. There may be another root cause out there.
> 2) Improve the error message to include more information about how to resolve 
> it
> 3) See if we can improve detection of the error cases to give more specific 
> information (specifically, if HDFS is miconfigured, can we detect that 
> somehow and tell the user exactly that?)
> 4) update documentation



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Created] (BEAM-2458) Move HashingFn from test -> main to make it more accessible

2017-06-16 Thread Stephen Sisk (JIRA)
Stephen Sisk created BEAM-2458:
--

 Summary: Move HashingFn from test -> main to make it more 
accessible
 Key: BEAM-2458
 URL: https://issues.apache.org/jira/browse/BEAM-2458
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Reporter: Stephen Sisk
Assignee: Stephen Sisk


HashingFn is currently only available as a test dependency. There's no reason 
for that (it could be generally useful to non-test parts of IOs if necessary), 
so we should move it over to regular directory.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] beam pull request #3382: [BEAM-2458] Move HashingFn from test -> main

2017-06-16 Thread ssisk
GitHub user ssisk opened a pull request:

https://github.com/apache/beam/pull/3382

[BEAM-2458] Move HashingFn from test -> main

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [X] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [X] Make sure tests pass via `mvn clean verify`.
 - [X] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [X] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ssisk/beam move-hashingfn

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3382.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3382


commit 321b328135eda37ab84cf2aeb0653a24d28ae999
Author: Stephen Sisk 
Date:   2017-06-16T18:03:31Z

Move HashingFn from test -> main




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (BEAM-2453) The Java DirectRunner should exercise all parts of a CombineFn

2017-06-16 Thread Robert Bradshaw (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2453?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Bradshaw updated BEAM-2453:
--
Summary: The Java DirectRunner should exercise all parts of a CombineFn  
(was: The DirectRunner should exercise all parts of a CombineFn)

> The Java DirectRunner should exercise all parts of a CombineFn
> --
>
> Key: BEAM-2453
> URL: https://issues.apache.org/jira/browse/BEAM-2453
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>
> Specifically it should:
> Create some number of accumulators; add elements to these accumulators, merge 
> the created accumulators, and extract the output.
> This can be performed by replacing the {{Combine.perKey}} composite transform 
> with a multi-step combine {{CombineBundles -> GroupByKey -> 
> MergeAccumulators}}
> Where {{CombineBundles}} is a {{ParDo}} which takes input {{KV}} 
> and produces {{KV}}, outputting in {{FinishBundle}} (this can only 
> be performed if the Combine takes no side inputs or does not have merging 
> windows). {{MergeAccumulators}} takes in {{KV>}} and 
> produces {{KV}} by merging all of the accumulators and extracting 
> the output.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (BEAM-2458) Move HashingFn from test -> main to make it more accessible

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2458?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052424#comment-16052424
 ] 

ASF GitHub Bot commented on BEAM-2458:
--

GitHub user ssisk opened a pull request:

https://github.com/apache/beam/pull/3382

[BEAM-2458] Move HashingFn from test -> main

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [X] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [X] Make sure tests pass via `mvn clean verify`.
 - [X] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [X] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ssisk/beam move-hashingfn

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3382.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3382


commit 321b328135eda37ab84cf2aeb0653a24d28ae999
Author: Stephen Sisk 
Date:   2017-06-16T18:03:31Z

Move HashingFn from test -> main




> Move HashingFn from test -> main to make it more accessible
> ---
>
> Key: BEAM-2458
> URL: https://issues.apache.org/jira/browse/BEAM-2458
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Stephen Sisk
>Assignee: Stephen Sisk
>
> HashingFn is currently only available as a test dependency. There's no reason 
> for that (it could be generally useful to non-test parts of IOs if 
> necessary), so we should move it over to regular directory.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] beam pull request #3383: [BEAM-2447] Reintroduces DoFn.ProcessContinuation (...

2017-06-16 Thread jkff
GitHub user jkff opened a pull request:

https://github.com/apache/beam/pull/3383

[BEAM-2447] Reintroduces DoFn.ProcessContinuation (Dataflow worker 
compatibility part)

Prerequisite for #3360. Will need a Dataflow worker update too.

R: @tgroh 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jkff/incubator-beam process-cont-base

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3383.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3383


commit 71599e1a7f0766ac5e7ea100f8bbc69152e44860
Author: Eugene Kirpichov 
Date:   2017-06-16T21:56:07Z

Reintroduces DoFn.ProcessContinuation (Dataflow worker compatibility part)




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2447) Reintroduce DoFn.ProcessContinuation

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2447?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052430#comment-16052430
 ] 

ASF GitHub Bot commented on BEAM-2447:
--

GitHub user jkff opened a pull request:

https://github.com/apache/beam/pull/3383

[BEAM-2447] Reintroduces DoFn.ProcessContinuation (Dataflow worker 
compatibility part)

Prerequisite for #3360. Will need a Dataflow worker update too.

R: @tgroh 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jkff/incubator-beam process-cont-base

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3383.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3383


commit 71599e1a7f0766ac5e7ea100f8bbc69152e44860
Author: Eugene Kirpichov 
Date:   2017-06-16T21:56:07Z

Reintroduces DoFn.ProcessContinuation (Dataflow worker compatibility part)




> Reintroduce DoFn.ProcessContinuation
> 
>
> Key: BEAM-2447
> URL: https://issues.apache.org/jira/browse/BEAM-2447
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Eugene Kirpichov
>Assignee: Eugene Kirpichov
>
> ProcessContinuation.resume() is useful for tailing files - when we reach 
> current EOF, we want to voluntarily suspend the process() call rather than 
> wait for runner to checkpoint us.
> In BEAM-1903, DoFn.ProcessContinuation was removed because there was 
> ambiguity about the semantics of resume() especially w.r.t. the following 
> situation described in 
> https://docs.google.com/document/d/1BGc8pM1GOvZhwR9SARSVte-20XEoBUxrGJ5gTWXdv3c/edit
>  : the runner has taken a checkpoint on the tracker, and then the 
> ProcessElement call returns resume() signaling that the work is still not 
> done - then there's 2 checkpoints to deal with.
> Instead, the proper way to refine this semantics is:
> - After checkpoint() on a RestrictionTracker, the tracker MUST fail all 
> subsequent tryClaim() calls, and MUST succeed in checkDone().
> - After a failed tryClaim() call, the ProcessElement method MUST return stop()
> - So ProcessElement can return resume() only *instead* of doing tryClaim()
> - Then, if the runner has already taken a checkpoint but tracker has returned 
> resume(), we do not need to take a new checkpoint - the one already taken 
> already accurately describes the remainder of the work.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] beam pull request #3377: [BEAM-2443] apply AutoValue to BeamSqlRecordType

2017-06-16 Thread xumingming
Github user xumingming closed the pull request at:

https://github.com/apache/beam/pull/3377


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2443) BeamSqlRecordType should migrate to using a builder pattern via AutoValue/AutoBuilder

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2443?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052444#comment-16052444
 ] 

ASF GitHub Bot commented on BEAM-2443:
--

Github user xumingming closed the pull request at:

https://github.com/apache/beam/pull/3377


> BeamSqlRecordType should migrate to using a builder pattern via 
> AutoValue/AutoBuilder
> -
>
> Key: BEAM-2443
> URL: https://issues.apache.org/jira/browse/BEAM-2443
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Luke Cwik
>Assignee: James Xu
>Priority: Minor
>  Labels: dsl_sql_merge
>
> This is a code health and usability issue. Performing the migration now to 
> use AutoValue/AutoBuilder will make it easier for people to create table row 
> types without needing to worry about mutability.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #3379

2017-06-16 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-2459) Remove no-op UnsupportedOperationVisitor

2017-06-16 Thread Tyler Akidau (JIRA)
Tyler Akidau created BEAM-2459:
--

 Summary: Remove no-op UnsupportedOperationVisitor
 Key: BEAM-2459
 URL: https://issues.apache.org/jira/browse/BEAM-2459
 Project: Beam
  Issue Type: Task
  Components: dsl-sql
Reporter: Tyler Akidau
Assignee: Tyler Akidau


UnsupportedOperationVisitor appears to simply extend Calcite's SqlShuttle, with 
no overrides, which means nothing actually happens when it's used. We should 
remove it unless I'm missing something or there's a plan to use it differently 
in the future.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] beam pull request #3384: [BEAM-2459] Remove UnsupportedOperationVisitor, whi...

2017-06-16 Thread takidau
GitHub user takidau opened a pull request:

https://github.com/apache/beam/pull/3384

[BEAM-2459] Remove UnsupportedOperationVisitor, which is currently just a 
no-op

AFAICT, UnsupportedOperationVisitor just extends SqlShuttle with no 
overrides, which means it does nothing when it visits SqlNodes. As such, the 
one use of it in BeamQueryPlanner is a no-op. Removing, unless there is a good 
reason not to.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/takidau/beam BEAM-2459

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3384.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3384


commit 5276d2381cb638e01ada16b09dd52cdda286590e
Author: Tyler Akidau 
Date:   2017-06-16T23:16:46Z

Remove UnsupportOperationVisitor, which is currently just a no-op




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2459) Remove no-op UnsupportedOperationVisitor

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2459?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052511#comment-16052511
 ] 

ASF GitHub Bot commented on BEAM-2459:
--

GitHub user takidau opened a pull request:

https://github.com/apache/beam/pull/3384

[BEAM-2459] Remove UnsupportedOperationVisitor, which is currently just a 
no-op

AFAICT, UnsupportedOperationVisitor just extends SqlShuttle with no 
overrides, which means it does nothing when it visits SqlNodes. As such, the 
one use of it in BeamQueryPlanner is a no-op. Removing, unless there is a good 
reason not to.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/takidau/beam BEAM-2459

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3384.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3384


commit 5276d2381cb638e01ada16b09dd52cdda286590e
Author: Tyler Akidau 
Date:   2017-06-16T23:16:46Z

Remove UnsupportOperationVisitor, which is currently just a no-op




> Remove no-op UnsupportedOperationVisitor
> 
>
> Key: BEAM-2459
> URL: https://issues.apache.org/jira/browse/BEAM-2459
> Project: Beam
>  Issue Type: Task
>  Components: dsl-sql
>Reporter: Tyler Akidau
>Assignee: Tyler Akidau
>
> UnsupportedOperationVisitor appears to simply extend Calcite's SqlShuttle, 
> with no overrides, which means nothing actually happens when it's used. We 
> should remove it unless I'm missing something or there's a plan to use it 
> differently in the future.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Dataflow #3380

2017-06-16 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #3385: Add dry run option to DataflowRunner in python SDK

2017-06-16 Thread vikkyrk
GitHub user vikkyrk opened a pull request:

https://github.com/apache/beam/pull/3385

Add dry run option to DataflowRunner in python SDK

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/vikkyrk/incubator-beam df_runner_dry_run

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3385.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3385


commit 388890df98c8de889a36e8b48db13c8faf4f9b4d
Author: Vikas Kedigehalli 
Date:   2017-06-17T00:18:57Z

Add dry run option to DataflowRunner




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (BEAM-2460) Make python maven variables available elsewhere

2017-06-16 Thread Stephen Sisk (JIRA)
Stephen Sisk created BEAM-2460:
--

 Summary: Make python maven variables available elsewhere
 Key: BEAM-2460
 URL: https://issues.apache.org/jira/browse/BEAM-2460
 Project: Beam
  Issue Type: Task
  Components: sdk-java-core, sdk-py
Reporter: Stephen Sisk
Assignee: Stephen Sisk
Priority: Minor


We're planning to start using perfkit to make it easy for users to run the IO 
ITs. Perfkit is a python app, so we'll need to invoke the python interpreter. 

findSupportedPython calculates this and sets the python.interpreter.bin 
variable. It currently does so in the python project. 

I believe the correct answer here is to move findSupportedPython so that's it's 
invoked in the root pom. (but perhaps there's a more maven-y approach to this 
problem?) 


cc [~altay] [~markflyhigh] [~jasonkuster] [~davor]



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #94

2017-06-16 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-1347] Break apart ProcessBundleHandler to use service locator

[kirpichov] Tests for reading windowed side input from resumed SDF call

[robertwb] Populate PBegin input when decoding from Runner API

--
[...truncated 1.51 MB...]
2017-06-17\T\00:32:23.283 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/spark/spark-unsafe_2.10/1.6.2/spark-unsafe_2.10-1.6.2.pom
2017-06-17\T\00:32:23.309 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/spark/spark-unsafe_2.10/1.6.2/spark-unsafe_2.10-1.6.2.pom
 (5 KB at 178.9 KB/sec)
2017-06-17\T\00:32:23.315 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/fasterxml/jackson/module/jackson-module-scala_2.10/2.8.8/jackson-module-scala_2.10-2.8.8.pom
2017-06-17\T\00:32:23.342 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/fasterxml/jackson/module/jackson-module-scala_2.10/2.8.8/jackson-module-scala_2.10-2.8.8.pom
 (5 KB at 153.4 KB/sec)
2017-06-17\T\00:32:23.343 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.6/scala-library-2.10.6.pom
2017-06-17\T\00:32:23.370 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.6/scala-library-2.10.6.pom
 (2 KB at 76.8 KB/sec)
2017-06-17\T\00:32:23.371 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-reflect/2.10.6/scala-reflect-2.10.6.pom
2017-06-17\T\00:32:23.397 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-reflect/2.10.6/scala-reflect-2.10.6.pom
 (2 KB at 72.9 KB/sec)
2017-06-17\T\00:32:23.398 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/fasterxml/jackson/module/jackson-module-paranamer/2.8.8/jackson-module-paranamer-2.8.8.pom
2017-06-17\T\00:32:23.425 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/fasterxml/jackson/module/jackson-module-paranamer/2.8.8/jackson-module-paranamer-2.8.8.pom
 (4 KB at 141.1 KB/sec)
2017-06-17\T\00:32:23.427 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/fasterxml/jackson/module/jackson-modules-base/2.8.8/jackson-modules-base-2.8.8.pom
2017-06-17\T\00:32:23.453 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/fasterxml/jackson/module/jackson-modules-base/2.8.8/jackson-modules-base-2.8.8.pom
 (3 KB at 96.1 KB/sec)
2017-06-17\T\00:32:23.454 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-bom/2.8.8/jackson-bom-2.8.8.pom
2017-06-17\T\00:32:23.480 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-bom/2.8.8/jackson-bom-2.8.8.pom
 (11 KB at 391.0 KB/sec)
2017-06-17\T\00:32:23.483 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.pom
2017-06-17\T\00:32:23.509 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.pom
 (6 KB at 198.1 KB/sec)
2017-06-17\T\00:32:23.510 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/thoughtworks/paranamer/paranamer-parent/2.8/paranamer-parent-2.8.pom
2017-06-17\T\00:32:23.536 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/thoughtworks/paranamer/paranamer-parent/2.8/paranamer-parent-2.8.pom
 (12 KB at 433.4 KB/sec)
2017-06-17\T\00:32:23.539 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/spark/spark-streaming_2.10/1.6.2/spark-streaming_2.10-1.6.2.pom
2017-06-17\T\00:32:23.565 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/spark/spark-streaming_2.10/1.6.2/spark-streaming_2.10-1.6.2.pom
 (8 KB at 298.3 KB/sec)
2017-06-17\T\00:32:23.640 [INFO] Downloading: 
http://repository.apache.org/snapshots/org/apache/beam/beam-runners-apex/2.1.0-SNAPSHOT/beam-runners-apex-2.1.0-20170616.081739-37.jar
2017-06-17\T\00:32:23.641 [INFO] Downloading: 
http://repository.apache.org/snapshots/org/apache/beam/beam-sdks-java-harness/2.1.0-SNAPSHOT/beam-sdks-java-harness-2.1.0-20170616.074213-41.jar
2017-06-17\T\00:32:24.061 [INFO] Downloaded: 
http://repository.apache.org/snapshots/org/apache/beam/beam-sdks-java-harness/2.1.0-SNAPSHOT/beam-sdks-java-harness-2.1.0-20170616.074213-41.jar
 (2698 KB at 6408.4 KB/sec)
2017-06-17\T\00:32:24.061 [INFO] Downloaded: 
http://repository.apache.org/snapshots/org/apache/beam/beam-runners-apex/2.1.0-SNAPSHOT/beam-runners-apex-2.1.0-20170616.081739-37.jar
 (2754 KB at 6539.8 KB/sec)
2017-06-17\T\00:32:24.070 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/apex/apex-common/3.6.0/apex-common-3.6.0.jar
2017-06-17\T\00:32:24.071 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/datatorrent/netlet/1.3.0/netlet-1.3.0.jar
2017-06-17\T\00:32:24.071 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/apex/a

Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #142

2017-06-16 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-1347] Break apart ProcessBundleHandler to use service locator

[kirpichov] Tests for reading windowed side input from resumed SDF call

[robertwb] Populate PBegin input when decoding from Runner API

--
[...truncated 2.65 MB...]
2017-06-17T00:34:43.277 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
2017-06-17T00:34:43.290 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
 (12 KB at 865.8 KB/sec)
2017-06-17T00:34:43.293 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
2017-06-17T00:34:43.302 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
 (13 KB at 1426.0 KB/sec)
2017-06-17T00:34:43.307 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
2017-06-17T00:34:43.315 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
 (3 KB at 271.6 KB/sec)
2017-06-17T00:34:43.319 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
2017-06-17T00:34:43.328 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
 (2 KB at 174.8 KB/sec)
2017-06-17T00:34:43.332 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
2017-06-17T00:34:43.339 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
 (6 KB at 750.0 KB/sec)
2017-06-17T00:34:43.343 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
2017-06-17T00:34:43.350 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
 (3 KB at 356.4 KB/sec)
2017-06-17T00:34:43.354 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
2017-06-17T00:34:43.362 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
 (4 KB at 428.3 KB/sec)
2017-06-17T00:34:43.366 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
2017-06-17T00:34:43.374 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
 (2 KB at 125.4 KB/sec)
2017-06-17T00:34:43.379 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
2017-06-17T00:34:43.388 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
 (2 KB at 188.5 KB/sec)
2017-06-17T00:34:43.394 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
2017-06-17T00:34:43.401 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
 (2 KB at 252.0 KB/sec)
2017-06-17T00:34:43.406 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
2017-06-17T00:34:43.414 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
 (3 KB at 331.8 KB/sec)
2017-06-17T00:34:43.418 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
2017-06-17T00:34:43.429 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
 (18 KB at 1601.5 KB/sec)
2017-06-17T00:34:43.433 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom
2017-06-17T00:34:43.442 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom (10 
KB at 1047.5 KB/sec)
2017-06-17T00:34:43.446 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
2017-06-17T00:34:43.457 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
 (5 KB at 404.9 KB/sec)
2017-06-17T00:34:43.462 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-launcher/1.8.4/ant-launcher-1.8.4.pom
2017-06-17T00:34:43.472 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-launcher/1.8.4/ant-launcher-1.8.4.pom
 (3 KB at 230.6 KB/sec

[2/2] beam git commit: [BEAM-2452] This closes #3371

2017-06-16 Thread takidau
[BEAM-2452] This closes #3371


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/738eb4dd
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/738eb4dd
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/738eb4dd

Branch: refs/heads/DSL_SQL
Commit: 738eb4dd0a231883cfb64f17869e08dfdb00ac51
Parents: dcd769c 86dea07
Author: Tyler Akidau 
Authored: Fri Jun 16 18:27:50 2017 -0700
Committer: Tyler Akidau 
Committed: Fri Jun 16 18:27:50 2017 -0700

--
 .../dsls/sql/BeamSqlDslAggregationTest.java | 260 +++
 .../apache/beam/dsls/sql/BeamSqlDslBase.java| 125 +
 .../beam/dsls/sql/BeamSqlDslFilterTest.java |  78 ++
 .../beam/dsls/sql/BeamSqlDslProjectTest.java| 163 
 .../beam/dsls/sql/planner/BasePlanner.java  | 108 
 .../sql/planner/BeamGroupByExplainTest.java | 106 
 .../sql/planner/BeamGroupByPipelineTest.java| 111 
 .../sql/planner/BeamInvalidGroupByTest.java |  51 
 .../BeamPlannerAggregationSubmitTest.java   | 152 ---
 .../sql/planner/BeamPlannerExplainTest.java |  67 -
 .../dsls/sql/planner/BeamPlannerSubmitTest.java |  56 
 .../sql/schema/BeamPCollectionTableTest.java|  73 --
 12 files changed, 626 insertions(+), 724 deletions(-)
--




[1/2] beam git commit: Update filter/project/aggregation tests to use BeamSql

2017-06-16 Thread takidau
Repository: beam
Updated Branches:
  refs/heads/DSL_SQL dcd769c8a -> 738eb4dd0


Update filter/project/aggregation tests to use BeamSql


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/86dea078
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/86dea078
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/86dea078

Branch: refs/heads/DSL_SQL
Commit: 86dea078eeb29ba92085dc6cd299aca00a23e7e9
Parents: dcd769c
Author: mingmxu 
Authored: Thu Jun 15 18:10:06 2017 -0700
Committer: Tyler Akidau 
Committed: Fri Jun 16 18:25:23 2017 -0700

--
 .../dsls/sql/BeamSqlDslAggregationTest.java | 260 +++
 .../apache/beam/dsls/sql/BeamSqlDslBase.java| 125 +
 .../beam/dsls/sql/BeamSqlDslFilterTest.java |  78 ++
 .../beam/dsls/sql/BeamSqlDslProjectTest.java| 163 
 .../beam/dsls/sql/planner/BasePlanner.java  | 108 
 .../sql/planner/BeamGroupByExplainTest.java | 106 
 .../sql/planner/BeamGroupByPipelineTest.java| 111 
 .../sql/planner/BeamInvalidGroupByTest.java |  51 
 .../BeamPlannerAggregationSubmitTest.java   | 152 ---
 .../sql/planner/BeamPlannerExplainTest.java |  67 -
 .../dsls/sql/planner/BeamPlannerSubmitTest.java |  56 
 .../sql/schema/BeamPCollectionTableTest.java|  73 --
 12 files changed, 626 insertions(+), 724 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/86dea078/dsls/sql/src/test/java/org/apache/beam/dsls/sql/BeamSqlDslAggregationTest.java
--
diff --git 
a/dsls/sql/src/test/java/org/apache/beam/dsls/sql/BeamSqlDslAggregationTest.java
 
b/dsls/sql/src/test/java/org/apache/beam/dsls/sql/BeamSqlDslAggregationTest.java
new file mode 100644
index 000..f7349c6
--- /dev/null
+++ 
b/dsls/sql/src/test/java/org/apache/beam/dsls/sql/BeamSqlDslAggregationTest.java
@@ -0,0 +1,260 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.dsls.sql;
+
+import java.sql.Types;
+import java.util.Arrays;
+import org.apache.beam.dsls.sql.schema.BeamSqlRecordType;
+import org.apache.beam.dsls.sql.schema.BeamSqlRow;
+import org.apache.beam.sdk.testing.PAssert;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.sdk.values.PCollectionTuple;
+import org.apache.beam.sdk.values.TupleTag;
+import org.joda.time.Instant;
+import org.junit.Test;
+
+/**
+ * Tests for GROUP-BY/aggregation, with 
global_window/fix_time_window/sliding_window/session_window.
+ */
+public class BeamSqlDslAggregationTest extends BeamSqlDslBase {
+  /**
+   * GROUP-BY with single aggregation function.
+   */
+  @Test
+  public void testAggregationWithoutWindow() throws Exception {
+String sql = "SELECT f_int2, COUNT(*) AS `size` FROM TABLE_A GROUP BY 
f_int2";
+
+PCollection result =
+inputA1.apply("testAggregationWithoutWindow", 
BeamSql.simpleQuery(sql));
+
+BeamSqlRecordType resultType = 
BeamSqlRecordType.create(Arrays.asList("f_int2", "size"),
+Arrays.asList(Types.INTEGER, Types.BIGINT));
+
+BeamSqlRow record = new BeamSqlRow(resultType);
+record.addField("f_int2", 0);
+record.addField("size", 4L);
+
+PAssert.that(result).containsInAnyOrder(record);
+
+pipeline.run().waitUntilFinish();
+  }
+
+  /**
+   * GROUP-BY with multiple aggregation functions.
+   */
+  @Test
+  public void testAggregationFunctions() throws Exception{
+String sql = "select f_int2, count(*) as size, "
++ "sum(f_long) as sum1, avg(f_long) as avg1, max(f_long) as max1, 
min(f_long) as min1,"
++ "sum(f_short) as sum2, avg(f_short) as avg2, max(f_short) as max2, 
min(f_short) as min2,"
++ "sum(f_byte) as sum3, avg(f_byte) as avg3, max(f_byte) as max3, 
min(f_byte) as min3,"
++ "sum(f_float) as sum4, avg(f_float) as avg4, max(f_float) as max4, 
min(f_float) as min4,"
++ "sum(f_double) as sum5, avg(f_double) as avg5, "
++ "max(f_double) as max5, min(f_double) as min5,"
+  

[jira] [Commented] (BEAM-2452) Update filter/project/aggregation tests with BeamSql

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052631#comment-16052631
 ] 

ASF GitHub Bot commented on BEAM-2452:
--

Github user XuMingmin closed the pull request at:

https://github.com/apache/beam/pull/3371


> Update filter/project/aggregation tests with BeamSql
> 
>
> Key: BEAM-2452
> URL: https://issues.apache.org/jira/browse/BEAM-2452
> Project: Beam
>  Issue Type: Test
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>  Labels: dsl_sql_merge
>
> Update test cases in package {{org.apache.beam.dsls.sql.planner}}, using 
> methods in {{BeamSql}}. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] beam pull request #3371: [BEAM-2452] Update filter/project/aggregation tests...

2017-06-16 Thread XuMingmin
Github user XuMingmin closed the pull request at:

https://github.com/apache/beam/pull/3371


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: Remove UnsupportedOperationVisitor, which is currently just a no-op

2017-06-16 Thread takidau
Repository: beam
Updated Branches:
  refs/heads/DSL_SQL 738eb4dd0 -> 887cf3a1a


Remove UnsupportedOperationVisitor, which is currently just a no-op


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/98383b6f
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/98383b6f
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/98383b6f

Branch: refs/heads/DSL_SQL
Commit: 98383b6faa931bcf2df6c16f1098953723f83a28
Parents: 738eb4d
Author: Tyler Akidau 
Authored: Fri Jun 16 16:16:46 2017 -0700
Committer: Tyler Akidau 
Committed: Fri Jun 16 20:04:22 2017 -0700

--
 .../beam/dsls/sql/planner/BeamQueryPlanner.java |  4 +--
 .../planner/UnsupportedOperatorsVisitor.java| 28 
 2 files changed, 1 insertion(+), 31 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/98383b6f/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/BeamQueryPlanner.java
--
diff --git 
a/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/BeamQueryPlanner.java 
b/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/BeamQueryPlanner.java
index ef71b53..2eaf9e7 100644
--- 
a/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/BeamQueryPlanner.java
+++ 
b/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/BeamQueryPlanner.java
@@ -152,9 +152,7 @@ public class BeamQueryPlanner {
   }
 
   private SqlNode validateNode(SqlNode sqlNode) throws ValidationException {
-SqlNode validatedSqlNode = planner.validate(sqlNode);
-validatedSqlNode.accept(new UnsupportedOperatorsVisitor());
-return validatedSqlNode;
+return planner.validate(sqlNode);
   }
 
   public Map getSourceTables() {

http://git-wip-us.apache.org/repos/asf/beam/blob/98383b6f/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/UnsupportedOperatorsVisitor.java
--
diff --git 
a/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/UnsupportedOperatorsVisitor.java
 
b/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/UnsupportedOperatorsVisitor.java
deleted file mode 100644
index 4a71024..000
--- 
a/dsls/sql/src/main/java/org/apache/beam/dsls/sql/planner/UnsupportedOperatorsVisitor.java
+++ /dev/null
@@ -1,28 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.beam.dsls.sql.planner;
-
-import org.apache.calcite.sql.util.SqlShuttle;
-
-/**
- * Unsupported operation to visit a RelNode.
- *
- */
-class UnsupportedOperatorsVisitor extends SqlShuttle {
-
-}



[2/2] beam git commit: [BEAM-2459] This closes #3384

2017-06-16 Thread takidau
[BEAM-2459] This closes #3384


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/887cf3a1
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/887cf3a1
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/887cf3a1

Branch: refs/heads/DSL_SQL
Commit: 887cf3a1a3bf6357bbb8efde519d470634ee6ea2
Parents: 738eb4d 98383b6
Author: Tyler Akidau 
Authored: Fri Jun 16 20:05:02 2017 -0700
Committer: Tyler Akidau 
Committed: Fri Jun 16 20:05:02 2017 -0700

--
 .../beam/dsls/sql/planner/BeamQueryPlanner.java |  4 +--
 .../planner/UnsupportedOperatorsVisitor.java| 28 
 2 files changed, 1 insertion(+), 31 deletions(-)
--




[GitHub] beam pull request #3384: [BEAM-2459] Remove UnsupportedOperationVisitor, whi...

2017-06-16 Thread takidau
Github user takidau closed the pull request at:

https://github.com/apache/beam/pull/3384


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2459) Remove no-op UnsupportedOperationVisitor

2017-06-16 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2459?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052659#comment-16052659
 ] 

ASF GitHub Bot commented on BEAM-2459:
--

Github user takidau closed the pull request at:

https://github.com/apache/beam/pull/3384


> Remove no-op UnsupportedOperationVisitor
> 
>
> Key: BEAM-2459
> URL: https://issues.apache.org/jira/browse/BEAM-2459
> Project: Beam
>  Issue Type: Task
>  Components: dsl-sql
>Reporter: Tyler Akidau
>Assignee: Tyler Akidau
>
> UnsupportedOperationVisitor appears to simply extend Calcite's SqlShuttle, 
> with no overrides, which means nothing actually happens when it's used. We 
> should remove it unless I'm missing something or there's a plan to use it 
> differently in the future.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Closed] (BEAM-2459) Remove no-op UnsupportedOperationVisitor

2017-06-16 Thread Tyler Akidau (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2459?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tyler Akidau closed BEAM-2459.
--
   Resolution: Fixed
Fix Version/s: Not applicable

> Remove no-op UnsupportedOperationVisitor
> 
>
> Key: BEAM-2459
> URL: https://issues.apache.org/jira/browse/BEAM-2459
> Project: Beam
>  Issue Type: Task
>  Components: dsl-sql
>Reporter: Tyler Akidau
>Assignee: Tyler Akidau
> Fix For: Not applicable
>
>
> UnsupportedOperationVisitor appears to simply extend Calcite's SqlShuttle, 
> with no overrides, which means nothing actually happens when it's used. We 
> should remove it unless I'm missing something or there's a plan to use it 
> differently in the future.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark #2389

2017-06-16 Thread Apache Jenkins Server
See 


--
[...truncated 311.43 KB...]
2017-06-17T06:03:02.317 [WARNING] bootstrap class path not set in conjunction 
with -source 1.7
2017-06-17T06:03:02.370 [INFO] 
2017-06-17T06:03:02.370 [INFO] --- maven-resources-plugin:3.0.2:testResources 
(default-testResources) @ beam-sdks-common-fn-api ---
2017-06-17T06:03:02.372 [INFO] Using 'UTF-8' encoding to copy filtered 
resources.
2017-06-17T06:03:02.372 [INFO] Copying 1 resource
2017-06-17T06:03:02.373 [INFO] Copying 3 resources
2017-06-17T06:03:02.464 [INFO] 
2017-06-17T06:03:02.464 [INFO] --- maven-compiler-plugin:3.6.1:testCompile 
(default-testCompile) @ beam-sdks-common-fn-api ---
2017-06-17T06:03:02.467 [INFO] No sources to compile
2017-06-17T06:03:02.580 [INFO] 
2017-06-17T06:03:02.580 [INFO] --- maven-checkstyle-plugin:2.17:check (default) 
@ beam-sdks-common-fn-api ---
2017-06-17T06:03:02.632 [INFO] 
2017-06-17T06:03:02.632 [INFO] >>> findbugs-maven-plugin:3.0.4:check (default) 
> :findbugs @ beam-sdks-common-fn-api >>>
2017-06-17T06:03:02.635 [INFO] 
2017-06-17T06:03:02.635 [INFO] --- findbugs-maven-plugin:3.0.4:findbugs 
(findbugs) @ beam-sdks-common-fn-api ---
2017-06-17T06:03:02.740 [INFO] 
2017-06-17T06:03:02.740 [INFO] <<< findbugs-maven-plugin:3.0.4:check (default) 
< :findbugs @ beam-sdks-common-fn-api <<<
2017-06-17T06:03:02.741 [INFO] 
2017-06-17T06:03:02.741 [INFO] --- findbugs-maven-plugin:3.0.4:check (default) 
@ beam-sdks-common-fn-api ---
2017-06-17T06:03:02.797 [INFO] 
2017-06-17T06:03:02.797 [INFO] --- maven-surefire-plugin:2.20:test 
(default-test) @ beam-sdks-common-fn-api ---
[JENKINS] Recording test results2017-06-17T06:03:02.851 [INFO] 
2017-06-17T06:03:02.851 [INFO] --- 
build-helper-maven-plugin:3.0.0:regex-properties (render-artifact-id) @ 
beam-sdks-common-fn-api ---

2017-06-17T06:03:02.988 [INFO] 
2017-06-17T06:03:02.988 [INFO] --- maven-jar-plugin:3.0.2:jar (default-jar) @ 
beam-sdks-common-fn-api ---
2017-06-17T06:03:03.014 [INFO] Building jar: 

2017-06-17T06:03:03.098 [INFO] 
2017-06-17T06:03:03.098 [INFO] --- maven-site-plugin:3.5.1:attach-descriptor 
(attach-descriptor) @ beam-sdks-common-fn-api ---
2017-06-17T06:03:03.200 [INFO] 
2017-06-17T06:03:03.201 [INFO] --- maven-jar-plugin:3.0.2:test-jar 
(default-test-jar) @ beam-sdks-common-fn-api ---
2017-06-17T06:03:03.204 [INFO] Building jar: 

2017-06-17T06:03:03.262 [INFO] 
2017-06-17T06:03:03.263 [INFO] --- maven-shade-plugin:3.0.0:shade 
(bundle-and-repackage) @ beam-sdks-common-fn-api ---
2017-06-17T06:03:03.265 [INFO] Excluding 
org.apache.beam:beam-sdks-common-runner-api:jar:2.1.0-SNAPSHOT from the shaded 
jar.
2017-06-17T06:03:03.265 [INFO] Excluding 
com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the 
shaded jar.
2017-06-17T06:03:03.265 [INFO] Including com.google.guava:guava:jar:20.0 in the 
shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding 
com.google.errorprone:error_prone_annotations:jar:2.0.15 from the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding 
com.google.code.findbugs:jsr305:jar:3.0.1 from the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from 
the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding 
com.google.instrumentation:instrumentation-api:jar:0.3.0 from the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from 
the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding 
com.google.protobuf:protobuf-java-util:jar:3.2.0 from the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding com.google.code.gson:gson:jar:2.7 from 
the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 
from the shaded jar.
2017-06-17T06:03:03.265 [INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the 
shaded jar.
2017-06-17T06:03:05.478 [INFO] Replacing original artifact with shaded artifact.
2017-06-17T06:03:05.478 [INFO] Replacing 

 with 

2017-06-17T06:03:05.478 [INFO] Replacing original test artifact with shaded 
test artifact.
2017-06-17T06:03:05.478 [INFO] Replacing 

 with 


Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #95

2017-06-16 Thread Apache Jenkins Server
See 


--
[...truncated 1.60 MB...]
  [javadoc] 
:319:
 warning - Tag @link: can't find withFilenamePolicy(FilenamePolicy) in 
org.apache.beam.sdk.io.AvroIO.Write
  [javadoc] 
:303:
 warning - Tag @link: can't find withFilenamePolicy(FilenamePolicy) in 
org.apache.beam.sdk.io.AvroIO.Write
  [javadoc] 
:358:
 warning - Tag @link: can't find withFilenamePolicy(FilenamePolicy) in 
org.apache.beam.sdk.io.AvroIO.Write
  [javadoc] 
:369:
 warning - Tag @link: can't find withFilenamePolicy(FilenamePolicy) in 
org.apache.beam.sdk.io.AvroIO.Write
  [javadoc] 
:534:
 warning - Tag @link: reference not found: 
FileBasedReader#startReading(ReadableByteChannel)
  [javadoc] 
:534:
 warning - Tag @link: reference not found: 
FileBasedReader#startReading(ReadableByteChannel)
  [javadoc] 
:184:
 warning - Tag @link: can't find FileBasedSource(Metadata, long, long, long) in 
org.apache.beam.sdk.io.FileBasedSource
  [javadoc] 
:295:
 warning - Tag @link: reference not found: BoundedReader#start
  [javadoc] 
:295:
 warning - Tag @see: reference not found: BoundedReader#start
  [javadoc] 
:309:
 warning - Tag @link: reference not found: BoundedReader#advance
  [javadoc] 
:309:
 warning - Tag @see: reference not found: BoundedReader#advance
  [javadoc] 
:80:
 warning - Tag @link: reference not found: ResourceIdT
  [javadoc] 
:70:
 warning - Tag @link: reference not found: ResourceIdT ResourceIds
  [javadoc] 
:93:
 warning - Tag @link: reference not found: ResourceIdT
  [javadoc] 
:70:
 warning - Tag @link: reference not found: ResourceIdT ResourceIds
  [javadoc] 
:70:
 warning - Tag @link: reference not found: ResourceIdT
  [javadoc] 
:70:
 warning - Tag @link: reference not found: Reso

Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #143

2017-06-16 Thread Apache Jenkins Server
See 


--
[...truncated 2.65 MB...]
2017-06-17T06:33:35.884 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
2017-06-17T06:33:35.894 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
 (12 KB at 1125.6 KB/sec)
2017-06-17T06:33:35.898 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
2017-06-17T06:33:35.909 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
 (13 KB at 1166.7 KB/sec)
2017-06-17T06:33:35.914 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
2017-06-17T06:33:35.923 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
 (3 KB at 241.4 KB/sec)
2017-06-17T06:33:35.929 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
2017-06-17T06:33:35.938 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
 (2 KB at 174.8 KB/sec)
2017-06-17T06:33:35.942 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
2017-06-17T06:33:35.954 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
 (6 KB at 437.5 KB/sec)
2017-06-17T06:33:35.958 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
2017-06-17T06:33:35.967 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
 (3 KB at 277.2 KB/sec)
2017-06-17T06:33:35.970 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
2017-06-17T06:33:35.980 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
 (4 KB at 342.7 KB/sec)
2017-06-17T06:33:35.985 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
2017-06-17T06:33:35.995 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
 (2 KB at 100.3 KB/sec)
2017-06-17T06:33:36.000 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
2017-06-17T06:33:36.007 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
 (2 KB at 242.3 KB/sec)
2017-06-17T06:33:36.012 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
2017-06-17T06:33:36.021 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
 (2 KB at 196.0 KB/sec)
2017-06-17T06:33:36.026 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
2017-06-17T06:33:36.035 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
 (3 KB at 294.9 KB/sec)
2017-06-17T06:33:36.041 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
2017-06-17T06:33:36.053 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
 (18 KB at 1468.0 KB/sec)
2017-06-17T06:33:36.058 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom
2017-06-17T06:33:36.070 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom (10 
KB at 785.6 KB/sec)
2017-06-17T06:33:36.073 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
2017-06-17T06:33:36.081 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
 (5 KB at 556.8 KB/sec)
2017-06-17T06:33:36.085 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-launcher/1.8.4/ant-launcher-1.8.4.pom
2017-06-17T06:33:36.094 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-launcher/1.8.4/ant-launcher-1.8.4.pom
 (3 KB at 256.2 KB/sec)
2017-06-17T06:33:36.098 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/fusesource/jansi/jansi/1.11/jansi-1.11.pom
2017-06-17T06:33:36.109 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/fusesource/jansi/jans