flink git commit: [FLINK-3074] Add config option to start YARN AM on port range

2015-12-09 Thread rmetzger
Repository: flink
Updated Branches:
  refs/heads/master 1190f3b1d -> 74b535d56


[FLINK-3074] Add config option to start YARN AM on port range

This closes #1416


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/74b535d5
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/74b535d5
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/74b535d5

Branch: refs/heads/master
Commit: 74b535d56a81fdf7460b9ce632e14e3c3d119355
Parents: 1190f3b
Author: Robert Metzger 
Authored: Thu Nov 26 17:38:45 2015 +0100
Committer: Robert Metzger 
Committed: Wed Dec 9 17:41:05 2015 +0100

--
 docs/setup/config.md| 13 +++-
 docs/setup/yarn_setup.md| 25 +++
 .../flink/configuration/ConfigConstants.java| 22 ++
 .../java/org/apache/flink/util/NetUtils.java| 73 ---
 .../org/apache/flink/util/NetUtilsTest.java | 33 +++--
 .../apache/flink/runtime/blob/BlobServer.java   | 26 +++
 flink-yarn/src/main/resources/log4j.properties  | 24 +++
 .../flink/yarn/ApplicationMasterBase.scala  | 74 ++--
 8 files changed, 251 insertions(+), 39 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/74b535d5/docs/setup/config.md
--
diff --git a/docs/setup/config.md b/docs/setup/config.md
index bc70e1d..a8aba49 100644
--- a/docs/setup/config.md
+++ b/docs/setup/config.md
@@ -224,7 +224,7 @@ Note: State backend must be accessible from the JobManager, 
use file:// only for
 - `blob.server.port`: Port definition for the blob server (serving user jar's) 
on the Taskmanagers.
 By default the port is set to 0, which means that the operating system is 
picking an ephemeral port.
 Flink also accepts a list of ports ("50100,50101"), ranges ("50100-50200") or 
a combination of both.
-It is recommended to set a range of ports to avoid collisions when multiple 
TaskManagers are running
+It is recommended to set a range of ports to avoid collisions when multiple 
JobManagers are running
 on the same machine.
 
 - `execution-retries.delay`: Delay between execution retries. Default value "5 
s". Note that values
@@ -428,6 +428,17 @@ For example for passing `LD_LIBRARY_PATH` as an env 
variable to the ApplicationM
 - `yarn.taskmanager.env.` Similar to the configuration prefix about, this 
prefix allows setting custom
 environment variables for the TaskManager processes.
 
+
+- `yarn.application-master.port` (Default: 0, which lets the OS choose an 
ephemeral port)
+With this configuration option, users can specify a port, a range of ports or 
a list of ports for the 
+Application Master (and JobManager) RPC port. By default we recommend using 
the default value (0) to
+let the operating system choose an appropriate port. In particular when 
multiple AMs are running on the 
+same physical host, fixed port assignments prevent the AM from starting.
+
+For example when running Flink on YARN on an environment with a restrictive 
firewall, this
+option allows specifying a range of allowed ports.
+
+
 ## High Availability Mode
 
 - `recovery.mode`: (Default 'standalone') Defines the recovery mode used for 
the cluster execution. Currently,

http://git-wip-us.apache.org/repos/asf/flink/blob/74b535d5/docs/setup/yarn_setup.md
--
diff --git a/docs/setup/yarn_setup.md b/docs/setup/yarn_setup.md
index b95b8a5..a7309e4 100644
--- a/docs/setup/yarn_setup.md
+++ b/docs/setup/yarn_setup.md
@@ -257,6 +257,31 @@ It allows to access log files for running YARN 
applications and shows diagnostic
 Users using Hadoop distributions from companies like Hortonworks, Cloudera or 
MapR might have to build Flink against their specific versions of Hadoop (HDFS) 
and YARN. Please read the [build instructions](building.html) for more details.
 
 
+## Running Flink on YARN behind Firewalls
+
+Some YARN clusters use firewalls for controlling the network traffic between 
the cluster and the rest of the network.
+In those setups, Flink jobs can only be submitted to a YARN session from 
within the cluster's network (behind the firewall).
+If this is not feasible for production use, Flink allows to configure a port 
range for all relevant services. With these 
+ranges configured, users can also submit jobs to Flink crossing the firewall.
+
+Currently, two services are needed to submit a job:
+
+ * The JobManager (ApplicatonMaster in YARN)
+ * The BlobServer running within the JobManager.
+ 
+When submitting a job to Flink, the BlobServer will distribute the jars with 
the user code to all worker nodes (TaskManagers).
+The JobManager receives the job itself and triggers the execution.
+
+The two configuration parameters for specifying t

flink git commit: [Storm Compatibility] Updated README.md and documenation

2015-12-09 Thread mjsax
Repository: flink
Updated Branches:
  refs/heads/master b2aa1d9e3 -> 1190f3b1d


[Storm Compatibility] Updated README.md and documenation


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/1190f3b1
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/1190f3b1
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/1190f3b1

Branch: refs/heads/master
Commit: 1190f3b1d9911e4d098b3c7ef27f4a1659ef
Parents: b2aa1d9
Author: mjsax 
Authored: Wed Dec 9 16:27:35 2015 +0100
Committer: mjsax 
Committed: Wed Dec 9 16:27:35 2015 +0100

--
 docs/apis/storm_compatibility.md| 11 ---
 flink-contrib/flink-storm/README.md |  4 +++-
 2 files changed, 11 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/1190f3b1/docs/apis/storm_compatibility.md
--
diff --git a/docs/apis/storm_compatibility.md b/docs/apis/storm_compatibility.md
index ccc7d78..852bbef 100644
--- a/docs/apis/storm_compatibility.md
+++ b/docs/apis/storm_compatibility.md
@@ -52,9 +52,12 @@ Add the following dependency to your `pom.xml` if you want 
to execute Storm code
 **Please note**: Do not add `storm-core` as a dependency. It is already 
included via `flink-storm`.
 
 **Please note**: `flink-storm` is not part of the provided binary Flink 
distribution.
-Thus, you need to include `flink-storm` classes (and their dependencies) in 
your program jar that is submitted to Flink's JobManager.
+Thus, you need to include `flink-storm` classes (and their dependencies) in 
your program jar (also called ueber-jar or fat-jar) that is submitted to 
Flink's JobManager.
 See *WordCount Storm* within `flink-storm-examples/pom.xml` for an example how 
to package a jar correctly.
 
+If you want to avoid large ueber-jars, you can manually copy 
`storm-core-0.9.4.jar`, `json-simple-1.1.jar` and 
`flink-storm-{{site.version}}.jar` into Flink's `lib/` folder of each cluster 
node (*before* the cluster is started).
+For this case, it is sufficient to include only your own Spout and Bolt 
classes (and their internal dependencies) into the program jar.
+
 # Execute Storm Topologies
 
 Flink provides a Storm compatible API (`org.apache.flink.storm.api`) that 
offers replacements for the following classes:
@@ -80,13 +83,15 @@ builder.setBolt("sink", new 
BoltFileSink(outputFilePath)).shuffleGrouping("count
 
 Config conf = new Config();
 if(runLocal) { // submit to test cluster
-   FlinkLocalCluster cluster = new FlinkLocalCluster(); // replaces: 
LocalCluster cluster = new LocalCluster();
+   // replaces: LocalCluster cluster = new LocalCluster();
+   FlinkLocalCluster cluster = new FlinkLocalCluster();
cluster.submitTopology("WordCount", conf, 
FlinkTopology.createTopology(builder));
 } else { // submit to remote cluster
// optional
// conf.put(Config.NIMBUS_HOST, "remoteHost");
// conf.put(Config.NIMBUS_THRIFT_PORT, 6123);
-   FlinkSubmitter.submitTopology("WordCount", conf, 
FlinkTopology.createTopology(builder)); // replaces: 
StormSubmitter.submitTopology(topologyId, conf, builder.createTopology());
+   // replaces: StormSubmitter.submitTopology(topologyId, conf, 
builder.createTopology());
+   FlinkSubmitter.submitTopology("WordCount", conf, 
FlinkTopology.createTopology(builder));
 }
 ~~~
 

http://git-wip-us.apache.org/repos/asf/flink/blob/1190f3b1/flink-contrib/flink-storm/README.md
--
diff --git a/flink-contrib/flink-storm/README.md 
b/flink-contrib/flink-storm/README.md
index 239780c..fd0f455 100644
--- a/flink-contrib/flink-storm/README.md
+++ b/flink-contrib/flink-storm/README.md
@@ -1,10 +1,12 @@
 # flink-storm
 
 `flink-storm` is compatibility layer for Apache Storm and allows to embed 
Spouts or Bolts unmodified within a regular Flink streaming program 
(`SpoutWrapper` and `BoltWrapper`).
-Additionally, a whole Storm topology can be submitted to Flink (see 
`FlinkTopologyBuilder`, `FlinkLocalCluster`, and `FlinkSubmitter`).
+Additionally, a whole Storm topology can be submitted to Flink (see 
`FlinkLocalCluster`, and `FlinkSubmitter`).
 Only a few minor changes to the original submitting code are required.
 The code that builds the topology itself, can be reused unmodified. See 
`flink-storm-examples` for a simple word-count example.
 
+**Please note**: Do not add `storm-core` as a dependency. It is already 
included via `flink-storm`.
+
 The following Storm features are not (yet/fully) supported by the 
compatibility layer right now:
 * tuple meta information
 * no fault-tolerance guarantees (ie, calls to `ack()`/`fail()` and anchoring 
is ignored)



flink git commit: [FLINK-3145][storm] pin Kryo version of transitive dependencies

2015-12-09 Thread mxm
Repository: flink
Updated Branches:
  refs/heads/release-0.10 f3f2ced46 -> f97803040


[FLINK-3145][storm] pin Kryo version of transitive dependencies

This closes #1441.


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/f9780304
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/f9780304
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/f9780304

Branch: refs/heads/release-0.10
Commit: f97803040a08553fdc006574df07abfe858389c3
Parents: f3f2ced
Author: Maximilian Michels 
Authored: Tue Dec 8 18:51:31 2015 +0100
Committer: Maximilian Michels 
Committed: Wed Dec 9 16:18:32 2015 +0100

--
 docs/apis/storm_compatibility.md  | 2 ++
 flink-contrib/flink-storm/pom.xml | 7 +++
 2 files changed, 9 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/f9780304/docs/apis/storm_compatibility.md
--
diff --git a/docs/apis/storm_compatibility.md b/docs/apis/storm_compatibility.md
index 5eb5f07..cc5ab38 100644
--- a/docs/apis/storm_compatibility.md
+++ b/docs/apis/storm_compatibility.md
@@ -49,6 +49,8 @@ Add the following dependency to your `pom.xml` if you want to 
execute Storm code
 
 ~~~
 
+**Please note**: Do not add `storm-core` as a dependency. It is already 
included via `flink-storm`.
+
 **Please note**: `flink-storm` is not part of the provided binary Flink 
distribution.
 Thus, you need to include `flink-storm` classes (and their dependencies) in 
your program jar that is submitted to Flink's JobManager.
 See *WordCount Storm* within `flink-storm-examples/pom.xml` for an example how 
to package a jar correctly.

http://git-wip-us.apache.org/repos/asf/flink/blob/f9780304/flink-contrib/flink-storm/pom.xml
--
diff --git a/flink-contrib/flink-storm/pom.xml 
b/flink-contrib/flink-storm/pom.xml
index 2424b55..71cbaf4 100644
--- a/flink-contrib/flink-storm/pom.xml
+++ b/flink-contrib/flink-storm/pom.xml
@@ -36,6 +36,13 @@ under the License.
 


+   
+   com.esotericsoftware.kryo
+   kryo
+   
+
+   
org.apache.flink
flink-streaming-java
${project.version}



flink git commit: [FLINK-3143] update Closure Cleaner's ASM references to ASM5

2015-12-09 Thread mxm
Repository: flink
Updated Branches:
  refs/heads/release-0.10 bf9caa5fa -> f3f2ced46


[FLINK-3143] update Closure Cleaner's ASM references to ASM5

- This solves errors with reflectasm using Scala 2.11 and Java 8


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/f3f2ced4
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/f3f2ced4
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/f3f2ced4

Branch: refs/heads/release-0.10
Commit: f3f2ced465584903a5bc7796408f576d287b
Parents: bf9caa5
Author: Maximilian Michels 
Authored: Wed Dec 9 11:42:09 2015 +0100
Committer: Maximilian Michels 
Committed: Wed Dec 9 16:15:58 2015 +0100

--
 .../org/apache/flink/api/java/ClosureCleaner.java |  6 +++---
 .../org/apache/flink/api/scala/ClosureCleaner.scala   | 14 +++---
 2 files changed, 10 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/f3f2ced4/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java
--
diff --git 
a/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java 
b/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java
index ec1ebd6..37cf8d3 100644
--- a/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java
+++ b/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java
@@ -105,7 +105,7 @@ class This0AccessFinder extends ClassVisitor {
private String this0Name;
 
public This0AccessFinder(String this0Name) {
-   super(Opcodes.ASM4);
+   super(Opcodes.ASM5);
this.this0Name = this0Name;
}
 
@@ -115,7 +115,7 @@ class This0AccessFinder extends ClassVisitor {
 
@Override
public MethodVisitor visitMethod(int access, String name, String desc, 
String sig, String[] exceptions) {
-   return new MethodVisitor(Opcodes.ASM4) {
+   return new MethodVisitor(Opcodes.ASM5) {
 
@Override
public void visitFieldInsn(int op, String owner, String 
name, String desc) {
@@ -125,4 +125,4 @@ class This0AccessFinder extends ClassVisitor {
}
};
}
-}
\ No newline at end of file
+}

http://git-wip-us.apache.org/repos/asf/flink/blob/f3f2ced4/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
--
diff --git 
a/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala 
b/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
index aee57f5..00ffcfc 100644
--- a/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
+++ b/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
@@ -253,11 +253,11 @@ object ClosureCleaner {
 }
 
 private[flink]
-class ReturnStatementFinder extends ClassVisitor(ASM4) {
+class ReturnStatementFinder extends ClassVisitor(ASM5) {
   override def visitMethod(access: Int, name: String, desc: String,
sig: String, exceptions: Array[String]): 
MethodVisitor = {
 if (name.contains("apply")) {
-  new MethodVisitor(ASM4) {
+  new MethodVisitor(ASM5) {
 override def visitTypeInsn(op: Int, tp: String) {
   if (op == NEW && tp.contains("scala/runtime/NonLocalReturnControl")) 
{
 throw new InvalidProgramException("Return statements aren't 
allowed in Flink closures")
@@ -265,16 +265,16 @@ class ReturnStatementFinder extends ClassVisitor(ASM4) {
 }
   }
 } else {
-  new MethodVisitor(ASM4) {}
+  new MethodVisitor(ASM5) {}
 }
   }
 }
 
 private[flink]
-class FieldAccessFinder(output: Map[Class[_], Set[String]]) extends 
ClassVisitor(ASM4) {
+class FieldAccessFinder(output: Map[Class[_], Set[String]]) extends 
ClassVisitor(ASM5) {
   override def visitMethod(access: Int, name: String, desc: String,
sig: String, exceptions: Array[String]): 
MethodVisitor = {
-new MethodVisitor(ASM4) {
+new MethodVisitor(ASM5) {
   override def visitFieldInsn(op: Int, owner: String, name: String, desc: 
String) {
 if (op == GETFIELD) {
   for (cl <- output.keys if cl.getName == owner.replace('/', '.')) {
@@ -297,7 +297,7 @@ class FieldAccessFinder(output: Map[Class[_], Set[String]]) 
extends ClassVisitor
   }
 }
 
-private[flink] class InnerClosureFinder(output: Set[Class[_]]) extends 
ClassVisitor(ASM4) {
+private[flink] class InnerClosureFinder(output: Set[Class[_]]) extends 
ClassVisitor(ASM5) {
   var myName: String = null
 
   override def visit(version: Int, access: Int, name: String, sig: String,
@@ -307,7 +307,7 @@ private[flink] c

[3/3] flink git commit: [FLINK-3155][docker] update Flink version to 0.10.1

2015-12-09 Thread mxm
[FLINK-3155][docker] update Flink version to 0.10.1

This closes #1443.


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/3de49f52
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/3de49f52
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/3de49f52

Branch: refs/heads/master
Commit: 3de49f521c275c3e3e053a0d83572e1a63b138b7
Parents: 4f12356
Author: Romeo Kienzler 
Authored: Tue Dec 8 12:25:54 2015 -0800
Committer: Maximilian Michels 
Committed: Wed Dec 9 15:48:30 2015 +0100

--
 flink-contrib/docker-flink/flink/Dockerfile | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/3de49f52/flink-contrib/docker-flink/flink/Dockerfile
--
diff --git a/flink-contrib/docker-flink/flink/Dockerfile 
b/flink-contrib/docker-flink/flink/Dockerfile
index 2a49819..b12ab66 100644
--- a/flink-contrib/docker-flink/flink/Dockerfile
+++ b/flink-contrib/docker-flink/flink/Dockerfile
@@ -25,8 +25,8 @@ RUN cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys && chmod 
600 ~/.ssh/*
 ##Flink 0.10.0 Installation
 ###Download:
 RUN mkdir ~/downloads && cd ~/downloads && \
-wget -q -O - 
http://apache.spd.co.il/flink/flink-0.10.0/flink-0.10.0-bin-hadoop27-scala_2.11.tgz
 | tar -zxvf - -C /usr/local/
-RUN cd /usr/local && ln -s ./flink-0.10.0 flink
+wget -q -O - 
http://mirror.switch.ch/mirror/apache/dist/flink/flink-0.10.1/flink-0.10.1-bin-hadoop27-scala_2.11.tgz|
 tar -zxvf - -C /usr/local/
+RUN cd /usr/local && ln -s ./flink-0.10.1 flink
 
 ENV FLINK_HOME /usr/local/flink
 ENV PATH $PATH:$FLINK_HOME/bin



[1/3] flink git commit: [FLINK-3145][storm] pin Kryo version of transitive dependencies

2015-12-09 Thread mxm
Repository: flink
Updated Branches:
  refs/heads/master 4f12356eb -> b2aa1d9e3


[FLINK-3145][storm] pin Kryo version of transitive dependencies

This closes #1441.


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/b2aa1d9e
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/b2aa1d9e
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/b2aa1d9e

Branch: refs/heads/master
Commit: b2aa1d9e3741c77230d300d6c777d3eabe98d0ac
Parents: e28b62e
Author: Maximilian Michels 
Authored: Tue Dec 8 18:51:31 2015 +0100
Committer: Maximilian Michels 
Committed: Wed Dec 9 15:48:30 2015 +0100

--
 docs/apis/storm_compatibility.md  | 2 ++
 flink-contrib/flink-storm/pom.xml | 7 +++
 2 files changed, 9 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/b2aa1d9e/docs/apis/storm_compatibility.md
--
diff --git a/docs/apis/storm_compatibility.md b/docs/apis/storm_compatibility.md
index fe6bf35..ccc7d78 100644
--- a/docs/apis/storm_compatibility.md
+++ b/docs/apis/storm_compatibility.md
@@ -49,6 +49,8 @@ Add the following dependency to your `pom.xml` if you want to 
execute Storm code
 
 ~~~
 
+**Please note**: Do not add `storm-core` as a dependency. It is already 
included via `flink-storm`.
+
 **Please note**: `flink-storm` is not part of the provided binary Flink 
distribution.
 Thus, you need to include `flink-storm` classes (and their dependencies) in 
your program jar that is submitted to Flink's JobManager.
 See *WordCount Storm* within `flink-storm-examples/pom.xml` for an example how 
to package a jar correctly.

http://git-wip-us.apache.org/repos/asf/flink/blob/b2aa1d9e/flink-contrib/flink-storm/pom.xml
--
diff --git a/flink-contrib/flink-storm/pom.xml 
b/flink-contrib/flink-storm/pom.xml
index 072ee75..5695ca5 100644
--- a/flink-contrib/flink-storm/pom.xml
+++ b/flink-contrib/flink-storm/pom.xml
@@ -36,6 +36,13 @@ under the License.
 


+   
+   com.esotericsoftware.kryo
+   kryo
+   
+
+   
org.apache.flink
flink-streaming-java
${project.version}



[2/3] flink git commit: [FLINK-3143] update Closure Cleaner's ASM references to ASM5

2015-12-09 Thread mxm
[FLINK-3143] update Closure Cleaner's ASM references to ASM5

- This solves errors with reflectasm using Scala 2.11 and Java 8

This closes #1445.


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/e28b62e0
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/e28b62e0
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/e28b62e0

Branch: refs/heads/master
Commit: e28b62e0e2973b01ad5b08ce319aaf0e7ce4c087
Parents: 3de49f5
Author: Maximilian Michels 
Authored: Wed Dec 9 11:42:09 2015 +0100
Committer: Maximilian Michels 
Committed: Wed Dec 9 15:48:30 2015 +0100

--
 .../org/apache/flink/api/java/ClosureCleaner.java |  6 +++---
 .../org/apache/flink/api/scala/ClosureCleaner.scala   | 14 +++---
 2 files changed, 10 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/e28b62e0/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java
--
diff --git 
a/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java 
b/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java
index ec1ebd6..37cf8d3 100644
--- a/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java
+++ b/flink-java/src/main/java/org/apache/flink/api/java/ClosureCleaner.java
@@ -105,7 +105,7 @@ class This0AccessFinder extends ClassVisitor {
private String this0Name;
 
public This0AccessFinder(String this0Name) {
-   super(Opcodes.ASM4);
+   super(Opcodes.ASM5);
this.this0Name = this0Name;
}
 
@@ -115,7 +115,7 @@ class This0AccessFinder extends ClassVisitor {
 
@Override
public MethodVisitor visitMethod(int access, String name, String desc, 
String sig, String[] exceptions) {
-   return new MethodVisitor(Opcodes.ASM4) {
+   return new MethodVisitor(Opcodes.ASM5) {
 
@Override
public void visitFieldInsn(int op, String owner, String 
name, String desc) {
@@ -125,4 +125,4 @@ class This0AccessFinder extends ClassVisitor {
}
};
}
-}
\ No newline at end of file
+}

http://git-wip-us.apache.org/repos/asf/flink/blob/e28b62e0/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
--
diff --git 
a/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala 
b/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
index aee57f5..00ffcfc 100644
--- a/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
+++ b/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
@@ -253,11 +253,11 @@ object ClosureCleaner {
 }
 
 private[flink]
-class ReturnStatementFinder extends ClassVisitor(ASM4) {
+class ReturnStatementFinder extends ClassVisitor(ASM5) {
   override def visitMethod(access: Int, name: String, desc: String,
sig: String, exceptions: Array[String]): 
MethodVisitor = {
 if (name.contains("apply")) {
-  new MethodVisitor(ASM4) {
+  new MethodVisitor(ASM5) {
 override def visitTypeInsn(op: Int, tp: String) {
   if (op == NEW && tp.contains("scala/runtime/NonLocalReturnControl")) 
{
 throw new InvalidProgramException("Return statements aren't 
allowed in Flink closures")
@@ -265,16 +265,16 @@ class ReturnStatementFinder extends ClassVisitor(ASM4) {
 }
   }
 } else {
-  new MethodVisitor(ASM4) {}
+  new MethodVisitor(ASM5) {}
 }
   }
 }
 
 private[flink]
-class FieldAccessFinder(output: Map[Class[_], Set[String]]) extends 
ClassVisitor(ASM4) {
+class FieldAccessFinder(output: Map[Class[_], Set[String]]) extends 
ClassVisitor(ASM5) {
   override def visitMethod(access: Int, name: String, desc: String,
sig: String, exceptions: Array[String]): 
MethodVisitor = {
-new MethodVisitor(ASM4) {
+new MethodVisitor(ASM5) {
   override def visitFieldInsn(op: Int, owner: String, name: String, desc: 
String) {
 if (op == GETFIELD) {
   for (cl <- output.keys if cl.getName == owner.replace('/', '.')) {
@@ -297,7 +297,7 @@ class FieldAccessFinder(output: Map[Class[_], Set[String]]) 
extends ClassVisitor
   }
 }
 
-private[flink] class InnerClosureFinder(output: Set[Class[_]]) extends 
ClassVisitor(ASM4) {
+private[flink] class InnerClosureFinder(output: Set[Class[_]]) extends 
ClassVisitor(ASM5) {
   var myName: String = null
 
   override def visit(version: Int, access: Int, name: String, sig: String,
@@ -307,7 +307,7 @@ private[flink] class InnerClosureFinder(output: 
Set[Class[_]]) extends ClassVisi
 
   ov

[2/2] flink git commit: [FLINK-3073] Replace Streaming Mode by Memory Allocation Mode

2015-12-09 Thread aljoscha
[FLINK-3073] Replace Streaming Mode by Memory Allocation Mode

Before, streaming mode (either batch or streaming) would specify how
memory is allocated on task managers.

This introduces a new configuration value taskmanager.memory.allocation
that can take values "lazy" or "eager". This controls how memory is
allocated.


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/4f12356e
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/4f12356e
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/4f12356e

Branch: refs/heads/master
Commit: 4f12356eb64af37909d18a5c2abb4a31a4c19472
Parents: 718a17b
Author: Aljoscha Krettek 
Authored: Tue Dec 1 16:13:55 2015 +0100
Committer: Aljoscha Krettek 
Committed: Wed Dec 9 15:33:10 2015 +0100

--
 docs/quickstart/setup_quickstart.md | 23 +++---
 docs/setup/cluster_setup.md | 28 ++--
 docs/setup/config.md| 39 +-
 docs/setup/jobmanager_high_availability.md  | 24 +++
 docs/setup/local_setup.md   |  3 -
 docs/setup/yarn_setup.md| 19 +++--
 .../flink/client/FlinkYarnSessionCli.java   |  3 -
 .../OperatorStatsAccumulatorTest.java   |  3 +-
 .../flink/storm/api/FlinkLocalCluster.java  |  3 +-
 .../flink/configuration/ConfigConstants.java| 11 +++
 flink-dist/src/main/flink-bin/bin/config.sh | 20 --
 flink-dist/src/main/flink-bin/bin/jobmanager.sh | 10 +--
 .../flink-bin/bin/start-cluster-streaming.sh| 24 ---
 .../src/main/flink-bin/bin/start-cluster.sh | 14 ++--
 .../main/flink-bin/bin/start-local-streaming.sh | 24 ---
 .../src/main/flink-bin/bin/start-local.bat  |  2 +-
 .../src/main/flink-bin/bin/start-local.sh   |  6 +-
 .../src/main/flink-bin/bin/taskmanager.sh   | 22 +++---
 flink-dist/src/main/resources/flink-conf.yaml   | 10 ++-
 .../webmonitor/WebRuntimeMonitorITCase.java |  2 -
 .../org/apache/flink/runtime/StreamingMode.java | 34 -
 .../jobmanager/JobManagerCliOptions.java| 21 --
 .../taskmanager/TaskManagerCliOptions.java  | 22 +-
 .../runtime/yarn/AbstractFlinkYarnClient.java   |  6 --
 .../flink/runtime/jobmanager/JobManager.scala   | 35 ++---
 .../runtime/minicluster/FlinkMiniCluster.scala  | 13 +---
 .../minicluster/LocalFlinkMiniCluster.scala | 14 +---
 .../flink/runtime/taskmanager/TaskManager.scala | 75 ++--
 .../JobManagerProcessReapingTest.java   |  3 +-
 .../jobmanager/JobManagerStartupTest.java   |  7 +-
 ...ManagerSubmittedJobGraphsRecoveryITCase.java |  7 +-
 .../flink/runtime/jobmanager/JobSubmitTest.java |  2 -
 .../JobManagerLeaderElectionTest.java   |  3 -
 .../LeaderChangeStateCleanupTest.java   |  3 +-
 .../LeaderElectionRetrievalTestingCluster.java  | 13 +---
 ...askManagerComponentsStartupShutdownTest.java |  2 -
 .../TaskManagerProcessReapingTest.java  |  4 +-
 .../TaskManagerRegistrationTest.java|  3 -
 .../taskmanager/TaskManagerStartupTest.java | 15 ++--
 .../runtime/testutils/JobManagerProcess.java|  7 +-
 .../runtime/testutils/TaskManagerProcess.java   |  9 +--
 .../jobmanager/JobManagerRegistrationTest.scala |  2 -
 .../runtime/testingUtils/TestingCluster.scala   | 19 ++---
 .../testingUtils/TestingJobManager.scala|  4 --
 .../runtime/testingUtils/TestingUtils.scala |  5 +-
 .../flink/api/scala/ScalaShellITCase.scala  |  2 -
 .../flink/tez/test/TezProgramTestBase.java  |  3 +-
 .../connectors/kafka/KafkaTestBase.java |  3 +-
 .../api/environment/LocalStreamEnvironment.java |  3 +-
 .../util/StreamingMultipleProgramsTestBase.java |  3 +-
 .../util/StreamingProgramTestBase.java  |  3 +-
 ...ScalaStreamingMultipleProgramsTestBase.scala |  2 -
 .../flink/test/util/AbstractTestBase.java   |  8 +--
 .../flink/test/util/JavaProgramTestBase.java|  3 +-
 .../test/util/MultipleProgramsTestBase.java |  2 -
 .../apache/flink/test/util/TestBaseUtils.java   |  7 +-
 .../apache/flink/test/util/FlinkTestBase.scala  |  2 -
 .../test/util/ForkableFlinkMiniCluster.scala| 13 +---
 .../EventTimeAllWindowCheckpointingITCase.java  |  3 +-
 .../EventTimeWindowCheckpointingITCase.java | 21 +-
 .../WindowCheckpointingITCase.java  |  6 +-
 .../manual/StreamingScalabilityAndLatency.java  |  3 +-
 ...tJobManagerProcessFailureRecoveryITCase.java |  3 +-
 ...ctTaskManagerProcessFailureRecoveryTest.java |  4 +-
 .../JobManagerCheckpointRecoveryITCase.java |  3 +-
 .../recovery/ProcessFailureCancelingITCase.java |  2 -
 .../flink/test/runtime/IPv6HostnamesITCase.java |  3 +-
 .../flink/yarn/TestingYarnJobManager.scala  |  3 -
 .../apache/flink/yarn/FlinkYarnClientBase.java  |  7 --
 .../flink/yarn/YarnTaskManagerRunner.java   | 10 +--
 .../flink/yarn/ApplicationMast

[1/2] flink git commit: [FLINK-3073] Replace Streaming Mode by Memory Allocation Mode

2015-12-09 Thread aljoscha
Repository: flink
Updated Branches:
  refs/heads/master 718a17b28 -> 4f12356eb


http://git-wip-us.apache.org/repos/asf/flink/blob/4f12356e/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/JobManagerLeaderElectionTest.java
--
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/JobManagerLeaderElectionTest.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/JobManagerLeaderElectionTest.java
index c804830..47255fc 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/JobManagerLeaderElectionTest.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/JobManagerLeaderElectionTest.java
@@ -27,9 +27,7 @@ import akka.testkit.JavaTestKit;
 import akka.util.Timeout;
 import org.apache.curator.framework.CuratorFramework;
 import org.apache.curator.test.TestingServer;
-import org.apache.flink.configuration.ConfigConstants;
 import org.apache.flink.configuration.Configuration;
-import org.apache.flink.runtime.StreamingMode;
 import org.apache.flink.runtime.akka.AkkaUtils;
 import org.apache.flink.runtime.blob.BlobServer;
 import org.apache.flink.runtime.checkpoint.CheckpointRecoveryFactory;
@@ -185,7 +183,6 @@ public class JobManagerLeaderElectionTest extends 
TestLogger {
1,
1L,
AkkaUtils.getDefaultTimeout(),
-   StreamingMode.BATCH_ONLY,
leaderElectionService,
submittedJobGraphStore,
checkpointRecoveryFactory

http://git-wip-us.apache.org/repos/asf/flink/blob/4f12356e/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderChangeStateCleanupTest.java
--
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderChangeStateCleanupTest.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderChangeStateCleanupTest.java
index 0b84474..c490a64 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderChangeStateCleanupTest.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderChangeStateCleanupTest.java
@@ -20,7 +20,6 @@ package org.apache.flink.runtime.leaderelection;
 
 import org.apache.flink.configuration.ConfigConstants;
 import org.apache.flink.configuration.Configuration;
-import org.apache.flink.runtime.StreamingMode;
 import org.apache.flink.runtime.instance.ActorGateway;
 import org.apache.flink.runtime.jobgraph.DistributionPattern;
 import org.apache.flink.runtime.jobgraph.JobGraph;
@@ -68,7 +67,7 @@ public class LeaderChangeStateCleanupTest extends TestLogger {

configuration.setInteger(ConfigConstants.LOCAL_NUMBER_TASK_MANAGER, numTMs);

configuration.setInteger(ConfigConstants.TASK_MANAGER_NUM_TASK_SLOTS, 
numSlotsPerTM);
 
-   cluster = new 
LeaderElectionRetrievalTestingCluster(configuration, true, false, 
StreamingMode.BATCH_ONLY);
+   cluster = new 
LeaderElectionRetrievalTestingCluster(configuration, true, false);
cluster.start(false); // TaskManagers don't have to register at 
the JobManager
 
cluster.waitForActorsToBeAlive(); // we only wait until all 
actors are alive

http://git-wip-us.apache.org/repos/asf/flink/blob/4f12356e/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderElectionRetrievalTestingCluster.java
--
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderElectionRetrievalTestingCluster.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderElectionRetrievalTestingCluster.java
index c83f548..c8cf868 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderElectionRetrievalTestingCluster.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/leaderelection/LeaderElectionRetrievalTestingCluster.java
@@ -20,7 +20,6 @@ package org.apache.flink.runtime.leaderelection;
 
 import org.apache.flink.configuration.ConfigConstants;
 import org.apache.flink.configuration.Configuration;
-import org.apache.flink.runtime.StreamingMode;
 import org.apache.flink.runtime.leaderretrieval.LeaderRetrievalService;
 import org.apache.flink.runtime.testingUtils.TestingCluster;
 import scala.Option;
@@ -39,7 +38,6 @@ public class LeaderElectionRetrievalTestingCluster extends 
TestingCluster {
 
private final Configuration userConfiguration;
private final boolean useSingleActorSystem;
-   private final StreamingMode streamingMode;
 
public List leaderElectionServices;
public List leaderRetriev

flink git commit: [FLINK-3136] Fix shaded imports in ClosureCleaner.scala

2015-12-09 Thread aljoscha
Repository: flink
Updated Branches:
  refs/heads/release-0.10 9268514b7 -> bf9caa5fa


[FLINK-3136] Fix shaded imports in ClosureCleaner.scala


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/bf9caa5f
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/bf9caa5f
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/bf9caa5f

Branch: refs/heads/release-0.10
Commit: bf9caa5faa38dfd30df764c8e017637854517d43
Parents: 9268514
Author: Aljoscha Krettek 
Authored: Tue Dec 8 17:07:51 2015 +0100
Committer: Aljoscha Krettek 
Committed: Wed Dec 9 11:22:04 2015 +0100

--
 flink-libraries/flink-gelly-scala/pom.xml   | 21 +---
 .../apache/flink/api/scala/ClosureCleaner.scala |  4 ++--
 2 files changed, 16 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/bf9caa5f/flink-libraries/flink-gelly-scala/pom.xml
--
diff --git a/flink-libraries/flink-gelly-scala/pom.xml 
b/flink-libraries/flink-gelly-scala/pom.xml
index 90d2971..2049450 100644
--- a/flink-libraries/flink-gelly-scala/pom.xml
+++ b/flink-libraries/flink-gelly-scala/pom.xml
@@ -38,6 +38,13 @@ under the License.
 flink-scala
 ${project.version}
 
+
+
+org.ow2.asm
+asm
+${asm.version}
+
 
 org.apache.flink
 flink-clients
@@ -48,13 +55,13 @@ under the License.
 flink-gelly
 ${project.version}
 
-   
-   org.apache.flink
-   flink-tests
-   ${project.version}
-   test
-   test-jar
-   
+
+org.apache.flink
+flink-tests
+${project.version}
+test
+test-jar
+
 
 org.apache.flink
 flink-test-utils

http://git-wip-us.apache.org/repos/asf/flink/blob/bf9caa5f/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
--
diff --git 
a/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala 
b/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
index c695988..aee57f5 100644
--- a/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
+++ b/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
@@ -26,8 +26,8 @@ import org.slf4j.LoggerFactory
 import scala.collection.mutable.Map
 import scala.collection.mutable.Set
 
-import com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.{ClassReader, 
ClassVisitor, MethodVisitor, Type}
-import com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.Opcodes._
+import org.objectweb.asm.{ClassReader, ClassVisitor, MethodVisitor, Type}
+import org.objectweb.asm.Opcodes._
 
 /* This code is originally from the Apache Spark project. */
 object ClosureCleaner {



flink git commit: [FLINK-3136] Fix shaded imports in ClosureCleaner.scala

2015-12-09 Thread aljoscha
Repository: flink
Updated Branches:
  refs/heads/master fc8be1ca6 -> 718a17b28


[FLINK-3136] Fix shaded imports in ClosureCleaner.scala


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/718a17b2
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/718a17b2
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/718a17b2

Branch: refs/heads/master
Commit: 718a17b282923c5d10a6ef68e2da27b5aa828b80
Parents: fc8be1c
Author: Aljoscha Krettek 
Authored: Tue Dec 8 17:07:51 2015 +0100
Committer: Aljoscha Krettek 
Committed: Wed Dec 9 11:12:37 2015 +0100

--
 flink-libraries/flink-gelly-scala/pom.xml   | 21 +---
 .../apache/flink/api/scala/ClosureCleaner.scala |  4 ++--
 2 files changed, 16 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/718a17b2/flink-libraries/flink-gelly-scala/pom.xml
--
diff --git a/flink-libraries/flink-gelly-scala/pom.xml 
b/flink-libraries/flink-gelly-scala/pom.xml
index 385f5c6..fce7346 100644
--- a/flink-libraries/flink-gelly-scala/pom.xml
+++ b/flink-libraries/flink-gelly-scala/pom.xml
@@ -38,6 +38,13 @@ under the License.
 flink-scala
 ${project.version}
 
+
+
+org.ow2.asm
+asm
+${asm.version}
+
 
 org.apache.flink
 flink-clients
@@ -48,13 +55,13 @@ under the License.
 flink-gelly
 ${project.version}
 
-   
-   org.apache.flink
-   flink-tests
-   ${project.version}
-   test
-   test-jar
-   
+
+org.apache.flink
+flink-tests
+${project.version}
+test
+test-jar
+
 
 org.apache.flink
 flink-test-utils

http://git-wip-us.apache.org/repos/asf/flink/blob/718a17b2/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
--
diff --git 
a/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala 
b/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
index c695988..aee57f5 100644
--- a/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
+++ b/flink-scala/src/main/scala/org/apache/flink/api/scala/ClosureCleaner.scala
@@ -26,8 +26,8 @@ import org.slf4j.LoggerFactory
 import scala.collection.mutable.Map
 import scala.collection.mutable.Set
 
-import com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.{ClassReader, 
ClassVisitor, MethodVisitor, Type}
-import com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.Opcodes._
+import org.objectweb.asm.{ClassReader, ClassVisitor, MethodVisitor, Type}
+import org.objectweb.asm.Opcodes._
 
 /* This code is originally from the Apache Spark project. */
 object ClosureCleaner {