[ 
https://issues.apache.org/jira/browse/FLINK-8819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16635388#comment-16635388
 ] 

ASF GitHub Bot commented on FLINK-8819:
---------------------------------------

zentol closed pull request #6642: [FLINK-8819][travis] Rework travis script to 
use stages
URL: https://github.com/apache/flink/pull/6642
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/.travis.yml b/.travis.yml
index 47ccf421555..28b7e4c752e 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -20,8 +20,11 @@ sudo: required
 dist: trusty
 
 cache:
+  # default timeout is too low
+  timeout: 600
   directories:
   - $HOME/.m2
+  - $HOME/flink_cache
 
 # do not cache our own artifacts
 before_cache:
@@ -31,62 +34,6 @@ install: true
 
 language: java
 
-# - define unique cache names in case the auto-generated ones are not unique
-#  (see https://docs.travis-ci.com/user/caching/#Caches-and-build-matrices)
-# - See https://issues.apache.org/jira/browse/FLINK-1072
-matrix:
-  include:
-    - jdk: "oraclejdk8"
-      env:
-        - TEST="core"
-        - PROFILE="-Dhadoop.version=2.8.3"
-        - CACHE_NAME=JDK8_H280_CO
-    - jdk: "oraclejdk8"
-      env:
-        - TEST="libraries"
-        - PROFILE="-Dhadoop.version=2.8.3"
-        - CACHE_NAME=JDK8_H280_L
-    - jdk: "oraclejdk8"
-      env:
-        - TEST="connectors"
-        - PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis"
-        - CACHE_NAME=JDK8_H280_CN
-    - jdk: "oraclejdk8"
-      env:
-        - TEST="tests"
-        - PROFILE="-Dhadoop.version=2.8.3"
-        - CACHE_NAME=JDK8_H280_T
-    - jdk: "oraclejdk8"
-      env:
-        - TEST="misc"
-        - PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws"
-        - CACHE_NAME=JDK8_H280_M
-    - jdk: "openjdk8"
-      env:
-        - TEST="core"
-        - PROFILE="-Dhadoop.version=2.4.1"
-        - CACHE_NAME=JDK8_H241_CO
-    - jdk: "openjdk8"
-      env:
-        - TEST="libraries"
-        - PROFILE="-Dhadoop.version=2.4.1"
-        - CACHE_NAME=JDK8_H241_L
-    - jdk: "openjdk8"
-      env:
-        - TEST="connectors"
-        - PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis"
-        - CACHE_NAME=JDK8_H241_CN
-    - jdk: "openjdk8"
-      env:
-        - TEST="tests"
-        - PROFILE="-Dhadoop.version=2.4.1"
-        - CACHE_NAME=JDK8_H241_T
-    - jdk: "openjdk8"
-      env:
-        - TEST="misc"
-        - PROFILE="-Dhadoop.version=2.4.1"
-        - CACHE_NAME=JDK8_H241_M
-
 git:
   depth: 100
 
@@ -102,6 +49,9 @@ env:
 
 before_script:
    - "gem install --no-document --version 0.8.9 faraday "
+   - "export -f travis_nanoseconds"
+   - "export -f travis_time_start"
+   - "export -f travis_time_finish"
 
 # Install maven 3.2.5 since trusty uses 3.3.9 for which shading is broken
 before_install:
@@ -119,6 +69,57 @@ before_install:
    - chmod +x docker-compose
    - sudo mv docker-compose /usr/local/bin
 
-# We run mvn and monitor its output. If there is no output for the specified 
number of seconds, we
-# print the stack traces of all running Java processes.
-script: "./tools/travis_mvn_watchdog.sh 300"
+
+jdk: "oraclejdk8"
+jobs:
+  include:
+    # main profile
+    - stage: compile
+      script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws"
+      name: compile
+    - stage: test
+      script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws"
+      name: core
+    - script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws"
+      name: libraries
+    - script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws"
+      name: connectors
+    - script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws"
+      name: tests
+    - script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws"
+      name: misc
+    - stage: cleanup
+      script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws"
+      name: cleanup
+    # legacy profile
+    - stage: compile
+      script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -DlegacyCode"
+      name: compile(legacy)
+    - stage: test
+      script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -DlegacyCode"
+      name: core(legacy)
+    - script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -DlegacyCode"
+      name: libraries(legacy)
+    - script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -DlegacyCode"
+      name: connectors(legacy)
+    - script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -DlegacyCode"
+      name: tests(legacy)
+    - script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -DlegacyCode"
+      name: misc(legacy)
+    - stage: cleanup
+      script: ./tools/travis_controller.sh
+      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -DlegacyCode"
+      name: cleanup(legacy)
diff --git a/tools/travis/fold.sh b/tools/travis/fold.sh
new file mode 100644
index 00000000000..c567690946f
--- /dev/null
+++ b/tools/travis/fold.sh
@@ -0,0 +1,45 @@
+#!/usr/bin/env bash
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+# Hex-encoded travis-interval ANSI escape sequences
+# 
https://github.com/travis-ci/travis-build/blob/master/lib/travis/build/bash/travis_fold.bash
+# 
https://github.com/travis-ci/travis-build/blob/master/lib/travis/build/bash/travis_setup_env.bash
+#
+# \x1b = \033 = ESC
+# \x5b = [
+# \x4b = K
+# \x6d = m
+# \x30 = 0
+# \x31 = 1
+# \x33 = 3
+# \x3b = ;
+
+COLOR_YELLOW="\x1b\x5b\x33\x33\x3b\x31\x6d"
+ANSI_CLEAR="\x1b\x5b\x30\x6d"
+
+function start_fold {
+    local id=$1
+    local message=$2
+    echo -e 
"travis_fold:start:${id}\\r${ANSI_CLEAR}${COLOR_YELLOW}${message}${ANSI_CLEAR}"
+}
+
+function end_fold {
+    local message=$1
+       echo -en "travis_fold:end:${message}\\r${ANSI_CLEAR}"
+}
diff --git a/tools/travis/shade.sh b/tools/travis/shade.sh
new file mode 100644
index 00000000000..428a92b812f
--- /dev/null
+++ b/tools/travis/shade.sh
@@ -0,0 +1,196 @@
+#!/usr/bin/env bash
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+# Check the final fat jar for illegal or missing artifacts
+check_shaded_artifacts() {
+       jar tf build-target/lib/flink-dist*.jar > allClasses
+       ASM=`cat allClasses | grep '^org/objectweb/asm/' | wc -l`
+       if [ "$ASM" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$ASM' unshaded asm dependencies in fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       GUAVA=`cat allClasses | grep '^com/google/common' | wc -l`
+       if [ "$GUAVA" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$GUAVA' guava dependencies in fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       CODEHAUS_JACKSON=`cat allClasses | grep '^org/codehaus/jackson' | wc -l`
+       if [ "$CODEHAUS_JACKSON" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$CODEHAUS_JACKSON' unshaded 
org.codehaus.jackson classes in fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       FASTERXML_JACKSON=`cat allClasses | grep '^com/fasterxml/jackson' | wc 
-l`
+       if [ "$FASTERXML_JACKSON" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$FASTERXML_JACKSON' unshaded 
com.fasterxml.jackson classes in fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       SNAPPY=`cat allClasses | grep '^org/xerial/snappy' | wc -l`
+       if [ "$SNAPPY" == "0" ]; then
+               echo 
"=============================================================================="
+               echo "Missing snappy dependencies in fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       IO_NETTY=`cat allClasses | grep '^io/netty' | wc -l`
+       if [ "$IO_NETTY" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$IO_NETTY' unshaded io.netty classes in fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       ORG_NETTY=`cat allClasses | grep '^org/jboss/netty' | wc -l`
+       if [ "$ORG_NETTY" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$ORG_NETTY' unshaded org.jboss.netty classes in 
fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       ZOOKEEPER=`cat allClasses | grep '^org/apache/zookeeper' | wc -l`
+       if [ "$ZOOKEEPER" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$ZOOKEEPER' unshaded org.apache.zookeeper 
classes in fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       CURATOR=`cat allClasses | grep '^org/apache/curator' | wc -l`
+       if [ "$CURATOR" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$CURATOR' unshaded org.apache.curator classes 
in fat jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       FLINK_PYTHON=`cat allClasses | grep '^org/apache/flink/python' | wc -l`
+       if [ "$FLINK_PYTHON" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected that the Flink Python artifact is in the dist 
jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       HADOOP=`cat allClasses | grep '^org/apache/hadoop' | wc -l`
+       if [ "$HADOOP" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$HADOOP' Hadoop classes in the dist jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       MAPR=`cat allClasses | grep '^com/mapr' | wc -l`
+       if [ "$MAPR" != "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected '$MAPR' MapR classes in the dist jar"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       return 0
+}
+
+# Check the S3 fs implementations' fat jars for illegal or missing artifacts
+check_shaded_artifacts_s3_fs() {
+       VARIANT=$1
+       jar tf 
flink-filesystems/flink-s3-fs-${VARIANT}/target/flink-s3-fs-${VARIANT}*.jar > 
allClasses
+
+       UNSHADED_CLASSES=`cat allClasses | grep -v -e '^META-INF' -e 
"^org/apache/flink/fs/" | grep '\.class$'`
+       if [ "$?" == "0" ]; then
+               echo 
"=============================================================================="
+               echo "${VARIANT}: Detected unshaded dependencies in fat jar:"
+               echo "${UNSHADED_CLASSES}"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       if [ ! `cat allClasses | grep 
'^META-INF/services/org\.apache\.flink\.core\.fs\.FileSystemFactory$'` ]; then
+               echo 
"=============================================================================="
+               echo "${VARIANT}: File does not exist: 
services/org.apache.flink.core.fs.FileSystemFactory"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       UNSHADED_SERVICES=`cat allClasses | grep '^META-INF/services/' | grep 
-v -e '^META-INF/services/org\.apache\.flink\.core\.fs\.FileSystemFactory$' -e 
"^META-INF/services/org\.apache\.flink\.fs.*shaded" -e '^META-INF/services/'`
+       if [ "$?" == "0" ]; then
+               echo 
"=============================================================================="
+               echo "${VARIANT}: Detected unshaded service files in fat jar:"
+               echo "${UNSHADED_SERVICES}"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       FS_SERVICE_FILE_CLASSES=`unzip -q -c 
flink-filesystems/flink-s3-fs-${VARIANT}/target/flink-s3-fs-${VARIANT}*.jar 
META-INF/services/org.apache.flink.core.fs.FileSystemFactory | grep -v -e '^#' 
-e '^$'`
+       
EXPECTED_FS_SERVICE_FILE_CLASSES="org.apache.flink.fs.s3${VARIANT}.S3FileSystemFactory"
+       if [ "${VARIANT}" == "hadoop" ]; then
+               read -r -d '' EXPECTED_FS_SERVICE_FILE_CLASSES <<EOF
+org.apache.flink.fs.s3${VARIANT}.S3FileSystemFactory
+org.apache.flink.fs.s3${VARIANT}.S3AFileSystemFactory
+EOF
+       fi
+
+       if [ "${FS_SERVICE_FILE_CLASSES}" != 
"${EXPECTED_FS_SERVICE_FILE_CLASSES}" ]; then
+               echo 
"=============================================================================="
+               echo "${VARIANT}: Detected wrong content in 
services/org.apache.flink.core.fs.FileSystemFactory:"
+               echo "${FS_SERVICE_FILE_CLASSES}"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       return 0
+}
+
+# Check the elasticsearch connectors' fat jars for illegal or missing artifacts
+check_shaded_artifacts_connector_elasticsearch() {
+       VARIANT=$1
+       find 
flink-connectors/flink-connector-elasticsearch${VARIANT}/target/flink-connector-elasticsearch${VARIANT}*.jar
 ! -name "*-tests.jar" -exec jar tf {} \; > allClasses
+
+       UNSHADED_CLASSES=`cat allClasses | grep -v -e '^META-INF' -e '^assets' 
-e "^org/apache/flink/streaming/connectors/elasticsearch/" -e 
"^org/apache/flink/streaming/connectors/elasticsearch${VARIANT}/" -e 
"^org/apache/flink/table/descriptors/" -e "^org/elasticsearch/" | grep 
'\.class$'`
+       if [ "$?" == "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected unshaded dependencies in 
flink-connector-elasticsearch${VARIANT}'s fat jar:"
+               echo "${UNSHADED_CLASSES}"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       UNSHADED_SERVICES=`cat allClasses | grep '^META-INF/services/' | grep 
-v -e '^META-INF/services/org\.apache\.flink\.core\.fs\.FileSystemFactory$' -e 
"^META-INF/services/org\.apache\.flink\.fs\.s3${VARIANT}\.shaded" -e 
'^META-INF/services/'`
+       if [ "$?" == "0" ]; then
+               echo 
"=============================================================================="
+               echo "Detected unshaded service files in 
flink-connector-elasticsearch${VARIANT}'s fat jar:"
+               echo "${UNSHADED_SERVICES}"
+               echo 
"=============================================================================="
+               return 1
+       fi
+
+       return 0
+}
diff --git a/tools/travis/stage.sh b/tools/travis/stage.sh
new file mode 100644
index 00000000000..b751383f957
--- /dev/null
+++ b/tools/travis/stage.sh
@@ -0,0 +1,139 @@
+#!/usr/bin/env bash
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+STAGE_COMPILE="compile"
+STAGE_CORE="core"
+STAGE_LIBRARIES="libraries"
+STAGE_CONNECTORS="connectors"
+STAGE_TESTS="tests"
+STAGE_MISC="misc"
+STAGE_CLEANUP="cleanup"
+
+MODULES_CORE="\
+flink-test-utils-parent/flink-test-utils,\
+flink-state-backends/flink-statebackend-rocksdb,\
+flink-clients,\
+flink-core,\
+flink-java,\
+flink-optimizer,\
+flink-runtime,\
+flink-runtime-web,\
+flink-scala,\
+flink-scala-shell,\
+flink-streaming-java,\
+flink-streaming-scala"
+
+MODULES_LIBRARIES="\
+flink-contrib/flink-storm,\
+flink-contrib/flink-storm-examples,\
+flink-libraries/flink-cep,\
+flink-libraries/flink-cep-scala,\
+flink-libraries/flink-gelly,\
+flink-libraries/flink-gelly-scala,\
+flink-libraries/flink-gelly-examples,\
+flink-libraries/flink-ml,\
+flink-libraries/flink-python,\
+flink-libraries/flink-streaming-python,\
+flink-libraries/flink-table,\
+flink-queryable-state/flink-queryable-state-runtime,\
+flink-queryable-state/flink-queryable-state-client-java"
+
+MODULES_CONNECTORS="\
+flink-contrib/flink-connector-wikiedits,\
+flink-filesystems/flink-hadoop-fs,\
+flink-filesystems/flink-mapr-fs,\
+flink-filesystems/flink-s3-fs-hadoop,\
+flink-filesystems/flink-s3-fs-presto,\
+flink-formats/flink-avro,\
+flink-formats/flink-parquet,\
+flink-connectors/flink-hbase,\
+flink-connectors/flink-hcatalog,\
+flink-connectors/flink-hadoop-compatibility,\
+flink-connectors/flink-jdbc,\
+flink-connectors/flink-connector-cassandra,\
+flink-connectors/flink-connector-elasticsearch,\
+flink-connectors/flink-connector-elasticsearch2,\
+flink-connectors/flink-connector-elasticsearch5,\
+flink-connectors/flink-connector-elasticsearch6,\
+flink-connectors/flink-connector-elasticsearch-base,\
+flink-connectors/flink-connector-filesystem,\
+flink-connectors/flink-connector-kafka-0.8,\
+flink-connectors/flink-connector-kafka-0.9,\
+flink-connectors/flink-connector-kafka-0.10,\
+flink-connectors/flink-connector-kafka-0.11,\
+flink-connectors/flink-connector-kafka-base,\
+flink-connectors/flink-connector-nifi,\
+flink-connectors/flink-connector-rabbitmq,\
+flink-connectors/flink-connector-twitter"
+
+MODULES_TESTS="\
+flink-tests"
+
+if [[ ${PROFILE} == *"include-kinesis"* ]]; then
+    
MODULES_CONNECTORS="$MODULES_CONNECTORS,flink-connectors/flink-connector-kinesis"
+fi
+
+function get_compile_modules_for_stage() {
+    local stage=$1
+
+    case ${stage} in
+        (${STAGE_CORE})
+            echo "-pl $MODULES_CORE -am"
+        ;;
+        (${STAGE_LIBRARIES})
+            echo "-pl $MODULES_LIBRARIES -am"
+        ;;
+        (${STAGE_CONNECTORS})
+            echo "-pl $MODULES_CONNECTORS -am"
+        ;;
+        (${STAGE_TESTS})
+            echo "-pl $MODULES_TESTS -am"
+        ;;
+        (${STAGE_MISC})
+            # compile everything since dist needs it anyway
+            echo ""
+        ;;
+    esac
+}
+
+function get_test_modules_for_stage() {
+    local stage=$1
+
+    case ${stage} in
+        (${STAGE_CORE})
+            echo "-pl $MODULES_CORE"
+        ;;
+        (${STAGE_LIBRARIES})
+            echo "-pl $MODULES_LIBRARIES"
+        ;;
+        (${STAGE_CONNECTORS})
+            echo "-pl $MODULES_CONNECTORS"
+        ;;
+        (${STAGE_TESTS})
+            echo "-pl $MODULES_TESTS"
+        ;;
+        (${STAGE_MISC})
+            NEGATED_CORE=\!${MODULES_CORE//,/,\!}
+            NEGATED_LIBRARIES=\!${MODULES_LIBRARIES//,/,\!}
+            NEGATED_CONNECTORS=\!${MODULES_CONNECTORS//,/,\!}
+            NEGATED_TESTS=\!${MODULES_TESTS//,/,\!}
+            echo "-pl 
$NEGATED_CORE,$NEGATED_LIBRARIES,$NEGATED_CONNECTORS,$NEGATED_TESTS"
+        ;;
+    esac
+}
diff --git a/tools/travis_controller.sh b/tools/travis_controller.sh
new file mode 100755
index 00000000000..18cbeee5983
--- /dev/null
+++ b/tools/travis_controller.sh
@@ -0,0 +1,211 @@
+#!/usr/bin/env bash
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+CACHE_DIR="$HOME/flink_cache"
+CACHE_BUILD_DIR="$CACHE_DIR/$TRAVIS_BUILD_NUMBER"
+CACHE_FLINK_DIR="$CACHE_BUILD_DIR/flink"
+
+HERE="`dirname \"$0\"`"                                # relative
+HERE="`( cd \"$HERE\" && pwd )`"       # absolutized and normalized
+if [ -z "$HERE" ] ; then
+       # error; for some reason, the path is not accessible
+       # to the script (e.g. permissions re-evaled after suid)
+       exit 1  # fail
+fi
+
+source "${HERE}/travis/fold.sh"
+source "${HERE}/travis/stage.sh"
+source "${HERE}/travis/shade.sh"
+
+function deleteOldCaches() {
+       while read CACHE_DIR; do
+               local old_number="${CACHE_DIR##*/}"
+               if [ "$old_number" -lt "$TRAVIS_BUILD_NUMBER" ]; then
+                       echo "Deleting old cache $CACHE_DIR"
+                       rm -rf "$CACHE_DIR"
+               fi
+       done
+}
+
+# delete leftover caches from previous builds
+find "$CACHE_DIR" -mindepth 1 -maxdepth 1 | grep -v "$TRAVIS_BUILD_NUMBER" | 
deleteOldCaches
+
+function getCurrentStage() {
+       STAGE_NUMBER=$(echo "$TRAVIS_JOB_NUMBER" | cut -d'.' -f 2)
+       case $STAGE_NUMBER in
+               (1)
+                       echo "$STAGE_COMPILE"
+                       ;;
+               (2)
+                       echo "$STAGE_COMPILE"
+                       ;;
+               (3)
+                       echo "$STAGE_CORE"
+                       ;;
+               (4)
+                       echo "$STAGE_LIBRARIES"
+                       ;;
+               (5)
+                       echo "$STAGE_CONNECTORS"
+                       ;;
+               (6)
+                       echo "$STAGE_TESTS"
+                       ;;
+               (7)
+                       echo "$STAGE_MISC"
+                       ;;
+               (8)
+                       echo "$STAGE_CORE"
+                       ;;
+               (9)
+                       echo "$STAGE_LIBRARIES"
+                       ;;
+               (10)
+                       echo "$STAGE_CONNECTORS"
+                       ;;
+               (11)
+                       echo "$STAGE_TESTS"
+                       ;;
+               (12)
+                       echo "$STAGE_MISC"
+                       ;;
+               (13)
+                       echo "$STAGE_CLEANUP"
+                       ;;
+               (14)
+                       echo "$STAGE_CLEANUP"
+                       ;;
+               (*)
+                       echo "Invalid stage detected ($STAGE_NUMBER)"
+                       return 1
+                       ;;
+       esac
+
+       return 0
+}
+
+STAGE=$(getCurrentStage)
+if [ $? != 0 ]; then
+       echo "Could not determine current stage."
+       exit 1
+fi
+echo "Current stage: \"$STAGE\""
+
+EXIT_CODE=0
+
+# Run actual compile&test steps
+if [ $STAGE == "$STAGE_COMPILE" ]; then
+       MVN="mvn clean install -nsu -Dflink.forkCount=2 
-Dflink.forkCountTestPackage=2 -Dmaven.javadoc.skip=true -B -DskipTests 
$PROFILE"
+       $MVN
+       EXIT_CODE=$?
+
+    if [ $EXIT_CODE == 0 ]; then
+        printf 
"\n\n==============================================================================\n"
+        printf "Checking dependency convergence\n"
+        printf 
"==============================================================================\n"
+
+        ./tools/check_dependency_convergence.sh
+        EXIT_CODE=$?
+    else
+        printf 
"\n==============================================================================\n"
+        printf "Previous build failure detected, skipping 
dependency-convergence check.\n"
+        printf 
"==============================================================================\n"
+    fi
+    
+    if [ $EXIT_CODE == 0 ]; then
+        check_shaded_artifacts
+        EXIT_CODE=$(($EXIT_CODE+$?))
+        check_shaded_artifacts_s3_fs hadoop
+        EXIT_CODE=$(($EXIT_CODE+$?))
+        check_shaded_artifacts_s3_fs presto
+        EXIT_CODE=$(($EXIT_CODE+$?))
+        check_shaded_artifacts_connector_elasticsearch ""
+        EXIT_CODE=$(($EXIT_CODE+$?))
+        check_shaded_artifacts_connector_elasticsearch 2
+        EXIT_CODE=$(($EXIT_CODE+$?))
+        check_shaded_artifacts_connector_elasticsearch 5
+        EXIT_CODE=$(($EXIT_CODE+$?))
+    else
+        echo 
"=============================================================================="
+        echo "Previous build failure detected, skipping shaded dependency 
check."
+        echo 
"=============================================================================="
+    fi
+
+    if [ $EXIT_CODE == 0 ]; then
+        echo "Creating cache build directory $CACHE_FLINK_DIR"
+        mkdir -p "$CACHE_FLINK_DIR"
+    
+        cp -r . "$CACHE_FLINK_DIR"
+
+        function minimizeCachedFiles() {
+            # reduces the size of the cached directory to speed up
+            # the packing&upload / download&unpacking process
+            # by removing files not required for subsequent stages
+    
+            # original jars
+            find "$CACHE_FLINK_DIR" -maxdepth 8 -type f -name 'original-*.jar' 
| xargs rm -rf
+    
+            # .git directory
+            # not deleting this can cause build stability issues
+            # merging the cached version sometimes fails
+            rm -rf "$CACHE_FLINK_DIR/.git"
+        }
+    
+        start_fold "minimize_cache" "Minimizing cache"
+        travis_time_start
+        minimizeCachedFiles
+        travis_time_finish
+        end_fold "minimize_cache"
+    else
+        echo 
"=============================================================================="
+        echo "Previous build failure detected, skipping cache setup."
+        echo 
"=============================================================================="
+    fi
+elif [ $STAGE != "$STAGE_CLEANUP" ]; then
+       if ! [ -e $CACHE_FLINK_DIR ]; then
+               echo "Cached flink dir $CACHE_FLINK_DIR does not exist. Exiting 
build."
+               exit 1
+       fi
+       # merged compiled flink into local clone
+       # this prevents the cache from being re-uploaded
+       start_fold "merge_cache" "Merging cache"
+       travis_time_start
+       cp -RT "$CACHE_FLINK_DIR" "."
+       travis_time_finish
+       end_fold "merge_cache"
+
+       start_fold "adjust_timestamps" "Adjusting timestamps"
+       travis_time_start
+       # adjust timestamps to prevent recompilation
+       find . -type f -name '*.java' | xargs touch
+       find . -type f -name '*.scala' | xargs touch
+       find . -type f -name '*.class' | xargs touch
+       find . -type f -name '*.timestamp' | xargs touch
+       travis_time_finish
+       end_fold "adjust_timestamps"
+
+       TEST="$STAGE" "./tools/travis_mvn_watchdog.sh" 300
+       EXIT_CODE=$?
+else
+       echo "Cleaning up $CACHE_BUILD_DIR"
+       rm -rf "$CACHE_BUILD_DIR"
+fi
+
+# Exit code for Travis build success/failure
+exit $EXIT_CODE
diff --git a/tools/travis_mvn_watchdog.sh b/tools/travis_mvn_watchdog.sh
index 160ca465042..63c177258c4 100755
--- a/tools/travis_mvn_watchdog.sh
+++ b/tools/travis_mvn_watchdog.sh
@@ -25,6 +25,8 @@ if [ -z "$HERE" ] ; then
        exit 1  # fail
 fi
 
+source "${HERE}/travis/stage.sh"
+
 ARTIFACTS_DIR="${HERE}/artifacts"
 
 mkdir -p $ARTIFACTS_DIR || { echo "FAILURE: cannot create log directory 
'${ARTIFACTS_DIR}'." ; exit 1; }
@@ -43,115 +45,8 @@ SLEEP_TIME=20
 
 LOG4J_PROPERTIES=${HERE}/log4j-travis.properties
 
-MODULES_CORE="\
-flink-test-utils-parent/flink-test-utils,\
-flink-state-backends/flink-statebackend-rocksdb,\
-flink-clients,\
-flink-core,\
-flink-java,\
-flink-optimizer,\
-flink-runtime,\
-flink-runtime-web,\
-flink-scala,\
-flink-scala-shell,\
-flink-streaming-java,\
-flink-streaming-scala"
-
-MODULES_LIBRARIES="\
-flink-contrib/flink-storm,\
-flink-contrib/flink-storm-examples,\
-flink-libraries/flink-cep,\
-flink-libraries/flink-cep-scala,\
-flink-libraries/flink-gelly,\
-flink-libraries/flink-gelly-scala,\
-flink-libraries/flink-gelly-examples,\
-flink-libraries/flink-ml,\
-flink-libraries/flink-python,\
-flink-libraries/flink-streaming-python,\
-flink-libraries/flink-table,\
-flink-queryable-state/flink-queryable-state-runtime,\
-flink-queryable-state/flink-queryable-state-client-java"
-
-MODULES_CONNECTORS="\
-flink-contrib/flink-connector-wikiedits,\
-flink-filesystems/flink-hadoop-fs,\
-flink-filesystems/flink-mapr-fs,\
-flink-filesystems/flink-s3-fs-hadoop,\
-flink-filesystems/flink-s3-fs-presto,\
-flink-formats/flink-avro,\
-flink-formats/flink-parquet,\
-flink-connectors/flink-hbase,\
-flink-connectors/flink-hcatalog,\
-flink-connectors/flink-hadoop-compatibility,\
-flink-connectors/flink-jdbc,\
-flink-connectors/flink-connector-cassandra,\
-flink-connectors/flink-connector-elasticsearch,\
-flink-connectors/flink-connector-elasticsearch2,\
-flink-connectors/flink-connector-elasticsearch5,\
-flink-connectors/flink-connector-elasticsearch6,\
-flink-connectors/flink-connector-elasticsearch-base,\
-flink-connectors/flink-connector-filesystem,\
-flink-connectors/flink-connector-kafka-0.8,\
-flink-connectors/flink-connector-kafka-0.9,\
-flink-connectors/flink-connector-kafka-0.10,\
-flink-connectors/flink-connector-kafka-0.11,\
-flink-connectors/flink-connector-kafka-base,\
-flink-connectors/flink-connector-nifi,\
-flink-connectors/flink-connector-rabbitmq,\
-flink-connectors/flink-connector-twitter"
-
-MODULES_TESTS="\
-flink-tests"
-
-if [[ $PROFILE == *"include-kinesis"* ]]; then
-       case $TEST in
-               (connectors)
-                       
MODULES_CONNECTORS="$MODULES_CONNECTORS,flink-connectors/flink-connector-kinesis"
-               ;;
-       esac
-fi
-
-MVN_COMPILE_MODULES=""
-MVN_COMPILE_OPTIONS=""
-MVN_TEST_MODULES=""
-MVN_TEST_OPTIONS=""
-case $TEST in
-       (core)
-               MVN_COMPILE_MODULES="-pl $MODULES_CORE -am"
-               MVN_TEST_MODULES="-pl $MODULES_CORE"
-               MVN_COMPILE_OPTIONS="-Dfast"
-               MVN_TEST_OPTIONS="-Dfast"
-       ;;
-       (libraries)
-               MVN_COMPILE_MODULES="-pl $MODULES_LIBRARIES -am"
-               MVN_TEST_MODULES="-pl $MODULES_LIBRARIES"
-               MVN_COMPILE_OPTIONS="-Dfast"
-               MVN_TEST_OPTIONS="-Dfast"
-       ;;
-       (connectors)
-               MVN_COMPILE_MODULES="-pl $MODULES_CONNECTORS -am"
-               MVN_TEST_MODULES="-pl $MODULES_CONNECTORS"
-               MVN_COMPILE_OPTIONS="-Dfast"
-               MVN_TEST_OPTIONS="-Dfast"
-       ;;
-       (tests)
-               MVN_COMPILE_MODULES="-pl $MODULES_TESTS -am"
-               MVN_TEST_MODULES="-pl $MODULES_TESTS"
-               MVN_COMPILE_OPTIONS="-Dfast"
-               MVN_TEST_OPTIONS="-Dfast"
-       ;;
-       (misc)
-               NEGATED_CORE=\!${MODULES_CORE//,/,\!}
-               NEGATED_LIBRARIES=\!${MODULES_LIBRARIES//,/,\!}
-               NEGATED_CONNECTORS=\!${MODULES_CONNECTORS//,/,\!}
-               NEGATED_TESTS=\!${MODULES_TESTS//,/,\!}
-               # compile everything since dist needs it anyway
-               MVN_COMPILE_MODULES=""
-               MVN_TEST_MODULES="-pl 
$NEGATED_CORE,$NEGATED_LIBRARIES,$NEGATED_CONNECTORS,$NEGATED_TESTS"
-               MVN_COMPILE_OPTIONS=""
-               MVN_TEST_OPTIONS="-Dfast"
-       ;;
-esac
+MVN_COMPILE_MODULES=$(get_compile_modules_for_stage ${TEST})
+MVN_TEST_MODULES=$(get_test_modules_for_stage ${TEST})
 
 # Maven command to run. We set the forkCount manually, because otherwise Maven 
sees too many cores
 # on the Travis VMs. Set forkCountTestPackage to 1 for container-based 
environment (4 GiB memory)
@@ -160,10 +55,11 @@ esac
 # -nsu option forbids downloading snapshot artifacts. The only snapshot 
artifacts we depend are from
 # Flink, which however should all be built locally. see FLINK-7230
 MVN_LOGGING_OPTIONS="-Dlog.dir=${ARTIFACTS_DIR} 
-Dlog4j.configuration=file://$LOG4J_PROPERTIES 
-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn"
-MVN_COMMON_OPTIONS="-nsu -Dflink.forkCount=2 -Dflink.forkCountTestPackage=2 -B 
$MVN_LOGGING_OPTIONS"
-MVN_COMPILE_OPTIONS="$MVN_COMPILE_OPTIONS -DskipTests"
+MVN_COMMON_OPTIONS="-nsu -Dflink.forkCount=2 -Dflink.forkCountTestPackage=2 
-Dfast -B $MVN_LOGGING_OPTIONS"
+MVN_COMPILE_OPTIONS="-DskipTests"
+MVN_TEST_OPTIONS="$MVN_LOGGING_OPTIONS"
 
-MVN_COMPILE="mvn $MVN_COMMON_OPTIONS $MVN_COMPILE_OPTIONS $PROFILE 
$MVN_COMPILE_MODULES clean install"
+MVN_COMPILE="mvn $MVN_COMMON_OPTIONS $MVN_COMPILE_OPTIONS $PROFILE 
$MVN_COMPILE_MODULES install"
 MVN_TEST="mvn $MVN_COMMON_OPTIONS $MVN_TEST_OPTIONS $PROFILE $MVN_TEST_MODULES 
verify"
 
 MVN_PID="${ARTIFACTS_DIR}/watchdog.mvn.pid"
@@ -304,184 +200,6 @@ watchdog () {
        done
 }
 
-# Check the final fat jar for illegal or missing artifacts
-check_shaded_artifacts() {
-       jar tf build-target/lib/flink-dist*.jar > allClasses
-       ASM=`cat allClasses | grep '^org/objectweb/asm/' | wc -l`
-       if [ "$ASM" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$ASM' unshaded asm dependencies in fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       GUAVA=`cat allClasses | grep '^com/google/common' | wc -l`
-       if [ "$GUAVA" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$GUAVA' guava dependencies in fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       CODEHAUS_JACKSON=`cat allClasses | grep '^org/codehaus/jackson' | wc -l`
-       if [ "$CODEHAUS_JACKSON" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$CODEHAUS_JACKSON' unshaded 
org.codehaus.jackson classes in fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       FASTERXML_JACKSON=`cat allClasses | grep '^com/fasterxml/jackson' | wc 
-l`
-       if [ "$FASTERXML_JACKSON" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$FASTERXML_JACKSON' unshaded 
com.fasterxml.jackson classes in fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       SNAPPY=`cat allClasses | grep '^org/xerial/snappy' | wc -l`
-       if [ "$SNAPPY" == "0" ]; then
-               echo 
"=============================================================================="
-               echo "Missing snappy dependencies in fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       IO_NETTY=`cat allClasses | grep '^io/netty' | wc -l`
-       if [ "$IO_NETTY" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$IO_NETTY' unshaded io.netty classes in fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       ORG_NETTY=`cat allClasses | grep '^org/jboss/netty' | wc -l`
-       if [ "$ORG_NETTY" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$ORG_NETTY' unshaded org.jboss.netty classes in 
fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       ZOOKEEPER=`cat allClasses | grep '^org/apache/zookeeper' | wc -l`
-       if [ "$ZOOKEEPER" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$ZOOKEEPER' unshaded org.apache.zookeeper 
classes in fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       CURATOR=`cat allClasses | grep '^org/apache/curator' | wc -l`
-       if [ "$CURATOR" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$CURATOR' unshaded org.apache.curator classes 
in fat jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       FLINK_PYTHON=`cat allClasses | grep '^org/apache/flink/python' | wc -l`
-       if [ "$FLINK_PYTHON" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected that the Flink Python artifact is in the dist 
jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       HADOOP=`cat allClasses | grep '^org/apache/hadoop' | wc -l`
-       if [ "$HADOOP" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$HADOOP' Hadoop classes in the dist jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       MAPR=`cat allClasses | grep '^com/mapr' | wc -l`
-       if [ "$MAPR" != "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected '$MAPR' MapR classes in the dist jar"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       return 0
-}
-
-# Check the S3 fs implementations' fat jars for illegal or missing artifacts
-check_shaded_artifacts_s3_fs() {
-       VARIANT=$1
-       jar tf 
flink-filesystems/flink-s3-fs-${VARIANT}/target/flink-s3-fs-${VARIANT}*.jar > 
allClasses
-
-       UNSHADED_CLASSES=`cat allClasses | grep -v -e '^META-INF' -e 
"^org/apache/flink/fs/" | grep '\.class$'`
-       if [ "$?" == "0" ]; then
-               echo 
"=============================================================================="
-               echo "${VARIANT}: Detected unshaded dependencies in fat jar:"
-               echo "${UNSHADED_CLASSES}"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       if [ ! `cat allClasses | grep 
'^META-INF/services/org\.apache\.flink\.core\.fs\.FileSystemFactory$'` ]; then
-               echo 
"=============================================================================="
-               echo "${VARIANT}: File does not exist: 
services/org.apache.flink.core.fs.FileSystemFactory"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       UNSHADED_SERVICES=`cat allClasses | grep '^META-INF/services/' | grep 
-v -e '^META-INF/services/org\.apache\.flink\.core\.fs\.FileSystemFactory$' -e 
"^META-INF/services/org\.apache\.flink\.fs.*shaded" -e '^META-INF/services/'`
-       if [ "$?" == "0" ]; then
-               echo 
"=============================================================================="
-               echo "${VARIANT}: Detected unshaded service files in fat jar:"
-               echo "${UNSHADED_SERVICES}"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       FS_SERVICE_FILE_CLASSES=`unzip -q -c 
flink-filesystems/flink-s3-fs-${VARIANT}/target/flink-s3-fs-${VARIANT}*.jar 
META-INF/services/org.apache.flink.core.fs.FileSystemFactory | grep -v -e '^#' 
-e '^$'`
-       
EXPECTED_FS_SERVICE_FILE_CLASSES="org.apache.flink.fs.s3${VARIANT}.S3FileSystemFactory"
-       if [ "${VARIANT}" == "hadoop" ]; then
-               read -r -d '' EXPECTED_FS_SERVICE_FILE_CLASSES <<EOF
-org.apache.flink.fs.s3${VARIANT}.S3FileSystemFactory
-org.apache.flink.fs.s3${VARIANT}.S3AFileSystemFactory
-EOF
-       fi
-
-       if [ "${FS_SERVICE_FILE_CLASSES}" != 
"${EXPECTED_FS_SERVICE_FILE_CLASSES}" ]; then
-               echo 
"=============================================================================="
-               echo "${VARIANT}: Detected wrong content in 
services/org.apache.flink.core.fs.FileSystemFactory:"
-               echo "${FS_SERVICE_FILE_CLASSES}"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       return 0
-}
-
-# Check the elasticsearch connectors' fat jars for illegal or missing artifacts
-check_shaded_artifacts_connector_elasticsearch() {
-       VARIANT=$1
-       find 
flink-connectors/flink-connector-elasticsearch${VARIANT}/target/flink-connector-elasticsearch${VARIANT}*.jar
 ! -name "*-tests.jar" -exec jar tf {} \; > allClasses
-
-       UNSHADED_CLASSES=`cat allClasses | grep -v -e '^META-INF' -e '^assets' 
-e "^org/apache/flink/streaming/connectors/elasticsearch/" -e 
"^org/apache/flink/streaming/connectors/elasticsearch${VARIANT}/" -e 
"^org/apache/flink/table/descriptors/" -e "^org/elasticsearch/" | grep 
'\.class$'`
-       if [ "$?" == "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected unshaded dependencies in 
flink-connector-elasticsearch${VARIANT}'s fat jar:"
-               echo "${UNSHADED_CLASSES}"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       UNSHADED_SERVICES=`cat allClasses | grep '^META-INF/services/' | grep 
-v -e '^META-INF/services/org\.apache\.flink\.core\.fs\.FileSystemFactory$' -e 
"^META-INF/services/org\.apache\.flink\.fs\.s3${VARIANT}\.shaded" -e 
'^META-INF/services/'`
-       if [ "$?" == "0" ]; then
-               echo 
"=============================================================================="
-               echo "Detected unshaded service files in 
flink-connector-elasticsearch${VARIANT}'s fat jar:"
-               echo "${UNSHADED_SERVICES}"
-               echo 
"=============================================================================="
-               return 1
-       fi
-
-       return 0
-}
-
 # =============================================================================
 # WATCHDOG
 # =============================================================================
@@ -518,24 +236,6 @@ echo "Trying to KILL watchdog (${WD_PID})."
 rm $MVN_PID
 rm $MVN_EXIT
 
-# only run dependency-convergence in misc because it is the only profile 
building all of Flink
-case $TEST in
-       (misc)
-               if [ $EXIT_CODE == 0 ]; then
-                       printf 
"\n\n==============================================================================\n"
-                       printf "Checking dependency convergence\n"
-                       printf 
"==============================================================================\n"
-
-                       ./tools/check_dependency_convergence.sh
-                       EXIT_CODE=$?
-               else
-                       printf 
"\n==============================================================================\n"
-                       printf "Previous build failure detected, skipping 
dependency-convergence check.\n"
-                       printf 
"==============================================================================\n"
-               fi
-       ;;
-esac
-
 # Run tests if compilation was successful
 if [ $EXIT_CODE == 0 ]; then
 
@@ -574,30 +274,6 @@ fi
 case $TEST in
        (misc)
                put_yarn_logs_to_artifacts
-
-               if [ $EXIT_CODE == 0 ]; then
-                       check_shaded_artifacts
-                       EXIT_CODE=$?
-               else
-                       echo 
"=============================================================================="
-                       echo "Compilation/test failure detected, skipping 
shaded dependency check."
-                       echo 
"=============================================================================="
-               fi
-       ;;
-       (connectors)
-               if [ $EXIT_CODE == 0 ]; then
-                       check_shaded_artifacts_s3_fs hadoop
-                       EXIT_CODE=$(($EXIT_CODE+$?))
-                       check_shaded_artifacts_s3_fs presto
-                       check_shaded_artifacts_connector_elasticsearch ""
-                       check_shaded_artifacts_connector_elasticsearch 2
-                       check_shaded_artifacts_connector_elasticsearch 5
-                       EXIT_CODE=$(($EXIT_CODE+$?))
-               else
-                       echo 
"=============================================================================="
-                       echo "Compilation/test failure detected, skipping 
shaded dependency check."
-                       echo 
"=============================================================================="
-               fi
        ;;
 esac
 


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Rework travis script to use build stages
> ----------------------------------------
>
>                 Key: FLINK-8819
>                 URL: https://issues.apache.org/jira/browse/FLINK-8819
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Build System, Travis
>            Reporter: Chesnay Schepler
>            Assignee: Chesnay Schepler
>            Priority: Trivial
>              Labels: pull-request-available
>
> This issue is for tracking efforts to rework our Travis scripts to use 
> [stages|https://docs.travis-ci.com/user/build-stages/].
> This feature allows us to define a sequence of jobs that are run one after 
> another. This implies that we can define dependencies between jobs, in 
> contrast to our existing jobs that have to be self-contained.
> As an example, we could have a compile stage, and a test stage with multiple 
> jobs.
> The main benefit here is that we no longer have to compile modules multiple 
> times, which would reduce our build times.
> The major issue here however is that there is no _proper_ support for passing 
> build-artifacts from one stage to the next. According to this 
> [issue|https://github.com/travis-ci/beta-features/issues/28] it is on their 
> to-do-list however.
> In the mean-time we could manually transfer the artifacts between stages by 
> either using the Travis cache or some other external storage. The cache 
> solution would work by setting up a cached directory (just like the mvn 
> cache) and creating build-scope directories within containing the artifacts 
> (I have a prototype that works like this).
> The major concern here is that of cleaning up the cache/storage.
>  We can clean things up if
>  * our script fails
>  * the last stage succeeds.
> We can *not* clean things up if
>  * the build is canceled
>  * travis fails the build due to a timeout or similar
> as apparently there is [no way to run a script at the end of a 
> build|https://github.com/travis-ci/travis-ci/issues/4221].
> Thus we would either have to periodically clear the cache, or encode more 
> information into the cached files that would allow _other_ builds to clean up 
> stale date. (For example the build number or date).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to