Github user tzulitai commented on a diff in the pull request:

    https://github.com/apache/flink/pull/5733#discussion_r176029439
  
    --- Diff: flink-end-to-end-tests/test-scripts/test_resume_savepoint.sh ---
    @@ -0,0 +1,102 @@
    +#!/usr/bin/env bash
    
+################################################################################
    +# Licensed to the Apache Software Foundation (ASF) under one
    +# or more contributor license agreements.  See the NOTICE file
    +# distributed with this work for additional information
    +# regarding copyright ownership.  The ASF licenses this file
    +# to you under the Apache License, Version 2.0 (the
    +# "License"); you may not use this file except in compliance
    +# with the License.  You may obtain a copy of the License at
    +#
    +#     http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    
+################################################################################
    +
    +source "$(dirname "$0")"/common.sh
    +
    +start_cluster
    +
    +# this tests runs 2 streaming jobs; adding extra taskmanagers for more 
slots
    +add_taskmanagers 1
    +
    +# get Kafka 0.10.0
    +mkdir -p $TEST_DATA_DIR
    +if [ -z "$3" ]; then
    +  # need to download Kafka because no Kafka was specified on the invocation
    +  
KAFKA_URL="https://archive.apache.org/dist/kafka/0.10.2.0/kafka_2.11-0.10.2.0.tgz";
    +  echo "Downloading Kafka from $KAFKA_URL"
    +  curl "$KAFKA_URL" > $TEST_DATA_DIR/kafka.tgz
    +else
    +  echo "Using specified Kafka from $3"
    +  cp $3 $TEST_DATA_DIR/kafka.tgz
    +fi
    +
    +tar xzf $TEST_DATA_DIR/kafka.tgz -C $TEST_DATA_DIR/
    +KAFKA_DIR=$TEST_DATA_DIR/kafka_2.11-0.10.2.0
    +
    +# fix kafka config
    +sed -i -e "s+^\(dataDir\s*=\s*\).*$+\1$TEST_DATA_DIR/zookeeper+" 
$KAFKA_DIR/config/zookeeper.properties
    +sed -i -e "s+^\(log\.dirs\s*=\s*\).*$+\1$TEST_DATA_DIR/kafka+" 
$KAFKA_DIR/config/server.properties
    +$KAFKA_DIR/bin/zookeeper-server-start.sh -daemon 
$KAFKA_DIR/config/zookeeper.properties
    +$KAFKA_DIR/bin/kafka-server-start.sh -daemon 
$KAFKA_DIR/config/server.properties
    +
    +# make sure to stop Kafka and ZooKeeper at the end
    +
    +function kafka_cleanup {
    +  $KAFKA_DIR/bin/kafka-server-stop.sh
    +  $KAFKA_DIR/bin/zookeeper-server-stop.sh
    +
    +  # make sure to run regular cleanup as well
    +  cleanup
    --- End diff --
    
    @zentol the `kafka_cleanup` trap also includes this, which shuts down the 
Flink cluster and checks logs for errors.


---

Reply via email to