Github user ssuchter commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20697#discussion_r191966223
  
    --- Diff: 
resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
 ---
    @@ -0,0 +1,91 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +TEST_ROOT_DIR=$(git rev-parse --show-toplevel)
    +UNPACKED_SPARK_TGZ="$TEST_ROOT_DIR/target/spark-dist-unpacked"
    +IMAGE_TAG_OUTPUT_FILE="$TEST_ROOT_DIR/target/image-tag.txt"
    +DEPLOY_MODE="minikube"
    +IMAGE_REPO="docker.io/kubespark"
    +IMAGE_TAG="N/A"
    +SPARK_TGZ="N/A"
    +
    +# Parse arguments
    +while (( "$#" )); do
    +  case $1 in
    +    --unpacked-spark-tgz)
    +      UNPACKED_SPARK_TGZ="$2"
    +      shift
    +      ;;
    +    --image-repo)
    +      IMAGE_REPO="$2"
    +      shift
    +      ;;
    +    --image-tag)
    +      IMAGE_TAG="$2"
    +      shift
    +      ;;
    +    --image-tag-output-file)
    +      IMAGE_TAG_OUTPUT_FILE="$2"
    +      shift
    +      ;;
    +    --deploy-mode)
    +      DEPLOY_MODE="$2"
    +      shift
    +      ;;
    +    --spark-tgz)
    +      SPARK_TGZ="$2"
    +      shift
    +      ;;
    +    *)
    +      break
    +      ;;
    +  esac
    +  shift
    +done
    +
    +if [[ $SPARK_TGZ == "N/A" ]];
    +then
    +  echo "Must specify a Spark tarball to build Docker images against with 
--spark-tgz." && exit 1;
    --- End diff --
    
    Ok, it's important for me to be clear here. There are currently two PRBs. 
This will continue in the immediate future.
    
    1. General Spark PRB, mainly for unit tests. This can run on all hosts.
    
    2. K8s integration-specific PRB. This early-outs on many PRs that don't 
seem relevant. This is specifically for running K8s integration tests, and can 
only run on some hosts.
    
    Because of the host restriction issue, these are two separate PRBs.
    
    It is definitely true that each one of these will build the main Spark jars 
separately, so that 11 minute time will be spent twice. Since the 
K8s-integration PRB is only doing this on a small set of PRs, it's not a 
significant cost to the Jenkins infrastructure.
    
    Within the K8s-integration PRB, the entire maven reactor is only built 
once, during the make distribution step. The integration test step doesn't 
rebuild it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to