This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 85b50d4  [SPARK-34539][BUILD][INFRA] Remove stand-alone version Zinc 
server
85b50d4 is described below

commit 85b50d42586be2f3f19c7d94a8aa297215ebfbc2
Author: Yikun Jiang <yikunk...@gmail.com>
AuthorDate: Mon Mar 1 08:39:38 2021 -0600

    [SPARK-34539][BUILD][INFRA] Remove stand-alone version Zinc server
    
    ### What changes were proposed in this pull request?
    Cleanup all Zinc standalone server code, and realated coniguration.
    
    ### Why are the changes needed?
    
![image](https://user-images.githubusercontent.com/1736354/109154790-c1d3e580-77a9-11eb-8cde-835deed6e10e.png)
    - Zinc is the incremental compiler to speed up builds of compilation.
    - The scala-maven-plugin is the mave plugin, which is used by Spark, one of 
the function is to integrate the Zinc to enable the incremental compiler.
    - Since Spark v3.0.0 
([SPARK-28759](https://issues.apache.org/jira/browse/SPARK-28759)), the 
scala-maven-plugin is upgraded to v4.X, that means Zinc v0.3.13 standalone 
server is useless anymore.
    
    However, we still download, install, start the standalone Zinc server. we 
should remove all zinc standalone server code, and all related configuration.
    
    See more in 
[SPARK-34539](https://issues.apache.org/jira/projects/SPARK/issues/SPARK-34539) 
or the doc [Zinc standalone server is useless after scala-maven-plugin 
4.x](https://docs.google.com/document/d/1u4kCHDx7KjVlHGerfmbcKSB0cZo6AD4cBdHSse-SBsM).
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    Run any mvn build:
    ./build/mvn -DskipTests clean package -pl core
    You could see the increamental compilation is still working, the stage of 
"scala-maven-plugin:4.3.0:compile (scala-compile-first)" with incremental 
compilation info, like:
    ```
    [INFO] --- scala-maven-plugin:4.3.0:testCompile (scala-test-compile-first)  
spark-core_2.12 ---
    [INFO] Using incremental compilation using Mixed compile order
    [INFO] Compiler bridge file: 
/root/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
    [INFO] compiler plugin: 
BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
    [INFO] Compiling 303 Scala sources and 27 Java sources to 
/root/spark/core/target/scala-2.12/test-classes ...
    ```
    
    Closes #31647 from Yikun/cleanup-zinc.
    
    Authored-by: Yikun Jiang <yikunk...@gmail.com>
    Signed-off-by: Sean Owen <sro...@gmail.com>
---
 .github/workflows/build_and_test.yml    | 12 +++------
 .gitignore                              |  1 -
 build/mvn                               | 47 +--------------------------------
 dev/create-release/do-release-docker.sh |  1 -
 dev/create-release/release-build.sh     | 24 ++++-------------
 dev/run-tests.py                        | 13 +--------
 docs/building-spark.md                  |  7 +++--
 pom.xml                                 |  1 -
 8 files changed, 14 insertions(+), 92 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 6c61281..8be24f1 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -99,12 +99,11 @@ jobs:
       if: ${{ github.event.inputs.target != '' }}
       run: git merge --progress --ff-only origin/${{ 
github.event.inputs.target }}
     # Cache local repositories. Note that GitHub Actions cache has a 2G limit.
-    - name: Cache Scala, SBT, Maven and Zinc
+    - name: Cache Scala, SBT and Maven
       uses: actions/cache@v2
       with:
         path: |
           build/apache-maven-*
-          build/zinc-*
           build/scala-*
           build/*.jar
           ~/.sbt
@@ -186,12 +185,11 @@ jobs:
       if: ${{ github.event.inputs.target != '' }}
       run: git merge --progress --ff-only origin/${{ 
github.event.inputs.target }}
     # Cache local repositories. Note that GitHub Actions cache has a 2G limit.
-    - name: Cache Scala, SBT, Maven and Zinc
+    - name: Cache Scala, SBT and Maven
       uses: actions/cache@v2
       with:
         path: |
           build/apache-maven-*
-          build/zinc-*
           build/scala-*
           build/*.jar
           ~/.sbt
@@ -254,12 +252,11 @@ jobs:
       if: ${{ github.event.inputs.target != '' }}
       run: git merge --progress --ff-only origin/${{ 
github.event.inputs.target }}
     # Cache local repositories. Note that GitHub Actions cache has a 2G limit.
-    - name: Cache Scala, SBT, Maven and Zinc
+    - name: Cache Scala, SBT and Maven
       uses: actions/cache@v2
       with:
         path: |
           build/apache-maven-*
-          build/zinc-*
           build/scala-*
           build/*.jar
           ~/.sbt
@@ -297,12 +294,11 @@ jobs:
     - name: Checkout Spark repository
       uses: actions/checkout@v2
     # Cache local repositories. Note that GitHub Actions cache has a 2G limit.
-    - name: Cache Scala, SBT, Maven and Zinc
+    - name: Cache Scala, SBT and Maven
       uses: actions/cache@v2
       with:
         path: |
           build/apache-maven-*
-          build/zinc-*
           build/scala-*
           build/*.jar
           ~/.sbt
diff --git a/.gitignore b/.gitignore
index 917eac1..021af9b 100644
--- a/.gitignore
+++ b/.gitignore
@@ -30,7 +30,6 @@ R/pkg/tests/fulltests/Rplots.pdf
 build/*.jar
 build/apache-maven*
 build/scala*
-build/zinc*
 cache
 checkpoint
 conf/*.cmd
diff --git a/build/mvn b/build/mvn
index 672599a..719d757 100755
--- a/build/mvn
+++ b/build/mvn
@@ -91,27 +91,6 @@ install_mvn() {
   fi
 }
 
-# Install zinc under the build/ folder
-install_zinc() {
-  local ZINC_VERSION=0.3.15
-  ZINC_BIN="$(command -v zinc)"
-  if [ "$ZINC_BIN" ]; then
-    local ZINC_DETECTED_VERSION="$(zinc -version | head -n1 | awk '{print 
$5}')"
-  fi
-
-  if [ $(version $ZINC_DETECTED_VERSION) -lt $(version $ZINC_VERSION) ]; then
-    local zinc_path="zinc-${ZINC_VERSION}/bin/zinc"
-    [ ! -f "${_DIR}/${zinc_path}" ] && ZINC_INSTALL_FLAG=1
-    local TYPESAFE_MIRROR=${TYPESAFE_MIRROR:-https://downloads.lightbend.com}
-
-    install_app \
-      "${TYPESAFE_MIRROR}/zinc/${ZINC_VERSION}" \
-      "zinc-${ZINC_VERSION}.tgz" \
-      "${zinc_path}"
-    ZINC_BIN="${_DIR}/${zinc_path}"
-  fi
-}
-
 # Determine the Scala version from the root pom.xml file, set the Scala URL,
 # and, with that, download the specific version of Scala necessary under
 # the build/ folder
@@ -131,31 +110,12 @@ install_scala() {
   SCALA_LIBRARY="$(cd "$(dirname "${scala_bin}")/../lib" && 
pwd)/scala-library.jar"
 }
 
-# Setup healthy defaults for the Zinc port if none were provided from
-# the environment
-ZINC_PORT=${ZINC_PORT:-"3030"}
-
-# Install the proper version of Scala, Zinc and Maven for the build
-if [ "$(uname -m)" != 'aarch64' ]; then
-  install_zinc
-fi
 install_scala
 install_mvn
 
 # Reset the current working directory
 cd "${_CALLING_DIR}"
 
-# Now that zinc is ensured to be installed, check its status and, if its
-# not running or just installed, start it
-if [ "$(uname -m)" != 'aarch64' ] && [ -n "${ZINC_INSTALL_FLAG}" -o -z 
"`"${ZINC_BIN}" -status -port ${ZINC_PORT}`" ]; then
-  export ZINC_OPTS=${ZINC_OPTS:-"$_COMPILE_JVM_OPTS"}
-  "${ZINC_BIN}" -shutdown -port ${ZINC_PORT}
-  "${ZINC_BIN}" -start -port ${ZINC_PORT} \
-    -server 127.0.0.1 -idle-timeout 3h \
-    -scala-compiler "${SCALA_COMPILER}" \
-    -scala-library "${SCALA_LIBRARY}" &>/dev/null
-fi
-
 # Set any `mvn` options if not already present
 export MAVEN_OPTS=${MAVEN_OPTS:-"$_COMPILE_JVM_OPTS"}
 
@@ -163,12 +123,7 @@ echo "Using \`mvn\` from path: $MVN_BIN" 1>&2
 
 # call the `mvn` command as usual
 # SPARK-25854
-"${MVN_BIN}" -DzincPort=${ZINC_PORT} "$@"
+"${MVN_BIN}" "$@"
 MVN_RETCODE=$?
 
-# Try to shut down zinc explicitly if the server is still running.
-if [ "$(uname -m)" != 'aarch64' ]; then
-  "${ZINC_BIN}" -shutdown -port ${ZINC_PORT}
-fi
-
 exit $MVN_RETCODE
diff --git a/dev/create-release/do-release-docker.sh 
b/dev/create-release/do-release-docker.sh
index 19a5345..f1632f01 100755
--- a/dev/create-release/do-release-docker.sh
+++ b/dev/create-release/do-release-docker.sh
@@ -133,7 +133,6 @@ ASF_PASSWORD=$ASF_PASSWORD
 GPG_PASSPHRASE=$GPG_PASSPHRASE
 RELEASE_STEP=$RELEASE_STEP
 USER=$USER
-ZINC_OPTS=${RELEASE_ZINC_OPTS:-"-Xmx4g -XX:ReservedCodeCacheSize=2g"}
 EOF
 
 JAVA_VOL=
diff --git a/dev/create-release/release-build.sh 
b/dev/create-release/release-build.sh
index a39ea6e..52665f7 100755
--- a/dev/create-release/release-build.sh
+++ b/dev/create-release/release-build.sh
@@ -179,8 +179,6 @@ if [[ "$1" == "package" ]]; then
   shasum -a 512 spark-$SPARK_VERSION.tgz > spark-$SPARK_VERSION.tgz.sha512
   rm -rf spark-$SPARK_VERSION
 
-  ZINC_PORT=3035
-
   # Updated for each binary build
   make_binary_release() {
     NAME=$1
@@ -198,17 +196,12 @@ if [[ "$1" == "package" ]]; then
       R_FLAG="--r"
     fi
 
-    # We increment the Zinc port each time to avoid OOM's and other craziness 
if multiple builds
-    # share the same Zinc server.
-    ZINC_PORT=$((ZINC_PORT + 1))
-
     echo "Building binary dist $NAME"
     cp -r spark spark-$SPARK_VERSION-bin-$NAME
     cd spark-$SPARK_VERSION-bin-$NAME
 
     ./dev/change-scala-version.sh $SCALA_VERSION
 
-    export ZINC_PORT=$ZINC_PORT
     echo "Creating distribution: $NAME ($FLAGS)"
 
     # Write out the VERSION to PySpark version info we rewrite the - into a . 
and SNAPSHOT
@@ -221,8 +214,7 @@ if [[ "$1" == "package" ]]; then
 
     echo "Creating distribution"
     ./dev/make-distribution.sh --name $NAME --mvn $MVN_HOME/bin/mvn --tgz \
-      $PIP_FLAG $R_FLAG $FLAGS \
-      -DzincPort=$ZINC_PORT 2>&1 >  ../binary-release-$NAME.log
+      $PIP_FLAG $R_FLAG $FLAGS 2>&1 >  ../binary-release-$NAME.log
     cd ..
 
     if [[ -n $R_FLAG ]]; then
@@ -380,14 +372,11 @@ if [[ "$1" == "publish-snapshot" ]]; then
   echo "<password>$ASF_PASSWORD</password>" >> $tmp_settings
   echo "</server></servers></settings>" >> $tmp_settings
 
-  # Generate random port for Zinc
-  export ZINC_PORT=$(python -S -c "import random; 
print(random.randrange(3030,4030))")
-
-  $MVN -DzincPort=$ZINC_PORT --settings $tmp_settings -DskipTests 
$SCALA_2_12_PROFILES $PUBLISH_PROFILES clean deploy
+  $MVN --settings $tmp_settings -DskipTests $SCALA_2_12_PROFILES 
$PUBLISH_PROFILES clean deploy
 
   if [[ $PUBLISH_SCALA_2_13 = 1 ]]; then
     ./dev/change-scala-version.sh 2.13
-    $MVN -DzincPort=$ZINC_PORT --settings $tmp_settings -DskipTests 
$SCALA_2_13_PROFILES $PUBLISH_PROFILES clean deploy
+    $MVN --settings $tmp_settings -DskipTests $SCALA_2_13_PROFILES 
$PUBLISH_PROFILES clean deploy
   fi
 
   rm $tmp_settings
@@ -417,18 +406,15 @@ if [[ "$1" == "publish-release" ]]; then
 
   tmp_repo=$(mktemp -d spark-repo-XXXXX)
 
-  # Generate random port for Zinc
-  export ZINC_PORT=$(python -S -c "import random; 
print(random.randrange(3030,4030))")
-
   if [[ $PUBLISH_SCALA_2_13 = 1 ]]; then
     ./dev/change-scala-version.sh 2.13
-    $MVN -DzincPort=$ZINC_PORT -Dmaven.repo.local=$tmp_repo -DskipTests \
+    $MVN -Dmaven.repo.local=$tmp_repo -DskipTests \
       $SCALA_2_13_PROFILES $PUBLISH_PROFILES clean install
   fi
 
   if [[ $PUBLISH_SCALA_2_12 = 1 ]]; then
     ./dev/change-scala-version.sh 2.12
-    $MVN -DzincPort=$((ZINC_PORT + 2)) -Dmaven.repo.local=$tmp_repo 
-DskipTests \
+    $MVN -Dmaven.repo.local=$tmp_repo -DskipTests \
       $SCALA_2_12_PROFILES $PUBLISH_PROFILES clean install
   fi
 
diff --git a/dev/run-tests.py b/dev/run-tests.py
index e54e098..83f9f02 100755
--- a/dev/run-tests.py
+++ b/dev/run-tests.py
@@ -20,7 +20,6 @@
 import itertools
 from argparse import ArgumentParser
 import os
-import random
 import re
 import sys
 import subprocess
@@ -257,21 +256,11 @@ def build_spark_documentation():
     os.chdir(SPARK_HOME)
 
 
-def get_zinc_port():
-    """
-    Get a randomized port on which to start Zinc
-    """
-    return random.randrange(3030, 4030)
-
-
 def exec_maven(mvn_args=()):
     """Will call Maven in the current directory with the list of mvn_args 
passed
     in and returns the subprocess for any further processing"""
 
-    zinc_port = get_zinc_port()
-    os.environ["ZINC_PORT"] = "%s" % zinc_port
-    zinc_flag = "-DzincPort=%s" % zinc_port
-    flags = [os.path.join(SPARK_HOME, "build", "mvn"), zinc_flag]
+    flags = [os.path.join(SPARK_HOME, "build", "mvn")]
     run_cmd(flags + mvn_args)
 
 
diff --git a/docs/building-spark.md b/docs/building-spark.md
index f9599b6..8e1c84a 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -51,7 +51,7 @@ You can fix these problems by setting the `MAVEN_OPTS` 
variable as discussed bef
 
 ### build/mvn
 
-Spark now comes packaged with a self-contained Maven installation to ease 
building and deployment of Spark from source located under the `build/` 
directory. This script will automatically download and setup all necessary 
build requirements ([Maven](https://maven.apache.org/), 
[Scala](https://www.scala-lang.org/), and 
[Zinc](https://github.com/typesafehub/zinc)) locally within the `build/` 
directory itself. It honors any `mvn` binary if present already, however, will 
pull down its own cop [...]
+Spark now comes packaged with a self-contained Maven installation to ease 
building and deployment of Spark from source located under the `build/` 
directory. This script will automatically download and setup all necessary 
build requirements ([Maven](https://maven.apache.org/), 
[Scala](https://www.scala-lang.org/)) locally within the `build/` directory 
itself. It honors any `mvn` binary if present already, however, will pull down 
its own copy of Scala regardless to ensure proper version re [...]
 
     ./build/mvn -DskipTests clean package
 
@@ -163,9 +163,8 @@ For the meanings of these two options, please carefully 
read the [Setting up Mav
 
 ## Speeding up Compilation
 
-Developers who compile Spark frequently may want to speed up compilation; 
e.g., by using Zinc
-(for developers who build with Maven) or by avoiding re-compilation of the 
assembly JAR (for
-developers who build with SBT).  For more information about how to do this, 
refer to the
+Developers who compile Spark frequently may want to speed up compilation; 
e.g., by avoiding re-compilation of the
+assembly JAR (for developers who build with SBT).  For more information about 
how to do this, refer to the
 [Useful Developer Tools 
page](https://spark.apache.org/developer-tools.html#reducing-build-times).
 
 ## Encrypted Filesystems
diff --git a/pom.xml b/pom.xml
index 4c300e4..e543cf2 100644
--- a/pom.xml
+++ b/pom.xml
@@ -2562,7 +2562,6 @@
             <checkMultipleScalaVersions>true</checkMultipleScalaVersions>
             <failOnMultipleScalaVersions>true</failOnMultipleScalaVersions>
             <recompileMode>incremental</recompileMode>
-            <useZincServer>true</useZincServer>
             <args>
               <arg>-unchecked</arg>
               <arg>-deprecation</arg>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to