Repository: spark
Updated Branches:
  refs/heads/branch-1.5 0119edf52 -> 6ea33f5bf


[SPARK-1517] Refactor release scripts to facilitate nightly publishing

This update contains some code changes to the release scripts that allow easier 
nightly publishing. I've been using these new scripts on Jenkins for cutting 
and publishing nightly snapshots for the last month or so, and it has been 
going well. I'd like to get them merged back upstream so this can be maintained 
by the community.

The main changes are:
1. Separates the release tagging from various build possibilities for an 
already tagged release (`release-tag.sh` and `release-build.sh`).
2. Allow for injecting credentials through the environment, including GPG keys. 
This is then paired with secure key injection in Jenkins.
3. Support for copying build results to a remote directory, and also "rotating" 
results, e.g. the ability to keep the last N copies of binary or doc builds.

I'm happy if anyone wants to take a look at this - it's not user facing but an 
internal utility used for generating releases.

Author: Patrick Wendell <patr...@databricks.com>

Closes #7411 from pwendell/release-script-updates and squashes the following 
commits:

74f9beb [Patrick Wendell] Moving maven build command to a variable
233ce85 [Patrick Wendell] [SPARK-1517] Refactor release scripts to facilitate 
nightly publishing

(cherry picked from commit 3ef0f32928fc383ad3edd5ad167212aeb9eba6e1)
Signed-off-by: Patrick Wendell <patr...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6ea33f5b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/6ea33f5b
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/6ea33f5b

Branch: refs/heads/branch-1.5
Commit: 6ea33f5bf1bf5577d99951b77e473c2b1479ac5c
Parents: 0119edf
Author: Patrick Wendell <patr...@databricks.com>
Authored: Tue Aug 11 21:16:48 2015 -0700
Committer: Patrick Wendell <patr...@databricks.com>
Committed: Tue Aug 11 21:16:59 2015 -0700

----------------------------------------------------------------------
 dev/create-release/create-release.sh | 267 -------------------------
 dev/create-release/release-build.sh  | 321 ++++++++++++++++++++++++++++++
 dev/create-release/release-tag.sh    |  79 ++++++++
 3 files changed, 400 insertions(+), 267 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/6ea33f5b/dev/create-release/create-release.sh
----------------------------------------------------------------------
diff --git a/dev/create-release/create-release.sh 
b/dev/create-release/create-release.sh
deleted file mode 100755
index 4311c8c..0000000
--- a/dev/create-release/create-release.sh
+++ /dev/null
@@ -1,267 +0,0 @@
-#!/usr/bin/env bash
-
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-# Quick-and-dirty automation of making maven and binary releases. Not robust 
at all.
-# Publishes releases to Maven and packages/copies binary release artifacts.
-# Expects to be run in a totally empty directory.
-#
-# Options:
-#  --skip-create-release       Assume the desired release tag already exists
-#  --skip-publish              Do not publish to Maven central
-#  --skip-package              Do not package and upload binary artifacts
-# Would be nice to add:
-#  - Send output to stderr and have useful logging in stdout
-
-# Note: The following variables must be set before use!
-ASF_USERNAME=${ASF_USERNAME:-pwendell}
-ASF_PASSWORD=${ASF_PASSWORD:-XXX}
-GPG_PASSPHRASE=${GPG_PASSPHRASE:-XXX}
-GIT_BRANCH=${GIT_BRANCH:-branch-1.0}
-RELEASE_VERSION=${RELEASE_VERSION:-1.2.0}
-# Allows publishing under a different version identifier than
-# was present in the actual release sources (e.g. rc-X)
-PUBLISH_VERSION=${PUBLISH_VERSION:-$RELEASE_VERSION} 
-NEXT_VERSION=${NEXT_VERSION:-1.2.1}
-RC_NAME=${RC_NAME:-rc2}
-
-M2_REPO=~/.m2/repository
-SPARK_REPO=$M2_REPO/org/apache/spark
-NEXUS_ROOT=https://repository.apache.org/service/local/staging
-NEXUS_PROFILE=d63f592e7eac0 # Profile for Spark staging uploads
-
-if [ -z "$JAVA_HOME" ]; then
-  echo "Error: JAVA_HOME is not set, cannot proceed."
-  exit -1
-fi
-JAVA_7_HOME=${JAVA_7_HOME:-$JAVA_HOME}
-
-set -e
-
-GIT_TAG=v$RELEASE_VERSION-$RC_NAME
-
-if [[ ! "$@" =~ --skip-create-release ]]; then
-  echo "Creating release commit and publishing to Apache repository"
-  # Artifact publishing
-  git clone 
https://$ASF_USERNAME:$asf_passw...@git-wip-us.apache.org/repos/asf/spark.git \
-    -b $GIT_BRANCH
-  pushd spark
-  export MAVEN_OPTS="-Xmx3g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=1g"
-
-  # Create release commits and push them to github
-  # NOTE: This is done "eagerly" i.e. we don't check if we can succesfully 
build
-  # or before we coin the release commit. This helps avoid races where
-  # other people add commits to this branch while we are in the middle of 
building.
-  cur_ver="${RELEASE_VERSION}-SNAPSHOT"
-  rel_ver="${RELEASE_VERSION}"
-  next_ver="${NEXT_VERSION}-SNAPSHOT"
-
-  old="^\( \{2,4\}\)<version>${cur_ver}<\/version>$"
-  new="\1<version>${rel_ver}<\/version>"
-  find . -name pom.xml | grep -v dev | xargs -I {} sed -i \
-    -e "s/${old}/${new}/" {}
-  find . -name package.scala | grep -v dev | xargs -I {} sed -i \
-    -e "s/${old}/${new}/" {}
-
-  git commit -a -m "Preparing Spark release $GIT_TAG"
-  echo "Creating tag $GIT_TAG at the head of $GIT_BRANCH"
-  git tag $GIT_TAG
-
-  old="^\( \{2,4\}\)<version>${rel_ver}<\/version>$"
-  new="\1<version>${next_ver}<\/version>"
-  find . -name pom.xml | grep -v dev | xargs -I {} sed -i \
-    -e "s/$old/$new/" {}
-  find . -name package.scala | grep -v dev | xargs -I {} sed -i \
-    -e "s/${old}/${new}/" {}
-  git commit -a -m "Preparing development version $next_ver"
-  git push origin $GIT_TAG
-  git push origin HEAD:$GIT_BRANCH
-  popd
-  rm -rf spark
-fi
-
-if [[ ! "$@" =~ --skip-publish ]]; then
-  git clone 
https://$ASF_USERNAME:$asf_passw...@git-wip-us.apache.org/repos/asf/spark.git
-  pushd spark
-  git checkout --force $GIT_TAG 
-  
-  # Substitute in case published version is different than released
-  old="^\( \{2,4\}\)<version>${RELEASE_VERSION}<\/version>$"
-  new="\1<version>${PUBLISH_VERSION}<\/version>"
-  find . -name pom.xml | grep -v dev | xargs -I {} sed -i \
-    -e "s/${old}/${new}/" {}
-
-  # Using Nexus API documented here:
-  # 
https://support.sonatype.com/entries/39720203-Uploading-to-a-Staging-Repository-via-REST-API
-  echo "Creating Nexus staging repository"
-  repo_request="<promoteRequest><data><description>Apache Spark $GIT_TAG 
(published as $PUBLISH_VERSION)</description></data></promoteRequest>"
-  out=$(curl -X POST -d "$repo_request" -u $ASF_USERNAME:$ASF_PASSWORD \
-    -H "Content-Type:application/xml" -v \
-    $NEXUS_ROOT/profiles/$NEXUS_PROFILE/start)
-  staged_repo_id=$(echo $out | sed -e 
"s/.*\(orgapachespark-[0-9]\{4\}\).*/\1/")
-  echo "Created Nexus staging repository: $staged_repo_id"
-
-  rm -rf $SPARK_REPO
-
-  build/mvn -DskipTests -Pyarn -Phive \
-    -Phive-thriftserver -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl \
-    clean install
-
-  ./dev/change-scala-version.sh 2.11
-  
-  build/mvn -DskipTests -Pyarn -Phive \
-    -Dscala-2.11 -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl \
-    clean install
-
-  ./dev/change-scala-version.sh 2.10
-
-  pushd $SPARK_REPO
-
-  # Remove any extra files generated during install
-  find . -type f |grep -v \.jar |grep -v \.pom | xargs rm
-
-  echo "Creating hash and signature files"
-  for file in $(find . -type f)
-  do
-    echo $GPG_PASSPHRASE | gpg --passphrase-fd 0 --output $file.asc 
--detach-sig --armour $file;
-    if [ $(command -v md5) ]; then
-      # Available on OS X; -q to keep only hash
-      md5 -q $file > $file.md5
-    else
-      # Available on Linux; cut to keep only hash
-      md5sum $file | cut -f1 -d' ' > $file.md5
-    fi
-    shasum -a 1 $file | cut -f1 -d' ' > $file.sha1
-  done
-
-  nexus_upload=$NEXUS_ROOT/deployByRepositoryId/$staged_repo_id
-  echo "Uplading files to $nexus_upload"
-  for file in $(find . -type f)
-  do
-    # strip leading ./
-    file_short=$(echo $file | sed -e "s/\.\///")
-    dest_url="$nexus_upload/org/apache/spark/$file_short"
-    echo "  Uploading $file_short"
-    curl -u $ASF_USERNAME:$ASF_PASSWORD --upload-file $file_short $dest_url
-  done
-
-  echo "Closing nexus staging repository"
-  
repo_request="<promoteRequest><data><stagedRepositoryId>$staged_repo_id</stagedRepositoryId><description>Apache
 Spark $GIT_TAG (published as 
$PUBLISH_VERSION)</description></data></promoteRequest>"
-  out=$(curl -X POST -d "$repo_request" -u $ASF_USERNAME:$ASF_PASSWORD \
-    -H "Content-Type:application/xml" -v \
-    $NEXUS_ROOT/profiles/$NEXUS_PROFILE/finish)
-  echo "Closed Nexus staging repository: $staged_repo_id"
-
-  popd
-  popd
-  rm -rf spark
-fi
-
-if [[ ! "$@" =~ --skip-package ]]; then
-  # Source and binary tarballs
-  echo "Packaging release tarballs"
-  git clone https://git-wip-us.apache.org/repos/asf/spark.git
-  cd spark
-  git checkout --force $GIT_TAG
-  release_hash=`git rev-parse HEAD`
-
-  rm .gitignore
-  rm -rf .git
-  cd ..
-
-  cp -r spark spark-$RELEASE_VERSION
-  tar cvzf spark-$RELEASE_VERSION.tgz spark-$RELEASE_VERSION
-  echo $GPG_PASSPHRASE | gpg --passphrase-fd 0 --armour --output 
spark-$RELEASE_VERSION.tgz.asc \
-    --detach-sig spark-$RELEASE_VERSION.tgz
-  echo $GPG_PASSPHRASE | gpg --passphrase-fd 0 --print-md MD5 
spark-$RELEASE_VERSION.tgz > \
-    spark-$RELEASE_VERSION.tgz.md5
-  echo $GPG_PASSPHRASE | gpg --passphrase-fd 0 --print-md SHA512 
spark-$RELEASE_VERSION.tgz > \
-    spark-$RELEASE_VERSION.tgz.sha
-  rm -rf spark-$RELEASE_VERSION
-  
-  # Updated for each binary build
-  make_binary_release() {
-    NAME=$1
-    FLAGS=$2
-    ZINC_PORT=$3
-    cp -r spark spark-$RELEASE_VERSION-bin-$NAME
-    
-    cd spark-$RELEASE_VERSION-bin-$NAME
-
-    # TODO There should probably be a flag to make-distribution to allow 2.11 
support
-    if [[ $FLAGS == *scala-2.11* ]]; then
-      ./dev/change-scala-version.sh 2.11
-    fi
-
-    export ZINC_PORT=$ZINC_PORT
-    echo "Creating distribution: $NAME ($FLAGS)"
-    ./make-distribution.sh --name $NAME --tgz $FLAGS -DzincPort=$ZINC_PORT 
2>&1 > \
-      ../binary-release-$NAME.log
-    cd ..
-    cp spark-$RELEASE_VERSION-bin-$NAME/spark-$RELEASE_VERSION-bin-$NAME.tgz .
-
-    echo $GPG_PASSPHRASE | gpg --passphrase-fd 0 --armour \
-      --output spark-$RELEASE_VERSION-bin-$NAME.tgz.asc \
-      --detach-sig spark-$RELEASE_VERSION-bin-$NAME.tgz
-    echo $GPG_PASSPHRASE | gpg --passphrase-fd 0 --print-md \
-      MD5 spark-$RELEASE_VERSION-bin-$NAME.tgz > \
-      spark-$RELEASE_VERSION-bin-$NAME.tgz.md5
-    echo $GPG_PASSPHRASE | gpg --passphrase-fd 0 --print-md \
-      SHA512 spark-$RELEASE_VERSION-bin-$NAME.tgz > \
-      spark-$RELEASE_VERSION-bin-$NAME.tgz.sha
-  }
-
-  # We increment the Zinc port each time to avoid OOM's and other craziness if 
multiple builds
-  # share the same Zinc server.
-  make_binary_release "hadoop1" "-Psparkr -Phadoop-1 -Phive 
-Phive-thriftserver" "3030" &
-  make_binary_release "hadoop1-scala2.11" "-Psparkr -Phadoop-1 -Phive 
-Dscala-2.11" "3031" &
-  make_binary_release "cdh4" "-Psparkr -Phadoop-1 -Phive -Phive-thriftserver 
-Dhadoop.version=2.0.0-mr1-cdh4.2.0" "3032" &
-  make_binary_release "hadoop2.3" "-Psparkr -Phadoop-2.3 -Phive 
-Phive-thriftserver -Pyarn" "3033" &
-  make_binary_release "hadoop2.4" "-Psparkr -Phadoop-2.4 -Phive 
-Phive-thriftserver -Pyarn" "3034" &
-  make_binary_release "mapr3" "-Pmapr3 -Psparkr -Phive -Phive-thriftserver" 
"3035" &
-  make_binary_release "mapr4" "-Pmapr4 -Psparkr -Pyarn -Phive 
-Phive-thriftserver" "3036" &
-  make_binary_release "hadoop2.4-without-hive" "-Psparkr -Phadoop-2.4 -Pyarn" 
"3037" &
-  wait
-  rm -rf spark-$RELEASE_VERSION-bin-*/
-
-  # Copy data
-  echo "Copying release tarballs"
-  rc_folder=spark-$RELEASE_VERSION-$RC_NAME
-  ssh $asf_usern...@people.apache.org \
-    mkdir /home/$ASF_USERNAME/public_html/$rc_folder
-  scp spark-* \
-    $asf_usern...@people.apache.org:/home/$ASF_USERNAME/public_html/$rc_folder/
-
-  # Docs
-  cd spark
-  sbt/sbt clean
-  cd docs
-  # Compile docs with Java 7 to use nicer format
-  JAVA_HOME="$JAVA_7_HOME" PRODUCTION=1 RELEASE_VERSION="$RELEASE_VERSION" 
jekyll build
-  echo "Copying release documentation"
-  rc_docs_folder=${rc_folder}-docs
-  ssh $asf_usern...@people.apache.org \
-    mkdir /home/$ASF_USERNAME/public_html/$rc_docs_folder
-  rsync -r _site/* 
$asf_usern...@people.apache.org:/home/$ASF_USERNAME/public_html/$rc_docs_folder
-
-  echo "Release $RELEASE_VERSION completed:"
-  echo "Git tag:\t $GIT_TAG"
-  echo "Release commit:\t $release_hash"
-  echo "Binary location:\t http://people.apache.org/~$ASF_USERNAME/$rc_folder";
-  echo "Doc location:\t 
http://people.apache.org/~$ASF_USERNAME/$rc_docs_folder";
-fi

http://git-wip-us.apache.org/repos/asf/spark/blob/6ea33f5b/dev/create-release/release-build.sh
----------------------------------------------------------------------
diff --git a/dev/create-release/release-build.sh 
b/dev/create-release/release-build.sh
new file mode 100755
index 0000000..399c73e
--- /dev/null
+++ b/dev/create-release/release-build.sh
@@ -0,0 +1,321 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+function exit_with_usage {
+  cat << EOF
+usage: release-build.sh <package|docs|publish-snapshot|publish-release>
+Creates build deliverables from a Spark commit.
+
+Top level targets are
+  package: Create binary packages and copy them to people.apache
+  docs: Build docs and copy them to people.apache
+  publish-snapshot: Publish snapshot release to Apache snapshots
+  publish-release: Publish a release to Apache release repo
+
+All other inputs are environment variables
+
+GIT_REF - Release tag or commit to build from
+SPARK_VERSION - Release identifier used when publishing
+SPARK_PACKAGE_VERSION - Release identifier in top level package directory
+REMOTE_PARENT_DIR - Parent in which to create doc or release builds.
+REMOTE_PARENT_MAX_LENGTH - If set, parent directory will be cleaned to only
+ have this number of subdirectories (by deleting old ones). WARNING: This 
deletes data.
+
+ASF_USERNAME - Username of ASF committer account
+ASF_PASSWORD - Password of ASF committer account
+ASF_RSA_KEY - RSA private key file for ASF committer account
+
+GPG_KEY - GPG key used to sign release artifacts
+GPG_PASSPHRASE - Passphrase for GPG key
+EOF
+  exit 1
+}
+
+set -e
+
+if [ $# -eq 0 ]; then
+  exit_with_usage
+fi
+
+if [[ $@ == *"help"* ]]; then
+  exit_with_usage
+fi
+
+for env in ASF_USERNAME ASF_RSA_KEY GPG_PASSPHRASE GPG_KEY; do
+  if [ -z "${!env}" ]; then
+    echo "ERROR: $env must be set to run this script"
+    exit_with_usage
+  fi
+done
+
+# Commit ref to checkout when building
+GIT_REF=${GIT_REF:-master}
+
+# Destination directory parent on remote server
+REMOTE_PARENT_DIR=${REMOTE_PARENT_DIR:-/home/$ASF_USERNAME/public_html}
+
+SSH="ssh -o StrictHostKeyChecking=no -i $ASF_RSA_KEY"
+GPG="gpg --no-tty --batch"
+NEXUS_ROOT=https://repository.apache.org/service/local/staging
+NEXUS_PROFILE=d63f592e7eac0 # Profile for Spark staging uploads
+BASE_DIR=$(pwd)
+
+MVN="build/mvn --force"
+PUBLISH_PROFILES="-Pyarn -Phive -Phadoop-2.2"
+PUBLISH_PROFILES="$PUBLISH_PROFILES -Pspark-ganglia-lgpl -Pkinesis-asl"
+
+rm -rf spark
+git clone https://git-wip-us.apache.org/repos/asf/spark.git
+cd spark
+git checkout $GIT_REF
+git_hash=`git rev-parse --short HEAD`
+echo "Checked out Spark git hash $git_hash"
+
+if [ -z "$SPARK_VERSION" ]; then
+  SPARK_VERSION=$($MVN help:evaluate -Dexpression=project.version \
+    | grep -v INFO | grep -v WARNING | grep -v Download)
+fi
+
+if [ -z "$SPARK_PACKAGE_VERSION" ]; then
+  SPARK_PACKAGE_VERSION="${SPARK_VERSION}-$(date +%Y_%m_%d_%H_%M)-${git_hash}"
+fi
+
+DEST_DIR_NAME="spark-$SPARK_PACKAGE_VERSION"
+USER_HOST="$asf_usern...@people.apache.org"
+
+rm .gitignore
+rm -rf .git
+cd ..
+
+if [ -n "$REMOTE_PARENT_MAX_LENGTH" ]; then
+  old_dirs=$($SSH $USER_HOST ls -t $REMOTE_PARENT_DIR | tail -n 
+$REMOTE_PARENT_MAX_LENGTH)
+  for old_dir in $old_dirs; do
+    echo "Removing directory: $old_dir"
+    $SSH $USER_HOST rm -r $REMOTE_PARENT_DIR/$old_dir
+  done
+fi
+
+if [[ "$1" == "package" ]]; then
+  # Source and binary tarballs
+  echo "Packaging release tarballs"
+  cp -r spark spark-$SPARK_VERSION
+  tar cvzf spark-$SPARK_VERSION.tgz spark-$SPARK_VERSION
+  echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --armour --output 
spark-$SPARK_VERSION.tgz.asc \
+    --detach-sig spark-$SPARK_VERSION.tgz
+  echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --print-md MD5 
spark-$SPARK_VERSION.tgz > \
+    spark-$SPARK_VERSION.tgz.md5
+  echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --print-md \
+    SHA512 spark-$SPARK_VERSION.tgz > spark-$SPARK_VERSION.tgz.sha
+  rm -rf spark-$SPARK_VERSION
+
+  # Updated for each binary build
+  make_binary_release() {
+    NAME=$1
+    FLAGS=$2
+    ZINC_PORT=$3
+    cp -r spark spark-$SPARK_VERSION-bin-$NAME
+
+    cd spark-$SPARK_VERSION-bin-$NAME
+
+    # TODO There should probably be a flag to make-distribution to allow 2.11 
support
+    if [[ $FLAGS == *scala-2.11* ]]; then
+      ./dev/change-scala-version.sh 2.11
+    fi
+
+    export ZINC_PORT=$ZINC_PORT
+    echo "Creating distribution: $NAME ($FLAGS)"
+    ./make-distribution.sh --name $NAME --tgz $FLAGS -DzincPort=$ZINC_PORT 
2>&1 > \
+      ../binary-release-$NAME.log
+    cd ..
+    cp spark-$SPARK_VERSION-bin-$NAME/spark-$SPARK_VERSION-bin-$NAME.tgz .
+
+    echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --armour \
+      --output spark-$SPARK_VERSION-bin-$NAME.tgz.asc \
+      --detach-sig spark-$SPARK_VERSION-bin-$NAME.tgz
+    echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --print-md \
+      MD5 spark-$SPARK_VERSION-bin-$NAME.tgz > \
+      spark-$SPARK_VERSION-bin-$NAME.tgz.md5
+    echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --print-md \
+      SHA512 spark-$SPARK_VERSION-bin-$NAME.tgz > \
+      spark-$SPARK_VERSION-bin-$NAME.tgz.sha
+  }
+
+  # TODO: Check exit codes of children here:
+  # 
http://stackoverflow.com/questions/1570262/shell-get-exit-code-of-background-process
+
+  # We increment the Zinc port each time to avoid OOM's and other craziness if 
multiple builds
+  # share the same Zinc server.
+  make_binary_release "hadoop1" "-Psparkr -Phadoop-1 -Phive 
-Phive-thriftserver" "3030" &
+  make_binary_release "hadoop1-scala2.11" "-Psparkr -Phadoop-1 -Phive 
-Dscala-2.11" "3031" &
+  make_binary_release "cdh4" "-Psparkr -Phadoop-1 -Phive -Phive-thriftserver 
-Dhadoop.version=2.0.0-mr1-cdh4.2.0" "3032" &
+  make_binary_release "hadoop2.3" "-Psparkr -Phadoop-2.3 -Phive 
-Phive-thriftserver -Pyarn" "3033" &
+  make_binary_release "hadoop2.4" "-Psparkr -Phadoop-2.4 -Phive 
-Phive-thriftserver -Pyarn" "3034" &
+  make_binary_release "hadoop2.6" "-Psparkr -Phadoop-2.6 -Phive 
-Phive-thriftserver -Pyarn" "3034" &
+  make_binary_release "hadoop2.4-without-hive" "-Psparkr -Phadoop-2.4 -Pyarn" 
"3037" &
+  make_binary_release "without-hadoop" "-Psparkr -Phadoop-provided -Pyarn" 
"3038" &
+  wait
+  rm -rf spark-$SPARK_VERSION-bin-*/
+
+  # Copy data
+  dest_dir="$REMOTE_PARENT_DIR/${DEST_DIR_NAME}-bin"
+  echo "Copying release tarballs to $dest_dir"
+  $SSH $USER_HOST mkdir $dest_dir
+  rsync -e "$SSH" spark-* $USER_HOST:$dest_dir
+  echo "Linking /latest to $dest_dir"
+  $SSH $USER_HOST rm -f "$REMOTE_PARENT_DIR/latest"
+  $SSH $USER_HOST ln -s $dest_dir "$REMOTE_PARENT_DIR/latest"
+  exit 0
+fi
+
+if [[ "$1" == "docs" ]]; then
+  # Documentation
+  cd spark
+  echo "Building Spark docs"
+  dest_dir="$REMOTE_PARENT_DIR/${DEST_DIR_NAME}-docs"
+  cd docs
+  # Compile docs with Java 7 to use nicer format
+  # TODO: Make configurable to add this: PRODUCTION=1
+  PRODUCTION=1 RELEASE_VERSION="$SPARK_VERSION" jekyll build
+  echo "Copying release documentation to $dest_dir"
+  $SSH $USER_HOST mkdir $dest_dir
+  echo "Linking /latest to $dest_dir"
+  $SSH $USER_HOST rm -f "$REMOTE_PARENT_DIR/latest"
+  $SSH $USER_HOST ln -s $dest_dir "$REMOTE_PARENT_DIR/latest"
+  rsync -e "$SSH" -r _site/* $USER_HOST:$dest_dir
+  cd ..
+  exit 0
+fi
+
+if [[ "$1" == "publish-snapshot" ]]; then
+  cd spark
+  # Publish Spark to Maven release repo
+  echo "Deploying Spark SNAPSHOT at '$GIT_REF' ($git_hash)"
+  echo "Publish version is $SPARK_VERSION"
+  if [[ ! $SPARK_VERSION == *"SNAPSHOT"* ]]; then
+    echo "ERROR: Snapshots must have a version containing SNAPSHOT"
+    echo "ERROR: You gave version '$SPARK_VERSION'"
+    exit 1
+  fi
+  # Coerce the requested version
+  $MVN versions:set -DnewVersion=$SPARK_VERSION
+  tmp_settings="tmp-settings.xml"
+  echo "<settings><servers><server>" > $tmp_settings
+  echo "<id>apache.snapshots.https</id><username>$ASF_USERNAME</username>" >> 
$tmp_settings
+  echo "<password>$ASF_PASSWORD</password>" >> $tmp_settings
+  echo "</server></servers></settings>" >> $tmp_settings
+
+  # Generate random point for Zinc
+  export ZINC_PORT=$(python -S -c "import random; print 
random.randrange(3030,4030)")
+
+  $MVN -DzincPort=$ZINC_PORT --settings $tmp_settings -DskipTests 
$PUBLISH_PROFILES \
+    -Phive-thriftserver deploy
+  ./dev/change-scala-version.sh 2.10
+  $MVN -DzincPort=$ZINC_PORT -Dscala-2.11 --settings $tmp_settings \
+    -DskipTests $PUBLISH_PROFILES deploy
+
+  # Clean-up Zinc nailgun process
+  /usr/sbin/lsof -P |grep $ZINC_PORT | grep LISTEN | awk '{ print $2; }' | 
xargs kill
+
+  rm $tmp_settings
+  cd ..
+  exit 0
+fi
+
+if [[ "$1" == "publish-release" ]]; then
+  cd spark
+  # Publish Spark to Maven release repo
+  echo "Publishing Spark checkout at '$GIT_REF' ($git_hash)"
+  echo "Publish version is $SPARK_VERSION"
+  # Coerce the requested version
+  $MVN versions:set -DnewVersion=$SPARK_VERSION
+
+  # Using Nexus API documented here:
+  # 
https://support.sonatype.com/entries/39720203-Uploading-to-a-Staging-Repository-via-REST-API
+  echo "Creating Nexus staging repository"
+  repo_request="<promoteRequest><data><description>Apache Spark $SPARK_VERSION 
(commit $git_hash)</description></data></promoteRequest>"
+  out=$(curl -X POST -d "$repo_request" -u $ASF_USERNAME:$ASF_PASSWORD \
+    -H "Content-Type:application/xml" -v \
+    $NEXUS_ROOT/profiles/$NEXUS_PROFILE/start)
+  staged_repo_id=$(echo $out | sed -e 
"s/.*\(orgapachespark-[0-9]\{4\}\).*/\1/")
+  echo "Created Nexus staging repository: $staged_repo_id"
+
+  tmp_repo=$(mktemp -d spark-repo-XXXXX)
+
+  # Generate random point for Zinc
+  export ZINC_PORT=$(python -S -c "import random; print 
random.randrange(3030,4030)")
+
+  $MVN -DzincPort=$ZINC_PORT -Dmaven.repo.local=$tmp_repo -DskipTests 
$PUBLISH_PROFILES \
+    -Phive-thriftserver clean install
+
+  ./dev/change-scala-version.sh 2.11
+
+  $MVN -DzincPort=$ZINC_PORT -Dmaven.repo.local=$tmp_repo -Dscala-2.11 \
+    -DskipTests $PUBLISH_PROFILES clean install
+
+  # Clean-up Zinc nailgun process
+  /usr/sbin/lsof -P |grep $ZINC_PORT | grep LISTEN | awk '{ print $2; }' | 
xargs kill
+
+  ./dev/change-version-to-2.10.sh
+
+  pushd $tmp_repo/org/apache/spark
+
+  # Remove any extra files generated during install
+  find . -type f |grep -v \.jar |grep -v \.pom | xargs rm
+
+  echo "Creating hash and signature files"
+  for file in $(find . -type f)
+  do
+    echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --output $file.asc \
+      --detach-sig --armour $file;
+    if [ $(command -v md5) ]; then
+      # Available on OS X; -q to keep only hash
+      md5 -q $file > $file.md5
+    else
+      # Available on Linux; cut to keep only hash
+      md5sum $file | cut -f1 -d' ' > $file.md5
+    fi
+    sha1sum $file | cut -f1 -d' ' > $file.sha1
+  done
+
+  nexus_upload=$NEXUS_ROOT/deployByRepositoryId/$staged_repo_id
+  echo "Uplading files to $nexus_upload"
+  for file in $(find . -type f)
+  do
+    # strip leading ./
+    file_short=$(echo $file | sed -e "s/\.\///")
+    dest_url="$nexus_upload/org/apache/spark/$file_short"
+    echo "  Uploading $file_short"
+    curl -u $ASF_USERNAME:$ASF_PASSWORD --upload-file $file_short $dest_url
+  done
+
+  echo "Closing nexus staging repository"
+  
repo_request="<promoteRequest><data><stagedRepositoryId>$staged_repo_id</stagedRepositoryId><description>Apache
 Spark $SPARK_VERSION (commit $git_hash)</description></data></promoteRequest>"
+  out=$(curl -X POST -d "$repo_request" -u $ASF_USERNAME:$ASF_PASSWORD \
+    -H "Content-Type:application/xml" -v \
+    $NEXUS_ROOT/profiles/$NEXUS_PROFILE/finish)
+  echo "Closed Nexus staging repository: $staged_repo_id"
+  popd
+  rm -rf $tmp_repo
+  cd ..
+  exit 0
+fi
+
+cd ..
+rm -rf spark
+echo "ERROR: expects to be called with 'package', 'docs', 'publish-release' or 
'publish-snapshot'"

http://git-wip-us.apache.org/repos/asf/spark/blob/6ea33f5b/dev/create-release/release-tag.sh
----------------------------------------------------------------------
diff --git a/dev/create-release/release-tag.sh 
b/dev/create-release/release-tag.sh
new file mode 100755
index 0000000..b0a3374
--- /dev/null
+++ b/dev/create-release/release-tag.sh
@@ -0,0 +1,79 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+function exit_with_usage {
+  cat << EOF
+usage: tag-release.sh
+Tags a Spark release on a particular branch.
+
+Inputs are specified with the following environment variables:
+ASF_USERNAME - Apache Username
+ASF_PASSWORD - Apache Password
+GIT_NAME - Name to use with git
+GIT_EMAIL - E-mail address to use with git
+GIT_BRANCH - Git branch on which to make release
+RELEASE_VERSION - Version used in pom files for release
+RELEASE_TAG - Name of release tag
+NEXT_VERSION - Development version after release
+EOF
+  exit 1
+}
+
+set -e
+
+if [[ $@ == *"help"* ]]; then
+  exit_with_usage
+fi
+
+for env in ASF_USERNAME ASF_PASSWORD RELEASE_VERSION RELEASE_TAG NEXT_VERSION 
GIT_EMAIL GIT_NAME GIT_BRANCH; do
+  if [ -z "${!env}" ]; then
+    echo "$env must be set to run this script"
+    exit 1
+  fi
+done
+
+ASF_SPARK_REPO="git-wip-us.apache.org/repos/asf/spark.git"
+MVN="build/mvn --force"
+
+rm -rf spark
+git clone https://$ASF_USERNAME:$ASF_PASSWORD@$ASF_SPARK_REPO -b $GIT_BRANCH
+cd spark
+
+git config user.name "$GIT_NAME"
+git config user.email $GIT_EMAIL
+
+# Create release version
+$MVN versions:set -DnewVersion=$RELEASE_VERSION | grep -v "no value" # silence 
logs
+git commit -a -m "Preparing Spark release $RELEASE_TAG"
+echo "Creating tag $RELEASE_TAG at the head of $GIT_BRANCH"
+git tag $RELEASE_TAG
+
+# TODO: It would be nice to do some verifications here
+#       i.e. check whether ec2 scripts have the new version
+
+# Create next version
+$MVN versions:set -DnewVersion=$NEXT_VERSION | grep -v "no value" # silence 
logs
+git commit -a -m "Preparing development version $NEXT_VERSION"
+
+# Push changes
+git push origin $RELEASE_TAG
+git push origin HEAD:$GIT_BRANCH
+
+cd ..
+rm -rf spark


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to