Repository: spark-website
Updated Branches:
  refs/heads/asf-site 005a2a0d1 -> 44d255281


update the release-process document

Our document of doing release is out-dated. This PR updates the release doc to 
match the process of Spark 2.4.0 release.
1. cutting RC has been fully automated. People should always use the docker 
script, instead of setting up the environment manually. Then we don't need to 
update document everytime an environment change is needed.
2. when finalizing the release, suggest people to retain the generated docs of 
latest RC,  and copy it to spark-website, instead of re-generating it.

Author: Wenchen Fan <wenc...@databricks.com>

Closes #157 from cloud-fan/do-release.


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/44d25528
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/44d25528
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/44d25528

Branch: refs/heads/asf-site
Commit: 44d255281e2d02edfb59681761931e8e2675964b
Parents: 005a2a0
Author: Wenchen Fan <wenc...@databricks.com>
Authored: Wed Nov 7 10:17:40 2018 +0800
Committer: Wenchen Fan <wenc...@databricks.com>
Committed: Wed Nov 7 10:17:40 2018 +0800

----------------------------------------------------------------------
 release-process.md        | 89 ++++++++++++++++++++-------------------
 site/contributing.html    |  3 +-
 site/release-process.html | 94 +++++++++++++++++++++---------------------
 3 files changed, 96 insertions(+), 90 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark-website/blob/44d25528/release-process.md
----------------------------------------------------------------------
diff --git a/release-process.md b/release-process.md
index 7097c34..14c9c16 100644
--- a/release-process.md
+++ b/release-process.md
@@ -35,6 +35,34 @@ If you are a new Release Manager, you can read up on the 
process from the follow
 - gpg for signing https://www.apache.org/dev/openpgp.html
 - svn https://www.apache.org/dev/version-control.html#https-svn
 
+<h3>Preparing gpg key</h3>
+
+You can skip this section if you have already uploaded your key.
+
+After generating the gpg key, you need to upload your key to a public key 
server. Please refer to
+<a 
href="https://www.apache.org/dev/openpgp.html#generate-key";>https://www.apache.org/dev/openpgp.html#generate-key</a>
+for details.
+
+If you want to do the release on another machine, you can transfer your gpg 
key to that machine
+via the `gpg --export` and `gpg --import` commands.
+
+The last step is to update the KEYS file with your code signing key
+<a 
href="https://www.apache.org/dev/openpgp.html#export-public-key";>https://www.apache.org/dev/openpgp.html#export-public-key</a>
+
+```
+# Move dev/ to release/ when the voting is completed. See Finalize the Release 
below
+svn co --depth=files "https://dist.apache.org/repos/dist/dev/spark"; svn-spark
+# edit svn-spark/KEYS file
+svn ci --username $ASF_USERNAME --password "$ASF_PASSWORD" -m"Update KEYS"
+```
+
+<h3>Installing docker</h3>
+
+The scripts to create release candidates are run through docker. You need to 
install docker before running
+these scripts. Please make sure that you can run docker as non-root users. See
+<a 
href="https://docs.docker.com/install/linux/linux-postinstall";>https://docs.docker.com/install/linux/linux-postinstall</a>
+for more details.
+
 <h2>Preparing Spark for Release</h2>
 
 The main step towards preparing a release is to create a release branch. This 
is done via
@@ -71,45 +99,15 @@ Also check that all build and test passes are green from 
the RISELab Jenkins: ht
 Note that not all permutations are run on PR therefore it is important to 
check Jenkins runs.
 
 
-The process of cutting a release candidate has been partially automated via 
the RISELab Jenkins. There are
-Jenkins jobs that can tag a release candidate and create various packages 
based on that candidate.
-At present the Jenkins jobs *SHOULD NOT BE USED* as they use a legacy shared 
key for signing.
+To cut a release candidate, there are 4 steps:
+1. Create a git tag for the release candidate.
+1. Package the release binaries & sources, and upload them to the Apache 
staging SVN repo.
+1. Create the release docs, and upload them to the Apache staging SVN repo.
+1. Publish a snapshot to the Apache staging Maven repo.
 
-
-Instead much of the same release logic can be accessed in 
`dev/create-release/release-tag.sh` and `dev/create-release/release-build.sh`. 
The general order of creating a release using the scripts is:
-
-- Verify Jenkins test pass on your desired commit
-- Set the shell environment variables used by the scripts (run with "help" for 
details)
-- Verify your JAVA_HOME is set to the correct Java version (2.2+ Java 8, 
pre-2.2 Java 7)
-- You may find Felix's docker env useful - 
https://github.com/felixcheung/spark-build/blob/master/Dockerfile .
-- Ensure you have the required dependencies to build the docs `docs/README.md`
-- R, for CRAN packaging tests, requires e1071 to be installed as part of the 
packaging tests.
-- In addition R uses LaTeX for some things, and requires some additional 
fonts. On Debian based systems you may wish to install 
`texlive-fonts-recommended` and `texlive-fonts-extra`.
-- Make sure you required Python packages for packaging (see 
`dev/requirements.txt`)
-- Ensure you have Python 3 having Sphinx installed, and `SPHINXPYTHON` 
environment variable is set to indicate your Python 3 executable (see 
SPARK-24530).
-- Tag the release candidate with `dev/create-release/release-tag.sh` (e.g. for 
creating 2.1.2 RC2 we did `ASF_USERNAME=holden ASF_PASSWORD=yoursecretgoeshere 
GIT_NAME="Holden Karau" GIT_BRANCH=branch-2.1 GIT_EMAIL="hol...@us.ibm.com" 
RELEASE_VERSION=2.1.2 RELEASE_TAG=v2.1.2-rc2 NEXT_VERSION=2.1.3-SNAPSHOT 
./dev/create-release/release-tag.sh`)
-- Package the release binaries & sources with 
`dev/create-release/release-build.sh package`
-- Create the release docs with `dev/create-release/release-build.sh docs`
-- For Spark versions prior to 2.1.2, change the SPARK_VERSION from X.Y.Z to 
X.Y.Z-rcA then run `dev/create-release/release-build.sh publish-release`.
-- Publish a snapshot to the Apache release repo 
`dev/create-release/release-build.sh publish-release`
-- If you are a new Release Manager, update the KEYS file with your code 
signing key https://www.apache.org/dev/openpgp.html#export-public-key
-
-```
-# Move dev/ to release/ when the voting is completed. See Finalize the Release 
below
-svn co --depth=files "https://dist.apache.org/repos/dist/dev/spark"; svn-spark
-# edit svn-spark/KEYS file
-svn ci --username $ASF_USERNAME --password "$ASF_PASSWORD" -m"Update KEYS"
-```
-
-If the Jenkins jobs have been updated to support signing with your key you can 
look at the job required for a release are located in the [Spark Release 
Jobs](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Release/) collection.
-If you don't have access, talk to a previous release manager for guidance and 
to get access.
-The jobs can be launched with "Build with Parameters" and the general order is:
-
-- Create a tag for the current RC with 
[spark-release-tag](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Release/job/spark-release-tag/)
 job.
-- Kick off the rest of the jobs except spark-release-publish after the current 
RC has been configured.
-- Once the packaging and doc jobs have finished kick off the 
[spark-release-publish](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Release/job/spark-release-publish)
 job.
-
-The jobs are configured through build parameters. If the build parameters are 
unclear you can look at previous releases or if available, the recommended 
process is to ask the previous release manager to walk you through the Jenkins 
jobs as this document may not be 100% up to date.
+The process of cutting a release candidate has been automated via the 
`dev/create-release/do-release-docker.sh` script.
+Run this script, type information it requires, and wait until it finishes. You 
can also do a single step via the `-s` option.
+Please run `do-release-docker.sh -h` and see more details.
 
 <h3>Call a Vote on the Release Candidate</h3>
 
@@ -195,7 +193,7 @@ Make sure to also remove the unpublished staging 
repositories from the
 
 <h4>Remove Old Releases from Mirror Network</h4>
 
-Spark always keeps the latest maintance released of each branch in the mirror 
network.
+Spark always keeps the latest maintenance released of each branch in the 
mirror network.
 To delete older versions simply use svn rm:
 
 ```
@@ -223,7 +221,9 @@ $ git push apache v1.1.1
 
 The website repository is located at
 <a 
href="https://github.com/apache/spark-website";>https://github.com/apache/spark-website</a>.
-Ensure the docs were generated with the PRODUCTION=1 environment variable.
+
+It's recommended to not remove the generated docs of the latest RC, so that we 
can copy it to
+spark-website directly, otherwise you need to re-build the docs.
 
 ```
 # Build the latest docs
@@ -283,11 +283,14 @@ which fetches potential replacements from Github and 
JIRA. For instance:
 ```
 $ cd release-spark/dev/create-release
 # Set RELEASE_TAG and PREVIOUS_RELEASE_TAG
-$ vim generate-contributors.py
+$ export RELEASE_TAG=v1.1.1
+$ export PREVIOUS_RELEASE_TAG=v1.1.0
 # Generate initial contributors list, likely with warnings
 $ ./generate-contributors.py
-# Set JIRA_USERNAME, JIRA_PASSWORD, and GITHUB_API_TOKEN
-$ vim release-spark/dev/translate-contributors.py
+# set JIRA_USERNAME, JIRA_PASSWORD, and GITHUB_API_TOKEN
+$ export JIRA_USERNAME=blabla
+$ export JIRA_PASSWORD=blabla
+$ export GITHUB_API_TOKEN=blabla
 # Translate names generated in the previous step, reading from 
known_translations if necessary
 $ ./translate-contributors.py
 ```

http://git-wip-us.apache.org/repos/asf/spark-website/blob/44d25528/site/contributing.html
----------------------------------------------------------------------
diff --git a/site/contributing.html b/site/contributing.html
index 59a26f6..c0d0a7b 100644
--- a/site/contributing.html
+++ b/site/contributing.html
@@ -282,7 +282,8 @@ first. Unreproducible bugs, or simple error reports, may be 
closed.</p>
 
 <p>It&#8217;s very helpful if the bug report has a description about how the 
bug was introduced, by 
 which commit, so that reviewers can easily understand the bug. It also helps 
committers to 
-decide how far the bug fix should be backported, when the pull request is 
merged.</p>
+decide how far the bug fix should be backported, when the pull request is 
merged. The pull 
+request to fix the bug should narrow down the problem to the root cause.</p>
 
 <p>Performance regression is also one kind of bug. The pull request to fix a 
performance regression 
 must provide a benchmark to prove the problem is indeed fixed.</p>

http://git-wip-us.apache.org/repos/asf/spark-website/blob/44d25528/site/release-process.html
----------------------------------------------------------------------
diff --git a/site/release-process.html b/site/release-process.html
index daac07b..d84a240 100644
--- a/site/release-process.html
+++ b/site/release-process.html
@@ -241,6 +241,33 @@
   <li>svn https://www.apache.org/dev/version-control.html#https-svn</li>
 </ul>
 
+<h3>Preparing gpg key</h3>
+
+<p>You can skip this section if you have already uploaded your key.</p>
+
+<p>After generating the gpg key, you need to upload your key to a public key 
server. Please refer to
+<a 
href="https://www.apache.org/dev/openpgp.html#generate-key";>https://www.apache.org/dev/openpgp.html#generate-key</a>
+for details.</p>
+
+<p>If you want to do the release on another machine, you can transfer your gpg 
key to that machine
+via the <code>gpg --export</code> and <code>gpg --import</code> commands.</p>
+
+<p>The last step is to update the KEYS file with your code signing key
+<a 
href="https://www.apache.org/dev/openpgp.html#export-public-key";>https://www.apache.org/dev/openpgp.html#export-public-key</a></p>
+
+<pre><code># Move dev/ to release/ when the voting is completed. See Finalize 
the Release below
+svn co --depth=files "https://dist.apache.org/repos/dist/dev/spark"; svn-spark
+# edit svn-spark/KEYS file
+svn ci --username $ASF_USERNAME --password "$ASF_PASSWORD" -m"Update KEYS"
+</code></pre>
+
+<h3>Installing docker</h3>
+
+<p>The scripts to create release candidates are run through docker. You need 
to install docker before running
+these scripts. Please make sure that you can run docker as non-root users. See
+<a 
href="https://docs.docker.com/install/linux/linux-postinstall";>https://docs.docker.com/install/linux/linux-postinstall</a>
+for more details.</p>
+
 <h2>Preparing Spark for Release</h2>
 
 <p>The main step towards preparing a release is to create a release branch. 
This is done via
@@ -272,47 +299,17 @@ changes or in the release news on the website later.</p>
 <p>Also check that all build and test passes are green from the RISELab 
Jenkins: https://amplab.cs.berkeley.edu/jenkins/ particularly look for Spark 
Packaging, QA Compile, QA Test.
 Note that not all permutations are run on PR therefore it is important to 
check Jenkins runs.</p>
 
-<p>The process of cutting a release candidate has been partially automated via 
the RISELab Jenkins. There are
-Jenkins jobs that can tag a release candidate and create various packages 
based on that candidate.
-At present the Jenkins jobs <em>SHOULD NOT BE USED</em> as they use a legacy 
shared key for signing.</p>
-
-<p>Instead much of the same release logic can be accessed in 
<code>dev/create-release/release-tag.sh</code> and 
<code>dev/create-release/release-build.sh</code>. The general order of creating 
a release using the scripts is:</p>
-
-<ul>
-  <li>Verify Jenkins test pass on your desired commit</li>
-  <li>Set the shell environment variables used by the scripts (run with 
&#8220;help&#8221; for details)</li>
-  <li>Verify your JAVA_HOME is set to the correct Java version (2.2+ Java 8, 
pre-2.2 Java 7)</li>
-  <li>You may find Felix&#8217;s docker env useful - 
https://github.com/felixcheung/spark-build/blob/master/Dockerfile .</li>
-  <li>Ensure you have the required dependencies to build the docs 
<code>docs/README.md</code></li>
-  <li>R, for CRAN packaging tests, requires e1071 to be installed as part of 
the packaging tests.</li>
-  <li>In addition R uses LaTeX for some things, and requires some additional 
fonts. On Debian based systems you may wish to install 
<code>texlive-fonts-recommended</code> and 
<code>texlive-fonts-extra</code>.</li>
-  <li>Make sure you required Python packages for packaging (see 
<code>dev/requirements.txt</code>)</li>
-  <li>Ensure you have Python 3 having Sphinx installed, and 
<code>SPHINXPYTHON</code> environment variable is set to indicate your Python 3 
executable (see SPARK-24530).</li>
-  <li>Tag the release candidate with 
<code>dev/create-release/release-tag.sh</code> (e.g. for creating 2.1.2 RC2 we 
did <code>ASF_USERNAME=holden ASF_PASSWORD=yoursecretgoeshere GIT_NAME="Holden 
Karau" GIT_BRANCH=branch-2.1 GIT_EMAIL="hol...@us.ibm.com" 
RELEASE_VERSION=2.1.2 RELEASE_TAG=v2.1.2-rc2 NEXT_VERSION=2.1.3-SNAPSHOT 
./dev/create-release/release-tag.sh</code>)</li>
-  <li>Package the release binaries &amp; sources with 
<code>dev/create-release/release-build.sh package</code></li>
-  <li>Create the release docs with <code>dev/create-release/release-build.sh 
docs</code></li>
-  <li>For Spark versions prior to 2.1.2, change the SPARK_VERSION from X.Y.Z 
to X.Y.Z-rcA then run <code>dev/create-release/release-build.sh 
publish-release</code>.</li>
-  <li>Publish a snapshot to the Apache release repo 
<code>dev/create-release/release-build.sh publish-release</code></li>
-  <li>If you are a new Release Manager, update the KEYS file with your code 
signing key https://www.apache.org/dev/openpgp.html#export-public-key</li>
-</ul>
-
-<pre><code># Move dev/ to release/ when the voting is completed. See Finalize 
the Release below
-svn co --depth=files "https://dist.apache.org/repos/dist/dev/spark"; svn-spark
-# edit svn-spark/KEYS file
-svn ci --username $ASF_USERNAME --password "$ASF_PASSWORD" -m"Update KEYS"
-</code></pre>
-
-<p>If the Jenkins jobs have been updated to support signing with your key you 
can look at the job required for a release are located in the <a 
href="https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Release/";>Spark 
Release Jobs</a> collection.
-If you don&#8217;t have access, talk to a previous release manager for 
guidance and to get access.
-The jobs can be launched with &#8220;Build with Parameters&#8221; and the 
general order is:</p>
-
-<ul>
-  <li>Create a tag for the current RC with <a 
href="https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Release/job/spark-release-tag/";>spark-release-tag</a>
 job.</li>
-  <li>Kick off the rest of the jobs except spark-release-publish after the 
current RC has been configured.</li>
-  <li>Once the packaging and doc jobs have finished kick off the <a 
href="https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Release/job/spark-release-publish";>spark-release-publish</a>
 job.</li>
-</ul>
+<p>To cut a release candidate, there are 4 steps:</p>
+<ol>
+  <li>Create a git tag for the release candidate.</li>
+  <li>Package the release binaries &amp; sources, and upload them to the 
Apache staging SVN repo.</li>
+  <li>Create the release docs, and upload them to the Apache staging SVN 
repo.</li>
+  <li>Publish a snapshot to the Apache staging Maven repo.</li>
+</ol>
 
-<p>The jobs are configured through build parameters. If the build parameters 
are unclear you can look at previous releases or if available, the recommended 
process is to ask the previous release manager to walk you through the Jenkins 
jobs as this document may not be 100% up to date.</p>
+<p>The process of cutting a release candidate has been automated via the 
<code>dev/create-release/do-release-docker.sh</code> script.
+Run this script, type information it requires, and wait until it finishes. You 
can also do a single step via the <code>-s</code> option.
+Please run <code>do-release-docker.sh -h</code> and see more details.</p>
 
 <h3>Call a Vote on the Release Candidate</h3>
 
@@ -392,7 +389,7 @@ the RC directories from the staging repository. For 
example:</p>
 
 <h4>Remove Old Releases from Mirror Network</h4>
 
-<p>Spark always keeps the latest maintance released of each branch in the 
mirror network.
+<p>Spark always keeps the latest maintenance released of each branch in the 
mirror network.
 To delete older versions simply use svn rm:</p>
 
 <pre><code>$ svn rm 
https://dist.apache.org/repos/dist/release/spark/spark-1.1.0
@@ -416,8 +413,10 @@ $ git push apache v1.1.1
 <h4>Update the Spark Website</h4>
 
 <p>The website repository is located at
-<a 
href="https://github.com/apache/spark-website";>https://github.com/apache/spark-website</a>.
-Ensure the docs were generated with the PRODUCTION=1 environment variable.</p>
+<a 
href="https://github.com/apache/spark-website";>https://github.com/apache/spark-website</a>.</p>
+
+<p>It&#8217;s recommended to not remove the generated docs of the latest RC, 
so that we can copy it to
+spark-website directly, otherwise you need to re-build the docs.</p>
 
 <pre><code># Build the latest docs
 $ git checkout v1.1.1
@@ -476,11 +475,14 @@ which fetches potential replacements from Github and 
JIRA. For instance:</p>
 
 <pre><code>$ cd release-spark/dev/create-release
 # Set RELEASE_TAG and PREVIOUS_RELEASE_TAG
-$ vim generate-contributors.py
+$ export RELEASE_TAG=v1.1.1
+$ export PREVIOUS_RELEASE_TAG=v1.1.0
 # Generate initial contributors list, likely with warnings
 $ ./generate-contributors.py
-# Set JIRA_USERNAME, JIRA_PASSWORD, and GITHUB_API_TOKEN
-$ vim release-spark/dev/translate-contributors.py
+# set JIRA_USERNAME, JIRA_PASSWORD, and GITHUB_API_TOKEN
+$ export JIRA_USERNAME=blabla
+$ export JIRA_PASSWORD=blabla
+$ export GITHUB_API_TOKEN=blabla
 # Translate names generated in the previous step, reading from 
known_translations if necessary
 $ ./translate-contributors.py
 </code></pre>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to