This is an automated email from the ASF dual-hosted git repository. yao pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/spark-website.git
The following commit(s) were added to refs/heads/asf-site by this push: new 9e556e4f4a Improve the navigational capability for release manager's guide (#535) 9e556e4f4a is described below commit 9e556e4f4abd29a309a5382427342d304b2c6206 Author: Kent Yao <y...@apache.org> AuthorDate: Fri Jul 19 16:00:53 2024 +0800 Improve the navigational capability for release manager's guide (#535) * Improve the navigational capability for release manager's guide * add title --- release-process.md | 112 +++++++++++++++++++++++++-------------- site/release-process.html | 131 ++++++++++++++++++++++++++++++---------------- 2 files changed, 159 insertions(+), 84 deletions(-) diff --git a/release-process.md b/release-process.md index 3b522e84f6..57fed2fde7 100644 --- a/release-process.md +++ b/release-process.md @@ -7,27 +7,41 @@ navigation: show: true --- -<h2>Preparing Spark releases</h2> - -<h3>Background</h3> +<h1>Preparing Spark releases</h1> The release manager role in Spark means you are responsible for a few different things: -1. Preparing your setup -1. Preparing for release candidates: - 1. cutting a release branch - 1. informing the community of timing - 1. working with component leads to clean up JIRA - 1. making code changes in that branch with necessary version updates -1. Running the voting process for a release: - 1. creating release candidates using automated tooling - 1. calling votes and triaging issues -1. Finalizing and posting a release: - 1. updating the Spark website - 1. writing release notes - 1. announcing the release - -<h2>Preparing your setup</h2> +- [Preparing your setup](#preparing-your-setup) + - [Preparing gpg key](#preparing-gpg-key) + - [Generate key](#generate-key) + - [Upload key](#upload-key) + - [Update KEYS file with your code signing key](#update-keys-file-with-your-code-signing-key) + - [Installing Docker](#installing-docker) +- [Preparing for release candidates](#preparing-for-release-candidates) + - [Cutting a release candidate](#cutting-a-release-candidate) + - Informing the community of timing + - Working with component leads to clean up JIRA + - Making code changes in that branch with necessary version updates +- Running the voting process for a release: + - [Creating release candidates using automated tooling](#creating-release-candidates-using-automated-tooling) + - [Triaging issues](https://s.apache.org/spark-jira-versions) + - [Call a vote on the release candidate](#call-a-vote-on-the-release-candidate) +- [Finalizing and posting a release](#finalize-the-release) + - [Upload to Apache release directory](#upload-to-apache-release-directory) + - [Upload to PyPI](#upload-to-pypi) + - [Publish to CRAN](#publish-to-cran) + - [Remove RC artifacts from repositories](#remove-rc-artifacts-from-repositories) + - [Remove old releases from Mirror Network](#remove-old-releases-from-mirror-network) + - [Update the Apache Spark<sup>TM</sup> repository](#update-the-apache-spark-repository) + - [Update the configuration of Algolia Crawler](#update-the-configuration-of-algolia-crawler) + - [Update the Spark website](#update-the-spark-website) + - [Upload generated docs](#upload-generated-docs) + - [Update the rest of the Spark website](#update-the-rest-of-the-spark-website) + - [Create and upload Spark Docker Images](#create-and-upload-spark-docker-images) + - [Create an announcement](#create-an-announcement) + +<h2 id="preparing-your-setup">Preparing your setup</h2> + If you are a new Release Manager, you can read up on the process from the followings: @@ -35,11 +49,11 @@ If you are a new Release Manager, you can read up on the process from the follow - gpg for signing [https://www.apache.org/dev/openpgp.html](https://www.apache.org/dev/openpgp.html) - svn [https://infra.apache.org/version-control.html#svn](https://infra.apache.org/version-control.html#svn) -<h3>Preparing gpg key</h3> +<h3 id="preparing-gpg-key">Preparing gpg key</h3> You can skip this section if you have already uploaded your key. -<h4>Generate key</h4> +<h4 id="generate-key">Generate key</h4> Here's an example of gpg 2.0.12. If you use gpg version 1 series, please refer to <a href="https://www.apache.org/dev/openpgp.html#generate-key">generate-key</a> for details. @@ -97,7 +111,7 @@ sub rsa4096 2021-08-19 [E] Note that the last 8 digits (26A27D33) of the public key is the <a href="https://infra.apache.org/release-signing.html#key-id">key ID</a>. -<h4>Upload key</h4> +<h4 id="upload-key">Upload key</h4> After generating the public key, we should upload it to <a href="https://infra.apache.org/release-signing.html#keyserver">public key server</a>: @@ -107,7 +121,7 @@ $ gpg --keyserver hkps://keys.openpgp.org --send-key 26A27D33 Please refer to <a href="https://infra.apache.org/release-signing.html#keyserver-upload">keyserver-upload</a> for details. -<h4>Update KEYS file with your code signing key</h4> +<h4 id="update-keys-file-with-your-code-signing-key">Update KEYS file with your code signing key</h4> To get the code signing key (a.k.a ASCII-armored public key), run the command: @@ -127,21 +141,24 @@ svn ci --username $ASF_USERNAME --password "$ASF_PASSWORD" -m"Update KEYS" If you want to do the release on another machine, you can transfer your secret key to that machine via the `gpg --export-secret-keys` and `gpg --import` commands. -<h3>Installing Docker</h3> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="installing-docker">Installing Docker</h3> The scripts to create release candidates are run through docker. You need to install docker before running these scripts. Please make sure that you can run docker as non-root users. See <a href="https://docs.docker.com/install/linux/linux-postinstall/">https://docs.docker.com/install/linux/linux-postinstall</a> for more details. -<h2>Preparing Spark for release</h2> +<h2 id="preparing-for-release-candidates">Preparing for release candidates</h2> + The main step towards preparing a release is to create a release branch. This is done via standard Git branching mechanism and should be announced to the community once the branch is created. -<h3>Cutting a release candidate</h3> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="cutting-a-release-candidate">Cutting a release candidate</h3> If this is not the first RC, then make sure that the JIRA issues that have been solved since the last RC are marked as `Resolved` and has a `Target Versions` set to this release version. @@ -160,6 +177,9 @@ Verify from `git log` whether they are actually making it in the new RC or not. with `release-notes` label, and make sure they are documented in relevant migration guide for breaking changes or in the release news on the website later. +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="creating-release-candidates-using-automated-tooling">Creating release candidates using automated tooling</h3> + To cut a release candidate, there are 4 steps: 1. Create a git tag for the release candidate. 1. Package the release binaries & sources, and upload them to the Apache staging SVN repo. @@ -170,7 +190,8 @@ The process of cutting a release candidate has been automated via the `dev/creat Run this script, type information it requires, and wait until it finishes. You can also do a single step via the `-s` option. Please run `do-release-docker.sh -h` and see more details. -<h3>Call a vote on the release candidate</h3> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="call-a-vote-on-the-release-candidate">Call a vote on the release candidate</h3> The release voting takes place on the Apache Spark developers list (the PMC is voting). Look at past voting threads to see how this proceeds. The email should follow @@ -184,7 +205,7 @@ Look at past voting threads to see how this proceeds. The email should follow Once the vote is done, you should also send out a summary email with the totals, with a subject that looks something like `[VOTE][RESULT] ...`. -<h3>Finalize the release</h3> +<h2 id="finalize-the-release">Finalize the release</h2> Note that `dev/create-release/do-release-docker.sh` script (`finalize` step ) automates most of the following steps **except** for: - Publish to CRAN @@ -196,7 +217,8 @@ Note that `dev/create-release/do-release-docker.sh` script (`finalize` step ) au Please manually verify the result after each step. -<h4>Upload to Apache release directory</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="upload-to-apache-release-directory">Upload to Apache release directory</h3> **Be Careful!** @@ -229,7 +251,8 @@ and the same under [https://repository.apache.org/content/groups/maven-staging-g (look for the correct release version). After some time this will be sync'd to <a href="https://search.maven.org/">Maven Central</a> automatically. -<h4>Upload to PyPI</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="upload-to-pypi">Upload to PyPI</h3> You'll need your own PyPI account. If you do not have a PyPI account that has access to the `pyspark` and `pyspark-connect` projects on PyPI, please ask the <a href="mailto:priv...@spark.apache.org">PMC</a> to grant permission for both. @@ -244,13 +267,15 @@ is incorrect (e.g. http failure or other issue), you can rename the artifact to `pyspark-version.post0.tar.gz`, delete the old artifact from PyPI and re-upload. -<h4>Publish to CRAN</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="publish-to-cran">Publish to CRAN</h3> Publishing to CRAN is done using <a href="https://cran.r-project.org/submit.html">this form</a>. Since it requires further manual steps, please also contact the <a href="mailto:priv...@spark.apache.org">PMC</a>. -<h4>Remove RC artifacts from repositories</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="remove-rc-artifacts-from-repositories">Remove RC artifacts from repositories</h3> **NOTE! If you did not make a backup of docs for approved RC, this is the last time you can make a backup. This will be used to upload the docs to the website in next few step. Check out docs from svn before removing the directory.** @@ -267,7 +292,8 @@ Make sure to also remove the unpublished staging repositories from the <a href="https://repository.apache.org/">Apache Nexus Repository Manager</a>. -<h4>Remove old releases from Mirror Network</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="remove-old-releases-from-mirror-network">Remove old releases from Mirror Network</h3> Spark always keeps the latest maintenance released of each branch in the mirror network. To delete older versions simply use svn rm: @@ -280,7 +306,8 @@ You will also need to update `js/download.js` to indicate the release is not mir anymore, so that the correct links are generated on the site. -<h4>Update the Spark Apache<span class="tm">™</span> repository</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="update-the-apache-spark-repository">Update the Spark Apache<span class="tm">™</span> repository</h3> Check out the tagged commit for the release candidate that passed and apply the correct version tag. @@ -289,12 +316,15 @@ $ git tag v1.1.1 v1.1.1-rc2 # the RC that passed $ git push apache v1.1.1 ``` -<h4>Update the configuration of Algolia Crawler</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="update-the-configuration-of-algolia-crawler">Update the configuration of Algolia Crawler</h3> + The search box on the <a href="https://spark.apache.org/docs/latest/">Spark documentation website</a> leverages the <a href="https://www.algolia.com/products/search-and-discovery/crawler/">Algolia Crawler</a>. Before a release, please update the crawler configuration for Apache Spark with the new version on the <a href="https://crawler.algolia.com/">Algolia Crawler Admin Console</a>. If you don't have access to the configuration, contact <a href="mailto:gengli...@apache.org">Gengliang Wa [...] -<h4>Update the Spark website</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="update-the-spark-website">Update the Spark website</h3> -<h5>Upload generated docs</h5> +<h4 id="upload-generated-docs">Upload generated docs</h4> The website repository is located at <a href="https://github.com/apache/spark-website">https://github.com/apache/spark-website</a>. @@ -319,7 +349,7 @@ $ rm latest $ ln -s 1.1.1 latest ``` -<h5>Update the rest of the Spark website</h5> +<h4 id="update-the-rest-of-the-spark-website">Update the rest of the Spark website</h4> Next, update the rest of the Spark website. See how the previous releases are documented (all the HTML file changes are generated by `jekyll`). In particular: @@ -400,7 +430,8 @@ $ git shortlog v1.1.1 --grep "$EXPR" > contrib.txt $ git log v1.1.1 --grep "$expr" --shortstat --oneline | grep -B 1 -e "[3-9][0-9][0-9] insert" -e "[1-9][1-9][1-9][1-9] insert" | grep SPARK > large-patches.txt ``` -<h4>Create and upload Spark Docker Images</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="create-and-upload-spark-docker-images">Create and upload Spark Docker Images</h3> The apache/spark-docker provides dockerfiles and Github Action for Spark Docker images publish. 1. Upload Spark Dockerfiles to apache/spark-docker repository, please refer to [link](https://github.com/apache/spark-docker/pull/33). @@ -410,10 +441,13 @@ The apache/spark-docker provides dockerfiles and Github Action for Spark Docker 3. Select "The Spark version of Spark image", click "Publish the image or not", select "apache" as target registry. 4. Click "Run workflow" button to publish the image to Apache dockerhub. -<h4>Create an announcement</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="create-an-announcement">Create an announcement</h3> Once everything is working (website docs, website changes) create an announcement on the website and then send an e-mail to the mailing list with a subject that looks something like `[ANNOUNCE] ...`. To create an announcement, create a post under `news/_posts` and then run `bundle exec jekyll build`. Enjoy an adult beverage of your choice, and congratulations on making a Spark release. + +<p align="right"><a href="#top">Return to top</a></p> diff --git a/site/release-process.html b/site/release-process.html index d316da36b7..9b1ba6bc5a 100644 --- a/site/release-process.html +++ b/site/release-process.html @@ -141,38 +141,60 @@ <div class="container"> <div class="row mt-4"> <div class="col-12 col-md-9"> - <h2>Preparing Spark releases</h2> - -<h3>Background</h3> + <h1>Preparing Spark releases</h1> <p>The release manager role in Spark means you are responsible for a few different things:</p> -<ol> - <li>Preparing your setup</li> - <li>Preparing for release candidates: - <ol> - <li>cutting a release branch</li> - <li>informing the community of timing</li> - <li>working with component leads to clean up JIRA</li> - <li>making code changes in that branch with necessary version updates</li> - </ol> +<ul> + <li><a href="#preparing-your-setup">Preparing your setup</a> + <ul> + <li><a href="#preparing-gpg-key">Preparing gpg key</a> + <ul> + <li><a href="#generate-key">Generate key</a></li> + <li><a href="#upload-key">Upload key</a></li> + <li><a href="#update-keys-file-with-your-code-signing-key">Update KEYS file with your code signing key</a></li> + </ul> + </li> + <li><a href="#installing-docker">Installing Docker</a></li> + </ul> + </li> + <li><a href="#preparing-for-release-candidates">Preparing for release candidates</a> + <ul> + <li><a href="#cutting-a-release-candidate">Cutting a release candidate</a></li> + <li>Informing the community of timing</li> + <li>Working with component leads to clean up JIRA</li> + <li>Making code changes in that branch with necessary version updates</li> + </ul> </li> <li>Running the voting process for a release: - <ol> - <li>creating release candidates using automated tooling</li> - <li>calling votes and triaging issues</li> - </ol> + <ul> + <li><a href="#creating-release-candidates-using-automated-tooling">Creating release candidates using automated tooling</a></li> + <li><a href="https://s.apache.org/spark-jira-versions">Triaging issues</a></li> + <li><a href="#call-a-vote-on-the-release-candidate">Call a vote on the release candidate</a></li> + </ul> </li> - <li>Finalizing and posting a release: - <ol> - <li>updating the Spark website</li> - <li>writing release notes</li> - <li>announcing the release</li> - </ol> + <li><a href="#finalize-the-release">Finalizing and posting a release</a> + <ul> + <li><a href="#upload-to-apache-release-directory">Upload to Apache release directory</a></li> + <li><a href="#upload-to-pypi">Upload to PyPI</a></li> + <li><a href="#publish-to-cran">Publish to CRAN</a></li> + <li><a href="#remove-rc-artifacts-from-repositories">Remove RC artifacts from repositories</a></li> + <li><a href="#remove-old-releases-from-mirror-network">Remove old releases from Mirror Network</a></li> + <li><a href="#update-the-apache-spark-repository">Update the Apache Spark<sup>TM</sup> repository</a></li> + <li><a href="#update-the-configuration-of-algolia-crawler">Update the configuration of Algolia Crawler</a></li> + <li><a href="#update-the-spark-website">Update the Spark website</a> + <ul> + <li><a href="#upload-generated-docs">Upload generated docs</a></li> + <li><a href="#update-the-rest-of-the-spark-website">Update the rest of the Spark website</a></li> + </ul> + </li> + <li><a href="#create-and-upload-spark-docker-images">Create and upload Spark Docker Images</a></li> + <li><a href="#create-an-announcement">Create an announcement</a></li> + </ul> </li> -</ol> +</ul> -<h2>Preparing your setup</h2> +<h2 id="preparing-your-setup">Preparing your setup</h2> <p>If you are a new Release Manager, you can read up on the process from the followings:</p> @@ -182,11 +204,11 @@ <li>svn <a href="https://infra.apache.org/version-control.html#svn">https://infra.apache.org/version-control.html#svn</a></li> </ul> -<h3>Preparing gpg key</h3> +<h3 id="preparing-gpg-key">Preparing gpg key</h3> <p>You can skip this section if you have already uploaded your key.</p> -<h4>Generate key</h4> +<h4 id="generate-key">Generate key</h4> <p>Here’s an example of gpg 2.0.12. If you use gpg version 1 series, please refer to <a href="https://www.apache.org/dev/openpgp.html#generate-key">generate-key</a> for details.</p> @@ -243,7 +265,7 @@ sub rsa4096 2021-08-19 [E] <p>Note that the last 8 digits (26A27D33) of the public key is the <a href="https://infra.apache.org/release-signing.html#key-id">key ID</a>.</p> -<h4>Upload key</h4> +<h4 id="upload-key">Upload key</h4> <p>After generating the public key, we should upload it to <a href="https://infra.apache.org/release-signing.html#keyserver">public key server</a>:</p> @@ -252,7 +274,7 @@ sub rsa4096 2021-08-19 [E] <p>Please refer to <a href="https://infra.apache.org/release-signing.html#keyserver-upload">keyserver-upload</a> for details.</p> -<h4>Update KEYS file with your code signing key</h4> +<h4 id="update-keys-file-with-your-code-signing-key">Update KEYS file with your code signing key</h4> <p>To get the code signing key (a.k.a ASCII-armored public key), run the command:</p> @@ -270,20 +292,22 @@ svn ci --username $ASF_USERNAME --password "$ASF_PASSWORD" -m"Update KEYS" <p>If you want to do the release on another machine, you can transfer your secret key to that machine via the <code class="language-plaintext highlighter-rouge">gpg --export-secret-keys</code> and <code class="language-plaintext highlighter-rouge">gpg --import</code> commands.</p> -<h3>Installing Docker</h3> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="installing-docker">Installing Docker</h3> <p>The scripts to create release candidates are run through docker. You need to install docker before running these scripts. Please make sure that you can run docker as non-root users. See <a href="https://docs.docker.com/install/linux/linux-postinstall/">https://docs.docker.com/install/linux/linux-postinstall</a> for more details.</p> -<h2>Preparing Spark for release</h2> +<h2 id="preparing-for-release-candidates">Preparing for release candidates</h2> <p>The main step towards preparing a release is to create a release branch. This is done via standard Git branching mechanism and should be announced to the community once the branch is created.</p> -<h3>Cutting a release candidate</h3> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="cutting-a-release-candidate">Cutting a release candidate</h3> <p>If this is not the first RC, then make sure that the JIRA issues that have been solved since the last RC are marked as <code class="language-plaintext highlighter-rouge">Resolved</code> and has a <code class="language-plaintext highlighter-rouge">Target Versions</code> set to this release version.</p> @@ -299,6 +323,9 @@ and click on the version link of its Target Versions field)</p> with <code class="language-plaintext highlighter-rouge">release-notes</code> label, and make sure they are documented in relevant migration guide for breaking changes or in the release news on the website later.</p> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="creating-release-candidates-using-automated-tooling">Creating release candidates using automated tooling</h3> + <p>To cut a release candidate, there are 4 steps:</p> <ol> <li>Create a git tag for the release candidate.</li> @@ -311,7 +338,8 @@ changes or in the release news on the website later.</p> Run this script, type information it requires, and wait until it finishes. You can also do a single step via the <code class="language-plaintext highlighter-rouge">-s</code> option. Please run <code class="language-plaintext highlighter-rouge">do-release-docker.sh -h</code> and see more details.</p> -<h3>Call a vote on the release candidate</h3> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="call-a-vote-on-the-release-candidate">Call a vote on the release candidate</h3> <p>The release voting takes place on the Apache Spark developers list (the PMC is voting). Look at past voting threads to see how this proceeds. The email should follow @@ -327,7 +355,7 @@ Look at past voting threads to see how this proceeds. The email should follow <p>Once the vote is done, you should also send out a summary email with the totals, with a subject that looks something like <code class="language-plaintext highlighter-rouge">[VOTE][RESULT] ...</code>.</p> -<h3>Finalize the release</h3> +<h2 id="finalize-the-release">Finalize the release</h2> <p>Note that <code class="language-plaintext highlighter-rouge">dev/create-release/do-release-docker.sh</code> script (<code class="language-plaintext highlighter-rouge">finalize</code> step ) automates most of the following steps <strong>except</strong> for:</p> <ul> @@ -341,7 +369,8 @@ that looks something like <code class="language-plaintext highlighter-rouge">[VO <p>Please manually verify the result after each step.</p> -<h4>Upload to Apache release directory</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="upload-to-apache-release-directory">Upload to Apache release directory</h3> <p><strong>Be Careful!</strong></p> @@ -371,7 +400,8 @@ select and click Release and confirm. If successful, it should show up under <a and the same under <a href="https://repository.apache.org/content/groups/maven-staging-group/org/apache/spark/spark-core_2.11/2.2.1/">https://repository.apache.org/content/groups/maven-staging-group/org/apache/spark/spark-core_2.11/2.2.1/</a> (look for the correct release version). After some time this will be sync’d to <a href="https://search.maven.org/">Maven Central</a> automatically.</p> -<h4>Upload to PyPI</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="upload-to-pypi">Upload to PyPI</h3> <p>You’ll need your own PyPI account. If you do not have a PyPI account that has access to the <code class="language-plaintext highlighter-rouge">pyspark</code> and <code class="language-plaintext highlighter-rouge">pyspark-connect</code> projects on PyPI, please ask the <a href="mailto:priv...@spark.apache.org">PMC</a> to grant permission for both.</p> @@ -384,12 +414,14 @@ and the same under <a href="https://repository.apache.org/content/groups/maven-s is incorrect (e.g. http failure or other issue), you can rename the artifact to <code class="language-plaintext highlighter-rouge">pyspark-version.post0.tar.gz</code>, delete the old artifact from PyPI and re-upload.</p> -<h4>Publish to CRAN</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="publish-to-cran">Publish to CRAN</h3> <p>Publishing to CRAN is done using <a href="https://cran.r-project.org/submit.html">this form</a>. Since it requires further manual steps, please also contact the <a href="mailto:priv...@spark.apache.org">PMC</a>.</p> -<h4>Remove RC artifacts from repositories</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="remove-rc-artifacts-from-repositories">Remove RC artifacts from repositories</h3> <p><strong>NOTE! If you did not make a backup of docs for approved RC, this is the last time you can make a backup. This will be used to upload the docs to the website in next few step. Check out docs from svn before removing the directory.</strong></p> @@ -404,7 +436,8 @@ the RC directories from the staging repository. For example:</p> <p>Make sure to also remove the unpublished staging repositories from the <a href="https://repository.apache.org/">Apache Nexus Repository Manager</a>.</p> -<h4>Remove old releases from Mirror Network</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="remove-old-releases-from-mirror-network">Remove old releases from Mirror Network</h3> <p>Spark always keeps the latest maintenance released of each branch in the mirror network. To delete older versions simply use svn rm:</p> @@ -415,7 +448,8 @@ To delete older versions simply use svn rm:</p> <p>You will also need to update <code class="language-plaintext highlighter-rouge">js/download.js</code> to indicate the release is not mirrored anymore, so that the correct links are generated on the site.</p> -<h4>Update the Spark Apache<span class="tm">™</span> repository</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="update-the-apache-spark-repository">Update the Spark Apache<span class="tm">™</span> repository</h3> <p>Check out the tagged commit for the release candidate that passed and apply the correct version tag.</p> @@ -423,12 +457,15 @@ anymore, so that the correct links are generated on the site.</p> $ git push apache v1.1.1 </code></pre></div></div> -<h4>Update the configuration of Algolia Crawler</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="update-the-configuration-of-algolia-crawler">Update the configuration of Algolia Crawler</h3> + <p>The search box on the <a href="https://spark.apache.org/docs/latest/">Spark documentation website</a> leverages the <a href="https://www.algolia.com/products/search-and-discovery/crawler/">Algolia Crawler</a>. Before a release, please update the crawler configuration for Apache Spark with the new version on the <a href="https://crawler.algolia.com/">Algolia Crawler Admin Console</a>. If you don’t have access to the configuration, contact <a href="mailto:gengli...@apache.org">Gen [...] -<h4>Update the Spark website</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="update-the-spark-website">Update the Spark website</h3> -<h5>Upload generated docs</h5> +<h4 id="upload-generated-docs">Upload generated docs</h4> <p>The website repository is located at <a href="https://github.com/apache/spark-website">https://github.com/apache/spark-website</a>.</p> @@ -452,7 +489,7 @@ $ rm latest $ ln -s 1.1.1 latest </code></pre></div></div> -<h5>Update the rest of the Spark website</h5> +<h4 id="update-the-rest-of-the-spark-website">Update the rest of the Spark website</h4> <p>Next, update the rest of the Spark website. See how the previous releases are documented (all the HTML file changes are generated by <code class="language-plaintext highlighter-rouge">jekyll</code>). In particular:</p> @@ -532,7 +569,8 @@ $ git shortlog v1.1.1 --grep "$EXPR" > contrib.txt $ git log v1.1.1 --grep "$expr" --shortstat --oneline | grep -B 1 -e "[3-9][0-9][0-9] insert" -e "[1-9][1-9][1-9][1-9] insert" | grep SPARK > large-patches.txt </code></pre></div></div> -<h4>Create and upload Spark Docker Images</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="create-and-upload-spark-docker-images">Create and upload Spark Docker Images</h3> <p>The apache/spark-docker provides dockerfiles and Github Action for Spark Docker images publish.</p> <ol> @@ -547,7 +585,8 @@ $ git log v1.1.1 --grep "$expr" --shortstat --oneline | grep -B 1 -e "[3-9][0-9] </li> </ol> -<h4>Create an announcement</h4> +<p align="right"><a href="#top">Return to top</a></p> +<h3 id="create-an-announcement">Create an announcement</h3> <p>Once everything is working (website docs, website changes) create an announcement on the website and then send an e-mail to the mailing list with a subject that looks something like <code class="language-plaintext highlighter-rouge">[ANNOUNCE] ...</code>. To create an announcement, create a post under @@ -555,6 +594,8 @@ and then send an e-mail to the mailing list with a subject that looks something <p>Enjoy an adult beverage of your choice, and congratulations on making a Spark release.</p> +<p align="right"><a href="#top">Return to top</a></p> + </div> <div class="col-12 col-md-3"> <div class="news" style="margin-bottom: 20px;"> --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org