This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/spark-website.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new a6e039b0a9 Update Spark Docker Images publish workflow
a6e039b0a9 is described below

commit a6e039b0a945d8f3c36940f452883d699e408230
Author: Yikun Jiang <yikunk...@gmail.com>
AuthorDate: Tue Apr 18 21:49:01 2023 +0900

    Update Spark Docker Images publish workflow
    
    This PR try to update `Create and upload Spark Docker Images` to use new 
workflow, the new workflow will first build docker image, and then test (k8s / 
standalone) the image, finally publish the docker image, see example in 
[here](https://github.com/apache/spark-docker/actions/runs/4728100554/jobs/8389323265).
    
    The different between old workflow and new workflow:
    
    | workflow      | Tag |
    | ----------- | ----------- |
    | Previous      | apache/spark:v3.4.0, apache/spark-py:v3.4.0, 
apache/spark-r:v3.4.0       |
    | New   |  <img width="732" alt="image" 
src="https://user-images.githubusercontent.com/1736354/232733695-acc6b099-0c69-4638-b0ec-5a13cfe289dd.png";>
      |
    
    After
    <img width="1152" alt="image" 
src="https://user-images.githubusercontent.com/1736354/232729016-11df3304-72e8-489f-b441-d92aa6840d71.png";>
    
    We already configure the `DOCKER_USER` and `DOCKER_TOKEN ` in 
https://issues.apache.org/jira/browse/INFRA-23882 .
    
    After this patch merged, the 3.4.0 image can be published as first docker 
image to `apache/spark` dockerhub (new version). In future release, only new 
version docker images will be published.
    
    Author: Yikun Jiang <yikunk...@gmail.com>
    
    Closes #458 from Yikun/docker-publish.
---
 release-process.md        | 17 +++++++----------
 site/release-process.html | 19 ++++++++++++-------
 2 files changed, 19 insertions(+), 17 deletions(-)

diff --git a/release-process.md b/release-process.md
index 4030352f8b..e417d50891 100644
--- a/release-process.md
+++ b/release-process.md
@@ -397,16 +397,13 @@ $ git log v1.1.1 --grep "$expr" --shortstat --oneline | 
grep -B 1 -e "[3-9][0-9]
 
 <h4>Create and upload Spark Docker Images</h4>
 
-Please contact <a href="mailto:hol...@apache.org";>Holden Karau</a>, <a 
href="mailto:gengli...@apache.org";>Gengliang Wang</a> or <a 
href="mailto:dongj...@apache.org";>Dongjoon Hyun</a> to do this step because of 
the [ASF has a limited number of Docker Hub 
seats](https://infra.apache.org/docker-hub-policy.html).
-
-
-The Spark docker images are created using the `./bin/docker-image-tool.sh` 
that is included in the release artifacts.
-
-
-You should install `docker buildx` so that you can cross-compile for multiple 
archs as ARM is becoming increasing popular. If you have access to both an ARM 
and an x86 machine you should set up a [remote builder as described 
here](https://scalingpythonml.com/2020/12/11/some-sharp-corners-with-docker-buildx.html),
 but if you only have one [docker buildx with QEMU works fine as we don't use 
cgo](https://docs.docker.com/buildx/working-with-buildx/).
-
-
-Once you have your cross-platform docker build environment setup, extract the 
build artifact (e.g. `tar -xvf spark-3.3.0-bin-hadoop3.tgz`), go into the 
directory (e.g. `cd spark-3.3.0-bin-hadoop3`) and build the containers and 
publish them to the Spark dockerhub (e.g. `./bin/docker-image-tool.sh -r 
docker.io/apache -p ./kubernetes/dockerfiles/spark/bindings/python/Dockerfile 
-t v3.3.0 -X -b java_image_tag=11-jre-slim build`)
+The apache/spark-docker provides dockerfiles and Github Action for Spark 
Docker images publish.
+1. Upload Spark Dockerfiles to apache/spark-docker repository, please refer to 
[link](https://github.com/apache/spark-docker/pull/33).
+2. Publish Spark Docker Images:
+    1. Enter [publish 
page](https://github.com/apache/spark-docker/actions/workflows/publish.yml).
+    2. Click "Run workflow".
+    3. Select "The Spark version of Spark image", click "Publish the image or 
not", select "apache" as target registry.
+    4. Click "Run workflow" button to publish the image to Apache dockerhub.
 
 <h4>Create an announcement</h4>
 
diff --git a/site/release-process.html b/site/release-process.html
index 72d6bd7982..396782863f 100644
--- a/site/release-process.html
+++ b/site/release-process.html
@@ -510,13 +510,18 @@ $ git log v1.1.1 --grep "$expr" --shortstat --oneline | 
grep -B 1 -e "[3-9][0-9]
 
 <h4>Create and upload Spark Docker Images</h4>
 
-<p>Please contact <a href="mailto:hol...@apache.org";>Holden Karau</a>, <a 
href="mailto:gengli...@apache.org";>Gengliang Wang</a> or <a 
href="mailto:dongj...@apache.org";>Dongjoon Hyun</a> to do this step because of 
the <a href="https://infra.apache.org/docker-hub-policy.html";>ASF has a limited 
number of Docker Hub seats</a>.</p>
-
-<p>The Spark docker images are created using the <code 
class="language-plaintext highlighter-rouge">./bin/docker-image-tool.sh</code> 
that is included in the release artifacts.</p>
-
-<p>You should install <code class="language-plaintext 
highlighter-rouge">docker buildx</code> so that you can cross-compile for 
multiple archs as ARM is becoming increasing popular. If you have access to 
both an ARM and an x86 machine you should set up a <a 
href="https://scalingpythonml.com/2020/12/11/some-sharp-corners-with-docker-buildx.html";>remote
 builder as described here</a>, but if you only have one <a 
href="https://docs.docker.com/buildx/working-with-buildx/";>docker buildx with Q 
[...]
-
-<p>Once you have your cross-platform docker build environment setup, extract 
the build artifact (e.g. <code class="language-plaintext highlighter-rouge">tar 
-xvf spark-3.3.0-bin-hadoop3.tgz</code>), go into the directory (e.g. <code 
class="language-plaintext highlighter-rouge">cd spark-3.3.0-bin-hadoop3</code>) 
and build the containers and publish them to the Spark dockerhub (e.g. <code 
class="language-plaintext highlighter-rouge">./bin/docker-image-tool.sh -r 
docker.io/apache -p ./kuber [...]
+<p>The apache/spark-docker provides dockerfiles and Github Action for Spark 
Docker images publish.</p>
+<ol>
+  <li>Upload Spark Dockerfiles to apache/spark-docker repository, please refer 
to <a href="https://github.com/apache/spark-docker/pull/33";>link</a>.</li>
+  <li>Publish Spark Docker Images:
+    <ol>
+      <li>Enter <a 
href="https://github.com/apache/spark-docker/actions/workflows/publish.yml";>publish
 page</a>.</li>
+      <li>Click &#8220;Run workflow&#8221;.</li>
+      <li>Select &#8220;The Spark version of Spark image&#8221;, click 
&#8220;Publish the image or not&#8221;, select &#8220;apache&#8221; as target 
registry.</li>
+      <li>Click &#8220;Run workflow&#8221; button to publish the image to 
Apache dockerhub.</li>
+    </ol>
+  </li>
+</ol>
 
 <h4>Create an announcement</h4>
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to