This is an automated email from the ASF dual-hosted git repository. yikun pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark-docker.git
The following commit(s) were added to refs/heads/master by this push: new e8f5b0a [SPARK-42494] Add official image Dockerfile for Spark v3.3.2 e8f5b0a is described below commit e8f5b0a1151c349d9c7fdb09cf76300b42a6946b Author: Yikun Jiang <yikunk...@gmail.com> AuthorDate: Tue Feb 21 14:22:19 2023 +0800 [SPARK-42494] Add official image Dockerfile for Spark v3.3.2 ### What changes were proposed in this pull request? Add Apache Spark 3.3.2 Dockerfiles. - Add 3.3.2 GPG key - Add .github/workflows/build_3.3.2.yaml - ./add-dockerfiles.sh 3.3.2 ### Why are the changes needed? Apache Spark 3.3.2 released. https://lists.apache.org/thread/k8skf16wyn6rg9n0vd0t6l3bhw7c9svq ### Does this PR introduce _any_ user-facing change? Yes in future, new image will publised in future (after DOI reviewed) ### How was this patch tested? Add workflow and CI passed Closes #30 from Yikun/SPARK-42494. Authored-by: Yikun Jiang <yikunk...@gmail.com> Signed-off-by: Yikun Jiang <yikunk...@gmail.com> --- .github/workflows/build_3.3.2.yaml | 43 +++++++++++ 3.3.2/scala2.12-java11-python3-r-ubuntu/Dockerfile | 86 ++++++++++++++++++++++ .../entrypoint.sh | 0 3.3.2/scala2.12-java11-python3-ubuntu/Dockerfile | 83 +++++++++++++++++++++ .../scala2.12-java11-python3-ubuntu/entrypoint.sh | 0 3.3.2/scala2.12-java11-r-ubuntu/Dockerfile | 82 +++++++++++++++++++++ .../scala2.12-java11-r-ubuntu/entrypoint.sh | 7 -- 3.3.2/scala2.12-java11-ubuntu/Dockerfile | 79 ++++++++++++++++++++ .../scala2.12-java11-ubuntu/entrypoint.sh | 7 -- add-dockerfiles.sh | 2 +- entrypoint.sh.template | 2 + tools/template.py | 2 + 12 files changed, 378 insertions(+), 15 deletions(-) diff --git a/.github/workflows/build_3.3.2.yaml b/.github/workflows/build_3.3.2.yaml new file mode 100644 index 0000000..9ae1a13 --- /dev/null +++ b/.github/workflows/build_3.3.2.yaml @@ -0,0 +1,43 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# + +name: "Build and Test (3.3.2)" + +on: + pull_request: + branches: + - 'master' + paths: + - '3.3.2/**' + - '.github/workflows/build_3.3.2.yaml' + - '.github/workflows/main.yml' + +jobs: + run-build: + strategy: + matrix: + image-type: ["all", "python", "scala", "r"] + name: Run + secrets: inherit + uses: ./.github/workflows/main.yml + with: + spark: 3.3.2 + scala: 2.12 + java: 11 + image-type: ${{ matrix.image-type }} diff --git a/3.3.2/scala2.12-java11-python3-r-ubuntu/Dockerfile b/3.3.2/scala2.12-java11-python3-r-ubuntu/Dockerfile new file mode 100644 index 0000000..b518021 --- /dev/null +++ b/3.3.2/scala2.12-java11-python3-r-ubuntu/Dockerfile @@ -0,0 +1,86 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one or more +# contributor license agreements. See the NOTICE file distributed with +# this work for additional information regarding copyright ownership. +# The ASF licenses this file to You under the Apache License, Version 2.0 +# (the "License"); you may not use this file except in compliance with +# the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +FROM eclipse-temurin:11-jre-focal + +ARG spark_uid=185 + +RUN groupadd --system --gid=${spark_uid} spark && \ + useradd --system --uid=${spark_uid} --gid=spark spark + +RUN set -ex && \ + apt-get update && \ + ln -s /lib /lib64 && \ + apt install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu && \ + apt install -y python3 python3-pip && \ + apt install -y r-base r-base-dev && \ + mkdir -p /opt/spark && \ + mkdir /opt/spark/python && \ + mkdir -p /opt/spark/examples && \ + mkdir -p /opt/spark/work-dir && \ + touch /opt/spark/RELEASE && \ + chown -R spark:spark /opt/spark && \ + rm /bin/sh && \ + ln -sv /bin/bash /bin/sh && \ + echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su && \ + chgrp root /etc/passwd && chmod ug+rw /etc/passwd && \ + rm -rf /var/cache/apt/* && \ + rm -rf /var/lib/apt/lists/* + +# Install Apache Spark +# https://downloads.apache.org/spark/KEYS +ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3.tgz \ + SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3.tgz.asc \ + GPG_KEY=C56349D886F2B01F8CAE794C653C2301FEA493EE + +RUN set -ex; \ + export SPARK_TMP="$(mktemp -d)"; \ + cd $SPARK_TMP; \ + wget -nv -O spark.tgz "$SPARK_TGZ_URL"; \ + wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL"; \ + export GNUPGHOME="$(mktemp -d)"; \ + gpg --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" || \ + gpg --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY"; \ + gpg --batch --verify spark.tgz.asc spark.tgz; \ + gpgconf --kill all; \ + rm -rf "$GNUPGHOME" spark.tgz.asc; \ + \ + tar -xf spark.tgz --strip-components=1; \ + chown -R spark:spark .; \ + mv jars /opt/spark/; \ + mv bin /opt/spark/; \ + mv sbin /opt/spark/; \ + mv kubernetes/dockerfiles/spark/decom.sh /opt/; \ + mv examples /opt/spark/; \ + mv kubernetes/tests /opt/spark/; \ + mv data /opt/spark/; \ + mv python/pyspark /opt/spark/python/pyspark/; \ + mv python/lib /opt/spark/python/lib/; \ + mv R /opt/spark/; \ + cd ..; \ + rm -rf "$SPARK_TMP"; + +COPY entrypoint.sh /opt/ + +ENV SPARK_HOME /opt/spark +ENV R_HOME /usr/lib/R + +WORKDIR /opt/spark/work-dir +RUN chmod g+w /opt/spark/work-dir +RUN chmod a+x /opt/decom.sh +RUN chmod a+x /opt/entrypoint.sh + +ENTRYPOINT [ "/opt/entrypoint.sh" ] diff --git a/entrypoint.sh.template b/3.3.2/scala2.12-java11-python3-r-ubuntu/entrypoint.sh similarity index 100% copy from entrypoint.sh.template copy to 3.3.2/scala2.12-java11-python3-r-ubuntu/entrypoint.sh diff --git a/3.3.2/scala2.12-java11-python3-ubuntu/Dockerfile b/3.3.2/scala2.12-java11-python3-ubuntu/Dockerfile new file mode 100644 index 0000000..37e1205 --- /dev/null +++ b/3.3.2/scala2.12-java11-python3-ubuntu/Dockerfile @@ -0,0 +1,83 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one or more +# contributor license agreements. See the NOTICE file distributed with +# this work for additional information regarding copyright ownership. +# The ASF licenses this file to You under the Apache License, Version 2.0 +# (the "License"); you may not use this file except in compliance with +# the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +FROM eclipse-temurin:11-jre-focal + +ARG spark_uid=185 + +RUN groupadd --system --gid=${spark_uid} spark && \ + useradd --system --uid=${spark_uid} --gid=spark spark + +RUN set -ex && \ + apt-get update && \ + ln -s /lib /lib64 && \ + apt install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu && \ + apt install -y python3 python3-pip && \ + mkdir -p /opt/spark && \ + mkdir /opt/spark/python && \ + mkdir -p /opt/spark/examples && \ + mkdir -p /opt/spark/work-dir && \ + touch /opt/spark/RELEASE && \ + chown -R spark:spark /opt/spark && \ + rm /bin/sh && \ + ln -sv /bin/bash /bin/sh && \ + echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su && \ + chgrp root /etc/passwd && chmod ug+rw /etc/passwd && \ + rm -rf /var/cache/apt/* && \ + rm -rf /var/lib/apt/lists/* + +# Install Apache Spark +# https://downloads.apache.org/spark/KEYS +ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3.tgz \ + SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3.tgz.asc \ + GPG_KEY=C56349D886F2B01F8CAE794C653C2301FEA493EE + +RUN set -ex; \ + export SPARK_TMP="$(mktemp -d)"; \ + cd $SPARK_TMP; \ + wget -nv -O spark.tgz "$SPARK_TGZ_URL"; \ + wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL"; \ + export GNUPGHOME="$(mktemp -d)"; \ + gpg --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" || \ + gpg --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY"; \ + gpg --batch --verify spark.tgz.asc spark.tgz; \ + gpgconf --kill all; \ + rm -rf "$GNUPGHOME" spark.tgz.asc; \ + \ + tar -xf spark.tgz --strip-components=1; \ + chown -R spark:spark .; \ + mv jars /opt/spark/; \ + mv bin /opt/spark/; \ + mv sbin /opt/spark/; \ + mv kubernetes/dockerfiles/spark/decom.sh /opt/; \ + mv examples /opt/spark/; \ + mv kubernetes/tests /opt/spark/; \ + mv data /opt/spark/; \ + mv python/pyspark /opt/spark/python/pyspark/; \ + mv python/lib /opt/spark/python/lib/; \ + cd ..; \ + rm -rf "$SPARK_TMP"; + +COPY entrypoint.sh /opt/ + +ENV SPARK_HOME /opt/spark + +WORKDIR /opt/spark/work-dir +RUN chmod g+w /opt/spark/work-dir +RUN chmod a+x /opt/decom.sh +RUN chmod a+x /opt/entrypoint.sh + +ENTRYPOINT [ "/opt/entrypoint.sh" ] diff --git a/entrypoint.sh.template b/3.3.2/scala2.12-java11-python3-ubuntu/entrypoint.sh similarity index 100% copy from entrypoint.sh.template copy to 3.3.2/scala2.12-java11-python3-ubuntu/entrypoint.sh diff --git a/3.3.2/scala2.12-java11-r-ubuntu/Dockerfile b/3.3.2/scala2.12-java11-r-ubuntu/Dockerfile new file mode 100644 index 0000000..330c746 --- /dev/null +++ b/3.3.2/scala2.12-java11-r-ubuntu/Dockerfile @@ -0,0 +1,82 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one or more +# contributor license agreements. See the NOTICE file distributed with +# this work for additional information regarding copyright ownership. +# The ASF licenses this file to You under the Apache License, Version 2.0 +# (the "License"); you may not use this file except in compliance with +# the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +FROM eclipse-temurin:11-jre-focal + +ARG spark_uid=185 + +RUN groupadd --system --gid=${spark_uid} spark && \ + useradd --system --uid=${spark_uid} --gid=spark spark + +RUN set -ex && \ + apt-get update && \ + ln -s /lib /lib64 && \ + apt install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu && \ + apt install -y r-base r-base-dev && \ + mkdir -p /opt/spark && \ + mkdir -p /opt/spark/examples && \ + mkdir -p /opt/spark/work-dir && \ + touch /opt/spark/RELEASE && \ + chown -R spark:spark /opt/spark && \ + rm /bin/sh && \ + ln -sv /bin/bash /bin/sh && \ + echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su && \ + chgrp root /etc/passwd && chmod ug+rw /etc/passwd && \ + rm -rf /var/cache/apt/* && \ + rm -rf /var/lib/apt/lists/* + +# Install Apache Spark +# https://downloads.apache.org/spark/KEYS +ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3.tgz \ + SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3.tgz.asc \ + GPG_KEY=C56349D886F2B01F8CAE794C653C2301FEA493EE + +RUN set -ex; \ + export SPARK_TMP="$(mktemp -d)"; \ + cd $SPARK_TMP; \ + wget -nv -O spark.tgz "$SPARK_TGZ_URL"; \ + wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL"; \ + export GNUPGHOME="$(mktemp -d)"; \ + gpg --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" || \ + gpg --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY"; \ + gpg --batch --verify spark.tgz.asc spark.tgz; \ + gpgconf --kill all; \ + rm -rf "$GNUPGHOME" spark.tgz.asc; \ + \ + tar -xf spark.tgz --strip-components=1; \ + chown -R spark:spark .; \ + mv jars /opt/spark/; \ + mv bin /opt/spark/; \ + mv sbin /opt/spark/; \ + mv kubernetes/dockerfiles/spark/decom.sh /opt/; \ + mv examples /opt/spark/; \ + mv kubernetes/tests /opt/spark/; \ + mv data /opt/spark/; \ + mv R /opt/spark/; \ + cd ..; \ + rm -rf "$SPARK_TMP"; + +COPY entrypoint.sh /opt/ + +ENV SPARK_HOME /opt/spark +ENV R_HOME /usr/lib/R + +WORKDIR /opt/spark/work-dir +RUN chmod g+w /opt/spark/work-dir +RUN chmod a+x /opt/decom.sh +RUN chmod a+x /opt/entrypoint.sh + +ENTRYPOINT [ "/opt/entrypoint.sh" ] diff --git a/entrypoint.sh.template b/3.3.2/scala2.12-java11-r-ubuntu/entrypoint.sh similarity index 96% copy from entrypoint.sh.template copy to 3.3.2/scala2.12-java11-r-ubuntu/entrypoint.sh index 4bb1557..159d539 100644 --- a/entrypoint.sh.template +++ b/3.3.2/scala2.12-java11-r-ubuntu/entrypoint.sh @@ -45,13 +45,6 @@ if [ -n "$SPARK_EXTRA_CLASSPATH" ]; then SPARK_CLASSPATH="$SPARK_CLASSPATH:$SPARK_EXTRA_CLASSPATH" fi -if ! [ -z ${PYSPARK_PYTHON+x} ]; then - export PYSPARK_PYTHON -fi -if ! [ -z ${PYSPARK_DRIVER_PYTHON+x} ]; then - export PYSPARK_DRIVER_PYTHON -fi - # If HADOOP_HOME is set and SPARK_DIST_CLASSPATH is not set, set it here so Hadoop jars are available to the executor. # It does not set SPARK_DIST_CLASSPATH if already set, to avoid overriding customizations of this value from elsewhere e.g. Docker/K8s. if [ -n "${HADOOP_HOME}" ] && [ -z "${SPARK_DIST_CLASSPATH}" ]; then diff --git a/3.3.2/scala2.12-java11-ubuntu/Dockerfile b/3.3.2/scala2.12-java11-ubuntu/Dockerfile new file mode 100644 index 0000000..5bead91 --- /dev/null +++ b/3.3.2/scala2.12-java11-ubuntu/Dockerfile @@ -0,0 +1,79 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one or more +# contributor license agreements. See the NOTICE file distributed with +# this work for additional information regarding copyright ownership. +# The ASF licenses this file to You under the Apache License, Version 2.0 +# (the "License"); you may not use this file except in compliance with +# the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +FROM eclipse-temurin:11-jre-focal + +ARG spark_uid=185 + +RUN groupadd --system --gid=${spark_uid} spark && \ + useradd --system --uid=${spark_uid} --gid=spark spark + +RUN set -ex && \ + apt-get update && \ + ln -s /lib /lib64 && \ + apt install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu && \ + mkdir -p /opt/spark && \ + mkdir -p /opt/spark/examples && \ + mkdir -p /opt/spark/work-dir && \ + touch /opt/spark/RELEASE && \ + chown -R spark:spark /opt/spark && \ + rm /bin/sh && \ + ln -sv /bin/bash /bin/sh && \ + echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su && \ + chgrp root /etc/passwd && chmod ug+rw /etc/passwd && \ + rm -rf /var/cache/apt/* && \ + rm -rf /var/lib/apt/lists/* + +# Install Apache Spark +# https://downloads.apache.org/spark/KEYS +ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3.tgz \ + SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3.tgz.asc \ + GPG_KEY=C56349D886F2B01F8CAE794C653C2301FEA493EE + +RUN set -ex; \ + export SPARK_TMP="$(mktemp -d)"; \ + cd $SPARK_TMP; \ + wget -nv -O spark.tgz "$SPARK_TGZ_URL"; \ + wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL"; \ + export GNUPGHOME="$(mktemp -d)"; \ + gpg --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" || \ + gpg --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY"; \ + gpg --batch --verify spark.tgz.asc spark.tgz; \ + gpgconf --kill all; \ + rm -rf "$GNUPGHOME" spark.tgz.asc; \ + \ + tar -xf spark.tgz --strip-components=1; \ + chown -R spark:spark .; \ + mv jars /opt/spark/; \ + mv bin /opt/spark/; \ + mv sbin /opt/spark/; \ + mv kubernetes/dockerfiles/spark/decom.sh /opt/; \ + mv examples /opt/spark/; \ + mv kubernetes/tests /opt/spark/; \ + mv data /opt/spark/; \ + cd ..; \ + rm -rf "$SPARK_TMP"; + +COPY entrypoint.sh /opt/ + +ENV SPARK_HOME /opt/spark + +WORKDIR /opt/spark/work-dir +RUN chmod g+w /opt/spark/work-dir +RUN chmod a+x /opt/decom.sh +RUN chmod a+x /opt/entrypoint.sh + +ENTRYPOINT [ "/opt/entrypoint.sh" ] diff --git a/entrypoint.sh.template b/3.3.2/scala2.12-java11-ubuntu/entrypoint.sh similarity index 96% copy from entrypoint.sh.template copy to 3.3.2/scala2.12-java11-ubuntu/entrypoint.sh index 4bb1557..159d539 100644 --- a/entrypoint.sh.template +++ b/3.3.2/scala2.12-java11-ubuntu/entrypoint.sh @@ -45,13 +45,6 @@ if [ -n "$SPARK_EXTRA_CLASSPATH" ]; then SPARK_CLASSPATH="$SPARK_CLASSPATH:$SPARK_EXTRA_CLASSPATH" fi -if ! [ -z ${PYSPARK_PYTHON+x} ]; then - export PYSPARK_PYTHON -fi -if ! [ -z ${PYSPARK_DRIVER_PYTHON+x} ]; then - export PYSPARK_DRIVER_PYTHON -fi - # If HADOOP_HOME is set and SPARK_DIST_CLASSPATH is not set, set it here so Hadoop jars are available to the executor. # It does not set SPARK_DIST_CLASSPATH if already set, to avoid overriding customizations of this value from elsewhere e.g. Docker/K8s. if [ -n "${HADOOP_HOME}" ] && [ -z "${SPARK_DIST_CLASSPATH}" ]; then diff --git a/add-dockerfiles.sh b/add-dockerfiles.sh index 4829ecd..1683f33 100755 --- a/add-dockerfiles.sh +++ b/add-dockerfiles.sh @@ -48,6 +48,6 @@ for TAG in $TAGS; do OPTS+=" --spark-version $VERSION" mkdir -p $VERSION/$TAG - cp -f entrypoint.sh.template $VERSION/$TAG/entrypoint.sh + python3 tools/template.py $OPTS -f entrypoint.sh.template > $VERSION/$TAG/entrypoint.sh python3 tools/template.py $OPTS > $VERSION/$TAG/Dockerfile done diff --git a/entrypoint.sh.template b/entrypoint.sh.template index 4bb1557..dd56d84 100644 --- a/entrypoint.sh.template +++ b/entrypoint.sh.template @@ -44,6 +44,7 @@ readarray -t SPARK_EXECUTOR_JAVA_OPTS < /tmp/java_opts.txt if [ -n "$SPARK_EXTRA_CLASSPATH" ]; then SPARK_CLASSPATH="$SPARK_CLASSPATH:$SPARK_EXTRA_CLASSPATH" fi +{%- if HAVE_PY %} if ! [ -z ${PYSPARK_PYTHON+x} ]; then export PYSPARK_PYTHON @@ -51,6 +52,7 @@ fi if ! [ -z ${PYSPARK_DRIVER_PYTHON+x} ]; then export PYSPARK_DRIVER_PYTHON fi +{%- endif %} # If HADOOP_HOME is set and SPARK_DIST_CLASSPATH is not set, set it here so Hadoop jars are available to the executor. # It does not set SPARK_DIST_CLASSPATH if already set, to avoid overriding customizations of this value from elsewhere e.g. Docker/K8s. diff --git a/tools/template.py b/tools/template.py index 7f80dfe..7bacb23 100755 --- a/tools/template.py +++ b/tools/template.py @@ -26,6 +26,8 @@ GPG_KEY_DICT = { "3.3.0": "80FB8EBE8EBA68504989703491B5DC815DBF10D3", # issuer "yumw...@apache.org" "3.3.1": "86727D43E73A415F67A0B1A14E68B3E6CD473653", + # issuer "vii...@apache.org" + "3.3.2": "C56349D886F2B01F8CAE794C653C2301FEA493EE", } --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org