[ 
https://issues.apache.org/jira/browse/SPARK-41075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17635071#comment-17635071
 ] 

Yikun Jiang commented on SPARK-41075:
-------------------------------------

I think the `kubernetes/dockerfiles/spark/` is for binary path.

```
wget https://dlcdn.apache.org/spark/spark-3.3.1/spark-3.3.1-bin-hadoop3.tgz
tar -zxvf spark-3.3.1-bin-hadoop3.tgz
cd spark-3.3.1-bin-hadoop3\
./bin/docker-image-tool.sh -r xxx -t 3.3.1 -f 
kubernetes/dockerfiles/spark/Dockerfile.java17 build
```

But if you are using the source code, you should set the correct path like 
resource-managers/kubernetes/docker/src/main/dockerfiles.

BTW, below info you might interested:
- After 3.4.0, java17 dockerfiles is default Dockerfile in k8s IT: 
https://github.com/apache/spark/pull/38417
- We create a spark-docker to help build image simpler: 
https://github.com/apache/spark-docker

> Can't build Spark docker image with Java 17
> -------------------------------------------
>
>                 Key: SPARK-41075
>                 URL: https://issues.apache.org/jira/browse/SPARK-41075
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 3.3.1
>            Reporter: Yauheni Shahun
>            Priority: Minor
>
> I'm trying to build Spark docker image with Java 17 according to usage 
> description from {{{}docker-image-tool.sh{}}}:
> {code:java}
>  ./bin/docker-image-tool.sh -r $REGISTRY -t ${VERSION} -f 
> kubernetes/dockerfiles/spark/Dockerfile.java17 build
> {code}
> But I get the following error:
> {code:java}
> ./bin/docker-image-tool.sh: line 74: cd: kubernetes/dockerfiles/spark: No 
> such file or directory
> {code}
> As far as I understand the script verifies the argument for presence before 
> it creates a build context and copies content from 
> {{{}resource-managers/kubernetes/docker/src/main/dockerfiles{}}}.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to