[
https://issues.apache.org/jira/browse/SPARK-42788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17702433#comment-17702433
]
Yikun Jiang commented on SPARK-42788:
-
Yes, I think we can add a line like:
{code:java}
ENV PATH="${PATH}:/opt/spark/bin:/opt/spark/sbin"
{code}
in dockerfile, or you can also extend this by extending base image like
```
FROM spark_base_image
ENV PATH="${PATH}:/opt/spark/bin:/opt/spark/sbin"
```
> spark binaries are not added to $PATH in public docker hub spark-py
> ---
>
> Key: SPARK-42788
> URL: https://issues.apache.org/jira/browse/SPARK-42788
> Project: Spark
> Issue Type: Improvement
> Components: Deploy
>Affects Versions: 3.1.3, 3.3.1
> Environment: https://hub.docker.com/r/apache/spark-py 3.1.3 and 3.3.1
> in Docker on M1 MacBook pro OSX ventura
> as in SPARK-42787, i'd do a PR but not too familiar with the mapping of
> Dockerfiles in the repo + the public docker hub images.
>Reporter: Max Rieger
>Priority: Minor
>
> i tested this for 3.1.3 and 3.3.1 from
> https://hub.docker.com/r/apache/spark-py/tags.
> for the user 185 `/opt/spark/bin` is not added to {{$PATH}} which hampers the
> user experience
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org