Hi,  seems

   - [SPARK-35391] <https://issues.apache.org/jira/browse/SPARK-36339>:
   Memory leak in ExecutorAllocationListener breaks dynamic allocation under
   high load

Links to wrong jira ticket?

Mich Talebzadeh <mich.talebza...@gmail.com> 于2022年2月22日周二 15:49写道:

> Well, that is pretty easy to do.
>
> However, a quick fix for now could be to retag the image created. It is a
> small volume which can be done manually for now. For example, I just
> downloaded v3.1.3
>
>
> docker image ls
>
> REPOSITORY                             TAG
>                     IMAGE ID       CREATED        SIZE
>
> apache/spark                           v3.1.3
>                    31ed15daa2bf   12 hours ago   531MB
>
> Retag it with
>
>
> docker tag 31ed15daa2bf
> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>
> docker image ls
>
> REPOSITORY                                                   TAG
>                                           IMAGE ID       CREATED        SIZE
>
> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster   latest
>                                          31ed15daa2bf   12 hours ago   531MB
>
> Then push it with (example)
>
> docker push apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>
>
> HTH
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Mon, 21 Feb 2022 at 23:51, Holden Karau <hol...@pigscanfly.ca> wrote:
>
>> Yeah I think we should still adopt that naming convention, however no one
>> has taken the time submit write a script to do it yet so until we get that
>> script merged I think we'll just have one build. I can try and do that for
>> the next release but it would be a great 2nd issue for someone getting more
>> familiar with the release tooling.
>>
>> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh <
>> mich.talebza...@gmail.com> wrote:
>>
>>> Ok thanks for the correction.
>>>
>>> The docker pull line shows as follows:
>>>
>>> docker pull apache/spark:v3.2.1
>>>
>>>
>>> So this only tells me the version of Spark 3.2.1
>>>
>>>
>>> I thought we discussed deciding on the docker naming conventions in
>>> detail, and broadly agreed on what needs to be in the naming convention.
>>> For example, in this thread:
>>>
>>>
>>> Time to start publishing Spark Docker Images? -
>>> mich.talebza...@gmail.com - Gmail (google.com)
>>> <https://mail.google.com/mail/u/0/?hl=en-GB#search/publishing/FMfcgzGkZQSzbXWQDWfddGDNRDQfPCpg>
>>>  dated
>>> 22nd July 2021
>>>
>>>
>>> Referring to that, I think the broad agreement was that the docker image
>>> name should be of the form:
>>>
>>>
>>> The name of the file provides:
>>>
>>>    - Built for spark or spark-py (PySpark) spark-r
>>>    - Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
>>>    - Scala version; 2.1.2
>>>    - The OS version based on JAVA: 8-jre-slim-buster,
>>>    11-jre-slim-buster meaning JAVA 8 and JAVA 11 respectively
>>>
>>> I believe it is a good thing and we ought to adopt that convention. For
>>> example:
>>>
>>>
>>> spark-py-3.2.1-scala_2.12-11-jre-slim-buster
>>>
>>>
>>> HTH
>>>
>>>
>>>
>>>    view my Linkedin profile
>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>
>>>
>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Mon, 21 Feb 2022 at 21:58, Holden Karau <hol...@pigscanfly.ca> wrote:
>>>
>>>> My bad, the correct link is:
>>>>
>>>> https://hub.docker.com/r/apache/spark/tags
>>>>
>>>> On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh <
>>>> mich.talebza...@gmail.com> wrote:
>>>>
>>>>> well that docker link is not found! may be permission issue
>>>>>
>>>>> [image: image.png]
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>    view my Linkedin profile
>>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>>
>>>>>
>>>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>>>
>>>>>
>>>>>
>>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>>> any loss, damage or destruction of data or any other property which may
>>>>> arise from relying on this email's technical content is explicitly
>>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>>> arising from such loss, damage or destruction.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Mon, 21 Feb 2022 at 21:09, Holden Karau <hol...@pigscanfly.ca>
>>>>> wrote:
>>>>>
>>>>>> We are happy to announce the availability of Spark 3.1.3!
>>>>>>
>>>>>> Spark 3.1.3 is a maintenance release containing stability fixes. This
>>>>>> release is based on the branch-3.1 maintenance branch of Spark. We
>>>>>> strongly
>>>>>> recommend all 3.1 users to upgrade to this stable release.
>>>>>>
>>>>>> To download Spark 3.1.3, head over to the download page:
>>>>>> https://spark.apache.org/downloads.html
>>>>>>
>>>>>> To view the release notes:
>>>>>> https://spark.apache.org/releases/spark-release-3-1-3.html
>>>>>>
>>>>>> We would like to acknowledge all community members for contributing
>>>>>> to this
>>>>>> release. This release would not have been possible without you.
>>>>>>
>>>>>> *New Dockerhub magic in this release:*
>>>>>>
>>>>>> We've also started publishing docker containers to the Apache
>>>>>> Dockerhub,
>>>>>> these contain non-ASF artifacts that are subject to different license
>>>>>> terms than the
>>>>>> Spark release. The docker containers are built for Linux x86 and
>>>>>> ARM64 since that's
>>>>>> what I have access to (thanks to NV for the ARM64 machines).
>>>>>>
>>>>>> You can get them from https://hub.docker.com/apache/spark (and
>>>>>> spark-r and spark-py) :)
>>>>>> (And version 3.2.1 is also now published on Dockerhub).
>>>>>>
>>>>>> Holden
>>>>>>
>>>>>> --
>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>> Books (Learning Spark, High Performance Spark, etc.):
>>>>>> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
>>>>>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>>>>>
>>>>>
>>>>
>>>> --
>>>> Twitter: https://twitter.com/holdenkarau
>>>> Books (Learning Spark, High Performance Spark, etc.):
>>>> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
>>>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>>>
>>>
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>> Books (Learning Spark, High Performance Spark, etc.):
>> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>
>

Reply via email to