+1 (non-binding)

This will bring the good experience to customers. So excited about this. ;-)

Yuming Wang <wgy...@gmail.com> 于2022年9月19日周一 10:18写道:

> +1.
>
> On Mon, Sep 19, 2022 at 9:44 AM Kent Yao <y...@apache.org> wrote:
>
>> +1
>>
>> Gengliang Wang <ltn...@gmail.com> 于2022年9月19日周一 09:23写道:
>> >
>> > +1, thanks for the work!
>> >
>> > On Sun, Sep 18, 2022 at 6:20 PM Hyukjin Kwon <gurwls...@gmail.com>
>> wrote:
>> >>
>> >> +1
>> >>
>> >> On Mon, 19 Sept 2022 at 09:15, Yikun Jiang <yikunk...@gmail.com>
>> wrote:
>> >>>
>> >>> Hi, all
>> >>>
>> >>>
>> >>> I would like to start the discussion for supporting Docker Official
>> Image for Spark.
>> >>>
>> >>>
>> >>> This SPIP is proposed to add Docker Official Image(DOI) to ensure the
>> Spark Docker images meet the quality standards for Docker images, to
>> provide these Docker images for users who want to use Apache Spark via
>> Docker image.
>> >>>
>> >>>
>> >>> There are also several Apache projects that release the Docker
>> Official Images, such as: flink, storm, solr, zookeeper, httpd (with 50M+
>> to 1B+ download for each). From the huge download statistics, we can see
>> the real demands of users, and from the support of other apache projects,
>> we should also be able to do it.
>> >>>
>> >>>
>> >>> After support:
>> >>>
>> >>> The Dockerfile will still be maintained by the Apache Spark community
>> and reviewed by Docker.
>> >>>
>> >>> The images will be maintained by the Docker community to ensure the
>> quality standards for Docker images of the Docker community.
>> >>>
>> >>>
>> >>> It will also reduce the extra docker images maintenance effort (such
>> as frequently rebuilding, image security update) of the Apache Spark
>> community.
>> >>>
>> >>>
>> >>> See more in SPIP DOC:
>> https://docs.google.com/document/d/1nN-pKuvt-amUcrkTvYAQ-bJBgtsWb9nAkNoVNRM2S2o
>> >>>
>> >>>
>> >>> cc: Ruifeng (co-author) and Hyukjin (shepherd)
>> >>>
>> >>>
>> >>> Regards,
>> >>> Yikun
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>

Reply via email to