[ 
https://issues.apache.org/jira/browse/SPARK-3821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15292599#comment-15292599
 ] 

Nicholas Chammas commented on SPARK-3821:
-----------------------------------------

You can deploy Spark today on Docker just fine. It's just that Spark itself 
does not maintain any official Dockerfiles and likely never will since the 
project is actually trying to push deployment stuff outside the main project 
(hence why spark-ec2 was moved out; you will not see spark-ec2 in the official 
docs once Spark 2.0 comes out). You may be more interested in the Apache Big 
Top project, which focuses on big data system deployment (including Spark) and 
may have stuff for Docker specifically. 

Mesos is a separate matter, because it's a resource manager (analogous to YARN) 
that integrates with Spark at a low level.

If you still think Spark should host and maintain an official Dockerfile and 
Docker images that are suitable for production use, please open a separate 
issue. I think the maintainers will reject it on the grounds that I have 
explained here, though. (Can't say for sure; after all I'm just a random 
contributor.)

> Develop an automated way of creating Spark images (AMI, Docker, and others)
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-3821
>                 URL: https://issues.apache.org/jira/browse/SPARK-3821
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build, EC2
>            Reporter: Nicholas Chammas
>            Assignee: Nicholas Chammas
>         Attachments: packer-proposal.html
>
>
> Right now the creation of Spark AMIs or Docker containers is done manually. 
> With tools like [Packer|http://www.packer.io/], we should be able to automate 
> this work, and do so in such a way that multiple types of machine images can 
> be created from a single template.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to