[ 
https://issues.apache.org/jira/browse/SPARK-8734?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14933403#comment-14933403
 ] 

Ondřej Smola commented on SPARK-8734:
-------------------------------------

The goal should be to try provide as much value as we can given this limited 
format, as i said i think from what other frameworks provide in their RESP API 
we need to handle in this stage at least  ports,volumes, network type, env 
variables, labels  - without limitation. So proposed model is 

spark.mesos.executor.docker.network    host
spark.mesos.executor.docker.portmaps  1234:1245:udp, 2546:1255:tpc
spark.mesos.executor.docker.volumes /usr/:/host/usr:ro, /opt:/host/opt
spark.mesos.executor.docker.env.FOO BAR
spark.mesos.executor.docker.label.STAGE PRODUCTION


Support for custom docker parameters

spark.mesos.executor.docker.parameter.memory-swappiness 0   

Limitation - only one value for key is allowed

spark.mesos.executor.docker.parameters.attach  stdin,stdout

Limitation - comma can be used only as a separator

Pros
consistent with other configuration sections
we can easily move custom parameters to own section in the future
no escaping problems

Cons
Not powerful enough to express some docker parameter combinations (multiple 
values containing commas)

> Expose all Mesos DockerInfo options to Spark
> --------------------------------------------
>
>                 Key: SPARK-8734
>                 URL: https://issues.apache.org/jira/browse/SPARK-8734
>             Project: Spark
>          Issue Type: Improvement
>          Components: Mesos
>            Reporter: Chris Heller
>            Priority: Minor
>         Attachments: network.diff
>
>
> SPARK-2691 only exposed a few options from the DockerInfo message. It would 
> be reasonable to expose them all, especially given one can now specify 
> arbitrary parameters to docker.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to