[ 
https://issues.apache.org/jira/browse/SPARK-30667?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiangrui Meng updated SPARK-30667:
----------------------------------
    Description: 
Currently we offer task.barrier() to coordinate tasks in barrier mode. Tasks 
can see all IP addresses from BarrierTaskContext. It would be simpler to 
integrate with distributed frameworks like TensorFlow DistributionStrategy if 
we provide all gather that can let tasks share additional information with 
others, e.g., an available port.

Note that with all gather, tasks are share their IP addresses as well.

{code}
port = ... # get an available port
ports = context.all_gather(port) # get all available ports, ordered by task ID
...  # set up distributed training service
{code}

  was:
Currently we offer task.barrier() to coordinate tasks in barrier mode. Tasks 
can see all IP addresses from BarrierTaskContext. It would be simpler to 
integrate with distributed frameworks like TensorFlow DistributionStrategy if 
we provide all gather that can let tasks share additional information with 
others, e.g., an available port.

Note that with all gather, tasks are share their IP addresses as well.


> Support simple all gather in barrier task context
> -------------------------------------------------
>
>                 Key: SPARK-30667
>                 URL: https://issues.apache.org/jira/browse/SPARK-30667
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark, Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Xiangrui Meng
>            Priority: Major
>
> Currently we offer task.barrier() to coordinate tasks in barrier mode. Tasks 
> can see all IP addresses from BarrierTaskContext. It would be simpler to 
> integrate with distributed frameworks like TensorFlow DistributionStrategy if 
> we provide all gather that can let tasks share additional information with 
> others, e.g., an available port.
> Note that with all gather, tasks are share their IP addresses as well.
> {code}
> port = ... # get an available port
> ports = context.all_gather(port) # get all available ports, ordered by task ID
> ...  # set up distributed training service
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to