[ https://issues.apache.org/jira/browse/SPARK-3512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yongjia Wang updated SPARK-3512: -------------------------------- Description: I believe this would be a common scenario that the yarn cluster runs behind a firewall, while people want to run spark driver locally for best interactivity experience. For example, using ipython notebook, or more fancy IDEs, etc. A potential solution is to setup socks proxy on the local machine outside of the firewall through shh tunneling into some work station inside the firewall. Then the spark yarn-client only needs to talk to the cluster through this proxy without changing any configurations. (was: I believe this would be a common scenario that the yarn cluster runs behind a firewall, while people want to run spark driver locally for best interactivity experience. For example, using ipython notebook, or more fancy IDEs, etc. A potential solution is to setup socks proxy on the local machine outside of the firewall through shh tunneling into some work station inside the firewall. Then the spark yarn-client only needs to talk through this proxy.) > yarn-client through socks proxy > ------------------------------- > > Key: SPARK-3512 > URL: https://issues.apache.org/jira/browse/SPARK-3512 > Project: Spark > Issue Type: Wish > Components: YARN > Reporter: Yongjia Wang > > I believe this would be a common scenario that the yarn cluster runs behind a > firewall, while people want to run spark driver locally for best > interactivity experience. For example, using ipython notebook, or more fancy > IDEs, etc. A potential solution is to setup socks proxy on the local machine > outside of the firewall through shh tunneling into some work station inside > the firewall. Then the spark yarn-client only needs to talk to the cluster > through this proxy without changing any configurations. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org