Was there any other creative solutions for this? I am running into the same
issue with submitting to yarn from a Docker container and the solutions
don't provided don't work. (1. the host doesn't work, even if I use the
hostname of the physical node because when spark tries to bind to the
hostname
All,
I was wondering if any of you have solved this problem :
I have pyspark(ipython mode) running on docker talking to
a yarn cluster(AM/executors are NOT running on docker).
When I start pyspark in the docker container, it binds to port *49460.*
Once the app is submitted to YARN, the app(AM)