How you guys make driver docker within container to be reachable from spark
worker ?
Would you share your driver docker? i am trying to put only driver in docker
and spark running with yarn outside of container and i don’t want to use
—net=host
Thx
Yanlin
> On Mar 10, 2016, at 11:06 AM,
We would like to run multiple spark driver in docker container. Any suggestion
for the port expose and network settings for docker so driver is reachable by
the worker nodes? —net=“hosts” is the last thing we want to do.
Thx
Yanlin
Did any one use Livy in real world high concurrency web app? I think it uses
spark submit command line to create job... How about job server or notebook
comparing with Livy?
Thx,
Yanlin
Sent from my iPhone
> On Mar 2, 2016, at 6:24 AM, Guru Medasani wrote:
>
> Hi Don,
>