Currently sparkContext and it's executor pool is not shareable. Each
spakContext gets its own executor pool for entire life of an application.
So what is the best ways to share cluster resources across multiple long
running spark applications?

Only one I see is spark dynamic allocation but it has high latency when it
comes to real-time application.

-- 


[image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>

<https://www.instagram.com/xactlycorp/>   
<https://www.linkedin.com/company/xactly-corporation>   
<https://twitter.com/Xactly>   <https://www.facebook.com/XactlyCorp>   
<http://www.youtube.com/xactlycorporation>

Reply via email to