Standalone. I'd love to tell it that my one executor can simultaneously
serve, say, 16 tasks at once for an arbitrary number of distinct jobs.


On Fri, Aug 29, 2014 at 11:29 AM, Matei Zaharia <matei.zaha...@gmail.com>
wrote:

> Yes, executors run one task per core of your machine by default. You can
> also manually launch them with more worker threads than you have cores.
> What cluster manager are you on?
>
> Matei
>
> On August 29, 2014 at 11:24:33 AM, Victor Tso-Guillen (v...@paxata.com)
> wrote:
>
>  I'm thinking of local mode where multiple virtual executors occupy the
> same vm. Can we have the same configuration in spark standalone cluster
> mode?
>
>

Reply via email to