+user@
An executor is specific to an application, but an application can be
executing many jobs at once. So as I understand many jobs' tasks can
be executing at once on an executor.
You may not use your full 80-way parallelism if, for example, your
data set doesn't have 80 partitions. I also
I'm pretty sure the issue was an interaction with another subsystem. Thanks
for your patience with me!
On Tue, Sep 2, 2014 at 10:05 AM, Sean Owen so...@cloudera.com wrote:
+user@
An executor is specific to an application, but an application can be
executing many jobs at once. So as I
I'm thinking of local mode where multiple virtual executors occupy the same
vm. Can we have the same configuration in spark standalone cluster mode?
Yes, executors run one task per core of your machine by default. You can also
manually launch them with more worker threads than you have cores. What cluster
manager are you on?
Matei
On August 29, 2014 at 11:24:33 AM, Victor Tso-Guillen (v...@paxata.com) wrote:
I'm thinking of local mode
Standalone. I'd love to tell it that my one executor can simultaneously
serve, say, 16 tasks at once for an arbitrary number of distinct jobs.
On Fri, Aug 29, 2014 at 11:29 AM, Matei Zaharia matei.zaha...@gmail.com
wrote:
Yes, executors run one task per core of your machine by default. You can
Any more thoughts on this? I'm not sure how to do this yet.
On Fri, Aug 29, 2014 at 12:10 PM, Victor Tso-Guillen v...@paxata.com
wrote:
Standalone. I'd love to tell it that my one executor can simultaneously
serve, say, 16 tasks at once for an arbitrary number of distinct jobs.
On Fri, Aug