Hello!
 I've read the documentation about the spark architecture, I have the
following questions:
1: how many executors can be on a single worker process (JMV)?
2:Should I think executor like a Java Thread Executor where the pool size
is equal with the number of the given cores (set up by the
SPARK_WORKER_CORES)?
3. If the worker can have many executors, how this is handled by the Spark?
Or can I handle by myself to set up the number of executors per JVM?

I look forward for your answers.
  Regards,
  Florin

Reply via email to