Hi,

I don't know much about you particular use case, but most (if not all) of
the Spark command line parameters can also be specified as properties.
You should try to use

SparkLauncher.setConf("spark.executor.instances", "3")

HTH,
Luc

Luc Bourlier
*Spark Team  - Typesafe, Inc.*
[email protected]

<http://www.typesafe.com>

On Wed, Oct 21, 2015 at 4:10 AM, [email protected] <
[email protected]> wrote:

> Hi all,
>  I want to launch spark job on yarn by java, but it seemes that there is
> no way to set numExecutors int the class SparkLauncher. Is there any way to
> set numExecutors ?
> Thanks
>
> ------------------------------
> [email protected]
>

Reply via email to