Re: Retrieve the count of spark nodes

2022-06-09 Thread Poorna Murali
Thanks Stephen! I will try this out. On Thu, 9 Jun, 2022, 6:02 am Stephen Coy, wrote: > Hi there, > > We use something like: > > /* > * Force Spark to initialise the defaultParallelism by executing a dummy > parallel operation and then return > * the resulting defaultParallelism. > */ >

Re: Retrieve the count of spark nodes

2022-06-08 Thread Stephen Coy
Hi there, We use something like: /* * Force Spark to initialise the defaultParallelism by executing a dummy parallel operation and then return * the resulting defaultParallelism. */ private int getWorkerCount(SparkContext sparkContext) { sparkContext.parallelize(List.of(1, 2, 3,

Retrieve the count of spark nodes

2022-06-08 Thread Poorna Murali
Hi, I would like to know if it is possible to get the count of live master and worker spark nodes running in a system. Please help to clarify the same. Thanks, Poorna