Re: Retrieve the count of spark nodes
Hi there, We use something like: /* * Force Spark to initialise the defaultParallelism by executing a dummy parallel operation and then return * the resulting defaultParallelism. */ private int getWorkerCount(SparkContext sparkContext) { sparkContext.parallelize(List.of(1, 2, 3,
Retrieve the count of spark nodes
Hi, I would like to know if it is possible to get the count of live master and worker spark nodes running in a system. Please help to clarify the same. Thanks, Poorna