Thanks Stephen! I will try this out.
On Thu, 9 Jun, 2022, 6:02 am Stephen Coy, wrote:
> Hi there,
>
> We use something like:
>
> /*
> * Force Spark to initialise the defaultParallelism by executing a dummy
> parallel operation and then return
> * the resulting defaultParallelism.
> */
>
Hi there,
We use something like:
/*
* Force Spark to initialise the defaultParallelism by executing a dummy
parallel operation and then return
* the resulting defaultParallelism.
*/
private int getWorkerCount(SparkContext sparkContext) {
sparkContext.parallelize(List.of(1, 2, 3,
Hi,
I would like to know if it is possible to get the count of live master and
worker spark nodes running in a system.
Please help to clarify the same.
Thanks,
Poorna