hello, im running spark on a cluster and i want to monitor how many nodes/ cores are active in different (specific) points of the program.
is there any way to do this? thanks, Isca
hello, im running spark on a cluster and i want to monitor how many nodes/ cores are active in different (specific) points of the program.
is there any way to do this? thanks, Isca