You have an issue with your cluster setup. Can you paste your
conf/spark-env.sh and the conf/slaves files here?
The reason why your job is running fine is because you set the master
inside the job as local[*] which runs in local mode (not in standalone
cluster mode).
Thanks
Best Regards
On Mon
I have the following simple example program:
public class SimpleCount {
public static void main(String[] args) {
final String master = System.getProperty("spark.master",
"local[*]");
System.out.printf("Running job against spark master %s ...%n",
master);
final SparkCo