Hi,

I am trying to submit a job on Spark 1.4 (with Spark Master):

bin/spark-submit --master spark://<address>:7077 --driver-memory 4g 
--executor-memory 4G  --executor-cores 4 --num-executors 1 
spark/examples/src/main/python/pi.py 6

which returns:

ERROR SparkDeploySchedulerBackend: Asked to remove non-existent executor <here 
growing integers>

On Spark Master I see that the job is indeed submitted. Running locally goes 
fine.

I am behind NAT. From what I read it might be that the Master cannot find 
Worker nodes. Could it be the case? Any ideas how to resolve it?

Cheers,
Lucas


=============================================================================== 
Please access the attached hyperlink for an important electronic communications 
disclaimer: 
http://www.credit-suisse.com/legal/en/disclaimer_email_ib.html 
=============================================================================== 

Reply via email to