Hi, Bruno!

You can send a message to the topic MQTT, when finished Job. This can be
done with the help of Mist service https://github.com/Hydrospheredata/mist,
or in a similar way.

Regards,

Leonid


7 дек. 2016 г. 6:03 пользователь "Bruno Faria" <brunocf...@hotmail.com>
написал:

I have a python spark job that runs successfully but never ends (releases
the prompt). I got messages like "releasing accumulator" but never the
shutdown message (expected) and the prompt release.


In order to handle this I used sys.exit(0), now it works but the tasks
always appears as KILLED and I can't control or monitor if the job ended
successfully or not.


Basically I have 2 questions

1 - Is sys.exit(0) the best way to end a job or am I missing something
(heard sc.stop() is not a good approach)?

2 - How to make sure the job finished successfully or not (the idea is to
use airflow to monitor that)


Any help is really appreciated.


Thanks

Reply via email to