Re: Terminate job without killing

2016-12-06 Thread Leonid Blokhin
Hi, Bruno!

You can send a message to the topic MQTT, when finished Job. This can be
done with the help of Mist service https://github.com/Hydrospheredata/mist,
or in a similar way.

Regards,

Leonid


7 дек. 2016 г. 6:03 пользователь "Bruno Faria" 
написал:

I have a python spark job that runs successfully but never ends (releases
the prompt). I got messages like "releasing accumulator" but never the
shutdown message (expected) and the prompt release.


In order to handle this I used sys.exit(0), now it works but the tasks
always appears as KILLED and I can't control or monitor if the job ended
successfully or not.


Basically I have 2 questions

1 - Is sys.exit(0) the best way to end a job or am I missing something
(heard sc.stop() is not a good approach)?

2 - How to make sure the job finished successfully or not (the idea is to
use airflow to monitor that)


Any help is really appreciated.


Thanks


Single context Spark from Python and Scala

2016-02-15 Thread Leonid Blokhin
Hello

 I want to work with single context Spark from Python and Scala. Is it
possible?

Is it possible to do betwen started  ./bin/pyspark and ./bin/spark-shell
for dramatic example?


Cheers,

Leonid