Hi all,

Cancellation seems to be supported at application level. In other words, you
can call stop() on your instance of SparkContext in order to stop the
computation associated with the SparkContext. Is there any way to cancel a
job? (To be clear, job is "a parallel computation consisting of multiple
tasks that gets spawned in response to a Spark action² as defined on the
Spark website.) The current RDD API doesn¹t seem to provide this
functionality, but I¹m wondering if there is any way to do anything similar.
I¹d like to be able to cancel a long-running job that is found to be
unnecessary without shutting down the SparkContext.

If there is no way to simulate the cancellation currently, is there any plan
to support this functionality? Or, is this just not part of the design or
desired uses of SparkContext?

Thanks!

Mingyu


Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to