Repository: spark
Updated Branches:
  refs/heads/master 42279bff6 -> 69f539140


[SPARK-16398][CORE] Make cancelJob and cancelStage APIs public

## What changes were proposed in this pull request?

Make SparkContext `cancelJob` and `cancelStage` APIs public. This allows 
applications to use `SparkListener` to do their own management of jobs via 
events, but without using the REST API.

## How was this patch tested?

Existing tests (dev/run-tests)

Author: MasterDDT <mite...@live.com>

Closes #14072 from MasterDDT/SPARK-16398.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/69f53914
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/69f53914
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/69f53914

Branch: refs/heads/master
Commit: 69f5391408b779a400b553344fd61051004685fc
Parents: 42279bf
Author: MasterDDT <mite...@live.com>
Authored: Wed Jul 6 22:47:40 2016 -0700
Committer: Reynold Xin <r...@databricks.com>
Committed: Wed Jul 6 22:47:40 2016 -0700

----------------------------------------------------------------------
 .../scala/org/apache/spark/SparkContext.scala     | 18 ++++++++++++++----
 1 file changed, 14 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/69f53914/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala 
b/core/src/main/scala/org/apache/spark/SparkContext.scala
index fe15052..57d1f09 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -2011,13 +2011,23 @@ class SparkContext(config: SparkConf) extends Logging 
with ExecutorAllocationCli
     dagScheduler.cancelAllJobs()
   }
 
-  /** Cancel a given job if it's scheduled or running */
-  private[spark] def cancelJob(jobId: Int) {
+  /**
+   * Cancel a given job if it's scheduled or running.
+   *
+   * @param jobId the job ID to cancel
+   * @throws InterruptedException if the cancel message cannot be sent
+   */
+  def cancelJob(jobId: Int) {
     dagScheduler.cancelJob(jobId)
   }
 
-  /** Cancel a given stage and all jobs associated with it */
-  private[spark] def cancelStage(stageId: Int) {
+  /**
+   * Cancel a given stage and all jobs associated with it.
+   *
+   * @param stageId the stage ID to cancel
+   * @throws InterruptedException if the cancel message cannot be sent
+   */
+  def cancelStage(stageId: Int) {
     dagScheduler.cancelStage(stageId)
   }
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to