[jira] [Commented] (SPARK-3874) Provide stable TaskContext API

2014-10-14 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-3874?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14171095#comment-14171095
 ] 

Apache Spark commented on SPARK-3874:
-

User 'pwendell' has created a pull request for this issue:
https://github.com/apache/spark/pull/2803

> Provide stable TaskContext API
> --
>
> Key: SPARK-3874
> URL: https://issues.apache.org/jira/browse/SPARK-3874
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Reporter: Patrick Wendell
>Assignee: Prashant Sharma
>
> We made some improvements in SPARK-3543 but for Spark 1.2 we should convert 
> TaskContext into a fully stable API. To do this I’d suggest the following 
> changes - note that some of this reverses parts of SPARK-3543. The goal is to 
> provide a class that users can’t easily construct and exposes only the public 
> functionality.
> 1. Separate TaskContext into a public abstract class (TaskContext) and a 
> private implementation called TaskContextImpl. The former should be a Java 
> abstract class - the latter should be a private[spark] Scala class to reduce 
> visibility (or maybe we can keep it as Java and tell people not to use it?).
> 2. TaskContext abstract class will have (NOTE: this changes getXX() to XX() 
> intentionally)
> public isCompleted()
> public isInterrupted()
> public addTaskCompletionListener(...)
> public addTaskCompletionCallback(...) (deprecated)
> public stageId()
> public partitionId()
> public attemptId()
> pubic isRunningLocally()
> STATIC
> public get() 
> set() and unset() at default visibility
> 3. A new private[spark] static object TaskContextHelper in the same package 
> as TaskContext will exist to expose set() and unset() from within Spark using 
> forwarder methods that just call TaskContext.set(). If someone within Spark 
> wants to set this they call TaskContextHelper.set() and it forwards it.
> 4. TaskContextImpl will be used whenever we construct a TaskContext 
> internally.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-3874) Provide stable TaskContext API

2014-10-13 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-3874?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169239#comment-14169239
 ] 

Apache Spark commented on SPARK-3874:
-

User 'ScrapCodes' has created a pull request for this issue:
https://github.com/apache/spark/pull/2782

> Provide stable TaskContext API
> --
>
> Key: SPARK-3874
> URL: https://issues.apache.org/jira/browse/SPARK-3874
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Reporter: Patrick Wendell
>Assignee: Prashant Sharma
>
> We made some improvements in SPARK-3543 but for Spark 1.2 we should convert 
> TaskContext into a fully stable API. To do this I’d suggest the following 
> changes - note that some of this reverses parts of SPARK-3543. The goal is to 
> provide a class that users can’t easily construct and exposes only the public 
> functionality.
> 1. Separate TaskContext into a public abstract class (TaskContext) and a 
> private implementation called TaskContextImpl. The former should be a Java 
> abstract class - the latter should be a private[spark] Scala class to reduce 
> visibility (or maybe we can keep it as Java and tell people not to use it?).
> 2. TaskContext abstract class will have (NOTE: this changes getXX() to XX() 
> intentionally)
> public isCompleted()
> public isInterrupted()
> public addTaskCompletionListener(...)
> public addTaskCompletionCallback(...) (deprecated)
> public stageId()
> public partitionId()
> public attemptId()
> pubic isRunningLocally()
> STATIC
> public get() 
> set() and unset() at default visibility
> 3. A new private[spark] static object TaskContextHelper in the same package 
> as TaskContext will exist to expose set() and unset() from within Spark using 
> forwarder methods that just call TaskContext.set(). If someone within Spark 
> wants to set this they call TaskContextHelper.set() and it forwards it.
> 4. TaskContextImpl will be used whenever we construct a TaskContext 
> internally.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-3874) Provide stable TaskContext API

2014-10-09 Thread Reynold Xin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-3874?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14164874#comment-14164874
 ] 

Reynold Xin commented on SPARK-3874:


The proposal LGTM.

> Provide stable TaskContext API
> --
>
> Key: SPARK-3874
> URL: https://issues.apache.org/jira/browse/SPARK-3874
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Reporter: Patrick Wendell
>Assignee: Prashant Sharma
>
> We made some improvements in SPARK-3543 but for Spark 1.2 we should convert 
> TaskContext into a fully stable API. To do this I’d suggest the following 
> changes - note that some of this reverses parts of SPARK-3543. The goal is to 
> provide a class that users can’t easily construct and exposes only the public 
> functionality.
> 1. Separate TaskContext into a public abstract class (TaskContext) and a 
> private implementation called TaskContextImpl. The former should be a Java 
> abstract class - the latter should be a private[spark] Scala class to reduce 
> visibility (or maybe we can keep it as Java and tell people not to use it?).
> 2. TaskContext abstract class will have (NOTE: this changes getXX() to XX() 
> intentionally)
> public isCompleted()
> public isInterrupted()
> public addTaskCompletionListener(...)
> public addTaskCompletionCallback(...) (deprecated)
> public stageId()
> public partitionId()
> public attemptId()
> pubic isRunningLocally()
> STATIC
> public get() 
> set() and unset() at default visibility
> 3. A new private[spark] static object TaskContextHelper in the same package 
> as TaskContext will exist to expose set() and unset() from within Spark using 
> forwarder methods that just call TaskContext.set(). If someone within Spark 
> wants to set this they call TaskContextHelper.set() and it forwards it.
> 4. TaskContextImpl will be used whenever we construct a TaskContext 
> internally.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org