[GitHub] spark pull request: Initial commit to provide pluggable strategy t...

2014-10-20 Thread andrewor14
Github user andrewor14 commented on the pull request:

https://github.com/apache/spark/pull/2849#issuecomment-59807087
  
Hey @olegz is there an associated JIRA for this? If so could you include it 
in the title?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Initial commit to provide pluggable strategy t...

2014-10-19 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/2849#issuecomment-59672773
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Initial commit to provide pluggable strategy t...

2014-10-19 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/spark/pull/2849

Initial commit to provide pluggable strategy to facilitate access to nat...

Initial commit to provide pluggable strategy to facilitate access to native 
Hadoop resources

Added HadoopExecutionContext trait and its default implementation 
DefaultHadoopExecutionContext
Modified SparkContext to instantiate and delegate to the instance of 
HadoopExecutionContext where appropriate

Changed HadoopExecutionContext to JobExecutionContext
Changed DefaultHadoopExecutionContext to DefaultExecutionContext
Name changes are due to the fact that when Spark executes outside of Hadoop 
having Hadoop in the name woudl be confusing
Added initial documentation and tests

polished scaladoc

annotated JobExecutionContext with @DeveloperAPI

eliminated TaskScheduler null checks in favor of NoOpTaskScheduler
to be used in cases where execution of Spark DAG is delegated to an 
external execution environment

added execution-context check to SparkSubmit

Added recognition of execution-context to SparkContext
updated spark-class script to recognize when 'execution-context:' is used

polished merge

changed annotations from @DeveloperApi to @Experimental as part of the PR 
suggestion

externalized persist and unpersist operations

added classpath hooks to spark-class

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/spark-1 SH-1

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/2849.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2849


commit 84556c86f95500f89bb57f2bcc6c35f025799dc5
Author: Oleg Zhurakousky 
Date:   2014-09-16T15:26:48Z

Initial commit to provide pluggable strategy to facilitate access to native 
Hadoop resources
Added HadoopExecutionContext trait and its default implementation 
DefaultHadoopExecutionContext
Modified SparkContext to instantiate and delegate to the instance of 
HadoopExecutionContext where appropriate

Changed HadoopExecutionContext to JobExecutionContext
Changed DefaultHadoopExecutionContext to DefaultExecutionContext
Name changes are due to the fact that when Spark executes outside of Hadoop 
having Hadoop in the name woudl be confusing
Added initial documentation and tests

polished scaladoc

annotated JobExecutionContext with @DeveloperAPI

eliminated TaskScheduler null checks in favor of NoOpTaskScheduler
to be used in cases where execution of Spark DAG is delegated to an 
external execution environment

added execution-context check to SparkSubmit

Added recognition of execution-context to SparkContext
updated spark-class script to recognize when 'execution-context:' is used

polished merge

changed annotations from @DeveloperApi to @Experimental as part of the PR 
suggestion

externalized persist and unpersist operations

added classpath hooks to spark-class




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org