[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-26 Thread rekhajoshm
Github user rekhajoshm commented on the issue:

https://github.com/apache/spark/pull/20347
  
Thank you @srowen I admire you for doing what you do over all the jira/PR's 
I have studied, and followed up. 
If its ok, will keep this PR open for few days, and close if jira is 
getting to 'Not an issue'/'Won't fix' state.thanks.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-26 Thread srowen
Github user srowen commented on the issue:

https://github.com/apache/spark/pull/20347
  
@rekhajoshm I think maybe the right resolution here is to do nothing. I 
haven't heard @mengxr on his old JIRA to make this change. Thank you for 
chasing down open JIRAs like this of course.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-23 Thread jiangxb1987
Github user jiangxb1987 commented on the issue:

https://github.com/apache/spark/pull/20347
  
My major concern is that, if there is a existing `SparkContext`, some confs 
you set may not take effect, as described in `SparkContext.getOrCreate()`. It's 
hard to enumerate the use cases but I'm sure there are some that pass in 
specific confs to create a new `JavaSparkContext`, so I tend to keep the 
current behavior here.

On the other hand, the following comment copyed from the comment of the 
class `JavaSparkContext`:
```
 * Only one SparkContext may be active per JVM.  You must `stop()` the 
active SparkContext before
 * creating a new one.  This limitation may eventually be removed; see 
SPARK-2243 for more details.
```
If that is the case, there should be no active `SparkContext` before we 
initiate the `JavaSparkContext`, so the change doesn't bring any advantage in 
that means.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-23 Thread srowen
Github user srowen commented on the issue:

https://github.com/apache/spark/pull/20347
  
Yes, you can already get the new semantics here with `new 
JavaSparkContext(SparkContext.getOrCreate())`.

Yes, probably better to add a new method, or else, decide that it's not 
worth a new API method just as a shortcut for the above. Maybe that's the right 
conclusion, unless @mengxr comes back with a particular reason to change the 
behavior slightly.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-22 Thread jerryshao
Github user jerryshao commented on the issue:

https://github.com/apache/spark/pull/20347
  
Using `getOrCreate` in constructor seems change the semantics. Maybe we can 
add a new static method for such usage in `JavaSparkContext`.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-22 Thread srowen
Github user srowen commented on the issue:

https://github.com/apache/spark/pull/20347
  
@mengxr suggested this in the JIRA originally -- what was the reasoning? It 
makes some sense, but so does leaving the current behavior, where a constructor 
calls a constructor. It's a behavior change, albeit a slight one. 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-22 Thread jerryshao
Github user jerryshao commented on the issue:

https://github.com/apache/spark/pull/20347
  
Can you please explain why do we need to change to `getOrCreate`?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/20347
  
Merged build finished. Test PASSed.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/20347
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/86456/
Test PASSed.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-21 Thread SparkQA
Github user SparkQA commented on the issue:

https://github.com/apache/spark/pull/20347
  
**[Test build #86456 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/86456/testReport)**
 for PR 20347 at commit 
[`b1ae512`](https://github.com/apache/spark/commit/b1ae5125f65e0d8a59a4006a9777ed5185a758c9).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-21 Thread SparkQA
Github user SparkQA commented on the issue:

https://github.com/apache/spark/pull/20347
  
**[Test build #86456 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/86456/testReport)**
 for PR 20347 at commit 
[`b1ae512`](https://github.com/apache/spark/commit/b1ae5125f65e0d8a59a4006a9777ed5185a758c9).


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/20347
  
Merged build finished. Test PASSed.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...

2018-01-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/20347
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 

https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/84/
Test PASSed.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org