[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @srowen @rxin @petermaxlee i will close this pull request and create new one with documentation changes and also modify jira issue. --- If your project is set up for it, you can reply to this

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread srowen
Github user srowen commented on the issue: https://github.com/apache/spark/pull/14421 @phalodi these changes don't seem to be what Reynold suggested. You have your old change and added some example change (?) but didn't update documentation. --- If your project is set up for it,

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @srowen @rxin @petermaxlee I make some changes according to me you can suggest something else if you have some good idea else just merge it if its looks good. --- If your project is set up for

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @rxin ok i look into it soon and make changes in that. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread rxin
Github user rxin commented on the issue: https://github.com/apache/spark/pull/14421 (hence the lower case) --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so,

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread rxin
Github user rxin commented on the issue: https://github.com/apache/spark/pull/14421 I meant update the sparkContext field in SparkSession. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @rxin But according to user they read it from starting and while they read about sparkContext they dont know about spark session so i think we just add a single line below the example where we

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread rxin
Github user rxin commented on the issue: https://github.com/apache/spark/pull/14421 Can we update the doc for sparkContext? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @petermaxlee @rxin ok so i go for make changes in documentation...!! What you guyss suggest what is the correct place to add this. --- If your project is set up for it, you can reply to this email

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread rxin
Github user rxin commented on the issue: https://github.com/apache/spark/pull/14421 Yea seems like we should just update the documentation for this one. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-30 Thread petermaxlee
Github user petermaxlee commented on the issue: https://github.com/apache/spark/pull/14421 Isn't this just ``` new JavaSparkContext(session.sparkContext) ``` ? Perhaps we should just update the documentation to say that. --- If your project is set up