[GitHub] spark pull request #15134: [SPARK-17580][CORE]Add random UUID as app name wh...

2016-09-19 Thread phalodi
Github user phalodi closed the pull request at: https://github.com/apache/spark/pull/15134 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request #15133: [SPARK-17578][Docs] Add spark.app.name default va...

2016-09-19 Thread phalodi
Github user phalodi closed the pull request at: https://github.com/apache/spark/pull/15133 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark issue #15134: [SPARK-17580][CORE]Add random UUID as app name while app...

2016-09-19 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15134 @srowen Ok no problem i will close this PR --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark issue #15134: [SPARK-17580][CORE]Add random UUID as app name while app...

2016-09-19 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15134 @jerryshao ok i will change it as you suggested but then i think if UUID is not a good name then we should also change it for spark session you see below spark session generate UUID while

[GitHub] spark issue #15134: [SPARK-17580][CORE]Add random UUID as app name while app...

2016-09-19 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15134 @jerryshao yeah you are right jerry for spark-submit and launcher its works, but for many time user also have usecase to just locally using spark like reading file with spark more fast even we run

[GitHub] spark issue #15134: [SPARK-17580][CORE]Add random UUID as app name while app...

2016-09-19 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15134 @sadikovi Yes it will, like if we will not define app name in SparkConf object or spark-submit then it will generate exception , In this PR random UUID is generated for app name if its not defining

[GitHub] spark issue #15134: [SPARK-17580][CORE]Add random UUID as app name while app...

2016-09-18 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15134 @jerryshao yeah if you suggest i prefix spark- in front of UUID but yeah still when user use core part of RDD when they dont need to deal with dataset they create SparkContext and some people also

[GitHub] spark issue #15134: [SPARK-17580][CORE]Add random UUID as app name while app...

2016-09-18 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15134 @jerryshao We can i think like if while creating spark context for application in its configuration i didnt define app name and at a time of spark-submit i also not provide app name

[GitHub] spark issue #15133: [SPARK-17578][Docs] Add spark.app.name default value for...

2016-09-17 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15133 Also change this one according to that now default value of app name is (random) for session and context ![random](https://cloud.githubusercontent.com/assets/8075390/18613106/c5e3b420-7d8f

[GitHub] spark issue #15133: [SPARK-17578][Docs] Add spark.app.name default value for...

2016-09-17 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15133 @andrewor14 what is think about this https://github.com/apache/spark/pull/15134 we add random UUID of app name while creating spark context if its not define. --- If your project is set up

[GitHub] spark pull request #15134: [SPARK-17580][CORE]Add random UUID as app name wh...

2016-09-17 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/15134 [SPARK-17580][CORE]Add random UUID as app name while app name not define while creating … ## What changes were proposed in this pull request? Assign Random UUID as a app name while app

[GitHub] spark issue #15133: [SPARK-17578][Docs] Add spark.app.name default value for...

2016-09-17 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/15133 @andrewor14 So as you suggest we also change it in spark context code because right now we must set app name while creating spark context. So when we add random UUID generate for default value

[GitHub] spark pull request #15133: [SPARK-17578][Docs] Add spark.app.name default va...

2016-09-17 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/15133 [SPARK-17578][Docs] Add spark.app.name default value for spark session ## What changes were proposed in this pull request? Modify spark.app.name configuration for spark session

[GitHub] spark pull request #15130: remove extra table tags in configuration document

2016-09-17 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/15130 remove extra table tags in configuration document ## What changes were proposed in this pull request? Remove extra table tags in configurations document. ## How

[GitHub] spark pull request #14684: [SPARK-17105][CORE] App name will be random UUID ...

2016-08-17 Thread phalodi
Github user phalodi closed the pull request at: https://github.com/apache/spark/pull/14684 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark issue #14684: [SPARK-17105][CORE] App name will be random UUID while c...

2016-08-17 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14684 @srowen ok so i close this PR as you suggested. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark issue #14684: [SPARK-17105][CORE] App name will be random UUID while c...

2016-08-17 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14684 @srowen Yes you are right but SPARK-16966 is only for spark session but while user creating the spark context then they must be give app name. i think for both spark session and spark context

[GitHub] spark pull request #14684: [SPARK-17105][CORE] App name will be random UUID ...

2016-08-17 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/14684 [SPARK-17105][CORE] App name will be random UUID while creating spark context if it will … ## What changes were proposed in this pull request? App name will be random UUID while

[GitHub] spark pull request #14669: Remove api doc link for mapReduceTriplets operato...

2016-08-16 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/14669 Remove api doc link for mapReduceTriplets operator ## What changes were proposed in this pull request? Remove the api doc link for mapReduceTriplets operator because in latest api

[GitHub] spark issue #14436: [SPARK-16816] Modify java example which is also reflect ...

2016-08-01 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14436 @srowen Yeah i think its look better, complete the example like in earlier versions and also indirectly show how to works with java spark context. --- If your project is set up for it, you can

[GitHub] spark issue #14436: [SPARK-16816] Modify java example which is also refelect...

2016-08-01 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14436 @srowen @rxin Hey guyss i make changes as you suggested but yeah it not exactly what you suggested, i change in example which is also reflect in documentation, so please review it and give give

[GitHub] spark pull request #14436: modify java example which is also refelect in doc...

2016-08-01 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/14436 modify java example which is also refelect in documentation exmaple ## What changes were proposed in this pull request? Modify java example which is also reflect in document

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @srowen @rxin @petermaxlee i will close this pull request and create new one with documentation changes and also modify jira issue. --- If your project is set up for it, you can reply

[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

2016-07-31 Thread phalodi
Github user phalodi closed the pull request at: https://github.com/apache/spark/pull/14421 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @srowen @rxin @petermaxlee I make some changes according to me you can suggest something else if you have some good idea else just merge it if its looks good. --- If your project is set up

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @rxin ok i look into it soon and make changes in that. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @rxin But according to user they read it from starting and while they read about sparkContext they dont know about spark session so i think we just add a single line below the example where we

[GitHub] spark issue #14421: [SPARK-16816] Add api to get JavaSparkContext from Spark...

2016-07-31 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14421 @petermaxlee @rxin ok so i go for make changes in documentation...!! What you guyss suggest what is the correct place to add this. --- If your project is set up for it, you can reply to this email

[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

2016-07-30 Thread phalodi
Github user phalodi commented on a diff in the pull request: https://github.com/apache/spark/pull/14421#discussion_r72888685 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala --- @@ -122,6 +122,14 @@ class SparkSession private( val sqlContext

[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

2016-07-30 Thread phalodi
Github user phalodi commented on a diff in the pull request: https://github.com/apache/spark/pull/14421#discussion_r72888636 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala --- @@ -122,6 +122,14 @@ class SparkSession private( val sqlContext

[GitHub] spark pull request #14421: [SPARK-16816] Add api to get JavaSparkContext fro...

2016-07-30 Thread phalodi
Github user phalodi commented on a diff in the pull request: https://github.com/apache/spark/pull/14421#discussion_r72888602 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala --- @@ -122,6 +122,14 @@ class SparkSession private( val sqlContext

[GitHub] spark pull request #14421: [Spark-16916] Add api to get JavaSparkContext fro...

2016-07-30 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/14421 [Spark-16916] Add api to get JavaSparkContext from SparkSession ## What changes were proposed in this pull request? Add api to get JavaSparkContext from SparkSession ## How

[GitHub] spark pull request #14135: [Spark-16479] Add Example for asynchronous action

2016-07-15 Thread phalodi
Github user phalodi closed the pull request at: https://github.com/apache/spark/pull/14135 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-12 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen Is this is correct if it is then please merge it otherwise give comment if needed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub

[GitHub] spark issue #14135: [Spark-16479] Add Example for asynchronous action

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14135 @srowen yeah you are right i will close this request can you please look into documentation pull request https://github.com/apache/spark/pull/14104 --- If your project is set up for it, you can

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen Below is final screen shot as you suggested i hope its looks good please review it and merge it. ![screenshot from 2016-07-11 17-45-24](https://cloud.githubusercontent.com/assets

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen I put back-tick-quote for foreach, foreachAsync and FutureAction. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen Thanks man i know i am not giving correct language but my intention is also same i hope its final commit :) if its done can you merge it. --- If your project is set up for it, you can

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen @rxin ![screenshot from 2016-07-11 16-51-53](https://cloud.githubusercontent.com/assets/8075390/16729090/e860ff50-4787-11e6-915e-7e39da6558e3.png) --- If your project is set up

[GitHub] spark pull request #14135: [Spark-16479] Add Example for asynchronous action

2016-07-11 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/14135 [Spark-16479] Add Example for asynchronous action ## What changes were proposed in this pull request? Add Example for asynchronous action ## How was this patch tested

[GitHub] spark pull request #14134: [Spark-16479] Add Example for asynchronous action

2016-07-11 Thread phalodi
Github user phalodi closed the pull request at: https://github.com/apache/spark/pull/14134 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request #14134: [Spark-16479] Add Example for asynchronous action

2016-07-11 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/14134 [Spark-16479] Add Example for asynchronous action ## What changes were proposed in this pull request? Add examples for asynchronous actions ## How was this patch tested

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen now its perfect i think at last from table to one line but yeah you are right it good to just mention that spark provide it and not duplicate table of actions. --- If your project is set

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen sure i will make appropriate changes and push it again --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen so what do you think we should not add this api reference in document? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-11 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen i know its nothing major just call normal functions in future but still naive user first time who learn scala and spark dont know what are future and all so at least we should add reference

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-10 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen @rxin Please review it and give your comments. ![screenshot from 2016-07-11 01-08-21](https://cloud.githubusercontent.com/assets/8075390/16715663/1305e0f6-4704-11e6-90e0

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-10 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen i make changes as you suggest i remove the table and add a single below actions table and also change statement to be more clear about non blocking. --- If your project is set up

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-10 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen ok no problem so what you suggest should i remove the table and just add the line below the actions table and give link to scala and java docs? And the line "Spark provide asynchr

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-10 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen yeah i agree with you its same like non async actions but yeah we should list it because its more useful in real life application when single application running multiple jobs of different

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-10 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @srowen yeah you are right its not blocking calling thread but they execute sequentially right but in async action its return future so its running on different threads soo not run sequentially soo

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-09 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @rxin Below are screen shot of Asynchronous Actions inside actions and screen shot of index of programming guide. ![screenshot from 2016-07-09 14-20-54](https

[GitHub] spark issue #14104: [SPARK-16438] Add Asynchronous Actions documentation

2016-07-08 Thread phalodi
Github user phalodi commented on the issue: https://github.com/apache/spark/pull/14104 @AmplabJenkins I dont know i just create the issue on JIRA and make changes and make pull request for merge , I think it is valid pull request because in real world all application need non

[GitHub] spark pull request #14104: Add Asynchronous Actions documentation

2016-07-08 Thread phalodi
GitHub user phalodi opened a pull request: https://github.com/apache/spark/pull/14104 Add Asynchronous Actions documentation ## What changes were proposed in this pull request? Add Asynchronous Actions documentation inside action of programming guide ## How