[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-19 Thread srowen
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/6234#discussion_r30622337 --- Diff: docs/programming-guide.md --- @@ -821,9 +820,7 @@ by a key. In Scala, these operations are automatically available on RDDs containing

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-19 Thread asfgit
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/6234 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enab

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-19 Thread zsxwing
Github user zsxwing commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103598095 LGTM except the minor comment. But I think @srowen will handle it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-19 Thread srowen
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/6234#discussion_r30573957 --- Diff: docs/programming-guide.md --- @@ -41,14 +41,15 @@ In addition, if you wish to access an HDFS cluster, you need to add a dependency artifactI

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread rxin
Github user rxin commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103336164 That looks good. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread daisukebe
Github user daisukebe commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103308933 Thanks guys. Then, does adding the following make sense? > If the Spark version is prior to 1.3.0, user needs to explicitly import org.apache.spark.SparkContex

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread rxin
Github user rxin commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103163427 Yes we should remove it. But maybe add a note somewhere to say "in Spark version xxx and below, you would need to explicitly ..." --- If your project is set up for it, you

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread zsxwing
Github user zsxwing commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103162608 @srowen Agree. @rxin what do you think? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread zsxwing
Github user zsxwing commented on a diff in the pull request: https://github.com/apache/spark/pull/6234#discussion_r30534442 --- Diff: docs/programming-guide.md --- @@ -821,9 +820,7 @@ by a key. In Scala, these operations are automatically available on RDDs containing

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread srowen
Github user srowen commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103151663 The docs are versioned though. I can understand that people using old Spark might still look at the latest docs, but then there are a number of problems of that form. I t

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread zsxwing
Github user zsxwing commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103124021 I remember we keep `import org.apache.spark.SparkContext._` for people that uses old Spark versions. They may read this doc even if they don't use the latest Spark. --

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread srowen
Github user srowen commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103039119 Yes I believe that's correct. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not hav

[GitHub] spark pull request: [SPARK-7704] Updating Programming Guides per S...

2015-05-18 Thread srowen
Github user srowen commented on the pull request: https://github.com/apache/spark/pull/6234#issuecomment-103039097 OK to test --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enab