GitHub user gatorsmile opened a pull request: https://github.com/apache/spark/pull/10149
[SPARK-12150] [SQL] [Minor] Add range API without specifying the slice number For usability, add another sqlContext.range() method. Users can specify start, end, and step without specifying the slice number. The slice number is based on the sparkContext's defaultParallelism. It just makes consistent with the RDD range API. You can merge this pull request into a Git repository by running: $ git pull https://github.com/gatorsmile/spark range Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/10149.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #10149 ---- commit 8c4bd8351f79db2ce2aebc8a641442ba882295b8 Author: gatorsmile <gatorsm...@gmail.com> Date: 2015-12-04T20:23:36Z range API with a default partition number commit 6655b9d9515819cf81844c63c7105eb59882be12 Author: gatorsmile <gatorsm...@gmail.com> Date: 2015-12-04T20:25:52Z 2.0->1.6 commit 72860c4e93de38da18ee13e46368493d04819094 Author: gatorsmile <gatorsm...@gmail.com> Date: 2015-12-04T20:27:02Z Merge remote-tracking branch 'upstream/master' into range ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org