[ https://issues.apache.org/jira/browse/SPARK-8116?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14573752#comment-14573752 ]
Apache Spark commented on SPARK-8116: ------------------------------------- User 'belisarius222' has created a pull request for this issue: https://github.com/apache/spark/pull/6656 > sc.range() doesn't match python range() > --------------------------------------- > > Key: SPARK-8116 > URL: https://issues.apache.org/jira/browse/SPARK-8116 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 1.4.0, 1.4.1 > Reporter: Ted Blackman > Priority: Minor > Labels: easyfix > > Python's built-in range() and xrange() functions can take 1, 2, or 3 > arguments. Ranges with just 1 argument are probably used the most frequently, > e.g.: > for i in range(len(myList)): ... > However, in pyspark, the SparkContext range() method throws an error when > called with a single argument, due to the way its arguments get passed into > python's range function. > There's no good reason that I can think of not to support the same syntax as > the built-in function. To fix this, we can set the default of the sc.range() > method's `stop` argument to None, and then inside the method, if it is None, > replace `stop` with `start` and set `start` to 0, which is what the c > implementation of range() does: > https://github.com/python/cpython/blob/master/Objects/rangeobject.c#L87 -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org