[jira] [Updated] (SPARK-2862) DoubleRDDFunctions.histogram() throws exception for some inputs
[ https://issues.apache.org/jira/browse/SPARK-2862?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Xiangrui Meng updated SPARK-2862: - Assignee: Chandan Kumar DoubleRDDFunctions.histogram() throws exception for some inputs --- Key: SPARK-2862 URL: https://issues.apache.org/jira/browse/SPARK-2862 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 0.9.0, 0.9.1, 1.0.0, 1.0.1 Environment: Scala version 2.9.2 (OpenJDK 64-Bit Server VM, Java 1.7.0_55) running on Ubuntu 14.04 Reporter: Chandan Kumar Assignee: Chandan Kumar Fix For: 1.1.0 histogram method call throws an IndexOutOfBoundsException when the choice of bucketCount partitions the RDD in irrational increments e.g. scala val r = sc.parallelize(6 to 99) r: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at parallelize at console:12 scala r.histogram(9) java.lang.IndexOutOfBoundsException: 9 at scala.collection.immutable.NumericRange.apply(NumericRange.scala:124) at scala.collection.immutable.NumericRange$$anon$1.apply(NumericRange.scala:176) at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:66) at scala.collection.IterableLike$class.copyToArray(IterableLike.scala:237) at scala.collection.AbstractIterable.copyToArray(Iterable.scala:54) at scala.collection.TraversableOnce$class.copyToArray(TraversableOnce.scala:241) at scala.collection.AbstractTraversable.copyToArray(Traversable.scala:105) at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:249) at scala.collection.AbstractTraversable.toArray(Traversable.scala:105) at org.apache.spark.rdd.DoubleRDDFunctions.histogram(DoubleRDDFunctions.scala:116) at $iwC$$iwC$$iwC$$iwC.init(console:15) at $iwC$$iwC$$iwC.init(console:20) at $iwC$$iwC.init(console:22) at $iwC.init(console:24) at init(console:26) -- This message was sent by Atlassian JIRA (v6.2#6252) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-2862) DoubleRDDFunctions.histogram() throws exception for some inputs
[ https://issues.apache.org/jira/browse/SPARK-2862?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Prashant Sharma updated SPARK-2862: --- Affects Version/s: 1.0.1 DoubleRDDFunctions.histogram() throws exception for some inputs --- Key: SPARK-2862 URL: https://issues.apache.org/jira/browse/SPARK-2862 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 0.9.0, 0.9.1, 1.0.0, 1.0.1 Environment: Scala version 2.9.2 (OpenJDK 64-Bit Server VM, Java 1.7.0_55) running on Ubuntu 14.04 Reporter: Chandan Kumar histogram method call throws the below stack trace when the choice of bucketCount partitions the RDD in irrational increments e.g. scala val r = sc.parallelize(6 to 99) r: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at parallelize at console:12 scala r.histogram(9) java.lang.IndexOutOfBoundsException: 9 at scala.collection.immutable.NumericRange.apply(NumericRange.scala:124) at scala.collection.immutable.NumericRange$$anon$1.apply(NumericRange.scala:176) at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:66) at scala.collection.IterableLike$class.copyToArray(IterableLike.scala:237) at scala.collection.AbstractIterable.copyToArray(Iterable.scala:54) at scala.collection.TraversableOnce$class.copyToArray(TraversableOnce.scala:241) at scala.collection.AbstractTraversable.copyToArray(Traversable.scala:105) at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:249) at scala.collection.AbstractTraversable.toArray(Traversable.scala:105) at org.apache.spark.rdd.DoubleRDDFunctions.histogram(DoubleRDDFunctions.scala:116) at $iwC$$iwC$$iwC$$iwC.init(console:15) at $iwC$$iwC$$iwC.init(console:20) at $iwC$$iwC.init(console:22) at $iwC.init(console:24) at init(console:26) -- This message was sent by Atlassian JIRA (v6.2#6252) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org