[ https://issues.apache.org/jira/browse/SPARK-8309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-8309. ------------------------------ Resolution: Fixed Fix Version/s: 1.4.1 1.5.0 1.3.2 Issue resolved by pull request 6763 [https://github.com/apache/spark/pull/6763] > OpenHashMap doesn't work with more than 12M items > ------------------------------------------------- > > Key: SPARK-8309 > URL: https://issues.apache.org/jira/browse/SPARK-8309 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.4.0 > Reporter: Vyacheslav Baranov > Fix For: 1.3.2, 1.5.0, 1.4.1 > > > The problem might be demonstrated with the following testcase: > {code} > test("support for more than 12M items") { > val cnt = 12000000 // 12M > val map = new OpenHashMap[Int, Int](cnt) > for (i <- 0 until cnt) { > map(i) = 1 > } > val numInvalidValues = map.iterator.count(_._2 == 0) > assertResult(0)(numInvalidValues) > } > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org