[ https://issues.apache.org/jira/browse/SPARK-3098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14102064#comment-14102064 ]
Guoqiang Li edited comment on SPARK-3098 at 8/19/14 9:41 AM: ------------------------------------------------------------- We cluster on yarn. You can try the following code in cluster mode {code} val c = sc.parallelize(1 to 7899).flatMap { i => (1 to 10000).toSeq.map(p => i * 6000 + p) }.distinct().zipWithIndex() c.join(c).filter(t => t._2._1 != t._2._2).take(3) {code} => {code} Array[(Int, (Long, Long))] = Array((1732608,(11,12)), (45515264,(12,13)), (36579712,(13,14))) {code} was (Author: gq): We cluster on yarn. You can try the following code in cluster mode {code} val c = sc.parallelize(1 to 7899).flatMap { i => (1 to 10000).toSeq.map(p => i * 6000 + p) }.distinct().zipWithIndex() val e = c.map(t => (t._1, t._2)) e.join(e).filter(t => t._2._1 != t._2._2).take(3) {code} => {code} Array[(Int, (Long, Long))] = Array((1732608,(11,12)), (45515264,(12,13)), (36579712,(13,14))) {code} > In some cases, operation zipWithIndex get a wrong results > ---------------------------------------------------------- > > Key: SPARK-3098 > URL: https://issues.apache.org/jira/browse/SPARK-3098 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.0.1 > Reporter: Guoqiang Li > Priority: Critical > > I do not know how to reproduce the bug. > This is the case. When I was in operating 10 billion data by groupByKey. the > results error: > {noformat} > (4696501, 370568) > (4696501, 376672) > (4696501, 374880) > ..... > (4696502, 350264) > (4696502, 358458) > (4696502, 398502) > ...... > {noformat} > => > {noformat} > (4696501,ArrayBuffer(350264, 358458, 398502 ........)), > (4696502,ArrayBuffer(376621, ......)) > {noformat} > code : > {code} > val dealOuts = clickPreferences(sc, dealOutPath, periodTime) > val dealOrders = orderPreferences(sc, dealOrderPath, periodTime) > val favorites = favoritePreferences(sc, favoritePath, periodTime) > val allBehaviors = (dealOrders ++ favorites ++ dealOuts) > val peferences= allBehaviors.groupByKey().map { ... } > {code} > spark-defaults.conf: > {code} > spark.default.parallelism 280 > {code} -- This message was sent by Atlassian JIRA (v6.2#6252) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org