[ https://issues.apache.org/jira/browse/SPARK-8765?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiangrui Meng updated SPARK-8765: --------------------------------- Labels: flaky-test (was: ) > Flaky PySpark PowerIterationClustering test > ------------------------------------------- > > Key: SPARK-8765 > URL: https://issues.apache.org/jira/browse/SPARK-8765 > Project: Spark > Issue Type: Test > Components: MLlib, PySpark > Reporter: Joseph K. Bradley > Assignee: Yanbo Liang > Priority: Critical > Labels: flaky-test > > See failure: > [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/36133/console] > {code} > ********************************************************************** > File > "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/mllib/clustering.py", > line 291, in __main__.PowerIterationClusteringModel > Failed example: > sorted(model.assignments().collect()) > Expected: > [Assignment(id=0, cluster=1), Assignment(id=1, cluster=0), ... > Got: > [Assignment(id=0, cluster=1), Assignment(id=1, cluster=1), > Assignment(id=2, cluster=1), Assignment(id=3, cluster=1), Assignment(id=4, > cluster=0)] > ********************************************************************** > File > "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/mllib/clustering.py", > line 299, in __main__.PowerIterationClusteringModel > Failed example: > sorted(sameModel.assignments().collect()) > Expected: > [Assignment(id=0, cluster=1), Assignment(id=1, cluster=0), ... > Got: > [Assignment(id=0, cluster=1), Assignment(id=1, cluster=1), > Assignment(id=2, cluster=1), Assignment(id=3, cluster=1), Assignment(id=4, > cluster=0)] > ********************************************************************** > 2 of 13 in __main__.PowerIterationClusteringModel > ***Test Failed*** 2 failures. > Had test failures in pyspark.mllib.clustering with python2.6; see logs. > {code} > CC: [~mengxr] [~yanboliang] -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org