[ https://issues.apache.org/jira/browse/SPARK-9875?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15009891#comment-15009891 ]
Jakob Odersky commented on SPARK-9875: -------------------------------------- I'm not sure I understand the issue. Are you trying to force running {{func}} on every partition to achieve some kind of side effect? > Do not evaluate foreach and foreachPartition with count > ------------------------------------------------------- > > Key: SPARK-9875 > URL: https://issues.apache.org/jira/browse/SPARK-9875 > Project: Spark > Issue Type: Improvement > Components: PySpark > Reporter: Zoltán Zvara > Priority: Minor > > It is evident, that the summation inside count will result in an overhead, > which would be nice to remove from the current execution. > {{self.mapPartitions(func).count() # Force evaluation}} @{{rdd.py}} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org