Github user abehrens closed the pull request at:
https://github.com/apache/spark/pull/10710
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is e
Github user abehrens commented on the pull request:
https://github.com/apache/spark/pull/10710#issuecomment-175017342
Thanks @holdenk, I see there are more functions than referenced in the
programming guide (http://spark.apache.org/docs/latest/programming-guide.html).
Probably should
Github user holdenk commented on the pull request:
https://github.com/apache/spark/pull/10710#issuecomment-174660392
Hi @abehrens - you may wish want to read the contributing guidelines at
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark (this
PR is missing a JI
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10710#issuecomment-170705375
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your p
GitHub user abehrens opened a pull request:
https://github.com/apache/spark/pull/10710
[pyspark] adding disjunction and difference functions for rdds
I was looking for a way to perform disjunction and difference operations,
in other words:
* disjunction: find all elements N