Github user buckhx commented on the issue:
https://github.com/apache/spark/pull/12398
@davies how's this looking?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user buckhx commented on the pull request:
https://github.com/apache/spark/pull/12398#issuecomment-219136325
changed addRequirementsFile to addPyRequirements which takes a list of
requirement strings
---
If your project is set up for it, you can reply to this email and have
Github user buckhx commented on the pull request:
https://github.com/apache/spark/pull/12398#issuecomment-218478989
@robert3005 added the --py-requirements to spark-submit. I'm looking at
passing the requirements as a list or a line delimited string currently.
---
If your project
Github user buckhx commented on the pull request:
https://github.com/apache/spark/pull/12398#issuecomment-217902925
Added an addPyPackage example to the docstring and that test formatting.
We're looking into adding the spark-submit arg for --py-requirements.
The pip API takes
Github user buckhx commented on the pull request:
https://github.com/apache/spark/pull/12398#issuecomment-211420495
I could see the pipBase approach working.
The other testing blocker I have is that it seems like that context that's
getting created in the test is able to use
GitHub user buckhx opened a pull request:
https://github.com/apache/spark/pull/12398
[SPARK-5929][PYSPARK] Context addPyPackage and addRequirementsFile
## What changes were proposed in this pull request?
Context.addPyPackage()
Context.addRequirementsFile()
Both
Github user buckhx commented on the pull request:
https://github.com/apache/spark/pull/4897#issuecomment-186259497
This is a pretty big pain when using pyspark (adding modules to workers)
and should definitely be included. I'll open a new pull request and to push
this through
Github user buckhx commented on the pull request:
https://github.com/apache/spark/pull/4897#issuecomment-150020186
@davies removed the requirements file from the context constructor. I
looked into add --py-requirements to spark-submit, but it looked a bit more in
depth than I
Github user buckhx commented on the pull request:
https://github.com/apache/spark/pull/4897#issuecomment-145139329
Looking to add support for namespaces, which I have tested locally and
confirmed to work. I was also thinking of exposing an add_package function that
would add local
Github user buckhx commented on a diff in the pull request:
https://github.com/apache/spark/pull/4897#discussion_r40440718
--- Diff: python/pyspark/context.py ---
@@ -711,6 +721,30 @@ def addPyFile(self, path):
# for tests in local mode
Github user buckhx commented on the pull request:
https://github.com/apache/spark/pull/4897#issuecomment-143251274
The tests fail because one of them attempts to pip install a package and
doesn't have permissions to do so. Is there a way to enable that? Or just leave
the pip
11 matches
Mail list logo