Github user Stibbons commented on the issue:

    https://github.com/apache/spark/pull/13599
  
    I see that everything in this "conf" settings may be set into a .conf file 
that define the default values, so user can give to spark-submit with all 
default values, I wonder if there is no requirements.txt how it will behave.
    
    In my patch (based on yours) user can give a python package (whl, tar.gz, 
...) that will be directly setup by pip install (not sure how it will behave 
with conda, but we can execute pip after conda). Admin can define a single 
default conf file with all default values, the name of the virtualenv 
executable (which may not be absolute path), the name of the requirements.txt, 
the name of the wheelhouse package and so on, so, if the requirements.txt is 
sent through --files it is used, if not it is just ignored (and pip only 
install the python package with its dependencies describes within setup.py).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to