Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/12248#issuecomment-207562982
  
    I mean making a `Properties` object in the driver, and using a reference to 
it in a function that is executed on the executors. That's certainly in scope. 
For the example you give that seems equally simple and can be bottled up inside 
the library, still.
    
    I understand Josh's use case more. There are certainly tasks and RDDs 
entirely internal to some Spark process. But those also won't know anything 
about what to do with some custom user properties. Maybe eventually they invoke 
a UDF that could use these properties. In many cases that UDF could still just 
refer to whatever config you like directly (right?) but I'm probably not 
thinking of some case where this fails to work.
    
    I take the point about this already being an API for the caller anyway.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to