Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/1253#issuecomment-49570062
  
    IMO `-D` does not have the right semantics here because the user isn't 
logically setting java properties for the submission tool, they are setting 
spark configuration properties for their application. The application might run 
totally remotely, for instance, so why should the user expect that a -D set on 
the submission site gets packaged up and sent to the remote launcher. Also 
confusing is that we'd only really triage -D options that are spark properties, 
not other ones, so the semantics would differ depending on whether the user 
happened to set a java property that started with `spark`. For these reasons I 
feel it's better to just have an explicit config-related flag.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to