Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/10205#issuecomment-206038435
  
    > In what case do we need overloading?
    
    This was part of a previous conversation in this thread. One of the 
features in the new API that wasn't present in the old one is that it handles 
optional config entries properly; that introduced some issues with the API, 
because to make it user-friendly you need different types for optional and 
non-optional entries (so you can say `set(nonOptionalIntConf, 1)` and 
`set(optionalIntConf, 1)`, for example). It was suggested to use overloaded 
methods, because the first approach was a little weird. But you can't have 
default parameters and overloaded methods together. So I changed it to the 
current builder approach.
    
    > Overall, I think all of the different classes make this pretty 
complicated.
    
    I've explained this to Reynold; the goal is not necessarily to make the 
implementation of the config builders simple, but their use. If you can 
simplify the config builder code while still keeping their use simple, I'm open 
to suggestions.
    
    The reason the new API has more classes than the old SQL conf is because it 
has more features. The old SQL conf did not handle optional configs, and it did 
not have an explicit fallback mechanism (instead using comments in the code to 
indicate that).



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to