[ https://issues.apache.org/jira/browse/SPARK-3859?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14345365#comment-14345365 ]
Andrew Or edited comment on SPARK-3859 at 3/3/15 5:27 PM: ---------------------------------------------------------- The problem is we keep adding more and more of these inconsistent properties because there isn't really a guideline to follow. When I review other people's patches there isn't really the "correct" way to name a new config. We will have to deprecate the old ones in a nicer fashion than what we can do today, and this is why I opened SPARK-5933. HOWEVER this one is duplicated by a more specific one I opened recently. I had forgotten that I had already opened this issue a while ago. I'm closing this one in favor of the new one. was (Author: andrewor14): The problem is we keep adding more and more of these inconsistent properties because there isn't really a guideline to follow. When I review other people's patches there isn't really the "correct" way to name a new config. We will have to deprecate the old ones in a nicer fashion than what we can do today, and this is why I opened SPARK-5933. HOWEVER this one is duplicated by a more specific one I opened recently. I had forgotten that I already opened this issue a while ago. I'm closing this one in favor of the new one. > Use consistent config names for duration (with units!) > ------------------------------------------------------ > > Key: SPARK-3859 > URL: https://issues.apache.org/jira/browse/SPARK-3859 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.1.0 > Reporter: Andrew Or > > There are many configs in Spark that refer to some unit of time. However, > from the first glance it is unclear what these units are. We should find a > consistent way to append the units to the end of these config names and > deprecate the old ones in favor of the more consistent ones. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org