Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/22531 @srowen, WDYT about we add a rule and exclude them when only they are legitimate cases? I wrote a rule locally and these below are (almost) all legitimate cases. I was thinking we just explicitly exclude them in the rule (for instance `scalastyle:off ...` ) and we define a rule to enforce as, at least, a reminder to use `Locale.Root`. ``` [error] spark/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/submit/KubernetesClientApplication.scala:263:6: [error] spark/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/scheduler/cluster/k8s/ExecutorPodsSnapshot.scala:55:40: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala:62:71: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala:63:17: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala:277:57: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala:279:44: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala:284:22: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:871:38: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:899:38: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:919:31: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:925:46: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:934:42: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:993:76: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:1034:62: [error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:1185:31: [error] spark/resource-managers/mesos/src/main/scala/org/apache/spark/deploy/mesos/MesosClusterDispatcher.scala:63:52: [error] spark/mllib/src/main/scala/org/apache/spark/ml/feature/Tokenizer.scala:39:5: [error] spark/mllib/src/main/scala/org/apache/spark/ml/feature/Tokenizer.scala:143:43: [error] spark/mllib/src/main/scala/org/apache/spark/ml/feature/StopWordsRemover.scala:121:51: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/higherOrderFunctions.scala:76:20: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/StringUtils.scala:65:67: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/StringUtils.scala:66:69: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/NumberConverter.scala:96:26: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ParseDriver.scala:182:18: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala:666:30: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala:333:53: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala:336:38: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala:352:53: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala:355:38: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala:1392:35: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala:1395:40: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/rules/RuleExecutor.scala:147:66: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/util/SchemaUtils.scala:80:77: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:181:72: [error] spark/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:978:45: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/InsertIntoHadoopFsRelationCommand.scala:98:55: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/csv/CSVDataSource.scala:89:56: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/csv/CSVDataSource.scala:98:62: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/csv/CSVDataSource.scala:156:39: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/csv/CSVDataSource.scala:157:39: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/PartitioningUtils.scala:132:77: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/PartitioningUtils.scala:327:59: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/WatermarkTracker.scala:39:14: [error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/SymmetricHashJoinStateManager.scala:266:61: [error] spark/core/src/main/scala/org/apache/spark/metrics/sink/StatsdSink.scala:55:79: [error] spark/core/src/main/scala/org/apache/spark/util/Utils.scala:2739:34: [error] spark/core/src/main/scala/org/apache/spark/rdd/OrderedRDDFunctions.scala:38:53: [error] spark/core/src/main/scala/org/apache/spark/rdd/OrderedRDDFunctions.scala:38:75: ``` Do you think it's too much or we should add it? I slightly prefer this way but want to listen to your thought before I go and open a PR.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org