michaelzhan-db commented on code in PR #46626: URL: https://github.com/apache/spark/pull/46626#discussion_r1612486829
########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/csv/CSVOptions.scala: ########## @@ -149,7 +149,13 @@ class CSVOptions( parameters.getOrElse(DateTimeUtils.TIMEZONE_OPTION, defaultTimeZoneId)) // A language tag in IETF BCP 47 format - val locale: Locale = parameters.get(LOCALE).map(Locale.forLanguageTag).getOrElse(Locale.US) + val locale: Locale = parameters.get(LOCALE) + .map { + case null => Review Comment: I see. Pyspark also seems to skip `None` check with `option()` or `options()` in python. Would it make sense to just add null checks in scala and `option()`/`options()` in pyspark? However, I'm not sure if any options accept `null` or `None` as a legitimate value. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org