Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/16750#discussion_r99989335 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JSONOptions.scala --- @@ -31,10 +31,11 @@ import org.apache.spark.sql.catalyst.util.{CaseInsensitiveMap, CompressionCodecs * Most of these map directly to Jackson's internal options, specified in [[JsonParser.Feature]]. */ private[sql] class JSONOptions( - @transient private val parameters: CaseInsensitiveMap) + @transient private val parameters: CaseInsensitiveMap, defaultTimeZoneId: String) --- End diff -- Ah, yes, it needed to introduce such logics below before creating `JSONOptions`/`CSVOptions`. ```scala val options = extraOptions.toMap val caseInsensitiveOptions = new CaseInsensitiveMap(options) if (caseInsensitiveOptions.contains("timeZone")) { caseInsensitiveOptions } else { new CaseInsensitiveMap( options + ("timeZone" -> sparkSession.sessionState.conf.sessionLocalTimeZone)) } val parsedOptions: JSONOptions = new JSONOptions(optionsWithTimeZone) ``` So, I suggested this way as It seems also because the default value of `timeZone` can be varied. It seems `ParquetOptions` also takes another argument for the same reason. Another way I suggested is, to make this `Option[TimeZone]` to decouple the variant of the default value (like `JSONOptions.columnNameOfCorruptRecord`) but it seems `timestampFormat` in both options are dependent on `timeZone`. In that case, we should make it `Option` too which seems introducing some more complexity. So, it seems above way is better. I am fine if we find a better cleaner way.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org