[ https://issues.apache.org/jira/browse/SPARK-28955?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Bill Schneider updated SPARK-28955: ----------------------------------- Issue Type: New Feature (was: Wish) Changing to new feature; this is a recurring issue when dealing with different Spark jobs running in different timezones when we want the time to remain fixed regardless of time zone. (e.g., local time semantics) > Support for LocalDateTime semantics > ----------------------------------- > > Key: SPARK-28955 > URL: https://issues.apache.org/jira/browse/SPARK-28955 > Project: Spark > Issue Type: New Feature > Components: SQL > Affects Versions: 2.3.0 > Reporter: Bill Schneider > Priority: Major > > It would be great if Spark supported local times in DataFrames, rather than > only instants. > The specific use case I have in mind is something like > * parse "2019-01-01 17:00" (no timezone) from CSV -> LocalDateTime in > dataframe > * save to Parquet: LocalDateTime is stored with same integer value as > 2019-01-01 17:00 UTC, but with isAdjustedToUTC=false. (Currently Spark saves > either INT96 or TIME_MILLIS/TIME_MICROS which has isAdjustedToUTC=true) -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org