[ https://issues.apache.org/jira/browse/SPARK-24673?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-24673. ---------------------------------- Resolution: Fixed Assignee: Antonio Murgia Fix Version/s: 2.4.0 Fixed in https://github.com/apache/spark/pull/21693/files > scala sql function from_utc_timestamp second argument could be Column instead > of String > --------------------------------------------------------------------------------------- > > Key: SPARK-24673 > URL: https://issues.apache.org/jira/browse/SPARK-24673 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.3.1 > Reporter: Antonio Murgia > Assignee: Antonio Murgia > Priority: Minor > Fix For: 2.4.0 > > > As of 2.3.1 the scala API for the built-in function from_utc_timestamp > (org.apache.spark.sql.functions#from_utc_timestamp) is less powerful than its > SQL counter part. In particular, given a dataset/dataframe with the following > schema: > {code:java} > CREATE TABLE MY_TABLE ( > ts TIMESTAMP, > tz STRING > ){code} > from the SQL api I can do something like: > {code:java} > SELECT FROM_UTC_TIMESTAMP(TS, TZ){code} > while from the programmatic api I simply cannot because > {code:java} > functions.from_utc_timestamp(ts: Column, tz: String){code} > second argument is a String. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org