GuoPhilipse commented on pull request #28571:
URL: https://github.com/apache/spark/pull/28571#issuecomment-630558198


   
   
   
   
   
   
   hi HyukjinKwon
   it will take effect only when converting long to timestamp, other cases 
currently seems no need to  change. we want to keep compatibility back with 
hive sql,
   Do you have any cases?  we can discuss more.
   
   
   Thanls.
   
   
   
   
   
   
   
   
   
   
   At 2020-05-19 10:44:13, "Hyukjin Kwon" <notificati...@github.com> wrote:
   
   @HyukjinKwon commented on this pull request.
   
   In 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala:
   
   >    // converting seconds to us
   -  private[this] def longToTimestamp(t: Long): Long = t * 1000000L
   +  private[this] def longToTimestamp(t: Long): Long = {
   +    if (SQLConf.get.getConf(SQLConf.LONG_TIMESTAMP_CONVERSION_IN_SECONDS)) 
t * 1000000L
   
   
   Let's don't do this. It will require to add the compatibility across all the 
date time functionality in Spark, e.g.) from_unixtime.
   
   —
   You are receiving this because you were mentioned.
   Reply to this email directly, view it on GitHub, or unsubscribe.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to