Github user dmateusp commented on the issue:

    https://github.com/apache/spark/pull/21706
  
    Just added it to the FunctionRegistry:
    ```scala
    scala> spark.sql("DESC function calendarinterval").show(truncate=false)
    
+--------------------------------------------------------------------------------------------------+
    |function_desc                                                              
                       |
    
+--------------------------------------------------------------------------------------------------+
    |Function: calendarinterval                                                 
                       |
    |Class: org.apache.spark.sql.catalyst.expressions.Cast                      
                       |
    |Usage: calendarinterval(expr) - Casts the value `expr` to the target data 
type `calendarinterval`.|
    
+--------------------------------------------------------------------------------------------------+
    
    
    scala> spark.sql("select calendarinterval('interval 10 days')").show()
    +------------------------------------------+
    |CAST(interval 10 days AS CALENDARINTERVAL)|
    +------------------------------------------+
    |                      interval 1 weeks ...|
    +------------------------------------------+
    ```
    
     I sent an email to the dev mailing list because I'm having troubles 
understanding how the sql-tests work (the structure, and how to run them). I'll 
add tests in there as soon as I figure that out :)


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to