parthchandra opened a new pull request, #4256:
URL: https://github.com/apache/datafusion-comet/pull/4256

    ## Which issue does this PR close?
   
     Closes #3119, #3120
   
    ## Rationale for this change
   
     Spark 4.1 introduced TimeType along with make_time, to_time, and 
try_to_time expressions. This PR adds native Comet support for these 
expressions to avoid falling back to the JVM for time-related queries.
   
    ## What changes are included in this PR?
   
     `make_time(hours, minutes, seconds)` — constructs a TIME value from 
integer hours, minutes, and decimal seconds.
     - Rust implementation validates ranges (hours 0-23, minutes 0-59, seconds 
0-59) and computes nanoseconds from midnight
     - Intercepts the StaticInvoke(DateTimeUtils.makeTime, ...) pattern that 
Spark 4.1's MakeTime RuntimeReplaceable resolves to
   
     to_time(string) / `try_to_time(string)` — parses a string to a TIME value 
using Spark's default stringToTime format (no format-pattern variant yet).
     - Rust parser handles: HH:mm, HH:mm:ss, fractional seconds (up to 
microsecond precision), single-digit components, T-prefix, AM/PM 
(case-insensitive, with or without space)
     - `to_time` raises an error on unparseable input; `try_to_time` returns 
null
     - Intercepts the Invoke(Literal(ToTimeParser(None)), "parse", ...) and 
TryEval(Invoke(...)) patterns
   
     Infrastructure for TimeType:
     - Added TIME = 17 to the protobuf DataTypeId enum
     - Added Time64(Nanosecond) handling in serde deserialization and 
columnar-to-row conversion
     - Added TimeType serialization in QueryPlanSerde.serializeDataType
   
     All expression interception is in the Spark 4.1/4.2 shims only — no impact 
on Spark 3.x/4.0 builds.
   
     How are these changes tested?
   
     - Rust unit tests for string_to_time parsing covering all format variants, 
AM/PM, invalid inputs, edge cases
     - SQL test files (make_time.sql, to_time.sql) using the Comet SQL test 
framework which validates that Comet produces identical results to Spark, 
covering:
       - Column and literal arguments
       - Null propagation
       - Boundary values (midnight, end-of-day, single microsecond)
       - Error cases (expect_error) for invalid inputs
       - try_to_time returning null for invalid inputs
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to