andygrove opened a new issue, #3100:
URL: https://github.com/apache/datafusion-comet/issues/3100

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `make_ym_interval` function, 
causing queries using this function to fall back to Spark's JVM execution 
instead of running natively on DataFusion.
   
   The `MakeYMInterval` expression creates a year-month interval value from 
separate year and month integer components. It constructs a 
`YearMonthIntervalType` object that represents a duration in terms of years and 
months, which is useful for date arithmetic operations in Spark SQL.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   make_ym_interval(years, months)
   make_ym_interval(years)  -- months defaults to 0
   make_ym_interval()       -- both years and months default to 0
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | years | Integer | The number of years for the interval (optional, defaults 
to 0) |
   | months | Integer | The number of months for the interval (optional, 
defaults to 0) |
   
   **Return Type:** Returns `YearMonthIntervalType()` - a year-month interval 
data type.
   
   **Supported Data Types:**
   - **Input Types**: Only `IntegerType` for both years and months parameters
   - **Output Type**: `YearMonthIntervalType`
   - Implicit casting is applied to convert compatible numeric types to integers
   
   **Edge Cases:**
   - **Null Handling**: Expression is null-intolerant (`nullIntolerant = 
true`), meaning if either input is null, the result is null
   - **Empty Input**: Default constructor provides zero values for both years 
and months
   - **Overflow Behavior**: Depends on the underlying `makeYearMonthInterval` 
implementation - likely throws runtime exceptions for values outside valid 
interval ranges
   - **Negative Values**: Accepts negative integers to create negative 
intervals (e.g., intervals representing past durations)
   
   **Examples:**
   ```sql
   -- Create a 2-year, 3-month interval
   SELECT make_ym_interval(2, 3);
   -- Returns: INTERVAL '2-3' YEAR TO MONTH
   
   -- Create a 1-year interval (months default to 0)
   SELECT make_ym_interval(1);
   -- Returns: INTERVAL '1-0' YEAR TO MONTH
   
   -- Create zero interval
   SELECT make_ym_interval();
   -- Returns: INTERVAL '0-0' YEAR TO MONTH
   
   -- Use in date arithmetic
   SELECT date '2023-01-01' + make_ym_interval(1, 6);
   -- Returns: 2024-07-01
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.functions._
   
   df.select(expr("make_ym_interval(2, 3)"))
   
   // Using with date arithmetic
   df.withColumn("future_date", col("start_date") + expr("make_ym_interval(1, 
6)"))
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Large
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.MakeYMInterval`
   
   **Related:**
   - `MakeDTInterval` - Creates day-time intervals
   - `INTERVAL` literal syntax - Alternative way to create intervals
   - Date/timestamp arithmetic functions that work with intervals
   - `YearMonthIntervalType` - The underlying data type
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to