andygrove opened a new issue, #3091:
URL: https://github.com/apache/datafusion-comet/issues/3091
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `make_date` function, causing
queries using this function to fall back to Spark's JVM execution instead of
running natively on DataFusion.
The MakeDate expression creates a date value from separate integer year,
month, and day components. It validates the input values and returns a DateType
value representing the constructed date, with configurable error handling based
on ANSI SQL compliance mode.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
make_date(year, month, day)
```
```scala
// DataFrame API
col("year_col").make_date(col("month_col"), col("day_col"))
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| year | Expression (IntegerType) | The year component as an integer |
| month | Expression (IntegerType) | The month component as an integer
(1-12) |
| day | Expression (IntegerType) | The day component as an integer (1-31,
depending on month) |
| failOnError | Boolean | Internal parameter controlling error handling
behavior (defaults to SQLConf.ansiEnabled) |
**Return Type:** DateType - Returns a date value representing the
constructed date.
**Supported Data Types:**
- Input types: All three arguments must be IntegerType or expressions that
can be implicitly cast to IntegerType
- Output type: DateType
**Edge Cases:**
- Null handling: Returns null if any input argument is null (nullIntolerant
= true)
- Invalid dates: When ANSI mode is enabled (failOnError = true), throws
QueryExecutionErrors.ansiDateTimeArgumentOutOfRange for invalid dates
- Invalid dates: When ANSI mode is disabled, returns null for invalid date
combinations
- Date range: Limited by Java LocalDate supported range (Year.MIN_VALUE to
Year.MAX_VALUE)
- Nullable behavior: In failOnError mode, nullable only if children are
nullable; otherwise always nullable
**Examples:**
```sql
-- Create a date from components
SELECT make_date(2023, 12, 25) AS christmas_2023;
-- Handle invalid dates (returns NULL in non-ANSI mode)
SELECT make_date(2023, 2, 30) AS invalid_date;
-- Use with table columns
SELECT make_date(birth_year, birth_month, birth_day) AS birth_date
FROM people;
```
```scala
// DataFrame API usage
import org.apache.spark.sql.functions._
// Create date from literal values
df.select(expr("make_date(2023, 12, 25)").as("christmas"))
// Create date from column values
df.select(expr("make_date(year_col, month_col,
day_col)").as("constructed_date"))
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Large
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.MakeDate`
**Related:**
- MakeTimestamp - Creates timestamp values from date and time components
- DateAdd - Adds days to a date
- DateSub - Subtracts days from a date
- Year, Month, Day - Extract components from existing dates
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]