andygrove opened a new issue, #3117:
URL: https://github.com/apache/datafusion-comet/issues/3117
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `try_make_timestamp` function,
causing queries using this function to fall back to Spark's JVM execution
instead of running natively on DataFusion.
The `TryMakeTimestamp` expression creates a timestamp from individual date
and time components (year, month, day, hour, minute, second) with optional
timezone specification. Unlike the standard `make_timestamp` function, this
variant returns null instead of throwing an exception when provided with
invalid date/time values.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
try_make_timestamp(year, month, day, hour, min, sec [, timezone])
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| year | Expression | The year component (e.g., 2023) |
| month | Expression | The month component (1-12) |
| day | Expression | The day component (1-31) |
| hour | Expression | The hour component (0-23) |
| min | Expression | The minute component (0-59) |
| sec | Expression | The second component (0-59, can include fractional
seconds) |
| timezone | Expression (Optional) | The timezone identifier (e.g., 'UTC',
'America/New_York') |
**Return Type:** Returns `TimestampType` - a timestamp value representing
the specified date and time, or null if the input values are invalid.
**Supported Data Types:**
- **year, month, day, hour, min**: Integer types or expressions that
evaluate to integers
- **sec**: Numeric types (integer or decimal) to support fractional seconds
- **timezone**: String type containing a valid timezone identifier
**Edge Cases:**
- **Null handling**: Returns null if any required input parameter is null
- **Invalid dates**: Returns null for impossible dates (e.g., February 30,
April 31)
- **Invalid times**: Returns null for invalid time values (e.g., hour = 25,
minute = 70)
- **Timezone handling**: Returns null if an invalid timezone identifier is
provided
- **Leap year handling**: Correctly handles February 29th validation based
on leap year rules
- **Fractional seconds**: Supports fractional seconds in the seconds
parameter
**Examples:**
```sql
-- Basic timestamp creation
SELECT try_make_timestamp(2023, 12, 25, 10, 30, 45.5);
-- With timezone
SELECT try_make_timestamp(2023, 6, 15, 14, 30, 0, 'America/New_York');
-- Invalid date returns null instead of error
SELECT try_make_timestamp(2023, 2, 30, 10, 0, 0); -- Returns null
```
```scala
// Example DataFrame API usage
import org.apache.spark.sql.functions._
// Basic usage
df.select(expr("try_make_timestamp(year_col, month_col, day_col, hour_col,
min_col, sec_col)"))
// With timezone
df.select(expr("try_make_timestamp(2023, 6, 15, 14, 30, 0, 'UTC')"))
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Medium
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.TryMakeTimestamp`
**Related:**
- `make_timestamp` - Similar function that throws exceptions instead of
returning null
- `to_timestamp` - Parses timestamp from string representation
- `current_timestamp` - Returns current system timestamp
- `TryMakeTimestampFromDateTime` - Creates timestamp from datetime string
input
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]