andygrove opened a new issue, #3111:
URL: https://github.com/apache/datafusion-comet/issues/3111

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `seconds_to_timestamp` function, 
causing queries using this function to fall back to Spark's JVM execution 
instead of running natively on DataFusion.
   
   The `SecondsToTimestamp` expression converts numeric values representing 
seconds since Unix epoch (1970-01-01 00:00:00 UTC) into timestamp values. It 
supports various numeric data types and handles the conversion by multiplying 
the input value by the number of microseconds per second to produce a timestamp 
in Spark's internal microsecond representation.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   timestamp_seconds(seconds)
   ```
   
   ```scala
   // DataFrame API
   import org.apache.spark.sql.functions._
   col("seconds_column").cast(TimestampType)
   // or using the function name
   expr("timestamp_seconds(seconds_column)")
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | seconds | Numeric | The number of seconds since Unix epoch (1970-01-01 
00:00:00 UTC) to convert to timestamp |
   
   **Return Type:** `TimestampType` - Returns a timestamp value representing 
the specified number of seconds since Unix epoch.
   
   **Supported Data Types:**
   - IntegralType (Byte, Short, Integer, Long)
   - DecimalType 
   - FloatType
   - DoubleType
   
   **Edge Cases:**
   - **Null handling**: Expression is null-intolerant, meaning null inputs 
produce null outputs
   - **Float/Double special values**: NaN and Infinite values in float/double 
inputs return null instead of throwing exceptions
   - **Overflow protection**: Integral types use `Math.multiplyExact()` and 
decimal types use `longValueExact()` to detect arithmetic overflow
   - **Precision handling**: Decimal inputs maintain precision through 
`BigDecimal` arithmetic before final conversion
   - **Nullable behavior**: Float and Double inputs make the result nullable 
due to potential NaN/Infinite handling; other numeric types inherit nullability 
from the child expression
   
   **Examples:**
   ```sql
   -- Convert integer seconds to timestamp
   SELECT timestamp_seconds(1640995200) as converted_timestamp;
   -- Result: 2021-12-31 16:00:00
   
   -- Convert decimal seconds to timestamp  
   SELECT timestamp_seconds(1640995200.123) as converted_timestamp;
   -- Result: 2021-12-31 16:00:00.123
   
   -- Handle null input
   SELECT timestamp_seconds(null) as converted_timestamp;
   -- Result: null
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.functions._
   
   // Convert seconds column to timestamp
   df.select(expr("timestamp_seconds(epoch_seconds)").as("converted_timestamp"))
   
   // Using with literal values
   df.select(expr("timestamp_seconds(1640995200)").as("converted_timestamp"))
   
   // Handle floating point seconds
   
df.select(expr("timestamp_seconds(1640995200.123)").as("converted_timestamp"))
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Medium
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.SecondsToTimestamp`
   
   **Related:**
   - `TimestampToSeconds` - Converts timestamps back to seconds since epoch
   - `UnixTimestamp` - Similar functionality with different precision handling
   - `FromUnixTime` - Converts Unix timestamp to formatted string
   - `ToTimestamp` - Converts string to timestamp with format parsing
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to