weimingdiit opened a new issue, #2130:
URL: https://github.com/apache/auron/issues/2130

   ### Description
   
   Implement Spark-compatible `weekofyear()` function
   
   Auron already supports several Spark-compatible date/time extraction 
functions such as
   `year()`, `month()`, `dayofmonth()`, `dayofweek()`, `quarter()`, `hour()`, 
`minute()`, and `second()`.
   However, Spark also provides `weekofyear()`, which returns the week number 
of a given date using ISO-style week semantics.
   
   To achieve better compatibility with Spark SQL, we should implement 
`weekofyear()` with the following characteristics:
   
   > Expected behavior
   
   Function name: `weekofyear(expr)`
   
   Return type: `INT`
   
   Week semantics:
   - A week starts on Monday
   - Week 1 is the first week of the year that contains more than 3 days
   - This is consistent with ISO 8601 week numbering
   
   Examples:
   - `weekofyear('2008-02-20')` -> `8`
   - `weekofyear('2009-07-30')` -> `31`
   - `weekofyear('2016-01-01')` -> `53`
   - `weekofyear('2017-01-01')` -> `52`
   
   Supports: `DATE`, `TIMESTAMP`, and compatible string/date inputs consistent 
with existing date extraction functions
   
   Additional expectations:
   - Null-safe: should return `NULL` if input is `NULL`
   - Array and scalar inputs: consistent with current native date extraction 
function implementations
   - Cross-year boundary behavior should match Spark semantics exactly
   
   ### Motivation
   - Ensure feature parity with Apache Spark SQL
   - Improve compatibility for expressions and queries migrated from Spark
   - Reduce fallback occurrences to Spark expression/UDF wrapper
   - Provide correct native behavior for ISO week-based calculations, 
especially around year boundaries
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to