Copilot commented on code in PR #2131:
URL: https://github.com/apache/auron/pull/2131#discussion_r3007143507


##########
spark-extension-shims-spark/src/test/scala/org/apache/auron/AuronFunctionSuite.scala:
##########
@@ -137,6 +137,28 @@ class AuronFunctionSuite extends AuronQueryTest with 
BaseAuronSQLSuite {
     }
   }
 
+  test("weekofyear function") {
+    withTable("t1") {
+      sql("create table t1(c1 date, c2 date, c3 date, c4 date) using parquet")
+      sql("""insert into t1 values (
+          |  date'2009-07-30',
+          |  date'1980-12-31',
+          |  date'2016-01-01',
+          |  date'2017-01-01'
+          |)""".stripMargin)
+
+      val query =
+        """select
+          |  weekofyear(c1),
+          |  weekofyear(c2),
+          |  weekofyear(c3),
+          |  weekofyear(c4)
+          |from t1
+          |""".stripMargin
+      checkSparkAnswerAndOperator(query)

Review Comment:
   The new integration test exercises `weekofyear` on `DATE` columns only. 
Since the implementation is intended to support `TIMESTAMP` and compatible 
string inputs too, it would be good to add cases that call `weekofyear()` on a 
timestamp (ideally with a non-UTC `sessionLocalTimeZone` to catch boundary 
issues) and on a string literal/column, so the Spark->native translation and 
casting paths are covered end-to-end.
   ```suggestion
       withSQLConf("spark.sql.session.timeZone" -> "America/Los_Angeles") {
         withTable("t1") {
           sql("create table t1(c1 date, c2 date, c3 date, c4 date, c5 
timestamp, c6 string) using parquet")
           sql(
             """insert into t1 values (
               |  date'2009-07-30',
               |  date'1980-12-31',
               |  date'2016-01-01',
               |  date'2017-01-01',
               |  timestamp'2016-01-03 23:30:00',
               |  '2016-01-01'
               |)""".stripMargin)
   
           val query =
             """select
               |  weekofyear(c1),
               |  weekofyear(c2),
               |  weekofyear(c3),
               |  weekofyear(c4),
               |  weekofyear(c5),
               |  weekofyear(c6)
               |from t1
               |""".stripMargin
           checkSparkAnswerAndOperator(query)
         }
   ```



##########
native-engine/datafusion-ext-functions/src/spark_dates.rs:
##########
@@ -46,6 +46,30 @@ pub fn spark_day(args: &[ColumnarValue]) -> 
Result<ColumnarValue> {
     Ok(ColumnarValue::Array(date_part(&input, DatePart::Day)?))
 }
 
+/// `spark_weekofyear(date/timestamp/compatible-string)`
+///
+/// Matches Spark's `weekofyear()` semantics:
+/// ISO week numbering, with Monday as the first day of the week,
+/// and week 1 defined as the first week with more than 3 days.
+pub fn spark_weekofyear(args: &[ColumnarValue]) -> Result<ColumnarValue> {
+    let input = cast(&args[0].clone().into_array(1)?, &DataType::Date32)?;
+    let input = input
+        .as_any()
+        .downcast_ref::<Date32Array>()
+        .expect("internal cast to Date32 must succeed");
+
+    let epoch = NaiveDate::from_ymd_opt(1970, 1, 1).expect("1970-01-01 must be 
a valid date");
+    let weekofyear = Int32Array::from_iter(input.iter().map(|opt_days| {
+        opt_days.and_then(|days| {
+            epoch
+                .checked_add_signed(Duration::days(days as i64))
+                .map(|date| date.iso_week().week() as i32)
+        })
+    }));
+
+    Ok(ColumnarValue::Array(Arc::new(weekofyear)))

Review Comment:
   `spark_weekofyear` currently casts the input to `Date32` using the generic 
Arrow/DataFusion cast. For `Timestamp` inputs this cast is timezone-agnostic, 
but Spark’s `weekofyear(timestamp)` uses the session local time zone when 
deriving the date, so results can differ around midnight/week boundaries. To 
match Spark semantics, consider accepting an optional timezone argument (like 
`spark_hour/minute/second` do), localizing timestamp epoch milliseconds before 
computing the ISO week, and keeping the current `Date32` path for `Date`/string 
inputs.
   ```suggestion
   /// `spark_weekofyear(date/timestamp/compatible-string[, timezone])`
   ///
   /// Matches Spark's `weekofyear()` semantics:
   /// ISO week numbering, with Monday as the first day of the week,
   /// and week 1 defined as the first week with more than 3 days.
   ///
   /// For `Timestamp` inputs, this function interprets epoch milliseconds in 
the
   /// provided timezone (if any) before deriving the calendar date and ISO 
week.
   /// If no timezone is provided, `UTC` is used by default. For `Date` and
   /// compatible string inputs, the behavior is unchanged: the value is cast to
   /// `Date32` and the ISO week is computed from the resulting date.
   pub fn spark_weekofyear(args: &[ColumnarValue]) -> Result<ColumnarValue> {
       // First argument as an Arrow array (date/timestamp/string, etc.)
       let array = args[0].clone().into_array(1)?;
   
       // Determine timezone (for timestamp inputs). Default to UTC to match
       // existing behavior when no timezone is provided.
       let default_tz = chrono_tz::UTC;
       let tz: Tz = if args.len() > 1 {
           match &args[1] {
               ColumnarValue::Scalar(ScalarValue::Utf8(Some(s)))
               | ColumnarValue::Scalar(ScalarValue::LargeUtf8(Some(s))) => {
                   s.parse::<Tz>().unwrap_or(default_tz)
               }
               _ => default_tz,
           }
       } else {
           default_tz
       };
   
       match array.data_type() {
           // Timestamp inputs: localize epoch milliseconds before computing 
ISO week
           DataType::Timestamp(TimeUnit::Millisecond, _) => {
               let ts_arr = array
                   .as_any()
                   .downcast_ref::<TimestampMillisecondArray>()
                   .expect("internal cast to TimestampMillisecondArray must 
succeed");
   
               let weekofyear = 
Int32Array::from_iter(ts_arr.iter().map(|opt_ms| {
                   opt_ms.map(|ms| {
                       // Localize epoch milliseconds to the chosen timezone, 
then
                       // derive the ISO week number from the resulting date.
                       let dt = tz.timestamp_millis(ms);
                       dt.date_naive().iso_week().week() as i32
                   })
               }));
   
               Ok(ColumnarValue::Array(Arc::new(weekofyear)))
           }
           // Non-timestamp inputs: preserve existing Date32-based behavior
           _ => {
               let input = cast(&array, &DataType::Date32)?;
               let input = input
                   .as_any()
                   .downcast_ref::<Date32Array>()
                   .expect("internal cast to Date32 must succeed");
   
               let epoch =
                   NaiveDate::from_ymd_opt(1970, 1, 1).expect("1970-01-01 must 
be a valid date");
               let weekofyear = 
Int32Array::from_iter(input.iter().map(|opt_days| {
                   opt_days.and_then(|days| {
                       epoch
                           .checked_add_signed(Duration::days(days as i64))
                           .map(|date| date.iso_week().week() as i32)
                   })
               }));
   
               Ok(ColumnarValue::Array(Arc::new(weekofyear)))
           }
       }
   ```



##########
spark-extension/src/main/scala/org/apache/spark/sql/auron/NativeConverters.scala:
##########
@@ -926,6 +926,8 @@ object NativeConverters extends Logging {
       case Year(child) => buildExtScalarFunction("Spark_Year", child :: Nil, 
IntegerType)
       case Month(child) => buildExtScalarFunction("Spark_Month", child :: Nil, 
IntegerType)
       case DayOfMonth(child) => buildExtScalarFunction("Spark_Day", child :: 
Nil, IntegerType)
+      case WeekOfYear(child) =>
+        buildExtScalarFunction("Spark_WeekOfYear", child :: Nil, IntegerType)

Review Comment:
   `WeekOfYear` is converted to the 1-arg ext function `Spark_WeekOfYear` 
regardless of input type. For `TimestampType` inputs this loses Spark’s session 
time zone semantics (Spark computes week-of-year after converting the timestamp 
to a local date in `SQLConf.sessionLocalTimeZone`, similar to how 
`Hour/Minute/Second` are handled via `buildTimePartExt`). Consider passing the 
session time zone as an additional argument when the child is a timestamp and 
updating the native function to accept/use it; otherwise timestamp results can 
differ from Spark around day/week boundaries.
   ```suggestion
           child.dataType match {
             case TimestampType =>
               buildExtScalarFunction(
                 "Spark_WeekOfYear",
                 child :: Literal(SQLConf.get.sessionLocalTimeZone) :: Nil,
                 IntegerType)
             case _ =>
               buildExtScalarFunction("Spark_WeekOfYear", child :: Nil, 
IntegerType)
           }
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to