HyukjinKwon commented on a change in pull request #32709:
URL: https://github.com/apache/spark/pull/32709#discussion_r642464454



##########
File path: R/pkg/tests/fulltests/test_sparkSQL_arrow.R
##########
@@ -68,7 +68,7 @@ test_that("createDataFrame/collect Arrow optimization - type 
specification", {
     callJMethod(conf, "set", "spark.sql.execution.arrow.sparkr.enabled", 
arrowEnabled)
   })
 
-  expect_equal(collect(createDataFrame(rdf)), expected)
+  expect_true(all(collect(createDataFrame(rdf)) == expected))

Review comment:
       Here I work around to make the tests work with any R version. The 
problem was [R 4.1 introduced `check.tzone` at 
`all.equal`](https://cran.r-project.org/doc/manuals/r-release/NEWS.html) which 
apparently `testthat` uses.
   
   When you collect from a POSIXct with an empty tzone (by default), Arrow 
conversion fills a local timezone instead of being empty:
   
   ```r
   rdf <- data.frame(list(list(t = as.POSIXct("1990-02-24 12:34:56", 
tz="UTC"))))
   SparkR:::callJMethod(SparkR:::callJMethod(spark, "conf"), "set", 
"spark.sql.execution.arrow.sparkr.enabled", "false")
   withoutArrow <- collect(createDataFrame(rdf))
   SparkR:::callJMethod(SparkR:::callJMethod(spark, "conf"), "set", 
"spark.sql.execution.arrow.sparkr.enabled", "true")
   withArrow <- collect(createDataFrame(rdf))
   
   attr(withoutArrow$t, "tzone")
   attr(withArrow$t, "tzone")
   ```
   ```
   [1] ""
   [1] "Asia/Seoul"
   ```
   
   Spark returns a local time instances in Scala, Python and R. Therefore, I 
think either being an empty timezone or local timezone can be correct in Spark 
context, and it's not an issue IMO.
   
   FWIW, we're the one who sets the timezone on JVM side if I remember all 
correctly.
   
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to