Github user felixcheung commented on a diff in the pull request: https://github.com/apache/spark/pull/16330#discussion_r105545762 --- Diff: R/pkg/inst/tests/testthat/test_sparkSQL.R --- @@ -2897,6 +2898,27 @@ test_that("Collect on DataFrame when NAs exists at the top of a timestamp column expect_equal(class(ldf3$col3), c("POSIXct", "POSIXt")) }) +compare_list <- function(list1, list2) { + # get testthat to show the diff by first making the 2 lists equal in length + expect_equal(length(list1), length(list2)) + l <- max(length(list1), length(list2)) + length(list1) <- l + length(list2) <- l + expect_equal(sort(list1, na.last = TRUE), sort(list2, na.last = TRUE)) +} + +# This should always be the last test in this test file. +test_that("No extra files are created in SPARK_HOME by starting session and making calls", { + # Check that it is not creating any extra file. + # Does not check the tempdir which would be cleaned up after. + filesAfter <- list.files(path = file.path(Sys.getenv("SPARK_HOME"), "R"), all.files = TRUE) + + expect_true(length(sparkHomeFileBefore) > 0) + compare_list(sparkHomeFileBefore, filesBefore) --- End diff -- I'm trying to catch a few things with this - will add some comment on. for instance, 1) what's created by calling `sparkR.session(enableHiveSupport = F)` (every tests except test_sparkSQL.R) 2) what's created by calling `sparkR.session(enableHiveSupport = T)` (test_sparkSQL.R) this unfortunately doesn't quite work as expected - it should have failed actually instead of passing - because we are running Scala tests before and they have caused spark-warehouse and metastore_db to be created already, before any R code is run. reworking that now.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org