[ https://issues.apache.org/jira/browse/SPARK-15091?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Shivaram Venkataraman resolved SPARK-15091. ------------------------------------------- Resolution: Fixed Assignee: Sun Rui Fix Version/s: 2.0.0 Issue resolved by https://github.com/apache/spark/pull/12867 > Fix warnings and a failure in SparkR test cases with testthat version 1.0.1 > --------------------------------------------------------------------------- > > Key: SPARK-15091 > URL: https://issues.apache.org/jira/browse/SPARK-15091 > Project: Spark > Issue Type: Improvement > Components: SparkR > Affects Versions: 1.6.1 > Reporter: Sun Rui > Assignee: Sun Rui > Fix For: 2.0.0 > > > After upgrading "testthat" package to version 1.0.1, new warnings and a new > failure were found in SparkR test cases: > ``` > Warnings > ----------------------------------------------------------------------- > 1. multiple packages don't produce a warning (@test_client.R#35) - `not()` is > deprecated. > 2. sparkJars sparkPackages as comma-separated strings (@test_context.R#141) - > `not()` is deprecated. > 3. spark.survreg (@test_mllib.R#453) - `not()` is deprecated. > 4. date functions on a DataFrame (@test_sparkSQL.R#1199) - Deprecated: please > use `expect_gt()` instead > 5. date functions on a DataFrame (@test_sparkSQL.R#1200) - Deprecated: please > use `expect_gt()` instead > 6. date functions on a DataFrame (@test_sparkSQL.R#1201) - Deprecated: please > use `expect_gt()` instead > 7. Method as.data.frame as a synonym for collect() (@test_sparkSQL.R#1899) - > `not()` is deprecated. > Failure: showDF() (@test_sparkSQL.R#1513) ----------------------------------- > `s` produced no output > ``` > Changes in releases of testthat can be found at > https://github.com/hadley/testthat/blob/master/NEWS.md -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org