This is an automated email from the ASF dual-hosted git repository. kabhwan pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new bbe95cfcd05 [SPARK-45946][SS] Fix use of deprecated FileUtils write to pass default charset in RocksDBSuite bbe95cfcd05 is described below commit bbe95cfcd05728dca3810bbbf72c663729296587 Author: Anish Shrigondekar <anish.shrigonde...@databricks.com> AuthorDate: Fri Nov 17 07:12:05 2023 +0900 [SPARK-45946][SS] Fix use of deprecated FileUtils write to pass default charset in RocksDBSuite ### What changes were proposed in this pull request? Fix use of deprecated FileUtils write to pass default charset in RocksDBSuite ### Why are the changes needed? Without the change, we were getting this compilation warning ``` [warn] /Users/anish.shrigondekar/spark/spark/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/RocksDBSuite.scala:854:17: method write in class FileUtils is deprecated [warn] Applicable -Wconf / nowarn filters for this warning: msg=<part of the message>, cat=deprecation, site=org.apache.spark.sql.execution.streaming.state.RocksDBSuite, origin=org.apache.commons.io.FileUtils.write [warn] FileUtils.write(file2, s"v2\n$json2") [warn] ^ [warn] /Users/anish.shrigondekar/spark/spark/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/RocksDBSuite.scala:1272:17: method write in class FileUtils is deprecated [warn] Applicable -Wconf / nowarn filters for this warning: msg=<part of the message>, cat=deprecation, site=org.apache.spark.sql.execution.streaming.state.RocksDBSuite.generateFiles.$anonfun, origin=org.apache.commons.io.FileUtils.write [warn] FileUtils.write(file, "a" * length) [warn] ``` ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Ran test suite ``` 22:47:45.700 WARN org.apache.spark.sql.execution.streaming.state.RocksDBSuite: ===== POSSIBLE THREAD LEAK IN SUITE o.a.s.sql.execution.streaming.state.RocksDBSuite, threads: ForkJoinPool.commonPool-worker-6 (daemon=true), ForkJoinPool.commonPool-worker-4 (daemon=true), rpc-boss-3-1 (daemon=true), ForkJoinPool.commonPool-worker-5 (daemon=true), ForkJoinPool.commonPool-worker-3 (daemon=true), ForkJoinPool.commonPool-worker-2 (daemon=true), shuffle-boss-6-1 (daemon=true), ForkJoinPool.commonPool-worker-1 (daemon=true) ===== [info] Run completed in 1 minute, 55 seconds. [info] Total number of tests run: 77 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 77, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [success] Total time: 172 s (02:52), completed Nov 15, 2023, 10:47:46 PM ``` ### Was this patch authored or co-authored using generative AI tooling? No Closes #43832 from anishshri-db/task/SPARK-45946. Authored-by: Anish Shrigondekar <anish.shrigonde...@databricks.com> Signed-off-by: Jungtaek Lim <kabhwan.opensou...@gmail.com> --- .../org/apache/spark/sql/execution/streaming/state/RocksDBSuite.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/RocksDBSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/RocksDBSuite.scala index ddef26224f2..e290f808f56 100644 --- a/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/RocksDBSuite.scala +++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/RocksDBSuite.scala @@ -851,7 +851,7 @@ class RocksDBSuite extends AlsoTestWithChangelogCheckpointingEnabled with Shared withTempDir { dir => val file2 = new File(dir, "json") val json2 = """{"sstFiles":[],"numKeys":0}""" - FileUtils.write(file2, s"v2\n$json2") + FileUtils.write(file2, s"v2\n$json2", Charset.defaultCharset) val e = intercept[SparkException] { RocksDBCheckpointMetadata.readFromFile(file2) } @@ -1269,7 +1269,7 @@ class RocksDBSuite extends AlsoTestWithChangelogCheckpointingEnabled with Shared def generateFiles(dir: String, fileToLengths: Seq[(String, Int)]): Unit = { fileToLengths.foreach { case (fileName, length) => val file = new File(dir, fileName) - FileUtils.write(file, "a" * length) + FileUtils.write(file, "a" * length, Charset.defaultCharset) } } --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org