dongjoon-hyun commented on PR #43018:
URL: https://github.com/apache/spark/pull/43018#issuecomment-1728265731

   The one Parquet test failure. I verified manually. It seems to be a flaky 
test.
   ```
   $ java -version
   openjdk version "21" 2023-09-19 LTS
   ...
   
   $ build/sbt "sql/testOnly *.ParquetV1FilterSuite -- -z StringEndsWith"
   [info] ParquetV1FilterSuite:
   11:49:50.643 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
   [info] - filter pushdown - StringEndsWith (6 seconds, 241 milliseconds)
   11:49:57.656 WARN 
org.apache.spark.sql.execution.datasources.parquet.ParquetV1FilterSuite:
   
   ===== POSSIBLE THREAD LEAK IN SUITE 
o.a.s.sql.execution.datasources.parquet.ParquetV1FilterSuite, threads: 
ForkJoinPool.commonPool-worker-6 (daemon=true), 
ForkJoinPool.commonPool-worker-4 (daemon=true), 
ForkJoinPool.commonPool-worker-7 (daemon=true), rpc-boss-3-1 (daemon=true), 
ForkJoinPool.commonPool-worker-8 (daemon=true), 
ForkJoinPool.commonPool-worker-5 (daemon=true), shuffle-boss-6-1 (daemon=true), 
ForkJoinPool.commonPool-worker-1 (daemon=true), 
ForkJoinPool.commonPool-worker-3 (daemon=true), ForkJoi...
   [info] Run completed in 8 seconds, 36 milliseconds.
   [info] Total number of tests run: 1
   [info] Suites: completed 1, aborted 0
   [info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
   [info] All tests passed.
   [success] Total time: 21 s, completed Sep 20, 2023, 11:49:57 AM
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to