goldmedal commented on issue #15493:
URL: https://github.com/apache/datafusion/issues/15493#issuecomment-2764446051
I saw similar error messages when running tpch sqllogictest in the latest
main branch
```
~/git/datafusion ▓▒░ INCLUDE_TPCH=true cargo test --test sqllogictests --
tpch
░▒▓ ✔ │ wren-core-py Py │ 16:15:33
Compiling datafusion-common v46.0.1
(/Users/jax/git/datafusion/datafusion/common)
Compiling datafusion-expr-common v46.0.1
(/Users/jax/git/datafusion/datafusion/expr-common)
Compiling datafusion-physical-expr-common v46.0.1
(/Users/jax/git/datafusion/datafusion/physical-expr-common)
Compiling datafusion-functions-aggregate-common v46.0.1
(/Users/jax/git/datafusion/datafusion/functions-aggregate-common)
Compiling datafusion-functions-window-common v46.0.1
(/Users/jax/git/datafusion/datafusion/functions-window-common)
Compiling datafusion-expr v46.0.1
(/Users/jax/git/datafusion/datafusion/expr)
Compiling datafusion-macros v46.0.1
(/Users/jax/git/datafusion/datafusion/macros)
Compiling datafusion-physical-expr v46.0.1
(/Users/jax/git/datafusion/datafusion/physical-expr)
Compiling datafusion-execution v46.0.1
(/Users/jax/git/datafusion/datafusion/execution)
Compiling datafusion-sql v46.0.1
(/Users/jax/git/datafusion/datafusion/sql)
Compiling datafusion-functions v46.0.1
(/Users/jax/git/datafusion/datafusion/functions)
Compiling datafusion-physical-plan v46.0.1
(/Users/jax/git/datafusion/datafusion/physical-plan)
Compiling datafusion-functions-aggregate v46.0.1
(/Users/jax/git/datafusion/datafusion/functions-aggregate)
Compiling datafusion-functions-window v46.0.1
(/Users/jax/git/datafusion/datafusion/functions-window)
Compiling datafusion-optimizer v46.0.1
(/Users/jax/git/datafusion/datafusion/optimizer)
Compiling datafusion-functions-nested v46.0.1
(/Users/jax/git/datafusion/datafusion/functions-nested)
Compiling datafusion-session v46.0.1
(/Users/jax/git/datafusion/datafusion/session)
Compiling datafusion-physical-optimizer v46.0.1
(/Users/jax/git/datafusion/datafusion/physical-optimizer)
Compiling datafusion-datasource v46.0.1
(/Users/jax/git/datafusion/datafusion/datasource)
Compiling datafusion-catalog v46.0.1
(/Users/jax/git/datafusion/datafusion/catalog)
Compiling datafusion-datasource-csv v46.0.1
(/Users/jax/git/datafusion/datafusion/datasource-csv)
Compiling datafusion-datasource-json v46.0.1
(/Users/jax/git/datafusion/datafusion/datasource-json)
Compiling datafusion-datasource-parquet v46.0.1
(/Users/jax/git/datafusion/datafusion/datasource-parquet)
Compiling datafusion-functions-table v46.0.1
(/Users/jax/git/datafusion/datafusion/functions-table)
Compiling datafusion-catalog-listing v46.0.1
(/Users/jax/git/datafusion/datafusion/catalog-listing)
Compiling datafusion-datasource-avro v46.0.1
(/Users/jax/git/datafusion/datafusion/datasource-avro)
Compiling datafusion v46.0.1 (/Users/jax/git/datafusion/datafusion/core)
Compiling datafusion-sqllogictest v46.0.1
(/Users/jax/git/datafusion/datafusion/sqllogictest)
Finished `test` profile [unoptimized + debuginfo] target(s) in 59.21s
Running bin/sqllogictests.rs
(target/debug/deps/sqllogictests-09ecded463d2aafd)
[00:00:02] ################------------------------ 33/83
"tpch/tpch.slt" - 3 took > 500 ms
thread 'tokio-runtime-worker' panicked at
datafusion/physical-plan/src/repartition/mod.rs:618:22:
partition not used yet
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
thread 'tokio-runtime-worker' panicked at
datafusion/physical-plan/src/repartition/mod.rs:618:22:
partition not used yet
thread 'tokio-runtime-worker' panicked at
datafusion/physical-plan/src/repartition/mod.rs:618:22:
partition not used yet
Completed 1 test files in 29 seconds
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]