This is an automated email from the ASF dual-hosted git repository.

agrove pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git


The following commit(s) were added to refs/heads/main by this push:
     new c6c74bb1a docs: Add instructions for running individual Spark SQL 
tests from sbt (#1752)
c6c74bb1a is described below

commit c6c74bb1a3c36b8cae59e1a0589a396b5ac1b2df
Author: B Vadlamani <[email protected]>
AuthorDate: Tue May 20 06:32:04 2025 -0700

    docs: Add instructions for running individual Spark SQL tests from sbt 
(#1752)
---
 docs/source/contributor-guide/spark-sql-tests.md | 11 ++++++++++-
 1 file changed, 10 insertions(+), 1 deletion(-)

diff --git a/docs/source/contributor-guide/spark-sql-tests.md 
b/docs/source/contributor-guide/spark-sql-tests.md
index a53615bab..56e1262ad 100644
--- a/docs/source/contributor-guide/spark-sql-tests.md
+++ b/docs/source/contributor-guide/spark-sql-tests.md
@@ -54,7 +54,7 @@ git apply ../datafusion-comet/dev/diffs/3.4.3.diff
 
 ## 3. Run Spark SQL Tests
 
-Use the following commands to run the SQL tests locally.
+#### Use the following commands to run the Spark SQL test suite locally.
 
 ```shell
 ENABLE_COMET=true build/sbt catalyst/test
@@ -65,6 +65,15 @@ ENABLE_COMET=true build/sbt "hive/testOnly * -- -l 
org.apache.spark.tags.Extende
 ENABLE_COMET=true build/sbt "hive/testOnly * -- -n 
org.apache.spark.tags.ExtendedHiveTest"
 ENABLE_COMET=true build/sbt "hive/testOnly * -- -n 
org.apache.spark.tags.SlowHiveTest"
 ```
+#### Steps to run individual test suites
+1. Open SBT with Comet enabled
+```sbt
+ENABLE_COMET=true sbt -Dspark.test.includeSlowTests=true 
+```
+2. Run individual tests (Below code runs test named `SPARK-35568` in the 
`spark-sql` module)
+```sbt
+ sql/testOnly  org.apache.spark.sql.DynamicPartitionPruningV1SuiteAEOn -- -z 
"SPARK-35568"
+```
 
 ## Creating a diff file for a new Spark version
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to