This is an automated email from the ASF dual-hosted git repository.
agrove pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git
The following commit(s) were added to refs/heads/main by this push:
new 8f14c7262 add_intellij_debug_statements (#1760)
8f14c7262 is described below
commit 8f14c72624b678ec9c7843877fe63ccd5ac80183
Author: B Vadlamani <[email protected]>
AuthorDate: Wed May 21 04:23:23 2025 -0700
add_intellij_debug_statements (#1760)
---
docs/source/contributor-guide/img.png | Bin 0 -> 194412 bytes
docs/source/contributor-guide/spark-sql-tests.md | 10 +++++++++-
2 files changed, 9 insertions(+), 1 deletion(-)
diff --git a/docs/source/contributor-guide/img.png
b/docs/source/contributor-guide/img.png
new file mode 100644
index 000000000..e9d460eed
Binary files /dev/null and b/docs/source/contributor-guide/img.png differ
diff --git a/docs/source/contributor-guide/spark-sql-tests.md
b/docs/source/contributor-guide/spark-sql-tests.md
index 56e1262ad..2c74f36f1 100644
--- a/docs/source/contributor-guide/spark-sql-tests.md
+++ b/docs/source/contributor-guide/spark-sql-tests.md
@@ -65,7 +65,7 @@ ENABLE_COMET=true build/sbt "hive/testOnly * -- -l
org.apache.spark.tags.Extende
ENABLE_COMET=true build/sbt "hive/testOnly * -- -n
org.apache.spark.tags.ExtendedHiveTest"
ENABLE_COMET=true build/sbt "hive/testOnly * -- -n
org.apache.spark.tags.SlowHiveTest"
```
-#### Steps to run individual test suites
+#### Steps to run individual test suites through SBT
1. Open SBT with Comet enabled
```sbt
ENABLE_COMET=true sbt -Dspark.test.includeSlowTests=true
@@ -74,6 +74,14 @@ ENABLE_COMET=true sbt -Dspark.test.includeSlowTests=true
```sbt
sql/testOnly org.apache.spark.sql.DynamicPartitionPruningV1SuiteAEOn -- -z
"SPARK-35568"
```
+#### Steps to run individual test suites in IntelliJ IDE
+1. Add below configuration in VM Options for your test case (apache-spark
repository)
+```sbt
+-Dspark.comet.enabled=true -Dspark.comet.debug.enabled=true
-Dspark.plugins=org.apache.spark.CometPlugin -DXmx4096m
-Dspark.executor.heartbeatInterval=20000 -Dspark.network.timeout=10000
--add-exports=java.base/sun.nio.ch=ALL-UNNAMED
--add-opens=java.base/java.nio=ALL-UNNAMED
+```
+2. Set `ENABLE_COMET=true` in environment variables
+
+3. After the above tests are configured, spark tests can be run with debugging
enabled on spark/comet code. Note that Comet is added as a dependency and the
classes are readonly while debugging from Spark. Any new changes to Comet are
to be built and deployed locally through the command (`PROFILES="-Pspark-3.4"
make release`)
## Creating a diff file for a new Spark version
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]