This is an automated email from the ASF dual-hosted git repository.
github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git
The following commit(s) were added to refs/heads/asf-site by this push:
new 9a5c199bb Publish built docs triggered by
c6c74bb1a3c36b8cae59e1a0589a396b5ac1b2df
9a5c199bb is described below
commit 9a5c199bb2fd045c7478036c1ea30b7f96fb1b3e
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Tue May 20 13:32:35 2025 +0000
Publish built docs triggered by c6c74bb1a3c36b8cae59e1a0589a396b5ac1b2df
---
_sources/contributor-guide/spark-sql-tests.md.txt | 11 +++++++-
contributor-guide/spark-sql-tests.html | 31 ++++++++++++++++++++++-
searchindex.js | 2 +-
3 files changed, 41 insertions(+), 3 deletions(-)
diff --git a/_sources/contributor-guide/spark-sql-tests.md.txt
b/_sources/contributor-guide/spark-sql-tests.md.txt
index a53615bab..56e1262ad 100644
--- a/_sources/contributor-guide/spark-sql-tests.md.txt
+++ b/_sources/contributor-guide/spark-sql-tests.md.txt
@@ -54,7 +54,7 @@ git apply ../datafusion-comet/dev/diffs/3.4.3.diff
## 3. Run Spark SQL Tests
-Use the following commands to run the SQL tests locally.
+#### Use the following commands to run the Spark SQL test suite locally.
```shell
ENABLE_COMET=true build/sbt catalyst/test
@@ -65,6 +65,15 @@ ENABLE_COMET=true build/sbt "hive/testOnly * -- -l
org.apache.spark.tags.Extende
ENABLE_COMET=true build/sbt "hive/testOnly * -- -n
org.apache.spark.tags.ExtendedHiveTest"
ENABLE_COMET=true build/sbt "hive/testOnly * -- -n
org.apache.spark.tags.SlowHiveTest"
```
+#### Steps to run individual test suites
+1. Open SBT with Comet enabled
+```sbt
+ENABLE_COMET=true sbt -Dspark.test.includeSlowTests=true
+```
+2. Run individual tests (Below code runs test named `SPARK-35568` in the
`spark-sql` module)
+```sbt
+ sql/testOnly org.apache.spark.sql.DynamicPartitionPruningV1SuiteAEOn -- -z
"SPARK-35568"
+```
## Creating a diff file for a new Spark version
diff --git a/contributor-guide/spark-sql-tests.html
b/contributor-guide/spark-sql-tests.html
index 4f5efb36e..3c9f8bcb5 100644
--- a/contributor-guide/spark-sql-tests.html
+++ b/contributor-guide/spark-sql-tests.html
@@ -304,6 +304,18 @@ under the License.
<a class="reference internal nav-link" href="#run-spark-sql-tests">
3. Run Spark SQL Tests
</a>
+ <ul class="nav section-nav flex-column">
+ <li class="toc-h3 nav-item toc-entry">
+ <a class="reference internal nav-link"
href="#use-the-following-commands-to-run-the-spark-sql-test-suite-locally">
+ Use the following commands to run the Spark SQL test suite locally.
+ </a>
+ </li>
+ <li class="toc-h3 nav-item toc-entry">
+ <a class="reference internal nav-link"
href="#steps-to-run-individual-test-suites">
+ Steps to run individual test suites
+ </a>
+ </li>
+ </ul>
</li>
<li class="toc-h2 nav-item toc-entry">
<a class="reference internal nav-link"
href="#creating-a-diff-file-for-a-new-spark-version">
@@ -398,7 +410,8 @@ git<span class="w"> </span>apply<span class="w">
</span>../datafusion-comet/dev/
</section>
<section id="run-spark-sql-tests">
<h2>3. Run Spark SQL Tests<a class="headerlink" href="#run-spark-sql-tests"
title="Link to this heading">¶</a></h2>
-<p>Use the following commands to run the SQL tests locally.</p>
+<section
id="use-the-following-commands-to-run-the-spark-sql-test-suite-locally">
+<h3>Use the following commands to run the Spark SQL test suite locally.<a
class="headerlink"
href="#use-the-following-commands-to-run-the-spark-sql-test-suite-locally"
title="Link to this heading">¶</a></h3>
<div class="highlight-shell notranslate"><div
class="highlight"><pre><span></span><span class="nv">ENABLE_COMET</span><span
class="o">=</span><span class="nb">true</span><span class="w">
</span>build/sbt<span class="w"> </span>catalyst/test
<span class="nv">ENABLE_COMET</span><span class="o">=</span><span
class="nb">true</span><span class="w"> </span>build/sbt<span class="w">
</span><span class="s2">"sql/testOnly * -- -l
org.apache.spark.tags.ExtendedSQLTest -l
org.apache.spark.tags.SlowSQLTest"</span>
<span class="nv">ENABLE_COMET</span><span class="o">=</span><span
class="nb">true</span><span class="w"> </span>build/sbt<span class="w">
</span><span class="s2">"sql/testOnly * -- -n
org.apache.spark.tags.ExtendedSQLTest"</span>
@@ -409,6 +422,22 @@ git<span class="w"> </span>apply<span class="w">
</span>../datafusion-comet/dev/
</pre></div>
</div>
</section>
+<section id="steps-to-run-individual-test-suites">
+<h3>Steps to run individual test suites<a class="headerlink"
href="#steps-to-run-individual-test-suites" title="Link to this
heading">¶</a></h3>
+<ol class="arabic simple">
+<li><p>Open SBT with Comet enabled</p></li>
+</ol>
+<div class="highlight-sbt notranslate"><div
class="highlight"><pre><span></span>ENABLE_COMET=true sbt
-Dspark.test.includeSlowTests=true
+</pre></div>
+</div>
+<ol class="arabic simple" start="2">
+<li><p>Run individual tests (Below code runs test named <code class="docutils
literal notranslate"><span class="pre">SPARK-35568</span></code> in the <code
class="docutils literal notranslate"><span class="pre">spark-sql</span></code>
module)</p></li>
+</ol>
+<div class="highlight-sbt notranslate"><div
class="highlight"><pre><span></span> sql/testOnly
org.apache.spark.sql.DynamicPartitionPruningV1SuiteAEOn -- -z
"SPARK-35568"
+</pre></div>
+</div>
+</section>
+</section>
<section id="creating-a-diff-file-for-a-new-spark-version">
<h2>Creating a diff file for a new Spark version<a class="headerlink"
href="#creating-a-diff-file-for-a-new-spark-version" title="Link to this
heading">¶</a></h2>
<p>Once Comet has support for a new Spark version, we need to create a diff
file that can be applied to that version
diff --git a/searchindex.js b/searchindex.js
index c01ba4bfd..05253655c 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({"alltitles": {"1. Install Comet": [[11, "install-comet"]],
"2. Clone Spark and Apply Diff": [[11, "clone-spark-and-apply-diff"]], "3. Run
Spark SQL Tests": [[11, "run-spark-sql-tests"]], "ANSI mode": [[14,
"ansi-mode"]], "API Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "ASF Links": [[13, null]],
"Accelerating Apache Iceberg Parquet Scans using Comet (Experimental)": [[19,
null]], "Adding Spark-side Tests for the New Expression": [...]
\ No newline at end of file
+Search.setIndex({"alltitles": {"1. Install Comet": [[11, "install-comet"]],
"2. Clone Spark and Apply Diff": [[11, "clone-spark-and-apply-diff"]], "3. Run
Spark SQL Tests": [[11, "run-spark-sql-tests"]], "ANSI mode": [[14,
"ansi-mode"]], "API Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "ASF Links": [[13, null]],
"Accelerating Apache Iceberg Parquet Scans using Comet (Experimental)": [[19,
null]], "Adding Spark-side Tests for the New Expression": [...]
\ No newline at end of file
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]