This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 277268dbf Publish built docs triggered by 
2bf2835812e818da976d192f5bb24e8947828dc4
277268dbf is described below

commit 277268dbf45ec6df84058ba8e8c647c0a28367c2
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Mon Dec 29 15:30:01 2025 +0000

    Publish built docs triggered by 2bf2835812e818da976d192f5bb24e8947828dc4
---
 _sources/user-guide/latest/compatibility.md.txt | 2 +-
 searchindex.js                                  | 2 +-
 user-guide/latest/compatibility.html            | 2 +-
 3 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/_sources/user-guide/latest/compatibility.md.txt 
b/_sources/user-guide/latest/compatibility.md.txt
index 35bf09724..31270404c 100644
--- a/_sources/user-guide/latest/compatibility.md.txt
+++ b/_sources/user-guide/latest/compatibility.md.txt
@@ -36,7 +36,7 @@ Comet will fall back to Spark for the following expressions 
when ANSI mode is en
 `spark.comet.expression.EXPRNAME.allowIncompatible=true`, where `EXPRNAME` is 
the Spark expression class name. See
 the [Comet Supported Expressions Guide](expressions.md) for more information 
on this configuration setting.
 
-- Average
+- Average (supports all numeric inputs except decimal types)
 - Cast (in some cases)
 
 There is an [epic](https://github.com/apache/datafusion-comet/issues/313) 
where we are tracking the work to fully implement ANSI support.
diff --git a/searchindex.js b/searchindex.js
index 14b760533..cd3cf6df9 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({"alltitles": {"1. Install Comet": [[19, "install-comet"]], 
"1. Native Operators (nativeExecs map)": [[4, 
"native-operators-nativeexecs-map"]], "2. Clone Spark and Apply Diff": [[19, 
"clone-spark-and-apply-diff"]], "2. Sink Operators (sinks map)": [[4, 
"sink-operators-sinks-map"]], "3. Comet JVM Operators": [[4, 
"comet-jvm-operators"]], "3. Run Spark SQL Tests": [[19, 
"run-spark-sql-tests"]], "ANSI Mode": [[22, "ansi-mode"], [35, "ansi-mode"], 
[48, "ansi-mode"], [88, "ans [...]
\ No newline at end of file
+Search.setIndex({"alltitles": {"1. Install Comet": [[19, "install-comet"]], 
"1. Native Operators (nativeExecs map)": [[4, 
"native-operators-nativeexecs-map"]], "2. Clone Spark and Apply Diff": [[19, 
"clone-spark-and-apply-diff"]], "2. Sink Operators (sinks map)": [[4, 
"sink-operators-sinks-map"]], "3. Comet JVM Operators": [[4, 
"comet-jvm-operators"]], "3. Run Spark SQL Tests": [[19, 
"run-spark-sql-tests"]], "ANSI Mode": [[22, "ansi-mode"], [35, "ansi-mode"], 
[48, "ansi-mode"], [88, "ans [...]
\ No newline at end of file
diff --git a/user-guide/latest/compatibility.html 
b/user-guide/latest/compatibility.html
index 961a2f5a5..fcc95b07f 100644
--- a/user-guide/latest/compatibility.html
+++ b/user-guide/latest/compatibility.html
@@ -477,7 +477,7 @@ under the License.
 <code class="docutils literal notranslate"><span 
class="pre">spark.comet.expression.EXPRNAME.allowIncompatible=true</span></code>,
 where <code class="docutils literal notranslate"><span 
class="pre">EXPRNAME</span></code> is the Spark expression class name. See
 the <a class="reference internal" href="expressions.html"><span class="std 
std-doc">Comet Supported Expressions Guide</span></a> for more information on 
this configuration setting.</p>
 <ul class="simple">
-<li><p>Average</p></li>
+<li><p>Average (supports all numeric inputs except decimal types)</p></li>
 <li><p>Cast (in some cases)</p></li>
 </ul>
 <p>There is an <a class="reference external" 
href="https://github.com/apache/datafusion-comet/issues/313";>epic</a> where we 
are tracking the work to fully implement ANSI support.</p>


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to