This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new e189c67fc0b Publishing website 2022/10/24 10:16:04 at commit 760c83e
e189c67fc0b is described below

commit e189c67fc0ba0fc2f3e2584ce25b6adebe203482
Author: jenkins <bui...@apache.org>
AuthorDate: Mon Oct 24 10:16:05 2022 +0000

    Publishing website 2022/10/24 10:16:04 at commit 760c83e
---
 website/generated-content/documentation/runners/spark/index.html | 4 ++--
 website/generated-content/sitemap.xml                            | 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/website/generated-content/documentation/runners/spark/index.html 
b/website/generated-content/documentation/runners/spark/index.html
index 7f797a1b538..315d9519632 100644
--- a/website/generated-content/documentation/runners/spark/index.html
+++ b/website/generated-content/documentation/runners/spark/index.html
@@ -112,7 +112,7 @@ python -m apache_beam.examples.wordcount \
     --output_executable_path=<b><i>OUTPUT_JAR_PATH</b></i> \
     --output=gs://<b><i>BUCKET_NAME</i></b>/python-wordcount-out \
     --spark_version=3
-</pre><ul><li><code>--runner</code>(required): 
<code>SparkRunner</code>.</li><li><code>--output_executable_path</code>(required):
 path for the bundle jar to be 
created.</li><li><code>--output</code>(required): where output shall be 
written.</li><li><code>--spark_version</code>(optional): select spark version 2 
(default) or 3.</li></ul><ol start=5><li>Submit spark job to Dataproc 
cluster&rsquo;s master node.</li></ol><pre>
+</pre><ul><li><code>--runner</code>(required): 
<code>SparkRunner</code>.</li><li><code>--output_executable_path</code>(required):
 path for the bundle jar to be 
created.</li><li><code>--output</code>(required): where output shall be 
written.</li><li><code>--spark_version</code>(optional): select spark version 3 
(default) or 2 (deprecated!).</li></ul><ol start=5><li>Submit spark job to 
Dataproc cluster&rsquo;s master node.</li></ol><pre>
 gcloud dataproc jobs submit spark \
         --cluster=<b><i>CLUSTER_NAME</i></b> \
         --region=<b><i>REGION</i></b> \
@@ -130,7 +130,7 @@ The Spark runner reports user-defined Beam Aggregators 
using this same metrics s
 <a 
href=https://beam.apache.org/releases/javadoc/2.42.0/org/apache/beam/runners/spark/metrics/sink/GraphiteSink.html>GraphiteSink</a>
 and <a 
href=https://beam.apache.org/releases/javadoc/2.42.0/org/apache/beam/runners/spark/metrics/sink/CsvSink.html>CSVSink</a>.
 Providing support for additional Sinks supported by Spark is easy and 
straight-forward.</p><p class=language-py>Spark metrics are not yet supported 
on the portable runner.</p></p><h3 id=streaming-execution>Streaming 
Execution</h3><p class=language-java><br><b>For RDD/DStream based 
runner:</b><br>If your pipeline uses an <code>UnboundedSource</code> the Spark 
Runner will automatically set streaming mode. Forcing streaming mode is mostly 
used for testing and is not recommended.<br><br><b>F [...]
-Instead, you should use <code>SparkContextOptions</code> which can only be 
used programmatically and is not a common <code>PipelineOptions</code> 
implementation.<br><br><b>For Structured Streaming based 
runner:</b><br>Provided SparkSession and StreamingListeners are not supported 
on the Spark Structured Streaming runner</p><p class=language-py>Provided 
SparkContext and StreamingListeners are not supported on the Spark portable 
runner.</p><h3 id=kubernetes>Kubernetes</h3><p>An <a href=htt [...]
+Instead, you should use <code>SparkContextOptions</code> which can only be 
used programmatically and is not a common <code>PipelineOptions</code> 
implementation.<br><br><b>For Structured Streaming based 
runner:</b><br>Provided SparkSession and StreamingListeners are not supported 
on the Spark Structured Streaming runner</p><p class=language-py>Provided 
SparkContext and StreamingListeners are not supported on the Spark portable 
runner.</p><h3 id=kubernetes>Kubernetes</h3><p>An <a href=htt [...]
 <a href=https://www.apache.org>The Apache Software Foundation</a>
 | <a href=/privacy_policy>Privacy Policy</a>
 | <a href=/feed.xml>RSS Feed</a><br><br>Apache Beam, Apache, Beam, the Beam 
logo, and the Apache feather logo are either registered trademarks or 
trademarks of The Apache Software Foundation. All other products or name brands 
are trademarks of their respective holders, including The Apache Software 
Foundation.</div></div><div class="footer__cols__col 
footer__cols__col__logos"><div class=footer__cols__col--group><div 
class=footer__cols__col__logo><a href=https://github.com/apache/beam><im [...]
\ No newline at end of file
diff --git a/website/generated-content/sitemap.xml 
b/website/generated-content/sitemap.xml
index 5d82be96940..a234265220a 100644
--- a/website/generated-content/sitemap.xml
+++ b/website/generated-content/sitemap.xml
@@ -1 +1 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset 
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"; 
xmlns:xhtml="http://www.w3.org/1999/xhtml";><url><loc>/blog/beam-2.42.0/</loc><lastmod>2022-10-17T09:50:38-07:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2022-10-17T09:50:38-07:00</lastmod></url><url><loc>/blog/</loc><lastmod>2022-10-17T09:50:38-07:00</lastmod></url><url><loc>/categories/</loc><lastmod>2022-10-17T09:50:38-07:00</lastmod></url><url><loc>/catego
 [...]
\ No newline at end of file
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset 
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"; 
xmlns:xhtml="http://www.w3.org/1999/xhtml";><url><loc>/blog/beam-2.42.0/</loc><lastmod>2022-10-17T09:50:38-07:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2022-10-17T09:50:38-07:00</lastmod></url><url><loc>/blog/</loc><lastmod>2022-10-17T09:50:38-07:00</lastmod></url><url><loc>/categories/</loc><lastmod>2022-10-17T09:50:38-07:00</lastmod></url><url><loc>/catego
 [...]
\ No newline at end of file

Reply via email to