This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 0c9a60b  Publishing website 2021/01/28 18:03:18 at commit f24ebd3
0c9a60b is described below

commit 0c9a60bf7f267a2be47f72539392a9ab77a76547
Author: jenkins <bui...@apache.org>
AuthorDate: Thu Jan 28 18:03:18 2021 +0000

    Publishing website 2021/01/28 18:03:18 at commit f24ebd3
---
 website/generated-content/documentation/index.xml  | 49 +++++++++++++++++++-
 .../io/built-in/google-bigquery/index.html         | 53 +++++++++++++++++++++-
 2 files changed, 100 insertions(+), 2 deletions(-)

diff --git a/website/generated-content/documentation/index.xml 
b/website/generated-content/documentation/index.xml
index 4faffd6..f51e8f6 100644
--- a/website/generated-content/documentation/index.xml
+++ b/website/generated-content/documentation/index.xml
@@ -12734,7 +12734,54 @@ GitHub&lt;/a>.&lt;/p>
 &lt;/div>
 &lt;p>The following code snippet reads with a query string.&lt;/p>
 &lt;div class=language-java>
-&lt;div class="highlight">&lt;pre class="chroma">&lt;code 
class="language-java" data-lang="java">&lt;span class="o">//&lt;/span> &lt;span 
class="n">Snippet&lt;/span> &lt;span class="n">not&lt;/span> &lt;span 
class="n">yet&lt;/span> &lt;span class="nf">available&lt;/span> &lt;span 
class="o">(&lt;/span>&lt;span class="n">BEAM&lt;/span>&lt;span 
class="o">-&lt;/span>&lt;span class="n">7034&lt;/span>&lt;span 
class="o">).&lt;/span>&lt;/code>&lt;/pre>&lt;/div>
+&lt;div class="highlight">&lt;pre class="chroma">&lt;code 
class="language-java" data-lang="java">&lt;span class="kn">import&lt;/span> 
&lt;span 
class="nn">org.apache.beam.examples.snippets.transforms.io.gcp.bigquery.BigQueryMyData.MyData&lt;/span>&lt;span
 class="o">;&lt;/span>
+&lt;span class="kn">import&lt;/span> &lt;span 
class="nn">org.apache.beam.sdk.Pipeline&lt;/span>&lt;span class="o">;&lt;/span>
+&lt;span class="kn">import&lt;/span> &lt;span 
class="nn">org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO&lt;/span>&lt;span 
class="o">;&lt;/span>
+&lt;span class="kn">import&lt;/span> &lt;span 
class="nn">org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method&lt;/span>&lt;span
 class="o">;&lt;/span>
+&lt;span class="kn">import&lt;/span> &lt;span 
class="nn">org.apache.beam.sdk.transforms.MapElements&lt;/span>&lt;span 
class="o">;&lt;/span>
+&lt;span class="kn">import&lt;/span> &lt;span 
class="nn">org.apache.beam.sdk.values.PCollection&lt;/span>&lt;span 
class="o">;&lt;/span>
+&lt;span class="kn">import&lt;/span> &lt;span 
class="nn">org.apache.beam.sdk.values.TypeDescriptor&lt;/span>&lt;span 
class="o">;&lt;/span>
+&lt;span class="kd">class&lt;/span> &lt;span 
class="nc">BigQueryReadFromQueryWithBigQueryStorageAPI&lt;/span> &lt;span 
class="o">{&lt;/span>
+&lt;span class="kd">public&lt;/span> &lt;span class="kd">static&lt;/span> 
&lt;span class="n">PCollection&lt;/span>&lt;span 
class="o">&amp;lt;&lt;/span>&lt;span class="n">MyData&lt;/span>&lt;span 
class="o">&amp;gt;&lt;/span> &lt;span 
class="nf">readFromQueryWithBigQueryStorageAPI&lt;/span>&lt;span 
class="o">(&lt;/span>
+&lt;span class="n">String&lt;/span> &lt;span 
class="n">project&lt;/span>&lt;span class="o">,&lt;/span> &lt;span 
class="n">String&lt;/span> &lt;span class="n">dataset&lt;/span>&lt;span 
class="o">,&lt;/span> &lt;span class="n">String&lt;/span> &lt;span 
class="n">table&lt;/span>&lt;span class="o">,&lt;/span> &lt;span 
class="n">String&lt;/span> &lt;span class="n">query&lt;/span>&lt;span 
class="o">,&lt;/span> &lt;span class="n">Pipeline&lt;/span> &lt;span 
class="n">pipeline&lt;/span>&lt;span  [...]
+&lt;span class="c1">// String project = &amp;#34;my-project-id&amp;#34;;
+&lt;/span>&lt;span class="c1">&lt;/span> &lt;span class="c1">// String dataset 
= &amp;#34;my_bigquery_dataset_id&amp;#34;;
+&lt;/span>&lt;span class="c1">&lt;/span> &lt;span class="c1">// String table = 
&amp;#34;my_bigquery_table_id&amp;#34;;
+&lt;/span>&lt;span class="c1">&lt;/span>
+&lt;span class="c1">// Pipeline pipeline = Pipeline.create();
+&lt;/span>&lt;span class="c1">&lt;/span>
+&lt;span class="cm">/*
+&lt;/span>&lt;span class="cm"> String query = 
String.format(&amp;#34;SELECT\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; string_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; int64_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; float64_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; numeric_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; bool_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; bytes_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; date_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; datetime_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; time_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; timestamp_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; geography_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; array_field,\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; struct_field\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34;FROM\n&amp;#34; +
+&lt;/span>&lt;span class="cm"> &amp;#34; `%s:%s.%s`&amp;#34;, project, 
dataset, table)
+&lt;/span>&lt;span class="cm"> */&lt;/span>
+&lt;span class="n">PCollection&lt;/span>&lt;span 
class="o">&amp;lt;&lt;/span>&lt;span class="n">MyData&lt;/span>&lt;span 
class="o">&amp;gt;&lt;/span> &lt;span class="n">rows&lt;/span> &lt;span 
class="o">=&lt;/span>
+&lt;span class="n">pipeline&lt;/span>
+&lt;span class="o">.&lt;/span>&lt;span class="na">apply&lt;/span>&lt;span 
class="o">(&lt;/span>
+&lt;span class="s">&amp;#34;Read from BigQuery 
table&amp;#34;&lt;/span>&lt;span class="o">,&lt;/span>
+&lt;span class="n">BigQueryIO&lt;/span>&lt;span class="o">.&lt;/span>&lt;span 
class="na">readTableRows&lt;/span>&lt;span class="o">()&lt;/span>
+&lt;span class="o">.&lt;/span>&lt;span class="na">fromQuery&lt;/span>&lt;span 
class="o">(&lt;/span>&lt;span class="n">query&lt;/span>&lt;span 
class="o">)&lt;/span>
+&lt;span class="o">.&lt;/span>&lt;span 
class="na">usingStandardSql&lt;/span>&lt;span class="o">()&lt;/span>
+&lt;span class="o">.&lt;/span>&lt;span class="na">withMethod&lt;/span>&lt;span 
class="o">(&lt;/span>&lt;span class="n">Method&lt;/span>&lt;span 
class="o">.&lt;/span>&lt;span class="na">DIRECT_READ&lt;/span>&lt;span 
class="o">))&lt;/span>
+&lt;span class="o">.&lt;/span>&lt;span class="na">apply&lt;/span>&lt;span 
class="o">(&lt;/span>
+&lt;span class="s">&amp;#34;TableRows to MyData&amp;#34;&lt;/span>&lt;span 
class="o">,&lt;/span>
+&lt;span class="n">MapElements&lt;/span>&lt;span class="o">.&lt;/span>&lt;span 
class="na">into&lt;/span>&lt;span class="o">(&lt;/span>&lt;span 
class="n">TypeDescriptor&lt;/span>&lt;span class="o">.&lt;/span>&lt;span 
class="na">of&lt;/span>&lt;span class="o">(&lt;/span>&lt;span 
class="n">MyData&lt;/span>&lt;span class="o">.&lt;/span>&lt;span 
class="na">class&lt;/span>&lt;span class="o">)).&lt;/span>&lt;span 
class="na">via&lt;/span>&lt;span class="o">(&lt;/span>&lt;span 
class="n">MyData&lt [...]
+&lt;span class="k">return&lt;/span> &lt;span class="n">rows&lt;/span>&lt;span 
class="o">;&lt;/span>
+&lt;span class="o">}&lt;/span>
+&lt;span class="o">}&lt;/span>&lt;/code>&lt;/pre>&lt;/div>
 &lt;/div>
 &lt;div class=language-py>
 &lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-py" 
data-lang="py">&lt;span class="c1"># The SDK for Python does not support the 
BigQuery Storage API.&lt;/span>&lt;/code>&lt;/pre>&lt;/div>
diff --git 
a/website/generated-content/documentation/io/built-in/google-bigquery/index.html
 
b/website/generated-content/documentation/io/built-in/google-bigquery/index.html
index 36e329f..2bf87cd 100644
--- 
a/website/generated-content/documentation/io/built-in/google-bigquery/index.html
+++ 
b/website/generated-content/documentation/io/built-in/google-bigquery/index.html
@@ -305,7 +305,58 @@ GitHub</a>.</p><div class=language-java><div 
class=highlight><pre class=chroma><
 
     <span class=k>return</span> <span class=n>rows</span><span class=o>;</span>
   <span class=o>}</span>
-<span class=o>}</span></code></pre></div></div><div class=language-py><div 
class=highlight><pre class=chroma><code class=language-py data-lang=py><span 
class=c1># The SDK for Python does not support the BigQuery Storage 
API.</span></code></pre></div></div><p>The following code snippet reads with a 
query string.</p><div class=language-java><div class=highlight><pre 
class=chroma><code class=language-java data-lang=java><span class=o>//</span> 
<span class=n>Snippet</span> <span class=n>not< [...]
+<span class=o>}</span></code></pre></div></div><div class=language-py><div 
class=highlight><pre class=chroma><code class=language-py data-lang=py><span 
class=c1># The SDK for Python does not support the BigQuery Storage 
API.</span></code></pre></div></div><p>The following code snippet reads with a 
query string.</p><div class=language-java><div class=highlight><pre 
class=chroma><code class=language-java data-lang=java><span 
class=kn>import</span> <span class=nn>org.apache.beam.examples.sn [...]
+<span class=kn>import</span> <span 
class=nn>org.apache.beam.sdk.Pipeline</span><span class=o>;</span>
+<span class=kn>import</span> <span 
class=nn>org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO</span><span 
class=o>;</span>
+<span class=kn>import</span> <span 
class=nn>org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method</span><span
 class=o>;</span>
+<span class=kn>import</span> <span 
class=nn>org.apache.beam.sdk.transforms.MapElements</span><span class=o>;</span>
+<span class=kn>import</span> <span 
class=nn>org.apache.beam.sdk.values.PCollection</span><span class=o>;</span>
+<span class=kn>import</span> <span 
class=nn>org.apache.beam.sdk.values.TypeDescriptor</span><span class=o>;</span>
+
+<span class=kd>class</span> <span 
class=nc>BigQueryReadFromQueryWithBigQueryStorageAPI</span> <span 
class=o>{</span>
+  <span class=kd>public</span> <span class=kd>static</span> <span 
class=n>PCollection</span><span class=o>&lt;</span><span 
class=n>MyData</span><span class=o>&gt;</span> <span 
class=nf>readFromQueryWithBigQueryStorageAPI</span><span class=o>(</span>
+      <span class=n>String</span> <span class=n>project</span><span 
class=o>,</span> <span class=n>String</span> <span class=n>dataset</span><span 
class=o>,</span> <span class=n>String</span> <span class=n>table</span><span 
class=o>,</span> <span class=n>String</span> <span class=n>query</span><span 
class=o>,</span> <span class=n>Pipeline</span> <span 
class=n>pipeline</span><span class=o>)</span> <span class=o>{</span>
+
+    <span class=c1>// String project = &#34;my-project-id&#34;;
+</span><span class=c1></span>    <span class=c1>// String dataset = 
&#34;my_bigquery_dataset_id&#34;;
+</span><span class=c1></span>    <span class=c1>// String table = 
&#34;my_bigquery_table_id&#34;;
+</span><span class=c1></span>
+    <span class=c1>// Pipeline pipeline = Pipeline.create();
+</span><span class=c1></span>
+    <span class=cm>/*
+</span><span class=cm>    String query = String.format(&#34;SELECT\n&#34; +
+</span><span class=cm>        &#34;  string_field,\n&#34; +
+</span><span class=cm>        &#34;  int64_field,\n&#34; +
+</span><span class=cm>        &#34;  float64_field,\n&#34; +
+</span><span class=cm>        &#34;  numeric_field,\n&#34; +
+</span><span class=cm>        &#34;  bool_field,\n&#34; +
+</span><span class=cm>        &#34;  bytes_field,\n&#34; +
+</span><span class=cm>        &#34;  date_field,\n&#34; +
+</span><span class=cm>        &#34;  datetime_field,\n&#34; +
+</span><span class=cm>        &#34;  time_field,\n&#34; +
+</span><span class=cm>        &#34;  timestamp_field,\n&#34; +
+</span><span class=cm>        &#34;  geography_field,\n&#34; +
+</span><span class=cm>        &#34;  array_field,\n&#34; +
+</span><span class=cm>        &#34;  struct_field\n&#34; +
+</span><span class=cm>        &#34;FROM\n&#34; +
+</span><span class=cm>        &#34;  `%s:%s.%s`&#34;, project, dataset, table)
+</span><span class=cm>    */</span>
+
+    <span class=n>PCollection</span><span class=o>&lt;</span><span 
class=n>MyData</span><span class=o>&gt;</span> <span class=n>rows</span> <span 
class=o>=</span>
+        <span class=n>pipeline</span>
+            <span class=o>.</span><span class=na>apply</span><span 
class=o>(</span>
+                <span class=s>&#34;Read from BigQuery table&#34;</span><span 
class=o>,</span>
+                <span class=n>BigQueryIO</span><span class=o>.</span><span 
class=na>readTableRows</span><span class=o>()</span>
+                    <span class=o>.</span><span class=na>fromQuery</span><span 
class=o>(</span><span class=n>query</span><span class=o>)</span>
+                    <span class=o>.</span><span 
class=na>usingStandardSql</span><span class=o>()</span>
+                    <span class=o>.</span><span 
class=na>withMethod</span><span class=o>(</span><span 
class=n>Method</span><span class=o>.</span><span 
class=na>DIRECT_READ</span><span class=o>))</span>
+            <span class=o>.</span><span class=na>apply</span><span 
class=o>(</span>
+                <span class=s>&#34;TableRows to MyData&#34;</span><span 
class=o>,</span>
+                <span class=n>MapElements</span><span class=o>.</span><span 
class=na>into</span><span class=o>(</span><span 
class=n>TypeDescriptor</span><span class=o>.</span><span 
class=na>of</span><span class=o>(</span><span class=n>MyData</span><span 
class=o>.</span><span class=na>class</span><span class=o>)).</span><span 
class=na>via</span><span class=o>(</span><span class=n>MyData</span><span 
class=o>::</span><span class=n>fromTableRow</span><span class=o>));</span>
+
+    <span class=k>return</span> <span class=n>rows</span><span class=o>;</span>
+  <span class=o>}</span>
+<span class=o>}</span></code></pre></div></div><div class=language-py><div 
class=highlight><pre class=chroma><code class=language-py data-lang=py><span 
class=c1># The SDK for Python does not support the BigQuery Storage 
API.</span></code></pre></div></div><h2 id=writing-to-bigquery>Writing to 
BigQuery</h2><p>BigQueryIO allows you to write to BigQuery tables. If you are 
using the Beam SDK
 for Java, you can also write different rows to different 
tables.</p><blockquote><p>BigQueryIO write transforms use APIs that are subject 
to BigQuery&rsquo;s
 <a href=https://cloud.google.com/bigquery/quota-policy>Quota</a> and
 <a href=https://cloud.google.com/bigquery/pricing>Pricing</a> 
policies.</p></blockquote><p>When you apply a write transform, you must provide 
the following information

Reply via email to