This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 5dce5d9  Publishing website 2021/04/19 18:03:04 at commit e0136ff
5dce5d9 is described below

commit 5dce5d9f5bc45d03339c486fd8090e9fb05d6a8a
Author: jenkins <bui...@apache.org>
AuthorDate: Mon Apr 19 18:03:04 2021 +0000

    Publishing website 2021/04/19 18:03:04 at commit e0136ff
---
 .../extensions/create-external-table/index.html    | 123 ++++++++++++++-------
 website/generated-content/sitemap.xml              |   2 +-
 2 files changed, 82 insertions(+), 43 deletions(-)

diff --git 
a/website/generated-content/documentation/dsls/sql/extensions/create-external-table/index.html
 
b/website/generated-content/documentation/dsls/sql/extensions/create-external-table/index.html
index fe7b590..1356e90 100644
--- 
a/website/generated-content/documentation/dsls/sql/extensions/create-external-table/index.html
+++ 
b/website/generated-content/documentation/dsls/sql/extensions/create-external-table/index.html
@@ -18,7 +18,7 @@
 function addPlaceholder(){$('input:text').attr('placeholder',"What are you 
looking for?");}
 function endSearch(){var 
search=document.querySelector(".searchBar");search.classList.add("disappear");var
 icons=document.querySelector("#iconsBar");icons.classList.remove("disappear");}
 function blockScroll(){$("body").toggleClass("fixedPosition");}
-function openMenu(){addPlaceholder();blockScroll();}</script><div 
class="clearfix container-main-content"><div class="section-nav closed" 
data-offset-top=90 data-offset-bottom=500><span class="section-nav-back 
glyphicon glyphicon-menu-left"></span><nav><ul class=section-nav-list 
data-section-nav><li><span 
class=section-nav-list-main-title>Languages</span></li><li><span 
class=section-nav-list-title>Java</span><ul class=section-nav-list><li><a 
href=/documentation/sdks/java/>Java SDK overvi [...]
+function openMenu(){addPlaceholder();blockScroll();}</script><div 
class="clearfix container-main-content"><div class="section-nav closed" 
data-offset-top=90 data-offset-bottom=500><span class="section-nav-back 
glyphicon glyphicon-menu-left"></span><nav><ul class=section-nav-list 
data-section-nav><li><span 
class=section-nav-list-main-title>Languages</span></li><li><span 
class=section-nav-list-title>Java</span><ul class=section-nav-list><li><a 
href=/documentation/sdks/java/>Java SDK overvi [...]
 <a href=/documentation/io/built-in/>external storage system</a>.
 For some storage systems, <code>CREATE EXTERNAL TABLE</code> does not create a 
physical table until
 a write occurs. After the physical table exists, you can access the table with
@@ -124,88 +124,127 @@ TBLPROPERTIES '{
 }'
 </code></pre><p>Write example:</p><pre><code>INSERT INTO writeTable(key, 
boolColumn, longColumn, stringColumn, doubleColumn)
   VALUES ('key', TRUE, 10, 'stringValue', 5.5)
-</code></pre><h2 id=pubsub>Pub/Sub</h2><h3 
id=syntax-3>Syntax</h3><pre><code>CREATE EXTERNAL TABLE [ IF NOT EXISTS ] 
tableName(
+</code></pre><h2 id=pubsub>Pub/Sub</h2><h3 id=syntax-3>Syntax</h3><h4 
id=nested-mode>Nested mode</h4><pre><code>CREATE EXTERNAL TABLE [ IF NOT EXISTS 
] tableName(
     event_timestamp TIMESTAMP,
-    attributes MAP&lt;VARCHAR, VARCHAR&gt;,
-    payload ROW&lt;tableElement [, tableElement ]*&gt;
+    attributes [MAP&lt;VARCHAR, VARCHAR&gt;, ARRAY&lt;ROW&lt;VARCHAR key, 
VARCHAR value&gt;&gt;],
+    payload [BYTES, ROW&lt;tableElement [, tableElement ]*&gt;]
 )
 TYPE pubsub
 LOCATION 'projects/[PROJECT]/topics/[TOPIC]'
-TBLPROPERTIES '{
-    &quot;timestampAttributeKey&quot;: &quot;key&quot;,
-    &quot;deadLetterQueue&quot;: &quot;projects/[PROJECT]/topics/[TOPIC]&quot;,
-    &quot;format&quot;: &quot;format&quot;
-}'
-</code></pre><ul><li><code>event_timestamp</code>: The event timestamp 
associated with the Pub/Sub message
-by PubsubIO. It can be one of the following:<ul><li>Message publish time, 
which is provided by Pub/Sub. This is the default
+</code></pre><h4 id=flattened-mode>Flattened mode</h4><pre><code>CREATE 
EXTERNAL TABLE [ IF NOT EXISTS ] tableName(tableElement [, tableElement ]*)
+TYPE pubsub
+LOCATION 'projects/[PROJECT]/topics/[TOPIC]'
+</code></pre><p>In nested mode, the following fields hold topic metadata. The 
presence of the
+<code>attributes</code> field triggers nested mode 
usage.</p><ul><li><p><code>event_timestamp</code>: The event timestamp 
associated with the Pub/Sub message
+by PubsubIO. It can be one of the following:</p><ul><li>Message publish time, 
which is provided by Pub/Sub. This is the default
 value if no extra configuration is provided.</li><li>A timestamp specified in 
one of the user-provided message attributes.
 The attribute key is configured by the <code>timestampAttributeKey</code> 
field of
 the <code>tblProperties</code> blob. The value of the attribute should conform 
to
 the <a 
href=https://beam.apache.org/releases/javadoc/2.4.0/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.Read.html#withTimestampAttribute-java.lang.String->requirements
 of
 PubsubIO</a>,
 which is either millis since Unix epoch or <a 
href=https://www.ietf.org/rfc/rfc3339.txt>RFC 339
-</a>date string.</li></ul></li><li><code>attributes</code>: The user-provided 
attributes map from the Pub/Sub message;</li><li><code>payload</code>: The 
schema of the payload of the Pub/Sub message. If a record
+</a>date string.</li></ul></li><li><p><code>attributes</code>: The 
user-provided attributes map from the Pub/Sub 
message;</p></li><li><p><code>payload</code>: The schema of the payload of the 
Pub/Sub message. If a record
 can&rsquo;t be unmarshalled, the record is written to the topic specified in 
the
 <code>deadLeaderQueue</code> field of the <code>tblProperties</code> blob. If 
no dead-letter queue
 is specified in this case, an exception is thrown and the pipeline will
-crash.</li><li><code>LOCATION</code>:<ul><li><code>PROJECT</code>: ID of the 
Google Cloud Project</li><li><code>TOPIC</code>: The Pub/Sub topic name. A 
subscription will be created
+crash.</p></li><li><p><code>LOCATION</code>:</p><ul><li><code>PROJECT</code>: 
ID of the Google Cloud Project</li><li><code>TOPIC</code>: The Pub/Sub topic 
name. A subscription will be created
 automatically, but the subscription is not cleaned up automatically.
-Specifying an existing subscription is not 
supported.</li></ul></li><li><code>TBLPROPERTIES</code>:<ul><li><code>timestampAttributeKey</code>:
 Optional. The key which contains the event
+Specifying an existing subscription is not 
supported.</li></ul></li><li><p><code>TBLPROPERTIES</code>:</p><ul><li><code>timestampAttributeKey</code>:
 Optional. The key which contains the event
 timestamp associated with the Pub/Sub message. If not specified, the
 message publish timestamp is used as an event timestamp for
 windowing/watermarking.</li><li><code>deadLetterQueue</code>: The topic into 
which messages are written if the
 payload was not parsed. If not specified, an exception is thrown for
-parsing failures.</li><li><code>format</code>: Optional. Allows you to specify 
the Pubsub payload format.
-Possible values are {<code>json</code>, <code>avro</code>}. Defaults to 
<code>json</code>.</li></ul></li></ul><h3 id=read-mode-2>Read 
Mode</h3><p>PubsubIO is currently limited to read access only.</p><h3 
id=write-mode-2>Write Mode</h3><p>Not supported. PubSubIO is currently limited 
to read access only in Beam SQL.</p><h3 id=schema-1>Schema</h3><p>Pub/Sub 
messages have metadata associated with them, and you can reference this
+parsing failures.</li><li><code>format</code>: Optional. Allows you to specify 
the Pubsub payload format.</li></ul></li></ul><h3 id=read-mode-2>Read 
Mode</h3><p>PubsubIO supports reading from topics by creating a new 
subscription.</p><h3 id=write-mode-2>Write Mode</h3><p>PubsubIO supports 
writing to topics.</p><h3 id=schema-1>Schema</h3><p>Pub/Sub messages have 
metadata associated with them, and you can reference this
 metadata in your queries. For each message, Pub/Sub exposes its publish time 
and
 a map of user-provided attributes in addition to the payload (unstructured in
 the general case). This information must be preserved and accessible from the
 SQL statements. Currently, this means that PubsubIO tables require you to
-declare a special set of columns, as shown below.</p><h3 
id=supported-payload>Supported Payload</h3><ul><li>JSON Objects 
(Default)<ul><li>Beam only supports querying messages with payload containing 
JSON
-objects. Beam attempts to parse JSON to match the schema of the
-<code>payload</code> field.</li></ul></li><li>Avro<ul><li>An Avro schema is 
automatically generated from the specified schema of
-the <code>payload</code> field. It is used to parse incoming 
messages.</li></ul></li></ul><h3 id=example-2>Example</h3><pre><code>CREATE 
EXTERNAL TABLE locations (event_timestamp TIMESTAMP, attributes MAP&lt;VARCHAR, 
VARCHAR&gt;, payload ROW&lt;id INTEGER, location VARCHAR&gt;)
+declare a special set of columns, as shown below.</p><h3 
id=supported-payload>Supported Payload</h3><ul><li>Pub/Sub supports <a 
href=#generic-payload-handling>Generic Payload Handling</a>.</li></ul><h3 
id=example-2>Example</h3><pre><code>CREATE EXTERNAL TABLE locations 
(event_timestamp TIMESTAMP, attributes MAP&lt;VARCHAR, VARCHAR&gt;, payload 
ROW&lt;id INTEGER, location VARCHAR&gt;)
 TYPE pubsub
 LOCATION 'projects/testing-integration/topics/user-location'
-</code></pre><h2 id=kafka>Kafka</h2><p>KafkaIO is experimental in Beam 
SQL.</p><h3 id=syntax-4>Syntax</h3><pre><code>CREATE EXTERNAL TABLE [ IF NOT 
EXISTS ] tableName (tableElement [, tableElement ]*)
+</code></pre><h2 id=pubsub-lite>Pub/Sub Lite</h2><h3 
id=syntax-4>Syntax</h3><pre><code>CREATE EXTERNAL TABLE [ IF NOT EXISTS ] 
tableName(
+    publish_timestamp DATETIME,
+    event_timestamp DATETIME,
+    message_key BYTES,
+    attributes ARRAY&lt;ROW&lt;key VARCHAR, `values` 
ARRAY&lt;VARBINARY&gt;&gt;&gt;,
+    payload [BYTES, ROW&lt;tableElement [, tableElement ]*&gt;]
+)
+TYPE pubsublite
+// For writing
+LOCATION 'projects/[PROJECT]/locations/[GCP-LOCATION]/topics/[TOPIC]'
+// For reading
+LOCATION 
'projects/[PROJECT]/locations/[GCP-LOCATION]/subscriptions/[SUBSCRIPTION]'
+</code></pre><ul><li><code>LOCATION</code>:<ul><li><code>PROJECT</code>: ID of 
the Google Cloud Project</li><li><code>TOPIC</code>: The Pub/Sub Lite topic 
name.</li><li><code>SUBSCRIPTION</code>: The Pub/Sub Lite subscription 
name.</li><li><code>GCP-LOCATION</code>: The location for this Pub/Sub Lite 
topic os 
subscription.</li></ul></li><li><code>TBLPROPERTIES</code>:<ul><li><code>timestampAttributeKey</code>:
 Optional. The key which contains the event
+timestamp associated with the Pub/Sub message. If not specified, the
+message publish timestamp is used as an event timestamp for
+windowing/watermarking.</li><li><code>deadLetterQueue</code>: Optional, 
supports
+<a href=#generic-dlq-handling>Generic DLQ 
Handling</a></li><li><code>format</code>: Optional. Allows you to specify the 
payload format.</li></ul></li></ul><h3 id=read-mode-3>Read 
Mode</h3><p>PubsubLiteIO supports reading from subscriptions.</p><h3 
id=write-mode-3>Write Mode</h3><p>PubsubLiteIO supports writing to 
topics.</p><h3 id=supported-payload-1>Supported Payload</h3><ul><li>Pub/Sub 
Lite supports <a href=#generic-payload-handling>Generic Payload 
Handling</a>.</li></ul><h3 id=example [...]
+TYPE pubsublite
+LOCATION 
'projects/testing-integration/locations/us-central1-a/topics/user-location'
+</code></pre><h2 id=kafka>Kafka</h2><p>KafkaIO is experimental in Beam 
SQL.</p><h3 id=syntax-5>Syntax</h3><h4 id=flattened-mode-1>Flattened 
mode</h4><pre><code>CREATE EXTERNAL TABLE [ IF NOT EXISTS ] tableName 
(tableElement [, tableElement ]*)
 TYPE kafka
-LOCATION 'kafka://localhost:2181/brokers'
+LOCATION 'my.company.url.com:2181/topic1'
 TBLPROPERTIES '{
-    &quot;bootstrap.servers&quot;:&quot;localhost:9092&quot;,
-    &quot;topics&quot;: [&quot;topic1&quot;, &quot;topic2&quot;],
-    &quot;format&quot;: &quot;avro&quot;
-    [, &quot;protoClass&quot;: &quot;com.example.ExampleMessage&quot; ]
+    &quot;bootstrap_servers&quot;: [&quot;localhost:9092&quot;, 
&quot;PLAINTEXT://192.168.1.200:2181&quot;],
+    &quot;topics&quot;: [&quot;topic2&quot;, &quot;topic3&quot;],
+    &quot;format&quot;: &quot;json&quot;
 }'
-</code></pre><ul><li><code>LOCATION</code>: The Kafka topic 
URL.</li><li><code>TBLPROPERTIES</code>:<ul><li><code>bootstrap.servers</code>: 
Optional. Allows you to specify the bootstrap
-server.</li><li><code>topics</code>: Optional. Allows you to specify specific 
topics.</li><li><code>format</code>: Optional. Allows you to specify the Kafka 
values format. Possible values are
-{<code>csv</code>, <code>avro</code>, <code>json</code>, <code>proto</code>, 
<code>thrift</code>}. Defaults to 
<code>csv</code>.</li><li><code>protoClass</code>: Optional. Use only when 
<code>format</code> is equal to <code>proto</code>. Allows you to
-specify full protocol buffer java class 
name.</li><li><code>thriftClass</code>: Optional. Use only when 
<code>format</code> is equal to <code>thrift</code>. Allows you to
-specify full thrift java class 
name.</li><li><code>thriftProtocolFactoryClass</code>: Optional. Use only when 
<code>format</code> is equal to <code>thrift</code>.
-Allows you to specify full class name of the <code>TProtocolFactory</code> to 
use for thrift
-serialization.</li></ul></li></ul><h3 id=read-mode-3>Read Mode</h3><p>Read 
Mode supports reading from a topic.</p><h3 id=write-mode-3>Write 
Mode</h3><p>Write Mode supports writing to a topic.</p><h3 
id=supported-formats>Supported Formats</h3><ul><li>CSV (default)<ul><li>Beam 
parses the messages, attempting to parse fields according to the
-types specified in the schema.</li></ul></li><li>Avro<ul><li>An Avro schema is 
automatically generated from the specified field
-types. It is used to parse incoming messages and to format outgoing
-messages.</li></ul></li><li>JSON Objects<ul><li>Beam attempts to parse JSON to 
match the schema.</li></ul></li><li>Protocol buffers<ul><li>Fields in the 
schema have to match the fields of the given 
<code>protoClass</code>.</li></ul></li><li>Thrift<ul><li>Fields in the schema 
have to match the fields of the given <code>thriftClass</code>.</li><li>The 
<code>TProtocolFactory</code> used for thrift serialization must match the
-provided <code>thriftProtocolFactoryClass</code>.</li></ul></li></ul><h3 
id=schema-2>Schema</h3><p>For CSV only simple types are supported.</p><h2 
id=mongodb>MongoDB</h2><h3 id=syntax-5>Syntax</h3><pre><code>CREATE EXTERNAL 
TABLE [ IF NOT EXISTS ] tableName (tableElement [, tableElement ]*)
+</code></pre><h4 id=nested-mode-1>Nested mode</h4><pre><code>CREATE EXTERNAL 
TABLE [ IF NOT EXISTS ] tableName (
+  event_timestamp DATETIME,
+  message_key BYTES,
+  headers ARRAY&lt;ROW&lt;key VARCHAR, `values` ARRAY&lt;VARBINARY&gt;&gt;&gt;,
+  payload [BYTES, ROW&lt;tableElement [, tableElement ]*&gt;]
+)
+TYPE kafka
+LOCATION 'my.company.url.com:2181/topic1'
+TBLPROPERTIES '{
+    &quot;bootstrap_servers&quot;: [&quot;localhost:9092&quot;, 
&quot;PLAINTEXT://192.168.1.200:2181&quot;],
+    &quot;topics&quot;: [&quot;topic2&quot;, &quot;topic3&quot;],
+    &quot;format&quot;: &quot;json&quot;
+}'
+</code></pre><p>The presence of the <code>headers</code> field triggers nested 
mode usage.</p><ul><li><code>LOCATION</code>: A url with the initial bootstrap 
broker to use and the initial
+topic name provided as the 
path.</li><li><code>TBLPROPERTIES</code>:<ul><li><code>bootstrap_servers</code>:
 Optional. Allows you to specify additional
+bootstrap servers, which are used in addition to the one in 
<code>LOCATION</code>.</li><li><code>topics</code>: Optional. Allows you to 
specify additional topics, which are
+used in addition to the one in 
<code>LOCATION</code>.</li><li><code>format</code>: Optional. Allows you to 
specify the Kafka values format. Possible values are
+{<code>csv</code>, <code>avro</code>, <code>json</code>, <code>proto</code>, 
<code>thrift</code>}. Defaults to <code>csv</code> in
+flattened mode or <code>json</code> in nested mode. <code>csv</code> does not 
support nested
+mode.</li></ul></li></ul><h3 id=read-mode-4>Read Mode</h3><p>Read Mode 
supports reading from a topic.</p><h3 id=write-mode-4>Write Mode</h3><p>Write 
Mode supports writing to a topic.</p><h3 id=supported-formats>Supported 
Formats</h3><ul><li>CSV (default)<ul><li>Beam parses the messages, attempting 
to parse fields according to the
+types specified in the schema.</li></ul></li><li>Kafka supports all <a 
href=#generic-payload-handling>Generic Payload Handling</a>
+formats.</li></ul><h3 id=schema-2>Schema</h3><p>For CSV only simple types are 
supported.</p><h2 id=mongodb>MongoDB</h2><h3 
id=syntax-6>Syntax</h3><pre><code>CREATE EXTERNAL TABLE [ IF NOT EXISTS ] 
tableName (tableElement [, tableElement ]*)
 TYPE mongodb
 LOCATION 'mongodb://[HOST]:[PORT]/[DATABASE]/[COLLECTION]'
 </code></pre><ul><li><code>LOCATION</code>: Location of the 
collection.<ul><li><code>HOST</code>: Location of the MongoDB server. Can be 
localhost or an ip address.
 When authentication is required username and password can be specified
-as follows: 
<code>username:password@localhost</code>.</li><li><code>PORT</code>: Port on 
which MongoDB server is listening.</li><li><code>DATABASE</code>: Database to 
connect to.</li><li><code>COLLECTION</code>: Collection within the 
database.</li></ul></li></ul><h3 id=read-mode-4>Read Mode</h3><p>Read Mode 
supports reading from a collection.</p><h3 id=write-mode-4>Write 
Mode</h3><p>Write Mode supports writing to a collection.</p><h3 
id=schema-3>Schema</h3><p>Only simple types are suppor [...]
+as follows: 
<code>username:password@localhost</code>.</li><li><code>PORT</code>: Port on 
which MongoDB server is listening.</li><li><code>DATABASE</code>: Database to 
connect to.</li><li><code>COLLECTION</code>: Collection within the 
database.</li></ul></li></ul><h3 id=read-mode-5>Read Mode</h3><p>Read Mode 
supports reading from a collection.</p><h3 id=write-mode-5>Write 
Mode</h3><p>Write Mode supports writing to a collection.</p><h3 
id=schema-3>Schema</h3><p>Only simple types are suppor [...]
 TYPE mongodb
 LOCATION 'mongodb://localhost:27017/apache/users'
 </code></pre><h2 id=text>Text</h2><p>TextIO is experimental in Beam SQL. Read 
Mode and Write Mode do not currently
-access the same underlying data.</p><h3 
id=syntax-6>Syntax</h3><pre><code>CREATE EXTERNAL TABLE [ IF NOT EXISTS ] 
tableName (tableElement [, tableElement ]*)
+access the same underlying data.</p><h3 
id=syntax-7>Syntax</h3><pre><code>CREATE EXTERNAL TABLE [ IF NOT EXISTS ] 
tableName (tableElement [, tableElement ]*)
 TYPE text
 LOCATION '/home/admin/orders'
 TBLPROPERTIES '{&quot;format: &quot;Excel&quot;}'
 </code></pre><ul><li><code>LOCATION</code>: The path to the file for Read 
Mode. The prefix for Write 
Mode.</li><li><code>TBLPROPERTIES</code>:<ul><li><code>format</code>: Optional. 
Allows you to specify the CSV Format, which controls
 the field delimeter, quote character, record separator, and other properties.
-See the following table:</li></ul></li></ul><div 
class=table-bordered-wrapper><table><thead><tr><th>Value for 
<code>format</code></th><th>Field delimiter</th><th>Quote</th><th>Record 
separator</th><th>Ignore empty lines?</th><th>Allow missing column 
names?</th></tr></thead><tbody><tr><td><code>default</code></td><td><code>,</code></td><td><code>"</code></td><td><code>\r\n</code></td><td>Yes</td><td>No</td></tr><tr><td><code>rfc4180</code></td><td><code>,</code></td><td><code>"</code></td
 [...]
-types specified in the schema using 
org.apache.commons.csv.</li></ul></li></ul><h3 id=schema-4>Schema</h3><p>Only 
simple types are supported.</p><h3 id=example-4>Example</h3><pre><code>CREATE 
EXTERNAL TABLE orders (id INTEGER, price INTEGER)
+See the following table:</li></ul></li></ul><div 
class=table-bordered-wrapper><table><thead><tr><th>Value for 
<code>format</code></th><th>Field delimiter</th><th>Quote</th><th>Record 
separator</th><th>Ignore empty lines?</th><th>Allow missing column 
names?</th></tr></thead><tbody><tr><td><code>default</code></td><td><code>,</code></td><td><code>"</code></td><td><code>\r\n</code></td><td>Yes</td><td>No</td></tr><tr><td><code>rfc4180</code></td><td><code>,</code></td><td><code>"</code></td
 [...]
+types specified in the schema using 
org.apache.commons.csv.</li></ul></li></ul><h3 id=schema-4>Schema</h3><p>Only 
simple types are supported.</p><h3 id=example-5>Example</h3><pre><code>CREATE 
EXTERNAL TABLE orders (id INTEGER, price INTEGER)
 TYPE text
 LOCATION '/home/admin/orders'
-</code></pre></div></div><footer class=footer><div 
class=footer__contained><div class=footer__cols><div class="footer__cols__col 
footer__cols__col__logos"><div class=footer__cols__col__logo><img 
src=/images/beam_logo_circle.svg class=footer__logo alt="Beam logo"></div><div 
class=footer__cols__col__logo><img src=/images/apache_logo_circle.svg 
class=footer__logo alt="Apache logo"></div></div><div class=footer-wrapper><div 
class=wrapper-grid><div class=footer__cols__col><div class=footer__c [...]
+</code></pre><h2 id=generic-payload-handling>Generic Payload 
Handling</h2><p>Certain data sources and sinks support generic payload 
handling. This handling
+parses a byte array payload field into a table schema. The following schemas 
are
+supported by this handling. All require at least setting <code>"format": 
"&lt;type>"</code>,
+and may require other properties.</p><ul><li><code>avro</code>: Avro<ul><li>An 
Avro schema is automatically generated from the specified field
+types. It is used to parse incoming messages and to format outgoing
+messages.</li></ul></li><li><code>json</code>: JSON Objects<ul><li>Beam 
attempts to parse the byte array as UTF-8 JSON to match the 
schema.</li></ul></li><li><code>proto</code>: Protocol Buffers<ul><li>Beam 
locates the equivalent Protocol Buffer class and uses it to parse
+the payload</li><li><code>protoClass</code>: Required. The proto class name to 
use. Must be built into
+the deployed JAR.</li><li>Fields in the schema have to match the fields of the 
given <code>protoClass</code>.</li></ul></li><li><code>thrift</code>: 
Thrift<ul><li>Fields in the schema have to match the fields of the given
+<code>thriftClass</code>.</li><li><code>thriftClass</code>: Required. Allows 
you to specify full thrift java class
+name. Must be built into the deployed 
JAR.</li><li><code>thriftProtocolFactoryClass</code>: Required. Allows you to 
specify full class
+name of the <code>TProtocolFactory</code> to use for thrift serialization. 
Must be
+built into the deployed JAR.</li><li>The <code>TProtocolFactory</code> used 
for thrift serialization must match the
+provided <code>thriftProtocolFactoryClass</code>.</li></ul></li></ul><h2 
id=generic-dlq-handling>Generic DLQ Handling</h2><p>Sources and sinks which 
support generic DLQ handling specify a parameter with
+the format <code>"&lt;dlqParamName>": "[DLQ_KIND]:[DLQ_ID]"</code>. The 
following types of
+DLQ handling are supported:</p><ul><li><code>bigquery</code>: 
BigQuery<ul><li>DLQ_ID is the table spec for an output table with an 
&ldquo;error&rdquo; string
+field and &ldquo;payload&rdquo; byte array 
field.</li></ul></li><li><code>pubsub</code>: Pub/Sub Topic<ul><li>DLQ_ID is 
the full path of the Pub/Sub Topic.</li></ul></li><li><code>pubsublite</code>: 
Pub/Sub Lite Topic<ul><li>DLQ_ID is the full path of the Pub/Sub Lite 
Topic.</li></ul></li></ul></div></div><footer class=footer><div 
class=footer__contained><div class=footer__cols><div class="footer__cols__col 
footer__cols__col__logos"><div class=footer__cols__col__logo><img 
src=/images/bea [...]
 <a href=http://www.apache.org>The Apache Software Foundation</a>
 | <a href=/privacy_policy>Privacy Policy</a>
 | <a href=/feed.xml>RSS Feed</a><br><br>Apache Beam, Apache, Beam, the Beam 
logo, and the Apache feather logo are either registered trademarks or 
trademarks of The Apache Software Foundation. All other products or name brands 
are trademarks of their respective holders, including The Apache Software 
Foundation.</div></div></div></div></footer></body></html>
\ No newline at end of file
diff --git a/website/generated-content/sitemap.xml 
b/website/generated-content/sitemap.xml
index 8775792..52a50ac 100644
--- a/website/generated-content/sitemap.xml
+++ b/website/generated-content/sitemap.xml
@@ -1 +1 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset 
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"; 
xmlns:xhtml="http://www.w3.org/1999/xhtml";><url><loc>/blog/beam-2.28.0/</loc><lastmod>2021-02-22T11:40:20-08:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2021-03-31T21:00:41-04:00</lastmod></url><url><loc>/blog/</loc><lastmod>2021-03-31T21:00:41-04:00</lastmod></url><url><loc>/categories/</loc><lastmod>2021-03-31T21:00:41-04:00</lastmod></url><url><loc>/blog/k
 [...]
\ No newline at end of file
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset 
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"; 
xmlns:xhtml="http://www.w3.org/1999/xhtml";><url><loc>/blog/beam-2.28.0/</loc><lastmod>2021-02-22T11:40:20-08:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2021-03-31T21:00:41-04:00</lastmod></url><url><loc>/blog/</loc><lastmod>2021-03-31T21:00:41-04:00</lastmod></url><url><loc>/categories/</loc><lastmod>2021-03-31T21:00:41-04:00</lastmod></url><url><loc>/blog/k
 [...]
\ No newline at end of file

Reply via email to