[ 
https://issues.apache.org/jira/browse/FLINK-39106?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-39106:
-----------------------------------
    Labels: pull-request-available  (was: )

> Support ROW and ARRAY<ROW> data types in DynamoDB Table API / SQL sink
> ----------------------------------------------------------------------
>
>                 Key: FLINK-39106
>                 URL: https://issues.apache.org/jira/browse/FLINK-39106
>             Project: Flink
>          Issue Type: Improvement
>          Components: Connectors / DynamoDB
>    Affects Versions: 2.0.0, 1.20.0, 1.20.1, 1.20.2, 2.1.0, 2.0.1, 1.20.3, 
> 2.2.0, 2.1.1
>            Reporter: Dominik Dębowczyk
>            Priority: Major
>              Labels: pull-request-available
>
> h3. Motivation
> The DynamoDB Table API / SQL sink ({{{}RowDataToAttributeValueConverter{}}}) 
> does not support the Flink {{ROW}} data type. When a table schema includes a 
> {{ROW}} column, the converter falls through to 
> {{{}EnhancedType.of(dataType.getConversionClass()){}}}, which resolves to the 
> Flink internal RowData class. The DynamoDB Enhanced Client has no built-in 
> {{AttributeConverter}} for {{{}RowData{}}}, causing a runtime failure.
> This prevents users from writing nested/structured data to DynamoDB via Flink 
> SQL. DynamoDB's Map (M) / document attribute type is a natural fit for 
> Flink's {{ROW}} type.
> h3. Changes
> ROW columns are now serialized as DynamoDB Map (M) attributes. {{ARRAY<ROW>}} 
> columns are serialized as DynamoDB List (L) of Maps. Recursive nesting 
> ({{ROW}} within {{ROW}}, {{ARRAY<ROW>}} within {{ROW}}) is fully supported.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to