[ 
https://issues.apache.org/jira/browse/BEAM-9840?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rahul Patwari updated BEAM-9840:
--------------------------------
    Summary: Support for Parameterized Types when converting from HCatRecords 
to Rows in HCatalogIO  (was: Support for Parameterized Types in HCatalogIO)

> Support for Parameterized Types when converting from HCatRecords to Rows in 
> HCatalogIO
> --------------------------------------------------------------------------------------
>
>                 Key: BEAM-9840
>                 URL: https://issues.apache.org/jira/browse/BEAM-9840
>             Project: Beam
>          Issue Type: Improvement
>          Components: io-java-hcatalog
>            Reporter: Rahul Patwari
>            Assignee: Rahul Patwari
>            Priority: Major
>
> In Hive, to use CHAR and VARCHAR as the data type for a column, the types 
> have to be parameterized with the length of the character sequence. Refer 
> https://github.com/apache/hive/blob/f37c5de6c32b9395d1b34fa3c02ed06d1bfbf6eb/serde/src/test/org/apache/hadoop/hive/serde2/typeinfo/TestTypeInfoUtils.java#L68
> In addition, for DECIMAL data type, custom precision and scale can be 
> provided as parameters.
> A user has faced an Exception while reading data from a table created in Hive 
> with parameterized DECIMAL data type. Refer 
> [https://lists.apache.org/thread.html/r159012fbefce24d734096e3ec24ecd112de5f89b8029e57147d233b0%40%3Cuser.beam.apache.org%3E].
> This ticket is created to support reading data from tables in Hive, where the 
> type of a column can be parameterized.
> To Support Parameterized data types, we can make use of HcatFieldSchema. 
> Refer 
> https://github.com/apache/hive/blob/f37c5de6c32b9395d1b34fa3c02ed06d1bfbf6eb/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/schema/HCatFieldSchema.java#L34



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to