[ 
https://issues.apache.org/jira/browse/HIVE-17984?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16240406#comment-16240406
 ] 

Syam commented on HIVE-17984:
-----------------------------

The mentioned issue was not seen while reading the ORC file from local file 
system.

It appears that issue is specific to WebHDFS file system.

Is there any known issue here?

> getMaxLength is not returning the previously set length in ORC file
> -------------------------------------------------------------------
>
>                 Key: HIVE-17984
>                 URL: https://issues.apache.org/jira/browse/HIVE-17984
>             Project: Hive
>          Issue Type: Bug
>          Components: Hive, ORC
>         Environment: tested it against hive-exec 2.1
>            Reporter: Syam
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> getMaxLength is not returning the correct length for char/varchar datatypes.
> I see that getMaxLength is returning 255 for CHAR type and 65535 for VARCHAR 
> type.
> When I checked the same file using orcfiledump utility, I could see the 
> correct lengths.
> Here is the snippet of the code:
>  Reader _reader = OrcFile.createReader(new 
> Path(_fileName),OrcFile.readerOptions(conf).filesystem(fs)) ;
>       TypeDescription metarec = _reader.getSchema() ;
>       List <TypeDescription> cols = metarec.getChildren();
>       List <String> colNames = metarec.getFieldNames();
>       for (int i=0; i < cols.size(); i++)
>       {
>           TypeDescription fieldSchema = cols.get(i);
>           switch (fieldSchema.getCategory())
>           {
>            case CHAR:
>              header += "char(" + fieldSchema.getMaxLength() + ")" ;
>              break;
>            ----------                  
>                  ----------
>                  }
>         }
> Please let me know your pointers please.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to