[ 
https://issues.apache.org/jira/browse/DRILL-4759?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15363620#comment-15363620
 ] 

ASF GitHub Bot commented on DRILL-4759:
---------------------------------------

Github user parthchandra commented on a diff in the pull request:

    https://github.com/apache/drill/pull/540#discussion_r69666852
  
    --- Diff: 
exec/java-exec/src/main/java/org/apache/drill/exec/store/parquet/columnreaders/ParquetFixedWidthDictionaryReaders.java
 ---
    @@ -156,12 +156,18 @@ protected void readField(long 
recordsToReadInThisPass) {
           recordsReadInThisIteration = Math.min(pageReader.currentPageCount
               - pageReader.valuesRead, recordsToReadInThisPass - 
valuesReadInCurrentPass);
     
    -      for (int i = 0; i < recordsReadInThisIteration; i++){
    -        try {
    -        valueVec.getMutator().setSafe(valuesReadInCurrentPass + i, 
pageReader.dictionaryValueReader.readLong());
    -        } catch ( Exception ex) {
    -          throw ex;
    +      if (usingDictionary) {
    --- End diff --
    
    Fix is fine, but agree that a unit test should be added.


> Drill throwing array index out of bound exception when reading a parquet file 
> written by map reduce program.
> ------------------------------------------------------------------------------------------------------------
>
>                 Key: DRILL-4759
>                 URL: https://issues.apache.org/jira/browse/DRILL-4759
>             Project: Apache Drill
>          Issue Type: Bug
>          Components: Storage - Parquet
>    Affects Versions: 1.7.0
>            Reporter: Padma Penumarthy
>            Assignee: Padma Penumarthy
>             Fix For: 1.8.0
>
>
> An ArrayIndexOutOfBound exception is thrown while reading bigInt data type 
> from dictionary encoded parquet data. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to