[
https://issues.apache.org/jira/browse/DRILL-1161?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sudheesh Katkam updated DRILL-1161:
-----------------------------------
Due Date: 15/Aug/14
> Drill Parquet writer fails with an Out of Memory issue when the data is large
> enough
> ------------------------------------------------------------------------------------
>
> Key: DRILL-1161
> URL: https://issues.apache.org/jira/browse/DRILL-1161
> Project: Apache Drill
> Issue Type: Bug
> Components: Storage - Parquet, Storage - Writer
> Reporter: Rahul Challapalli
> Assignee: Parth Chandra
> Fix For: 0.5.0
>
> Attachments: error.log
>
>
> git.commit.id.abbrev=e5c2da0
> The below query fails with an out of memory issue :
> create table `wide-columns-100000` as select columns[0] col0, cast(columns[1]
> as int) col1 from `wide-columns-100000.tbl`;
> The source file contains 100000 records. Each record has 2 columns. The first
> column is a string with 100000 characters in it and the second is an integer.
> Adding a limit to the above query succeeds. I attached the error messages
> from drillbit.log and drillbit.out
> Let me know if you need anything more
--
This message was sent by Atlassian JIRA
(v6.2#6252)