[ 
https://issues.apache.org/jira/browse/PARQUET-2059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gabor Szadovszky resolved PARQUET-2059.
---------------------------------------
    Resolution: Fixed

> Tests require too much memory
> -----------------------------
>
>                 Key: PARQUET-2059
>                 URL: https://issues.apache.org/jira/browse/PARQUET-2059
>             Project: Parquet
>          Issue Type: Test
>            Reporter: Gabor Szadovszky
>            Assignee: Gabor Szadovszky
>            Priority: Major
>
> For testing the solution of PARQUET-1633 we require ~3GB memory that is not 
> always available. To solve this issue we temporarily disabled the implemented 
> unit test.
> We need to ensure somehow that [this 
> test|https://github.com/apache/parquet-mr/blob/master/parquet-hadoop/src/test/java/org/apache/parquet/hadoop/TestLargeColumnChunk.java]
>  (and maybe some other similar ones) are executed regularly. Some options we 
> might have:
> * Execute this test separately with a maven profile. I am not sure if the CI 
> allows allocating such large memory but with Xmx options we might give a try 
> and create a separate check for this test only.
> * Similar to the previous with the profile but not executing in the CI ever. 
> Instead, we add some comments to the release doc so this test will be 
> executed at least once per release.
> * Configuring the CI profile to skip this test but have it in the normal 
> scenario meaning the devs will execute it locally. There are a couple of cons 
> though. There is no guarantee that devs executes all the tests including this 
> one. It also can cause issues if the dev doesn't have enough memory and don't 
> know that the test failure is not related to the current change.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to