[ 
https://issues.apache.org/jira/browse/SPARK-17524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15487093#comment-15487093
 ] 

Apache Spark commented on SPARK-17524:
--------------------------------------

User 'a-roberts' has created a pull request for this issue:
https://github.com/apache/spark/pull/15079

> RowBasedKeyValueBatchSuite always uses 64 mb page size
> ------------------------------------------------------
>
>                 Key: SPARK-17524
>                 URL: https://issues.apache.org/jira/browse/SPARK-17524
>             Project: Spark
>          Issue Type: Improvement
>          Components: Tests
>    Affects Versions: 2.1.0
>            Reporter: Adam Roberts
>            Priority: Minor
>
> The appendRowUntilExceedingPageSize test at 
> sql/catalyst/src/test/java/org/apache/spark/sql/catalyst/expressions/RowBasedKeyValueBatchSuite.java
>  always uses the default page size which is 64 MB for running the test
> Users with less powerful machines (e.g. those with two cores) may opt to 
> choose a smaller spark.buffer.pageSize value in order to prevent problems 
> acquiring memory
> If this size is reduced, let's say to 1 MB, this test will fail, here is the 
> problem scenario
> We run with 1,048,576 page size (1 mb)
> Default is 67,108,864 size (64 mb)
> Test fails: java.lang.AssertionError: expected:<14563> but was:<932067>
> 932,067 is 64x bigger than 14,563 and the default page size is 64x bigger 
> than 1 MB (which is when we see the failure)
> The failure is at
> Assert.assertEquals(batch.numRows(), numRows);
> This minor improvement has the test use whatever the user has specified 
> (looks for spark.buffer.pageSize) to prevent this problem from occurring for 
> anyone testing Apache Spark on a box with a reduced page size.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to