[ 
https://issues.apache.org/jira/browse/SPARK-11130?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin updated SPARK-11130:
-----------------------------------
    Description: 
Filing so it doesn't get lost (again).

TestHive.scala has this code:

{code}
    new SparkContext(
      System.getProperty("spark.sql.test.master", "local[32]"),
{code}

On machines with less cores, that causes many tests to fail with "unable to 
allocate memory" errors, because the default page size calculation seems to be 
based on the machine's core count, and not on the core count specified for the 
SparkContext.

  was:
Filing so it doesn't get lost (again).

TestHive.scala has this code:

{core}
    new SparkContext(
      System.getProperty("spark.sql.test.master", "local[32]"),
{core}

On machines with less cores, that causes many tests to fail with "unable to 
allocate memory" errors, because the default page size calculation seems to be 
based on the machine's core count, and not on the core count specified for the 
SparkContext.


> TestHive fails on machines with few cores
> -----------------------------------------
>
>                 Key: SPARK-11130
>                 URL: https://issues.apache.org/jira/browse/SPARK-11130
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0, 1.6.0
>            Reporter: Marcelo Vanzin
>            Priority: Minor
>
> Filing so it doesn't get lost (again).
> TestHive.scala has this code:
> {code}
>     new SparkContext(
>       System.getProperty("spark.sql.test.master", "local[32]"),
> {code}
> On machines with less cores, that causes many tests to fail with "unable to 
> allocate memory" errors, because the default page size calculation seems to 
> be based on the machine's core count, and not on the core count specified for 
> the SparkContext.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to