[ 
https://issues.apache.org/jira/browse/SPARK-4970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-4970.
------------------------------
    Resolution: Not A Problem

> Do not read spark.executor.memory from spark-defaults.conf in SparkSubmitSuite
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-4970
>                 URL: https://issues.apache.org/jira/browse/SPARK-4970
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Takeshi Yamamuro
>            Priority: Minor
>
> The test 'includes jars passed in through --jars’ in SparkSubmitSuite fails
> when spark.executor.memory is set at over 512MiB in conf/spark-default.conf.
> An exception is thrown as follows:
> Exception in thread "main" org.apache.spark.SparkException: Asked to launch 
> cluster with 512 MB RAM / worker but requested 1024 MB/worker
>       at 
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:1889)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:322)
>       at 
> org.apache.spark.deploy.JarCreationTest$.main(SparkSubmitSuite.scala:458)
>       at org.apache.spark.deploy.JarCreationTest.main(SparkSubmitSuite.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:367)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to