[ 
https://issues.apache.org/jira/browse/SPARK-23145?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Azharuddin updated SPARK-23145:
-------------------------------
    Description: 
How can I increase the memory available for Apache spark executor nodes?

I have a 2 GB file that is suitable to loading in to [Apache 
Spark|https://mindmajix.com/apache-spark-training]. I am running apache spark 
for the moment on 1 machine, so the driver and executor are on the same 
machine. The machine has 8 GB of memory.

When I try count the lines of the file after setting the file to be cached in 
memory I get these errors:

{\{2014-10-25 22:25:12 WARN CacheManager:71 - Not enough space to cache 
partition rdd_1_1 in memory! Free memory is 278099801 bytes. }}

I looked at the documentation 
[here|http://spark.apache.org/docs/latest/configuration.html] and set 
{{spark.executor.memory}} to {{4g}} in {{$SPARK_HOME/conf/spark-defaults.conf}}

The UI shows this variable is set in the Spark Environment. You can find 
screenshot 
[here|https://drive.google.com/file/d/0B0B_O5bxDDlsc3JmY0xfbjFtN0k/view?usp=sharing]

However when I go to the [Executor 
tab|https://drive.google.com/file/d/0B0B_O5bxDDlsV25ka2lHd1ZFNzA/view?usp=sharing]
 the memory limit for my single Executor is still set to 265.4 MB. I also still 
get the same error.

I tried various things mentioned 
[here|https://stackoverflow.com/questions/24242060/how-to-change-memory-per-node-for-apache-spark-worker]
 but I still get the error and don't have a clear idea where I should change 
the setting.

I am running my code interactively from the spark-shell

  was:
How can I increase the memory available for Apache spark executor nodes?

I have a 2 GB file that is suitable to loading in to [Apache Spark| 
[https://mindmajix.com|https://mindmajix.com/apache-spark-training]]. I am 
running apache spark for the moment on 1 machine, so the driver and executor 
are on the same machine. The machine has 8 GB of memory.

When I try count the lines of the file after setting the file to be cached in 
memory I get these errors:

{\{2014-10-25 22:25:12 WARN CacheManager:71 - Not enough space to cache 
partition rdd_1_1 in memory! Free memory is 278099801 bytes. }}

I looked at the documentation 
[here|http://spark.apache.org/docs/latest/configuration.html] and set 
{{spark.executor.memory}} to {{4g}} in {{$SPARK_HOME/conf/spark-defaults.conf}}

The UI shows this variable is set in the Spark Environment. You can find 
screenshot 
[here|https://drive.google.com/file/d/0B0B_O5bxDDlsc3JmY0xfbjFtN0k/view?usp=sharing]

However when I go to the [Executor 
tab|https://drive.google.com/file/d/0B0B_O5bxDDlsV25ka2lHd1ZFNzA/view?usp=sharing]
 the memory limit for my single Executor is still set to 265.4 MB. I also still 
get the same error.

I tried various things mentioned 
[here|https://stackoverflow.com/questions/24242060/how-to-change-memory-per-node-for-apache-spark-worker]
 but I still get the error and don't have a clear idea where I should change 
the setting.

I am running my code interactively from the spark-shell


> How to set Apache Spark Executor memory
> ---------------------------------------
>
>                 Key: SPARK-23145
>                 URL: https://issues.apache.org/jira/browse/SPARK-23145
>             Project: Spark
>          Issue Type: Question
>          Components: EC2
>    Affects Versions: 2.1.0
>            Reporter: Azharuddin
>            Priority: Major
>             Fix For: 2.1.1
>
>   Original Estimate: 954h
>  Remaining Estimate: 954h
>
> How can I increase the memory available for Apache spark executor nodes?
> I have a 2 GB file that is suitable to loading in to [Apache 
> Spark|https://mindmajix.com/apache-spark-training]. I am running apache spark 
> for the moment on 1 machine, so the driver and executor are on the same 
> machine. The machine has 8 GB of memory.
> When I try count the lines of the file after setting the file to be cached in 
> memory I get these errors:
> {\{2014-10-25 22:25:12 WARN CacheManager:71 - Not enough space to cache 
> partition rdd_1_1 in memory! Free memory is 278099801 bytes. }}
> I looked at the documentation 
> [here|http://spark.apache.org/docs/latest/configuration.html] and set 
> {{spark.executor.memory}} to {{4g}} in 
> {{$SPARK_HOME/conf/spark-defaults.conf}}
> The UI shows this variable is set in the Spark Environment. You can find 
> screenshot 
> [here|https://drive.google.com/file/d/0B0B_O5bxDDlsc3JmY0xfbjFtN0k/view?usp=sharing]
> However when I go to the [Executor 
> tab|https://drive.google.com/file/d/0B0B_O5bxDDlsV25ka2lHd1ZFNzA/view?usp=sharing]
>  the memory limit for my single Executor is still set to 265.4 MB. I also 
> still get the same error.
> I tried various things mentioned 
> [here|https://stackoverflow.com/questions/24242060/how-to-change-memory-per-node-for-apache-spark-worker]
>  but I still get the error and don't have a clear idea where I should change 
> the setting.
> I am running my code interactively from the spark-shell



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to