[jira] [Assigned] (SPARK-27811) Docs of spark.driver.memoryOverhead and spark.executor.memoryOverhead exists a little ambiguity

2019-06-01 Thread Sean Owen (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen reassigned SPARK-27811:
-

Assignee: jiaan.geng

>  Docs of spark.driver.memoryOverhead and spark.executor.memoryOverhead exists 
> a little ambiguity
> 
>
> Key: SPARK-27811
> URL: https://issues.apache.org/jira/browse/SPARK-27811
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, Spark Core
>Affects Versions: 2.3.0, 2.4.0
>Reporter: jiaan.geng
>Assignee: jiaan.geng
>Priority: Trivial
>
> I found the docs of {{spark.driver.memoryOverhead}} and 
> {{spark.executor.memoryOverhead}} exists a little ambiguity.
> For example, the origin docs of {{spark.driver.memoryOverhead}} start with 
> {{The amount of off-heap memory to be allocated per driver in cluster mode}}.
> But {{MemoryManager}} also managed a memory area named off-heap used to 
> allocate memory in tungsten mode.
> So I think the description of {{spark.driver.memoryOverhead}} always make 
> confused.
> {{spark.executor.memoryOverhead}} has the same confused with 
> {{spark.driver.memoryOverhead}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-27811) Docs of spark.driver.memoryOverhead and spark.executor.memoryOverhead exists a little ambiguity

2019-05-22 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-27811:


Assignee: Apache Spark

>  Docs of spark.driver.memoryOverhead and spark.executor.memoryOverhead exists 
> a little ambiguity
> 
>
> Key: SPARK-27811
> URL: https://issues.apache.org/jira/browse/SPARK-27811
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, Spark Core
>Affects Versions: 2.3.0, 2.4.0
>Reporter: jiaan.geng
>Assignee: Apache Spark
>Priority: Major
>
> I found the docs of {{spark.driver.memoryOverhead}} and 
> {{spark.executor.memoryOverhead}} exists a little ambiguity.
> For example, the origin docs of {{spark.driver.memoryOverhead}} start with 
> {{The amount of off-heap memory to be allocated per driver in cluster mode}}.
> But {{MemoryManager}} also managed a memory area named off-heap used to 
> allocate memory in tungsten mode.
> So I think the description of {{spark.driver.memoryOverhead}} always make 
> confused.
> {{spark.executor.memoryOverhead}} has the same confused with 
> {{spark.driver.memoryOverhead}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-27811) Docs of spark.driver.memoryOverhead and spark.executor.memoryOverhead exists a little ambiguity

2019-05-22 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-27811:


Assignee: (was: Apache Spark)

>  Docs of spark.driver.memoryOverhead and spark.executor.memoryOverhead exists 
> a little ambiguity
> 
>
> Key: SPARK-27811
> URL: https://issues.apache.org/jira/browse/SPARK-27811
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, Spark Core
>Affects Versions: 2.3.0, 2.4.0
>Reporter: jiaan.geng
>Priority: Major
>
> I found the docs of {{spark.driver.memoryOverhead}} and 
> {{spark.executor.memoryOverhead}} exists a little ambiguity.
> For example, the origin docs of {{spark.driver.memoryOverhead}} start with 
> {{The amount of off-heap memory to be allocated per driver in cluster mode}}.
> But {{MemoryManager}} also managed a memory area named off-heap used to 
> allocate memory in tungsten mode.
> So I think the description of {{spark.driver.memoryOverhead}} always make 
> confused.
> {{spark.executor.memoryOverhead}} has the same confused with 
> {{spark.driver.memoryOverhead}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org