Github user ivoson commented on the issue:

    https://github.com/apache/spark/pull/22252
  
    Based on the discussion above, I think updating the doc to guide users to 
explicitly identify the unit may be necessary. The last commit update the doc. 
cc @vanzin @srowen @xuanyuanking 
    
    Also I think the unit for the conf should be finally unified because even 
for yarn-cluster or K8S,  spark.executor.memory is not just used to allocate 
resource where parsed as MiB in default, while some other places parsed as 
bytes  in default for some validation, such as:
    
https://github.com/apache/spark/blob/ff8dcc1d4c684e1b68e63d61b3f20284b9979cca/core/src/main/scala/org/apache/spark/SparkContext.scala#L465-L470
    
https://github.com/apache/spark/blob/ff8dcc1d4c684e1b68e63d61b3f20284b9979cca/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L208-L233
    
https://github.com/apache/spark/blob/ff8dcc1d4c684e1b68e63d61b3f20284b9979cca/core/src/main/scala/org/apache/spark/memory/StaticMemoryManager.scala#L121-L143
    
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to