Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23055#discussion_r238048542
  
    --- Diff: docs/configuration.md ---
    @@ -190,6 +190,8 @@ of the most common options to set are:
         and it is up to the application to avoid exceeding the overhead memory 
space
         shared with other non-JVM processes. When PySpark is run in YARN or 
Kubernetes, this memory
         is added to executor resource requests.
    +
    +    NOTE: This configuration is not supported on Windows.
    --- End diff --
    
    > Python memory usage may not be limited on platforms that do not support 
resource limiting, such as Windows
    
    Let me change it to this.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to