Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21977#discussion_r207595892
  
    --- Diff: python/pyspark/worker.py ---
    @@ -259,6 +260,26 @@ def main(infile, outfile):
                                  "PYSPARK_DRIVER_PYTHON are correctly set.") %
                                 ("%d.%d" % sys.version_info[:2], version))
     
    +        # set up memory limits
    +        memory_limit_mb = int(os.environ.get('PYSPARK_EXECUTOR_MEMORY_MB', 
"-1"))
    +        total_memory = resource.RLIMIT_AS
    +        try:
    +            (total_memory_limit, max_total_memory) = 
resource.getrlimit(total_memory)
    +            msg = "Current mem: {0} of max 
{1}\n".format(total_memory_limit, max_total_memory)
    +            sys.stderr.write()
    +
    +            if memory_limit_mb > 0 and total_memory_limit < 0:
    --- End diff --
    
    So the logic of this block appears to be the user has requested a memory 
limit and Python does not have a memory limit set. If the user has requested a 
different memory limit than the one set though, regardless of if there is a 
current memory limit, would it make sense to set?
    
    Also possible I've misunderstood the rlmit return values here. 
    
    That being said even if that is the behaviour we want, should we use 
`resource.RLIM_INFINITY` to check if its unlimited?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to