Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22247#discussion_r213398017
  
    --- Diff: python/pyspark/taskcontext.py ---
    @@ -108,38 +108,12 @@ def _load_from_socket(port, auth_secret):
         """
         Load data from a given socket, this is a blocking method thus only 
return when the socket
         connection has been closed.
    -
    -    This is copied from context.py, while modified the message protocol.
         """
    -    sock = None
    -    # Support for both IPv4 and IPv6.
    -    # On most of IPv6-ready systems, IPv6 will take precedence.
    -    for res in socket.getaddrinfo("localhost", port, socket.AF_UNSPEC, 
socket.SOCK_STREAM):
    -        af, socktype, proto, canonname, sa = res
    -        sock = socket.socket(af, socktype, proto)
    -        try:
    -            # Do not allow timeout for socket reading operation.
    -            sock.settimeout(None)
    -            sock.connect(sa)
    -        except socket.error:
    -            sock.close()
    -            sock = None
    -            continue
    -        break
    -    if not sock:
    -        raise Exception("could not open socket")
    -
    -    # We don't really need a socket file here, it's just for convenience 
that we can reuse the
    -    # do_server_auth() function and data serialization methods.
    -    sockfile = sock.makefile("rwb", 65536)
    -
    +    (sockfile, sock) = local_connect_and_auth(port, auth_secret)
    --- End diff --
    
    good catch, thanks!  updated


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to