Source: python-requests-cache
Version: 0.9.8-2
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: lu...@debian.org
Usertags: ftbfs-20240615 ftbfs-trixie

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.


Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dh_auto_build
> I: pybuild plugin_pyproject:129: Building wheel for python3.12 with "build" 
> module
> I: pybuild base:311: python3.12 -m build --skip-dependency-check 
> --no-isolation --wheel --outdir 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache  
> * Building wheel...
> Successfully built requests_cache-0.9.8-py3-none-any.whl
> I: pybuild plugin_pyproject:144: Unpacking wheel built for python3.12 with 
> "installer" module
> I: pybuild plugin_pyproject:129: Building wheel for python3.11 with "build" 
> module
> I: pybuild base:311: python3.11 -m build --skip-dependency-check 
> --no-isolation --wheel --outdir 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache  
> * Building wheel...
> Successfully built requests_cache-0.9.8-py3-none-any.whl
> I: pybuild plugin_pyproject:144: Unpacking wheel built for python3.11 with 
> "installer" module
> # /usr/bin/make -C docs html
> make[1]: Leaving directory '/<<PKGBUILDDIR>>'
>    dh_auto_test -O--buildsystem=pybuild
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build; python3.12 -m 
> pytest --ignore=tests/integration
> ============================= test session starts 
> ==============================
> platform linux -- Python 3.12.4, pytest-8.2.2, pluggy-1.5.0
> rootdir: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build
> configfile: pyproject.toml
> plugins: requests-mock-1.11.0
> collected 299 items
> 
> tests/compat/test_requests_mock_combine_cache.py .                       [  
> 0%]
> tests/compat/test_requests_mock_disable_cache.py .                       [  
> 0%]
> tests/compat/test_requests_mock_load_cache.py .                          [  
> 1%]
> tests/compat/test_responses_load_cache.py F                              [  
> 1%]
> tests/unit/models/test_raw_response.py F.....                            [  
> 3%]
> tests/unit/models/test_request.py .                                      [  
> 3%]
> tests/unit/models/test_response.py ...............                       [  
> 8%]
> tests/unit/policy/test_actions.py ...................................... [ 
> 21%]
> ..........................................                               [ 
> 35%]
> tests/unit/test_cache_keys.py ..................                         [ 
> 41%]
> tests/unit/test_patcher.py ..........                                    [ 
> 44%]
> tests/unit/test_serializers.py .......                                   [ 
> 47%]
> tests/unit/test_session.py ............................................. [ 
> 62%]
> ........................................................................ [ 
> 86%]
> .........................................                                
> [100%]
> 
> =================================== FAILURES 
> ===================================
> ______________________________ test_mock_session 
> _______________________________
> 
> self = <urllib3.connection.HTTPSConnection object at 0x7fa120524950>
> 
>     def _new_conn(self) -> socket.socket:
>         """Establish a socket connection and set nodelay settings on it.
>     
>         :return: New socket connection.
>         """
>         try:
> >           sock = connection.create_connection(
>                 (self._dns_host, self.port),
>                 self.timeout,
>                 source_address=self.source_address,
>                 socket_options=self.socket_options,
>             )
> 
> /usr/lib/python3/dist-packages/urllib3/connection.py:203: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in 
> create_connection
>     raise err
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> address = ('127.0.0.1', 9), timeout = None, source_address = None
> socket_options = []
> 
>     def create_connection(
>         address: tuple[str, int],
>         timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
>         source_address: tuple[str, int] | None = None,
>         socket_options: _TYPE_SOCKET_OPTIONS | None = None,
>     ) -> socket.socket:
>         """Connect to *address* and return the socket object.
>     
>         Convenience function.  Connect to *address* (a 2-tuple ``(host,
>         port)``) and return the socket object.  Passing the optional
>         *timeout* parameter will set the timeout on the socket instance
>         before attempting to connect.  If no *timeout* is supplied, the
>         global default timeout setting returned by 
> :func:`socket.getdefaulttimeout`
>         is used.  If *source_address* is set it must be a tuple of (host, 
> port)
>         for the socket to bind as a source address before making the 
> connection.
>         An host of '' or port 0 tells the OS to use the default.
>         """
>     
>         host, port = address
>         if host.startswith("["):
>             host = host.strip("[]")
>         err = None
>     
>         # Using the value from allowed_gai_family() in the context of 
> getaddrinfo lets
>         # us select whether to work with IPv4 DNS records, IPv6 records, or 
> both.
>         # The original create_connection function always returns all records.
>         family = allowed_gai_family()
>     
>         try:
>             host.encode("idna")
>         except UnicodeError:
>             raise LocationParseError(f"'{host}', label empty or too long") 
> from None
>     
>         for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
>             af, socktype, proto, canonname, sa = res
>             sock = None
>             try:
>                 sock = socket.socket(af, socktype, proto)
>     
>                 # If provided, set socket level options before connecting.
>                 _set_socket_options(sock, socket_options)
>     
>                 if timeout is not _DEFAULT_TIMEOUT:
>                     sock.settimeout(timeout)
>                 if source_address:
>                     sock.bind(source_address)
> >               sock.connect(sa)
> E               ConnectionRefusedError: [Errno 111] Connection refused
> 
> /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: 
> ConnectionRefusedError
> 
> The above exception was the direct cause of the following exception:
> 
> self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa120524890>
> method = 'GET', url = '/gzip', body = None
> headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, 
> deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
> retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
> redirect = False, assert_same_host = False
> timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
> release_conn = False, chunked = False, body_pos = None, preload_content = 
> False
> decode_content = False, response_kw = {}
> parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/gzip', 
> query=None, fragment=None)
> destination_scheme = None, conn = None, release_this_conn = True
> http_tunnel_required = True, err = None, clean_exit = False
> 
>     def urlopen(  # type: ignore[override]
>         self,
>         method: str,
>         url: str,
>         body: _TYPE_BODY | None = None,
>         headers: typing.Mapping[str, str] | None = None,
>         retries: Retry | bool | int | None = None,
>         redirect: bool = True,
>         assert_same_host: bool = True,
>         timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
>         pool_timeout: int | None = None,
>         release_conn: bool | None = None,
>         chunked: bool = False,
>         body_pos: _TYPE_BODY_POSITION | None = None,
>         preload_content: bool = True,
>         decode_content: bool = True,
>         **response_kw: typing.Any,
>     ) -> BaseHTTPResponse:
>         """
>         Get a connection from the pool and perform an HTTP request. This is 
> the
>         lowest level call for making a request, so you'll need to specify all
>         the raw details.
>     
>         .. note::
>     
>            More commonly, it's appropriate to use a convenience method
>            such as :meth:`request`.
>     
>         .. note::
>     
>            `release_conn` will only behave as expected if
>            `preload_content=False` because we want to make
>            `preload_content=False` the default behaviour someday soon without
>            breaking backwards compatibility.
>     
>         :param method:
>             HTTP request method (such as GET, POST, PUT, etc.)
>     
>         :param url:
>             The URL to perform the request on.
>     
>         :param body:
>             Data to send in the request body, either :class:`str`, 
> :class:`bytes`,
>             an iterable of :class:`str`/:class:`bytes`, or a file-like object.
>     
>         :param headers:
>             Dictionary of custom headers to send, such as User-Agent,
>             If-None-Match, etc. If None, pool headers are used. If provided,
>             these headers completely replace any pool-specific headers.
>     
>         :param retries:
>             Configure the number of retries to allow before raising a
>             :class:`~urllib3.exceptions.MaxRetryError` exception.
>     
>             Pass ``None`` to retry until you receive a response. Pass a
>             :class:`~urllib3.util.retry.Retry` object for fine-grained control
>             over different types of retries.
>             Pass an integer number to retry connection errors that many times,
>             but no other types of errors. Pass zero to never retry.
>     
>             If ``False``, then retries are disabled and any exception is 
> raised
>             immediately. Also, instead of raising a MaxRetryError on 
> redirects,
>             the redirect response will be returned.
>     
>         :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
>     
>         :param redirect:
>             If True, automatically handle redirects (status codes 301, 302,
>             303, 307, 308). Each redirect counts as a retry. Disabling retries
>             will disable redirect, too.
>     
>         :param assert_same_host:
>             If ``True``, will make sure that the host of the pool requests is
>             consistent else will raise HostChangedError. When ``False``, you 
> can
>             use the pool on an HTTP proxy and request foreign hosts.
>     
>         :param timeout:
>             If specified, overrides the default timeout for this one
>             request. It may be a float (in seconds) or an instance of
>             :class:`urllib3.util.Timeout`.
>     
>         :param pool_timeout:
>             If set and the pool is set to block=True, then this method will
>             block for ``pool_timeout`` seconds and raise EmptyPoolError if no
>             connection is available within the time period.
>     
>         :param bool preload_content:
>             If True, the response's body will be preloaded into memory.
>     
>         :param bool decode_content:
>             If True, will attempt to decode the body based on the
>             'content-encoding' header.
>     
>         :param release_conn:
>             If False, then the urlopen call will not release the connection
>             back into the pool once a response is received (but will release 
> if
>             you read the entire contents of the response such as when
>             `preload_content=True`). This is useful if you're not preloading
>             the response's content immediately. You will need to call
>             ``r.release_conn()`` on the response ``r`` to return the 
> connection
>             back into the pool. If None, it takes the value of 
> ``preload_content``
>             which defaults to ``True``.
>     
>         :param bool chunked:
>             If True, urllib3 will send the body using chunked transfer
>             encoding. Otherwise, urllib3 will send the body using the standard
>             content-length form. Defaults to False.
>     
>         :param int body_pos:
>             Position to seek to in file-like body in the event of a retry or
>             redirect. Typically this won't need to be set because urllib3 will
>             auto-populate the value when needed.
>         """
>         parsed_url = parse_url(url)
>         destination_scheme = parsed_url.scheme
>     
>         if headers is None:
>             headers = self.headers
>     
>         if not isinstance(retries, Retry):
>             retries = Retry.from_int(retries, redirect=redirect, 
> default=self.retries)
>     
>         if release_conn is None:
>             release_conn = preload_content
>     
>         # Check host
>         if assert_same_host and not self.is_same_host(url):
>             raise HostChangedError(self, url, retries)
>     
>         # Ensure that the URL we're connecting to is properly encoded
>         if url.startswith("/"):
>             url = to_str(_encode_target(url))
>         else:
>             url = to_str(parsed_url.url)
>     
>         conn = None
>     
>         # Track whether `conn` needs to be released before
>         # returning/raising/recursing. Update this variable if necessary, and
>         # leave `release_conn` constant throughout the function. That way, if
>         # the function recurses, the original value of `release_conn` will be
>         # passed down into the recursive call, and its value will be 
> respected.
>         #
>         # See issue #651 [1] for details.
>         #
>         # [1] <https://github.com/urllib3/urllib3/issues/651>
>         release_this_conn = release_conn
>     
>         http_tunnel_required = connection_requires_http_tunnel(
>             self.proxy, self.proxy_config, destination_scheme
>         )
>     
>         # Merge the proxy headers. Only done when not using HTTP CONNECT. We
>         # have to copy the headers dict so we can safely change it without 
> those
>         # changes being reflected in anyone else's copy.
>         if not http_tunnel_required:
>             headers = headers.copy()  # type: ignore[attr-defined]
>             headers.update(self.proxy_headers)  # type: ignore[union-attr]
>     
>         # Must keep the exception bound to a separate variable or else Python 
> 3
>         # complains about UnboundLocalError.
>         err = None
>     
>         # Keep track of whether we cleanly exited the except block. This
>         # ensures we do proper cleanup in finally.
>         clean_exit = False
>     
>         # Rewind body position, if needed. Record current position
>         # for future rewinds in the event of a redirect/retry.
>         body_pos = set_file_position(body, body_pos)
>     
>         try:
>             # Request a connection from the queue.
>             timeout_obj = self._get_timeout(timeout)
>             conn = self._get_conn(timeout=pool_timeout)
>     
>             conn.timeout = timeout_obj.connect_timeout  # type: 
> ignore[assignment]
>     
>             # Is this a closed/new connection that requires CONNECT 
> tunnelling?
>             if self.proxy is not None and http_tunnel_required and 
> conn.is_closed:
>                 try:
> >                   self._prepare_proxy(conn)
> 
> /usr/lib/python3/dist-packages/urllib3/connectionpool.py:777: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/connectionpool.py:1046: in 
> _prepare_proxy
>     conn.connect()
> /usr/lib/python3/dist-packages/urllib3/connection.py:611: in connect
>     self.sock = sock = self._new_conn()
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.connection.HTTPSConnection object at 0x7fa120524950>
> 
>     def _new_conn(self) -> socket.socket:
>         """Establish a socket connection and set nodelay settings on it.
>     
>         :return: New socket connection.
>         """
>         try:
>             sock = connection.create_connection(
>                 (self._dns_host, self.port),
>                 self.timeout,
>                 source_address=self.source_address,
>                 socket_options=self.socket_options,
>             )
>         except socket.gaierror as e:
>             raise NameResolutionError(self.host, self, e) from e
>         except SocketTimeout as e:
>             raise ConnectTimeoutError(
>                 self,
>                 f"Connection to {self.host} timed out. (connect 
> timeout={self.timeout})",
>             ) from e
>     
>         except OSError as e:
> >           raise NewConnectionError(
>                 self, f"Failed to establish a new connection: {e}"
>             ) from e
> E           urllib3.exceptions.NewConnectionError: 
> <urllib3.connection.HTTPSConnection object at 0x7fa120524950>: Failed to 
> establish a new connection: [Errno 111] Connection refused
> 
> /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError
> 
> The above exception was the direct cause of the following exception:
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 203, in 
> _new_conn
>     sock = connection.create_connection(
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 85, 
> in create_connection
>     raise err
>   File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 73, 
> in create_connection
>     sock.connect(sa)
> ConnectionRefusedError: [Errno 111] Connection refused
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 777, 
> in urlopen
>     self._prepare_proxy(conn)
>   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 1046, 
> in _prepare_proxy
>     conn.connect()
>   File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 611, in 
> connect
>     self.sock = sock = self._new_conn()
>                        ^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 218, in 
> _new_conn
>     raise NewConnectionError(
> urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection 
> object at 0x7fa120524950>: Failed to establish a new connection: [Errno 111] 
> Connection refused
> 
> The above exception was the direct cause of the following exception:
> 
> urllib3.exceptions.ProxyError: ('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7fa120524950>: Failed to establish a new connection: [Errno 111] Connection 
> refused'))
> 
> The above exception was the direct cause of the following exception:
> 
> self = <requests.adapters.HTTPAdapter object at 0x7fa120b4be00>
> request = <PreparedRequest [GET]>, stream = False
> timeout = Timeout(connect=None, read=None, total=None), verify = True
> cert = None
> proxies = OrderedDict({'no': 'localhost', 'https': 'https://127.0.0.1:9/', 
> 'http': 'http://127.0.0.1:9/'})
> 
>     def send(
>         self, request, stream=False, timeout=None, verify=True, cert=None, 
> proxies=None
>     ):
>         """Sends PreparedRequest object. Returns Response object.
>     
>         :param request: The :class:`PreparedRequest <PreparedRequest>` being 
> sent.
>         :param stream: (optional) Whether to stream the request content.
>         :param timeout: (optional) How long to wait for the server to send
>             data before giving up, as a float, or a :ref:`(connect timeout,
>             read timeout) <timeouts>` tuple.
>         :type timeout: float or tuple or urllib3 Timeout object
>         :param verify: (optional) Either a boolean, in which case it controls 
> whether
>             we verify the server's TLS certificate, or a string, in which 
> case it
>             must be a path to a CA bundle to use
>         :param cert: (optional) Any user-provided SSL certificate to be 
> trusted.
>         :param proxies: (optional) The proxies dictionary to apply to the 
> request.
>         :rtype: requests.Response
>         """
>     
>         try:
>             conn = self.get_connection_with_tls_context(
>                 request, verify, proxies=proxies, cert=cert
>             )
>         except LocationValueError as e:
>             raise InvalidURL(e, request=request)
>     
>         self.cert_verify(conn, request.url, verify, cert)
>         url = self.request_url(request, proxies)
>         self.add_headers(
>             request,
>             stream=stream,
>             timeout=timeout,
>             verify=verify,
>             cert=cert,
>             proxies=proxies,
>         )
>     
>         chunked = not (request.body is None or "Content-Length" in 
> request.headers)
>     
>         if isinstance(timeout, tuple):
>             try:
>                 connect, read = timeout
>                 timeout = TimeoutSauce(connect=connect, read=read)
>             except ValueError:
>                 raise ValueError(
>                     f"Invalid timeout {timeout}. Pass a (connect, read) 
> timeout tuple, "
>                     f"or a single float to set both timeouts to the same 
> value."
>                 )
>         elif isinstance(timeout, TimeoutSauce):
>             pass
>         else:
>             timeout = TimeoutSauce(connect=timeout, read=timeout)
>     
>         try:
> >           resp = conn.urlopen(
>                 method=request.method,
>                 url=url,
>                 body=request.body,
>                 headers=request.headers,
>                 redirect=False,
>                 assert_same_host=False,
>                 preload_content=False,
>                 decode_content=False,
>                 retries=self.max_retries,
>                 timeout=timeout,
>                 chunked=chunked,
>             )
> 
> /usr/lib/python3/dist-packages/requests/adapters.py:667: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen
>     retries = retries.increment(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
> method = 'GET', url = '/gzip', response = None
> error = ProxyError('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7fa120524950>: Failed to establish a new connection: [Errno 111] Connection 
> refused'))
> _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa120524890>
> _stacktrace = <traceback object at 0x7fa12052e000>
> 
>     def increment(
>         self,
>         method: str | None = None,
>         url: str | None = None,
>         response: BaseHTTPResponse | None = None,
>         error: Exception | None = None,
>         _pool: ConnectionPool | None = None,
>         _stacktrace: TracebackType | None = None,
>     ) -> Retry:
>         """Return a new Retry object with incremented retry counters.
>     
>         :param response: A response object, or None, if the server did not
>             return a response.
>         :type response: :class:`~urllib3.response.BaseHTTPResponse`
>         :param Exception error: An error encountered during the request, or
>             None if the response was received successfully.
>     
>         :return: A new ``Retry`` object.
>         """
>         if self.total is False and error:
>             # Disabled, indicate to re-raise the error.
>             raise reraise(type(error), error, _stacktrace)
>     
>         total = self.total
>         if total is not None:
>             total -= 1
>     
>         connect = self.connect
>         read = self.read
>         redirect = self.redirect
>         status_count = self.status
>         other = self.other
>         cause = "unknown"
>         status = None
>         redirect_location = None
>     
>         if error and self._is_connection_error(error):
>             # Connect retry?
>             if connect is False:
>                 raise reraise(type(error), error, _stacktrace)
>             elif connect is not None:
>                 connect -= 1
>     
>         elif error and self._is_read_error(error):
>             # Read retry?
>             if read is False or method is None or not 
> self._is_method_retryable(method):
>                 raise reraise(type(error), error, _stacktrace)
>             elif read is not None:
>                 read -= 1
>     
>         elif error:
>             # Other retry?
>             if other is not None:
>                 other -= 1
>     
>         elif response and response.get_redirect_location():
>             # Redirect retry?
>             if redirect is not None:
>                 redirect -= 1
>             cause = "too many redirects"
>             response_redirect_location = response.get_redirect_location()
>             if response_redirect_location:
>                 redirect_location = response_redirect_location
>             status = response.status
>     
>         else:
>             # Incrementing because of a server error like a 500 in
>             # status_forcelist and the given method is in the allowed_methods
>             cause = ResponseError.GENERIC_ERROR
>             if response and response.status:
>                 if status_count is not None:
>                     status_count -= 1
>                 cause = 
> ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
>                 status = response.status
>     
>         history = self.history + (
>             RequestHistory(method, url, error, status, redirect_location),
>         )
>     
>         new_retry = self.new(
>             total=total,
>             connect=connect,
>             read=read,
>             redirect=redirect,
>             status=status_count,
>             other=other,
>             history=history,
>         )
>     
>         if new_retry.is_exhausted():
>             reason = error or ResponseError(cause)
> >           raise MaxRetryError(_pool, url, reason) from reason  # type: 
> > ignore[arg-type]
> E           urllib3.exceptions.MaxRetryError: 
> HTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with 
> url: /gzip (Caused by ProxyError('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7fa120524950>: Failed to establish a new connection: [Errno 111] Connection 
> refused')))
> 
> /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError
> 
> During handling of the above exception, another exception occurred:
> 
> mock_http_adapter = <MagicMock name='get_connection' id='140330015172864'>
> 
>     @patch.object(
>         requests.adapters.HTTPAdapter, 'get_connection', 
> side_effect=ValueError('Real request made!')
>     )
>     def test_mock_session(mock_http_adapter):
>         """Test that the mock_session fixture is working as expected"""
>         with get_responses():
>             # An error will be raised if a real request is made
>             with pytest.raises(ValueError):
> >               requests.get(PASSTHRU_URL)
> 
> tests/compat/test_responses_load_cache.py:53: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/api.py:73: in get
>     return request("get", url, params=params, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
>     return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:703: in send
>     r = adapter.send(request, **kwargs)
> /usr/lib/python3/dist-packages/responses/__init__.py:1175: in send
>     return self._on_request(adapter, request, **kwargs)
> /usr/lib/python3/dist-packages/responses/__init__.py:1079: in _on_request
>     return self._real_send(adapter, request, **kwargs)  # type: ignore
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <requests.adapters.HTTPAdapter object at 0x7fa120b4be00>
> request = <PreparedRequest [GET]>, stream = False
> timeout = Timeout(connect=None, read=None, total=None), verify = True
> cert = None
> proxies = OrderedDict({'no': 'localhost', 'https': 'https://127.0.0.1:9/', 
> 'http': 'http://127.0.0.1:9/'})
> 
>     def send(
>         self, request, stream=False, timeout=None, verify=True, cert=None, 
> proxies=None
>     ):
>         """Sends PreparedRequest object. Returns Response object.
>     
>         :param request: The :class:`PreparedRequest <PreparedRequest>` being 
> sent.
>         :param stream: (optional) Whether to stream the request content.
>         :param timeout: (optional) How long to wait for the server to send
>             data before giving up, as a float, or a :ref:`(connect timeout,
>             read timeout) <timeouts>` tuple.
>         :type timeout: float or tuple or urllib3 Timeout object
>         :param verify: (optional) Either a boolean, in which case it controls 
> whether
>             we verify the server's TLS certificate, or a string, in which 
> case it
>             must be a path to a CA bundle to use
>         :param cert: (optional) Any user-provided SSL certificate to be 
> trusted.
>         :param proxies: (optional) The proxies dictionary to apply to the 
> request.
>         :rtype: requests.Response
>         """
>     
>         try:
>             conn = self.get_connection_with_tls_context(
>                 request, verify, proxies=proxies, cert=cert
>             )
>         except LocationValueError as e:
>             raise InvalidURL(e, request=request)
>     
>         self.cert_verify(conn, request.url, verify, cert)
>         url = self.request_url(request, proxies)
>         self.add_headers(
>             request,
>             stream=stream,
>             timeout=timeout,
>             verify=verify,
>             cert=cert,
>             proxies=proxies,
>         )
>     
>         chunked = not (request.body is None or "Content-Length" in 
> request.headers)
>     
>         if isinstance(timeout, tuple):
>             try:
>                 connect, read = timeout
>                 timeout = TimeoutSauce(connect=connect, read=read)
>             except ValueError:
>                 raise ValueError(
>                     f"Invalid timeout {timeout}. Pass a (connect, read) 
> timeout tuple, "
>                     f"or a single float to set both timeouts to the same 
> value."
>                 )
>         elif isinstance(timeout, TimeoutSauce):
>             pass
>         else:
>             timeout = TimeoutSauce(connect=timeout, read=timeout)
>     
>         try:
>             resp = conn.urlopen(
>                 method=request.method,
>                 url=url,
>                 body=request.body,
>                 headers=request.headers,
>                 redirect=False,
>                 assert_same_host=False,
>                 preload_content=False,
>                 decode_content=False,
>                 retries=self.max_retries,
>                 timeout=timeout,
>                 chunked=chunked,
>             )
>     
>         except (ProtocolError, OSError) as err:
>             raise ConnectionError(err, request=request)
>     
>         except MaxRetryError as e:
>             if isinstance(e.reason, ConnectTimeoutError):
>                 # TODO: Remove this in 3.0.0: see #2811
>                 if not isinstance(e.reason, NewConnectionError):
>                     raise ConnectTimeout(e, request=request)
>     
>             if isinstance(e.reason, ResponseError):
>                 raise RetryError(e, request=request)
>     
>             if isinstance(e.reason, _ProxyError):
> >               raise ProxyError(e, request=request)
> E               requests.exceptions.ProxyError: 
> HTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with 
> url: /gzip (Caused by ProxyError('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7fa120524950>: Failed to establish a new connection: [Errno 111] Connection 
> refused')))
> 
> /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError
> ----------------------------- Captured stderr call 
> -----------------------------
> INFO:responses:request.allowed-passthru
> ------------------------------ Captured log call 
> -------------------------------
> INFO     responses:__init__.py:1078 request.allowed-passthru
> ______________________________ test_from_response 
> ______________________________
> 
> mock_session = <CachedSession(cache=<SQLiteCache(name=http_cache)>, 
> expire_after=-1, urls_expire_after=None, allowable_codes=(200,), 
> ...wable_methods=['GET', 'HEAD', 'OPTIONS', 'POST', 'PUT', 'PATCH', 
> 'DELETE'], stale_if_error=False, cache_control=False)>
> 
>     def test_from_response(mock_session):
>         response = mock_session.get(MOCKED_URL)
>         response.raw._fp = BytesIO(b'mock response')
>         raw = CachedHTTPResponse.from_response(response)
>     
>         assert dict(response.raw.headers) == dict(raw.headers) == 
> {'Content-Type': 'text/plain'}
>         assert raw.read(None) == b'mock response'
>         assert response.raw.decode_content is raw.decode_content is False
>         assert response.raw.reason is raw.reason is None
>         if hasattr(response.raw, '_request_url'):
>             assert response.raw._request_url is raw.request_url is None
>         assert response.raw.status == raw.status == 200
> >       assert response.raw.strict == raw.strict == 0
> E       AttributeError: 'HTTPResponse' object has no attribute 'strict'
> 
> tests/unit/models/test_raw_response.py:19: AttributeError
> =============================== warnings summary 
> ===============================
> tests/unit/models/test_response.py:38
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/models/test_response.py:38:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     (datetime.utcnow() + timedelta(days=1), False),
> 
> tests/unit/models/test_response.py:39
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/models/test_response.py:39:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     (datetime.utcnow() - timedelta(days=1), True),
> 
> tests/unit/test_session.py:32
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/test_session.py:32:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     YESTERDAY = datetime.utcnow() - timedelta(days=1)
> 
> tests/compat/test_requests_mock_combine_cache.py: 1 warning
> tests/unit/models/test_raw_response.py: 1 warning
> tests/unit/models/test_request.py: 1 warning
> tests/unit/models/test_response.py: 19 warnings
> tests/unit/policy/test_actions.py: 5 warnings
> tests/unit/test_serializers.py: 3 warnings
> tests/unit/test_session.py: 311 warnings
>   <attrs generated init requests_cache.models.response.CachedResponse>:11: 
> DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     self.created_at = __attr_factory_created_at()
> 
> tests/compat/test_requests_mock_load_cache.py::test_mock_session
> tests/compat/test_responses_load_cache.py::test_mock_session
> tests/unit/test_session.py::test_values
> tests/unit/test_session.py::test_values__with_invalid_responses[True-1]
> tests/unit/test_session.py::test_values__with_invalid_responses[False-2]
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/backends/base.py:298:
>  DeprecationWarning: BaseCache.values() is deprecated; please use .filter() 
> instead
>     warn('BaseCache.values() is deprecated; please use .filter() instead', 
> DeprecationWarning)
> 
> tests/unit/models/test_raw_response.py: 1 warning
> tests/unit/models/test_request.py: 1 warning
> tests/unit/models/test_response.py: 10 warnings
> tests/unit/test_serializers.py: 1 warning
> tests/unit/test_session.py: 306 warnings
>   <cattrs generated unstructure 
> requests_cache.models.response.CachedResponse>:10: DeprecationWarning: 
> datetime.datetime.utcnow() is deprecated and scheduled for removal in a 
> future version. Use timezone-aware objects to represent datetimes in UTC: 
> datetime.datetime.now(datetime.UTC).
>     if instance.created_at != __c_def_created_at():
> 
> tests/unit/models/test_response.py: 9 warnings
> tests/unit/policy/test_actions.py: 4 warnings
> tests/unit/test_session.py: 52 warnings
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/models/response.py:99:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     return self.expires is not None and datetime.utcnow() >= self.expires
> 
> tests/unit/models/test_response.py::test_revalidate__extend_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/models/test_response.py:82:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     expires=datetime.utcnow() - timedelta(seconds=0.01),
> 
> tests/unit/models/test_response.py::test_revalidate__extend_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/models/test_response.py:87:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     is_expired = response.revalidate(datetime.utcnow() + 
> timedelta(seconds=0.01))
> 
> tests/unit/models/test_response.py::test_revalidate__shorten_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/models/test_response.py:97:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     expires=datetime.utcnow() + timedelta(seconds=1),
> 
> tests/unit/models/test_response.py::test_revalidate__shorten_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/models/test_response.py:102:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     is_expired = response.revalidate(datetime.utcnow() - timedelta(seconds=1))
> 
> tests/unit/policy/test_actions.py: 3 warnings
> tests/unit/test_session.py: 17 warnings
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/policy/actions.py:179:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     return datetime.utcnow() + expire_after
> 
> tests/unit/policy/test_actions.py::test_get_expiration_datetime__relative[expire_after0-expected_expiration_delta0]
> tests/unit/policy/test_actions.py::test_get_expiration_datetime__relative[60-expected_expiration_delta1]
> tests/unit/policy/test_actions.py::test_get_expiration_datetime__relative[33.3-expected_expiration_delta2]
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/policy/test_actions.py:294:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     expected_expiration = datetime.utcnow() + expected_expiration_delta
> 
> tests/unit/test_serializers.py::test_custom_serializer
>   <cattrs generated unstructure 
> requests_cache.models.response.CachedResponse-2>:10: DeprecationWarning: 
> datetime.datetime.utcnow() is deprecated and scheduled for removal in a 
> future version. Use timezone-aware objects to represent datetimes in UTC: 
> datetime.datetime.now(datetime.UTC).
>     if instance.created_at != __c_def_created_at():
> 
> tests/unit/test_session.py::test_keys
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/backends/base.py:269:
>  DeprecationWarning: BaseCache.keys() is deprecated; please use .filter() or 
> BaseCache.responses.keys() instead
>     warn(
> 
> tests/unit/test_session.py::test_response_count[True-2]
> tests/unit/test_session.py::test_response_count[False-3]
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/backends/base.py:280:
>  DeprecationWarning: BaseCache.response_count() is deprecated; please use 
> .filter() or len(BaseCache.responses) instead
>     warn(
> 
> tests/unit/test_session.py: 19 warnings
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/backends/base.py:262:
>  DeprecationWarning: BaseCache.has_url() is deprecated; please use 
> .contains(url=...) instead
>     warn(
> 
> tests/unit/test_session.py::test_delete_url
> tests/unit/test_session.py::test_delete_url__request_args
> tests/unit/test_session.py::test_delete_url__nonexistent_response
> tests/unit/test_session.py::test_delete_url__nonexistent_response
> tests/unit/test_session.py::test_delete_url__nonexistent_response
> tests/unit/test_session.py::test_delete_url__redirect
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/backends/base.py:241:
>  DeprecationWarning: BaseCache.delete_url() is deprecated; please use 
> .delete(urls=...) instead
>     warn(
> 
> tests/unit/test_session.py::test_delete_urls
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/backends/base.py:248:
>  DeprecationWarning: BaseCache.delete_urls() is deprecated; please use 
> .delete(urls=...) instead
>     warn(
> 
> tests/unit/test_session.py::test_response_defaults
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/test_session.py:450:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     mock_session.expire_after = datetime.utcnow() + timedelta(days=1)
> 
> tests/unit/test_session.py::test_do_not_cache
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/policy/actions.py:165:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     return datetime.utcnow()
> 
> tests/unit/test_session.py::test_do_not_cache
> tests/unit/test_session.py::test_remove_expired_responses__per_request
> tests/unit/test_session.py::test_remove_expired_responses__per_request
> tests/unit/test_session.py::test_per_request__enable_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/policy/actions.py:185:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     return ceil((expires - datetime.utcnow()).total_seconds()) if expires 
> else NEVER_EXPIRE
> 
> tests/unit/test_session.py::test_remove_expired_responses
> tests/unit/test_session.py::test_remove_expired_responses
> tests/unit/test_session.py::test_remove_expired_responses__error
> tests/unit/test_session.py::test_remove_expired_responses__extend_expiration
> tests/unit/test_session.py::test_remove_expired_responses__shorten_expiration
> tests/unit/test_session.py::test_remove_expired_responses__per_request
> tests/unit/test_session.py::test_remove_expired_responses__per_request
> tests/unit/test_session.py::test_remove_expired_responses__per_request
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/requests_cache/backends/base.py:288:
>  DeprecationWarning: BaseCache.remove_expired_responses() is deprecated; 
> please use .delete(expired=True) instead
>     warn(
> 
> tests/unit/test_session.py::test_remove_expired_responses__extend_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/test_session.py:696:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     mock_session.expire_after = datetime.utcnow() - timedelta(seconds=0.01)
> 
> tests/unit/test_session.py::test_remove_expired_responses__extend_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/test_session.py:700:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     mock_session.remove_expired_responses(expire_after=datetime.utcnow() + 
> timedelta(seconds=1))
> 
> tests/unit/test_session.py::test_remove_expired_responses__shorten_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/test_session.py:708:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     mock_session.expire_after = datetime.utcnow() + timedelta(seconds=1)
> 
> tests/unit/test_session.py::test_remove_expired_responses__shorten_expiration
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/test_session.py:712:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     mock_session.remove_expired_responses(expire_after=datetime.utcnow() - 
> timedelta(seconds=0.01))
> 
> tests/unit/test_session.py::test_remove_expired_responses__per_request
> tests/unit/test_session.py::test_remove_expired_responses__per_request
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build/tests/unit/test_session.py:731:
>  DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled 
> for removal in a future version. Use timezone-aware objects to represent 
> datetimes in UTC: datetime.datetime.now(datetime.UTC).
>     print('Expires:', response.expires - datetime.utcnow() if 
> response.expires else None)
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info 
> ============================
> FAILED tests/compat/test_responses_load_cache.py::test_mock_session - 
> request...
> FAILED tests/unit/models/test_raw_response.py::test_from_response - 
> Attribute...
> ================= 2 failed, 297 passed, 810 warnings in 15.81s 
> =================
> E: pybuild pybuild:389: test: plugin pyproject failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests_cache/build; python3.12 -m 
> pytest --ignore=tests/integration
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build; python3.11 -m 
> pytest --ignore=tests/integration
> ============================= test session starts 
> ==============================
> platform linux -- Python 3.11.9, pytest-8.2.2, pluggy-1.5.0
> rootdir: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build
> configfile: pyproject.toml
> plugins: requests-mock-1.11.0
> collected 299 items
> 
> tests/compat/test_requests_mock_combine_cache.py .                       [  
> 0%]
> tests/compat/test_requests_mock_disable_cache.py .                       [  
> 0%]
> tests/compat/test_requests_mock_load_cache.py .                          [  
> 1%]
> tests/compat/test_responses_load_cache.py F                              [  
> 1%]
> tests/unit/models/test_raw_response.py F.....                            [  
> 3%]
> tests/unit/models/test_request.py .                                      [  
> 3%]
> tests/unit/models/test_response.py ...............                       [  
> 8%]
> tests/unit/policy/test_actions.py ...................................... [ 
> 21%]
> ..........................................                               [ 
> 35%]
> tests/unit/test_cache_keys.py ..................                         [ 
> 41%]
> tests/unit/test_patcher.py ..........                                    [ 
> 44%]
> tests/unit/test_serializers.py .......                                   [ 
> 47%]
> tests/unit/test_session.py ............................................. [ 
> 62%]
> ........................................................................ [ 
> 86%]
> .........................................                                
> [100%]
> 
> =================================== FAILURES 
> ===================================
> ______________________________ test_mock_session 
> _______________________________
> 
> self = <urllib3.connection.HTTPSConnection object at 0x7f21ead327d0>
> 
>     def _new_conn(self) -> socket.socket:
>         """Establish a socket connection and set nodelay settings on it.
>     
>         :return: New socket connection.
>         """
>         try:
> >           sock = connection.create_connection(
>                 (self._dns_host, self.port),
>                 self.timeout,
>                 source_address=self.source_address,
>                 socket_options=self.socket_options,
>             )
> 
> /usr/lib/python3/dist-packages/urllib3/connection.py:203: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in 
> create_connection
>     raise err
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> address = ('127.0.0.1', 9), timeout = None, source_address = None
> socket_options = []
> 
>     def create_connection(
>         address: tuple[str, int],
>         timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
>         source_address: tuple[str, int] | None = None,
>         socket_options: _TYPE_SOCKET_OPTIONS | None = None,
>     ) -> socket.socket:
>         """Connect to *address* and return the socket object.
>     
>         Convenience function.  Connect to *address* (a 2-tuple ``(host,
>         port)``) and return the socket object.  Passing the optional
>         *timeout* parameter will set the timeout on the socket instance
>         before attempting to connect.  If no *timeout* is supplied, the
>         global default timeout setting returned by 
> :func:`socket.getdefaulttimeout`
>         is used.  If *source_address* is set it must be a tuple of (host, 
> port)
>         for the socket to bind as a source address before making the 
> connection.
>         An host of '' or port 0 tells the OS to use the default.
>         """
>     
>         host, port = address
>         if host.startswith("["):
>             host = host.strip("[]")
>         err = None
>     
>         # Using the value from allowed_gai_family() in the context of 
> getaddrinfo lets
>         # us select whether to work with IPv4 DNS records, IPv6 records, or 
> both.
>         # The original create_connection function always returns all records.
>         family = allowed_gai_family()
>     
>         try:
>             host.encode("idna")
>         except UnicodeError:
>             raise LocationParseError(f"'{host}', label empty or too long") 
> from None
>     
>         for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
>             af, socktype, proto, canonname, sa = res
>             sock = None
>             try:
>                 sock = socket.socket(af, socktype, proto)
>     
>                 # If provided, set socket level options before connecting.
>                 _set_socket_options(sock, socket_options)
>     
>                 if timeout is not _DEFAULT_TIMEOUT:
>                     sock.settimeout(timeout)
>                 if source_address:
>                     sock.bind(source_address)
> >               sock.connect(sa)
> E               ConnectionRefusedError: [Errno 111] Connection refused
> 
> /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: 
> ConnectionRefusedError
> 
> The above exception was the direct cause of the following exception:
> 
> self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f21eb9c5c50>
> method = 'GET', url = '/gzip', body = None
> headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, 
> deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
> retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
> redirect = False, assert_same_host = False
> timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
> release_conn = False, chunked = False, body_pos = None, preload_content = 
> False
> decode_content = False, response_kw = {}
> parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/gzip', 
> query=None, fragment=None)
> destination_scheme = None, conn = None, release_this_conn = True
> http_tunnel_required = True, err = None, clean_exit = False
> 
>     def urlopen(  # type: ignore[override]
>         self,
>         method: str,
>         url: str,
>         body: _TYPE_BODY | None = None,
>         headers: typing.Mapping[str, str] | None = None,
>         retries: Retry | bool | int | None = None,
>         redirect: bool = True,
>         assert_same_host: bool = True,
>         timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
>         pool_timeout: int | None = None,
>         release_conn: bool | None = None,
>         chunked: bool = False,
>         body_pos: _TYPE_BODY_POSITION | None = None,
>         preload_content: bool = True,
>         decode_content: bool = True,
>         **response_kw: typing.Any,
>     ) -> BaseHTTPResponse:
>         """
>         Get a connection from the pool and perform an HTTP request. This is 
> the
>         lowest level call for making a request, so you'll need to specify all
>         the raw details.
>     
>         .. note::
>     
>            More commonly, it's appropriate to use a convenience method
>            such as :meth:`request`.
>     
>         .. note::
>     
>            `release_conn` will only behave as expected if
>            `preload_content=False` because we want to make
>            `preload_content=False` the default behaviour someday soon without
>            breaking backwards compatibility.
>     
>         :param method:
>             HTTP request method (such as GET, POST, PUT, etc.)
>     
>         :param url:
>             The URL to perform the request on.
>     
>         :param body:
>             Data to send in the request body, either :class:`str`, 
> :class:`bytes`,
>             an iterable of :class:`str`/:class:`bytes`, or a file-like object.
>     
>         :param headers:
>             Dictionary of custom headers to send, such as User-Agent,
>             If-None-Match, etc. If None, pool headers are used. If provided,
>             these headers completely replace any pool-specific headers.
>     
>         :param retries:
>             Configure the number of retries to allow before raising a
>             :class:`~urllib3.exceptions.MaxRetryError` exception.
>     
>             Pass ``None`` to retry until you receive a response. Pass a
>             :class:`~urllib3.util.retry.Retry` object for fine-grained control
>             over different types of retries.
>             Pass an integer number to retry connection errors that many times,
>             but no other types of errors. Pass zero to never retry.
>     
>             If ``False``, then retries are disabled and any exception is 
> raised
>             immediately. Also, instead of raising a MaxRetryError on 
> redirects,
>             the redirect response will be returned.
>     
>         :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
>     
>         :param redirect:
>             If True, automatically handle redirects (status codes 301, 302,
>             303, 307, 308). Each redirect counts as a retry. Disabling retries
>             will disable redirect, too.
>     
>         :param assert_same_host:
>             If ``True``, will make sure that the host of the pool requests is
>             consistent else will raise HostChangedError. When ``False``, you 
> can
>             use the pool on an HTTP proxy and request foreign hosts.
>     
>         :param timeout:
>             If specified, overrides the default timeout for this one
>             request. It may be a float (in seconds) or an instance of
>             :class:`urllib3.util.Timeout`.
>     
>         :param pool_timeout:
>             If set and the pool is set to block=True, then this method will
>             block for ``pool_timeout`` seconds and raise EmptyPoolError if no
>             connection is available within the time period.
>     
>         :param bool preload_content:
>             If True, the response's body will be preloaded into memory.
>     
>         :param bool decode_content:
>             If True, will attempt to decode the body based on the
>             'content-encoding' header.
>     
>         :param release_conn:
>             If False, then the urlopen call will not release the connection
>             back into the pool once a response is received (but will release 
> if
>             you read the entire contents of the response such as when
>             `preload_content=True`). This is useful if you're not preloading
>             the response's content immediately. You will need to call
>             ``r.release_conn()`` on the response ``r`` to return the 
> connection
>             back into the pool. If None, it takes the value of 
> ``preload_content``
>             which defaults to ``True``.
>     
>         :param bool chunked:
>             If True, urllib3 will send the body using chunked transfer
>             encoding. Otherwise, urllib3 will send the body using the standard
>             content-length form. Defaults to False.
>     
>         :param int body_pos:
>             Position to seek to in file-like body in the event of a retry or
>             redirect. Typically this won't need to be set because urllib3 will
>             auto-populate the value when needed.
>         """
>         parsed_url = parse_url(url)
>         destination_scheme = parsed_url.scheme
>     
>         if headers is None:
>             headers = self.headers
>     
>         if not isinstance(retries, Retry):
>             retries = Retry.from_int(retries, redirect=redirect, 
> default=self.retries)
>     
>         if release_conn is None:
>             release_conn = preload_content
>     
>         # Check host
>         if assert_same_host and not self.is_same_host(url):
>             raise HostChangedError(self, url, retries)
>     
>         # Ensure that the URL we're connecting to is properly encoded
>         if url.startswith("/"):
>             url = to_str(_encode_target(url))
>         else:
>             url = to_str(parsed_url.url)
>     
>         conn = None
>     
>         # Track whether `conn` needs to be released before
>         # returning/raising/recursing. Update this variable if necessary, and
>         # leave `release_conn` constant throughout the function. That way, if
>         # the function recurses, the original value of `release_conn` will be
>         # passed down into the recursive call, and its value will be 
> respected.
>         #
>         # See issue #651 [1] for details.
>         #
>         # [1] <https://github.com/urllib3/urllib3/issues/651>
>         release_this_conn = release_conn
>     
>         http_tunnel_required = connection_requires_http_tunnel(
>             self.proxy, self.proxy_config, destination_scheme
>         )
>     
>         # Merge the proxy headers. Only done when not using HTTP CONNECT. We
>         # have to copy the headers dict so we can safely change it without 
> those
>         # changes being reflected in anyone else's copy.
>         if not http_tunnel_required:
>             headers = headers.copy()  # type: ignore[attr-defined]
>             headers.update(self.proxy_headers)  # type: ignore[union-attr]
>     
>         # Must keep the exception bound to a separate variable or else Python 
> 3
>         # complains about UnboundLocalError.
>         err = None
>     
>         # Keep track of whether we cleanly exited the except block. This
>         # ensures we do proper cleanup in finally.
>         clean_exit = False
>     
>         # Rewind body position, if needed. Record current position
>         # for future rewinds in the event of a redirect/retry.
>         body_pos = set_file_position(body, body_pos)
>     
>         try:
>             # Request a connection from the queue.
>             timeout_obj = self._get_timeout(timeout)
>             conn = self._get_conn(timeout=pool_timeout)
>     
>             conn.timeout = timeout_obj.connect_timeout  # type: 
> ignore[assignment]
>     
>             # Is this a closed/new connection that requires CONNECT 
> tunnelling?
>             if self.proxy is not None and http_tunnel_required and 
> conn.is_closed:
>                 try:
> >                   self._prepare_proxy(conn)
> 
> /usr/lib/python3/dist-packages/urllib3/connectionpool.py:777: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/connectionpool.py:1046: in 
> _prepare_proxy
>     conn.connect()
> /usr/lib/python3/dist-packages/urllib3/connection.py:611: in connect
>     self.sock = sock = self._new_conn()
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.connection.HTTPSConnection object at 0x7f21ead327d0>
> 
>     def _new_conn(self) -> socket.socket:
>         """Establish a socket connection and set nodelay settings on it.
>     
>         :return: New socket connection.
>         """
>         try:
>             sock = connection.create_connection(
>                 (self._dns_host, self.port),
>                 self.timeout,
>                 source_address=self.source_address,
>                 socket_options=self.socket_options,
>             )
>         except socket.gaierror as e:
>             raise NameResolutionError(self.host, self, e) from e
>         except SocketTimeout as e:
>             raise ConnectTimeoutError(
>                 self,
>                 f"Connection to {self.host} timed out. (connect 
> timeout={self.timeout})",
>             ) from e
>     
>         except OSError as e:
> >           raise NewConnectionError(
>                 self, f"Failed to establish a new connection: {e}"
>             ) from e
> E           urllib3.exceptions.NewConnectionError: 
> <urllib3.connection.HTTPSConnection object at 0x7f21ead327d0>: Failed to 
> establish a new connection: [Errno 111] Connection refused
> 
> /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError
> 
> The above exception was the direct cause of the following exception:
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 203, in 
> _new_conn
>     sock = connection.create_connection(
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 85, 
> in create_connection
>     raise err
>   File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 73, 
> in create_connection
>     sock.connect(sa)
> ConnectionRefusedError: [Errno 111] Connection refused
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 777, 
> in urlopen
>     self._prepare_proxy(conn)
>   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 1046, 
> in _prepare_proxy
>     conn.connect()
>   File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 611, in 
> connect
>     self.sock = sock = self._new_conn()
>                        ^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 218, in 
> _new_conn
>     raise NewConnectionError(
> urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection 
> object at 0x7f21ead327d0>: Failed to establish a new connection: [Errno 111] 
> Connection refused
> 
> The above exception was the direct cause of the following exception:
> 
> urllib3.exceptions.ProxyError: ('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7f21ead327d0>: Failed to establish a new connection: [Errno 111] Connection 
> refused'))
> 
> The above exception was the direct cause of the following exception:
> 
> self = <requests.adapters.HTTPAdapter object at 0x7f21ea4bbb10>
> request = <PreparedRequest [GET]>, stream = False
> timeout = Timeout(connect=None, read=None, total=None), verify = True
> cert = None
> proxies = OrderedDict([('no', 'localhost'), ('https', 
> 'https://127.0.0.1:9/'), ('http', 'http://127.0.0.1:9/')])
> 
>     def send(
>         self, request, stream=False, timeout=None, verify=True, cert=None, 
> proxies=None
>     ):
>         """Sends PreparedRequest object. Returns Response object.
>     
>         :param request: The :class:`PreparedRequest <PreparedRequest>` being 
> sent.
>         :param stream: (optional) Whether to stream the request content.
>         :param timeout: (optional) How long to wait for the server to send
>             data before giving up, as a float, or a :ref:`(connect timeout,
>             read timeout) <timeouts>` tuple.
>         :type timeout: float or tuple or urllib3 Timeout object
>         :param verify: (optional) Either a boolean, in which case it controls 
> whether
>             we verify the server's TLS certificate, or a string, in which 
> case it
>             must be a path to a CA bundle to use
>         :param cert: (optional) Any user-provided SSL certificate to be 
> trusted.
>         :param proxies: (optional) The proxies dictionary to apply to the 
> request.
>         :rtype: requests.Response
>         """
>     
>         try:
>             conn = self.get_connection_with_tls_context(
>                 request, verify, proxies=proxies, cert=cert
>             )
>         except LocationValueError as e:
>             raise InvalidURL(e, request=request)
>     
>         self.cert_verify(conn, request.url, verify, cert)
>         url = self.request_url(request, proxies)
>         self.add_headers(
>             request,
>             stream=stream,
>             timeout=timeout,
>             verify=verify,
>             cert=cert,
>             proxies=proxies,
>         )
>     
>         chunked = not (request.body is None or "Content-Length" in 
> request.headers)
>     
>         if isinstance(timeout, tuple):
>             try:
>                 connect, read = timeout
>                 timeout = TimeoutSauce(connect=connect, read=read)
>             except ValueError:
>                 raise ValueError(
>                     f"Invalid timeout {timeout}. Pass a (connect, read) 
> timeout tuple, "
>                     f"or a single float to set both timeouts to the same 
> value."
>                 )
>         elif isinstance(timeout, TimeoutSauce):
>             pass
>         else:
>             timeout = TimeoutSauce(connect=timeout, read=timeout)
>     
>         try:
> >           resp = conn.urlopen(
>                 method=request.method,
>                 url=url,
>                 body=request.body,
>                 headers=request.headers,
>                 redirect=False,
>                 assert_same_host=False,
>                 preload_content=False,
>                 decode_content=False,
>                 retries=self.max_retries,
>                 timeout=timeout,
>                 chunked=chunked,
>             )
> 
> /usr/lib/python3/dist-packages/requests/adapters.py:667: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen
>     retries = retries.increment(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
> method = 'GET', url = '/gzip', response = None
> error = ProxyError('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7f21ead327d0>: Failed to establish a new connection: [Errno 111] Connection 
> refused'))
> _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f21eb9c5c50>
> _stacktrace = <traceback object at 0x7f21ea490c40>
> 
>     def increment(
>         self,
>         method: str | None = None,
>         url: str | None = None,
>         response: BaseHTTPResponse | None = None,
>         error: Exception | None = None,
>         _pool: ConnectionPool | None = None,
>         _stacktrace: TracebackType | None = None,
>     ) -> Retry:
>         """Return a new Retry object with incremented retry counters.
>     
>         :param response: A response object, or None, if the server did not
>             return a response.
>         :type response: :class:`~urllib3.response.BaseHTTPResponse`
>         :param Exception error: An error encountered during the request, or
>             None if the response was received successfully.
>     
>         :return: A new ``Retry`` object.
>         """
>         if self.total is False and error:
>             # Disabled, indicate to re-raise the error.
>             raise reraise(type(error), error, _stacktrace)
>     
>         total = self.total
>         if total is not None:
>             total -= 1
>     
>         connect = self.connect
>         read = self.read
>         redirect = self.redirect
>         status_count = self.status
>         other = self.other
>         cause = "unknown"
>         status = None
>         redirect_location = None
>     
>         if error and self._is_connection_error(error):
>             # Connect retry?
>             if connect is False:
>                 raise reraise(type(error), error, _stacktrace)
>             elif connect is not None:
>                 connect -= 1
>     
>         elif error and self._is_read_error(error):
>             # Read retry?
>             if read is False or method is None or not 
> self._is_method_retryable(method):
>                 raise reraise(type(error), error, _stacktrace)
>             elif read is not None:
>                 read -= 1
>     
>         elif error:
>             # Other retry?
>             if other is not None:
>                 other -= 1
>     
>         elif response and response.get_redirect_location():
>             # Redirect retry?
>             if redirect is not None:
>                 redirect -= 1
>             cause = "too many redirects"
>             response_redirect_location = response.get_redirect_location()
>             if response_redirect_location:
>                 redirect_location = response_redirect_location
>             status = response.status
>     
>         else:
>             # Incrementing because of a server error like a 500 in
>             # status_forcelist and the given method is in the allowed_methods
>             cause = ResponseError.GENERIC_ERROR
>             if response and response.status:
>                 if status_count is not None:
>                     status_count -= 1
>                 cause = 
> ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
>                 status = response.status
>     
>         history = self.history + (
>             RequestHistory(method, url, error, status, redirect_location),
>         )
>     
>         new_retry = self.new(
>             total=total,
>             connect=connect,
>             read=read,
>             redirect=redirect,
>             status=status_count,
>             other=other,
>             history=history,
>         )
>     
>         if new_retry.is_exhausted():
>             reason = error or ResponseError(cause)
> >           raise MaxRetryError(_pool, url, reason) from reason  # type: 
> > ignore[arg-type]
> E           urllib3.exceptions.MaxRetryError: 
> HTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with 
> url: /gzip (Caused by ProxyError('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7f21ead327d0>: Failed to establish a new connection: [Errno 111] Connection 
> refused')))
> 
> /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError
> 
> During handling of the above exception, another exception occurred:
> 
> mock_http_adapter = <MagicMock name='get_connection' id='139783641468816'>
> 
>     @patch.object(
>         requests.adapters.HTTPAdapter, 'get_connection', 
> side_effect=ValueError('Real request made!')
>     )
>     def test_mock_session(mock_http_adapter):
>         """Test that the mock_session fixture is working as expected"""
>         with get_responses():
>             # An error will be raised if a real request is made
>             with pytest.raises(ValueError):
> >               requests.get(PASSTHRU_URL)
> 
> tests/compat/test_responses_load_cache.py:53: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/api.py:73: in get
>     return request("get", url, params=params, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
>     return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:703: in send
>     r = adapter.send(request, **kwargs)
> /usr/lib/python3/dist-packages/responses/__init__.py:1175: in send
>     return self._on_request(adapter, request, **kwargs)
> /usr/lib/python3/dist-packages/responses/__init__.py:1079: in _on_request
>     return self._real_send(adapter, request, **kwargs)  # type: ignore
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <requests.adapters.HTTPAdapter object at 0x7f21ea4bbb10>
> request = <PreparedRequest [GET]>, stream = False
> timeout = Timeout(connect=None, read=None, total=None), verify = True
> cert = None
> proxies = OrderedDict([('no', 'localhost'), ('https', 
> 'https://127.0.0.1:9/'), ('http', 'http://127.0.0.1:9/')])
> 
>     def send(
>         self, request, stream=False, timeout=None, verify=True, cert=None, 
> proxies=None
>     ):
>         """Sends PreparedRequest object. Returns Response object.
>     
>         :param request: The :class:`PreparedRequest <PreparedRequest>` being 
> sent.
>         :param stream: (optional) Whether to stream the request content.
>         :param timeout: (optional) How long to wait for the server to send
>             data before giving up, as a float, or a :ref:`(connect timeout,
>             read timeout) <timeouts>` tuple.
>         :type timeout: float or tuple or urllib3 Timeout object
>         :param verify: (optional) Either a boolean, in which case it controls 
> whether
>             we verify the server's TLS certificate, or a string, in which 
> case it
>             must be a path to a CA bundle to use
>         :param cert: (optional) Any user-provided SSL certificate to be 
> trusted.
>         :param proxies: (optional) The proxies dictionary to apply to the 
> request.
>         :rtype: requests.Response
>         """
>     
>         try:
>             conn = self.get_connection_with_tls_context(
>                 request, verify, proxies=proxies, cert=cert
>             )
>         except LocationValueError as e:
>             raise InvalidURL(e, request=request)
>     
>         self.cert_verify(conn, request.url, verify, cert)
>         url = self.request_url(request, proxies)
>         self.add_headers(
>             request,
>             stream=stream,
>             timeout=timeout,
>             verify=verify,
>             cert=cert,
>             proxies=proxies,
>         )
>     
>         chunked = not (request.body is None or "Content-Length" in 
> request.headers)
>     
>         if isinstance(timeout, tuple):
>             try:
>                 connect, read = timeout
>                 timeout = TimeoutSauce(connect=connect, read=read)
>             except ValueError:
>                 raise ValueError(
>                     f"Invalid timeout {timeout}. Pass a (connect, read) 
> timeout tuple, "
>                     f"or a single float to set both timeouts to the same 
> value."
>                 )
>         elif isinstance(timeout, TimeoutSauce):
>             pass
>         else:
>             timeout = TimeoutSauce(connect=timeout, read=timeout)
>     
>         try:
>             resp = conn.urlopen(
>                 method=request.method,
>                 url=url,
>                 body=request.body,
>                 headers=request.headers,
>                 redirect=False,
>                 assert_same_host=False,
>                 preload_content=False,
>                 decode_content=False,
>                 retries=self.max_retries,
>                 timeout=timeout,
>                 chunked=chunked,
>             )
>     
>         except (ProtocolError, OSError) as err:
>             raise ConnectionError(err, request=request)
>     
>         except MaxRetryError as e:
>             if isinstance(e.reason, ConnectTimeoutError):
>                 # TODO: Remove this in 3.0.0: see #2811
>                 if not isinstance(e.reason, NewConnectionError):
>                     raise ConnectTimeout(e, request=request)
>     
>             if isinstance(e.reason, ResponseError):
>                 raise RetryError(e, request=request)
>     
>             if isinstance(e.reason, _ProxyError):
> >               raise ProxyError(e, request=request)
> E               requests.exceptions.ProxyError: 
> HTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with 
> url: /gzip (Caused by ProxyError('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7f21ead327d0>: Failed to establish a new connection: [Errno 111] Connection 
> refused')))
> 
> /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError
> ----------------------------- Captured stderr call 
> -----------------------------
> INFO:responses:request.allowed-passthru
> ------------------------------ Captured log call 
> -------------------------------
> INFO     responses:__init__.py:1078 request.allowed-passthru
> ______________________________ test_from_response 
> ______________________________
> 
> mock_session = <CachedSession(cache=<SQLiteCache(name=http_cache)>, 
> expire_after=-1, urls_expire_after=None, allowable_codes=(200,), 
> ...wable_methods=['GET', 'HEAD', 'OPTIONS', 'POST', 'PUT', 'PATCH', 
> 'DELETE'], stale_if_error=False, cache_control=False)>
> 
>     def test_from_response(mock_session):
>         response = mock_session.get(MOCKED_URL)
>         response.raw._fp = BytesIO(b'mock response')
>         raw = CachedHTTPResponse.from_response(response)
>     
>         assert dict(response.raw.headers) == dict(raw.headers) == 
> {'Content-Type': 'text/plain'}
>         assert raw.read(None) == b'mock response'
>         assert response.raw.decode_content is raw.decode_content is False
>         assert response.raw.reason is raw.reason is None
>         if hasattr(response.raw, '_request_url'):
>             assert response.raw._request_url is raw.request_url is None
>         assert response.raw.status == raw.status == 200
> >       assert response.raw.strict == raw.strict == 0
> E       AttributeError: 'HTTPResponse' object has no attribute 'strict'
> 
> tests/unit/models/test_raw_response.py:19: AttributeError
> =============================== warnings summary 
> ===============================
> tests/compat/test_requests_mock_load_cache.py::test_mock_session
> tests/compat/test_responses_load_cache.py::test_mock_session
> tests/unit/test_session.py::test_values
> tests/unit/test_session.py::test_values__with_invalid_responses[True-1]
> tests/unit/test_session.py::test_values__with_invalid_responses[False-2]
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build/requests_cache/backends/base.py:298:
>  DeprecationWarning: BaseCache.values() is deprecated; please use .filter() 
> instead
>     warn('BaseCache.values() is deprecated; please use .filter() instead', 
> DeprecationWarning)
> 
> tests/unit/test_session.py::test_keys
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build/requests_cache/backends/base.py:269:
>  DeprecationWarning: BaseCache.keys() is deprecated; please use .filter() or 
> BaseCache.responses.keys() instead
>     warn(
> 
> tests/unit/test_session.py::test_response_count[True-2]
> tests/unit/test_session.py::test_response_count[False-3]
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build/requests_cache/backends/base.py:280:
>  DeprecationWarning: BaseCache.response_count() is deprecated; please use 
> .filter() or len(BaseCache.responses) instead
>     warn(
> 
> tests/unit/test_session.py: 19 warnings
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build/requests_cache/backends/base.py:262:
>  DeprecationWarning: BaseCache.has_url() is deprecated; please use 
> .contains(url=...) instead
>     warn(
> 
> tests/unit/test_session.py::test_delete_url
> tests/unit/test_session.py::test_delete_url__request_args
> tests/unit/test_session.py::test_delete_url__nonexistent_response
> tests/unit/test_session.py::test_delete_url__nonexistent_response
> tests/unit/test_session.py::test_delete_url__nonexistent_response
> tests/unit/test_session.py::test_delete_url__redirect
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build/requests_cache/backends/base.py:241:
>  DeprecationWarning: BaseCache.delete_url() is deprecated; please use 
> .delete(urls=...) instead
>     warn(
> 
> tests/unit/test_session.py::test_delete_urls
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build/requests_cache/backends/base.py:248:
>  DeprecationWarning: BaseCache.delete_urls() is deprecated; please use 
> .delete(urls=...) instead
>     warn(
> 
> tests/unit/test_session.py::test_remove_expired_responses
> tests/unit/test_session.py::test_remove_expired_responses
> tests/unit/test_session.py::test_remove_expired_responses__error
> tests/unit/test_session.py::test_remove_expired_responses__extend_expiration
> tests/unit/test_session.py::test_remove_expired_responses__shorten_expiration
> tests/unit/test_session.py::test_remove_expired_responses__per_request
> tests/unit/test_session.py::test_remove_expired_responses__per_request
> tests/unit/test_session.py::test_remove_expired_responses__per_request
>   
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build/requests_cache/backends/base.py:288:
>  DeprecationWarning: BaseCache.remove_expired_responses() is deprecated; 
> please use .delete(expired=True) instead
>     warn(
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info 
> ============================
> FAILED tests/compat/test_responses_load_cache.py::test_mock_session - 
> request...
> FAILED tests/unit/models/test_raw_response.py::test_from_response - 
> Attribute...
> ================= 2 failed, 297 passed, 42 warnings in 16.60s 
> ==================
> E: pybuild pybuild:389: test: plugin pyproject failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests_cache/build; python3.11 -m 
> pytest --ignore=tests/integration
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 
> 3.11" returned exit code 13


The full build log is available from:
http://qa-logs.debian.net/2024/06/15/python-requests-cache_0.9.8-2_unstable.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240615;users=lu...@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240615&fusertaguser=lu...@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.

Reply via email to