Source: python-requests-toolbelt
Version: 1.0.0-2
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: lu...@debian.org
Usertags: ftbfs-20240615 ftbfs-trixie

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.


Relevant part (hopefully):
>  debian/rules binary
> dh binary --with python3 --buildsystem=pybuild
>    dh_update_autotools_config -O--buildsystem=pybuild
>    dh_autoreconf -O--buildsystem=pybuild
>    dh_auto_configure -O--buildsystem=pybuild
>       pybuild --configure -i python{version} -p "3.12 3.11"
> I: pybuild base:311: python3.12 setup.py config 
> running config
> I: pybuild base:311: python3.11 setup.py config 
> running config
>    dh_auto_build -O--buildsystem=pybuild
>       pybuild --build -i python{version} -p "3.12 3.11"
> I: pybuild base:311: /usr/bin/python3.12 setup.py build 
> running build
> running build_py
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/streaming_iterator.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/exceptions.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/_compat.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/sessions.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/source.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/socket_options.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/ssl.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/fingerprint.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/host_header_ssl.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/x509.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/guess.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/_digest_auth_compat.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/http_proxy_digest.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/handler.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/tee.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/stream.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/downloadutils
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/decoder.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/encoder.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/multipart
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/pool.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/thread.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/threaded
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/dump.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/user_agent.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/formdata.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/deprecated.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> running egg_info
> creating requests_toolbelt.egg-info
> writing requests_toolbelt.egg-info/PKG-INFO
> writing dependency_links to requests_toolbelt.egg-info/dependency_links.txt
> writing requirements to requests_toolbelt.egg-info/requires.txt
> writing top-level names to requests_toolbelt.egg-info/top_level.txt
> writing manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> reading manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> reading manifest template 'MANIFEST.in'
> no previously-included directories found matching 'docs/_build'
> warning: no previously-included files matching '*.py[cdo]' found anywhere in 
> distribution
> warning: no previously-included files matching '__pycache__' found anywhere 
> in distribution
> warning: no previously-included files matching '*.so' found anywhere in 
> distribution
> warning: no previously-included files matching '*.pyd' found anywhere in 
> distribution
> adding license file 'LICENSE'
> adding license file 'AUTHORS.rst'
> writing manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> /usr/lib/python3/dist-packages/setuptools/command/build_py.py:204: _Warning: 
> Package 'requests_toolbelt.cookies' is absent from the `packages` 
> configuration.
> !!
> 
>         
> ********************************************************************************
>         ############################
>         # Package would be ignored #
>         ############################
>         Python recognizes 'requests_toolbelt.cookies' as an importable 
> package[^1],
>         but it is absent from setuptools' `packages` configuration.
> 
>         This leads to an ambiguous overall configuration. If you want to 
> distribute this
>         package, please make sure that 'requests_toolbelt.cookies' is 
> explicitly added
>         to the `packages` configuration field.
> 
>         Alternatively, you can also rely on setuptools' discovery methods
>         (for example by using `find_namespace_packages(...)`/`find_namespace:`
>         instead of `find_packages(...)`/`find:`).
> 
>         You can read more about "package discovery" on setuptools 
> documentation page:
> 
>         - 
> https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
> 
>         If you don't want 'requests_toolbelt.cookies' to be distributed and 
> are
>         already explicitly excluding 'requests_toolbelt.cookies' via
>         `find_namespace_packages(...)/find_namespace` or 
> `find_packages(...)/find`,
>         you can try to use `exclude_package_data`, or 
> `include-package-data=False` in
>         combination with a more fine grained `package-data` configuration.
> 
>         You can read more about "package data files" on setuptools 
> documentation page:
> 
>         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
> 
> 
>         [^1]: For Python, any directory (with suitable naming) can be 
> imported,
>               even if it does not contain any `.py` files.
>               On the other hand, currently there is no concept of package data
>               directory, all directories are treated like packages.
>         
> ********************************************************************************
> 
> !!
>   check.warn(importable)
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/cookies
> copying requests_toolbelt/cookies/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/cookies
> copying requests_toolbelt/cookies/forgetful.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/cookies
> I: pybuild base:311: /usr/bin/python3 setup.py build 
> running build
> running build_py
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/streaming_iterator.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/exceptions.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/_compat.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/sessions.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/source.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/socket_options.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/ssl.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/fingerprint.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/host_header_ssl.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/x509.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/guess.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/_digest_auth_compat.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/http_proxy_digest.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/handler.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/tee.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/stream.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/downloadutils
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/decoder.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/encoder.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/multipart
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/pool.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/thread.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/threaded
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/dump.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/user_agent.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/formdata.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/deprecated.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> running egg_info
> writing requests_toolbelt.egg-info/PKG-INFO
> writing dependency_links to requests_toolbelt.egg-info/dependency_links.txt
> writing requirements to requests_toolbelt.egg-info/requires.txt
> writing top-level names to requests_toolbelt.egg-info/top_level.txt
> reading manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> reading manifest template 'MANIFEST.in'
> no previously-included directories found matching 'docs/_build'
> warning: no previously-included files matching '*.py[cdo]' found anywhere in 
> distribution
> warning: no previously-included files matching '__pycache__' found anywhere 
> in distribution
> warning: no previously-included files matching '*.so' found anywhere in 
> distribution
> warning: no previously-included files matching '*.pyd' found anywhere in 
> distribution
> adding license file 'LICENSE'
> adding license file 'AUTHORS.rst'
> writing manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> /usr/lib/python3/dist-packages/setuptools/command/build_py.py:204: _Warning: 
> Package 'requests_toolbelt.cookies' is absent from the `packages` 
> configuration.
> !!
> 
>         
> ********************************************************************************
>         ############################
>         # Package would be ignored #
>         ############################
>         Python recognizes 'requests_toolbelt.cookies' as an importable 
> package[^1],
>         but it is absent from setuptools' `packages` configuration.
> 
>         This leads to an ambiguous overall configuration. If you want to 
> distribute this
>         package, please make sure that 'requests_toolbelt.cookies' is 
> explicitly added
>         to the `packages` configuration field.
> 
>         Alternatively, you can also rely on setuptools' discovery methods
>         (for example by using `find_namespace_packages(...)`/`find_namespace:`
>         instead of `find_packages(...)`/`find:`).
> 
>         You can read more about "package discovery" on setuptools 
> documentation page:
> 
>         - 
> https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
> 
>         If you don't want 'requests_toolbelt.cookies' to be distributed and 
> are
>         already explicitly excluding 'requests_toolbelt.cookies' via
>         `find_namespace_packages(...)/find_namespace` or 
> `find_packages(...)/find`,
>         you can try to use `exclude_package_data`, or 
> `include-package-data=False` in
>         combination with a more fine grained `package-data` configuration.
> 
>         You can read more about "package data files" on setuptools 
> documentation page:
> 
>         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
> 
> 
>         [^1]: For Python, any directory (with suitable naming) can be 
> imported,
>               even if it does not contain any `.py` files.
>               On the other hand, currently there is no concept of package data
>               directory, all directories are treated like packages.
>         
> ********************************************************************************
> 
> !!
>   check.warn(importable)
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/cookies
> copying requests_toolbelt/cookies/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/cookies
> copying requests_toolbelt/cookies/forgetful.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/cookies
>    dh_auto_test -O--buildsystem=pybuild
>       pybuild --test --test-pytest -i python{version} -p "3.12 3.11"
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build; python3.12 
> -m pytest -k 'not test_reads_open_file_objects and not 
> test_reads_open_file_objects_using_to_string and not 
> test_reads_open_file_objects_with_a_specified_filename'
> ..........................................FF......ssss.................. [ 
> 43%]
> ................................FF.FF..............................sss.. [ 
> 86%]
> ......................                                                   
> [100%]
> =================================== FAILURES 
> ===================================
> ___________________ TestDumpRealResponses.test_dump_response 
> ___________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfbd2140>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfbd2140>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfbd2140>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_dump.TestDumpRealResponses object at 0x7fc0bfc36db0>
> 
>     def test_dump_response(self):
>         session = requests.Session()
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('https://httpbin.org/get')
> 
> tests/test_dump.py:376: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _____________________ TestDumpRealResponses.test_dump_all 
> ______________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfac0ee0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfac0ee0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfac0ee0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_dump.TestDumpRealResponses object at 0x7fc0bfc37890>
> 
>     def test_dump_all(self):
>         session = requests.Session()
>         recorder = get_betamax(session)
>         with recorder.use_cassette('redirect_request_for_dump_all'):
> >           response = session.get('https://httpbin.org/redirect/5')
> 
> tests/test_dump.py:392: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:724: in send
>     history = [resp for resp in gen]
> /usr/lib/python3/dist-packages/requests/sessions.py:265: in resolve_redirects
>     resp = self.send(
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _____________ TestBasedSession.test_prepared_request_override_base 
> _____________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa79b70>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa79b70>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa79b70>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_prepared_request_override_base>
> 
>     def test_prepared_request_override_base(self):
>         session = sessions.BaseUrlSession('https://www.google.com')
>         request = Request(method="GET", url="https://httpbin.org/get";)
>         prepared_request = session.prepare_request(request)
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.send(prepared_request)
> 
> tests/test_sessions.py:53: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _______________ TestBasedSession.test_prepared_request_with_base 
> _______________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa7a560>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa7a560>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa7a560>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_prepared_request_with_base>
> 
>     def test_prepared_request_with_base(self):
>         session = sessions.BaseUrlSession('https://httpbin.org')
>         request = Request(method="GET", url="/get")
>         prepared_request = session.prepare_request(request)
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.send(prepared_request)
> 
> tests/test_sessions.py:37: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _________________ TestBasedSession.test_request_override_base 
> __________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa1d2a0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa1d2a0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa1d2a0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_request_override_base>
> 
>     def test_request_override_base(self):
>         session = sessions.BaseUrlSession('https://www.google.com')
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('https://httpbin.org/get')
> 
> tests/test_sessions.py:27: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> requests_toolbelt/sessions.py:76: in request
>     return super(BaseUrlSession, self).request(
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ___________________ TestBasedSession.test_request_with_base 
> ____________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfc545e0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfc545e0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfc545e0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_request_with_base>
> 
>     def test_request_with_base(self):
>         session = sessions.BaseUrlSession('https://httpbin.org/')
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('/get')
> 
> tests/test_sessions.py:15: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> requests_toolbelt/sessions.py:76: in request
>     return super(BaseUrlSession, self).request(
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> =============================== warnings summary 
> ===============================
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_auth.py: 3 warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_downloadutils.py: 9 
> warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_dump.py: 2 warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_fingerprintadapter.py:
>  1 warning
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_forgetfulcookiejar.py:
>  1 warning
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_multipart_encoder.py:
>  3 warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_sessions.py: 4 
> warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_ssladapter.py: 1 
> warning
>   /usr/lib/python3/dist-packages/betamax/adapter.py:105: DeprecationWarning: 
> datetime.datetime.utcnow() is deprecated and scheduled for removal in a 
> future version. Use timezone-aware objects to represent datetimes in UTC: 
> datetime.datetime.now(datetime.UTC).
>     now = datetime.utcnow()
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info 
> ============================
> FAILED tests/test_dump.py::TestDumpRealResponses::test_dump_response - 
> reques...
> FAILED tests/test_dump.py::TestDumpRealResponses::test_dump_all - 
> requests.ex...
> FAILED 
> tests/test_sessions.py::TestBasedSession::test_prepared_request_override_base
> FAILED 
> tests/test_sessions.py::TestBasedSession::test_prepared_request_with_base
> FAILED tests/test_sessions.py::TestBasedSession::test_request_override_base
> FAILED tests/test_sessions.py::TestBasedSession::test_request_with_base - 
> req...
> 6 failed, 153 passed, 7 skipped, 3 deselected, 24 warnings in 3.55s
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build; python3.12 
> -m pytest -k 'not test_reads_open_file_objects and not 
> test_reads_open_file_objects_using_to_string and not 
> test_reads_open_file_objects_with_a_specified_filename'
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build; python3.11 
> -m pytest -k 'not test_reads_open_file_objects and not 
> test_reads_open_file_objects_using_to_string and not 
> test_reads_open_file_objects_with_a_specified_filename'
> ..........................................FF......ssss.................. [ 
> 43%]
> ................................FF.FF..............................sss.. [ 
> 86%]
> ......................                                                   
> [100%]
> =================================== FAILURES 
> ===================================
> ___________________ TestDumpRealResponses.test_dump_response 
> ___________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa9f60>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa9f60>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa9f60>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_dump.TestDumpRealResponses object at 0x7fa19c731590>
> 
>     def test_dump_response(self):
>         session = requests.Session()
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('https://httpbin.org/get')
> 
> tests/test_dump.py:376: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _____________________ TestDumpRealResponses.test_dump_all 
> ______________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c98b460>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c98b460>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c98b460>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_dump.TestDumpRealResponses object at 0x7fa19c733110>
> 
>     def test_dump_all(self):
>         session = requests.Session()
>         recorder = get_betamax(session)
>         with recorder.use_cassette('redirect_request_for_dump_all'):
> >           response = session.get('https://httpbin.org/redirect/5')
> 
> tests/test_dump.py:392: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:724: in send
>     history = [resp for resp in gen]
> /usr/lib/python3/dist-packages/requests/sessions.py:724: in <listcomp>
>     history = [resp for resp in gen]
> /usr/lib/python3/dist-packages/requests/sessions.py:265: in resolve_redirects
>     resp = self.send(
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _____________ TestBasedSession.test_prepared_request_override_base 
> _____________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c79bc10>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c79bc10>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c79bc10>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_prepared_request_override_base>
> 
>     def test_prepared_request_override_base(self):
>         session = sessions.BaseUrlSession('https://www.google.com')
>         request = Request(method="GET", url="https://httpbin.org/get";)
>         prepared_request = session.prepare_request(request)
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.send(prepared_request)
> 
> tests/test_sessions.py:53: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _______________ TestBasedSession.test_prepared_request_with_base 
> _______________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa8340>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa8340>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa8340>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_prepared_request_with_base>
> 
>     def test_prepared_request_with_base(self):
>         session = sessions.BaseUrlSession('https://httpbin.org')
>         request = Request(method="GET", url="/get")
>         prepared_request = session.prepare_request(request)
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.send(prepared_request)
> 
> tests/test_sessions.py:37: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _________________ TestBasedSession.test_request_override_base 
> __________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19cae93c0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19cae93c0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19cae93c0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_request_override_base>
> 
>     def test_request_override_base(self):
>         session = sessions.BaseUrlSession('https://www.google.com')
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('https://httpbin.org/get')
> 
> tests/test_sessions.py:27: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> requests_toolbelt/sessions.py:76: in request
>     return super(BaseUrlSession, self).request(
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ___________________ TestBasedSession.test_request_with_base 
> ____________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c57e7a0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c57e7a0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c57e7a0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_request_with_base>
> 
>     def test_request_with_base(self):
>         session = sessions.BaseUrlSession('https://httpbin.org/')
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('/get')
> 
> tests/test_sessions.py:15: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> requests_toolbelt/sessions.py:76: in request
>     return super(BaseUrlSession, self).request(
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> =========================== short test summary info 
> ============================
> FAILED tests/test_dump.py::TestDumpRealResponses::test_dump_response - 
> reques...
> FAILED tests/test_dump.py::TestDumpRealResponses::test_dump_all - 
> requests.ex...
> FAILED 
> tests/test_sessions.py::TestBasedSession::test_prepared_request_override_base
> FAILED 
> tests/test_sessions.py::TestBasedSession::test_prepared_request_with_base
> FAILED tests/test_sessions.py::TestBasedSession::test_request_override_base
> FAILED tests/test_sessions.py::TestBasedSession::test_request_with_base - 
> req...
> 6 failed, 153 passed, 7 skipped, 3 deselected in 3.76s
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build; python3.11 
> -m pytest -k 'not test_reads_open_file_objects and not 
> test_reads_open_file_objects_using_to_string and not 
> test_reads_open_file_objects_with_a_specified_filename'
>       rm -fr -- /tmp/dh-xdg-rundir-dPeX5v9W
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 
> 3.11" returned exit code 13


The full build log is available from:
http://qa-logs.debian.net/2024/06/15/python-requests-toolbelt_1.0.0-2_unstable.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240615;users=lu...@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240615&fusertaguser=lu...@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.

Reply via email to