Package: src:python-scrapy
Version: 2.11.2-1
Severity: serious
Tags: ftbfs

Dear maintainer:

During a rebuild of all packages in unstable, your package failed to build:

--------------------------------------------------------------------------------
[...]
 debian/rules binary
dh binary --buildsystem=pybuild
   dh_update_autotools_config -O--buildsystem=pybuild
   dh_autoreconf -O--buildsystem=pybuild
   dh_auto_configure -O--buildsystem=pybuild
I: pybuild base:311: python3.12 setup.py config
running config
   dh_auto_build -O--buildsystem=pybuild
I: pybuild base:311: /usr/bin/python3 setup.py build
running build
running build_py
creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy
copying scrapy/interfaces.py -> 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy
copying scrapy/mail.py -> 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy

[... snipped ...]

KeyError: "namespace: urls key: {'url': 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 174, in 
_new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 73, in 
create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/socket.py", line 964, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 716, in 
urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 405, in 
_make_request
    self._validate_conn(conn)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 1059, 
in _validate_conn
    conn.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 363, in 
connect
    self.sock = conn = self._new_conn()
                       ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 186, in 
_new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object 
at 0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 800, in 
urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 592, in 
increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: 
HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded 
with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/suffix_list.py", line 46, in 
find_first_response
    return cache.cached_fetch_url(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 220, in 
cached_fetch_url
    return self.run_and_cache(
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 211, in 
run_and_cache
    result = func(**kwargs)
             ^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 229, in 
_fetch_url
    response = session.get(url, timeout=timeout)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 589, in 
request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 519, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: 
HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded 
with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))
------------------------------ Captured log call -------------------------------
ERROR    tldextract:suffix_list.py:50 Exception reading Public Suffix List url 
file:///usr/share/publicsuffix/effective_tld_names.dat
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: publicsuffix.org-tlds key: {'urls': 
('file:///usr/share/publicsuffix/effective_tld_names.dat', 
'https://publicsuffix.org/list/public_suffix_list.dat', 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 
'fallback_to_snapshot': True}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: urls key: {'url': 
'file:///usr/share/publicsuffix/effective_tld_names.dat'}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/suffix_list.py", line 46, in 
find_first_response
    return cache.cached_fetch_url(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 220, in 
cached_fetch_url
    return self.run_and_cache(
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 211, in 
run_and_cache
    result = func(**kwargs)
             ^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 230, in 
_fetch_url
    response.raise_for_status()
  File "/usr/lib/python3/dist-packages/requests/models.py", line 1021, in 
raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: None for url: None
DEBUG    urllib3.connectionpool:connectionpool.py:1020 Starting new HTTPS 
connection (1): publicsuffix.org:443
ERROR    tldextract:suffix_list.py:50 Exception reading Public Suffix List url 
https://publicsuffix.org/list/public_suffix_list.dat
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: publicsuffix.org-tlds key: {'urls': 
('file:///usr/share/publicsuffix/effective_tld_names.dat', 
'https://publicsuffix.org/list/public_suffix_list.dat', 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 
'fallback_to_snapshot': True}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: urls key: {'url': 
'https://publicsuffix.org/list/public_suffix_list.dat'}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 174, in 
_new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 73, in 
create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/socket.py", line 964, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 716, in 
urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 405, in 
_make_request
    self._validate_conn(conn)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 1059, 
in _validate_conn
    conn.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 363, in 
connect
    self.sock = conn = self._new_conn()
                       ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 186, in 
_new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object 
at 0x7f8330b887d0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 800, in 
urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 592, in 
increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='publicsuffix.org', 
port=443): Max retries exceeded with url: /list/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b887d0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/suffix_list.py", line 46, in 
find_first_response
    return cache.cached_fetch_url(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 220, in 
cached_fetch_url
    return self.run_and_cache(
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 211, in 
run_and_cache
    result = func(**kwargs)
             ^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 229, in 
_fetch_url
    response = session.get(url, timeout=timeout)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 589, in 
request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 519, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='publicsuffix.org', 
port=443): Max retries exceeded with url: /list/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b887d0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))
DEBUG    urllib3.connectionpool:connectionpool.py:1020 Starting new HTTPS 
connection (1): raw.githubusercontent.com:443
ERROR    tldextract:suffix_list.py:50 Exception reading Public Suffix List url 
https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: publicsuffix.org-tlds key: {'urls': 
('file:///usr/share/publicsuffix/effective_tld_names.dat', 
'https://publicsuffix.org/list/public_suffix_list.dat', 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 
'fallback_to_snapshot': True}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: urls key: {'url': 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 174, in 
_new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 73, in 
create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/socket.py", line 964, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 716, in 
urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 405, in 
_make_request
    self._validate_conn(conn)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 1059, 
in _validate_conn
    conn.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 363, in 
connect
    self.sock = conn = self._new_conn()
                       ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 186, in 
_new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object 
at 0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 800, in 
urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 592, in 
increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: 
HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded 
with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/suffix_list.py", line 46, in 
find_first_response
    return cache.cached_fetch_url(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 220, in 
cached_fetch_url
    return self.run_and_cache(
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 211, in 
run_and_cache
    result = func(**kwargs)
             ^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 229, in 
_fetch_url
    response = session.get(url, timeout=timeout)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 589, in 
request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 519, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: 
HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded 
with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))
=============================== warnings summary ===============================
scrapy/spidermiddlewares/offsite.py:15
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/spidermiddlewares/offsite.py:15:
 ScrapyDeprecationWarning: The scrapy.spidermiddlewares.offsite module is deprecated, use 
scrapy.downloadermiddlewares.offsite instead.
    warnings.warn(

tests/test_addons.py: 2 warnings
tests/test_crawler.py: 2 warnings
tests/test_downloaderslotssettings.py: 1 warning
tests/test_extension_periodic_log.py: 16 warnings
tests/test_spider.py: 6 warnings
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/utils/request.py:254:
 ScrapyDeprecationWarning: '2.6' is a deprecated value for the 
'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting.
It is also the default value. In other words, it is normal to get this warning if you have not defined a value for the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting. This is so for backward compatibility reasons, but it will change in a future version of Scrapy. See the documentation of the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting for information on how to handle this deprecation.
    return cls(crawler)

tests/test_contracts.py::ContractsManagerTest::test_returns_async
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/contracts/__init__.py:170:
 RuntimeWarning: coroutine 'TestSpider.returns_request_async' was never awaited
    results.addError(case, sys.exc_info())

tests/test_crawl.py: 2 warnings
tests/test_downloader_handlers.py: 161 warnings
tests/test_downloader_handlers_http2.py: 34 warnings
tests/test_webclient.py: 4 warnings
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/core/downloader/contextfactory.py:90:
 DeprecationWarning: Passing method to twisted.internet.ssl.CertificateOptions was 
deprecated in Twisted 17.1.0. Please use a combination of insecurelyLowerMinimumTo, 
raiseMinimumTo, and lowerMaximumSecurityTo instead, as Twisted will correctly configure the 
method.
    return CertificateOptions(

tests/test_downloadermiddleware_offsite.py::test_process_request_invalid_domains
tests/test_downloadermiddleware_offsite.py::test_request_scheduled_invalid_domains
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/downloadermiddlewares/offsite.py:67:
 UserWarning: allowed_domains accepts only domains, not URLs. Ignoring URL entry 
http:////b.example in allowed_domains.
    warnings.warn(message)

tests/test_feedexport.py::StdoutFeedStoragePreFeedOptionsTest::test_init
tests/test_feedexport.py::StdoutFeedStoragePreFeedOptionsTest::test_init
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: StdoutFeedStorageWithoutFeedOptions does not support the 
'feed_options' keyword argument. Add a 'feed_options' parameter to its signature to remove 
this warning. This parameter will become mandatory in a future version of Scrapy.
    warnings.warn(

tests/test_feedexport.py::FileFeedStoragePreFeedOptionsTest::test_init
tests/test_feedexport.py::FileFeedStoragePreFeedOptionsTest::test_init
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: FileFeedStorageWithoutFeedOptions does not support the 
'feed_options' keyword argument. Add a 'feed_options' parameter to its signature to remove 
this warning. This parameter will become mandatory in a future version of Scrapy.
    warnings.warn(

tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_from_crawler
tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_from_crawler
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: S3FeedStorageWithoutFeedOptionsWithFromCrawler.from_crawler does 
not support the 'feed_options' keyword argument. Add a 'feed_options' parameter to its 
signature to remove this warning. This parameter will become mandatory in a future version 
of Scrapy.
    warnings.warn(

tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_init
tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_init
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: S3FeedStorageWithoutFeedOptions does not support the 
'feed_options' keyword argument. Add a 'feed_options' parameter to its signature to remove 
this warning. This parameter will become mandatory in a future version of Scrapy.
    warnings.warn(

tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_from_crawler
tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_from_crawler
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: FTPFeedStorageWithoutFeedOptionsWithFromCrawler.from_crawler 
does not support the 'feed_options' keyword argument. Add a 'feed_options' parameter to its 
signature to remove this warning. This parameter will become mandatory in a future version 
of Scrapy.
    warnings.warn(

tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_init
tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_init
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: FTPFeedStorageWithoutFeedOptions does not support the 
'feed_options' keyword argument. Add a 'feed_options' parameter to its signature to remove 
this warning. This parameter will become mandatory in a future version of Scrapy.
    warnings.warn(

tests/test_utils_datatypes.py::CaseInsensitiveDictTest::test_getdefault
tests/test_utils_datatypes.py::CaselessDictTest::test_getdefault
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:95:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = CaselessDict()

tests/test_utils_datatypes.py::CaseInsensitiveDictTest::test_setdefault
tests/test_utils_datatypes.py::CaselessDictTest::test_setdefault
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:101:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = CaselessDict({"a": 1, "b": 2})

tests/test_utils_datatypes.py::CaselessDictTest::test_caseless
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:79:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class()

tests/test_utils_datatypes.py::CaselessDictTest::test_contains
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:132:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class()

tests/test_utils_datatypes.py::CaselessDictTest::test_copy
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:185:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    h1 = self.dict_class({"header1": "value"})

tests/test_utils_datatypes.py::CaselessDictTest::test_copy
tests/test_utils_datatypes.py::CaselessDictTest::test_copy
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/utils/datatypes.py:55:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    return self.__class__(self)

tests/test_utils_datatypes.py::CaselessDictTest::test_delete
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:89:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class({"key_lower": 1})

tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/utils/datatypes.py:80:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    return cls((k, value) for k in keys)

tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:122:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    instance = self.dict_class()

tests/test_utils_datatypes.py::CaselessDictTest::test_init_dict
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:24:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class(seq)

tests/test_utils_datatypes.py::CaselessDictTest::test_init_mapping
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:49:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class(seq)

tests/test_utils_datatypes.py::CaselessDictTest::test_init_mutable_mapping
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:74:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class(seq)

tests/test_utils_datatypes.py::CaselessDictTest::test_init_pair_sequence
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:30:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class(seq)

tests/test_utils_datatypes.py::CaselessDictTest::test_normkey
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:149:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict()

tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:161:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict({"key": 1})

tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:165:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict()

tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:170:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict()

tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:175:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict()

tests/test_utils_datatypes.py::CaselessDictTest::test_pop
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:137:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class()

tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_none
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_none
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/utils/misc.py:249: 
DeprecationWarning: ast.NameConstant is deprecated and will be removed in Python 3.14; use 
ast.Constant instead
    value is None or isinstance(value, ast.NameConstant) and value.value is None

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED tests/test_crawl.py::CrawlTestCase::test_unbounded_response - twisted....
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_basic
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_complex_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookie_redirect_different_domain
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookie_redirect_different_domain_forcing_get
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookie_redirect_same_domain
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookie_redirect_same_domain_forcing_get
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookiejar_key
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_do_not_break_on_non_utf8_header
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_dont_merge_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_invalid_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_local_domain
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_merge_request_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_primitive_type_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_request_cookies_encoding
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_server_set_cookie_domain_public_period
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_server_set_cookie_domain_suffix_private
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_server_set_cookie_domain_suffix_public_period
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_server_set_cookie_domain_suffix_public_private
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_setting_disabled_cookies_debug
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_setting_enabled_cookies_debug
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_user_set_cookie_domain_public_period
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_user_set_cookie_domain_suffix_private
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_user_set_cookie_domain_suffix_public_period
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_user_set_cookie_domain_suffix_public_private
= 25 failed, 3187 passed, 306 skipped, 5 deselected, 21 xfailed, 277 warnings 
in 379.84s (0:06:19) =
E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build; python3.12 -m pytest 
--ignore tests/test_command_check.py -k 'not (test_start_requests_laziness or test_utf16)'
dh_auto_test: error: pybuild --test -i python{version} -p 3.12 returned exit 
code 13
make: *** [debian/rules:17: binary] Error 25
dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2
--------------------------------------------------------------------------------

The above is just how the build ends and not necessarily the most relevant part.

For a full build log, please see:

https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/python-scrapy.html

Note: You can reproduce this easily by trying to build the package using sbuild 
unshare backend.

About the archive rebuild: The build was made on virtual machines
of type m6a.large and r6a.large from AWS, using sbuild and a
reduced chroot with only build-essential packages.

If you could not reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and affects, so that this is still visible in the BTS web
page for this package.

Thanks.

Reply via email to