Script 'mail_helper' called by obssrc Hello community, here is the log from the commit of package python-Scrapy for openSUSE:Factory checked in at 2021-04-29 01:38:33 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Comparing /work/SRC/openSUSE:Factory/python-Scrapy (Old) and /work/SRC/openSUSE:Factory/.python-Scrapy.new.12324 (New) ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-Scrapy" Thu Apr 29 01:38:33 2021 rev:9 rq:889037 version:2.5.0 Changes: -------- --- /work/SRC/openSUSE:Factory/python-Scrapy/python-Scrapy.changes 2020-07-08 19:14:01.003323462 +0200 +++ /work/SRC/openSUSE:Factory/.python-Scrapy.new.12324/python-Scrapy.changes 2021-04-29 01:39:39.438685284 +0200 @@ -1,0 +2,53 @@ +Wed Apr 28 09:29:08 UTC 2021 - Ben Greiner <c...@bnavigator.de> + +- Update to 2.5.0: + * Official Python 3.9 support + * Experimental HTTP/2 support + * New get_retry_request() function to retry requests from spider + callbacks + * New headers_received signal that allows stopping downloads + early + * New Response.protocol attribute +- Release 2.4.1: + * Fixed feed exports overwrite support + * Fixed the asyncio event loop handling, which could make code + hang + * Fixed the IPv6-capable DNS resolver CachingHostnameResolver + for download handlers that call reactor.resolve + * Fixed the output of the genspider command showing placeholders + instead of the import part of the generated spider module + (issue 4874) +- Release 2.4.0: + * Python 3.5 support has been dropped. + * The file_path method of media pipelines can now access the + source item. + * This allows you to set a download file path based on item data. + * The new item_export_kwargs key of the FEEDS setting allows to + define keyword parameters to pass to item exporter classes. + * You can now choose whether feed exports overwrite or append to + the output file. + * For example, when using the crawl or runspider commands, you + can use the -O option instead of -o to overwrite the output + file. + * Zstd-compressed responses are now supported if zstandard is + installed. + * In settings, where the import path of a class is required, it + is now possible to pass a class object instead. +- Release 2.3.0: + * Feed exports now support Google Cloud Storage as a storage + backend + * The new FEED_EXPORT_BATCH_ITEM_COUNT setting allows to deliver + output items in batches of up to the specified number of items. + * It also serves as a workaround for delayed file delivery, + which causes Scrapy to only start item delivery after the + crawl has finished when using certain storage backends (S3, + FTP, and now GCS). + * The base implementation of item loaders has been moved into a + separate library, itemloaders, allowing usage from outside + Scrapy and a separate release schedule +- Release 2.2.1: + * The startproject command no longer makes unintended changes to + the permissions of files in the destination folder, such as + removing execution permissions. + +------------------------------------------------------------------- Old: ---- Scrapy-2.2.0.tar.gz New: ---- Scrapy-2.5.0.tar.gz ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Other differences: ------------------ ++++++ python-Scrapy.spec ++++++ --- /var/tmp/diff_new_pack.4E8EGY/_old 2021-04-29 01:39:39.862685884 +0200 +++ /var/tmp/diff_new_pack.4E8EGY/_new 2021-04-29 01:39:39.866685890 +0200 @@ -1,7 +1,7 @@ # # spec file for package python-Scrapy # -# Copyright (c) 2020 SUSE LLC +# Copyright (c) 2021 SUSE LLC # # All modifications and additions to the file contributed by third parties # remain the property of their copyright owners, unless otherwise agreed @@ -19,7 +19,7 @@ %{?!python_module:%define python_module() python-%{**} python3-%{**}} %define skip_python2 1 Name: python-Scrapy -Version: 2.2.0 +Version: 2.5.0 Release: 0 Summary: A high-level Python Screen Scraping framework License: BSD-3-Clause @@ -30,16 +30,17 @@ BuildRequires: %{python_module Protego >= 0.1.15} BuildRequires: %{python_module PyDispatcher >= 2.0.5} BuildRequires: %{python_module Twisted >= 17.9.0} +BuildRequires: %{python_module botocore} BuildRequires: %{python_module cryptography >= 2.0} BuildRequires: %{python_module cssselect >= 0.9.1} BuildRequires: %{python_module dbm} BuildRequires: %{python_module itemadapter >= 0.1.0} +BuildRequires: %{python_module itemloaders >= 1.0.1} BuildRequires: %{python_module jmespath} BuildRequires: %{python_module lxml >= 3.5.0} -BuildRequires: %{python_module mock} BuildRequires: %{python_module parsel >= 1.5.0} BuildRequires: %{python_module pyOpenSSL >= 16.2.0} -BuildRequires: %{python_module pytest-twisted} +BuildRequires: %{python_module pyftpdlib} BuildRequires: %{python_module pytest-xdist} BuildRequires: %{python_module pytest} BuildRequires: %{python_module queuelib >= 1.4.2} @@ -47,16 +48,20 @@ BuildRequires: %{python_module setuptools} BuildRequires: %{python_module sybil} BuildRequires: %{python_module testfixtures} +BuildRequires: %{python_module uvloop} BuildRequires: %{python_module w3lib >= 1.17.2} +BuildRequires: %{python_module zope.interface >= 4.1.3} BuildRequires: fdupes BuildRequires: python-rpm-macros BuildRequires: python3-Sphinx +BuildRequires: %{python_module dataclasses if (%python-base with python36-base)} Requires: python-Protego >= 0.1.15 Requires: python-PyDispatcher >= 2.0.5 Requires: python-Twisted >= 17.9.0 Requires: python-cryptography >= 2.0 Requires: python-cssselect >= 0.9.1 Requires: python-itemadapter >= 0.1.0 +Requires: python-itemloaders >= 1.0.1 Requires: python-lxml >= 3.5.0 Requires: python-parsel >= 1.5.0 Requires: python-pyOpenSSL >= 16.2.0 @@ -66,7 +71,7 @@ Requires: python-w3lib >= 1.17.2 Requires: python-zope.interface >= 4.1.3 Requires(post): update-alternatives -Requires(postun): update-alternatives +Requires(postun):update-alternatives BuildArch: noarch %python_subpackages @@ -98,19 +103,17 @@ %python_expand %fdupes %{buildroot}%{$python_sitelib} %check -# boto package is broken (boto.cacerts module removed, but resulting errors are not fixed) -skiplist="not S3AnonTestCase and not S3TestCase and not S3FeedStorageTest" -skiplist="$skiplist and not FilesPipelineTestCaseFields" -skiplist="$skiplist and not ImagesPipelineTestCaseFields" -skiplist="$skiplist and not CrawlerTestCase" -skiplist="$skiplist and not CrawlerRunnerTestCase" -skiplist="$skiplist and not RFPDupeFilterTest" -skiplist="$skiplist and not StopDownloadEngineTest" # tests/test_proxy_connect.py: requires mitmproxy == 0.10.1 -# tests/test_downloader_handlers.py: fails on https & tls tests -%{python_expand PYTHONPATH=%{buildroot}%{$python_sitelib} py.test-%{$python_bin_suffix} \ +# tests/test_downloader_handlers_*.py and test_http2_client_protocol.py: no network +# tests/test_command_check.py: twisted dns resolution of example.com error +# no color in obs chroot console +skiplist="not test_pformat" +%{pytest \ --ignore tests/test_proxy_connect.py \ + --ignore tests/test_command_check.py \ --ignore tests/test_downloader_handlers.py \ + --ignore tests/test_downloader_handlers_http2.py \ + --ignore tests/test_http2_client_protocol.py \ -k "${skiplist}" \ -W ignore::DeprecationWarning \ tests} ++++++ Scrapy-2.2.0.tar.gz -> Scrapy-2.5.0.tar.gz ++++++ ++++ 28418 lines of diff (skipped)