Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-Werkzeug for openSUSE:Factory
checked in at 2023-03-15 18:53:01
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-Werkzeug (Old)
and /work/SRC/openSUSE:Factory/.python-Werkzeug.new.31432 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-Werkzeug"
Wed Mar 15 18:53:01 2023 rev:40 rq:1071237 version:2.2.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-Werkzeug/python-Werkzeug.changes
2022-09-17 20:08:28.748834789 +0200
+++
/work/SRC/openSUSE:Factory/.python-Werkzeug.new.31432/python-Werkzeug.changes
2023-03-15 18:53:04.615926761 +0100
@@ -1,0 +2,30 @@
+Mon Mar 13 18:48:22 UTC 2023 - Dirk Müller <[email protected]>
+
+- update to 2.2.3 (bsc#1208283, CVE-2023-25577):
+ * Ensure that URL rules using path converters will redirect
+ with strict slashes when the trailing slash is missing.
+ * Type signature for ``get_json`` specifies that return type
+ is not optional when ``silent=False``.
+ * ``parse_content_range_header`` returns ``None`` for a value
+ like ``bytes */-1`` where the length is invalid, instead of
+ raising an ``AssertionError``.
+ * Address remaining ``ResourceWarning`` related to the socket
+ used by ``run_simple``.
+ * Remove ``prepare_socket``, which now happens when
+ creating the server.
+ * Update pre-existing headers for ``multipart/form-data``
+ requests with the test client.
+ * Fix handling of header extended parameters such that they
+ are no longer quoted.
+ * ``LimitedStream.read`` works correctly when wrapping a
+ stream that may not return the requested size in one
+ ``read`` call.
+ * A cookie header that starts with ``=`` is treated as an
+ empty key and discarded, rather than stripping the leading ``==``.
+ * Specify a maximum number of multipart parts, default 1000,
+ after which a ``RequestEntityTooLarge`` exception is
+ raised on parsing. This mitigates a DoS attack where a
+ larger number of form/file parts would result in disproportionate
+ resource use.
+
+-------------------------------------------------------------------
Old:
----
Werkzeug-2.2.2.tar.gz
New:
----
Werkzeug-2.2.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-Werkzeug.spec ++++++
--- /var/tmp/diff_new_pack.MFq3Cu/_old 2023-03-15 18:53:07.583942549 +0100
+++ /var/tmp/diff_new_pack.MFq3Cu/_new 2023-03-15 18:53:07.587942571 +0100
@@ -1,7 +1,7 @@
#
# spec file
#
-# Copyright (c) 2022 SUSE LLC
+# Copyright (c) 2023 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -26,7 +26,7 @@
%endif
Name: python-Werkzeug%{psuffix}
-Version: 2.2.2
+Version: 2.2.3
Release: 0
Summary: The Swiss Army knife of Python web development
License: BSD-3-Clause
++++++ Werkzeug-2.2.2.tar.gz -> Werkzeug-2.2.3.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/CHANGES.rst
new/Werkzeug-2.2.3/CHANGES.rst
--- old/Werkzeug-2.2.2/CHANGES.rst 2022-08-08 23:41:31.000000000 +0200
+++ new/Werkzeug-2.2.3/CHANGES.rst 2023-02-14 18:14:28.000000000 +0100
@@ -1,5 +1,32 @@
.. currentmodule:: werkzeug
+Version 2.2.3
+-------------
+
+Released 2023-02-14
+
+- Ensure that URL rules using path converters will redirect with strict
slashes when
+ the trailing slash is missing. :issue:`2533`
+- Type signature for ``get_json`` specifies that return type is not optional
when
+ ``silent=False``. :issue:`2508`
+- ``parse_content_range_header`` returns ``None`` for a value like ``bytes
*/-1``
+ where the length is invalid, instead of raising an ``AssertionError``.
:issue:`2531`
+- Address remaining ``ResourceWarning`` related to the socket used by
``run_simple``.
+ Remove ``prepare_socket``, which now happens when creating the server.
:issue:`2421`
+- Update pre-existing headers for ``multipart/form-data`` requests with the
test
+ client. :issue:`2549`
+- Fix handling of header extended parameters such that they are no longer
quoted.
+ :issue:`2529`
+- ``LimitedStream.read`` works correctly when wrapping a stream that may not
return
+ the requested size in one ``read`` call. :issue:`2558`
+- A cookie header that starts with ``=`` is treated as an empty key and
discarded,
+ rather than stripping the leading ``==``.
+- Specify a maximum number of multipart parts, default 1000, after which a
+ ``RequestEntityTooLarge`` exception is raised on parsing. This mitigates a
DoS
+ attack where a larger number of form/file parts would result in
disproportionate
+ resource use.
+
+
Version 2.2.2
-------------
@@ -54,8 +81,9 @@
debug console. :pr:`2439`
- Fix compatibility with Python 3.11 by ensuring that ``end_lineno``
and ``end_col_offset`` are present on AST nodes. :issue:`2425`
-- Add a new faster matching router based on a state
- machine. :pr:`2433`
+- Add a new faster URL matching router based on a state machine. If a custom
converter
+ needs to match a ``/`` it must set the class variable ``part_isolating =
False``.
+ :pr:`2433`
- Fix branch leaf path masking branch paths when strict-slashes is
disabled. :issue:`1074`
- Names within options headers are always converted to lowercase. This
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/PKG-INFO new/Werkzeug-2.2.3/PKG-INFO
--- old/Werkzeug-2.2.2/PKG-INFO 2022-08-08 23:43:00.048206300 +0200
+++ new/Werkzeug-2.2.3/PKG-INFO 2023-02-14 18:14:50.419272000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: Werkzeug
-Version: 2.2.2
+Version: 2.2.3
Summary: The comprehensive WSGI web application library.
Home-page: https://palletsprojects.com/p/werkzeug/
Author: Armin Ronacher
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/docs/installation.rst
new/Werkzeug-2.2.3/docs/installation.rst
--- old/Werkzeug-2.2.2/docs/installation.rst 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/docs/installation.rst 2023-02-14 18:14:28.000000000
+0100
@@ -9,12 +9,6 @@
Python 3.7 and newer.
-Dependencies
-------------
-
-Werkzeug does not have any direct dependencies.
-
-
Optional dependencies
~~~~~~~~~~~~~~~~~~~~~
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/docs/request_data.rst
new/Werkzeug-2.2.3/docs/request_data.rst
--- old/Werkzeug-2.2.2/docs/request_data.rst 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/docs/request_data.rst 2023-02-14 18:14:28.000000000
+0100
@@ -73,23 +73,26 @@
Limiting Request Data
---------------------
-To avoid being the victim of a DDOS attack you can set the maximum
-accepted content length and request field sizes. The :class:`Request`
-class has two attributes for that: :attr:`~Request.max_content_length`
-and :attr:`~Request.max_form_memory_size`.
-
-The first one can be used to limit the total content length. For example
-by setting it to ``1024 * 1024 * 16`` the request won't accept more than
-16MB of transmitted data.
-
-Because certain data can't be moved to the hard disk (regular post data)
-whereas temporary files can, there is a second limit you can set. The
-:attr:`~Request.max_form_memory_size` limits the size of `POST`
-transmitted form data. By setting it to ``1024 * 1024 * 2`` you can make
-sure that all in memory-stored fields are not more than 2MB in size.
-
-This however does *not* affect in-memory stored files if the
-`stream_factory` used returns a in-memory file.
+The :class:`Request` class provides a few attributes to control how much data
is
+processed from the request body. This can help mitigate DoS attacks that craft
the
+request in such a way that the server uses too many resources to handle it.
Each of
+these limits will raise a :exc:`~werkzeug.exceptions.RequestEntityTooLarge` if
they are
+exceeded.
+
+- :attr:`~Request.max_content_length` Stop reading request data after this
number
+ of bytes. It's better to configure this in the WSGI server or HTTP server,
rather
+ than the WSGI application.
+- :attr:`~Request.max_form_memory_size` Stop reading request data if any
form part is
+ larger than this number of bytes. While file parts can be moved to disk,
regular
+ form field data is stored in memory only.
+- :attr:`~Request.max_form_parts` Stop reading request data if more than
this number
+ of parts are sent in multipart form data. This is useful to stop a very
large number
+ of very small parts, especially file parts. The default is 1000.
+
+Using Werkzeug to set these limits is only one layer of protection. WSGI
servers
+and HTTPS servers should set their own limits on size and timeouts. The
operating system
+or container manager should set limits on memory and processing time for server
+processes.
How to extend Parsing?
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/docs/utils.rst
new/Werkzeug-2.2.3/docs/utils.rst
--- old/Werkzeug-2.2.2/docs/utils.rst 2022-08-08 16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/docs/utils.rst 2023-02-14 18:14:28.000000000 +0100
@@ -23,6 +23,8 @@
.. autofunction:: send_file
+.. autofunction:: send_from_directory
+
.. autofunction:: import_string
.. autofunction:: find_modules
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/requirements/build.txt
new/Werkzeug-2.2.3/requirements/build.txt
--- old/Werkzeug-2.2.2/requirements/build.txt 1970-01-01 01:00:00.000000000
+0100
+++ new/Werkzeug-2.2.3/requirements/build.txt 2023-02-14 18:14:28.000000000
+0100
@@ -0,0 +1,17 @@
+# SHA1:80754af91bfb6d1073585b046fe0a474ce868509
+#
+# This file is autogenerated by pip-compile-multi
+# To update, run:
+#
+# pip-compile-multi
+#
+build==0.9.0
+ # via -r requirements/build.in
+packaging==23.0
+ # via build
+pep517==0.13.0
+ # via build
+tomli==2.0.1
+ # via
+ # build
+ # pep517
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/requirements/dev.txt
new/Werkzeug-2.2.3/requirements/dev.txt
--- old/Werkzeug-2.2.2/requirements/dev.txt 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/requirements/dev.txt 2023-02-14 18:14:28.000000000
+0100
@@ -8,55 +8,55 @@
-r docs.txt
-r tests.txt
-r typing.txt
-build==0.8.0
+build==0.9.0
# via pip-tools
+cachetools==5.2.0
+ # via tox
cfgv==3.3.1
# via pre-commit
+chardet==5.1.0
+ # via tox
click==8.1.3
# via
# pip-compile-multi
# pip-tools
-distlib==0.3.4
+colorama==0.4.6
+ # via tox
+distlib==0.3.6
# via virtualenv
-filelock==3.7.1
+filelock==3.9.0
# via
# tox
# virtualenv
-greenlet==1.1.2 ; python_version < "3.11"
- # via -r requirements/tests.in
-identify==2.5.1
+identify==2.5.12
# via pre-commit
nodeenv==1.7.0
# via pre-commit
-pep517==0.12.0
+pep517==0.13.0
# via build
-pip-compile-multi==2.4.5
+pip-compile-multi==2.6.1
# via -r requirements/dev.in
-pip-tools==6.8.0
+pip-tools==6.12.1
# via pip-compile-multi
-platformdirs==2.5.2
- # via virtualenv
-pre-commit==2.20.0
- # via -r requirements/dev.in
-pyyaml==6.0
- # via pre-commit
-six==1.16.0
+platformdirs==2.6.2
# via
# tox
# virtualenv
-toml==0.10.2
- # via
- # pre-commit
- # tox
+pre-commit==2.21.0
+ # via -r requirements/dev.in
+pyproject-api==1.4.0
+ # via tox
+pyyaml==6.0
+ # via pre-commit
toposort==1.7
# via pip-compile-multi
-tox==3.25.1
+tox==4.2.3
# via -r requirements/dev.in
-virtualenv==20.15.1
+virtualenv==20.17.1
# via
# pre-commit
# tox
-wheel==0.37.1
+wheel==0.38.4
# via pip-tools
# The following packages are considered to be unsafe in a requirements file:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/requirements/docs.txt
new/Werkzeug-2.2.3/requirements/docs.txt
--- old/Werkzeug-2.2.2/requirements/docs.txt 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/requirements/docs.txt 2023-02-14 18:14:28.000000000
+0100
@@ -7,15 +7,15 @@
#
alabaster==0.7.12
# via sphinx
-babel==2.10.3
+babel==2.11.0
# via sphinx
-certifi==2022.6.15
+certifi==2022.12.7
# via requests
-charset-normalizer==2.1.0
+charset-normalizer==2.1.1
# via requests
-docutils==0.18.1
+docutils==0.19
# via sphinx
-idna==3.3
+idna==3.4
# via requests
imagesize==1.4.1
# via sphinx
@@ -23,23 +23,21 @@
# via sphinx
markupsafe==2.1.1
# via jinja2
-packaging==21.3
+packaging==22.0
# via
# pallets-sphinx-themes
# sphinx
-pallets-sphinx-themes==2.0.2
+pallets-sphinx-themes==2.0.3
# via -r requirements/docs.in
-pygments==2.12.0
+pygments==2.14.0
# via sphinx
-pyparsing==3.0.9
- # via packaging
-pytz==2022.1
+pytz==2022.7
# via babel
requests==2.28.1
# via sphinx
snowballstemmer==2.2.0
# via sphinx
-sphinx==5.0.2
+sphinx==6.1.1
# via
# -r requirements/docs.in
# pallets-sphinx-themes
@@ -61,5 +59,5 @@
# via sphinx
sphinxcontrib-serializinghtml==1.1.5
# via sphinx
-urllib3==1.26.10
+urllib3==1.26.13
# via requests
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/requirements/tests.txt
new/Werkzeug-2.2.3/requirements/tests.txt
--- old/Werkzeug-2.2.2/requirements/tests.txt 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/requirements/tests.txt 2023-02-14 18:14:28.000000000
+0100
@@ -5,40 +5,40 @@
#
# pip-compile-multi
#
-attrs==21.4.0
+attrs==22.2.0
# via pytest
cffi==1.15.1
# via cryptography
-cryptography==37.0.4
+cryptography==39.0.0
# via -r requirements/tests.in
ephemeral-port-reserve==1.1.4
# via -r requirements/tests.in
-greenlet==1.1.2 ; python_version < "3.11"
+exceptiongroup==1.1.0
+ # via pytest
+greenlet==2.0.1 ; python_version < "3.11"
# via -r requirements/tests.in
iniconfig==1.1.1
# via pytest
-packaging==21.3
+packaging==22.0
# via pytest
pluggy==1.0.0
# via pytest
-psutil==5.9.1
+psutil==5.9.4
# via pytest-xprocess
py==1.11.0
- # via pytest
+ # via pytest-xprocess
pycparser==2.21
# via cffi
-pyparsing==3.0.9
- # via packaging
-pytest==7.1.2
+pytest==7.2.0
# via
# -r requirements/tests.in
# pytest-timeout
# pytest-xprocess
pytest-timeout==2.1.0
# via -r requirements/tests.in
-pytest-xprocess==0.19.0
+pytest-xprocess==0.22.2
# via -r requirements/tests.in
tomli==2.0.1
# via pytest
-watchdog==2.1.9
+watchdog==2.2.1
# via -r requirements/tests.in
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/requirements/typing.txt
new/Werkzeug-2.2.3/requirements/typing.txt
--- old/Werkzeug-2.2.2/requirements/typing.txt 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/requirements/typing.txt 2023-02-14 18:14:28.000000000
+0100
@@ -1,11 +1,11 @@
-# SHA1:95499f7e92b572adde012b13e1ec99dbbb2f7089
+# SHA1:162796b1b3ac7a29da65fe0e32278f14b68ed8c8
#
# This file is autogenerated by pip-compile-multi
# To update, run:
#
# pip-compile-multi
#
-mypy==0.961
+mypy==0.991
# via -r requirements/typing.in
mypy-extensions==0.4.3
# via mypy
@@ -15,7 +15,11 @@
# via -r requirements/typing.in
types-dataclasses==0.6.6
# via -r requirements/typing.in
-types-setuptools==62.6.1
+types-docutils==0.19.1.1
+ # via types-setuptools
+types-setuptools==65.6.0.3
# via -r requirements/typing.in
-typing-extensions==4.3.0
+typing-extensions==4.4.0
# via mypy
+watchdog==2.2.1
+ # via -r requirements/typing.in
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/setup.cfg new/Werkzeug-2.2.3/setup.cfg
--- old/Werkzeug-2.2.2/setup.cfg 2022-08-08 23:43:00.051540000 +0200
+++ new/Werkzeug-2.2.3/setup.cfg 2023-02-14 18:14:50.423272100 +0100
@@ -58,19 +58,6 @@
src
*/site-packages
-[flake8]
-select = B, E, F, W, B9, ISC
-ignore =
- E203
- E402
- E501
- E722
- W503
-max-line-length = 80
-per-file-ignores =
- **/__init__.py: F401
- src/werkzeug/local.py: E731
-
[mypy]
files = src/werkzeug
python_version = 3.7
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/Werkzeug.egg-info/PKG-INFO
new/Werkzeug-2.2.3/src/Werkzeug.egg-info/PKG-INFO
--- old/Werkzeug-2.2.2/src/Werkzeug.egg-info/PKG-INFO 2022-08-08
23:43:00.000000000 +0200
+++ new/Werkzeug-2.2.3/src/Werkzeug.egg-info/PKG-INFO 2023-02-14
18:14:50.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: Werkzeug
-Version: 2.2.2
+Version: 2.2.3
Summary: The comprehensive WSGI web application library.
Home-page: https://palletsprojects.com/p/werkzeug/
Author: Armin Ronacher
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/Werkzeug.egg-info/SOURCES.txt
new/Werkzeug-2.2.3/src/Werkzeug.egg-info/SOURCES.txt
--- old/Werkzeug-2.2.2/src/Werkzeug.egg-info/SOURCES.txt 2022-08-08
23:43:00.000000000 +0200
+++ new/Werkzeug-2.2.3/src/Werkzeug.egg-info/SOURCES.txt 2023-02-14
18:14:50.000000000 +0100
@@ -163,6 +163,7 @@
examples/simplewiki/templates/recent_changes.html
examples/webpylike/example.py
examples/webpylike/webpylike.py
+requirements/build.txt
requirements/dev.txt
requirements/docs.txt
requirements/tests.txt
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/__init__.py
new/Werkzeug-2.2.3/src/werkzeug/__init__.py
--- old/Werkzeug-2.2.2/src/werkzeug/__init__.py 2022-08-08 23:41:31.000000000
+0200
+++ new/Werkzeug-2.2.3/src/werkzeug/__init__.py 2023-02-14 18:14:28.000000000
+0100
@@ -3,4 +3,4 @@
from .wrappers import Request as Request
from .wrappers import Response as Response
-__version__ = "2.2.2"
+__version__ = "2.2.3"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/_internal.py
new/Werkzeug-2.2.3/src/werkzeug/_internal.py
--- old/Werkzeug-2.2.2/src/werkzeug/_internal.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/_internal.py 2023-02-14
18:14:28.000000000 +0100
@@ -34,7 +34,7 @@
_legal_cookie_chars_re = rb"[\w\d!#%&\'~_`><@,:/\$\*\+\-\.\^\|\)\(\?\}\{\=]"
_cookie_re = re.compile(
rb"""
- (?P<key>[^=;]+)
+ (?P<key>[^=;]*)
(?:\s*=\s*
(?P<val>
"(?:[^\\"]|\\.)*" |
@@ -382,16 +382,21 @@
"""Lowlevel cookie parsing facility that operates on bytes."""
i = 0
n = len(b)
+ b += b";"
while i < n:
- match = _cookie_re.search(b + b";", i)
+ match = _cookie_re.match(b, i)
+
if not match:
break
- key = match.group("key").strip()
- value = match.group("val") or b""
i = match.end(0)
+ key = match.group("key").strip()
+
+ if not key:
+ continue
+ value = match.group("val") or b""
yield key, _cookie_unquote(value)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/_reloader.py
new/Werkzeug-2.2.3/src/werkzeug/_reloader.py
--- old/Werkzeug-2.2.2/src/werkzeug/_reloader.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/_reloader.py 2023-02-14
18:14:28.000000000 +0100
@@ -20,7 +20,7 @@
if hasattr(sys, "real_prefix"):
# virtualenv < 20
- prefix.add(sys.real_prefix) # type: ignore[attr-defined]
+ prefix.add(sys.real_prefix)
_stat_ignore_scan = tuple(prefix)
del prefix
@@ -309,7 +309,7 @@
super().__init__(*args, **kwargs)
trigger_reload = self.trigger_reload
- class EventHandler(PatternMatchingEventHandler): # type: ignore
+ class EventHandler(PatternMatchingEventHandler):
def on_any_event(self, event): # type: ignore
trigger_reload(event.src_path)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/datastructures.py
new/Werkzeug-2.2.3/src/werkzeug/datastructures.py
--- old/Werkzeug-2.2.2/src/werkzeug/datastructures.py 2022-08-08
23:36:27.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/datastructures.py 2023-02-14
18:14:28.000000000 +0100
@@ -1226,7 +1226,7 @@
(_unicodify_header_value(k), _unicodify_header_value(v))
for (k, v) in value
]
- for (_, v) in value:
+ for _, v in value:
self._validate_value(v)
if isinstance(key, int):
self._list[key] = value[0]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/debug/__init__.py
new/Werkzeug-2.2.3/src/werkzeug/debug/__init__.py
--- old/Werkzeug-2.2.2/src/werkzeug/debug/__init__.py 2022-08-08
23:36:27.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/debug/__init__.py 2023-02-14
18:14:28.000000000 +0100
@@ -329,7 +329,7 @@
app_iter = self.app(environ, start_response)
yield from app_iter
if hasattr(app_iter, "close"):
- app_iter.close() # type: ignore
+ app_iter.close()
except Exception as e:
if hasattr(app_iter, "close"):
app_iter.close() # type: ignore
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/debug/repr.py
new/Werkzeug-2.2.3/src/werkzeug/debug/repr.py
--- old/Werkzeug-2.2.2/src/werkzeug/debug/repr.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/debug/repr.py 2023-02-14
18:14:28.000000000 +0100
@@ -132,7 +132,7 @@
def regex_repr(self, obj: t.Pattern) -> str:
pattern = repr(obj.pattern)
- pattern = codecs.decode(pattern, "unicode-escape", "ignore") # type:
ignore
+ pattern = codecs.decode(pattern, "unicode-escape", "ignore")
pattern = f"r{pattern}"
return f're.compile(<span class="string regex">{pattern}</span>)'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/debug/tbtools.py
new/Werkzeug-2.2.3/src/werkzeug/debug/tbtools.py
--- old/Werkzeug-2.2.2/src/werkzeug/debug/tbtools.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/debug/tbtools.py 2023-02-14
18:14:28.000000000 +0100
@@ -184,7 +184,7 @@
}
if hasattr(fs, "colno"):
- frame_args["colno"] = fs.colno # type: ignore[attr-defined]
+ frame_args["colno"] = fs.colno
frame_args["end_colno"] = fs.end_colno # type:
ignore[attr-defined]
new_stack.append(DebugFrameSummary(**frame_args))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/exceptions.py
new/Werkzeug-2.2.3/src/werkzeug/exceptions.py
--- old/Werkzeug-2.2.2/src/werkzeug/exceptions.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/exceptions.py 2023-02-14
18:14:28.000000000 +0100
@@ -205,7 +205,7 @@
KeyError.__init__(self, arg)
@property # type: ignore
- def description(self) -> str: # type: ignore
+ def description(self) -> str:
if self.show_exception:
return (
f"{self._description}\n"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/formparser.py
new/Werkzeug-2.2.3/src/werkzeug/formparser.py
--- old/Werkzeug-2.2.2/src/werkzeug/formparser.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/formparser.py 2023-02-14
18:14:28.000000000 +0100
@@ -179,6 +179,8 @@
:param cls: an optional dict class to use. If this is not specified
or `None` the default :class:`MultiDict` is used.
:param silent: If set to False parsing errors will not be caught.
+ :param max_form_parts: The maximum number of parts to be parsed. If this is
+ exceeded, a :exc:`~exceptions.RequestEntityTooLarge` exception is
raised.
"""
def __init__(
@@ -190,6 +192,8 @@
max_content_length: t.Optional[int] = None,
cls: t.Optional[t.Type[MultiDict]] = None,
silent: bool = True,
+ *,
+ max_form_parts: t.Optional[int] = None,
) -> None:
if stream_factory is None:
stream_factory = default_stream_factory
@@ -199,6 +203,7 @@
self.errors = errors
self.max_form_memory_size = max_form_memory_size
self.max_content_length = max_content_length
+ self.max_form_parts = max_form_parts
if cls is None:
cls = MultiDict
@@ -281,6 +286,7 @@
self.errors,
max_form_memory_size=self.max_form_memory_size,
cls=self.cls,
+ max_form_parts=self.max_form_parts,
)
boundary = options.get("boundary", "").encode("ascii")
@@ -346,10 +352,12 @@
max_form_memory_size: t.Optional[int] = None,
cls: t.Optional[t.Type[MultiDict]] = None,
buffer_size: int = 64 * 1024,
+ max_form_parts: t.Optional[int] = None,
) -> None:
self.charset = charset
self.errors = errors
self.max_form_memory_size = max_form_memory_size
+ self.max_form_parts = max_form_parts
if stream_factory is None:
stream_factory = default_stream_factory
@@ -409,7 +417,9 @@
[None],
)
- parser = MultipartDecoder(boundary, self.max_form_memory_size)
+ parser = MultipartDecoder(
+ boundary, self.max_form_memory_size, max_parts=self.max_form_parts
+ )
fields = []
files = []
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/http.py
new/Werkzeug-2.2.3/src/werkzeug/http.py
--- old/Werkzeug-2.2.2/src/werkzeug/http.py 2022-08-08 23:36:27.000000000
+0200
+++ new/Werkzeug-2.2.3/src/werkzeug/http.py 2023-02-14 18:14:28.000000000
+0100
@@ -190,6 +190,15 @@
SAME_ORIGIN = "same-origin"
+def _is_extended_parameter(key: str) -> bool:
+ """Per RFC 5987/8187, "extended" values may *not* be quoted.
+ This is in keeping with browser implementations. So we test
+ using this function to see if the key indicates this parameter
+ follows the `ext-parameter` syntax (using a trailing '*').
+ """
+ return key.strip().endswith("*")
+
+
def quote_header_value(
value: t.Union[str, int], extra_chars: str = "", allow_token: bool = True
) -> str:
@@ -254,6 +263,8 @@
for key, value in options.items():
if value is None:
segments.append(key)
+ elif _is_extended_parameter(key):
+ segments.append(f"{key}={value}")
else:
segments.append(f"{key}={quote_header_value(value)}")
return "; ".join(segments)
@@ -282,6 +293,8 @@
for key, value in iterable.items():
if value is None:
items.append(key)
+ elif _is_extended_parameter(key):
+ items.append(f"{key}={value}")
else:
items.append(
f"{key}={quote_header_value(value,
allow_token=allow_token)}"
@@ -818,6 +831,9 @@
return None
if rng == "*":
+ if not is_byte_range_valid(None, None, length):
+ return None
+
return ds.ContentRange(units, None, None, length, on_update=on_update)
elif "-" not in rng:
return None
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/local.py
new/Werkzeug-2.2.3/src/werkzeug/local.py
--- old/Werkzeug-2.2.2/src/werkzeug/local.py 2022-08-08 23:36:27.000000000
+0200
+++ new/Werkzeug-2.2.3/src/werkzeug/local.py 2023-02-14 18:14:28.000000000
+0100
@@ -291,7 +291,7 @@
# A C function, use partial to bind the first argument.
def bind_f(instance: "LocalProxy", obj: t.Any) -> t.Callable:
- return partial(f, obj) # type: ignore
+ return partial(f, obj)
else:
# Use getattr, which will produce a bound method.
@@ -313,7 +313,7 @@
return self
try:
- obj = instance._get_current_object() # type: ignore[misc]
+ obj = instance._get_current_object()
except RuntimeError:
if self.fallback is None:
raise
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/middleware/lint.py
new/Werkzeug-2.2.3/src/werkzeug/middleware/lint.py
--- old/Werkzeug-2.2.2/src/werkzeug/middleware/lint.py 2022-08-08
23:36:27.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/middleware/lint.py 2023-02-14
18:14:28.000000000 +0100
@@ -164,7 +164,7 @@
self.closed = True
if hasattr(self._iterator, "close"):
- self._iterator.close() # type: ignore
+ self._iterator.close()
if self.headers_set:
status_code, headers = self.headers_set
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/middleware/profiler.py
new/Werkzeug-2.2.3/src/werkzeug/middleware/profiler.py
--- old/Werkzeug-2.2.2/src/werkzeug/middleware/profiler.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/middleware/profiler.py 2023-02-14
18:14:28.000000000 +0100
@@ -106,7 +106,7 @@
response_body.extend(app_iter)
if hasattr(app_iter, "close"):
- app_iter.close() # type: ignore
+ app_iter.close()
profile = Profile()
start = time.time()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/routing/matcher.py
new/Werkzeug-2.2.3/src/werkzeug/routing/matcher.py
--- old/Werkzeug-2.2.2/src/werkzeug/routing/matcher.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/routing/matcher.py 2023-02-14
18:14:28.000000000 +0100
@@ -127,7 +127,14 @@
remaining = []
match = re.compile(test_part.content).match(target)
if match is not None:
- rv = _match(new_state, remaining, values +
list(match.groups()))
+ groups = list(match.groups())
+ if test_part.suffixed:
+ # If a part_isolating=False part has a slash suffix,
remove the
+ # suffix from the match and check for the slash
redirect next.
+ suffix = groups.pop()
+ if suffix == "/":
+ remaining = [""]
+ rv = _match(new_state, remaining, values + groups)
if rv is not None:
return rv
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/routing/rules.py
new/Werkzeug-2.2.3/src/werkzeug/routing/rules.py
--- old/Werkzeug-2.2.2/src/werkzeug/routing/rules.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/routing/rules.py 2023-02-14
18:14:28.000000000 +0100
@@ -36,6 +36,7 @@
content: str
final: bool
static: bool
+ suffixed: bool
weight: Weighting
@@ -631,7 +632,11 @@
argument_weights,
)
yield RulePart(
- content=content, final=final, static=static,
weight=weight
+ content=content,
+ final=final,
+ static=static,
+ suffixed=False,
+ weight=weight,
)
content = ""
static = True
@@ -641,6 +646,12 @@
pos = match.end()
+ suffixed = False
+ if final and content[-1] == "/":
+ # If a converter is part_isolating=False (matches slashes) and
ends with a
+ # slash, augment the regex to support slash redirects.
+ suffixed = True
+ content = content[:-1] + "(?<!/)(/?)"
if not static:
content += r"\Z"
weight = Weighting(
@@ -649,7 +660,17 @@
-len(argument_weights),
argument_weights,
)
- yield RulePart(content=content, final=final, static=static,
weight=weight)
+ yield RulePart(
+ content=content,
+ final=final,
+ static=static,
+ suffixed=suffixed,
+ weight=weight,
+ )
+ if suffixed:
+ yield RulePart(
+ content="", final=False, static=True, suffixed=False,
weight=weight
+ )
def compile(self) -> None:
"""Compiles the regular expression and stores it."""
@@ -665,7 +686,11 @@
if domain_rule == "":
self._parts = [
RulePart(
- content="", final=False, static=True, weight=Weighting(0,
[], 0, [])
+ content="",
+ final=False,
+ static=True,
+ suffixed=False,
+ weight=Weighting(0, [], 0, []),
)
]
else:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/sansio/http.py
new/Werkzeug-2.2.3/src/werkzeug/sansio/http.py
--- old/Werkzeug-2.2.2/src/werkzeug/sansio/http.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/sansio/http.py 2023-02-14
18:14:28.000000000 +0100
@@ -126,10 +126,6 @@
def _parse_pairs() -> t.Iterator[t.Tuple[str, str]]:
for key, val in _cookie_parse_impl(cookie): # type: ignore
key_str = _to_str(key, charset, errors, allow_none_charset=True)
-
- if not key_str:
- continue
-
val_str = _to_str(val, charset, errors, allow_none_charset=True)
yield key_str, val_str
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/sansio/multipart.py
new/Werkzeug-2.2.3/src/werkzeug/sansio/multipart.py
--- old/Werkzeug-2.2.2/src/werkzeug/sansio/multipart.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/sansio/multipart.py 2023-02-14
18:14:28.000000000 +0100
@@ -87,10 +87,13 @@
self,
boundary: bytes,
max_form_memory_size: Optional[int] = None,
+ *,
+ max_parts: Optional[int] = None,
) -> None:
self.buffer = bytearray()
self.complete = False
self.max_form_memory_size = max_form_memory_size
+ self.max_parts = max_parts
self.state = State.PREAMBLE
self.boundary = boundary
@@ -118,6 +121,7 @@
re.MULTILINE,
)
self._search_position = 0
+ self._parts_decoded = 0
def last_newline(self) -> int:
try:
@@ -191,6 +195,10 @@
)
self.state = State.DATA
self._search_position = 0
+ self._parts_decoded += 1
+
+ if self.max_parts is not None and self._parts_decoded >
self.max_parts:
+ raise RequestEntityTooLarge()
else:
# Update the search start position to be equal to the
# current buffer length (already searched) minus a
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/security.py
new/Werkzeug-2.2.3/src/werkzeug/security.py
--- old/Werkzeug-2.2.2/src/werkzeug/security.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/src/werkzeug/security.py 2023-02-14 18:14:28.000000000
+0100
@@ -12,7 +12,7 @@
DEFAULT_PBKDF2_ITERATIONS = 260000
_os_alt_seps: t.List[str] = list(
- sep for sep in [os.path.sep, os.path.altsep] if sep is not None and sep !=
"/"
+ sep for sep in [os.sep, os.path.altsep] if sep is not None and sep != "/"
)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/serving.py
new/Werkzeug-2.2.3/src/werkzeug/serving.py
--- old/Werkzeug-2.2.2/src/werkzeug/serving.py 2022-08-08 23:36:27.000000000
+0200
+++ new/Werkzeug-2.2.3/src/werkzeug/serving.py 2023-02-14 18:14:28.000000000
+0100
@@ -221,9 +221,7 @@
try:
# binary_form=False gives nicer information, but wouldn't be
compatible with
# what Nginx or Apache could return.
- peer_cert = self.connection.getpeercert( # type:
ignore[attr-defined]
- binary_form=True
- )
+ peer_cert = self.connection.getpeercert(binary_form=True)
if peer_cert is not None:
# Nginx and Apache use PEM format.
environ["SSL_CLIENT_CERT"] =
ssl.DER_cert_to_PEM_cert(peer_cert)
@@ -329,7 +327,7 @@
self.wfile.write(b"0\r\n\r\n")
finally:
if hasattr(application_iter, "close"):
- application_iter.close() # type: ignore
+ application_iter.close()
try:
execute(self.server.app)
@@ -659,6 +657,7 @@
multithread = False
multiprocess = False
request_queue_size = LISTEN_QUEUE
+ allow_reuse_address = True
def __init__(
self,
@@ -710,10 +709,36 @@
try:
self.server_bind()
self.server_activate()
+ except OSError as e:
+ # Catch connection issues and show them without the traceback.
Show
+ # extra instructions for address not found, and for macOS.
+ self.server_close()
+ print(e.strerror, file=sys.stderr)
+
+ if e.errno == errno.EADDRINUSE:
+ print(
+ f"Port {port} is in use by another program. Either
identify and"
+ " stop that program, or start the server with a
different"
+ " port.",
+ file=sys.stderr,
+ )
+
+ if sys.platform == "darwin" and port == 5000:
+ print(
+ "On macOS, try disabling the 'AirPlay Receiver'
service"
+ " from System Preferences -> Sharing.",
+ file=sys.stderr,
+ )
+
+ sys.exit(1)
except BaseException:
self.server_close()
raise
else:
+ # TCPServer automatically opens a socket even if bind_and_activate
is False.
+ # Close it to silence a ResourceWarning.
+ self.server_close()
+
# Use the passed in socket directly.
self.socket = socket.fromfd(fd, address_family, socket.SOCK_STREAM)
self.server_address = self.socket.getsockname()
@@ -879,60 +904,6 @@
return os.environ.get("WERKZEUG_RUN_MAIN") == "true"
-def prepare_socket(hostname: str, port: int) -> socket.socket:
- """Prepare a socket for use by the WSGI server and reloader.
-
- The socket is marked inheritable so that it can be kept across
- reloads instead of breaking connections.
-
- Catch errors during bind and show simpler error messages. For
- "address already in use", show instructions for resolving the issue,
- with special instructions for macOS.
-
- This is called from :func:`run_simple`, but can be used separately
- to control server creation with :func:`make_server`.
- """
- address_family = select_address_family(hostname, port)
- server_address = get_sockaddr(hostname, port, address_family)
- s = socket.socket(address_family, socket.SOCK_STREAM)
- s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
- s.set_inheritable(True)
-
- # Remove the socket file if it already exists.
- if address_family == af_unix:
- server_address = t.cast(str, server_address)
-
- if os.path.exists(server_address):
- os.unlink(server_address)
-
- # Catch connection issues and show them without the traceback. Show
- # extra instructions for address not found, and for macOS.
- try:
- s.bind(server_address)
- except OSError as e:
- print(e.strerror, file=sys.stderr)
-
- if e.errno == errno.EADDRINUSE:
- print(
- f"Port {port} is in use by another program. Either"
- " identify and stop that program, or start the"
- " server with a different port.",
- file=sys.stderr,
- )
-
- if sys.platform == "darwin" and port == 5000:
- print(
- "On macOS, try disabling the 'AirPlay Receiver'"
- " service from System Preferences -> Sharing.",
- file=sys.stderr,
- )
-
- sys.exit(1)
-
- s.listen(LISTEN_QUEUE)
- return s
-
-
def run_simple(
hostname: str,
port: int,
@@ -1059,12 +1030,7 @@
application = DebuggedApplication(application, evalex=use_evalex)
if not is_running_from_reloader():
- s = prepare_socket(hostname, port)
- fd = s.fileno()
- # Silence a ResourceWarning about an unclosed socket. This object is
no longer
- # used, the server will create another with fromfd.
- s.detach()
- os.environ["WERKZEUG_SERVER_FD"] = str(fd)
+ fd = None
else:
fd = int(os.environ["WERKZEUG_SERVER_FD"])
@@ -1079,6 +1045,8 @@
ssl_context,
fd=fd,
)
+ srv.socket.set_inheritable(True)
+ os.environ["WERKZEUG_SERVER_FD"] = str(srv.fileno())
if not is_running_from_reloader():
srv.log_startup()
@@ -1087,12 +1055,15 @@
if use_reloader:
from ._reloader import run_with_reloader
- run_with_reloader(
- srv.serve_forever,
- extra_files=extra_files,
- exclude_patterns=exclude_patterns,
- interval=reloader_interval,
- reloader_type=reloader_type,
- )
+ try:
+ run_with_reloader(
+ srv.serve_forever,
+ extra_files=extra_files,
+ exclude_patterns=exclude_patterns,
+ interval=reloader_interval,
+ reloader_type=reloader_type,
+ )
+ finally:
+ srv.server_close()
else:
srv.serve_forever()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/test.py
new/Werkzeug-2.2.3/src/werkzeug/test.py
--- old/Werkzeug-2.2.2/src/werkzeug/test.py 2022-08-08 23:36:27.000000000
+0200
+++ new/Werkzeug-2.2.3/src/werkzeug/test.py 2023-02-14 18:14:28.000000000
+0100
@@ -107,7 +107,8 @@
and mimetypes.guess_type(filename)[0]
or "application/octet-stream"
)
- headers = Headers([("Content-Type", content_type)])
+ headers = value.headers
+ headers.update([("Content-Type", content_type)])
if filename is None:
write_binary(encoder.send_event(Field(name=key,
headers=headers)))
else:
@@ -441,7 +442,7 @@
if input_stream is not None:
raise TypeError("can't provide input stream and data")
if hasattr(data, "read"):
- data = data.read() # type: ignore
+ data = data.read()
if isinstance(data, str):
data = data.encode(self.charset)
if isinstance(data, bytes):
@@ -449,7 +450,7 @@
if self.content_length is None:
self.content_length = len(data)
else:
- for key, value in _iter_data(data): # type: ignore
+ for key, value in _iter_data(data):
if isinstance(value, (tuple, dict)) or hasattr(value,
"read"):
self._add_file_from_data(key, value)
else:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/utils.py
new/Werkzeug-2.2.3/src/werkzeug/utils.py
--- old/Werkzeug-2.2.2/src/werkzeug/utils.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/src/werkzeug/utils.py 2023-02-14 18:14:28.000000000
+0100
@@ -221,7 +221,7 @@
filename = unicodedata.normalize("NFKD", filename)
filename = filename.encode("ascii", "ignore").decode("ascii")
- for sep in os.path.sep, os.path.altsep:
+ for sep in os.sep, os.path.altsep:
if sep:
filename = filename.replace(sep, " ")
filename = str(_filename_ascii_strip_re.sub("",
"_".join(filename.split()))).strip(
@@ -352,7 +352,7 @@
Never pass file paths provided by a user. The path is assumed to be
trusted, so a user could craft a path to access a file you didn't
- intend.
+ intend. Use :func:`send_from_directory` to safely serve user-provided
paths.
If the WSGI server sets a ``file_wrapper`` in ``environ``, it is
used, otherwise Werkzeug's built-in wrapper is used. Alternatively,
@@ -562,9 +562,10 @@
If the final path does not point to an existing regular file,
returns a 404 :exc:`~werkzeug.exceptions.NotFound` error.
- :param directory: The directory that ``path`` must be located under.
- :param path: The path to the file to send, relative to
- ``directory``.
+ :param directory: The directory that ``path`` must be located under. This
*must not*
+ be a value provided by the client, otherwise it becomes insecure.
+ :param path: The path to the file to send, relative to ``directory``. This
is the
+ part of the path provided by the client, which is checked for security.
:param environ: The WSGI environ for the current request.
:param kwargs: Arguments to pass to :func:`send_file`.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/wrappers/request.py
new/Werkzeug-2.2.3/src/werkzeug/wrappers/request.py
--- old/Werkzeug-2.2.2/src/werkzeug/wrappers/request.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/wrappers/request.py 2023-02-14
18:14:28.000000000 +0100
@@ -83,6 +83,13 @@
#: .. versionadded:: 0.5
max_form_memory_size: t.Optional[int] = None
+ #: The maximum number of multipart parts to parse, passed to
+ #: :attr:`form_data_parser_class`. Parsing form data with more than this
+ #: many parts will raise :exc:`~.RequestEntityTooLarge`.
+ #:
+ #: .. versionadded:: 2.2.3
+ max_form_parts = 1000
+
#: The form data parser that should be used. Can be replaced to customize
#: the form date parsing.
form_data_parser_class: t.Type[FormDataParser] = FormDataParser
@@ -246,6 +253,7 @@
self.max_form_memory_size,
self.max_content_length,
self.parameter_storage_class,
+ max_form_parts=self.max_form_parts,
)
def _load_form_data(self) -> None:
@@ -543,6 +551,18 @@
# with sentinel values.
_cached_json: t.Tuple[t.Any, t.Any] = (Ellipsis, Ellipsis)
+ @t.overload
+ def get_json(
+ self, force: bool = ..., silent: "te.Literal[False]" = ..., cache:
bool = ...
+ ) -> t.Any:
+ ...
+
+ @t.overload
+ def get_json(
+ self, force: bool = ..., silent: bool = ..., cache: bool = ...
+ ) -> t.Optional[t.Any]:
+ ...
+
def get_json(
self, force: bool = False, silent: bool = False, cache: bool = True
) -> t.Optional[t.Any]:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/wrappers/response.py
new/Werkzeug-2.2.3/src/werkzeug/wrappers/response.py
--- old/Werkzeug-2.2.2/src/werkzeug/wrappers/response.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/src/werkzeug/wrappers/response.py 2023-02-14
18:14:28.000000000 +0100
@@ -439,7 +439,7 @@
Can now be used in a with statement.
"""
if hasattr(self.response, "close"):
- self.response.close() # type: ignore
+ self.response.close()
for func in self._on_close:
func()
@@ -645,6 +645,14 @@
"""
return self.get_json()
+ @t.overload
+ def get_json(self, force: bool = ..., silent: "te.Literal[False]" = ...)
-> t.Any:
+ ...
+
+ @t.overload
+ def get_json(self, force: bool = ..., silent: bool = ...) ->
t.Optional[t.Any]:
+ ...
+
def get_json(self, force: bool = False, silent: bool = False) ->
t.Optional[t.Any]:
"""Parse :attr:`data` as JSON. Useful during testing.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/src/werkzeug/wsgi.py
new/Werkzeug-2.2.3/src/werkzeug/wsgi.py
--- old/Werkzeug-2.2.2/src/werkzeug/wsgi.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/src/werkzeug/wsgi.py 2023-02-14 18:14:28.000000000
+0100
@@ -611,9 +611,7 @@
self.end_byte = start_byte + byte_range
self.read_length = 0
- self.seekable = (
- hasattr(iterable, "seekable") and iterable.seekable() # type:
ignore
- )
+ self.seekable = hasattr(iterable, "seekable") and iterable.seekable()
self.end_reached = False
def __iter__(self) -> "_RangeWrapper":
@@ -665,7 +663,7 @@
def close(self) -> None:
if hasattr(self.iterable, "close"):
- self.iterable.close() # type: ignore
+ self.iterable.close()
def _make_chunk_iter(
@@ -930,37 +928,77 @@
raise ClientDisconnected()
- def exhaust(self, chunk_size: int = 1024 * 64) -> None:
- """Exhaust the stream. This consumes all the data left until the
- limit is reached.
+ def _exhaust_chunks(self, chunk_size: int = 1024 * 64) ->
t.Iterator[bytes]:
+ """Exhaust the stream by reading until the limit is reached or the
client
+ disconnects, yielding each chunk.
+
+ :param chunk_size: How many bytes to read at a time.
- :param chunk_size: the size for a chunk. It will read the chunk
- until the stream is exhausted and throw away
- the results.
+ :meta private:
+
+ .. versionadded:: 2.2.3
"""
to_read = self.limit - self._pos
- chunk = chunk_size
+
while to_read > 0:
- chunk = min(to_read, chunk)
- self.read(chunk)
- to_read -= chunk
+ chunk = self.read(min(to_read, chunk_size))
+ yield chunk
+ to_read -= len(chunk)
+
+ def exhaust(self, chunk_size: int = 1024 * 64) -> None:
+ """Exhaust the stream by reading until the limit is reached or the
client
+ disconnects, discarding the data.
+
+ :param chunk_size: How many bytes to read at a time.
+
+ .. versionchanged:: 2.2.3
+ Handle case where wrapped stream returns fewer bytes than
requested.
+ """
+ for _ in self._exhaust_chunks(chunk_size):
+ pass
def read(self, size: t.Optional[int] = None) -> bytes:
- """Read `size` bytes or if size is not provided everything is read.
+ """Read up to ``size`` bytes from the underlying stream. If size is not
+ provided, read until the limit.
- :param size: the number of bytes read.
+ If the limit is reached, :meth:`on_exhausted` is called, which returns
empty
+ bytes.
+
+ If no bytes are read and the limit is not reached, or if an error
occurs during
+ the read, :meth:`on_disconnect` is called, which raises
+ :exc:`.ClientDisconnected`.
+
+ :param size: The number of bytes to read. ``None``, default, reads
until the
+ limit is reached.
+
+ .. versionchanged:: 2.2.3
+ Handle case where wrapped stream returns fewer bytes than
requested.
"""
if self._pos >= self.limit:
return self.on_exhausted()
- if size is None or size == -1: # -1 is for consistence with file
- size = self.limit
+
+ if size is None or size == -1: # -1 is for consistency with file
+ # Keep reading from the wrapped stream until the limit is reached.
Can't
+ # rely on stream.read(size) because it's not guaranteed to return
size.
+ buf = bytearray()
+
+ for chunk in self._exhaust_chunks():
+ buf.extend(chunk)
+
+ return bytes(buf)
+
to_read = min(self.limit - self._pos, size)
+
try:
read = self._read(to_read)
except (OSError, ValueError):
return self.on_disconnect()
- if to_read and len(read) != to_read:
+
+ if to_read and not len(read):
+ # If no data was read, treat it as a disconnect. As long as some
data was
+ # read, a subsequent call can still return more before reaching
the limit.
return self.on_disconnect()
+
self._pos += len(read)
return read
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tests/conftest.py
new/Werkzeug-2.2.3/tests/conftest.py
--- old/Werkzeug-2.2.2/tests/conftest.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/tests/conftest.py 2023-02-14 18:14:28.000000000
+0100
@@ -41,7 +41,9 @@
self.log = None
def tail_log(self, path):
- self.log = open(path)
+ # surrogateescape allows for handling of file streams
+ # containing junk binary values as normal text streams
+ self.log = open(path, errors="surrogateescape")
self.log.read()
def connect(self, **kwargs):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tests/live_apps/data_app.py
new/Werkzeug-2.2.3/tests/live_apps/data_app.py
--- old/Werkzeug-2.2.2/tests/live_apps/data_app.py 2022-08-08
16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/tests/live_apps/data_app.py 2023-02-14
18:14:28.000000000 +0100
@@ -5,12 +5,12 @@
@Request.application
-def app(request):
+def app(request: Request) -> Response:
return Response(
json.dumps(
{
"environ": request.environ,
- "form": request.form,
+ "form": request.form.to_dict(),
"files": {k: v.read().decode("utf8") for k, v in
request.files.items()},
},
default=lambda x: str(x),
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tests/test_formparser.py
new/Werkzeug-2.2.3/tests/test_formparser.py
--- old/Werkzeug-2.2.2/tests/test_formparser.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/tests/test_formparser.py 2023-02-14 18:14:28.000000000
+0100
@@ -127,6 +127,15 @@
req.max_form_memory_size = 400
assert req.form["foo"] == "Hello World"
+ req = Request.from_values(
+ input_stream=io.BytesIO(data),
+ content_length=len(data),
+ content_type="multipart/form-data; boundary=foo",
+ method="POST",
+ )
+ req.max_form_parts = 1
+ pytest.raises(RequestEntityTooLarge, lambda: req.form["foo"])
+
def test_missing_multipart_boundary(self):
data = (
b"--foo\r\nContent-Disposition: form-field; name=foo\r\n\r\n"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tests/test_http.py
new/Werkzeug-2.2.3/tests/test_http.py
--- old/Werkzeug-2.2.2/tests/test_http.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/tests/test_http.py 2023-02-14 18:14:28.000000000
+0100
@@ -354,6 +354,7 @@
assert http.dump_header([1, 2, 3], allow_token=False) == '"1", "2",
"3"'
assert http.dump_header({"foo": "bar"}, allow_token=False) ==
'foo="bar"'
assert http.dump_header({"foo": "bar"}) == "foo=bar"
+ assert http.dump_header({"foo*": "UTF-8''bar"}) == "foo*=UTF-8''bar"
def test_is_resource_modified(self):
env = create_environ()
@@ -411,7 +412,8 @@
def test_parse_cookie(self):
cookies = http.parse_cookie(
"dismiss-top=6; CP=null*;
PHPSESSID=0a539d42abc001cdc762809248d4beed;"
- 'a=42; b="\\";"; ; fo234{=bar;blub=Blah; "__Secure-c"=d'
+ 'a=42; b="\\";"; ; fo234{=bar;blub=Blah; "__Secure-c"=d;'
+ "==__Host-eq=bad;__Host-eq=good;"
)
assert cookies.to_dict() == {
"CP": "null*",
@@ -422,6 +424,7 @@
"fo234{": "bar",
"blub": "Blah",
'"__Secure-c"': "d",
+ "__Host-eq": "good",
}
def test_dump_cookie(self):
@@ -619,6 +622,9 @@
rv = http.parse_content_range_header("bytes 0-98/*asdfsa")
assert rv is None
+ rv = http.parse_content_range_header("bytes */-1")
+ assert rv is None
+
rv = http.parse_content_range_header("bytes 0-99/100")
assert rv.to_header() == "bytes 0-99/100"
rv.start = None
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tests/test_routing.py
new/Werkzeug-2.2.3/tests/test_routing.py
--- old/Werkzeug-2.2.2/tests/test_routing.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/tests/test_routing.py 2023-02-14 18:14:28.000000000
+0100
@@ -163,6 +163,7 @@
r.Rule("/bar/", endpoint="get", methods=["GET"]),
r.Rule("/bar", endpoint="post", methods=["POST"]),
r.Rule("/foo/", endpoint="foo", methods=["POST"]),
+ r.Rule("/<path:var>/", endpoint="path", methods=["GET"]),
]
)
adapter = map.bind("example.org", "/")
@@ -170,6 +171,7 @@
# Check if the actual routes works
assert adapter.match("/bar/", method="GET") == ("get", {})
assert adapter.match("/bar", method="POST") == ("post", {})
+ assert adapter.match("/abc/", method="GET") == ("path", {"var": "abc"})
# Check if exceptions are correct
pytest.raises(r.RequestRedirect, adapter.match, "/bar", method="GET")
@@ -177,6 +179,9 @@
with pytest.raises(r.RequestRedirect) as error_info:
adapter.match("/foo", method="POST")
assert error_info.value.code == 308
+ with pytest.raises(r.RequestRedirect) as error_info:
+ adapter.match("/abc", method="GET")
+ assert error_info.value.new_url == "http://example.org/abc/"
# Check differently defined order
map = r.Map(
@@ -1434,6 +1439,9 @@
[
r.Rule("/path1", endpoint="leaf_path", strict_slashes=False),
r.Rule("/path2/", endpoint="branch_path", strict_slashes=False),
+ r.Rule(
+ "/<path:path>", endpoint="leaf_path_converter",
strict_slashes=False
+ ),
],
)
@@ -1443,6 +1451,14 @@
assert adapter.match("/path1/", method="GET") == ("leaf_path", {})
assert adapter.match("/path2", method="GET") == ("branch_path", {})
assert adapter.match("/path2/", method="GET") == ("branch_path", {})
+ assert adapter.match("/any", method="GET") == (
+ "leaf_path_converter",
+ {"path": "any"},
+ )
+ assert adapter.match("/any/", method="GET") == (
+ "leaf_path_converter",
+ {"path": "any/"},
+ )
def test_invalid_rule():
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tests/test_send_file.py
new/Werkzeug-2.2.3/tests/test_send_file.py
--- old/Werkzeug-2.2.2/tests/test_send_file.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/tests/test_send_file.py 2023-02-14 18:14:28.000000000
+0100
@@ -107,6 +107,9 @@
("Vögel.txt", "Vogel.txt", "V%C3%B6gel.txt"),
# ":/" are not safe in filename* value
("Ñе:/ÑÑ", '":/"', "%D1%82%D0%B5%3A%2F%D1%81%D1%82"),
+ # general test of extended parameter (non-quoted)
+ ("(ÑеÑÑ.txt", '"(.txt"', "(%D1%82%D0%B5%D1%81%D1%82.txt"),
+ ("(test.txt", '"(test.txt"', None),
),
)
def test_non_ascii_name(name, ascii, utf8):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tests/test_serving.py
new/Werkzeug-2.2.3/tests/test_serving.py
--- old/Werkzeug-2.2.2/tests/test_serving.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/tests/test_serving.py 2023-02-14 18:14:28.000000000
+0100
@@ -125,6 +125,7 @@
assert rv == argv
[email protected]("ignore::pytest.PytestUnraisableExceptionWarning")
@pytest.mark.parametrize("find", [_find_stat_paths, _find_watchdog_paths])
def test_exclude_patterns(find):
# Imported paths under sys.prefix will be included by default.
@@ -254,6 +255,7 @@
@pytest.mark.parametrize("endpoint", ["", "crash"])
[email protected]("ignore::pytest.PytestUnraisableExceptionWarning")
@pytest.mark.dev_server
def test_streaming_close_response(dev_server, endpoint):
"""When using HTTP/1.0, chunked encoding is not supported. Fall
@@ -265,6 +267,7 @@
assert r.data == "".join(str(x) + "\n" for x in range(5)).encode()
[email protected]("ignore::pytest.PytestUnraisableExceptionWarning")
@pytest.mark.dev_server
def test_streaming_chunked_response(dev_server):
"""When using HTTP/1.1, use Transfer-Encoding: chunked for streamed
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tests/test_wsgi.py
new/Werkzeug-2.2.3/tests/test_wsgi.py
--- old/Werkzeug-2.2.2/tests/test_wsgi.py 2022-08-08 16:19:18.000000000
+0200
+++ new/Werkzeug-2.2.3/tests/test_wsgi.py 2023-02-14 18:14:28.000000000
+0100
@@ -1,6 +1,9 @@
+from __future__ import annotations
+
import io
import json
import os
+import typing as t
import pytest
@@ -165,21 +168,63 @@
def test_limited_stream_disconnection():
- io_ = io.BytesIO(b"A bit of content")
-
- # disconnect detection on out of bytes
- stream = wsgi.LimitedStream(io_, 255)
+ # disconnect because stream returns zero bytes
+ stream = wsgi.LimitedStream(io.BytesIO(), 255)
with pytest.raises(ClientDisconnected):
stream.read()
- # disconnect detection because file close
- io_ = io.BytesIO(b"x" * 255)
- io_.close()
- stream = wsgi.LimitedStream(io_, 255)
+ # disconnect because stream is closed
+ data = io.BytesIO(b"x" * 255)
+ data.close()
+ stream = wsgi.LimitedStream(data, 255)
+
with pytest.raises(ClientDisconnected):
stream.read()
+def test_limited_stream_read_with_raw_io():
+ class OneByteStream(t.BinaryIO):
+ def __init__(self, buf: bytes) -> None:
+ self.buf = buf
+ self.pos = 0
+
+ def read(self, size: int | None = None) -> bytes:
+ """Return one byte at a time regardless of requested size."""
+
+ if size is None or size == -1:
+ raise ValueError("expected read to be called with specific
limit")
+
+ if size == 0 or len(self.buf) < self.pos:
+ return b""
+
+ b = self.buf[self.pos : self.pos + 1]
+ self.pos += 1
+ return b
+
+ stream = wsgi.LimitedStream(OneByteStream(b"foo"), 4)
+ assert stream.read(5) == b"f"
+ assert stream.read(5) == b"o"
+ assert stream.read(5) == b"o"
+
+ # The stream has fewer bytes (3) than the limit (4), therefore the read
returns 0
+ # bytes before the limit is reached.
+ with pytest.raises(ClientDisconnected):
+ stream.read(5)
+
+ stream = wsgi.LimitedStream(OneByteStream(b"foo123"), 3)
+ assert stream.read(5) == b"f"
+ assert stream.read(5) == b"o"
+ assert stream.read(5) == b"o"
+ # The limit was reached, therefore the wrapper is exhausted, not
disconnected.
+ assert stream.read(5) == b""
+
+ stream = wsgi.LimitedStream(OneByteStream(b"foo"), 3)
+ assert stream.read() == b"foo"
+
+ stream = wsgi.LimitedStream(OneByteStream(b"foo"), 2)
+ assert stream.read() == b"fo"
+
+
def test_get_host_fallback():
assert (
wsgi.get_host(
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/Werkzeug-2.2.2/tox.ini new/Werkzeug-2.2.3/tox.ini
--- old/Werkzeug-2.2.2/tox.ini 2022-08-08 16:19:18.000000000 +0200
+++ new/Werkzeug-2.2.3/tox.ini 2023-02-14 18:14:28.000000000 +0100
@@ -1,24 +1,31 @@
[tox]
envlist =
- py3{11,10,9,8,7},pypy3{8,7}
+ py3{12,11,10,9,8,7}
+ pypy39
style
typing
docs
skip_missing_interpreters = true
[testenv]
+package = wheel
+wheel_build_env = .pkg
deps = -r requirements/tests.txt
commands = pytest -v --tb=short --basetemp={envtmpdir} {posargs}
[testenv:style]
deps = pre-commit
skip_install = true
-commands = pre-commit run --all-files --show-diff-on-failure
+commands = pre-commit run --all-files
[testenv:typing]
+package = wheel
+wheel_build_env = .pkg
deps = -r requirements/typing.txt
commands = mypy
[testenv:docs]
+package = wheel
+wheel_build_env = .pkg
deps = -r requirements/docs.txt
commands = sphinx-build -W -b html -d {envtmpdir}/doctrees docs
{envtmpdir}/html
++++++ moved_root.patch ++++++
--- /var/tmp/diff_new_pack.MFq3Cu/_old 2023-03-15 18:53:07.787943635 +0100
+++ /var/tmp/diff_new_pack.MFq3Cu/_new 2023-03-15 18:53:07.791943655 +0100
@@ -2,8 +2,10 @@
tests/test_serving.py | 12 ++++++++----
1 file changed, 8 insertions(+), 4 deletions(-)
---- a/tests/test_serving.py
-+++ b/tests/test_serving.py
+Index: Werkzeug-2.2.3/tests/test_serving.py
+===================================================================
+--- Werkzeug-2.2.3.orig/tests/test_serving.py
++++ Werkzeug-2.2.3/tests/test_serving.py
@@ -10,6 +10,7 @@ from pathlib import Path
import pytest
@@ -12,8 +14,8 @@
from werkzeug import run_simple
from werkzeug._reloader import _find_stat_paths
from werkzeug._reloader import _find_watchdog_paths
-@@ -127,12 +128,15 @@ def test_windows_get_args_for_reloading(
-
+@@ -128,12 +129,15 @@ def test_windows_get_args_for_reloading(
+ @pytest.mark.filterwarnings("ignore::pytest.PytestUnraisableExceptionWarning")
@pytest.mark.parametrize("find", [_find_stat_paths, _find_watchdog_paths])
def test_exclude_patterns(find):
- # Imported paths under sys.prefix will be included by default.