📦 Bump versions of multiple dependencies to address vulnerabilities #3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Lineaje has automatically created this pull request to resolve the following CVEs:
versions before 5.4, where it is susceptible to arbitrary
code execution when it processes untrusted YAML files through
the full_load method or with the FullLoader loader.
Applications that use the library to process untrusted input
may be vulnerable to this flaw. This flaw allows an attacker
to execute arbitrary code on the system by abusing the
python/object/new constructor. This flaw is due to an
incomplete fix for CVE-2020-1747.
multipart/form-datarequests are vulnerable to resource exhaustion. A specially
crafted form body can bypass the
Request.max_form_memory_sizesetting. TheRequest.max_content_lengthsetting, as well as resourcelimits provided by deployment software and platforms, are
also available to limit the resources used during a request.
This vulnerability does not affect those settings. All three
types of limits should be considered and set appropriately
when deploying an application.
os.path.isabs()does not catchUNC paths like
//server/share. Werkzeug'ssafe_join()relies on this check, and so can produce a path that is not
safe, potentially allowing unintended access to data.
Applications using Python >= 3.11, or not using Windows, are
not vulnerable.
number of parts, including file parts. Parts can be a small
amount of bytes, but each requires CPU time to parse and may
use more memory as Python data. If a request can be made to
an endpoint that accesses
request.data,request.form,request.files, orrequest.get_data(parse_form_data=False), it can causeunexpectedly high resource usage. This allows an attacker to
cause a denial of service by sending crafted multipart data
to an endpoint that will parse it. The amount of CPU time
required can block worker processes from handling legitimate
requests. The amount of RAM required can trigger an out of
memory kill of the process. Unlimited file parts can use up
memory and file handles. If many concurrent requests are sent
continuously, this can exhaust or kill all available workers.
=valueinstead of
key=value. A vulnerable browser may allow acompromised application on an adjacent subdomain to exploit
this to set a cookie like
=__Host-test=badfor anothersubdomain. Werkzeug <= 2.2.2 will parse the cookie
=__Host-test=badas__Host-test=bad. If a Werkzeugapplication is running next to a vulnerable or malicious
subdomain which sets such a cookie using a vulnerable
browser, the Werkzeug application will see the bad cookie
value but the valid cookie key.
may be between consecutive chunks. That's why parsing is
based on looking for newline characters. Unfortunately, code
looking for partial boundary in the buffer is written
inefficiently, so if we upload a file that starts with CR or
LF and then is followed by megabytes of data without these
characters: all of these bytes are appended chunk by chunk
into internal bytearray and lookup for boundary is performed
on growing buffer. This allows an attacker to cause a denial
of service by sending crafted multipart data to an endpoint
that will parse it. The amount of CPU time required can block
worker processes from handling legitimate requests. The
amount of RAM required can trigger an out of memory kill of
the process. If many concurrent requests are sent
continuously, this can exhaust or kill all available workers.
attacker to execute code on a developer's machine under some
circumstances. This requires the attacker to get the
developer to interact with a domain and subdomain they
control, and enter the debugger PIN, but if they are
successful it allows access to the debugger even if it is
only running on localhost. This also requires the attacker to
guess a URL in the developer's application that will trigger
the debugger.
may lead OpenSSL to crash leading to a potential Denial of
Service attack Impact summary: Applications loading files in
the PKCS12 format from untrusted sources might terminate
abruptly. A file in PKCS12 format can contain certificates
and keys and may come from an untrusted source. The PKCS12
specification allows certain fields to be NULL, but OpenSSL
does not correctly check for this case. This can lead to a
NULL pointer dereference that results in OpenSSL crashing. If
an application processes PKCS12 files from an untrusted
source using the OpenSSL APIs then that application will be
vulnerable to this issue. OpenSSL APIs that are vulnerable to
this are: PKCS12_parse(), PKCS12_unpack_p7data(),
PKCS12_unpack_p7encdata(), PKCS12_unpack_authsafes() and
PKCS12_newpass(). We have also fixed a similar issue in
SMIME_write_PKCS7(). However since this function is related
to writing data we do not consider it security significant.
The FIPS modules in 3.2, 3.1 and 3.0 are not affected by this
issue.
vulnerabilities, which would impact people using RSA
decryption in online scenarios. This is fixed in cryptography
3.2.
of OpenSSL. The versions of OpenSSL included in cryptography
0.8.1-39.0.0 are vulnerable to a security issue. More details
about the vulnerabilities themselves can be found in
https://www.openssl.org/news/secadv/20221213.txt and
https://www.openssl.org/news/secadv/20230207.txt. If you are
building cryptography source ("sdist") then you are
responsible for upgrading your copy of OpenSSL. Only users
installing from wheels built by the cryptography project
(i.e., those distributed on PyPI) need to update their
cryptography versions.
Cipher.update_intowould accept Python objectswhich implement the buffer protocol, but provide only
immutable buffers:
pycon >>> outbuf = b"\x00" * 32 >>> c =<br>ciphers.Cipher(AES(b"\x00" * 32), modes.ECB()).encryptor()<br>>>> c.update_into(b"\x00" * 16, outbuf) 16 >>> outbuf<br>b'\xdc\x95\xc0x\xa2@\x89\x89\xadH\xa2\x14\x92\x84<br>\x87\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'<br>This would allow immutable objects (such asbytes) tobe mutated, thus violating fundamental rules of Python. This
is a soundness bug -- it allows programmers to misuse an API,
it cannot be exploited by attacker controlled data alone.
This now correctly raises an exception. This issue has been
present since
update_intowas originally introduced incryptography 1.8.
of OpenSSL. The versions of OpenSSL included in cryptography
0.5-40.0.2 are vulnerable to a security issue. More details
about the vulnerability itself can be found in
https://www.openssl.org/news/secadv/20230530.txt. If you are
building cryptography source ("sdist") then you are
responsible for upgrading your copy of OpenSSL. Only users
installing from wheels built by the cryptography project
(i.e., those distributed on PyPI) need to update their
cryptography versions.
of OpenSSL. The versions of OpenSSL included in cryptography
0.8-41.0.2 are vulnerable to several security issues. More
details about the vulnerabilities themselves can be found in
https://www.openssl.org/news/secadv/20230731.txt,
https://www.openssl.org/news/secadv/20230719.txt, and
https://www.openssl.org/news/secadv/20230714.txt. If you are
building cryptography source ("sdist") then you are
responsible for upgrading your copy of OpenSSL. Only users
installing from wheels built by the cryptography project
(i.e., those distributed on PyPI) need to update their
cryptography versions.
issue may allow a remote attacker to decrypt captured
messages in TLS servers that use RSA key exchanges, which may
lead to exposure of confidential or sensitive data.
containing data intended for one client may be cached and
subsequently sent by a proxy to other clients. If the proxy
also caches
Set-Cookieheaders, it may send one client'ssessioncookie to other clients. The severity depends onthe application's use of the session, and the proxy's
behavior regarding cookies. The risk depends on all these
conditions being met. 1. The application must be hosted
behind a caching proxy that does not strip cookies or ignore
responses with cookies. 2. The application sets
session.permanent =<br>True.2. The application does not access or modify the session at
any point during a request. 4.
SESSION_REFRESH_EACH_REQUESTis enabled (the default). 5. The application does not set a
Cache-Controlheader to indicate that a page is private orshould not be cached. This happens because vulnerable
versions of Flask only set the
Vary: Cookieheader when thesession is accessed or modified, not when it is refreshed
(re-sent to update the expiration) without being accessed or
modified.
CWE-20: Improper Input Validation vulnerability in flask that
can result in Large amount of memory usage possibly leading
to denial of service. This attack appear to be exploitable
via Attacker provides JSON data in incorrect encoding. This
vulnerability appears to have been fixed in 0.12.3.
unexpected memory usage. The impact is denial of service. The
attack vector is crafted encoded JSON data. The fixed version
is 1. NOTE this may overlap CVE-2018-1000656.
prior to 2.32.4 may leak .netrc credentials to third parties
for specific maliciously-crafted URLs. ### Workarounds For
older versions of Requests, use of the .netrc file can be
disabled with
trust_env=Falseon your Requests Session(docs).
### References psf/requests#6965
https://seclists.org/fulldisclosure/2025/Jun/2
vulnerable to potentially leaking
Proxy-Authorizationheaders to destination servers, specifically during redirects
to an HTTPS origin. This is a product of how
rebuild_proxiesis used to recompute and reattach theProxy-Authorizationheader
to requests when redirected. Note this behavior has only
been observed to affect proxied requests when credentials are
supplied in the URL user information component (e.g.
https://username:password@proxy:8080). Current vulnerablebehavior(s): 1. HTTP → HTTPS: leak 2. HTTPS → HTTP:
no leak 3. HTTPS → HTTPS: leak 4. HTTP → HTTP:
no leak For HTTP connections sent through the proxy, the
proxy will identify the header in the request itself and
remove it prior to forwarding to the destination server.
However when sent over HTTPS, the
Proxy-Authorizationheader must be sent in the CONNECT request as the proxy has
no visibility into further tunneled requests. This results in
Requests forwarding the header to the destination server
unintentionally, allowing a malicious actor to potentially
exfiltrate those credentials. The reason this currently works
for HTTPS connections in Requests is the
Proxy-Authorizationheader is also handled by urllib3 withour usage of the ProxyManager in adapters.py with
proxy_manager_for.This will compute the required proxy headers in
proxy_headersand pass them to the Proxy Manager, avoidingattaching them directly to the Request object. This will be
our preferred option going forward for default usage. ###
Patches Starting in Requests v2.31.0, Requests will no longer
attach this header to redirects with an HTTPS destination.
This should have no negative impacts on the default behavior
of the library as the proxy credentials are already properly
being handled by urllib3's ProxyManager. For users with
custom adapters, this may be potentially breaking if you
were already working around this behavior. The previous
functionality of
rebuild_proxiesdoesn't make sense in anycase, so we would encourage any users impacted to migrate any
handling of Proxy-Authorization directly into their custom
adapter. ### Workarounds For users who are not able to update
Requests immediately, there is one potential workaround. You
may disable redirects by setting
allow_redirectstoFalseon all calls through Requests top-level APIs. Note that if
you're currently relying on redirect behaviors, you will need
to capture the 3xx response codes and ensure a new request is
made to the redirect destination.
import requests r =<br>requests.get('http://github.com/', allow_redirects=False)### Credits This vulnerability was discovered and disclosed
by the following individuals. Dennis Brinkrolf, Haxolot
(https://haxolot.com/) Tobias Funke,
([email protected])
Session, if thefirst request is made with
verify=Falseto disable certverification, all subsequent requests to the same origin will
continue to ignore cert verification regardless of changes to
the value of
verify. This behavior will continue for thelifecycle of the connection in the connection pool. ###
Remediation Any of these options can be used to remediate the
current issue, we highly recommend upgrading as the preferred
mitigation. * Upgrade to
requests>=2.32.0. * Forrequests<2.32.0, avoid settingverify=Falsefor the firstrequest to a host while using a Requests Session. * For
requests<2.32.0, callclose()onSessionobjects toclear existing connections if
verify=Falseis used. ###Related Links * psf/requests#6655
mechanism, which is controlled by the
Retryobject. Themost common way to disable redirects is at the request level,
as follows:
python resp = urllib3.request("GET",<br>"https://httpbin.org/redirect/1", redirect=False)<br>print(resp.status) # 302However, it is also possible todisable redirects, for all requests, by instantiating a
PoolManagerand specifyingretriesin a way that disableredirects:
python import urllib3 http =<br>urllib3.PoolManager(retries=0) # should raise MaxRetryError<br>on redirect http =<br>urllib3.PoolManager(retries=urllib3.Retry(redirect=0)) #<br>equivalent to the above http =<br>urllib3.PoolManager(retries=False) # should return the first<br>response resp = http.request("GET",<br>"https://httpbin.org/redirect/1")However, theretriesparameter is currently ignored, which means all the above
examples don't disable redirects. ## Affected usages Passing
retriesonPoolManagerinstantiation to disable redirectsor restrict their number. By default, requests and botocore
users are not affected. ## Impact Redirects are often used to
exploit SSRF vulnerabilities. An application attempting to
mitigate SSRF or open redirect vulnerabilities by disabling
redirects at the PoolManager level will remain vulnerable. ##
Remediation You can remediate this vulnerability with the
following steps: * Upgrade to a patched version of urllib3.
If your organization would benefit from the continued support
of urllib3 1.x, please contact
[email protected]
to discuss sponsorship or contribution opportunities. *
Disable redirects at the
request()level instead of thePoolManager()level.header when following a cross-origin redirect (i.e., a
redirect that differs in host, port, or scheme). This can
allow for credentials in the authorization header to be
exposed to unintended hosts or transmitted in cleartext.
NOTE: this issue exists because of an incomplete fix for
CVE-2018-20060 (which was case-sensitive).
certain cases where the desired set of CA certificates is
different from the OS store of CA certificates, which results
in SSL connections succeeding in situations where a
verification failure is the correct outcome. This is related
to use of the
ssl_context,ca_certs, orca_certs_dirargument.
injection is possible if the attacker controls the request
parameter.
controls the HTTP request method, as demonstrated by
inserting CR and LF control characters in the first argument
of
putrequest(). NOTE: this is similar to CVE-2020-26116.ProxyManager, theProxy-Authorizationheader is only sent to the configuredproxy, as expected. However, when sending HTTP requests
without using urllib3's proxy support, it's possible to
accidentally configure the
Proxy-Authorizationheader eventhough it won't have any effect as the request is not using a
forwarding proxy or a tunneling proxy. In those cases,
urllib3 doesn't treat the
Proxy-AuthorizationHTTP headeras one carrying authentication material and thus doesn't
strip the header on cross-origin redirects. Because this is a
highly unlikely scenario, we believe the severity of this
vulnerability is low for almost all users. Out of an
abundance of caution urllib3 will automatically strip the
Proxy-Authorizationheader during cross-origin redirects toavoid the small chance that users are doing this on accident.
Users should use urllib3's proxy support or disable automatic
redirects to achieve safe processing of the
Proxy-Authorizationheader, but we still decided to stripthe header by default in order to further protect users who
aren't using the correct approach. ## Affected usages We
believe the number of usages affected by this advisory is
low. It requires all of the following to be true to be
exploited: * Setting the
Proxy-Authorizationheader withoutusing urllib3's built-in proxy support. * Not disabling HTTP
redirects. * Either not using an HTTPS origin server or for
the proxy or target origin to redirect to a malicious origin.
## Remediation * Using the
Proxy-Authorizationheader withurllib3's
ProxyManager. * Disabling HTTP redirects usingredirects=Falsewhen sending requests. * Not using theProxy-Authorizationheader.CookieHTTP header special orprovide any helpers for managing cookies over HTTP, that is
the responsibility of the user. However, it is possible for a
user to specify a
Cookieheader and unknowingly leakinformation via HTTP redirects to a different origin if that
user doesn't disable redirects explicitly. Users must
handle redirects themselves instead of relying on urllib3's
automatic redirects to achieve safe processing of the
Cookieheader, thus we decided to strip the header bydefault in order to further protect users who aren't using
the correct approach. ## Affected usages We believe the
number of usages affected by this advisory is low. It
requires all of the following to be true to be exploited: *
Using an affected version of urllib3 (patched in v1.26.17 and
v2.0.6) * Using the
Cookieheader on requests, which ismostly typical for impersonating a browser. * Not disabling
HTTP redirects * Either not using HTTPS or for the origin
server to redirect to a malicious origin. ## Remediation
Upgrading to at least urllib3 v1.26.17 or v2.0.6 * Disabling
HTTP redirects using
redirects=Falsewhen sending requests.Not using the
Cookieheader.an HTTP redirect response using status 303 "See Other" after
the request had its method changed from one that could accept
a request body (like
POST) toGETas is required by HTTPRFCs. Although the behavior of removing the request body is
not specified in the section for redirects, it can be
inferred by piecing together information from different
sections and we have observed the behavior in other major
HTTP client implementations like curl and web browsers. From
RFC 9110 Section
9.3.1:
> A client SHOULD NOT generate content in a GET request
unless it is made directly to an origin server that has
previously indicated, in or out of band, that such a request
has a purpose and will be adequately supported. ## Affected
usages Because the vulnerability requires a previously
trusted service to become compromised in order to have an
impact on confidentiality we believe the exploitability of
this vulnerability is low. Additionally, many users aren't
putting sensitive data in HTTP request bodies, if this is the
case then this vulnerability isn't exploitable. Both of the
following conditions must be true to be affected by this
vulnerability: * If you're using urllib3 and submitting
sensitive information in the HTTP request body (such as form
data or JSON) * The origin service is compromised and starts
redirecting using 303 to a malicious peer or the
redirected-to service becomes compromised. ## Remediation You
can remediate this vulnerability with any of the following
steps: * Upgrade to a patched version of urllib3 (v1.26.18 or
v2.0.7) * Disable redirects for services that you aren't
expecting to respond with redirects with
redirects=False. *Disable automatic redirects with
redirects=Falseand handle303 redirects manually by stripping the HTTP request body.
You can merge this PR once the tests pass and the changes are reviewed.
Thank you for reviewing the update! 🚀