ensurepip optionally installs or upgrades 'pip' and 'setuptools' using
the version of those modules bundled with Python. The internal PIP
installation routine by default temporarily uses its cache, if it
exists. This is undesirable as Python builds and installations may be
independent of the user running the build, whilst PIP cache location
is dependent on the user's environment and outside of the build
environment.
At the same time, there's no value in using the cache while installing
bundled modules.
This change disables PIP caching when used in ensurepip.
The reset_peak function sets the peak memory size to the current size,
representing a resetting of that metric. This allows for recording the
peak of specific sections of code, ignoring other code that may have
had a higher peak (since the most recent `tracemalloc.start()` or
tracemalloc.clear_traces()` call).
This reverts commit 0da5466650.
The commit is causing make failures on a FreeBSD buildbot.
Due to the imminent 3.9.0b1 cutoff, revert this commit for
now pending further investigation.
Add support to the configure script for OBJC and OBJCXX command line options so that the macOS builds can use the clang compiler for the macOS-specific Objective C source files. This allows third-party compilers, like GNU gcc, to be used to build the rest of the project since some of the Objective C system header files are not compilable by GNU gcc.
Co-authored-by: Jeffrey Kintscher <websurfer@surf2c.net>
Co-authored-by: Terry Jan Reedy <tjreedy@udel.edu>
Currently, if asyncio.wait_for() timeout expires, it cancels
inner future and then always raises TimeoutError. In case
those future is task, it can handle cancelation mannually,
and those process can lead to some other exception. Current
implementation silently loses thoses exception.
To resolve this, wait_for will check was the cancelation
successfull or not. In case there was exception, wait_for
will reraise it.
Co-authored-by: Roman Skurikhin <roman.skurikhin@cruxlab.com>
compileall is now able to use hardlinks to prevent duplicates in a
case when .pyc files for different optimization levels have the same content.
Co-authored-by: Miro Hrončok <miro@hroncok.cz>
Co-authored-by: Victor Stinner <vstinner@python.org>
Added str.removeprefix and str.removesuffix methods and corresponding
bytes, bytearray, and collections.UserString methods to remove affixes
from a string if present. See PEP 616 for a full description.
This pull request fixes the newline conversion bug originally reported in bpo-1812. When that issue was originally submitted, the open builtin did not default to universal newline mode; now it does, which makes the issue fix simpler, since the only code path that needs to be changed is the one in doctest._load_testfile where the file is loaded from a package whose loader has a get_data method.
* Update ChainMap to include | and |=
Created __ior__, __or__ and __ror__ methods in ChainMap class.
* Update ACKS
* Update docs
* Update test_collections.py to include test_issue584().
Added testing for | and |= operators for ChainMap objects.
* Update test_union_operators
Renamed test_union operators, fixed errors and style problems raised by brandtbucher.
* Update test_union_operators in TestChainMap
Added testing for union operator between ChainMap and iterable of key-value pairs.
* Update test_union operators in test_collections.py
Gave more descriptive variable names and eliminated unnecessary tmp variable.
* Update test_union_operators in test_collections.py
Added cm3
* Check .maps rather than Chainmap equality.
* Add news entry
* Update Lib/test/test_collections.py
Co-Authored-By: Brandt Bucher <brandtbucher@gmail.com>
* Removed whitespace
* Added Guido's changes
* Fixed Docs
* Removed whitespace
Co-authored-by: Brandt Bucher <brandtbucher@gmail.com>
When `allow_abbrev` was first added, disabling the abbreviation of
long options broke the grouping of short flags ([bpo-26967](https://bugs.python.org/issue26967)). As a fix,
b1e4d1b603 (contained in v3.8) ignores `allow_abbrev=False` for a
given argument string if the string does _not_ start with "--"
(i.e. it doesn't look like a long option).
This fix, however, doesn't take into account that long options can
start with alternative characters specified via `prefix_chars`,
introducing a regression: `allow_abbrev=False` has no effect on long
options that start with an alternative prefix character.
The most minimal fix would be to replace the "starts with --" check
with a "starts with two prefix_chars characters". But
`_get_option_tuples` already distinguishes between long and short
options, so let's instead piggyback off of that check by moving the
`allow_abbrev` condition into `_get_option_tuples`.
https://bugs.python.org/issue39546
The GNU docs describe the `devmajor` and `devminor` fields of the tar
header struct only in the context of character and block special files,
suggesting that in other cases they are not populated. Typical utilities
behave accordingly; this patch teaches `tarfile` to do the same.
Improve multi-threaded performance by dropping the GIL in the fast path
of bytes.join. To avoid increasing overhead for small joins, it is only
done if the output size exceeds a threshold.
* Reorder the __aenter__ and __aexit__ checks for async with
* Add assertions for async with body being skipped
* Swap __aexit__ and __aenter__ loading in the documentation
The importlib.metadata documentation uses hardcoded links to internal
pages. This results in minor rendering issues. This change replaces
the hardcoded links with suitable Sphinx roles.
Signed-off-by: Oleg Höfling <oleg.hoefling@gmail.com>
On most platforms, the `environ` symbol is accessible everywhere.
In a dylib on OSX, it's not easily accessible, you need to find it with
_NSGetEnviron.
The code was caching the *value* of environ. But a setenv() can change the value,
leaving garbage at the old value. Fix: don't cache the value of environ, just
read it every time.
The regex http.cookiejar.LOOSE_HTTP_DATE_RE was vulnerable to regular
expression denial of service (REDoS).
LOOSE_HTTP_DATE_RE.match is called when using http.cookiejar.CookieJar
to parse Set-Cookie headers returned by a server.
Processing a response from a malicious HTTP server can lead to extreme
CPU usage and execution will be blocked for a long time.
The regex contained multiple overlapping \s* capture groups.
Ignoring the ?-optional capture groups the regex could be simplified to
\d+-\w+-\d+(\s*\s*\s*)$
Therefore, a long sequence of spaces can trigger bad performance.
Matching a malicious string such as
LOOSE_HTTP_DATE_RE.match("1-c-1" + (" " * 2000) + "!")
caused catastrophic backtracking.
The fix removes ambiguity about which \s* should match a particular
space.
You can create a malicious server which responds with Set-Cookie headers
to attack all python programs which access it e.g.
from http.server import BaseHTTPRequestHandler, HTTPServer
def make_set_cookie_value(n_spaces):
spaces = " " * n_spaces
expiry = f"1-c-1{spaces}!"
return f"b;Expires={expiry}"
class Handler(BaseHTTPRequestHandler):
def do_GET(self):
self.log_request(204)
self.send_response_only(204) # Don't bother sending Server and Date
n_spaces = (
int(self.path[1:]) # Can GET e.g. /100 to test shorter sequences
if len(self.path) > 1 else
65506 # Max header line length 65536
)
value = make_set_cookie_value(n_spaces)
for i in range(99): # Not necessary, but we can have up to 100 header lines
self.send_header("Set-Cookie", value)
self.end_headers()
if __name__ == "__main__":
HTTPServer(("", 44020), Handler).serve_forever()
This server returns 99 Set-Cookie headers. Each has 65506 spaces.
Extracting the cookies will pretty much never complete.
Vulnerable client using the example at the bottom of
https://docs.python.org/3/library/http.cookiejar.html :
import http.cookiejar, urllib.request
cj = http.cookiejar.CookieJar()
opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj))
r = opener.open("http://localhost:44020/")
The popular requests library was also vulnerable without any additional
options (as it uses http.cookiejar by default):
import requests
requests.get("http://localhost:44020/")
* Regression test for http.cookiejar REDoS
If we regress, this test will take a very long time.
* Improve performance of http.cookiejar.ISO_DATE_RE
A string like
"444444" + (" " * 2000) + "A"
could cause poor performance due to the 2 overlapping \s* groups,
although this is not as serious as the REDoS in LOOSE_HTTP_DATE_RE was.