The reprlib code was copied here instead of importing reprlib. I'm not sure if we really need to avoid the import, but since I expect dataclasses to be more common that reprlib, it seems wise. Plus, the code is small.
Adding `max_num_fields` to `cgi.FieldStorage` to make DOS attacks harder by
limiting the number of `MiniFieldStorage` objects created by `FieldStorage`.
Restores the use of pyexpatns.h to isolate our embedded copy of the expat C
library so that its symbols do not conflict at link or dynamic loading time
with an embedding application or other extension modules with their own
version of libexpat.
5dc3f23b5f (diff-3afaf7274c90ce1b7405f75ad825f545) inadvertently removed it when upgrading expat.
python-gdb.py now handles errors on computing the line number
of a Python frame.
Changes:
* PyFrameObjectPtr.current_line_num() now catchs any Exception on
calling addr2line(), instead of failing with a surprising "<class
'TypeError'> 'FakeRepr' object is not subscriptable" error.
* All callers of current_line_num() now handle current_line_num()
returning None.
* PyFrameObjectPtr.current_line() now also catchs IndexError on
getting a line from the Python source file.
Allow annotated global names in the module namespace after the symbol is
declared as global. Previously, only symbols annotated before they are declared
as global (i.e. inside a function) were allowed. This change allows symbols to be
declared as global before the annotation happens in the global scope.
Unconditional forcing of ``CHECKED_HASH`` invalidation was introduced in
3.7.0 in bpo-29708. The change is bad, as it unconditionally overrides
*invalidation_mode*, even if it was passed as an explicit argument to
``py_compile.compile()`` or ``compileall``. An environment variable
should *never* override an explicit argument to a library function.
That change leads to multiple test failures if the ``SOURCE_DATE_EPOCH``
environment variable is set.
This changes ``py_compile.compile()`` to only look at
``SOURCE_DATE_EPOCH`` if no explicit *invalidation_mode* was specified.
I also made various relevant tests run with explicit control over the
value of ``SOURCE_DATE_EPOCH``.
While looking at this, I noticed that ``zipimport`` does not work
with hash-based .pycs _at all_, though I left the fixes for
subsequent commits.
When Python is built with the intel control-flow protection flags,
-mcet -fcf-protection, gdb is not able to read the stack without
actually jumping inside the function. This means an extra
'next' command is required to make the $pc (program counter)
enter the function and make the stack of the function exposed to gdb.
Co-Authored-By: Marcel Plch <gmarcel.plch@gmail.com>
(cherry picked from commit 9b7c74ca32)
Let .chm document display non-ASCII characters properly
Escape the `body` part of .chm source file to 7-bit ASCII, to fix visual effect on some MBCS Windows systems.
This is needed to even the run the test suite on buildbots for affected platforms; e.g.:
```
./python.exe ./Tools/scripts/run_tests.py -j 1 -u all -W --slowest --fail-env-changed --timeout=11700 -j2
/home/embray/src/python/test-worker/3.x.test-worker/build/python -u -W default -bb -E -W error::BytesWarning -m test -r -w -j 1 -u all -W --slowest --fail-env-changed --timeout=11700 -j2
Traceback (most recent call last):
File "./Tools/scripts/run_tests.py", line 56, in <module>
main(sys.argv[1:])
File "./Tools/scripts/run_tests.py", line 52, in main
os.execv(sys.executable, args)
PermissionError: [Errno 13] Permission denied
make: *** [Makefile:1073: buildbottest] Error 1
```
The C implementation of asyncio.Task currently fails to perform the
cancellation cleanup correctly in the following scenario.
async def task1():
async def task2():
await task3 # task3 is never cancelled
asyncio.current_task().cancel()
await asyncio.create_task(task2())
The actuall error is a hardcoded call to `future_cancel()` instead of
calling the `cancel()` method of a future-like object.
Thanks to Vladimir Matveev for noticing the code discrepancy and to
Yury Selivanov for coming up with a pathological scenario.
Fix a reference issue inside multiprocessing.Pool that caused the pool to remain alive if it was deleted without being closed or terminated explicitly.
property_descr_get() uses a "cached" tuple to optimize function
calls. But this tuple can be discovered in debug mode with
sys.getobjects(). Remove the optimization, it's not really worth it
and it causes 3 different crashes last years.
Microbenchmark:
./python -m perf timeit -v \
-s "from collections import namedtuple; P = namedtuple('P', 'x y'); p = P(1, 2)" \
--duplicate 1024 "p.x"
Result:
Mean +- std dev: [ref] 32.8 ns +- 0.8 ns -> [patch] 40.4 ns +- 1.3 ns: 1.23x slower (+23%)
* Compiling a string annotation containing a lambda with keyword-only
argument without default value caused a crash.
* Remove the final "*" (it is incorrect syntax) in the representation of
lambda without *args and keyword-only arguments when compile from AST.
* Improve the representation of lambda without arguments.
The waiting is pretty normal for any asyncio program, logging its time just adds
a noise to logs without any useful information provided.
https://bugs.python.org/issue34849
Report the filename to the exception when raising {gdbm,dbm.ndbm}.error in
dbm.gnu.open() and dbm.ndbm.open() functions, so it gets printed when the
exception is raised, and can also be obtained by the filename attribute of the
exception object.