<!--
Thanks for your contribution!
Please read this comment in its entirety. It's quite important.
# Pull Request title
It should be in the following format:
```
gh-NNNNN: Summary of the changes made
```
Where: gh-NNNNN refers to the GitHub issue number.
Most PRs will require an issue number. Trivial changes, like fixing a
typo, do not need an issue.
# Backport Pull Request title
If this is a backport PR (PR made against branches other than `main`),
please ensure that the PR title is in the following format:
```
[X.Y] <title from the original PR> (GH-NNNN)
```
Where: [X.Y] is the branch name, e.g. [3.6].
GH-NNNN refers to the PR number from `main`.
-->
<!-- gh-issue-number: gh-103726 -->
* Issue: gh-103726
<!-- /gh-issue-number -->
This will keep us from adding new unsupported (i.e. non-const) C global variables, which would break interpreter isolation.
FYI, historically it is very uncommon for new global variables to get added. Furthermore, it is rare for new code to break the c-analyzer. So the check should almost always pass unnoticed.
Note that I've removed test_check_c_globals. A test wasn't a great fit conceptually and was super slow on debug builds. A CI check is a better fit.
This also resolves gh-100237.
https://github.com/python/cpython/issues/81057
* Auto-cancel old builds when new commit pushed to branch
* Add a fallback
Co-authored-by: Ezio Melotti <ezio.melotti@gmail.com>
* Use the same group for all workflows.
Co-authored-by: Ezio Melotti <ezio.melotti@gmail.com>
This effectively reverts the Makefile change in gh-31637. I've added some notes so it is more clear what is going on.
We also update the "Check if generated files are up to date" job to run "make regen-deepfreeze" to ensure "make regen-global-objects" catches deepfreeze.c.
https://bugs.python.org/issue47146
Skip tests on ASAN and/or MSAN builds:
* multiprocessing tests
* test___all__
* test_concurrent_futures
* test_decimal
* test_peg_generator
* test_tools
The ASAN job of GitHub Actions no longer excludes these tests.
We're no longer using _Py_IDENTIFIER() (or _Py_static_string()) in any core CPython code. It is still used in a number of non-builtin stdlib modules.
The replacement is: PyUnicodeObject (not pointer) fields under _PyRuntimeState, statically initialized as part of _PyRuntime. A new _Py_GET_GLOBAL_IDENTIFIER() macro facilitates lookup of the fields (along with _Py_GET_GLOBAL_STRING() for non-identifier strings).
https://bugs.python.org/issue46541#msg411799 explains the rationale for this change.
The core of the change is in:
* (new) Include/internal/pycore_global_strings.h - the declarations for the global strings, along with the macros
* Include/internal/pycore_runtime_init.h - added the static initializers for the global strings
* Include/internal/pycore_global_objects.h - where the struct in pycore_global_strings.h is hooked into _PyRuntimeState
* Tools/scripts/generate_global_objects.py - added generation of the global string declarations and static initializers
I've also added a --check flag to generate_global_objects.py (along with make check-global-objects) to check for unused global strings. That check is added to the PR CI config.
The remainder of this change updates the core code to use _Py_GET_GLOBAL_IDENTIFIER() instead of _Py_IDENTIFIER() and the related _Py*Id functions (likewise for _Py_GET_GLOBAL_STRING() instead of _Py_static_string()). This includes adding a few functions where there wasn't already an alternative to _Py*Id(), replacing the _Py_Identifier * parameter with PyObject *.
The following are not changed (yet):
* stop using _Py_IDENTIFIER() in the stdlib modules
* (maybe) get rid of _Py_IDENTIFIER(), etc. entirely -- this may not be doable as at least one package on PyPI using this (private) API
* (maybe) intern the strings during runtime init
https://bugs.python.org/issue46541
Skip the 3 slowest tests of the Address Sanitizer CI of GitHub
Actions:
* test_tools
* test_peg_generator
* test_concurrent_futures
These tests take between 5 and 20 minutes on this CI which makes this
CI job the slowest. Making this CI job faster makes the whole Python
workflow faster. These tests are run on all others CIs.
Example of Address Sanitizer output:
10 slowest tests:
- test_peg_generator: 17 min 33 sec
- test_tools: 8 min 27 sec
- test_concurrent_futures: 5 min 24 sec
- test_zipfile: 2 min 41 sec
- test_compileall: 2 min 21 sec
- test_asyncio: 2 min 17 sec
- test_gdb: 1 min 43 sec
- test_weakref: 1 min 35 sec
- test_pickle: 1 min 18 sec
- test_subprocess: 1 min 12 sec
Moreover, test_concurrent_futures also seems to be affected by
bpo-45200 bug: libasan dead lock in pthread_create().
Check that users don't push changes with outdated or patched autoconf.
The presence of runstatedir option and aclocal 1.16.3 are good markers.
Use my container image to regenerate autoconf files. "Check for changes"
will fail later when any file is regenerated.
Use ccache in check_generated_files to speed up testing.
Frozen modules must be added to several files in order to work properly. Before this change this had to be done manually. Here we add a tool to generate the relevant lines in those files instead. This helps us avoid mistakes and omissions.
https://bugs.python.org/issue45019