We're no longer using _Py_IDENTIFIER() (or _Py_static_string()) in any core CPython code. It is still used in a number of non-builtin stdlib modules.
The replacement is: PyUnicodeObject (not pointer) fields under _PyRuntimeState, statically initialized as part of _PyRuntime. A new _Py_GET_GLOBAL_IDENTIFIER() macro facilitates lookup of the fields (along with _Py_GET_GLOBAL_STRING() for non-identifier strings).
https://bugs.python.org/issue46541#msg411799 explains the rationale for this change.
The core of the change is in:
* (new) Include/internal/pycore_global_strings.h - the declarations for the global strings, along with the macros
* Include/internal/pycore_runtime_init.h - added the static initializers for the global strings
* Include/internal/pycore_global_objects.h - where the struct in pycore_global_strings.h is hooked into _PyRuntimeState
* Tools/scripts/generate_global_objects.py - added generation of the global string declarations and static initializers
I've also added a --check flag to generate_global_objects.py (along with make check-global-objects) to check for unused global strings. That check is added to the PR CI config.
The remainder of this change updates the core code to use _Py_GET_GLOBAL_IDENTIFIER() instead of _Py_IDENTIFIER() and the related _Py*Id functions (likewise for _Py_GET_GLOBAL_STRING() instead of _Py_static_string()). This includes adding a few functions where there wasn't already an alternative to _Py*Id(), replacing the _Py_Identifier * parameter with PyObject *.
The following are not changed (yet):
* stop using _Py_IDENTIFIER() in the stdlib modules
* (maybe) get rid of _Py_IDENTIFIER(), etc. entirely -- this may not be doable as at least one package on PyPI using this (private) API
* (maybe) intern the strings during runtime init
https://bugs.python.org/issue46541
Skip the 3 slowest tests of the Address Sanitizer CI of GitHub
Actions:
* test_tools
* test_peg_generator
* test_concurrent_futures
These tests take between 5 and 20 minutes on this CI which makes this
CI job the slowest. Making this CI job faster makes the whole Python
workflow faster. These tests are run on all others CIs.
Example of Address Sanitizer output:
10 slowest tests:
- test_peg_generator: 17 min 33 sec
- test_tools: 8 min 27 sec
- test_concurrent_futures: 5 min 24 sec
- test_zipfile: 2 min 41 sec
- test_compileall: 2 min 21 sec
- test_asyncio: 2 min 17 sec
- test_gdb: 1 min 43 sec
- test_weakref: 1 min 35 sec
- test_pickle: 1 min 18 sec
- test_subprocess: 1 min 12 sec
Moreover, test_concurrent_futures also seems to be affected by
bpo-45200 bug: libasan dead lock in pthread_create().
Check that users don't push changes with outdated or patched autoconf.
The presence of runstatedir option and aclocal 1.16.3 are good markers.
Use my container image to regenerate autoconf files. "Check for changes"
will fail later when any file is regenerated.
Use ccache in check_generated_files to speed up testing.
Remove the asyncore and asynchat modules, deprecated in Python
3.6: use the asyncio module instead.
Remove the smtpd module, deprecated in Python 3.6: the aiosmtpd
module can be used instead, it is based on asyncio.
* Remove asyncore, asynchat and smtpd documentation
* Remove test_asyncore, test_asynchat and test_smtpd
* Rename Lib/asynchat.py to Lib/test/support/_asynchat.py
* Rename Lib/asyncore.py to Lib/test/support/_asyncore.py
* Rename Lib/smtpd.py to Lib/test/support/_smtpd.py
* Remove DeprecationWarning from private _asyncore, _asynchat and
_smtpd modules
* _smtpd: remove deprecated properties
Add Modules subdirs to SRCDIRS to generate directories for out-of-tree
object files.
Debian wants ncurses lib. Works on Fedora, too.
Debian also needs pkg-config to detect correct flags.
Remove more outdated comments. Makefile now tracks header dependencies
-lintl is injected by configure when needed. Build _dbm with
gdbm-compat.
Group some modules by purpose. socket, select, and mmap work on Windows,
too.
Frozen modules must be added to several files in order to work properly. Before this change this had to be done manually. Here we add a tool to generate the relevant lines in those files instead. This helps us avoid mistakes and omissions.
https://bugs.python.org/issue45019
- Remove HAVE_X509_VERIFY_PARAM_SET1_HOST check
- Update hashopenssl to require OpenSSL 1.1.1
- multissltests only OpenSSL > 1.1.0
- ALPN is always supported
- SNI is always supported
- Remove deprecated NPN code. Python wrappers are no-op.
- ECDH is always supported
- Remove OPENSSL_VERSION_1_1 macro
- Remove locking callbacks
- Drop PY_OPENSSL_1_1_API macro
- Drop HAVE_SSL_CTX_CLEAR_OPTIONS macro
- SSL_CTRL_GET_MAX_PROTO_VERSION is always defined now
- security level is always available now
- get_num_tickets is available with TLS 1.3
- X509_V_ERR MISMATCH is always available now
- Always set SSL_MODE_RELEASE_BUFFERS
- X509_V_FLAG_TRUSTED_FIRST is always available
- get_ciphers is always supported
- SSL_CTX_set_keylog_callback is always available
- Update Modules/Setup with static link example
- Mention PEP in whatsnew
- Drop 1.0.2 and 1.1.0 from GHA tests
The new checks are only executed when one or more OpenSSL-related files are modified. The checks run a handful of networking and hashing test suites. All SSL checks are optional. This PR also introduces ccache to speed up compilation. In common cases it speeds up configure and compile time from about 90 seconds to less than 30 seconds.
Signed-off-by: Christian Heimes <christian@python.org>
The suspicious check is still executed as part of the release process and release managers have been
lately fixing some actual errors that the suspicious target can find. For this reason, reactivate the suspicious
until we decide what to do in a coordinated fashion.
- [x] Build OpenSSL 1.1.1k for macOS
- [x] Build OpenSSL 1.1.1k for Windows
I have also updated multissl tester and various CI configurations to use latest OpenSSL. The versions were all over the place.
Signed-off-by: Christian Heimes <christian@python.org>
Automerge-Triggered-By: GH:tiran
The `target-branch` field doesn't seem to support array.
Since it defaults to the default branch anyway, we should just remove the `target-branch` field from the config.
Add a private list of all stdlib modules: _Py_module_names.
* Add Tools/scripts/generate_module_names.py script.
* Makefile: Add "make regen-module-names" command.
* setup.py: Add --list-module-names option.
* GitHub Action and Travis CI also runs "make regen-module-names",
not ony "make regen-all", to ensure that the module names remains
up to date.
There was a typo, we were checking if the "GITHUB_BASE_REF" string
literal was empty instead of the $GITHUB_BASE_REF value. When
$GITHUB_BASE_REF is empty, the action that triggered the run was not a
pull request, so we always run the full test suite.
Signed-off-by: Filipe Laíns <lains@riseup.net>
Adding "stale" GitHub Action
Added the "stale" GitHub action to the CPython repo.
PR's older than 30 days will be labeled as stale using the "stale-pr" label.
Closes https://github.com/python/core-workflow/issues/372
Co-authored-by: Brett Cannon <brett@python.org>
It probably helped a lot a while back, but may not be as usefull
today. We'll continue monitoring it before deletion, so true
positives can be migrated to rstlint.
The smelly.py script now also checks the Python dynamic library and
extension modules, not only the Python static library. Make also the
script more verbose: explain what it does.
The GitHub Action job now builds Python with the libpython dynamic
library.
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from v1 to v2.2.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/actions/upload-artifact/releases">actions/upload-artifact's releases</a>.</em></p>
<blockquote>
<h2>v2.2.0</h2>
<ul>
<li>Support for artifact retention</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="27bce4eee7"><code>27bce4e</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/actions/upload-artifact/issues/112">#112</a> from thboop/main</li>
<li><a href="f8b42f7ab4"><code>f8b42f7</code></a> update licensed files</li>
<li><a href="2106e8cf10"><code>2106e8c</code></a> update contributing.md</li>
<li><a href="db66798ebc"><code>db66798</code></a> Ignore Generated Files in Git PR's</li>
<li><a href="d359fd0772"><code>d359fd0</code></a> Manual Verification of licenses</li>
<li><a href="350822c32f"><code>350822c</code></a> Add Licensed Workflow and config</li>
<li><a href="abecf4abf4"><code>abecf4a</code></a> Updated README.md (<a href="https://github-redirect.dependabot.com/actions/upload-artifact/issues/118">#118</a>)</li>
<li><a href="604e071d21"><code>604e071</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/actions/upload-artifact/issues/126">#126</a> from yacaovsnc/main</li>
<li><a href="4560c23b39"><code>4560c23</code></a> Check for invalid retention-days input</li>
<li><a href="59018c2f85"><code>59018c2</code></a> Add an option to specify retention period</li>
<li>Additional commits viewable in <a href="https://github.com/actions/upload-artifact/compare/v1...27bce4eee761b5bc643f46a8dfb41b430c8d05f6">compare view</a></li>
</ul>
</details>
<br />
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
</details>
Automerge-Triggered-By: GH:Mariatta
Bumps [actions/cache](https://github.com/actions/cache) from v1 to v2.1.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/actions/cache/releases">actions/cache's releases</a>.</em></p>
<blockquote>
<h2>v2.1.2</h2>
<ul>
<li>Adds input to limit the chunk upload size, useful for self-hosted runners with slower upload speeds</li>
<li>No-op when executing on GHES</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="d1255ad936"><code>d1255ad</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/actions/cache/issues/424">#424</a> from actions/dhadka/upload-chunk-size</li>
<li><a href="68cfb2ccb7"><code>68cfb2c</code></a> Add units to description</li>
<li><a href="cce3c03a74"><code>cce3c03</code></a> Add new input to action.yml</li>
<li><a href="4bceb75b5b"><code>4bceb75</code></a> Use parseInt instead of Number to handle empty strings</li>
<li><a href="a6f1f4b32e"><code>a6f1f4b</code></a> Adds input for upload chunk size</li>
<li><a href="d606e039ae"><code>d606e03</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/actions/cache/issues/421">#421</a> from actions/dhadka/ghes</li>
<li><a href="d3e4f218f3"><code>d3e4f21</code></a> Use warning instead of info</li>
<li><a href="55a5894438"><code>55a5894</code></a> Update dist</li>
<li><a href="3f6dfcbcc4"><code>3f6dfcb</code></a> Merge branch 'main' of <a href="http://github.com/actions/cache">http://github.com/actions/cache</a> into dhadka/ghes</li>
<li><a href="0f71d4ac9a"><code>0f71d4a</code></a> Add tests for isGhes</li>
<li>Additional commits viewable in <a href="https://github.com/actions/cache/compare/v1...d1255ad9362389eac595a9ae406b8e8cb3331f16">compare view</a></li>
</ul>
</details>
<br />
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
</details>
Automerge-Triggered-By: GH:Mariatta
On Git 2.28, "git diff master..." (3 dots) no longer works when
"fetch --depth=1" is used, whereas it works on Git 2.26.
Replace "..." (3 dots) with ".." (2 dots) in the "git diff" command
computing the list of modified files between the base branch and the
PR branch.
This is the initial implementation of PEP 615, the zoneinfo module,
ported from the standalone reference implementation (see
https://www.python.org/dev/peps/pep-0615/#reference-implementation for a
link, which has a more detailed commit history).
This includes (hopefully) all functional elements described in the PEP,
but documentation is found in a separate PR. This includes:
1. A pure python implementation of the ZoneInfo class
2. A C accelerated implementation of the ZoneInfo class
3. Tests with 100% branch coverage for the Python code (though C code
coverage is less than 100%).
4. A compile-time configuration option on Linux (though not on Windows)
Differences from the reference implementation:
- The module is arranged slightly differently: the accelerated module is
`_zoneinfo` rather than `zoneinfo._czoneinfo`, which also necessitates
some changes in the test support function. (Suggested by Victor
Stinner and Steve Dower.)
- The tests are arranged slightly differently and do not include the
property tests. The tests live at test/test_zoneinfo/test_zoneinfo.py
rather than test/test_zoneinfo.py or test/test_zoneinfo/__init__.py
because we may do some refactoring in the future that would likely
require this separation anyway; we may:
- include the property tests
- automatically run all the tests against both pure Python and C,
rather than manually constructing C and Python test classes (similar
to the way this works with test_datetime.py, which generates C
and Python test cases from datetimetester.py).
- This includes a compile-time configuration option on Linux (though not
on Windows); added with much help from Thomas Wouters.
- Integration into the CPython build system is obviously different from
building a standalone zoneinfo module wheel.
- This includes configuration to install the tzdata package as part of
CI, though only on the coverage jobs. Introducing a PyPI dependency as
part of the CI build was controversial, and this is seen as less of a
major change, since the coverage jobs already depend on pip and PyPI.
Additional changes that were introduced as part of this PR, most / all of
which were backported to the reference implementation:
- Fixed reference and memory leaks
With much debugging help from Pablo Galindo
- Added smoke tests ensuring that the C and Python modules are built
The import machinery can be somewhat fragile, and the "seamlessly falls
back to pure Python" nature of this module makes it so that a problem
building the C extension or a failure to import the pure Python version
might easily go unnoticed.
- Adjustments to zoneinfo.__dir__
Suggested by Petr Viktorin.
- Slight refactorings as suggested by Steve Dower.
- Removed unnecessary if check on std_abbr
Discovered this because of a missing line in branch coverage.
Always run GitHub action jobs, even on documentation-only pull
requests. So it will be possible to make a GitHub action job, like
the Windows (64-bit) job, mandatory.
* 📝 Add a GitHub-specific security page
It will show up @
https://github.com/python/cpython/security/policy
allowing to navigate users who get there from "Security" tab in the
GitHub repo to the full article explaining the security vulnerability
reporting practices.
Co-Authored-By: Hugo <hugovk@users.noreply.github.com>
./.github/PULL_REQUEST_TEMPLATE.md:8: MD031 Fenced code blocks should be
surrounded by blank lines
./.github/PULL_REQUEST_TEMPLATE.md:10: MD031 Fenced code blocks should
be surrounded by blank lines
./.github/PULL_REQUEST_TEMPLATE.md:19: MD031 Fenced code blocks should
be surrounded by blank lines
./.github/PULL_REQUEST_TEMPLATE.md:21: MD031 Fenced code blocks should
be surrounded by blank lines
It's more trouble than it's worth, since AppVeyor only checks the HEAD commit of a PR rather than the full diff against the base branch to decide which files changed.
* Add Lib/test/pythoninfo.py: script collecting various informations
about Python to help debugging test failures.
* regrtest: remove sys.hash_info and sys.flags from header.
* Travis CI, Appveyor: run pythoninfo before tests
Fix logic for retrying nuget.exe download with Python.
Add support for HOST_PYTHON variable.
Clear internal environment variables used in find_python.bat
Use HOST_PYTHON as the actual Python if it is recent enough.
Adds HOST_PYTHON variable to AppVeyor configuration
This will allow for centralized management of the Codecov config to prevent skew as well as easier management going forward.
Closespython/core-workflow#81.
Badges are small images which gives the status of the Travis CI and
the coverage percentage of Codecode. It helps to check the status of
the Travis CI and to get the link to Travis CI.
See also https://shields.io/