* pycore_pythread.h is now the central place to make sure that
_POSIX_THREADS and _POSIX_SEMAPHORES macros are defined if
available.
* Make sure that pycore_pythread.h is included when _POSIX_THREADS
and _POSIX_SEMAPHORES macros are tested.
* PY_TIMEOUT_MAX is now defined as a constant, since its value
depends on _POSIX_THREADS, instead of being defined as a macro.
* Prevent integer overflow in the preprocessor when computing
PY_TIMEOUT_MAX_VALUE on Windows:
replace "0xFFFFFFFELL * 1000 < LLONG_MAX"
with "0xFFFFFFFELL < LLONG_MAX / 1000".
* Document the change and give hints how to fix affected code.
* Add an exception for PY_TIMEOUT_MAX name to smelly.py
* Add PY_TIMEOUT_MAX to the stable ABI
Fix a race condition in "make regen-all". The deepfreeze.c source and
files generated by Argument Clinic are now generated or updated
before generating "global objects". Previously, some identifiers may
miss depending on the order in which these files were generated.
* "make regen-global-objects": Make sure that deepfreeze.c is
generated and up to date, and always run "make clinic".
* "make clinic" no longer runs generate_global_objects.py script.
* "make regen-deepfreeze" now only updates deepfreeze.c (C file).
It doesn't build deepfreeze.o (object) anymore.
* Remove misleading messages in "make regen-global-objects" and
"make clinic". They are now outdated, these commands are now
safe to use.
* Document generates files in Doc/using/configure.rst.
Co-authored-by: Erlend E. Aasland <erlend@python.org>
Output with one wheel:
```
❯ GITHUB_ACTIONS=true ./Tools/build/verify_ensurepip_wheels.py
Verifying checksum for /Volumes/RAMDisk/cpython/Lib/ensurepip/_bundled/pip-23.2.1-py3-none-any.whl.
Expected digest: 7ccf472345f20d35bdc9d1841ff5f313260c2c33fe417f48c30ac46cccabf5be
Actual digest: 7ccf472345f20d35bdc9d1841ff5f313260c2c33fe417f48c30ac46cccabf5be
::notice file=/Volumes/RAMDisk/cpython/Lib/ensurepip/_bundled/pip-23.2.1-py3-none-any.whl::Successfully verified the checksum of the pip wheel.
```
Output with two wheels:
```
❯ GITHUB_ACTIONS=true ./Tools/build/verify_ensurepip_wheels.py
::error file=/Volumes/RAMDisk/cpython/Lib/ensurepip/_bundled/pip-22.0.4-py3-none-any.whl::Found more than one wheel for package pip.
::error file=/Volumes/RAMDisk/cpython/Lib/ensurepip/_bundled/pip-23.2.1-py3-none-any.whl::Found more than one wheel for package pip.
```
Output without wheels:
```
❯ GITHUB_ACTIONS=true ./Tools/build/verify_ensurepip_wheels.py
::error file=::Could not find a pip wheel on disk.
```
Argument Clinic now has a partial support of the
Limited API:
* Add --limited option to clinic.c.
* Add '_testclinic_limited' extension which is built with
the limited C API version 3.13.
* For now, hardcode in clinic.py that "_testclinic_limited.c" targets
the limited C API.
No longer export _PyUnicode_FromId() internal C API function.
Change comment style to "// comment" and add comment explaining why
other functions have to be exported.
Update Tools/build/generate_token.py to update Include/internal/pycore_token.h
comments.
* Instead of calling get_identifiers_and_strings(), extract identifiers and strings from pycore_global_strings.h.
* Avoid ast.literal_eval(), it's very slow.
We tried this before with a dict and for all interned strings. That ran into problems due to interpreter isolation. However, exclusively using a per-interpreter cache caused some inconsistency that can eliminate the benefit of interning. Here we circle back to using a global cache, but only for statically allocated strings. We also use a more-basic _Py_hashtable_t for that global cache instead of a dict.
Ideally we would only have the global cache, but the optional isolation of each interpreter's allocator means that a non-static string object must not outlive its interpreter. Thus we would have to store a copy of each such interned string in the global cache, tied to the main interpreter.
Remove the "cpython/pytime.h" header file: it only contained private
functions. Move functions to the internal pycore_time.h header file.
Move tests from _testcapi to _testinternalcapi. Rename also test
methods to have the same name than tested C functions.
No longer export these functions:
* _PyTime_Add()
* _PyTime_As100Nanoseconds()
* _PyTime_FromMicrosecondsClamp()
* _PyTime_FromTimespec()
* _PyTime_FromTimeval()
* _PyTime_GetPerfCounterWithInfo()
* _PyTime_MulDiv()
This is the implementation of PEP683
Motivation:
The PR introduces the ability to immortalize instances in CPython which bypasses reference counting. Tagging objects as immortal allows up to skip certain operations when we know that the object will be around for the entire execution of the runtime.
Note that this by itself will bring a performance regression to the runtime due to the extra reference count checks. However, this brings the ability of having truly immutable objects that are useful in other contexts such as immutable data sharing between sub-interpreters.
Remove the bundled setuptools wheel from ensurepip, and stop installing setuptools in environments created by venv.
Co-Authored-by: Hugo van Kemenade <hugovk@users.noreply.github.com>
Co-authored-by: C.A.M. Gerlach <CAM.Gerlach@Gerlach.CAM>
Co-authored-by: Oleg Iarygin <oleg@arhadthedev.net>
* The majority of the monitoring code is in instrumentation.c
* The new instrumentation bytecodes are in bytecodes.c
* legacy_tracing.c adapts the new API to the old sys.setrace and sys.setprofile APIs
We can revisit the options for keeping it global later, if desired. For now the approach seems quite complex, so we've gone with the simpler isolation solution in the meantime.
https://github.com/python/cpython/issues/100227
* Eliminate all remaining uses of Py_SIZE and Py_SET_SIZE on PyLongObject, adding asserts.
* Change layout of size/sign bits in longobject to support future addition of immortal ints and tagged medium ints.
* Add functions to hide some internals of long object, and for setting sign and digit count.
* Replace uses of IS_MEDIUM_VALUE macro with _PyLong_IsCompact().