- Make some string interpolations more readable using f-strings or
explicit parametrisation
- Remove unneeded open() mode specifiers
Co-authored-by: Erlend E. Aasland <erlend.aasland@protonmail.com>
The following local variables were assigned but never used:
- line 551: result
- line 1341: groups
- line 1431: default_return_converter
- line 1529: ignore_self
- line 1809: input_checksum
- line 4224: new'
---
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
For code readability. Instances of `builtins.dict` have been ordered since 3.6, and have been guaranteed by the language to be ordered since Python 3.7. Argument Clinic now requires Python 3.10+.
Annotate the following:
- methods of class Class
- methods of class Module
- methods of class PythonParser
- function compute_checksum()
- function parse_file()
- global variable unsupported_special_methods
- Convert `unspecified` and `unknown` to be members of a `Sentinels` enum, rather than instances of bespoke classes.
- An enum feels more idiomatic here, and works better with type checkers.
- Convert some `==` and `!=` checks for these values to identity checks, which are more idiomatic with sentinels.
- _Don't_ do the same for `Null`, as this needs to be a distinct type due to its usage in `clinic.py`.
- Use `object` as the annotation for `default` across `clinic.py`. `default` can be literally any object, so `object` is the correct annotation here.
---
Co-authored-by: Erlend E. Aasland <erlend.aasland@protonmail.com>
Introduce TypeSet, and use it to annotate the 'accept' keyword of
various C converters. Also add some missing return annotations for
converter init functions.
* Add basic mypy workflow to CI
* Make the type check pass
---------
Co-authored-by: Erlend E. Aasland <erlend.aasland@protonmail.com>
Co-authored-by: Nikita Sobolev <mail@sobolevn.me>
Co-authored-by: Hugo van Kemenade <hugovk@users.noreply.github.com>
When monitoring LINE events, instrument all instructions that can have a predecessor on a different line.
Then check that the a new line has been hit in the instrumentation code.
This brings the behavior closer to that of 3.11, simplifying implementation and porting of tools.
Use the unused keyword param in the converter to explicitly
mark an argument as unused:
/*[clinic input]
SomeBaseClass.stubmethod
flag: bool(unused=True)
[clinic start generated code]*/
Replaces our built-in SHA3 implementation with a verified one from the HACL* project.
This implementation is used when OpenSSL does not provide SHA3 or is not present.
3.11 shiped with a very slow tiny sha3 implementation to get off of the <=3.10 reference implementation that wound up having serious bugs. This brings us back to a reasonably performing built-in implementation consistent with what we've just replaced our other guaranteed available standard hash algorithms with: code from the HACL* project.
---------
Co-authored-by: Gregory P. Smith <greg@krypto.org>
This is strictly about moving the "obmalloc" runtime state from
`_PyRuntimeState` to `PyInterpreterState`. Doing so improves isolation
between interpreters, specifically most of the memory (incl. objects)
allocated for each interpreter's use. This is important for a
per-interpreter GIL, but such isolation is valuable even without it.
FWIW, a per-interpreter obmalloc is the proverbial
canary-in-the-coalmine when it comes to the isolation of objects between
interpreters. Any object that leaks (unintentionally) to another
interpreter is highly likely to cause a crash (on debug builds at
least). That's a useful thing to know, relative to interpreter
isolation.
This PR makes three minor linting adjustments to the `wasm` module
caught by [ruff](https://github.com/charliermarsh/ruff).
<!-- gh-issue-number: gh-103801 -->
* Issue: gh-103801
<!-- /gh-issue-number -->
---------
Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com>
This is the implementation of PEP683
Motivation:
The PR introduces the ability to immortalize instances in CPython which bypasses reference counting. Tagging objects as immortal allows up to skip certain operations when we know that the object will be around for the entire execution of the runtime.
Note that this by itself will bring a performance regression to the runtime due to the extra reference count checks. However, this brings the ability of having truly immutable objects that are useful in other contexts such as immutable data sharing between sub-interpreters.
Remove the bundled setuptools wheel from ensurepip, and stop installing setuptools in environments created by venv.
Co-Authored-by: Hugo van Kemenade <hugovk@users.noreply.github.com>
Co-authored-by: C.A.M. Gerlach <CAM.Gerlach@Gerlach.CAM>
Co-authored-by: Oleg Iarygin <oleg@arhadthedev.net>
* The majority of the monitoring code is in instrumentation.c
* The new instrumentation bytecodes are in bytecodes.c
* legacy_tracing.c adapts the new API to the old sys.setrace and sys.setprofile APIs
I've also added a small comment to `Tools/c-analyzer/cpython/_parser.py` to trigger the `patchcheck` CI. It must pass now.
Automerge-Triggered-By: GH:ericsnowcurrently
On content update, builds `python` and the docs. Also adds a Dockerfile that should include everything but autoconf 2.69 that's necessary to build CPython and the entire stdlib on Fedora.
Co-authored-by: Ronald Oussoren <ronaldoussoren@mac.com>
Co-authored-by: Dusty Phillips <dusty@phillips.codes>
We can revisit the options for keeping it global later, if desired. For now the approach seems quite complex, so we've gone with the simpler isolation solution in the meantime.
https://github.com/python/cpython/issues/100227
* Eliminate all remaining uses of Py_SIZE and Py_SET_SIZE on PyLongObject, adding asserts.
* Change layout of size/sign bits in longobject to support future addition of immortal ints and tagged medium ints.
* Add functions to hide some internals of long object, and for setting sign and digit count.
* Replace uses of IS_MEDIUM_VALUE macro with _PyLong_IsCompact().
The essentially eliminates the global variable, with the associated benefits. This is also a precursor to isolating this bit of state to PyInterpreterState.
Folks that currently read _Py_RefTotal directly would have to start using _Py_GetGlobalRefTotal() instead.
https://github.com/python/cpython/issues/102304
This behavior is optional, because in some extreme cases it
may just make debugging harder. The tool defaults it to off,
but it is on in Makefile.pre.in.
Also note that this makes diffs to generated_cases.c.h noisier,
since whenever you insert or delete a line in bytecodes.c,
all subsequent #line directives will change.
This will keep us from adding new unsupported (i.e. non-const) C global variables, which would break interpreter isolation.
FYI, historically it is very uncommon for new global variables to get added. Furthermore, it is rare for new code to break the c-analyzer. So the check should almost always pass unnoticed.
Note that I've removed test_check_c_globals. A test wasn't a great fit conceptually and was super slow on debug builds. A CI check is a better fit.
This also resolves gh-100237.
https://github.com/python/cpython/issues/81057
distutils was removed in November. However, the c-analyzer relies on it. To solve that here, we vendor the parts the tool needs so it can be run against 3.12+. (Also see gh-92584.)
Note that we may end up removing this code later in favor of a solution in common with the peg_generator tool (which also relies on distutils). At the least, the copy here makes sure the c-analyzer tool works on 3.12+ in the meantime.
Some incompatible changes had gone in, and the "ignore" lists weren't properly undated. This change fixes that. It's necessary prior to enabling test_check_c_globals, which I hope to do soon.
Note that this does include moving last_resort_memory_error to PyInterpreterState.
https://github.com/python/cpython/issues/90110
Prevent test_tools from copying 1000M of "source"
It doesn't need a git repo, just the checkout. We skip .git metadata, Doc/build, Doc/venv, and `__pycache__` subdirs, that developers often have in their clients to reduce the size of the source tree copy ten-fold.
This should significantly reduce IO and presumably time on buildbots during this long test.
* Write output and metadata in a single run
This halves the time to run the cases generator
(most of the time goes into parsing the input).
* Declare or define opcode metadata based on NEED_OPCODE_TABLES
* Use generated metadata for stack_effect()
* compile.o depends on opcode_metadata.h
* Return -1 from _PyOpcode_num_popped/pushed for unknown opcode
New generator feature: Generate useful glue for output arrays, so you can just write values to the output array (no bounds checking). Rewrote UNPACK_SEQUENCE_TWO_TUPLE to use this, and also UNPACK_SEQUENCE_{TUPLE,LIST}.