Like #28744 but for the Tools directory.
[skip issue] Opening a related issue is pending python/psf-infra-meta#130
Automerge-Triggered-By: GH:pablogsal
In the list of generated frozen modules at the top of Tools/scripts/freeze_modules.py, you will find that some of the modules have a different name than the module (or .py file) that is actually frozen. Let's call each case an "alias". Aliases do not come into play until we get to the (generated) list of modules in Python/frozen.c. (The tool for freezing modules, Programs/_freeze_module, is only concerned with the source file, not the module it will be used for.)
Knowledge of which frozen modules are aliases (and the identity of the original module) normally isn't important. However, this information is valuable when we go to set __file__ on frozen stdlib modules. This change updates Tools/scripts/freeze_modules.py to map aliases to the original module name (or None if not a stdlib module) in Python/frozen.c. We also add a helper function in Python/import.c to look up a frozen module's alias and add the result of that function to the frozen info returned from find_frozen().
https://bugs.python.org/issue45020
I've added a number of test-only modules. Some of those cases are covered by the recently frozen stdlib modules (and some will be once we add encodings back in). However, I figured we'd play it safe by having a set of modules guaranteed to be there during tests.
https://bugs.python.org/issue45020
Currently we're freezing the __init__.py twice, duplicating the built data unnecessarily With this change we do it once. There is no change in runtime behavior.
https://bugs.python.org/issue45020
* Makefile.pre.in: Add $(srcdir) when needed, remove it when it was
used by mistake.
* freeze_modules.py tool uses ./Programs/_freeze_module if the
executable doesn't exist in the source tree.
The update_file.py tool now preserves the end of line of the updated
file. Fix the "make regen-frozen" command: it no longer changes the
end of line of PCbuild/ files on Unix. Git changes the end of line
depending on the platform.
The main advantage is that the files will no longer show up in diffs and PRs. That means, for a PR, the number of files / lines changed will more clearly reflect the actual change. (This is essentially an un-revert of gh-28375.)
https://bugs.python.org/issue45020
The main advantage is that the files will no longer show up in diffs and PRs. That means, for a PR, the number of files / lines changed will more clearly reflect the actual change.
https://bugs.python.org/issue45020
Here's one more small cleanup that should have been in PR gh-28319. We eliminate stdout side-effects from importing the frozen __hello__ module, and update tests accordingly. We also move the module's source file into Lib/ from Toos/freeze/flag.py.
https://bugs.python.org/issue45019
This will enable us to drop the frozen module header files from the repository.
It does currently cause many source files to be built twice, which just takes more time. For whoever comes to fix this in the future, the files shared between freeze_module and pythoncore should be put into a static library that is consumed by both.
Doing this provides significant performance gains for runtime startup (~15% with all the imported modules frozen). We don't yet freeze all the imported modules because there are a few hiccups in the build systems we need to sort out first. (See bpo-45186 and bpo-45188.)
Note that in PR GH-28320 we added a command-line flag (-X frozen_modules=[on|off]) that allows users to opt out of (or into) using frozen modules. The default is still "off" but we will change it to "on" as soon as we can do it in a way that does not cause contributors pain.
https://bugs.python.org/issue45020
There are a few things I missed in gh-27980. This is a follow-up that will make subsequent PRs cleaner. It includes fixes to tests and tools that reference the frozen modules.
https://bugs.python.org/issue45019
* Constructors of subclasses of some buitin classes (e.g. tuple, list,
frozenset) no longer accept arbitrary keyword arguments.
* Subclass of set can now define a __new__() method with additional
keyword parameters without overriding also __init__().
Frozen modules must be added to several files in order to work properly. Before this change this had to be done manually. Here we add a tool to generate the relevant lines in those files instead. This helps us avoid mistakes and omissions.
https://bugs.python.org/issue45019
Places the locals between the specials and stack. This is the more "natural" layout for a C struct, makes the code simpler and gives a slight speedup (~1%)
* Convert "specials" array to InterpreterFrame struct, adding f_lasti, f_state and other non-debug FrameObject fields to it.
* Refactor, calls pushing the call to the interpreter upward toward _PyEval_Vector.
* Compute f_back when on thread stack, only filling in value when frame object outlives stack invocation.
* Move ownership of InterpreterFrame in generator from frame object to generator object.
* Do not create frame objects for Python calls.
* Do not create frame objects for generators.
* Remove struct _node from the stable ABI list
This struct was removed along with the old parser in Python 3.9 (PEP 617)
* Stable ABI list: Use the public name "PyFrameObject" rather than "_frame"
* Ensure limited API doesn't contain private names
Names prefixed by an underscore are private by definition.
* Add a blurb
* Specialize LOAD_ATTR with LOAD_ATTR_SLOT and LOAD_ATTR_SPLIT_KEYS
* Move dict-common.h to internal/pycore_dict.h
* Add LOAD_ATTR_WITH_HINT specialized opcode.
* Quicken in function if loopy
* Specialize LOAD_ATTR for module attributes.
* Add specialization stats
These were reverted in gh-26530 (commit 17c4edc) due to refleaks.
* 2c1e258 - Compute deref offsets in compiler (gh-25152)
* b2bf2bc - Add new internal code objects fields: co_fastlocalnames and co_fastlocalkinds. (gh-26388)
This change fixes the refleaks.
https://bugs.python.org/issue43693
* Revert "bpo-43693: Compute deref offsets in compiler (gh-25152)"
This reverts commit b2bf2bc1ec.
* Revert "bpo-43693: Add new internal code objects fields: co_fastlocalnames and co_fastlocalkinds. (gh-26388)"
This reverts commit 2c1e2583fd.
These two commits are breaking the refleak buildbots.
A number of places in the code base (notably ceval.c and frameobject.c) rely on mapping variable names to indices in the frame "locals plus" array (AKA fast locals), and thus opargs. Currently the compiler indirectly encodes that information on the code object as the tuples co_varnames, co_cellvars, and co_freevars. At runtime the dependent code must calculate the proper mapping from those, which isn't ideal and impacts performance-sensitive sections. This is something we can easily address in the compiler instead.
This change addresses the situation by replacing internal use of co_varnames, etc. with a single combined tuple of names in locals-plus order, along with a minimal array mapping each to its kind (local vs. cell vs. free). These two new PyCodeObject fields, co_fastlocalnames and co_fastllocalkinds, are not exposed to Python code for now, but co_varnames, etc. are still available with the same values as before (though computed lazily).
Aside from the (mild) performance impact, there are a number of other benefits:
* there's now a clear, direct relationship between locals-plus and variables
* code that relies on the locals-plus-to-name mapping is simpler
* marshaled code objects are smaller and serialize/de-serialize faster
Also note that we can take this approach further by expanding the possible values in co_fastlocalkinds to include specific argument types (e.g. positional-only, kwargs). Doing so would allow further speed-ups in _PyEval_MakeFrameVector(), which is where args get unpacked into the locals-plus array. It would also allow us to shrink marshaled code objects even further.
https://bugs.python.org/issue43693
The invalid assignment rules are very delicate since the parser can
easily raise an invalid assignment when a keyword argument is provided.
As they are very deep into the grammar tree, is very difficult to
specify in which contexts these rules can be used and in which don't.
For that, we need to use a different version of the rule that doesn't do
error checking in those situations where we don't want the rule to raise
(keyword arguments and generator expressions).
We also need to check if we are in left-recursive rule, as those can try
to eagerly advance the parser even if the parse will fail at the end of
the expression. Failing to do this allows the parser to start parsing a
call as a tuple and incorrectly identify a keyword argument as an
invalid assignment, before it realizes that it was not a tuple after all.
* Remove 'zombie' frames. We won't need them once we are allocating fixed-size frames.
* Add co_nlocalplus field to code object to avoid recomputing size of locals + frees + cells.
* Move locals, cells and freevars out of frame object into separate memory buffer.
* Use per-threadstate allocated memory chunks for local variables.
* Move globals and builtins from frame object to per-thread stack.
* Move (slow) locals frame object to per-thread stack.
* Move internal frame functions to internal header.
The patch from [bpo-44074]() does not account for a possibly non-English locale and blindly greps for "HEAD branch" in a possibly localized text.
Automerge-Triggered-By: GH:pitrou
- Reformat the C API and ABI Versioning page (and extend/clarify a bit)
- Rewrite the stable ABI docs into a general text on C API Compatibility
- Add a list of Limited API contents, and notes for the individual items.
- Replace `Include/README.rst` with a link to a devguide page with the same info
"Zero cost" exception handling.
* Uses a lookup table to determine how to handle exceptions.
* Removes SETUP_FINALLY and POP_TOP block instructions, eliminating (most of) the runtime overhead of try statements.
* Reduces the size of the frame object by about 60%.
- Remove HAVE_X509_VERIFY_PARAM_SET1_HOST check
- Update hashopenssl to require OpenSSL 1.1.1
- multissltests only OpenSSL > 1.1.0
- ALPN is always supported
- SNI is always supported
- Remove deprecated NPN code. Python wrappers are no-op.
- ECDH is always supported
- Remove OPENSSL_VERSION_1_1 macro
- Remove locking callbacks
- Drop PY_OPENSSL_1_1_API macro
- Drop HAVE_SSL_CTX_CLEAR_OPTIONS macro
- SSL_CTRL_GET_MAX_PROTO_VERSION is always defined now
- security level is always available now
- get_num_tickets is available with TLS 1.3
- X509_V_ERR MISMATCH is always available now
- Always set SSL_MODE_RELEASE_BUFFERS
- X509_V_FLAG_TRUSTED_FIRST is always available
- get_ciphers is always supported
- SSL_CTX_set_keylog_callback is always available
- Update Modules/Setup with static link example
- Mention PEP in whatsnew
- Drop 1.0.2 and 1.1.0 from GHA tests
* Use instruction offset, rather than bytecode offset. Streamlines interpreter dispatch a bit, and removes most EXTENDED_ARGs for jumps.
* Change some uses of PyCode_Addr2Line to PyFrame_GetLineNumber
The stable_abi.py script no longer parse macros. Macro targets can be
static inline functions which are not part of the stable ABI, only
part of the limited C API.
Run "make regen-limited-abi" to exclude PyType_HasFeature from
Doc/data/stable_abi.dat.
- [x] Build OpenSSL 1.1.1k for macOS
- [x] Build OpenSSL 1.1.1k for Windows
I have also updated multissl tester and various CI configurations to use latest OpenSSL. The versions were all over the place.
Signed-off-by: Christian Heimes <christian@python.org>
Automerge-Triggered-By: GH:tiran
Remove the pyarena.h header file with functions:
* PyArena_New()
* PyArena_Free()
* PyArena_Malloc()
* PyArena_AddPyObject()
These functions were undocumented, excluded from the limited C API,
and were only used internally by the compiler.
Add pycore_pyarena.h header. Rename functions:
* PyArena_New() => _PyArena_New()
* PyArena_Free() => _PyArena_Free()
* PyArena_Malloc() => _PyArena_Malloc()
* PyArena_AddPyObject() => _PyArena_AddPyObject()
Remove parser functions using the "struct _mod" type, because the
AST C API was removed:
* PyParser_ASTFromFile()
* PyParser_ASTFromFileObject()
* PyParser_ASTFromFilename()
* PyParser_ASTFromString()
* PyParser_ASTFromStringObject()
These functions were undocumented and excluded from the limited C
API.
Add pycore_parser.h internal header file. Rename functions:
* PyParser_ASTFromFileObject() => _PyParser_ASTFromFile()
* PyParser_ASTFromStringObject() => _PyParser_ASTFromString()
These functions are no longer exported (replace PyAPI_FUNC() with
extern).
Remove also _PyPegen_run_parser_from_file() function. Update
test_peg_generator to use _PyPegen_run_parser_from_file_pointer()
instead.
Remove the compiler functions using "struct _mod" type, because the
public AST C API was removed:
* PyAST_Compile()
* PyAST_CompileEx()
* PyAST_CompileObject()
* PyFuture_FromAST()
* PyFuture_FromASTObject()
These functions were undocumented and excluded from the limited C API.
Rename functions:
* PyAST_CompileObject() => _PyAST_Compile()
* PyFuture_FromASTObject() => _PyFuture_FromAST()
Moreover, _PyFuture_FromAST() is no longer exported (replace
PyAPI_FUNC() with extern). _PyAST_Compile() remains exported for
test_peg_generator.
Remove also compatibility functions:
* PyAST_Compile()
* PyAST_CompileEx()
* PyFuture_FromAST()
Rename Include/symtable.h to to Include/internal/pycore_symtable.h,
don't export symbols anymore (replace PyAPI_FUNC and PyAPI_DATA with
extern) and rename functions:
* PyST_GetScope() to _PyST_GetScope()
* PySymtable_BuildObject() to _PySymtable_Build()
* PySymtable_Free() to _PySymtable_Free()
Remove PySymtable_Build(), Py_SymtableString() and
Py_SymtableStringObject() functions.
The Py_SymtableString() function was part the stable ABI by mistake
but it could not be used, since the symtable.h header file was
excluded from the limited C API.
The Python symtable module remains available and is unchanged.
test_peg_generator now defines _Py_TEST_PEGEN macro when building C
code to not call PyAST_Validate() in Parser/pegen.c. Moreover, it
defines Py_BUILD_CORE_MODULE macro to get access to the internal
C API.
Remove "global_ast_state" from Python-ast.c when it's built by
test_peg_generator: always get the AST state from the current interpreter.
Add frozen modules to sys.stdlib_module_names. For example, add
"_frozen_importlib" and "_frozen_importlib_external" names.
Add "list_frozen" command to Programs/_testembed.
Include/{odictobject.h,parser_interface.h,picklebufobject.h,pydebug.h,pyfpe.h}
into Include/cpython/.
Parser: peg_api: include Python.h instead of parser_interface.h.
Add a new configure --without-static-libpython option to not build
the libpythonMAJOR.MINOR.a static library and not install the
python.o object file.
Fix smelly.py and stable_abi.py tools when libpython3.10.a is
missing.
* Add to the peg generator a new directive ('&&') that allows to expect
a token and hard fail the parsing if the token is not found. This
allows to quickly emmit syntax errors for missing tokens.
* Use the new grammar element to hard-fail if the ':' is missing before
suites.
Add a private list of all stdlib modules: _Py_module_names.
* Add Tools/scripts/generate_module_names.py script.
* Makefile: Add "make regen-module-names" command.
* setup.py: Add --list-module-names option.
* GitHub Action and Travis CI also runs "make regen-module-names",
not ony "make regen-all", to ensure that the module names remains
up to date.
The distutils bdist_wininst command deprecated in Python 3.8 has been
removed. The distutils bidst_wheel command is now recommended to
distribute binary packages on Windows.
* Remove Lib/distutils/command/bdist_wininst.py
* Remove PC/bdist_wininst/ project
* Remove Lib/distutils/command/wininst-*.exe programs
* Remove all references to bdist_wininst
On Fedora 31 gdb is using python 3.7.9, calling `proxyval` on an instance with a dictionary fails because of the `dict.iteritems` usage. This PR changes the code to be compatible with py2 and py3.
This changed seemed small enough to not need an issue and news blurb, if one is required please let me know.
Automerge-Triggered-By: GH:benjaminp
* Tkinter functions and constructors which need a default root window
raise now RuntimeError with descriptive message instead of obscure
AttributeError or NameError if it is not created yet or cannot
be created automatically.
* Add tests for all functions which use default root window.
* Fix import in the pynche script.
The smelly.py script now also checks the Python dynamic library and
extension modules, not only the Python static library. Make also the
script more verbose: explain what it does.
The GitHub Action job now builds Python with the libpython dynamic
library.
Fix a race condition in "make regen-all" when make -jN option is used
to run jobs in parallel. The clinic.py script now only use atomic
write to write files. Moveover, generated files are now left
unchanged if the content does not change, to not change the file
modification time.
The "make regen-all" command runs "make clinic" and "make
regen-importlib" targets:
* "make regen-importlib" builds object files (ex: Modules/_weakref.o)
from source files (ex: Modules/_weakref.c) and clinic files (ex:
Modules/clinic/_weakref.c.h)
* "make clinic" always rewrites all clinic files
(ex: Modules/clinic/_weakref.c.h)
Since there is no dependency between "clinic" and "regen-importlib"
Makefile targets, these two targets can be run in parallel. Moreover,
half of clinic.py file writes are not atomic and so there is a race
condition when "make regen-all" runs jobs in parallel using make -jN
option (which can be passed in MAKEFLAGS environment variable).
Fix clinic.py to make all file writes atomic:
* Add write_file() function to ensure that all file writes are
atomic: write into a temporary file and then use os.replace().
* Moreover, write_file() doesn't recreate or modify the file if the
content does not change to avoid modifying the file modification
file.
* Update test_clinic to verify these assertions with a functional
test.
* Remove Clinic.force attribute which was no longer used, whereas
Clinic.verify remains useful.
Adds support to Tools/i18n/pygettext.py for gettext calls in f-strings. This process is done by parsing the f-strings, processing each value, and flagging the ones which contain a gettext call.
Co-authored-by: Batuhan Taskaya <batuhanosmantaskaya@gmail.com>
Left-recursive rules need to check for errors explicitly, since
even if the rule returns NULL, the parsing might continue and lead
to long-distance failures.
Co-authored-by: Pablo Galindo <Pablogsal@gmail.com>
Use PyLong_FromLong(0) and PyLong_FromLong(1) of the public C API
instead. For Python internals, _PyLong_GetZero() and _PyLong_GetOne()
of pycore_long.h can be used.
* Implement running the parser a second time for the errors messages
The first parser run is only responsible for detecting whether
there is a `SyntaxError` or not. If there isn't the AST gets returned.
Otherwise, the parser is run a second time with all the `invalid_*`
rules enabled so that all the customized error messages get produced.
The original tool wasn't working right and it was simpler to create a new one, partially re-using some of the old code. At this point the tool runs properly on the master. (Try: ./python Tools/c-analyzer/c-analyzer.py analyze.) It take ~40 seconds on my machine to analyze the full CPython code base.
Note that we'll need to iron out some OS-specific stuff (e.g. preprocessor). We're okay though since this tool isn't used yet in our workflow. We will also need to verify the analysis results in detail before activating the check in CI, though I'm pretty sure it's close.
https://bugs.python.org/issue36876
This commit reverts commit ac0333e1e1 as the original links are working again and they provide extended features such as comments and alternative versions.
Remove the global _Py_CheckRecursionLimit variable: it has been
replaced by ceval.recursion_limit of the PyInterpreterState
structure.
There is no need to keep the variable for the stable ABI, since
Py_EnterRecursiveCall() and Py_LeaveRecursiveCall() were not usable
in Python 3.8 and older: these macros accessed PyThreadState members,
whereas the PyThreadState structure is opaque in the limited C API.
* Add new capability to the PEG parser to type variable assignments. For instance:
```
| a[asdl_stmt_seq*]=';'.small_stmt+ [';'] NEWLINE { a }
```
* Add new sequence types from the asdl definition (automatically generated)
* Make `asdl_seq` type a generic aliasing pointer type.
* Create a new `asdl_generic_seq` for the generic case using `void*`.
* The old `asdl_seq_GET`/`ast_seq_SET` macros now are typed.
* New `asdl_seq_GET_UNTYPED`/`ast_seq_SET_UNTYPED` macros for dealing with generic sequences.
* Changes all possible `asdl_seq` types to use specific versions everywhere.
NuGet automatically includes .props file from the build directory in the
target using the package, but only if the .props file has the correct
name: it must be $(id).props
Rename python.props correspondingly in all the nuspec variants. Also
keep python.props as it were for backward compatibility.
Currently, empty sequences in gather rules make the conditional for
gather rules fail as empty sequences evaluate as "False". We need to
explicitly check for "None" (the failure condition) to avoid false
negatives.
Prevent installation on Windows 8 and earlier.
Download UCRT on demand when required (non-updated Windows 8.1 only)
Add reference to py launcher to post-install message
Replace MIDL-generated file with manual GUID definition.
Use the same .def file for release and debug builds.
Update setup build to support latest toolset
This commit removes the old parser, the deprecated parser module, the old parser compatibility flags and environment variables and all associated support code and documentation.
Fix :mod:`ssl`` code to be compatible with OpenSSL 1.1.x builds that use
``no-deprecated`` and ``--api=1.1.0``.
Note: Tests assume full OpenSSL API and fail with limited API.
Signed-off-by: Christian Heimes <christian@python.org>
Co-authored-by: Mark Wright <gienah@gentoo.org>
Previously, the result could have been an instance of a subclass of int.
Also revert bpo-26202 and make attributes start, stop and step of the range
object having exact type int.
Add private function _PyNumber_Index() which preserves the old behavior
of PyNumber_Index() for performance to use it in the conversion functions
like PyLong_AsLong().
These are like keywords but they only work in context; they are not reserved except when there is an exact match.
This would enable things like match statements without reserving `match` (which would be bad for the `re.match()` function and probably lots of other places).
Automerge-Triggered-By: @gvanrossum
The scripts in `Tools/peg_generator/scripts` mostly assume that
`ast.parse` and `compile` use the old parser, since this was the
state of things, while we were developing them. They need to be
updated to always use the correct parser. `_peg_parser` is being
extended to support both parsing and compiling with both parsers.
When there are 2 negative lookaheads in the same rule, let's say `!"(" blabla "," !")"`, there will the 2 `FunctionCall`'s where assigned value is None. Currently when the `add_var` is called
the first one will be ignored but when the second lookahead's var is sent to dedupe it
will be returned as `None_1` and this won't be ignored by the declaration generator in the `visit_Alt`. This patch adds an explicit check to `add_var` to distinguish whether if there is a variable or not.
Create a `make venv` target, that creates a virtual environment
and installs the dependency in that venv. `make time` and all
the related targets are changed to use the virtual environment
python.
Automerge-Triggered-By: @pablogsal
The following improvements are implemented in this commit:
- `p->error_indicator` is set, in case malloc or realloc fail.
- Avoid memory leaks in the case that realloc fails.
- Call `PyErr_NoMemory()` instead of `PyErr_Format()`, because it requires no memory.
Co-authored-by: Pablo Galindo <Pablogsal@gmail.com>
This is the initial implementation of PEP 615, the zoneinfo module,
ported from the standalone reference implementation (see
https://www.python.org/dev/peps/pep-0615/#reference-implementation for a
link, which has a more detailed commit history).
This includes (hopefully) all functional elements described in the PEP,
but documentation is found in a separate PR. This includes:
1. A pure python implementation of the ZoneInfo class
2. A C accelerated implementation of the ZoneInfo class
3. Tests with 100% branch coverage for the Python code (though C code
coverage is less than 100%).
4. A compile-time configuration option on Linux (though not on Windows)
Differences from the reference implementation:
- The module is arranged slightly differently: the accelerated module is
`_zoneinfo` rather than `zoneinfo._czoneinfo`, which also necessitates
some changes in the test support function. (Suggested by Victor
Stinner and Steve Dower.)
- The tests are arranged slightly differently and do not include the
property tests. The tests live at test/test_zoneinfo/test_zoneinfo.py
rather than test/test_zoneinfo.py or test/test_zoneinfo/__init__.py
because we may do some refactoring in the future that would likely
require this separation anyway; we may:
- include the property tests
- automatically run all the tests against both pure Python and C,
rather than manually constructing C and Python test classes (similar
to the way this works with test_datetime.py, which generates C
and Python test cases from datetimetester.py).
- This includes a compile-time configuration option on Linux (though not
on Windows); added with much help from Thomas Wouters.
- Integration into the CPython build system is obviously different from
building a standalone zoneinfo module wheel.
- This includes configuration to install the tzdata package as part of
CI, though only on the coverage jobs. Introducing a PyPI dependency as
part of the CI build was controversial, and this is seen as less of a
major change, since the coverage jobs already depend on pip and PyPI.
Additional changes that were introduced as part of this PR, most / all of
which were backported to the reference implementation:
- Fixed reference and memory leaks
With much debugging help from Pablo Galindo
- Added smoke tests ensuring that the C and Python modules are built
The import machinery can be somewhat fragile, and the "seamlessly falls
back to pure Python" nature of this module makes it so that a problem
building the C extension or a failure to import the pure Python version
might easily go unnoticed.
- Adjustments to zoneinfo.__dir__
Suggested by Petr Viktorin.
- Slight refactorings as suggested by Steve Dower.
- Removed unnecessary if check on std_abbr
Discovered this because of a missing line in branch coverage.
OpenSSL can be build without support for TLS 1.0 and 1.1. The ssl module
now correctly adheres to OPENSSL_NO_TLS1 and OPENSSL_NO_TLS1_1 flags.
Also update multissltest to test with latest OpenSSL and LibreSSL
releases.
Signed-off-by: Christian Heimes <christian@python.org>
Automerge-Triggered-By: @tiran
* 1.0.2u (EOL)
* 1.1.0l (EOL)
* 1.1.1g
* 3.0.0-alpha2 (disabled for now)
Build the FIPS provider and create a FIPS configuration file for OpenSSL
3.0.0.
Signed-off-by: Christian Heimes <christian@python.org>
Automerge-Triggered-By: @tiran
Don't hardcode defining_class parameter name to "cls":
* Define CConverter.set_template_dict(): do nothing by default
* CLanguage.render_function() now calls set_template_dict() on all
converters.
This is for the C generator:
- Disallow rule and variable names starting with `_`
- Rename most local variable names generated by the parser to start with `_`
Exceptions:
- Renaming `p` to `_p` will be a separate PR
- There are still some names that might clash, e.g.
- anything starting with `Py`
- C reserved words (`if` etc.)
- Macros like `EXTRA` and `CHECK`
Module C state is now accessible from C-defined heap type methods (PEP 573).
Patch by Marcel Plch and Petr Viktorin.
Co-authored-by: Marcel Plch <mplch@redhat.com>
Co-authored-by: Victor Stinner <vstinner@python.org>
This commit also allows to pass flags to the new parser in all interfaces and fixes a bug in the parser generator that was causing to inline rules with actions, making them disappear.
This is one of the few files that has intimate knowledge of the pyc file
format. Since it lacks tests it tends to become outdated fairly quickly.
At present it has been broken since the introduction of PEP 552.
Previously every test was building an extension module and
loading it into sys.modules. The tearDown function was thus
not able to clean up correctly, resulting in memory leaks.
With this commit, every test function now builds the extension
module and runs the actual test code in a new process
(using assert_python_ok), so that sys.modules stays intact
and no memory gets leaked.
When there is a SyntaxError after reading the last input character from
the tokenizer and if no newline follows it, the error message used to be
`unexpected EOF while parsing`, which is wrong.
python-gdb.py now checks for "take_gil" function name to check if a
frame tries to acquire the GIL, instead of checking for
"pthread_cond_timedwait" which is specific to Linux and can be a
different condition than the GIL.
Break up COMPARE_OP into four logically distinct opcodes:
* COMPARE_OP for rich comparisons
* IS_OP for 'is' and 'is not' tests
* CONTAINS_OP for 'in' and 'is not' tests
* JUMP_IF_NOT_EXC_MATCH for checking exceptions in 'try-except' statements.
Add ast.unparse() as a function in the ast module that can be used to unparse an
ast.AST object and produce a string with code that would produce an equivalent ast.AST
object when parsed.
test_urllib commented since 2007:
commit d9880d07fc
Author: Facundo Batista <facundobatista@gmail.com>
Date: Fri May 25 04:20:22 2007 +0000
Commenting out the tests until find out who can test them in
one of the problematic enviroments.
pynche code commented since 1998 and 2001:
commit ef30092207
Author: Barry Warsaw <barry@python.org>
Date: Tue Dec 15 01:04:38 1998 +0000
Added most of the mechanism to change the strips from color variations
to color constants (i.e. red constant, green constant, blue
constant). But I haven't hooked this up yet because the UI gets more
crowded and the arrows don't reflect the correct values.
Added "Go to Black" and "Go to White" buttons.
commit 741eae0b31
Author: Barry Warsaw <barry@python.org>
Date: Wed Apr 18 03:51:55 2001 +0000
StripWidget.__init__(), update_yourself(): Removed some unused local
variables reported by PyChecker.
__togglegentype(): PyChecker accurately reported that the variable
__gentypevar was unused -- actually this whole method is currently
unused so comment it out.
This is partly a cleanup of the code. It also is preparation for getting the variables from the source (cross-platform) rather than from the symbols.
The change only touches the tool (and its tests).
The "Slot" helper (descriptor) is leaking references due to its caching mechanism. The change includes a partial fix to Slot, but also adds Variable.storage to replace the problematic use of Slot.
https://bugs.python.org/issue38187
This fixes the exception '`ValueError: invalid literal for int() with base 10`
if `str(gdbval)` returns a hexadecimal value (e.g. '0xa0'). This is the case if
the output-radix is set to 16 in gdb. See
https://sourceware.org/gdb/onlinedocs/gdb/Numbers.html for more information.
In ArgumentClinic, value "NULL" should now be used only for unrepresentable default values
(like in the optional third parameter of getattr). "None" should be used if None is accepted
as argument and passing None has the same effect as not passing the argument at all.
Now the fields have names! Much easier to keep straight as a
reader than the elements of an 18-tuple.
Runs about 10-15% slower: from 10.8s to 12.3s, on my laptop.
Fortunately that's perfectly fine for this maintenance script.
bpo-37151: remove special case for PyCFunction from PyObject_Call
Alse, make the undocumented function PyCFunction_Call an alias
of PyObject_Call and deprecate it.
Since PEP 393 in Python 3.3, this value is always 0x10ffff, the
maximum codepoint in Unicode; there's no longer such a thing as a
UCS-2 build of Python, which couldn't properly represent some
characters.
There are a couple of spots left where we still condition on the value
of this constant. Take them out.
This is the converse of GH-15353 -- in addition to plenty of
scripts in the tree that are marked with the executable bit
(and so can be directly executed), there are a few that have
a leading `#!` which could let them be executed, but it doesn't
do anything because they don't have the executable bit set.
Here's a command which finds such files and marks them. The
first line finds files in the tree with a `#!` line *anywhere*;
the next-to-last step checks that the *first* line is actually of
that form. In between we filter out files that already have the
bit set, and some files that are meant as fragments to be
consumed by one or another kind of preprocessor.
$ git grep -l '^#!' \
| grep -vxFf <( \
git ls-files --stage \
| perl -lane 'print $F[3] if (!/^100644/)' \
) \
| grep -ve '\.in$' -e '^Doc/includes/' \
| while read f; do
head -c2 "$f" | grep -qxF '#!' \
&& chmod a+x "$f"; \
done