Module C state is now accessible from C-defined heap type methods (PEP 573).
Patch by Marcel Plch and Petr Viktorin.
Co-authored-by: Marcel Plch <mplch@redhat.com>
Co-authored-by: Victor Stinner <vstinner@python.org>
Add a new internal pycore_byteswap.h header file with the following
functions:
* _Py_bswap16()
* _Py_bswap32()
* _Py_bswap64()
Use these functions in _ctypes, sha256 and sha512 modules,
and also use in the UTF-32 encoder.
sha256, sha512 and _ctypes modules are now built with the internal
C API.
* _PyThreadState_DeleteCurrent() now takes tstate rather than
runtime.
* Add ensure_tstate_not_null() helper to pystate.c.
* Add _PyEval_ReleaseLock() function.
* _PyThreadState_DeleteCurrent() now calls
_PyEval_ReleaseLock(tstate) and frees PyThreadState memory after
this call, not before.
* PyGILState_Release(): rename "tcur" variable to "tstate".
The Py_FatalError() function is replaced with a macro which logs
automatically the name of the current function, unless the
Py_LIMITED_API macro is defined.
Changes:
* Add _Py_FatalErrorFunc() function.
* Remove the function name from the message of Py_FatalError() calls
which included the function name.
* Update tests.
bpo-36389, bpo-38376: The _PyObject_CheckConsistency() function is
now also available in release mode. For example, it can be used to
debug a crash in the visit_decref() function of the GC.
Modify the following functions to also work in release mode:
* _PyDict_CheckConsistency()
* _PyObject_CheckConsistency()
* _PyType_CheckConsistency()
* _PyUnicode_CheckConsistency()
Other changes:
* _PyMem_IsPtrFreed(ptr) now also returns 1 if ptr is NULL
(equals to 0).
* _PyBytesWriter_CheckConsistency() now returns 1 and is only used
with assert().
* Reorder _PyObject_Dump() to write safe fields first, and only
attempt to render repr() at the end.
* subprocess: Add user, group and extra_groups paremeters to subprocess.Popen
This adds a `user` parameter to the Popen constructor that will call
setreuid() in the child before calling exec(). This allows processes
running as root to safely drop privileges before running the subprocess
without having to use a preexec_fn.
This also adds a `group` parameter that will call setregid() in
the child process before calling exec().
Finally an `extra_groups` parameter was added that will call
setgroups() to set the supplimental groups.
The instance destructor for a type is responsible for preparing
an instance for deallocation by decrementing the reference counts
of its referents.
If an instance belongs to a heap type, the type object of an instance
has its reference count decremented while for static types, which
are permanently allocated, the type object is unaffected by the
instance destructor.
Previously, the default instance destructor searched the class
hierarchy for an inherited instance destructor and, if present,
would invoke it.
Then, if the instance type is a heap type, it would decrement the
reference count of that heap type. However, this could result in the
premature destruction of a type because the inherited instance
destructor should have already decremented the reference count
of the type object.
This change avoids the premature destruction of the type object
by suppressing the decrement of its reference count when an
inherited, non-default instance destructor has been invoked.
Finally, an assertion on the Py_SIZE of a type was deleted. Heap
types have a non zero size, making this into an incorrect assertion.
https://github.com/python/cpython/pull/15323
Replace two Python function calls with a single one to ensure that no
memory allocation is done between the invalid object is created and
when _PyObject_IsFreed() is called.
When inheriting a heap subclass from a vectorcall class that sets
`.tp_call=PyVectorcall_Call` (as recommended in PEP 590), the subclass does
not inherit `_Py_TPFLAGS_HAVE_VECTORCALL`, and thus `PyVectorcall_Call` does
not work for it.
This attempts to solve the issue by:
* always inheriting `tp_vectorcall_offset` unless `tp_call` is overridden
in the subclass
* inheriting _Py_TPFLAGS_HAVE_VECTORCALL for static types, unless `tp_call`
is overridden
* making `PyVectorcall_Call` ignore `_Py_TPFLAGS_HAVE_VECTORCALL`
This means it'll be ever more important to only call `PyVectorcall_Call`
on classes that support vectorcall. In `PyVectorcall_Call`'s intended role
as `tp_call` filler, that's not a problem.
Add new trashcan macros to deal with a double deallocation that could occur when the `tp_dealloc` of a subclass calls the `tp_dealloc` of a base class and that base class uses the trashcan mechanism.
Patch by Jeroen Demeyer.
Omit serialno field from debug hooks on Python memory allocators to
reduce the memory footprint by 5%.
Enable tracemalloc to get the traceback where a memory block has been
allocated when a fatal memory error is logged to decide where to put
a breakpoint.
Compile Python with PYMEM_DEBUG_SERIALNO defined to get back the
field.
This is effectively an un-revert of #11617 and #12024 (reverted in #12159). Portions of those were merged in other PRs (with lower risk) and this represents the remainder. Note that I found 3 different bugs in the original PRs and have fixed them here.
Modify CLEANBYTE, DEADDYTE and FORBIDDENBYTE constants: use 0xCD,
0xDD and 0xFD, rather than 0xCB, 0xBB and 0xFB, to use the same byte
patterns than Windows CRT debug malloc() and free().
Replace _PyMem_IsFreed() function with _PyMem_IsPtrFreed() inline
function. The function is now way more efficient, it became a simple
comparison on integers, rather than a short loop. It detects also
uninitialized bytes and "forbidden bytes" filled by debug hooks
on memory allocators.
Add unit tests on _PyObject_IsFreed().
If tracemalloc is not tracing Python memory allocations,
_PyMem_DumpTraceback() now suggests to enable tracemalloc
to get the traceback where the memory block has been allocated.
* Add Py_STATIC_INLINE() macro to declare a "static inline" function.
If the compiler supports it, try to always inline the function even if no
optimization level was specified.
* Modify pydtrace.h to use Py_STATIC_INLINE() when WITH_DTRACE is
not defined.
* Add an unit test on Py_DECREF() to make sure that
_Py_NegativeRefcount() reports the correct filename.
* Fix _PyMem_SetupAllocators("debug"): always restore allocators to
the defaults, rather than only caling _PyMem_SetupDebugHooks().
* Add _PyMem_SetDefaultAllocator() helper to set the "default"
allocator.
* Add _PyMem_GetAllocatorsName(): get the name of the allocators
* main() now uses debug hooks on memory allocators if Py_DEBUG is
defined, rather than calling directly malloc()
* Document default memory allocators in C API documentation
* _Py_InitializeCore() now fails with a fatal user error if
PYTHONMALLOC value is an unknown memory allocator, instead of
failing with a fatal internal error.
* Add new tests on the PYTHONMALLOC environment variable
* Add support.with_pymalloc()
* Add the _testcapi.WITH_PYMALLOC constant and expose it as
support.with_pymalloc().
* sysconfig.get_config_var('WITH_PYMALLOC') doesn't work on Windows, so
replace it with support.with_pymalloc().
* pythoninfo: add _testcapi collector for pymem
Some parts of the C API are only relevant to larger
applications embedding CPython as a runtime engine.
The helpers to test those APIs are already separated
out into Programs/_testembed.c, this update moves
the associated test cases out into their own dedicated
test file.
bpo-32096, bpo-30860: Partially revert the commit
2ebc5ce42a8a9e047e790aefbf9a94811569b2b6:
* Move structures back from Include/internal/mem.h to
Objects/obmalloc.c
* Remove _PyObject_Initialize() and _PyMem_Initialize()
* Remove Include/internal/pymalloc.h
* Add test_capi.test_pre_initialization_api():
Make sure that it's possible to call Py_DecodeLocale(), and then call
Py_SetProgramName() with the decoded string, before Py_Initialize().
PyMem_RawMalloc() and Py_DecodeLocale() can be called again before
_PyRuntimeState_Init().
Co-Authored-By: Eric Snow <ericsnowcurrently@gmail.com>
Don't run Python in an empty environment, but copy the current
environment and set PYTHONIOENCODING. So the test works also on
Python compiled in shared mode (using libpython).
- new PYTHONCOERCECLOCALE config setting
- coerces legacy C locale to C.UTF-8, C.utf8 or UTF-8 by default
- always uses C.UTF-8 on Android
- uses `surrogateescape` on stdin and stdout in the coercion
target locales
- configure option to disable locale coercion at build time
- configure option to disable C locale warning at build time