This is the implementation of PEP683
Motivation:
The PR introduces the ability to immortalize instances in CPython which bypasses reference counting. Tagging objects as immortal allows up to skip certain operations when we know that the object will be around for the entire execution of the runtime.
Note that this by itself will bring a performance regression to the runtime due to the extra reference count checks. However, this brings the ability of having truly immutable objects that are useful in other contexts such as immutable data sharing between sub-interpreters.
* Eliminate all remaining uses of Py_SIZE and Py_SET_SIZE on PyLongObject, adding asserts.
* Change layout of size/sign bits in longobject to support future addition of immortal ints and tagged medium ints.
* Add functions to hide some internals of long object, and for setting sign and digit count.
* Replace uses of IS_MEDIUM_VALUE macro with _PyLong_IsCompact().
The essentially eliminates the global variable, with the associated benefits. This is also a precursor to isolating this bit of state to PyInterpreterState.
Folks that currently read _Py_RefTotal directly would have to start using _Py_GetGlobalRefTotal() instead.
https://github.com/python/cpython/issues/102304
The Py_CLEAR(), Py_SETREF() and Py_XSETREF() macros now only evaluate
their arguments once. If an argument has side effects, these side
effects are no longer duplicated.
Use temporary variables to avoid duplicating side effects of macro
arguments. If available, use _Py_TYPEOF() to avoid type punning.
Otherwise, use memcpy() for the assignment to prevent a
miscompilation with strict aliasing caused by type punning.
Add _Py_TYPEOF() macro: __typeof__() on GCC and clang.
Add test_py_clear() and test_py_setref() unit tests to _testcapi.
The Py_CLEAR(), Py_SETREF() and Py_XSETREF() macros now only evaluate
their argument once. If an argument has side effects, these side
effects are no longer duplicated.
Add test_py_clear() and test_py_setref() unit tests to _testcapi.
In the limited C API with a debug build, Py_INCREF() is implemented
by calling _Py_IncRef() which calls Py_INCREF(). Only call
_Py_INCREF_STAT_INC() once.
This is the first of several precursors to storing tp_subclasses (and tp_weaklist) on the interpreter state for static builtin types.
We do the following:
* add `_PyStaticType_InitBuiltin()`
* add `_Py_TPFLAGS_STATIC_BUILTIN`
* set it on all static builtin types in `_PyStaticType_InitBuiltin()`
* shuffle some code around to be able to use _PyStaticType_InitBuiltin()
* rename `_PyStructSequence_InitType()` to `_PyStructSequence_InitBuiltinWithFlags()`
* add `_PyStructSequence_InitBuiltin()`.
Added a new stable API function ``PyType_FromMetaclass``, which mirrors
the behavior of ``PyType_FromModuleAndSpec`` except that it takes an
additional metaclass argument. This is, e.g., useful for language
binding tools that need to store additional information in the type
object.
Use the PyObject* type for parameters of static inline functions:
* Py_SIZE(): same parameter type than PyObject_Size()
* PyList_GET_SIZE(), PyList_SET_ITEM(): same parameter type than
PyList_Size() and PyList_SetItem()
* PyTuple_GET_SIZE(), PyTuple_SET_ITEM(): same parameter type than
PyTuple_Size() and PyTuple_SetItem().
In C++, the _PyObject_EXTRA_INIT macro now uses nullptr, rather than
0, to initialize the _ob_next and _ob_prev members of the PyObject
structure.
Fix test_cppext failure when Python is built with
./configure --with-trace-refs.
Fix C++ compiler warnings: "zero as null pointer constant"
(clang -Wzero-as-null-pointer-constant).
* Add the _Py_NULL macro used by static inline functions to use
nullptr in C++.
* Replace NULL with nullptr in _testcppext.cpp.
Fix C++ compiler warnings about "old-style cast"
(g++ -Wold-style-cast) in the Python C API. Use C++
reinterpret_cast<> and static_cast<> casts when the Python C API is
used in C++.
Example of fixed warning:
Include/object.h:107:43: error: use of old-style cast to
‘PyObject*’ {aka ‘struct _object*’} [-Werror=old-style-cast]
#define _PyObject_CAST(op) ((PyObject*)(op))
Add _Py_reinterpret_cast() and _Py_static_cast() macros.
In the limited C API version 3.11 and newer, the following functions
no longer cast their object pointer argument with _PyObject_CAST() or
_PyObject_CAST_CONST():
* Py_REFCNT(), Py_TYPE(), Py_SIZE()
* Py_SET_REFCNT(), Py_SET_TYPE(), Py_SET_SIZE()
* Py_IS_TYPE()
* Py_INCREF(), Py_DECREF()
* Py_XINCREF(), Py_XDECREF()
* Py_NewRef(), Py_XNewRef()
* PyObject_TypeCheck()
* PyType_Check()
* PyType_CheckExact()
Split Py_DECREF() implementation in 3 versions to make the code more
readable.
Update the xxlimited.c extension, which uses the limited C API
version 3.11, to pass PyObject* to these functions.
Py_REFCNT(), Py_TYPE(), Py_SIZE() and Py_IS_TYPE() functions argument
type is now "PyObject*", rather than "const PyObject*".
* Replace also "const PyObject*" with "PyObject*" in functions:
* _Py_strhex_impl()
* _Py_strhex_with_sep()
* _Py_strhex_bytes_with_sep()
* Remove _PyObject_CAST_CONST() and _PyVarObject_CAST_CONST() macros.
* Py_IS_TYPE() can now use Py_TYPE() in its implementation.
Copying and pickling instances of subclasses of builtin types
bytearray, set, frozenset, collections.OrderedDict, collections.deque,
weakref.WeakSet, and datetime.tzinfo now copies and pickles instance attributes
implemented as slots.
Move forward declarations of Python C API types to a new pytypedefs.h
header file to solve interdependency issues between header files.
pytypedefs.h contains forward declarations of the following types:
* PyCodeObject
* PyFrameObject
* PyGetSetDef
* PyInterpreterState
* PyLongObject
* PyMemberDef
* PyMethodDef
* PyModuleDef
* PyObject
* PyThreadState
* PyTypeObject
When a static inline function is wrapped by a macro which casts its
arguments to the expected type, there is no need that the function
has a different name than the macro. Use the same name for the macro
and the function to avoid confusion.
Rename _PyUnicode_get_wstr_length() to PyUnicode_WSTR_LENGTH().
Don't rename static inline _Py_NewRef() and _Py_XNewRef() functions,
since the C API exports Py_NewRef() and Py_XNewRef() functions as
regular functions. The name cannot be reused in this case.
* Never change types' cached keys. It could invalidate inline attribute objects.
* Lazily create object dictionaries.
* Update specialization of LOAD/STORE_ATTR.
* Don't update shared keys version for deletion of value.
* Update gdb support to handle instance values.
* Rename SPLIT_KEYS opcodes to INSTANCE_VALUE.
Convert the Py_TYPE() and Py_SIZE() macros to static inline
functions. The Py_SET_TYPE() and Py_SET_SIZE() functions must now be
used to set an object type and size.
* Remove code that checks Py_TPFLAGS_HAVE_VERSION_TAG
The field is always present in the type struct, as explained
in the added comment.
* Remove Py_TPFLAGS_HAVE_AM_SEND
The flag is not needed, and since it was added in 3.10 it can be removed now.
Convert the Py_TYPE() and Py_SIZE() macros to static inline
functions. The Py_SET_TYPE() and Py_SET_SIZE() functions must now be
used to set an object type and size.
Add a new Py_TPFLAGS_DISALLOW_INSTANTIATION type flag to disallow
creating type instances: set tp_new to NULL and don't create the
"__new__" key in the type dictionary.
The flag is set automatically on static types if tp_base is NULL or
&PyBaseObject_Type and tp_new is NULL.
Use the flag on the following types:
* _curses.ncurses_version type
* _curses_panel.panel
* _tkinter.Tcl_Obj
* _tkinter.tkapp
* _tkinter.tktimertoken
* _xxsubinterpretersmodule.ChannelID
* sys.flags type
* sys.getwindowsversion() type
* sys.version_info type
Update MyStr example in the C API documentation to use
Py_TPFLAGS_DISALLOW_INSTANTIATION.
Add _PyStructSequence_InitType() function to create a structseq type
with the Py_TPFLAGS_DISALLOW_INSTANTIATION flag set.
type_new() calls _PyType_CheckConsistency() at exit.
* Add Py_TPFLAGS_SEQUENCE and Py_TPFLAGS_MAPPING, add to all relevant standard builtin classes.
* Set relevant flags on collections.abc.Sequence and Mapping.
* Use flags in MATCH_SEQUENCE and MATCH_MAPPING opcodes.
* Inherit Py_TPFLAGS_SEQUENCE and Py_TPFLAGS_MAPPING.
* Add NEWS
* Remove interpreter-state map_abc and seq_abc fields.
Introduce Py_TPFLAGS_IMMUTABLETYPE flag for immutable type objects, and
modify PyType_Ready() to set it for static types.
Co-authored-by: Victor Stinner <vstinner@python.org>
Add the Py_Is(x, y) function to test if the 'x' object is the 'y'
object, the same as "x is y" in Python. Add also the Py_IsNone(),
Py_IsTrue(), Py_IsFalse() functions to test if an object is,
respectively, the None singleton, the True singleton or the False
singleton.
The limited C API is now supported if Python is built in debug mode
(if the Py_DEBUG macro is defined). In the limited C API, the
Py_INCREF() and Py_DECREF() functions are now implemented as opaque
function calls, rather than accessing directly the PyObject.ob_refcnt
member, if Python is built in debug mode and the Py_LIMITED_API macro
targets Python 3.10 or newer. It became possible to support the
limited C API in debug mode because the PyObject structure is the
same in release and debug mode since Python 3.8 (see bpo-36465).
The limited C API is still not supported in the --with-trace-refs
special build (Py_TRACE_REFS macro).
This change partically reverts
commit ad3252bad9
and the commit fe2978b3b9.
Many third party C extension modules rely on the ability of using
Py_TYPE() to set an object type: "Py_TYPE(obj) = type;" or to set an
object type using: "Py_SIZE(obj) = size;".
Convert Py_REFCNT() and Py_SIZE() macros to static inline functions.
They cannot be used as l-value anymore: use Py_SET_REFCNT() and
Py_SET_SIZE() to set an object reference count and size.
Replace &Py_SIZE(self) with &((PyVarObject*)self)->ob_size
in arraymodule.c.
This change is backward incompatible on purpose, to prepare the C API
for an opaque PyObject structure.
Module C state is now accessible from C-defined heap type methods (PEP 573).
Patch by Marcel Plch and Petr Viktorin.
Co-authored-by: Marcel Plch <mplch@redhat.com>
Co-authored-by: Victor Stinner <vstinner@python.org>
PyType_HasFeature() now always calls PyType_GetFlags() to hide
implementation details. Previously, it accessed directly the
PyTypeObject.tp_flags member when the limited C API was not used.
Add fast inlined version _PyType_HasFeature() and _PyType_IS_GC()
for object.c and typeobject.c.
* Add backcompat defines and move non-limited API declaration to cpython/
This partially reverts commit 2ff58a24e8
which added PyObject_CallNoArgs to the 3.9+ stable ABI. This should not
be done; there are enough other call APIs in the stable ABI to choose from.
* Adjust documentation
Mark all newly public functions as added in 3.9.
Add a note about the 3.8 provisional names.
Add notes on public API.
* Put PyObject_CallNoArgs back in the limited API
* Rename PyObject_FastCallDict to PyObject_VectorcallDict
In the limited C API, PyObject_INIT() and PyObject_INIT_VAR() are now
defined as aliases to PyObject_Init() and PyObject_InitVar() to make
their implementation opaque. It avoids to leak implementation details
in the limited C API.
Exclude the following functions from the limited C API, move them
from object.h to cpython/object.h:
* _Py_NewReference()
* _Py_ForgetReference()
* _PyTraceMalloc_NewReference()
* _Py_GetRefTotal()
Exclude trashcan mechanism from the limited C API: it requires access to
PyTypeObject and PyThreadState structure fields, whereas these structures
are opaque in the limited C API.
The trashcan mechanism never worked with the limited C API. Move it
from object.h to cpython/object.h.
_Py_NewReference() becomes a regular opaque function, rather than a
static inline function in the C API (object.h), to better hide
implementation details.
Move _Py_tracemalloc_config from public pymem.h to internal
pycore_pymem.h header.
Make _Py_AddToAllObjects() private.
* Remove _Py_INC_REFTOTAL and _Py_DEC_REFTOTAL macros: modify
directly _Py_RefTotal.
* _Py_ForgetReference() is no longer defined if the Py_TRACE_REFS
macro is not defined.
* Remove _Py_NewReference() implementation from object.c:
unify the two implementations in object.h inline function.
* Fix Py_TRACE_REFS build: _Py_INC_TPALLOCS() macro has been removed.
Move the following functions from the public C API to the internal C
API:
* _PyDebug_PrintTotalRefs(),
* _Py_PrintReferenceAddresses()
* _Py_PrintReferences()
Filename and line numbers are not needed when Py_REF_DEBUG are not
defined.
The static inline _Py_DECREF() function was introduced by
commit 2aaf0c1204.
It is now allowed to add new fields at the end of the PyTypeObject struct without having to allocate a dedicated compatibility flag in tp_flags.
This will reduce the risk of running out of bits in the 32-bit tp_flags value.
Add new trashcan macros to deal with a double deallocation that could occur when the `tp_dealloc` of a subclass calls the `tp_dealloc` of a base class and that base class uses the trashcan mechanism.
Patch by Jeroen Demeyer.
Release build and debug build are now ABI compatible: the Py_DEBUG
define no longer implies Py_TRACE_REFS define which introduces the
only ABI incompatibility.
A new "./configure --with-trace-refs" build option is now required to
get Py_TRACE_REFS define which adds sys.getobjects() function and
PYTHONDUMPREFS environment variable.
Changes:
* Add ./configure --with-trace-refs
* Py_DEBUG no longer implies Py_TRACE_REFS
Fix the following clang warning:
Include/cpython/pystate.h:217:3: warning: redefinition of
typedef 'PyThreadState' is a C11 feature [-Wtypedef-redefinition]
* Move object.h code surrounded by "#ifndef Py_LIMITED_API"
to a new Include/cpython/object.h header file.
* "typedef struct _typeobject PyTypeObject;" is now always defined
in object.h, but if Py_LIMITED_API is not defined,
Include/cpython/object.h also defines the structure.
* Complete the printfunc comment to mention Py_LIMITED_API define.
* Add _PyObject_ASSERT_FROM() and _PyObject_ASSERT_FAILED_MSG()
macros.
* PyObject_GC_Track() now calls _PyObject_ASSERT_FAILED_MSG(),
instead of Py_FatalError(), if the object is already tracked, to
dump more information on error.
* _PyObject_GC_TRACK() no longer checks if the object is already
tracked at runtime, use an assertion instead for best performances;
PyObject_GC_Track() still checks at runtime.
* pycore_object.h now includes pycore_pystate.h.
* Convert _PyObject_GC_TRACK() and _PyObject_GC_UNTRACK() macros to
inline functions.
Configuring python with ./configure --with-pydebug CFLAGS="-D COUNT_ALLOCS -O0"
makes "make smelly" fail as some symbols were being exported without the "Py_" or
"_Py" prefixes.
_PyTraceMalloc_NewReference() is now called by _Py_NewReference(), so
move its definition to object.h. Moreover, define it even if
Py_LIMITED_API is defined, since _Py_NewReference() is also exposed
even if Py_LIMITED_API is defined.
Changes:
* Add _PyObject_AssertFailed() function.
* Add _PyObject_ASSERT() and _PyObject_ASSERT_WITH_MSG() macros.
* gc_decref(): replace assert() with _PyObject_ASSERT_WITH_MSG() to
dump the faulty object if the assertion fails.
_PyObject_AssertFailed() calls:
* _PyMem_DumpTraceback(): try to log the traceback where the object
memory has been allocated if tracemalloc is enabled.
* _PyObject_Dump(): log repr(obj).
* Py_FatalError(): log the current Python traceback.
_PyObject_AssertFailed() uses _PyObject_IsFreed() heuristic to check
if the object memory has been freed by a debug hook on Python memory
allocators.
Initial patch written by David Malcolm.
Co-Authored-By: David Malcolm <dmalcolm@redhat.com>
* Add Py_STATIC_INLINE() macro to declare a "static inline" function.
If the compiler supports it, try to always inline the function even if no
optimization level was specified.
* Modify pydtrace.h to use Py_STATIC_INLINE() when WITH_DTRACE is
not defined.
* Add an unit test on Py_DECREF() to make sure that
_Py_NegativeRefcount() reports the correct filename.
* Modify object.h to ensure that pymem.h is included,
to get _Py_tracemalloc_config variable.
* Move _PyTraceMalloc_XXX() functions to tracemalloc.h,
they need PyObject type. Break circular dependency between pymem.h
and object.h.
tracemalloc now tries to update the traceback when an object is
reused from a "free list" (optimization for faster object creation,
used by the builtin list type for example).
Changes:
* Add _PyTraceMalloc_NewReference() function which tries to update
the Python traceback of a Python object.
* _Py_NewReference() now calls _PyTraceMalloc_NewReference().
* Add an unit test.
_PyObject_Dump() now uses an heuristic to check if the object memory
has been freed: log "<freed object>" in that case.
The heuristic rely on the debug hooks on Python memory allocators
which fills the memory with DEADBYTE (0xDB) when memory is
deallocated. Use PYTHONMALLOC=debug to always enable these debug
hooks.