Fix a bug I introduced in #9864 by which coroutines are treated as synonymous of function coroutines.
Also, fix the same mistake (coroutines == function coroutines) already present in other parts of the reference.
I'm very sorry for the hassle.
There is only one trivial change to idle.rst. Nearly all the changes to help.html are the elimination of chapter and section numbers on headers due to changes in the build system. help.py no longer requires header numbering.
Since `SourceFileLoader.set_data()` catches exceptions raised by `_write_atomic()` and logs an informative message consequently, always logging successful outcome in 'SourceLoader.get_code()' seems redundant.
https://bugs.python.org/issue35024
Use _PyObject_ASSERT() in:
* _PyDict_CheckConsistency()
* _PyType_CheckConsistency()
* _PyUnicode_CheckConsistency()
_PyObject_ASSERT() dumps the faulty object if the assertion fails
to help debugging.
Replace assert() with _PyObject_ASSERT() in Modules/gcmodule.c
to dump the faulty object on assertion failure to ease debugging.
Fix also indentation of a large comment.
Initial patch written by David Malcolm.
Co-Authored-By: David Malcolm <dmalcolm@redhat.com>
Declare functions with EXTINLINE:
* mpd_del()
* mpd_uint_zero()
* mpd_qresize()
* mpd_qresize_zero()
* mpd_minalloc()
These functions are implemented with "inline" or "ALWAYS_INLINE", but
declared without inline which cause linker error on Visual Studio in
Debug mode when using /Ob1.
Prior to this revision, after the shutdown of a `BaseServer`,
the server accepted a last single request
if it was sent between the server socket polling
and the polling timeout.
This can be problematic for instance for a server restart
for which you do not want to interrupt the service,
by not closing the listening socket during the restart.
One request failed because of this behavior.
Note that only one request failed,
following requests were not accepted, as expected.
Visual Studio solution: Set InlineFunctionExpansion to
OnlyExplicitInline ("/Ob1" option) on all projects (in
pyproject.props) in Debug mode on Win32 and x64 platforms to expand
functions marked as inline.
This change should make Python compiled in Debug mode a little bit
faster on Windows. On Unix, GCC uses -Og optimization level for
./configure --with-pydebug.
* Convert PyObject_INIT() and PyObject_INIT_VAR() macros to static
inline functions.
* Fix usage of these functions: cast to PyObject* or PyVarObject*.
inspect.isfunction() processes both inspect.isfunction(func) and
inspect.isfunction(partial(func, arg)) correctly but some other functions in the
inspect module (iscoroutinefunction, isgeneratorfunction and isasyncgenfunction)
lack this functionality. This commits adds a new check in the mentioned functions
in the inspect module so they can work correctly with arbitrarily nested partial
functions.
_PyTraceMalloc_NewReference() is now called by _Py_NewReference(), so
move its definition to object.h. Moreover, define it even if
Py_LIMITED_API is defined, since _Py_NewReference() is also exposed
even if Py_LIMITED_API is defined.
The MagicMock class supports many magic methods, but not __fspath__. To ease
testing with modules such as os.path, this function is now supported by default.
Changes:
* Add _PyObject_AssertFailed() function.
* Add _PyObject_ASSERT() and _PyObject_ASSERT_WITH_MSG() macros.
* gc_decref(): replace assert() with _PyObject_ASSERT_WITH_MSG() to
dump the faulty object if the assertion fails.
_PyObject_AssertFailed() calls:
* _PyMem_DumpTraceback(): try to log the traceback where the object
memory has been allocated if tracemalloc is enabled.
* _PyObject_Dump(): log repr(obj).
* Py_FatalError(): log the current Python traceback.
_PyObject_AssertFailed() uses _PyObject_IsFreed() heuristic to check
if the object memory has been freed by a debug hook on Python memory
allocators.
Initial patch written by David Malcolm.
Co-Authored-By: David Malcolm <dmalcolm@redhat.com>
* Add Py_STATIC_INLINE() macro to declare a "static inline" function.
If the compiler supports it, try to always inline the function even if no
optimization level was specified.
* Modify pydtrace.h to use Py_STATIC_INLINE() when WITH_DTRACE is
not defined.
* Add an unit test on Py_DECREF() to make sure that
_Py_NegativeRefcount() reports the correct filename.
* Modify object.h to ensure that pymem.h is included,
to get _Py_tracemalloc_config variable.
* Move _PyTraceMalloc_XXX() functions to tracemalloc.h,
they need PyObject type. Break circular dependency between pymem.h
and object.h.
tracemalloc now tries to update the traceback when an object is
reused from a "free list" (optimization for faster object creation,
used by the builtin list type for example).
Changes:
* Add _PyTraceMalloc_NewReference() function which tries to update
the Python traceback of a Python object.
* _Py_NewReference() now calls _PyTraceMalloc_NewReference().
* Add an unit test.
.o generated by clang in LTO mode actually are LLVM bitcode files, which
leads to a few errors during configure/build step:
- add lto flags to the BASECFLAGS instead of CFLAGS, as CFLAGS are used
to build autoconf test case, and some are not compatible with clang LTO
(they assume binary in the .o, not bitcode)
- force llvm-ar instead of ar, as ar is not aware of .o files generated
by clang -flto
_PyObject_Dump() now uses an heuristic to check if the object memory
has been freed: log "<freed object>" in that case.
The heuristic rely on the debug hooks on Python memory allocators
which fills the memory with DEADBYTE (0xDB) when memory is
deallocated. Use PYTHONMALLOC=debug to always enable these debug
hooks.