Since _PyImport_ReInitLock() now calls _PyThread_at_fork_reinit() on
the import lock, the lock is now in a known state: unlocked. It
became safe to acquire it after fork.
PyOS_AfterFork_Child() helper functions now return a PyStatus:
PyOS_AfterFork_Child() is now responsible to handle errors.
* Move _PySignal_AfterFork() to the internal C API
* Add #ifdef HAVE_FORK on _PyGILState_Reinit(), _PySignal_AfterFork()
and _PyInterpreterState_DeleteExceptMain().
I can add another commit with the new test case I wrote to verify that the warning was being printed before my change, stopped printing after my change, and that the function does not return null after my change.
Automerge-Triggered-By: @brettcannon
Otherwise we leave a dangling pointer to free'd memory. If we
then initialize a new interpreter in the same process and call
PyImport_ExtendInittab, we will (likely) crash when calling
PyMem_RawRealloc(inittab_copy, ...) since the pointer address
is bogus.
Automerge-Triggered-By: @brettcannon
PyFrame_GetCode(frame): return a borrowed reference to the frame
code.
Replace frame->f_code with PyFrame_GetCode(frame) in most code,
except in frameobject.c, genobject.c and ceval.c.
Also add PyFrame_GetLineNumber() to the limited C API.
Rename _PyInterpreterState_GET_UNSAFE() to _PyInterpreterState_GET()
for consistency with _PyThreadState_GET() and to have a shorter name
(help to fit into 80 columns).
Add also "assert(tstate != NULL);" to the function.
Don't access PyInterpreterState.config member directly anymore, but
use new functions:
* _PyInterpreterState_GetConfig()
* _PyInterpreterState_SetConfig()
* _Py_GetConfig()
Replace _PyInterpreterState_Get() function call with
_PyInterpreterState_GET_UNSAFE() macro which is more efficient but
don't check if tstate or interp is NULL.
_Py_GetConfigsAsDict() now uses _PyThreadState_GET().
The Py_FatalError() function is replaced with a macro which logs
automatically the name of the current function, unless the
Py_LIMITED_API macro is defined.
Changes:
* Add _Py_FatalErrorFunc() function.
* Remove the function name from the message of Py_FatalError() calls
which included the function name.
* Update tests.
The bulk of this patch was generated automatically with:
for name in \
PyObject_Vectorcall \
Py_TPFLAGS_HAVE_VECTORCALL \
PyObject_VectorcallMethod \
PyVectorcall_Function \
PyObject_CallOneArg \
PyObject_CallMethodNoArgs \
PyObject_CallMethodOneArg \
;
do
echo $name
git grep -lwz _$name | xargs -0 sed -i "s/\b_$name\b/$name/g"
done
old=_PyObject_FastCallDict
new=PyObject_VectorcallDict
git grep -lwz $old | xargs -0 sed -i "s/\b$old\b/$new/g"
and then cleaned up:
- Revert changes to in docs & news
- Revert changes to backcompat defines in headers
- Nudge misaligned comments
Currently, during runtime destruction, `_PyImport_Cleanup` is clearing the interpreter state before clearing out the modules themselves. This leads to a segfault on modules that rely on the module state to clear themselves up.
For example, let's take the small snippet added in the issue by @DinoV :
```
import _struct
class C:
def __init__(self):
self.pack = _struct.pack
def __del__(self):
self.pack('I', -42)
_struct.x = C()
```
The module `_struct` uses the module state to run `pack`. Therefore, the module state has to be alive until after the module has been cleared out to successfully run `C.__del__`. This happens at line 606, when `_PyImport_Cleanup` calls `_PyModule_Clear`. In fact, the loop that calls `_PyModule_Clear` has in its comments:
> Now, if there are any modules left alive, clear their globals to minimize potential leaks. All C extension modules actually end up here, since they are kept alive in the interpreter state.
That means that we can't clear the module state (which is used by C Extensions) before we run that loop.
Moving `_PyInterpreterState_ClearModules` until after it, fixes the segfault in the code snippet.
Finally, this updates a test in `io` to correctly assert the error that it now throws (since it now finds the io module state). The test that uses this is: `test_create_at_shutdown_without_encoding`. Given this test is now working is a proof that the module state now stays alive even when `__del__` is called at module destruction time. Thus, I didn't add a new tests for this.
https://bugs.python.org/issue38076
new_interpreter() now calls _PyBuiltin_Init() to create the builtins
module and calls _PyImport_FixupBuiltin(), rather than using
_PyImport_FindBuiltin(tstate, "builtins").
pycore_init_builtins() is now responsible to initialize
intepr->builtins_copy: inline _PyImport_Init() and remove this
function.
If _PyImport_FixupExtensionObject() is called from a subinterpreter,
leave extensions unchanged and don't copy the module dictionary
into def->m_base.m_copy.
* Add GCState type for readability
* gcmodule.c now gets its gcstate from tstate
* _PyGC_DumpShutdownStats() now expects tstate rather than runtime
* Rename "state" to "gcstate" for readability: to avoid confusion
between "state" and "tstate" for example.
* collect() now only expects tstate: it gets gcstate from tstate.
* Pass tstate to _PyErr_xxx() functions
Relative imports use resolve_name to get the absolute target name,
which first seeks the current module's absolute package name from the globals:
If __package__ (and __spec__.parent) are missing then
import uses __name__, truncating the last segment if
the module is a submodule rather than a package __init__.py
(which it guesses from whether __path__ is defined).
The __name__ attempt should fail if there is no parent package (top level modules),
if __name__ is '__main__' (-m entry points), or both (scripts).
That is, if both __name__ has no subcomponents and the module does not seem
to be a package __init__ module then import should fail.
Imports now raise `TypeError` instead of `ValueError` for relative import failures. This makes things consistent between `builtins.__import__` and `importlib.__import__` as well as using a more natural import for the failure.
https://bugs.python.org/issue37444
Automerge-Triggered-By: @brettcannon
* Rename PyImport_Cleanup() to _PyImport_Cleanup() and move it to the
internal C API. Add 'tstate' parameters.
* Remove documentation of _PyImport_Init(), PyImport_Cleanup(),
_PyImport_Fini(). All three were documented as "For internal use
only.".
* Add 'tstate' parameter to many internal import.c functions.
* _PyImportZip_Init() now gets 'tstate' parameter rather than
'interp'.
* Add 'interp' parameter to _PyState_ClearModules() and rename it
to _PyInterpreterState_ClearModules().
* Move private _PyImport_FindBuiltin() to the internal C API; add
'tstate' parameter to it.
* Remove private _PyImport_AddModuleObject() from the C API:
use public PyImport_AddModuleObject() instead.
* Remove private _PyImport_FindExtensionObjectEx() from the C API:
use private _PyImport_FindExtensionObject() instead.