instead of failing with SystemError.
Relative import from non-package now fails with ImportError rather than
SystemError.
(cherry picked from commit 8a9cd20edc)
Add a test to check the current MAGIC_NUMBER against the
expected number for the release if the current release is
at candidate or final level. On test failure, describe to
the developer the procedure for changing the magic number.
This ensures that pre-merge CI will automatically pick up
on magic number changes in maintenance releases (and
explain why those are problematic), rather than relying on
all core developers to be aware of the implications of
such changes.
Multi-phase extension module import now correctly allows the
``m_methods`` field to be used to add module level functions
to instances of non-module types returned from ``Py_create_mod``.
Patch by Xiang Zhang.
Windows.
Originally only b'PYTHONCASEOK' was being checked for in os.environ,
but that won't work under Windows where all environment variables are
strings (on OS X they are bytes).
Thanks to Eryk Sun for the bug report.
modules can't be lazily loaded.
Thanks to Python 3.6 allowing for types.ModuleType to have its
__class__ mutated, the restriction can be lifted by calling
create_module() on the wrapped loader.
* Rename libregrtest.main_in_temp_cwd() to libregrtest.main()
* Add regrtest.main_in_temp_cwd() alias to libregrtest.main()
* Move old main_in_temp_cwd() code into libregrtest.Regrtest.main()
* Update multiple scripts to call libregrtest.main()
importlib.util.LazyLoader.
The class was checking its argument as to whether its implementation
of create_module() came directly from importlib.abc.Loader. The
problem is that the classes coming from imoprtlib.machinery do not
directly inherit from the ABC as they come from _frozen_importlib.
Because the documentation has always said that create_module() was
ignored, the check has simply been removed.
with no known parent package.
Previously SystemError was raised if the parent package didn't exist
(e.g., __package__ was set to '').
Thanks to Florent Xicluna and Yongzhi Pan for reporting the issue.
In a previous change, __spec__.parent was prioritized over
__package__. That is a backwards-compatibility break, but we do
eventually want __spec__ to be the ground truth for module details. So
this change reverts the change in semantics and instead raises an
ImportWarning when __package__ != __spec__.parent to give people time
to adjust to using spec objects.
not defined for a relative import.
This is the start of work to try and clean up import semantics to rely
more on a module's spec than on the myriad attributes that get set on
a module. Thanks to Rose Ames for the patch.
Known limitations of the current implementation:
- documentation changes are incomplete
- there's a reference leak I haven't tracked down yet
The leak is most visible by running:
./python -m test -R3:3 test_importlib
However, you can also see it by running:
./python -X showrefcount
Importing the array or _testmultiphase modules, and
then deleting them from both sys.modules and the local
namespace shows significant increases in the total
number of active references each cycle. By contrast,
with _testcapi (which continues to use single-phase
initialisation) the global refcounts stabilise after
a couple of cycles.
The concept of .pyo files no longer exists. Now .pyc files have an
optional `opt-` tag which specifies if any extra optimizations beyond
the peepholer were applied.
importlib.abc.Loader.exec_module() is also defined.
Before this change, create_module() was optional **and** could return
None to trigger default semantics. This change now reduces the
options for choosing default semantics to one and in the most
backporting-friendly way (define create_module() to return None).