Remove sys.getcheckinterval() and sys.setcheckinterval() functions.
They were deprecated since Python 3.2. Use sys.getswitchinterval()
and sys.setswitchinterval() instead.
Remove also check_interval field of the PyInterpreterState structure.
* regrtest: Add --cleanup option to remove "test_python_*" directories
of previous failed test jobs.
* Add "make cleantest" to run "python3 -m test --cleanup".
At the moment you can definitely use UDPLITE sockets on Linux systems, but it would be good if this support were formalized such that you can detect support at runtime easily.
At the moment, to make and use a UDPLITE socket requires something like the following code:
```
>>> import socket
>>> a = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, 136)
>>> b = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, 136)
>>> a.bind(('localhost', 44444))
>>> b.sendto(b'test'*256, ('localhost', 44444))
>>> b.setsockopt(136, 10, 16)
>>> b.sendto(b'test'*256, ('localhost', 44444))
>>> b.setsockopt(136, 10, 32)
>>> b.sendto(b'test'*256, ('localhost', 44444))
>>> b.setsockopt(136, 10, 64)
>>> b.sendto(b'test'*256, ('localhost', 44444))
```
If you look at this through Wireshark, you can see that the packets are different in that the checksums and checksum coverages change.
With the pull request that I am submitting momentarily, you could do the following code instead:
```
>>> import socket
>>> a = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDPLITE)
>>> b = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDPLITE)
>>> a.bind(('localhost', 44444))
>>> b.sendto(b'test'*256, ('localhost', 44444))
>>> b.set_send_checksum_coverage(16)
>>> b.sendto(b'test'*256, ('localhost', 44444))
>>> b.set_send_checksum_coverage(32)
>>> b.sendto(b'test'*256, ('localhost', 44444))
>>> b.set_send_checksum_coverage(64)
>>> b.sendto(b'test'*256, ('localhost', 44444))
```
One can also detect support for UDPLITE just by checking
```
>>> hasattr(socket, 'IPPROTO_UDPLITE')
```
https://bugs.python.org/issue37345
When the test is ran with `PYTHONWARNINGS=error` the environment variable is passed to the python interpreter used in `assert_python_ok` where `DeprecationWarning` from `@asyncio.coroutine` is converted into an error. Ignore the `DeprecationWarning` in `assert_python_ok`.
https://bugs.python.org/issue37323
test_gdb no longer fails if it gets an "unexpected" message on
stderr: it now ignores stderr. The purpose of test_gdb is to test
that python-gdb.py commands work as expected, not to test gdb.
For datetime.datetime.strptime(), the leading zero for some two-digit formats is optional.
This adds a footnote to the strftime/strptime documentation to reflect this fact, and adds some tests to ensure that it is true.
bpo-34903
The initialize options are 1) add command line options, which are appended to sys.argv as if passed on a real command line, and 2) skip the shell restart. The customization dialog is accessed by a new entry on the Run menu.
aifc.openfp() alias to aifc.open(), sunau.openfp() alias to
sunau.open(), and wave.openfp() alias to wave.open() have been
removed. They were deprecated since Python 3.7.
Add --upgrade-deps to venv module
- This allows for pip + setuptools to be automatically upgraded to the latest version on PyPI
- Update documentation to represent this change
bpo-34556: Add --upgrade to venv module
bpo-35031, bpo-35998: Reintroduce workaround on Windows and FreeBSD
in test_start_tls_server_1() of test_asyncio: disable TLS v1.3 on the
client context.
uuid could try fallback methods that had no chance of working on a particular
platform, and this could cause spurious test failures, as well as degraded
performance as fallback options were tried and failed.
This fixes both the uuid module and its test's SkipUnless logic to use a
prefiltered list of techniques that may at least potentially work on that platform.
Patch by Michael Felt (aixtools).
In a subinterpreter, spawning a daemon thread now raises an
exception. Daemon threads were never supported in subinterpreters.
Previously, the subinterpreter finalization crashed with a Pyton
fatal error if a daemon thread was still running.
* Add _thread._is_main_interpreter()
* threading.Thread.start() now raises RuntimeError if the thread is a
daemon thread and the method is called from a subinterpreter.
* The _thread module now uses Argument Clinic for the new function.
* Use textwrap.dedent() in test_threading.SubinterpThreadingTests
Document reference cycle and resurrected objects issues in
sys.unraisablehook() and threading.excepthook() documentation.
Fix test.support.catch_unraisable_exception(): __exit__() no longer
ignores unraisable exceptions.
Fix test_io test_writer_close_error_on_close(): use a second
catch_unraisable_exception() to catch the BufferedWriter unraisable
exception.
Join the thread to prevent leaking a running thread and leaking a
reference.
Cleanup also the test:
* asyncioWindowsProactorEventLoopPolicy became the default policy,
there is no need to set it manually.
* Only start the thread once the loop is running.
* Use a shorter sleep in the thread (100 ms rather than 1 sec).
* Use close_loop(loop) rather than loop.close().
* Use longer variable names.
Fix a regression introduced by af8646c805 that was causing code of the form:
if True and False:
do_something()
to be optimized incorrectly, eliminating the block.
The peephole optimizer was not optimizing correctly bytecode after negative deltas were introduced. This is due to the fact that some special values (255) were being searched for in both instruction pointer delta and line number deltas.
The __exit__() method of test.support.catch_unraisable_exception
context manager now ignores unraisable exception raised when clearing
self.unraisable attribute.
Allow pure Python implementation of pickle to work
even when the C _pickle module is unavailable.
Fix test_pickle when _pickle is missing: declare PyPicklerHookTests
outside "if has_c_implementation:" block.
Fix a race condition at Python shutdown when waiting for threads.
Wait until the Python thread state of all non-daemon threads get
deleted (join all non-daemon threads), rather than just wait until
Python threads complete.
* Add threading._shutdown_locks: set of Thread._tstate_lock locks
of non-daemon threads used by _shutdown() to wait until all Python
thread states get deleted. See Thread._set_tstate_lock().
* Add also threading._shutdown_locks_lock to protect access to
threading._shutdown_locks.
* Add test_finalization_shutdown() test.
regrtest now uses sys.unraisablehook() to mark a test as "environment
altered" (ENV_CHANGED) if it emits an "unraisable exception".
Moreover, regrtest logs a warning in this case.
Use "python3 -m test --fail-env-changed" to catch unraisable
exceptions in tests.
Use catch_unraisable_exception() to ignore 'Exception ignored in:'
error when the internal BufferedWriter of the BufferedRWPair is
destroyed. The C implementation doesn't give access to the
internal BufferedWriter, so just ignore the warning instead.
Rename compile() feature_version parameter to _feature_version and
convert it to a keyword-only parameter.
Update also test_type_comments to pass feature_version as a tuple.
Fix an unintended ValueError from :func:`subprocess.run` when checking for
conflicting `input` and `stdin` or `capture_output` and `stdout` or `stderr` args
when they were explicitly provided but with `None` values within a passed in
`**kwargs` dict rather than as passed directly by name.
Currently, inspect.getfile(str) will report nonsense:
```pytb
>>> inspect.getfile(str)
TypeError: <module 'builtins' (built-in)> is a built-in class
```
This fixes that
https://bugs.python.org/issue37173
There is a possibility that someone (like me) accidentally will omit parentheses with `FileType` arguments after `FileType`, and parser will contain wrong file until someone will try to use it.
Example:
```python
parser = argparse.ArgumentParser()
parser.add_argument('-x', type=argparse.FileType)
```
https://bugs.python.org/issue37150
Replace two Python function calls with a single one to ensure that no
memory allocation is done between the invalid object is created and
when _PyObject_IsFreed() is called.
* bpo-36520: reset the encoded word offset when starting a new
line during an email header folding operation
* 📜🤖 Added by blurb_it.
* bpo-36520: add an additional test case, and provide descriptive
comments for the test_folding_with_utf8_encoding_* tests
* bpo-36520: fix whitespace issue
* bpo-36520: changes per reviewer request -- remove extraneous
backslashes; add whitespace between terminating quotes and
line-continuation backslashes; use "bpo-" instead of
"issue #" in comments
In test_fromkeys() the derived test class now supports all arguments in its
constructor so that the class to be tested can use its own constructor in its
fromkeys() implementation.
In test_mutatingiteration() the test fails as soon as iterating over a
dictionary with one entry and adding new entries in the loop iterates more
than once (to avoid endless loops in faulty implementations).
https://bugs.python.org/issue2661
* bpo-21315: Fix parsing of encoded words with missing leading ws.
Because of missing leading whitespace, encoded word would get parsed as
unstructured token. This patch fixes that by looking for encoded words when
splitting tokens with whitespace.
Missing trailing whitespace around encoded word now register a defect
instead.
Original patch suggestion by David R. Murray on bpo-21315.
This PR deprecate explicit loop parameters in all public asyncio APIs
This issues is split to be easier to review.
Second step: streams.py
https://bugs.python.org/issue36373
* bpo-30835: email: Fix AttributeError when parsing invalid Content-Transfer-Encoding
Parsing an email containing a multipart Content-Type, along with a
Content-Transfer-Encoding containing an invalid (non-ASCII-decodable) byte
will fail. email.feedparser.FeedParser._parsegen() gets the header and
attempts to convert it to lowercase before comparing it with the accepted
encodings, but as the header contains an invalid byte, it's returned as a
Header object rather than a str.
Cast the Content-Transfer-Encoding header to a str to avoid this.
Found using the AFL fuzzer.
Reported-by: Daniel Axtens <dja@axtens.net>
Signed-off-by: Andrew Donnellan <andrew@donnellan.id.au>
* Add email and NEWS entry for the bugfix.
* bpo-35805: Add parser for Message-ID header.
This parser is based on the definition of Identification Fields from RFC 5322
Sec 3.6.4.
This should also prevent folding of Message-ID header using RFC 2047 encoded
words and hence fix bpo-35805.
* Prevent folding of non-ascii message-id headers.
* Add fold method to MsgID token to prevent folding.
Add BaseEventLoop.wait_executor_on_close attribute: true by default.
loop.close() now waits for the default executor to finish by default.
Set loop.wait_executor_on_close attribute to False to not wait for
the executor.
Replace asyncio.set_event_loop() with TestCase.set_event_loop() of
test_asyncio.utils: this method calls TestCase.close_loop() which
waits until the executor completes, to avoid leaking dangling
threads.
Inherit from test_asyncio.utils.TestCase rather than
unittest.TestCase.
Modify test_coroutines, test_cprofile, test_generators, test_raise,
test_ssl and test_yield_from to use
support.catch_unraisable_exception() rather than
support.captured_stderr().
test_thread: remove test_save_exception_state_on_error() which is now
updated. test_unraisable_exception() checks that sys.unraisablehook()
is called to handle _thread.start_new_thread() exception.
test_cprofile now rely on unittest for test discovery: replace
support.run_unittest() with unittest.main().
When inheriting a heap subclass from a vectorcall class that sets
`.tp_call=PyVectorcall_Call` (as recommended in PEP 590), the subclass does
not inherit `_Py_TPFLAGS_HAVE_VECTORCALL`, and thus `PyVectorcall_Call` does
not work for it.
This attempts to solve the issue by:
* always inheriting `tp_vectorcall_offset` unless `tp_call` is overridden
in the subclass
* inheriting _Py_TPFLAGS_HAVE_VECTORCALL for static types, unless `tp_call`
is overridden
* making `PyVectorcall_Call` ignore `_Py_TPFLAGS_HAVE_VECTORCALL`
This means it'll be ever more important to only call `PyVectorcall_Call`
on classes that support vectorcall. In `PyVectorcall_Call`'s intended role
as `tp_call` filler, that's not a problem.
* bpo-37014: Update docstring and Documentation of fileinput.FileInput()
* Explain the behavior of fileinput.FileInput() when reading stdin.
* Update blurb.
* bpo-37014: Fix typo in the docstring and documentation.
Fixed QueueListener in order to avoid random deadlocks.
Unable to add regression tests atm due to time constraints, will add it in a bit.
Regarding implementation, although it's nested, it does not cause performance issues whatsoever, and does not call task_done() in case of an exception (which is the right thing to do IMHO).
https://bugs.python.org/issue36813
Adds a new option in trace that allows tracing runnable modules. It is
exposed as `--module module_name` as `-m` is already in use for another
argument.
The ssl module now can dump key material to a keylog file and trace TLS
protocol messages with a tracing callback. The default and stdlib
contexts also support SSLKEYLOGFILE env var.
The msg_callback and related enums are private members. The feature
is designed for internal debugging and not for end users.
Signed-off-by: Christian Heimes <christian@python.org>
This is an old feature request that appears from time to time. After a year of experimenting with various introspection capabilities in `typing_inspect` on PyPI, I propose to add these two most commonly used functions: `get_origin()` and `get_args()`. These are essentially thin public wrappers around private APIs: `__origin__` and `__args__`.
As discussed in the issue and on the typing tracker, exposing some public helpers instead of `__origin__` and `__args__` directly will give us more flexibility if we will decide to update the internal representation, while still maintaining backwards compatibility.
The implementation is very simple an is essentially a copy from `typing_inspect` with one exception: `ClassVar` was special-cased in `typing_inspect`, but I think this special-casing doesn't really help and only makes things more complicated.
Bump the removal to 3.9, indicate collections.abc available since 3.3,
replace version-changed directive to deprecated-removed.
https://bugs.python.org/issue36953
From 3.8 async functions used with mock.patch return an `AsyncMock`. `_accept_connection2` is an async function where create_task is also mocked. Don't mock `create_task` so that tasks are created out of coroutine returned by `AsyncMock` and the tasks are completed.
https://bugs.python.org/issue37015
As it changes the way functions are called, the PEP 590 implementation
skipped the functions that the GDB integration is looking for
(by name) to find function calls.
Looking for the new helper `cfunction_call_varargs` hopefully fixes the
tests, and thus buildbots.
The changed frame nuber in test_gdb is due to there being fewer
C calls when calling a built-in method.
* bpo-26836: Add os.memfd_create()
* Use the glibc wrapper for memfd_create()
Co-Authored-By: Christian Heimes <christian@python.org>
* Fix deletions caused by autoreconf.
* Use MFD_CLOEXEC as the default value for *flags*.
* Add memset_s to configure.ac.
* Revert memset_s changes.
* Apply the requested changes.
* Tweak the docs.
* bpo-22385: Support output separators in hex methods.
Also in binascii.hexlify aka b2a_hex.
The underlying implementation behind all hex generation in CPython uses the
same pystrhex.c implementation. This adds support to bytes, bytearray,
and memoryview objects.
The binascii module functions exist rather than being slated for deprecation
because they return bytes rather than requiring an intermediate step through a
str object.
This change was inspired by MicroPython which supports sep in its binascii
implementation (and does not yet support the .hex methods).
https://bugs.python.org/issue22385
Add explicit `asyncSetUp` and `asyncTearDown` methods.
The rest is the same as for #13228
`AsyncTestCase` create a loop instance for every test for the sake of test isolation.
Sometimes a loop shared between all tests can speed up tests execution time a lot but it requires control of closed resources after every test finish. Basically, it requires nested supervisors support that was discussed with @1st1 many times. Sorry, asyncio supervisors have no chance to land on Python 3.8.
The PR intentionally does not provide API for changing the used event loop or getting the test loop: use `asyncio.set_event_loop_policy()` and `asyncio.get_event_loop()` instead.
The PR adds four overridable methods to base `unittest.TestCase` class:
```
def _callSetUp(self):
self.setUp()
def _callTestMethod(self, method):
method()
def _callTearDown(self):
self.tearDown()
def _callCleanup(self, function, /, *args, **kwargs):
function(*args, **kwargs)
```
It allows using asyncio facilities with minimal influence on the unittest code.
The last but not least: the PR respects contextvars. The context variable installed by `asyncSetUp` is available on test, `tearDown` and a coroutine scheduled by `addCleanup`.
https://bugs.python.org/issue32972
* Fix the implicit string concatenation in `assert_has_awaits` error message.
* Use "await" instead of "call" in `assert_awaited_with` error message.
https://bugs.python.org/issue37075
_thread.start_new_thread() now logs uncaught exception raised by the
function using sys.unraisablehook(), rather than sys.excepthook(), so
the hook gets access to the function which raised the exception.
I tried to get rid of the `_ProtocolMeta`, but unfortunately it didn'y work. My idea to return a generic alias from `@runtime_checkable` made runtime protocols unpickleable. I am not sure what is worse (a custom metaclass or having some classes unpickleable), so I decided to stick with the status quo (since there were no complains so far). So essentially this is a copy of the implementation in `typing_extensions` with two modifications:
* Rename `@runtime` to `@runtime_checkable` (plus corresponding updates).
* Allow protocols that extend `collections.abc.Iterable` etc.
It has been documented as deprecated and to be removed in 3.8;
From a comment on another thread – which I can't find ; leave get_coro_wrapper() for now, but always return `None`.
https://bugs.python.org/issue36933
Return a coroutine while patching async functions with a decorator.
Co-authored-by: Andrew Svetlov <andrew.svetlov@gmail.com>
https://bugs.python.org/issue36996
Fix destructor _pyio.BytesIO and _pyio.TextIOWrapper: initialize
their _buffer attribute as soon as possible (in the class body),
because it's used by __del__() which calls close().
Add a new threading.excepthook() function which handles uncaught
Thread.run() exception. It can be overridden to control how uncaught
exceptions are handled.
threading.ExceptHookArgs is not documented on purpose: it should not
be used directly.
* threading.excepthook() and threading.ExceptHookArgs.
* Add _PyErr_Display(): similar to PyErr_Display(), but accept a
'file' parameter.
* Add _thread._excepthook(): C implementation of the exception hook
calling _PyErr_Display().
* Add _thread._ExceptHookArgs: structseq type.
* Add threading._invoke_excepthook_wrapper() which handles the gory
details to ensure that everything remains alive during Python
shutdown.
* Add unit tests.
When using the "=" debug functionality of f-strings, use another Constant node (or a merged constant node) instead of adding expr_text to the FormattedValue node.
This will address the common mistake many asyncio users make:
an "except Exception" clause breaking Tasks cancellation.
In addition to this change, we stop inheriting asyncio.TimeoutError
and asyncio.InvalidStateError from their concurrent.futures.*
counterparts. There's no point for these exceptions to share the
inheritance chain.
In 3.9 we'll focus on implementing supervisors and cancel scopes,
which should allow better handling of all exceptions, including
SystemExit and KeyboardInterrupt
* sys.unraisablehook: add 'err_msg' field to UnraisableHookArgs.
* Use _PyErr_WriteUnraisableMsg() in _ctypes _DictRemover_call()
and gc delete_garbage().
The implementation is straightforward, it just mimics `ClassVar` (since the latter is also a name/access qualifier, not really a type). Also it is essentially copied from `typing_extensions`.
It is also possible to link against a library or executable with a
statically linked libpython, but not both with the same DLL. In fact
building a statically linked python is currently broken on Cygwin
for other (related) reasons.
The same problem applies to other POSIX-like layers over Windows
(MinGW, MSYS) but Python's build system does not seem to attempt
to support those platforms at the moment.
Previously, it was hard to tell whether a function should be awaited. It was also incorrect (per PEP 484) to put this in the type hint for coroutine functions. Added this info to the output of builtins.help and pydoc.
https://bugs.python.org/issue36045
* _PyPreConfig_InitCompatConfig() sets utf8_mode to 0.
* Change Py_UTF8Mode default value to 0.
* Fix _PyPreConfig_Copy(): copy also _config_init attrbibute.
* _PyPreConfig_AsDict() exports _config_init
* Fix _PyPreConfig_GetGlobalConfig(): use Py_UTF8Mode if it's greater
than 0, even if utf8_mode >= 0.
* Add unit tests on environment variables using Python API.
In development (-X dev) mode and in a debug build, IOBase finalizer
of the _pyio module now logs the exception if the close() method
fails. The exception is ignored silently by default in release build.
test_io: test_error_through_destructor() now uses
support.catch_unraisable_exception() rather than capturing stderr.
PyErr_WriteUnraisable() now creates a traceback object if there is no
current traceback. Moreover, call PyErr_NormalizeException() and
PyException_SetTraceback() to normalize the exception value. Ignore
silently any error.
* _PyPreConfig_GetGlobalConfig() and _PyCoreConfig_GetGlobalConfig()
now do nothing if the configuration was not initialized with
_PyPreConfig_InitCompatConfig() and _PyCoreConfig_InitCompatConfig()
* Remove utf8_mode=-2 special case: use utf8_mode=-1 instead.
* Fix _PyPreConfig_InitPythonConfig():
* isolated = 0 instead of -1
* use_environment = 1 instead of -1
* Rename _PyConfig_INIT to _PyConfig_INIT_COMPAT
* Rename _PyPreConfig_Init() to _PyPreConfig_InitCompatConfig()
* Rename _PyCoreConfig_Init() to _PyCoreConfig_InitCompatConfig()
* PyInterpreterState_New() now uses _PyCoreConfig_InitPythonConfig()
as default configuration, but it's very quickly overriden anyway.
* _freeze_importlib.c uses _PyCoreConfig_SetString() to set
program_name.
* Cleanup preconfig_init_utf8_mode(): cmdline is always non-NULL.
* Copy test_exceptions.test_unraisable() to
test_sys.UnraisableHookTest().
* Use catch_unraisable_exception() in test_coroutines,
test_exceptions, test_generators.
* Fixes issue 24882
* Add news file entry for change.
* Change test_concurrent_futures.ThreadPoolShutdownTest
Adjust the shutdown test so that, after submitting three jobs
to the executor, the test checks for less than three threads,
instead of looking for exactly three threads.
If idle threads are being recycled properly, then we should have
less than three threads.
* Switched idle count to semaphor, Updated tests
As suggested by reviewer tomMoral, swapped lock-protected counter
with a semaphore to track the number of unused threads.
Adjusted test_threads_terminate to wait for completiton of the
previous future before submitting a new one (and checking the
number of threads used).
Also added a new test to confirm the thread pool can be saturated.
* Updates tests as requested by pitrou.
* Correct minor whitespace error.
* Make test_saturation faster
Wrap the callback call within the `add_done_callback` function within concurrent.futures, in order to behave in an identical manner to callbacks added to a running future are triggered once it has completed.
This disallows things like `# type: ignoreé`, which seems wrong.
Also switch to using Py_ISALNUM for the alnum check, for consistency
with other code (and maybe correctness re: locale issues?).
https://bugs.python.org/issue36878
CVE-2019-9948: Avoid file reading as disallowing the unnecessary URL
scheme in URLopener().open() and URLopener().retrieve()
of urllib.request.
Co-Authored-By: SH <push0ebp@gmail.com>