- In functions where we already hold the same object in differently typed
pointers, use the correctly typed pointer instead of casting the other
pointer a second time.
objects before initializing it. It might be linked
already if there was a Py_Initialize/Py_Finalize
cycle earlier; not unlinking it would break the global
list.
Py_VISIT: cast the `op` argument to PyObject* when calling
`visit()`. Else the caller has to pay too much attention to
this silly detail (e.g., frame_traverse needs to traverse
`struct _frame *` and `PyCodeObject *` pointers too).
examples no longer require any explicit closing to avoid
leaking.
That the tee-based examples still do is (I think) still a
mystery. Part of the mystery is that gc.garbage remains
empty: if it were the case that some generator in a trash
cycle said it needed finalization, suppressing collection
of that cycle, that generator _would_ show up in gc.garbage.
So this is acting more like, e.g., some tp_traverse slot
isn't visiting all the pointers it should (in which case
the skipped pointer(s) would act like an external root,
silently suppressing collection of everything reachable
from it(them)).
problems: first, PyGen_NeedsFinalizing() had an off-by-one bug that
prevented it from ever saying a generator didn't need finalizing, and
second, frame objects cleared themselves in a way that caused their
owning generator to think they were still executable, causing a double
deallocation of objects on the value stack if there was still a loop
on the block stack. This revision also removes some unnecessary
close() operations from test_generators that are now appropriately
handled by the cycle collector.
PySequence_GetItem of the time.strptime() result. Not a high probability
bug, but not inconceivable either, considering people can provide their own
'time' module.
an incremental encoder that must retain part of the data between calls
to the encode() method.
Fix the incremental encoder and decoder for the IDNA encoding.
This closes SF patch #1453235.
The test case came from test_generators, not test_itertools.
Ensure there's no cyclic garbage we are counting.
This is weird because it leaks, then reaches a limit:
python.exe -i test_tee.py
>>> leak()
0
[26633 refs]
>>> leak()
0
[26658 refs]
>>> leak()
0
[26683 refs]
>>> leak()
0
[26708 refs]
>>> leak()
0
[26708 refs]
>>> leak()
0
[26708 refs]
>>> leak()
0
passing a string. Martin already fixed the actual crash by ensuring
Py_UNICODE is unsigned. As discussed on python-dev, this fix
removes the possibility of creating a unicode string from a raw buffer.
There is an outstanding question of how to fix the crash in 2.4.
appear. Get rid of them by nuking doctest's default DocTestRunner
instance as part of cleanup(). Also cleanup() before running the
first test repetition (the test was run once before we get into
the -R branch).