Changed comments to make sense now that the LazyList-based

examples no longer require any explicit closing to avoid
leaking.

That the tee-based examples still do is (I think) still a
mystery.  Part of the mystery is that gc.garbage remains
empty:  if it were the case that some generator in a trash
cycle said it needed finalization, suppressing collection
of that cycle, that generator _would_ show up in gc.garbage.

So this is acting more like, e.g., some tp_traverse slot
isn't visiting all the pointers it should (in which case
the skipped pointer(s) would act like an external root,
silently suppressing collection of everything reachable
from it(them)).
This commit is contained in:
Tim Peters 2006-04-15 01:48:57 +00:00
parent 8ebb28df3a
commit 7f098112ee
1 changed files with 6 additions and 4 deletions

View File

@ -700,11 +700,12 @@ result for as long as it has not been "consumed" from all of the duplicated
iterators, whereupon it is deleted. You can therefore print the hamming
sequence during hours without increasing memory usage, or very little.
The beauty of it is that recursive running after their tail FP algorithms
The beauty of it is that recursive running-after-their-tail FP algorithms
are quite straightforwardly expressed with this Python idiom. The problem is
that this creates the same kind of reference cycle as the m235()
implementation above, and again we have to explicitly close the innermost
generator to clean up the cycle.
that this creates an uncollectable reference cycle, and we have to explicitly
close the innermost generator to clean up the cycle.
XXX As of 14-Apr-2006, Tim doubts that anyone understands _why_ some cycle
XXX is uncollectable here.
Ye olde Fibonacci generator, tee style.
@ -730,6 +731,7 @@ Ye olde Fibonacci generator, tee style.
[1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610, 987, 1597, 2584]
>>> closer()
XXX Again the tee-based approach leaks without an explicit close().
"""
leak_test1 = """