A change from Duncan Booth, to deal with changes in the way pgen gets
built. Note that graminit.[ch] aren't normally built on Windows (they're
obtained from CVS).
the build, so I'm restoring it. I'm not sure what Neal's intent was,
since the line following the one he removed was "REQN(i, 1)" so i is
obviously used. ;)
with an indented code block but no newline would raise SyntaxError.
This would have been a four-line change in parsetok.c... Except
codeop.py depends on this behavior, so a compilation flag had to be
invented that causes the tokenizer to revert to the old behavior;
this required extra changes to 2 .h files, 2 .c files, and 2 .py
files. (Fixes SF bug #501622.)
(see Patch 512981). I changed stdin to sys_stdin in the body of the
function, but did not change stderr to sys_stdout, though I suspect that may
be the correct course. I don't know the code involved well enough to judge.
-- replace then with slightly faster PyObject_Call(o,a,NULL). (The
difference is that the latter requires a to be a tuple; the former
allows other values and wraps them in a tuple if necessary; it
involves two more levels of C function calls to accomplish all that.)
Change the parser and compiler to use PyMalloc.
Only the files implementing processes that will request memory
allocations small enough for PyMalloc to be a win have been
changed, which are:-
- Python/compile.c
- Parser/acceler.c
- Parser/node.c
- Parser/parsetok.c
This augments the aggressive overallocation strategy implemented by
Tim Peters in PyNode_AddChild() [Parser/node.c], in reducing the
impact of platform malloc()/realloc()/free() corner case behaviour.
Such corner cases are known to be triggered by test_longexp and
test_import.
Jeremy Hylton, in accepting this patch, recommended this as a
bugfix candidate for 2.2. While the changes to Python/compile.c
and Parser/node.c backport easily (and could go in), the changes
to Parser/acceler.c and Parser/parsetok.c require other not
insignificant changes as a result of the differences in the memory
APIs between 2.3 and 2.2, which I'm not in a position to work
through at the moment. This is a pity, as the Parser/parsetok.c
changes are the most important after the Parser/node.c changes, due
to the size of the memory requests involved and their frequency.
disaster too, so this change is here to stay. Beefed up the comments
and added some stats Andrew reported. Also a small change to the
macro body, to make it obvious how XXXROUNDUP(0) ends up returning 0.
See SF patch 578297 for context.
Not a bugfix candidate, as the functional changes here have already
been backported to the 2.2 line (this patch just improves clarity).
This gets us closer to consistent Ctrl+C behaviour on NT and Win9x. NT now reliably generates KeyboardInterrupt exceptions for NT when a file IO operation was aborted. Bugfix candidate
more trivial lexical helper macros so that uses of these guys expand
to nothing at all when they're not enabled. This should help sub-
standard compilers that can't do a good job of optimizing away the
previous "(void)0" expressions.
Py_DECREF: There's only one definition of this now. Yay! That
was that last one in the family defined multiple times in an #ifdef
maze.
Py_FatalError(): Changed the char* signature to const char*.
_Py_NegativeRefcount(): New helper function for the Py_REF_DEBUG
expansion of Py_DECREF. Calling an external function cuts down on
the volume of generated code. The previous inline expansion of abort()
didn't work as intended on Windows (the program often kept going, and
the error msg scrolled off the screen unseen). _Py_NegativeRefcount
calls Py_FatalError instead, which captures our best knowledge of
how to abort effectively across platforms.
children gets large, to avoid severe platform realloc() degeneration
in extreme cases (like test_longexp).
Bugfix candidate.
This was doing extremely timid over-allocation, just rounding up to the
nearest multiple of 3. Now so long as the number of children is <= 128,
it rounds up to a multiple of 4 but via a much faster method. When the
number of children exceeds 128, though, and more space is needed, it
doubles the capacity. This is aggressive over-allocation.
SF patch <http://www.python.org/sf/578297> has Andrew MacIntyre using
PyMalloc in the parser to overcome platform malloc problems in
test_longexp on OS/2 EMX. Jack Jansen notes there that it didn't help
him on the Mac, because the Mac has problems with frequent ever-growing
reallocs, not just with gazillions of teensy mallocs. Win98 has no
visible problems with test_longexp, but I tried boosting the test-case
size and soon got "senseless" MemoryErrors out of it, and soon after
crashed the OS: as I've seen in many other contexts before, while the
Win98 realloc remains zippy in bad cases, it leads to extreme
fragmentation of user address space, to the point that the OS barfs.
I don't yet know whether this fixes Jack's Mac problems, but it does cure
Win98's problems when boosting the test case size. It also speeds
test_longexp in its unaltered state.
Highlights: import and friends will understand any of \r, \n and \r\n
as end of line. Python file input will do the same if you use mode 'U'.
Everything can be disabled by configuring with --without-universal-newlines.
See PEP278 for details.