PYTHONDUMPREFS output after most teardown. Attempts to use
PYTHONDUMPREFS with the Zope3 test suite died with Py_FatalError(),
since _Py_PrintReferences() can end up executing arbitrary Python code
(for objects that override __repr__), and that requires an intact
interpreter.
even farther down, to just before the call to
_PyObject_DebugMallocStats(). This required the following changes:
- pystate.c, PyThreadState_GetDict(): changed not to raise an
exception or issue a fatal error when no current thread state is
available, but simply return NULL without raising an exception
(ever).
- object.c, Py_ReprEnter(): when PyThreadState_GetDict() returns NULL,
don't raise an exception but return 0. This means that when
printing a container that's recursive, printing will go on and on
and on. But that shouldn't happen in the case we care about (see
first bullet).
- Updated Misc/NEWS and Doc/api/init.tex to reflect changes to
PyThreadState_GetDict() definition.
variables to store internal data. As a result, any atempts to use the
unicode system with multiple active interpreters, or successive
interpreter executions, would fail.
Now that information is stored into members of the PyInterpreterState
structure.
import warnings.py _after_ site.py has run. This ensures that site.py
is again the first .py to be imported, giving it back full control over
sys.path.
with an indented code block but no newline would raise SyntaxError.
This would have been a four-line change in parsetok.c... Except
codeop.py depends on this behavior, so a compilation flag had to be
invented that causes the tokenizer to revert to the old behavior;
this required extra changes to 2 .h files, 2 .c files, and 2 .py
files. (Fixes SF bug #501622.)
Initialize the small integers and __builtins__ in startup.
This removes some if conditions.
Change XDECREF to DECREF for values which shouldn't be NULL.
- new import hooks in import.c, exposed in the sys module
- new module called 'zipimport'
- various changes to allow bootstrapping from zip files
I hope I didn't break the Windows build (or anything else for that
matter), but then again, it's been sitting on sf long enough...
Regarding the latest discussions on python-dev: zipimport sets
pkg.__path__ as specified in PEP 273, and likewise, sys.path item such as
/path/to/Archive.zip/subdir/ are supported again.
Py_Init crash". refchain cannot be cleared because objects can live across
Py_Finalize() and Py_Initialize() if they are kept alive by circular
references.
more trivial lexical helper macros so that uses of these guys expand
to nothing at all when they're not enabled. This should help sub-
standard compilers that can't do a good job of optimizing away the
previous "(void)0" expressions.
Py_DECREF: There's only one definition of this now. Yay! That
was that last one in the family defined multiple times in an #ifdef
maze.
Py_FatalError(): Changed the char* signature to const char*.
_Py_NegativeRefcount(): New helper function for the Py_REF_DEBUG
expansion of Py_DECREF. Calling an external function cuts down on
the volume of generated code. The previous inline expansion of abort()
didn't work as intended on Windows (the program often kept going, and
the error msg scrolled off the screen unseen). _Py_NegativeRefcount
calls Py_FatalError instead, which captures our best knowledge of
how to abort effectively across platforms.
The SIGXFSZ signal is sent when the maximum file size limit is
exceeded (RLIMIT_FSIZE). Apparently, it is also sent when the 2GB
file limit is reached on platforms without large file support.
The default action for SIGXFSZ is to terminate the process and dump
core. When it is ignored, the system call that caused the limit to be
exceeded returns an error and sets errno to EFBIG. Python
always checks errno on I/O syscalls, so there is nothing to do with
the signal.
Added code to call this when PYMALLOC_DEBUG is enabled, and envar
PYTHONMALLOCSTATS is set, whenever a new arena is obtained and once
late in the Python shutdown process.
Big Hammer to implement -Qnew as PEP 238 says it should work (a global
option affecting all instances of "/").
pydebug.h, main.c, pythonrun.c: define a private _Py_QnewFlag flag, true
iff -Qnew is passed on the command line. This should go away (as the
comments say) when true division becomes The Rule. This is
deliberately not exposed to runtime inspection or modification: it's
a one-way one-shot switch to pretend you're using Python 3.
ceval.c: when _Py_QnewFlag is set, treat BINARY_DIVIDE as
BINARY_TRUE_DIVIDE.
test_{descr, generators, zipfile}.py: fiddle so these pass under
-Qnew too. This was just a matter of s!/!//! in test_generators and
test_zipfile. test_descr was trickier, as testbinop() is passed
assumptions that "/" is the same as calling a "__div__" method; put
a temporary hack there to call "__truediv__" instead when the method
name is "__div__" and 1/2 evaluates to 0.5.
Three standard tests still fail under -Qnew (on Windows; somebody
please try the Linux tests with -Qnew too! Linux runs a whole bunch
of tests Windows doesn't):
test_augassign
test_class
test_coercion
I can't stay awake longer to stare at this (be my guest). Offhand
cures weren't obvious, nor was it even obvious that cures are possible
without major hackery.
Question: when -Qnew is in effect, should calls to __div__ magically
change into calls to __truediv__? See "major hackery" at tail end of
last paragraph <wink>.
uninitialized memory reads reported in bug #478001.
Note that this doesn't address the following larger issues:
- Error conditions are not documented for PyOS_*sig() in the C API.
- Nothing that actually calls PyOS_*sig() in the core interpreter and
extension modules actually /checks/ the return value of the call.
Fixing those is left as an exercise for a later day.
This patch changes to logic to:
if env.var. set and non-empty:
if env.var. is an integer:
set flag to that integer
if flag is zero: # [actually, <= 0 --GvR]
set flag to 1
Under this patch, anyone currently using
PYTHONVERBOSE=yes will get the same output as before.
PYTHONVERBNOSE=2 will generate more verbosity than
before.
The only unusual case that the following three are
still all equivalent:
PYTHONVERBOSE=yespleas
PYTHONVERBOSE=1
PYTHONVERBOSE=0
PEP 238. Changes:
- add a new flag variable Py_DivisionWarningFlag, declared in
pydebug.h, defined in object.c, set in main.c, and used in
{int,long,float,complex}object.c. When this flag is set, the
classic division operator issues a DeprecationWarning message.
- add a new API PyRun_SimpleStringFlags() to match
PyRun_SimpleString(). The main() function calls this so that
commands run with -c can also benefit from -Dnew.
- While I was at it, I changed the usage message in main() somewhat:
alphabetized the options, split it in *four* parts to fit in under
512 bytes (not that I still believe this is necessary -- doc strings
elsewhere are much longer), and perhaps most visibly, don't display
the full list of options on each command line error. Instead, the
full list is only displayed when -h is used, and otherwise a brief
reminder of -h is displayed. When -h is used, write to stdout so
that you can do `python -h | more'.
Notes:
- I don't want to use the -W option to control whether the classic
division warning is issued or not, because the machinery to decide
whether to display the warning or not is very expensive (it involves
calling into the warnings.py module). You can use -Werror to turn
the warnings into exceptions though.
- The -Dnew option doesn't select future division for all of the
program -- only for the __main__ module. I don't know if I'll ever
change this -- it would require changes to the .pyc file magic
number to do it right, and a more global notion of compiler flags.
- You can usefully combine -Dwarn and -Dnew: this gives the __main__
module new division, and warns about classic division everywhere
else.
- Do not compile unicodeobject, unicodectype, and unicodedata if Unicode is disabled
- check for Py_USING_UNICODE in all places that use Unicode functions
- disables unicode literals, and the builtin functions
- add the types.StringTypes list
- remove Unicode literals from most tests.
_PyImport_FixupExtension() on the exceptions module. Now
reload(exceptions) acts just like reload(sys) instead of raising
an ImportError.
This closes SF bug #422004.
Replace uses of PyCF_xxx with CO_xxx.
Replace individual feature slots in PyFutureFeatures with single
bitmask ff_features.
When flags must be transfered among the three parts of the interpreter
that care about them -- the pythonrun layer, the compiler, and the
future feature parser -- can simply or (|) the definitions.
- Add an explicit call to PyType_Ready(&PyList_Type) to pythonrun.c
(just for the heck of it, really -- we should either explicitly
ready all types, or none).
that 'yield' is a keyword. This doesn't help test_generators at all! I
don't know why not. These things do work now (and didn't before this
patch):
1. "from __future__ import generators" now works in a native shell.
2. Similarly "python -i xxx.py" now has generators enabled in the
shell if xxx.py had them enabled.
3. This program (which was my doctest proxy) works fine:
from __future__ import generators
source = """\
def f():
yield 1
"""
exec compile(source, "", "single") in globals()
print type(f())
that info to code dynamically compiled *by* code compiled with generators
enabled. Doesn't yet work because there's still no way to tell the parser
that "yield" is OK (unlike nested_scopes, the parser has its fingers in
this too).
Replaced PyEval_GetNestedScopes by a more-general
PyEval_MergeCompilerFlags. Perhaps I should not have? I doubted it was
*intended* to be part of the public API, so just did.
but apparently he had to go to school, so I am checking it in for him.
This makes PyRun_HandleSystemExit() a static instead, called
handle_system_exit(), and let it use the current exception rather than
passing in an exception. This slightly simplifies the code.
Update docstring and library reference section on 'sys' module.
New API PyErr_Display, just for displaying errors, called by excepthook.
Uncaught exceptions now call sys.excepthook; if that fails, we fall back
to calling PyErr_Display directly.
Also comes with sys.__excepthook__ and sys.__displayhook__.
If a module has a future statement enabling nested scopes, they are
also enable for the exec statement and the functions compile() and
execfile() if they occur in the module.
If Python is run with the -i option, which enters interactive mode
after executing a script, and the script it runs enables nested
scopes, they are also enabled in interactive mode.
XXX The use of -i with -c "from __future__ import nested_scopes" is
not supported. What's the point?
To support these changes, many function variants have been added to
pythonrun.c. All the variants names end with Flags and they take an
extra PyCompilerFlags * argument. It is possible that this complexity
will be eliminated in a future version of the interpreter in which
nested scopes are not optional.
(Also remove warning about module-level global decl, because we can't
distinguish from code passed to exec.)
Define PyCompilerFlags type contains a single element,
cf_nested_scopes, that is true if a nested scopes future statement has
been entered at the interactive prompt.
New API functions:
PyNode_CompileFlags()
PyRun_InteractiveOneFlags()
-- same as their non Flags counterparts except that the take an
optional PyCompilerFlags pointer
compile.c: In jcompile() use PyCompilerFlags argument. If
cf_nested_scopes is true, compile code with nested scopes. If it
is false, but the code has a valid future nested scopes statement,
set it to true.
pythonrun.c: Create a new PyCompilerFlags object in
PyRun_InteractiveLoop() and thread it through to
PyRun_InteractiveOneFlags().
raised by the compiler.
XXX For now, text entered into the interactive intepreter is not
printed in the traceback.
Inspired by a patch from Roman Sulzhyk
compile.c:
Add helper fetch_program_text() that opens a file and reads until it
finds the specified line number. The code is a near duplicate of
similar code in traceback.c.
Modify com_error() to pass two arguments to SyntaxError constructor,
where the second argument contains the offending text when possible.
Modify set_error_location(), now used only by the symtable pass, to
set the text attribute on existing exceptions.
pythonrun.c:
Change parse_syntax_error() to continue of the offset attribute of a
SyntaxError is None. In this case, it sets offset to -1.
Move code from PyErr_PrintEx() into helper function
print_error_text(). In the helper, only print the caret for a
SyntaxError if offset > 0.
Bug was introduced by tricks played to make .pyc files executable
via cmdline arg. Then again, -x worked via a trick to begin with.
If anyone can think of a portable way to test -x, be my guest!
symtable.h, so that they can be used by external module.
Improve error handling in symtable_enter_scope(), which return an
error code that went unchecked by most callers. XXX The error handling
in symtable code is sloppy in general.
Modify symtable to record the line number that begins each scope.
This can help to identify which code block is being referred to when
multiple blocks are bound to the same name.
Add st_scopes dict that is used to preserve scope info when
PyNode_CompileSymtable() is called. Otherwise, this information is
tossed as soon as it is no longer needed.
Add Py_SymtableString() to pythonrun; analogous to Py_CompileString().
They're named as if public, so I did a Bad Thing by changing
PyMarshal_ReadObjectFromFile() to suck up the remainder of the file in one
gulp: anyone who counted on that leaving the file pointer merely at the
end of the next object would be screwed. So restored
PyMarshal_ReadObjectFromFile() to its earlier state, renamed the new greedy
code to PyMarshal_ReadLastObjectFromFile(), and changed Python internals to
call the latter instead.
pythonrun.c: In Py_Finalize, don't reset the initialized flag until after
the exit funcs have run.
atexit.py: in _run_exitfuncs, mutate the list of pending calls in a
threadsafe way. This wasn't a contributor to bug 128475, it just burned
my eyeballs when looking at that bug.
PyRun_FileEx(). These are the same as their non-Ex counterparts but
have an extra argument, a flag telling them to close the file when
done.
Then this is used by Py_Main() and execfile() to close the file after
it is parsed but before it is executed.
Adding APIs seems strange given the feature freeze but it's the only
way I see to close the bug report without incompatible changes.
[ Bug #110616 ] source file stays open after parsing is done (PR#209)
filename and lineno attributes, but do not mask the SyntaxError if we
fail.
This is part of what is needed to close SoruceForge bug #110628
(Jitterbug PR#278).
comments, docstrings or error messages. I fixed two minor things in
test_winreg.py ("didn't" -> "Didn't" and "Didnt" -> "Didn't").
There is a minor style issue involved: Guido seems to have preferred English
grammar (behaviour, honour) in a couple places. This patch changes that to
American, which is the more prominent style in the source. I prefer English
myself, so if English is preferred, I'd be happy to supply a patch myself ;)
used for indentation related errors. This patch includes Ping's
improvements for indentation-related error messages.
Closes SourceForge patches #100734 and #100856.
the number of children of a node exceeds the max possible value for
the short that is used to count them. The Python runtime converts
this parser error into the SyntaxError "expression too long."
need two phase init or fini of the builtin module. Change the call of
_PyBuiltin_Init_1() to _PyBuiltin_Init(). Add a call to
init_exceptions().
Py_Finalize(): Don't call _PyBuiltin_Fini_1(). Instead call
fini_exceptions() but move this to before the thread state is
cleared.
For more comments, read the patches@python.org archives.
For documentation read the comments in mymalloc.h and objimpl.h.
(This is not exactly what Vladimir posted to the patches list; I've
made a few changes, and Vladimir sent me a fix in private email for a
problem that only occurs in debug mode. I'm also holding back on his
change to main.c, which seems unnecessary to me.)
remaining object references if the environment variable PYTHONDUMPREFS
exists. The default behaviour caused problems for background or
otherwise invisible processes that use the debug build of Python.
v temporary variable was never decref'd. Test this by starting up the
interpreter, hitting C-c, then immediately exiting.
Same potential leak can occur if error is E_NOMEM, since the return is
done in the case block. Added Py_XDECREF(v); to both blocks, just
before the return.
think we have our own DOS box (i.e. we're not started from a command
line shell), we print a message and wait for the user to hit a key
before the DOS box is closed.
The hacky heuristic for determining whether we have our *own* DOS box
(due to Mark Hammond) is to test whether we're on line zero...
- Add Py_FrozenFlag, intended to suppress error messages fron
getpath.c in frozen binaries.
- Add Py_GetPythonHome() and Py_SetPythonHome(), intended to allow
embedders to force a different PYTHONHOME.
- Add new interface PyErr_PrintEx(flag); same as PyErr_Print() but
flag determines whether sys.last_* are set or not. PyErr_Print()
now simply calls PyErr_PrintEx(1).
1) The __builtins__ variable in the __main__ module is set to the
__builtin__ module instead of its __dict__.
2) Get rid of the SIGHUP and SIGTERM handlers. They can't be made to
work reliably when threads may be in use, they are Unix specific, and
Python programmers can now program this functionality is a safer way
using the signal module.
Setting interp->builtins to the __builtin__ module instead of to its
dictionary had the unfortunate side effect of always running in
restricted execution mode :-(
I will check in a different way of setting __main__.__builtins__ to
the __builtin__ module later.
Also, there was a typo -- a comment was unfinished, and as a result
some finalizations were not being executed.
In Bart Simpson style,
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
I Will Not Check In Untested Changes.
- The interp->builtins variable (and hence, __main__.__builtins__) is
once again initialized to the built-in *module* instead of its
dictionary.
- The finalization order is once again changed. Signals are finalized
relatively early, because (1) it DECREF's the signal handlers, and if
a signal handler happens to be a bound method, deleting it could cause
problems when there's no current thread around, and (2) we don't want
to risk executing signal handlers during finalization.
- Changed semantics for initialized flag (again); forget the ref
counting, forget the fatal errors -- redundant calls to
Py_Initialize() or Py_Finalize() calls are simply ignored.
- Automatically import site.py on initialization, unless a flag is set
not to do this by main().
the -X command line option.
Py_Initialize(): Handle the two phase initialization of the built-in
module.
Py_Finalize(): Handle the two phase finalization of the built-in
module.
parse_syntax_error(): New function which parses syntax errors that
PyErr_Print() will catch. This correctly parses such errors
regardless of whether PyExc_SyntaxError is an old-style string
exception or new-fangled class exception.
PyErr_Print(): Many changes:
1. Normalize the exception.
2. Handle SystemExit exceptions which might be class based. Digs
the exit code out of the "code" attribute. String based
SystemExit is handled the same as before.
3. Handle SyntaxError exceptions which might be class based. Digs
the various information bits out of the instance's attributes
(see parse_syntax_error() for details). String based
SyntaxError still works too.
4. Don't write the `:' after the exception if the exception is
class based and has an empty string str() value.
for more!).
- The global flags that can be set from environment variables are now
set in Py_Initialize (except the silly Py_SuppressPrint, which no
longer exists). This saves duplicate code in frozenmain.c and main.c.
- Py_GetProgramName() is now here; added Py_SetProgramName(). An
embedding program should no longer provide Py_GetProgramName(),
instead it should call Py_SetProgramName() *before* calling
Py_Initialize().
Py_FdIsInteractive(). The flag is supposed to be set by the -i
command line option. The function is supposed to be called instead of
isatty(). This is used for Lee Busby's wish #1, to have an option
that pretends stdin is interactive even when it really isn't.