2-digit years are now converted using rules that are (according to
Fredrik Lundh) recommended by POSIX or X/Open: 0-68 mean 2000-2068,
69-99 mean 1969-1999.
2-digit years are now only accepted if time.accept2dyear is set to a
nonzero integer; if it is zero or not an integer or absent, only year
values >= 1900 are accepted. Year values 100-1899 and negative year
values are never accepted.
The initial value of time.accept2dyear depends on the environment
variable PYTHONY2K: if PYTHONY2K is set and non-empty,
time.accept2dyear is initialized to 0; if PYTHONY2K is empty or not
set, time.accept2dyear is initialized to 0.
I had to make a slight diddle to work with Python 1.4, which
we and some of our customers are still using. :(
I've also made a few minor enhancements:
- You can now both get and set the memo using a 'memo'
attribute. This is handy for certain advanced applications
that we have.
- Added a 'binary' attribute to get and set the binary
mode for a pickler.
- Added a somewhat experimental 'fast' attribute. When this
is set, objects are not placed in the memo during pickling.
This should lead to faster pickling and smaller pickles in
cases where:
o you *know* there are no circular references, and
o either you've:
- preloaded the memo with class information
by pickling classes in non-fast mode or by
manipilating the memo directly, or
- aren't pickling instances.
1. Only DECREF the class's module when the module is retrieved via
PyImport_Import. If it is retrieved from the modules dictionary with
PyDict_GetItem, it is using a borrowed reference.
2. If the module doesn't define the desired class, raise the same
SystemError that pickle.py does instead of returning an AttributeError
(which is cryptic at best).
Also, fix the PyArg_ParseTuple in cpm_loads (the externally visible
loads) function: Use "S" instead of "O" because cStringIO will croak
with a "bad arguments to internal function" if passed anything other
than a string.
gethostbyaddr(). (Plain gethostbyname() returns only the IP address.)
This moves the code shared by gethostbyaddr() and gethostbyname_ex()
to a subroutine.
Original patch by Dan Stromberg; some tweaks by GvR.
exceptions:
posix_error_with_filename(): New function which calls
PyErr_SetFromErrnoWithFilename()
The following methods have been changed to call
posix_error_with_filename():
posix_1str()
posix_strint()
posix_strintint()
posix_do_stat()
posix_mkdir()
posix_utime()
posix_readlink()
posix_open()
INITFUNC(): os.error (nee PosixError) is PyExc_OSError
low-level Python exit handler. This can attempt to call Python code
at a point that the interpreter and thread state have already been
destroyed, causing a Bus Error. Given the intended use of
Py_AtExit(), I'm not convinced that it's a good idea to call it
earlier during Python's finalization sequence... (Although this is
the only use for it in the entire distribution.)
PythonCmd_Error() but failed to return. The error wasn't very likely
(only when we run out of memory) but since the check is there we might
as well return the error. (I think that Barry introduced this buglet
when he added error checks everywhere.)
# from my PC at home, but it can't send email :-(
Add a clarifying comment about the new ENTER_OVERLAP and
LEAVE_OVERLAP_TCL macros; get rid of all the bogus tests for deleted
interpreters (Tcl already tests for this; they were left over from an
earlier misguided attempt to fix the threading).
There were some serious problem with the thread-safety code.
The basic problem was that often the result was gotten out of
the Tcl interpreter object after releasing the Tcl lock.
Of course, another thread might have changed the return value
already, and this was indeed happening. (Amazing what trying
it on a different thread implementation does!)
The solution is to grab the Python lock without releasing the
Tcl lock, so it's safe to create a string object or set the
exceptions from the Tcl interpreter. Once that's done, the
Tcl lock is released.
Note that it's now legal to acquire the Python lock while the
the Tcl lock is held; but the reverse is not true: the Python
lock must be released before the Tcl lock is acquired. This
in order to avoid deadlines. Fortunately, there don't seem to
be any problems with this.
(The "sort of" is because it uses kbhit() to detect that the user
starts typing, and then no events are processed until they hit
return.)
Also fixed a nasty locking bug: EventHook() is called without the Tcl
lock set, so it can't use the ENTER_PYTHON and LEAVE_PYTHON macros,
which manipulate both the Python and the Tcl lock. I now only acquire
and release the Python lock.
(Haven't tested this on Unix yet...)
Tkinter. This adds a separate lock -- read the comments. (This was
also needed for Mark Hammond's attempts to make PythonWin
Tkinter-friendly.)
The changes have affected the EventHook slightly, too; and I've done
some more cleanup of the code that deals with the different versions
of Tcl_CreateFileHandler().
so that our #ifdef test has the wrong effect. Substitute hardcoded
values for some important symbols (but not for the whole range -- some
are pretty obscure so it's not worth it).
registers an input file handler for stdin with Tcl and handles Tcl
events until something is available on stdin; it then deletes the
handler and returns from EventHook().
This works with or without GNU readline, and doesn't busy-wait.
It still doesn't work for Mac or Windows :-(
Rationalized the doc strings.
Also simplify the module initialization -- we don't need a __version__
which is set to "$Rev" :-) and we don't need a fatal error when the
initialization fails.
- When facility not specified to syslog() method, use default from openlog()
(This is how it was claimed to work in the documentation)
- Potential resource leak of o_ident, now cleaned up in closelog()
- Minor comment accuracy fix.
most common interface to Tcl, the call() method, by maybe 20-25%.
The speedup code avoids the construction of a Tcl command string from
the argument list -- the Tcl argument list is immediately parsed back
by Tcl_Eval() into a list that is *guaranteed* (by Tcl_Merge()) to be
exactly the same list, so instead we look up the command info and call
the command function directly. If the lookup fails, we fall back to
the old method (Tcl_Merge() + Tcl_Eval()) so we don't need to worry
about special cases like undefined commands or the occasional command
("after") that sets the info.proc pointer to NULL -- let TclEval()
deal with these.
the address of the Tcl interpreter object, as an integer. Not very
useful for the Python programmer, but this can be called by another C
extension that needs to make calls into the Tcl/Tk C API and needs to
get the address of the Tcl interpreter object. A simple cast of the
return value to (Tcl_Interp *) will do the trick now.
type for all functions. However many function call PyArg_Parse() and
need a 0. This is so that when they didn't change anything, the can
do Py_INCREF(args); return args. Reverted this back. For atof(),
there's no reason not to use PyArg_ParseTuple(), so I changed the code
(atoi and atol already used that).
it seems harmless for other platforms. It plays tricks with the name
of the library used to link with. Apparently DG/UX really wants a
shared library to link with if it wants shared modules to use symbols
from the library. I'm not sure why this wasn't an issue with 1.4;
DG/UX seems to be the only platform where moving to a single library
made things harder!
BTW This adds a target to create libpython$(VERSION).so; however this
target is *only* for DG/UX.
- Loading non-binary string pickles checks for insecure
strings. This is needed because cPickle (still)
uses a restricted eval to parse non-binary string pickles.
This change is needed to prevent untrusted
pickles like::
"S'hello world'*2000000\012p0\012."
from hosing an application.
- User-defined types can now support unpickling without
executing a constructor.
The second value returned from __reduce__ can now be None,
rather than an argument tuple. On unpickling, if the second
value returned from __reduce__ during pickling was None, then
rather than calling the first value returned from __reduce__,
directly, the __basicnew__ method of the first value returned
from __reduce__ is called without arguments.
- New option -x, to skip first line of script
- Use the correct platform-specific delimiter and library location in
the usage message
(Also removed two blank lines and moved one line around so that each
part of the usage message is again under 512 bytes and the whole usage
message still fits in 23 lines.)
the default build on Linux (because it requires -lcrypt which isn't
availabel everywhere).
Some improvements to the _tkinter build line suggested by Case Roole.
maxsplit which is implemented in string.py but wasn't here. The
reference manual doesn't define what happens when maxsplit is negative
or larger than the number of occurrences, but in either case, I
implemented this as all get replaced. Default value is zero which
replaces all occurrences.
signal handlers in a fork()ed child process when Python is compiled with
thread support. The bug was reported by Scott <scott@chronis.icgroup.com>.
What happens is that after a fork(), the variables used by the signal
module to determine whether this is the main thread or not are bogus,
and it decides that no thread is the main thread, so no signals will
be delivered.
The solution is the addition of PyOS_AfterFork(), which fixes the signal
module's variables. A dummy version of the function is present in the
intrcheck.c source file which is linked when the signal module is not
used.
to inside floatsleep(). This is necessary because floatsleep() does
the error handling and it must have grabbed the interpreter lock and
thread state before it can do so.
save and restore the tstate, but explicitly calling
PyEval_SaveThread() does reset it! While I think about how to fix
this for real, here's a fix that avoids getting a fatal error.
(1) Use PyErr_NewException("module.class", NULL, NULL) to create the
exception object.
(2) Remove all calls to Py_FatalError(); instead, return or
ignore the errors -- the import code now checks PyErr_Occurred()
after calling a module's init function, so it's no longer a
fatal error for the initialization to fail.
Also did some small cleanups, e.g. removed unnecessary test for
"already initialized" from initfpectl(), and unified
initposix()/initnt().
I haven't checked this very thoroughly, so while the changes are
pretty trivial -- beware of untested code!
This one works! However it requires using a modified version of
tclNotify.c (provided), which requires access to the Tcl source
to compile it. In order to enable this hack, add the following
to the Setup line for _tkinter:
tclNotify.c -DHAVE_PYTCL_WAITUNTILEVENT -I$(TCL)/generic
where TCL points to the source tree of Tcl 8.0. Other versions
of Tcl are not supported.
The tclNotify.c file is copyrighted by Sun Microsystems; the
licensing terms are in the file license.terms. According to this
file, no further permission to distribute this is required,
provided the file license.terms is included. Hence, I am checking
that in, too.
maps errno numbers to errno names (e.g. EINTR), and errorcode maps
them to message strings. (The latter is redundant because
the new call posix.strerror() now does the same, but alla...)
set_completer(function)
parse_and_bind(string)
read_init_file(filename)
The first is the most exciting feature: with an appropriate Python
completer function, it can do dynamic completion based on the contents
of your namespace!
Added 'p' format character for Pascal string (i.e. leading length
byte). This uses the count prefix line 's' does, except that the
count includes the length byte; i.e. '10p' takes 10 bytes packed but
has space for a length byte and 9 data bytes.
1. Fix bug in (de)compression objects. The final string resize used
zst.total_out to determine the length of the string, but the
(de)compression object will output data a little bit at a time, which
means total_out is not the string size. Fix: save original value of
total_out at the start of the call.
2. Be sure to Py_DECREF the result value if you exit with an
exception.
3. Use PyInt_FromLong instead of Py_BuildValue
4. include more constants from the zlib header file
5. Use PyErr_Format instead of using a local buffer and sprintf.
dealloc() functions contained code to free/DECREF the buffer
(there were differences between I and O objects but the logic bug was
the same). Fixed this be setting the buffer pointer to NULL and
testing for that. (This also makes it safe to call close() more than
once.)
XXX Worry: what if you try to read() or write() once the thing is
closed?
the executable must have that suffix. Note that there is no
corresponding support in the top-level Makefile because I'm not sure
that the install targets there make sense under these circumstances.
getpagesize() -- #ifdef doesn't work, Linux has conflicting decls in
its headers. Choice: only declare the return type, not the argument
prototype, and not on Linux.
-- initialize length to DEFAULTALLOC and not 0
-- resize string before returning (to remove '\000' padding)
Also converted some compression routines to use PyString instead of
buffer.
Change default alloc size for uncompressing to 16K.
Remove comment about core dumps when an invalid window sizes is used.
This bug has been fixed in zlib 1.0.4.
Two new optional arguments to decompress, wbits and bufsize. wbits
specifies the window size and bufsize specifies the initial output
string size.
In decompression code -- decompress and decompressobj methods -- use a
Python string (and _PyString_Resize) to collect the uncompressed
stream. Replaces a separate buffer that was copied into a string.
Fix bug in decompress that caused it to always realloc the buffer when
it was finished decompressing.
Modernized handling of optional arguments to compressobj.
Updated doc strings.
Removed handling of \e, \cX escapes, following a string-SIG discussion.
Fixed minor typos in re.py
re.error is now set equal to reop.error.
Move definition of constants like NORMAL and CHARCLASS into reop, which
exports them; re.py was changed to import them from reop.
Added C equivalents of _expand and expand_escape to reop, and changed
re.py to use them.