This patch allows building the Python 'mpzmodule' under SuSE Linux
without having to install the source package of the GMP-libary.
The gmp-mparam.h seems to be an internal header file. The patch
shouldn't hurt any other platforms.
The same problem (mixed mallocs) exists for the pcre stack.
The buffers md->... are allocated via PyMem_RESIZE in grow_stack(),
while in free_stack() they are released with free() instead of
PyMem_DEL().
The buffers self->regex and self->regex_extra are allocated in
pcre_compile() and pcre_study() via pcre_malloc, but are released
via free() instead of pcre_free.
(e.g. used for ZIP files).
The patch includes code that says:
+ Copyright (C) 1986 Gary S. Brown. You may use this program, or
+ code or tables extracted from it, as desired without restriction.
My interpretation (and Jim's) is that Gary S Brown has no claims under
copyright, patent or other rights or interests. Lawyers might disagree.
only. Through some mysterious interaction, they would take 9 separate
arguments as well. This misfeature is now disabled (to end a
difference with JPython).
building the dicts used to inform the user about the defined
constants when using the *conf*() APIs.
Thanks to Mark Hammond <mhammond@skippinet.com.au>.
strings to integers for the *conf*() functions.
Added code to sort the tables at module initialization. Three
dictionaries, confstr_names, sysconf_names, and pathconf_names, are
added to the module as well. These map known configuration setting
names to the numeric value which is used to represent the setting in
the system call. This code is always called.
Updated related comments.
pathconf() names, from Sjoerd.
Added code to verify that these tables are properly ordered, only
included and used when CHECK_CONFNAME_TABLES is defined. This is only
needed to test the tables, so I haven't enabled this by default.
available since the interface is poorly defined on at least one major
platform (Solaris).
Moved table of constant names for fpathconf() & pathconf() into the
conditional that defines the conv_path_confname() helper; Mark Hammond
reported that defining the table when none of the constants were
defined causes the compiler to complain (won't allow 0-length array,
imagine that!).
In posix_fpathconf(), use conv_path_confname() as the O& conversion
function, instead of the conv_confname() helper, which has the wrong
signature (posix_pathconf() already used the right thing).
and TMP_MAX.
Converted all functions that used PyArg_Parse() or PyArg_NoArgs() to
use PyArg_ParseTuple() and specified all function names using the
:name syntax in the format strings, to allow better error messages
when TypeError is raised for parameter type mismatches.
Brian E Gallew, which were improved and adapted to OpenSSL 0.9.4 by
Laszlo Kovacs of HP. Both have kindly given permission to include
the patches in the Python distribution. Final formatting by GvR.
Brian E Gallew, which were improved and adapted to OpenSSL 0.9.4 by
Laszlo Kovacs of HP. Both have kindly given permission to include
the patches in the Python distribution. Final formatting by GvR.
new:
readline.get_begidx() -> int
gets the beginning index in the command line string
delimiting the tab-completion scope. This would
probably be used from within a tab-completion
handler
readline.get_endidx() -> int
gets the ending index in the command line string
delimiting the tab-completion scope. This would
probably be used from within a tab-compeltion
handler
readline.set_completer_delims(string) -> None
sets the delimiters used by readline as word breakpoints
for tab-completion
readline.get_completer_delims() -> string
gets the delimiters used by readline as word breakpoints
for tab-completion
fixed:
readline.get_line_buffer() -> string
doesnt cause a debug message every other call
simply moves the call to Tk_MainWindow() after the Tcl/Tk
initialization calls. The patch is unconditional, it works with
earlier and later versions as well.
reformatted.)
- Illegal padding is now ignored. (Recommendation by GvR.)
- Padding no longer removes characters from data string (resulting in
lost data/strings with negative lengths).
- Illegal characters outside the ASCII range are now ignored, instead
of possibly being remapped to a valid character.
the right variant of gethostbyname_r for us, since not all Linuxes are
equal in this respect. Reported by Laurent Pointal.
(2) On BeOS, Chris Herborth reports that instead of arpa/inet.h you
must include net/netdb.h to get the inet_ntoa() and inet_addr()
prototypes.
names match the documentation.
Removed broken code that supports the __methods__ attribute on ast
objects; the right magic was added to Py_FindMethod() since this was
originally written. <ast-object>.__methods__ now works, so dir() and
rlcompleter are happy.
Treat them as read-only, and make a copy as appropriately. This was
first reported by Bill Janssend and later by Craig Rowland and Ron
Sedlmeyer. This fix is mine.
"""
It fixes a memory corruption error resulting from BadPickleGet
exceptions in load_get, load_binget and load_long_binget. This was
initially reported on c.l.py as a problem with Cookie.py; see the thread
titled "python core dump (SIGBUS) on Solaris" for more details.
If PyDict_GetItem(self->memo, py_key) call failed, then py_key was being
Py_DECREF'd out of existence before call was made to
PyErr_SetObject(BadPickleGet, py_key).
The bug can be duplicated as follows:
import cPickle
cPickle.loads('garyp')
This raises a BadPickleGet exception whose value is a freed object. A
core dump will soon follow.
"""
Jim Fulton approves of the patch.
different values in the environ dict with the same key (although he
couldn't explain exactly how this came to be). Since getenv() uses
the first one, Python should do too. (Some doubts about case
sensitivity, but for now this at least seems the right thing to do
regardless of platform.)
- Don't call Py_FatalError() when initialization fails.
- Fix bogus use of return value from PyRun_String().
- Fix misc. compiler errors on some platforms.
I've updated cPickle.c to use class exceptions:
Changed pickle error types to classes:
PickleError
PicklingError
UnpickleableError
UnpicklingError
And change the handling of unpickleable objects so that an UnpickleableError
is raised with the unpickleable object as the argument. UnpickleableError
has a reasonable string representation and provides access to the problem
object, which is useful during debugging.
[I'm still waiting for patches to do the same to pickle.py.]
The test really wanted to distinguish between the two. So now we test
for __GLIBC__ instead. I have confirmed that this works for glibc and
I have an email from Christian Tanzer confirming that it works for
libc5, so it should be fine.
I have attached a new cPickle that adds a new control attribute
to unpicklers:
Added new Unpickler attribute, find_global. If set to None, then
global and instance pickles are disabled. Otherwise, it should be set to
a callable object that takes two arguments, a module name and an
object name, and returns an object. If the attribute is unset, then
the default mechanism is used.
This feature provides an additional mechanism for controlling which
classes can be used for unpickling.
Without this, if inflate() returned Z_BUF_ERROR asking for more output
space, we would report the error; now, we increase the buffer size and
try again, just as for Z_OK.
"""
The GNU folks, in their infinite wisdom, have decided not to implement
altzone in libc6; this would not be horrible, except that timezone
(which is implemented) includes the current DST setting (i.e. timezone
for Central is 18000 in summer and 21600 in winter). So Python's
timezone and altzone variables aren't set correctly during DST.
Here's a patch relative to 1.5.2b2 that (a) makes timezone and altzone
show the "right" thing on Linux (by using the tm_gmtoff stuff
available in BSD, which is how the GLIBC manual claims things should
be done) and (b) should cope with the southern hemisphere. In pursuit
of (b), I also took the liberty of renaming the "summer" and "winter"
variables to "july" and "jan". This patch should also make certain
time calculations on Linux actually work right (like the tz-aware
functions in the rfc822 module).
(It's hard to find DST that's currently being used in the southern
hemisphere; I tested using Africa/Windhoek.)
"""
is not an empty string, this means that you have arrived at the
end of the stream of compressed data, and the contents of .unused_data are
whatever follows the compressed stream.
data struct before calling gethostby{name,addr}_r(); (2) ignore the
3/5/6 args determinations made by the configure script and switch on
platform identifiers instead:
AIX, OSF have 3 args
Sun, SGI have 5 args
Linux has 6 args
On all other platforms, undef HAVE_GETHOSTBYNAME_R altogether.
- Use HAVE_GETHOSTBYNAME_R_6_ARG instead of testing for Linux and
glibc2.
- If gethostbyname takes 3 args, undefine HAVE_GETHOSTBYNAME_R --
don't know what code should be used.
- New symbol USE_GETHOSTBYNAME_LOCK defined iff the lock should be used.
- Modify the gethostbyaddr() code to also hold on to the lock until
after it is safe to release, overlapping with the Python lock.
(Note: I think that it could in theory be possible that Python code
executed while gethostbyname_lock is held could attempt to reacquire
the lock -- e.g. in a signal handler or destructor. I will simply say
"don't do that then.")
Here's a patch to fix the race condition, which wasn't fixed by Rob's
patch. It holds the gethostbyname lock until the results are copied out,
which means that this lock and the Python global lock are held at the same
time. This shouldn't be a problem as long as the gethostbyname lock is
always acquired when the global lock is not held.
He writes:
I had an off-by-1000 error in floatsleep(),
and the problem with time.clock() is that it's not implemented properly
on QNX... ANSI says it's supposed to return _CPU_ time used by the
process, but on QNX it returns the amount of real time used... so I was
confused.
guessing what happened when strftime() returns 0. Is it buffer
overflow or was the result simply 0 bytes long? (This happens for an
empty format string, or when the format string is a single %Z and the
timezone is unknown.) if the buffer is at least 256 times as long as
the format, assume the latter.
converted was a "digit" -- use isalnum(). This test is there only to
guard against "+" or "-" being interpreted as a valid int literal.
Reported by Takahiro Nakayama.
up the _tkinter main loop. Not clear why; the _kbhit() call _tkinter
makes probably confuses the stdio library when buffering isn't set to
whatever it is by default.
f_fsid field, since it's not a scalar on all systems supporting this
call (in particular, it's a tuple of two longs on AIX). Since it's
not particularly useful, just nuke it. Adapted the doc strings too.
was appended to a list. Lists are reference count neutral, so the
string must be DECREF'd. Also added some checks for the return value
of PyList_Append().
Note: there are still some memory problems reported by Purify (I get
two Array Bounds Reads still and an Unitialized Memory Read). Also,
in scanning the code, there appears to be some potential problems
where return values aren't checked. To much to attack now though.
Purify) being caused by a bug in the readline library. Nothing we can
do about it.
Cause: readline_initialize_everything() throws away the return value
from rl_read_init_file(), but that happens to be the last reference to
a dynamically allocated char*.
decompressor object. This required adding a flag to the struct which is
true if initialisation was completed; on object destruction, deflateEnd()
is only called if the flag is true.
NOTE: There is still a bug of some sort in the behavior of zlib. In
at least one case, inflate returns Z_OK (which is typically
interpreted to mean that more output space is needed) when it has
finished inflating a buffer. This has been reported as a bug to the
zlib maintainers; we may need to change the Python interface.
> mpz.mpz('\xff') should return mpz(255). Instead it returns
> mpz(4294967295L). Looks like the constructor doesn't work with strings
> containing characters above chr(128).
Caused by using just 'char' where 'unsigned char' should have been used.
But IMHO, this problem really reveals an annoyance in Python's
makesetup. makesetup puts the global include directories "$(INCLUDEPY)
$(EXECINCLUDEPY)" in front of the directories defined by the module in
Setup. Therefore global (potentially older) header files are preferred
over the ones set by the module, which makes it hard to compile new
versions of modules when the old versions are installed. AFAIK, the
other way around is common practice for most other software.
This patch to makesetup would be an potential fix for this problem,
though I don't know if it breaks anything else.
- New copyright. (Open source)
- Added new protocol for binary string pickles that
takes out unneeded puts:
p=Pickler()
p.dump(x)
p.dump(y)
thePickle=p.getvalue()
This has little or no impact on pickling time, but
often reduces unpickling time and pickle size, sometimes
significantly.
- Changed unpickler to use internal data structure instead
of list to reduce unpickling times by about a third.
- Many cleanups to get rid of obfuscated error handling
involving 'goto finally' and status variables.
- Extensive reGuidofication. (formatting :)
- Fixed binary floating-point pickling bug. 0.0 was not
pickled correctly.
- Now use binary floating point format when saving
floats in binary mode.
- Fixed some error message spelling error.
- New copyright. (Open source)
- Fixed problem in seek method. The seek method should (and now does)
fill with nulls when seeking past the end of the "file".
device and control pseudo-device:
- first look for the device filename in the environment variable
AUDIODEV.
- if not found, use /dev/audio
- calculate the control device by tacking "ctl" onto the base device
name.
We now do this. Also, if the open fails, we call
PyErr_SetFromErrnoWithFilename() to give a more informative error
message.
Added a fileno() method to the audio object returned from open().
This returns the file descriptor which can be used by applications to
set up SIGPOLL notification, as per the manpage.
solves the conflict with curses over the 'clear' entry point much
nicer. (Jack had checked in the changes to cstubs eons ago, but I
never regenrated glmodule.c :-( )
the compilation flags for the gl, fl and fm modules. This avoids a
name conflict with the curses module (both gl and curses have an entry
point called 'clear').
thread state of the thread calling mainloop() (or another event
handling function) rather than the thread state of the function that
created the client data structure.
2-digit years are now converted using rules that are (according to
Fredrik Lundh) recommended by POSIX or X/Open: 0-68 mean 2000-2068,
69-99 mean 1969-1999.
2-digit years are now only accepted if time.accept2dyear is set to a
nonzero integer; if it is zero or not an integer or absent, only year
values >= 1900 are accepted. Year values 100-1899 and negative year
values are never accepted.
The initial value of time.accept2dyear depends on the environment
variable PYTHONY2K: if PYTHONY2K is set and non-empty,
time.accept2dyear is initialized to 0; if PYTHONY2K is empty or not
set, time.accept2dyear is initialized to 0.
I had to make a slight diddle to work with Python 1.4, which
we and some of our customers are still using. :(
I've also made a few minor enhancements:
- You can now both get and set the memo using a 'memo'
attribute. This is handy for certain advanced applications
that we have.
- Added a 'binary' attribute to get and set the binary
mode for a pickler.
- Added a somewhat experimental 'fast' attribute. When this
is set, objects are not placed in the memo during pickling.
This should lead to faster pickling and smaller pickles in
cases where:
o you *know* there are no circular references, and
o either you've:
- preloaded the memo with class information
by pickling classes in non-fast mode or by
manipilating the memo directly, or
- aren't pickling instances.
1. Only DECREF the class's module when the module is retrieved via
PyImport_Import. If it is retrieved from the modules dictionary with
PyDict_GetItem, it is using a borrowed reference.
2. If the module doesn't define the desired class, raise the same
SystemError that pickle.py does instead of returning an AttributeError
(which is cryptic at best).
Also, fix the PyArg_ParseTuple in cpm_loads (the externally visible
loads) function: Use "S" instead of "O" because cStringIO will croak
with a "bad arguments to internal function" if passed anything other
than a string.