stack usage on FreeBSD, requiring the recursion limit to be lowered
further. Building with gcc 2.95 (the standard compiler on FreeBSD 4.x)
is now also affected.
The underlying issue is that FreeBSD's pthreads implementation has a
hard-coded 1MB stack size for the initial (or "primary") thread, which
can not be changed without rebuilding libc_r. Exhausting this stack
results in a bus error.
Building without pthreads (configure --without-threads), or linking
with the port of the Linux pthreads library (aka Linuxthreads) instead
of libc_r, avoids this limitation.
On OS/2, only gcc 3.2 is affected and the stack size is controllable,
so the special handling has been removed.
build (assert(gc->gc.gc_refs != 0) in visit_decref()).
Because OSSAudioError is a global, we must compensate (twice!) for
PyModule_AddObject()'s "helpful" decref of the object it adds.
* it no longer takes ssize, which served no purpose apart from
scolding you if you got it wrong
* changed the order of the three remaining required arguments
to (format, channels, rate) to match the order in which they
must be set
* replaced the optional argument 'emulate' with 'strict': if strict
true, and the audio device does not accept the requested sampling
parameters, raise OSSAudioError
* return a tuple (format, channels, rate) reflecting the sampling
parameters that were actually set
Change the canonical name of ossaudiodev.error to
ossaudiodev.OSSAudioError (keep an alias for backwards compatibility).
Remove 'audio_types' list and 'n_audio_types' (no longer needed now that
setparameters() no longer has an 'ssize' argument to police).
* sync(), because it waits for hardware buffers to flush, which
can take several seconds depending on cirumstances (according
to the OSS docs)
* close(), because it does an implicit sync()
tp_free is NULL or PyObject_Del at the end. Because it's a base type
it must call tp_free in its dealloc function, and because it's gc'able
it must not call PyObject_Del.
inherit_slots(): Don't inherit tp_free unless the type and its base
agree about whether they're gc'able. If the type is gc'able and the
base is not, and the base uses the default PyObject_Del for its
tp_free, give the type PyObject_GC_Del for its tp_free (the appropriate
default for a gc'able type).
cPickle.c: The Pickler and Unpickler types claim to be base classes
and gc'able, but their dealloc functions didn't call tp_free.
Repaired that. Also call PyType_Ready() on these typeobjects, so
that the correct (PyObject_GC_Del) default memory-freeing function
gets plugged into these types' tp_free slots.
one good use: a subclass adding a method to express the duration as
a number of hours (or minutes, or whatever else you want to add). The
native breakdown into days+seconds+us is often clumsy. Incidentally
moved a large chunk of object-initialization code closer to the top of
the file, to avoid worse forward-reference trickery.
the itertoolsmodule.
* Taught itertools.repeat(obj, n) to treat negative repeat counts as
zero. This behavior matches that for sequences and prevents
infinite loops.
that was used to start the thread. This is useful to track down the
source of the problem when there is no traceback, as can happen when a
daemon thread gets to run after Python is finialized (a new kind of
event, somehow this is now possible due to changes in Py_Finalize()).
to use LASTMARK_SAVE()/LASTMARK_RESTORE(), based on the discussion
in patch #712900.
- Cleaned up LASTMARK_SAVE()/LASTMARK_RESTORE() usage, based on the
established rules.
- Moved the upper part of the just commited patch (relative to bug #725106)
to outside the for() loop of BRANCH OP. There's no need to mark_save()
in every loop iteration.
This problem is related to a wrong behavior from mark_save/restore(),
which don't restore the mark_stack_base before restoring the marks.
Greg's suggestion was to change the asserts, which happen to be
the only recursive ops that can continue the loop, but the problem would
happen to any operation with the same behavior. So, rather than
hardcoding this into asserts, I have changed mark_save/restore() to
always restore the stackbase before restoring the marks.
Both solutions should fix these two cases, presented by Greg:
>>> re.match('(a)(?:(?=(b)*)c)*', 'abb').groups()
('b', None)
>>> re.match('(a)((?!(b)*))*', 'abb').groups()
('b', None, None)
The rest of the bug and patch in #725149 must be discussed further.
within repeats of alternatives. The only change to the original
patch was to convert the tests to the new test_re.py file.
This patch fixes cases like:
>>> re.match('((a)|b)*', 'abc').groups()
('b', '')
Which is wrong (it's impossible to match the empty string),
and incompatible with other regex systems, like the following
examples show:
% perl -e '"abc" =~ /^((a)|b)*/; print "$1 $2\n";'
b a
% echo "abc" | sed -r -e "s/^((a)|b)*/\1 \2|/"
b a|c
- The socket module now provides the functions inet_pton and inet_ntop
for converting between string and packed representation of IP addresses.
See SF patch #658327.
This still needs a bit of work in the doc area, because it is not
available on all platforms (especially not on Windows).
(contributed by logistix; substantially reworked by rhettinger).
To create a representation of non-string arrays, array_repr() was
starting with a base Python string object and repeatedly using +=
to concatenate the representation of individual objects.
Logistix had the idea to convert to an intermediate tuple form and
then join it all at once. I took advantage of existing tools and
formed a list with array_tolist() and got its representation through
PyObject_Repr(v) which already has a fast implementation for lists.
docs here are best-guess: the MS docs I could find weren't clear, and
some even claimed _commit() has no effect on Win32 systems (which is
easily shown to be false just by trying it).
* UINT_MAX -> ULONG_MAX since we are dealing with longs
* ParseTuple needs &int for 'i' and &long for 'l'
There may be a better way to do this, but this works.
string does what is expected (ie unset [BEGIN|END]LIBPATH)
- set the size of the DosQuerySysInfo buffer correctly; it was safe,
but incorrect (allowing a 1 element overrun)
I've applied a modified version of Greg Chapman's patch. I've included
the fixes without introducing the reorganization mentioned, for the sake
of stability. Also, the second fix mentioned in the patch don't fix the
mentioned problem anymore, because of the change introduced by patch
#720991 (by Greg as well). The new fix wasn't complicated though, and is
included as well.
As a note. It seems that there are other places that require the
"protection" of LASTMARK_SAVE()/LASTMARK_RESTORE(), and are just waiting
for someone to find how to break them. Particularly, I belive that every
recursion of SRE_MATCH() should be protected by these macros. I won't
do that right now since I'm not completely sure about this, and we don't
have much time for testing until the next release.
to be compliant with previous python versions, by backing out the changes
made in revision 2.84 which affected this. The bugfix for backtracking is
still maintained.
New functions:
unsigned long PyInt_AsUnsignedLongMask(PyObject *);
unsigned PY_LONG_LONG) PyInt_AsUnsignedLongLongMask(PyObject *);
unsigned long PyLong_AsUnsignedLongMask(PyObject *);
unsigned PY_LONG_LONG) PyLong_AsUnsignedLongLongMask(PyObject *);
New and changed format codes:
b unsigned char 0..UCHAR_MAX
B unsigned char none **
h unsigned short 0..USHRT_MAX
H unsigned short none **
i int INT_MIN..INT_MAX
I * unsigned int 0..UINT_MAX
l long LONG_MIN..LONG_MAX
k * unsigned long none
L long long LLONG_MIN..LLONG_MAX
K * unsigned long long none
Notes:
* New format codes.
** Changed from previous "range-and-a-half" to "none"; the
range-and-a-half checking wasn't particularly useful.
New test test_getargs2.py, to verify all this.
A small fix for bug #545855 and Greg Chapman's
addition of op code SRE_OP_MIN_REPEAT_ONE for
eliminating recursion on simple uses of pattern '*?' on a
long string.
of PyObject_HasAttr(); the former promises never to execute
arbitrary Python code. Undid many of the changes recently made to
worm around the worst consequences of that PyObject_HasAttr() could
execute arbitrary Python code.
Compatibility is hard to discuss, because the dangerous cases are
so perverse, and much of this appears to rely on implementation
accidents.
To start with, using hasattr() to check for __del__ wasn't only
dangerous, in some cases it was wrong: if an instance of an old-
style class didn't have "__del__" in its instance dict or in any
base class dict, but a getattr hook said __del__ existed, then
hasattr() said "yes, this object has a __del__". But
instance_dealloc() ignores the possibility of getattr hooks when
looking for a __del__, so while object.__del__ succeeds, no
__del__ method is called when the object is deleted. gc was
therefore incorrect in believing that the object had a finalizer.
The new method doesn't suffer that problem (like instance_dealloc(),
_PyObject_Lookup() doesn't believe __del__ exists in that case), but
does suffer a somewhat opposite-- and even more obscure --oddity:
if an instance of an old-style class doesn't have "__del__" in its
instance dict, and a base class does have "__del__" in its dict,
and the first base class with a "__del__" associates it with a
descriptor (an object with a __get__ method), *and* if that
descriptor raises an exception when __get__ is called, then
(a) the current method believes the instance does have a __del__,
but (b) hasattr() does not believe the instance has a __del__.
While these disagree, I believe the new method is "more correct":
because the descriptor *will* be called when the object is
destructed, it can execute arbitrary Python code at the time the
object is destructed, and that's really what gc means by "has a
finalizer": not specifically a __del__ method, but more generally
the possibility of executing arbitrary Python code at object
destruction time. Code in a descriptor's __get__() executed at
destruction time can be just as problematic as code in a
__del__() executed then.
So I believe the new method is better on all counts.
Bugfix candidate, but it's unclear to me how all this differs in
the 2.2 branch (e.g., new-style and old-style classes already
took different gc paths in 2.3 before this last round of patches,
but don't in the 2.2 branch).
instead of looping. Smaller and clearer. Faster, too, when we're not
appending to gc.garbage: gc_list_merge() takes constant time, regardless
of the lists' sizes.
append_objects(): Moved up to live with the other list manipulation
utilities.
externally unreachable objects with finalizers, and externally unreachable
objects without finalizers reachable from such objects. This allows us
to call has_finalizer() at most once per object, and so limit the pain of
nasty getattr hooks. This fixes the failing "boom 2" example Jeremy
posted (a non-printing variant of which is now part of test_gc), via never
triggering the nasty part of its __getattr__ method.
to special-case classic classes, or to worry about refcounts;
has_finalizer() deleted the current object iff the first entry in
the unreachable list has changed. I don't believe it was correct
to check for ob_refcnt == 1, either: the dealloc routine would get
called by Py_DECREF then, but there's nothing to stop the dealloc
routine from ressurecting the object, and then gc would remain at
the head of the unreachable list despite that its refcount temporarily
fell to 0 (and that would lead to an infinite loop in move_finalizers()).
I'm still worried about has_finalizer() resurrecting other objects
in the unreachable list: what's to stop them from getting collected?
delstr from initgc() into collect(). initgc() isn't called unless the
user explicitly imports gc, so can be used only for initialization of
user-visible module features; delstr needs to be initialized for proper
internal operation, whether or not gc is explicitly imported.
Bugfix candidate? I don't know whether the new bug was backported to
2.2 already.
pack_float, pack_double, save_float: All the routines for creating
IEEE-format packed representations of floats and doubles simply ignored
that rounding can (in rare cases) propagate out of a long string of
1 bits. At worst, the end-off carry can (by mistake) interfere with
the exponent value, and then unpacking yields a result wrong by a factor
of 2. In less severe cases, it can end up losing more low-order bits
than intended, or fail to catch overflow *caused* by rounding.
Bugfix candidate, but I already backported this to 2.2.
In 2.3, this code remains in severe need of refactoring.
for specific platforms. Use this to add plat-mac and
plat-mac/lib-scriptpackages on MacOSX. Also tested for not having adverse
effects on Linux, and I think this code isn't used on Windows anyway.
Fixes#661521.
invalid, rather than returning a string of random garbage of the
estimated result length. Closes SF patch #703471 by Hye-Shik Chang.
Will backport to 2.2-maint (consider it done.)
it instead of the OS-specific <linux/soundcard.h> or <machine/soundcard.h>.
Mixers devices have an ioctl-only interface, no read/write -- so the
flags passed to open() don't really matter. Thus, drop the 'mode'
parameter to openmixer() (ie. second arg to newossmixerobject()) and
always open mixers with O_RDWR.
- The applet logic has been replaced to bundlebuilder's bootstrap script
- Due to Apple being extremely string about argv[0], we need a way to
specify the actual executable name for use with sys.executable. See
the comment embedded in the code.
- Implement the behavior as specified in PEP 277, meaning os.listdir()
will only return unicode strings if it is _called_ with a unicode
argument.
- And then return only unicode, don't attempt to convert to ASCII.
- Don't switch on Py_FileSystemDefaultEncoding, but simply use the
default encoding if Py_FileSystemDefaultEncoding is NULL. This means
os.listdir() can now raise UnicodeDecodeError if the default encoding
can't represent the directory entry. (This seems better than silcencing
the error and fall back to a byte string.)
- Attempted to decribe the above in Doc/lib/libos.tex.
- Reworded the Misc/NEWS items to reflect the current situation.
This checkin also fixes bug #696261, which was due to os.listdir() not
using Py_FileSystemDefaultEncoding, like all file system calls are
supposed to.
[ 555817 ] Flawed fcntl.ioctl implementation.
with my patch that allows for an array to be mutated when passed
as the buffer argument to ioctl() (details complicated by
backwards compatibility considerations -- read the docs!).
Change setup.py to catch all exceptions.
- Rename module if the exception was an ImportError
- Only warn if the exception was any other error
Revert _iconv_codec to raising a RuntimeError.
(see SF bug #690309) and raise ImportErrors instead of
RuntimeErrors, so building Python continues even
if importing iconv_codecs fails.
This is a temporary fix until we get proper configure
support for "broken" iconv implementations.
is required for the chosen internal encoding in the init function,
as this seems to have a better chance of working under Irix and
Solaris.
Also change the test character from '\x01' to '0'.
This might fix SF bug #690309.
Rather than trying to second-guess the various error returns
of a second connect(), use select() to determine whether the
socket becomes writable (which means connected).
reasons: importing module can fail, or the attribute lookup
module.name can fail. We were giving the same error msg for
both cases, making it needlessly hard to guess what went wrong.
These cases give different error msgs now.
Fix off-by-1 error in normalize_line_endings():
when *p == '\0' the NUL was copied into q and q was auto-incremented,
the loop was broken out of,
then a newline was appended followed by a NUL.
So the function, in effect, was strcpy() but added two extra chars
which was caught by obmalloc in debug mode, since there was only
room for 1 additional newline.
Get test working under regrtest (added test_main).
the optional proto 2 slot state.
pickle.py, load_build(): CAUTION: Noted that cPickle's
load_build and pickle's load_build really don't do the same
things with the state, and didn't before this patch either.
cPickle never tries to do .update(), and has no backoff if
instance.__dict__ can't be retrieved. There are no tests
that can tell the difference, and part of what cPickle's
load_build() did looked accidental to me, so I don't know
what the true intent is here.
pickletester.py, test_pickle.py: Got rid of the hack for
exempting cPickle from running some of the proto 2 tests.
dictobject.c, PyDict_Next(): documented intended use.
exercised by the test suite before cPickle knows how to create NEWOBJ
too. For now, it was just tried once by hand (via loading a NEWOBJ
pickle created by pickle.py).
inet_aton() rather than inet_addr() -- the latter is obsolete because
it has a problem: "255.255.255.255" is a valid address but
indistinguishable from an error.
(I'm not sure if inet_aton() exists everywhere -- in case it doesn't,
I've left the old code in with an #ifdef.)
* Modules/bz2module.c
(BZ2FileObject): Now the structure includes a pointer to a file object,
instead of "inheriting" one. Also, some members were copied from the
PyFileObject structure to avoid dealing with the internals of that
structure from outside fileobject.c.
(Util_GetLine,Util_DropReadAhead,Util_ReadAhead,Util_ReadAheadGetLineSkip,
BZ2File_write,BZ2File_writelines,BZ2File_init,BZ2File_dealloc,
BZ2Comp_dealloc,BZ2Decomp_dealloc):
These functions were adapted to the change above.
(BZ2File_seek,BZ2File_close): Use PyObject_CallMethod instead of
getting the function attribute locally.
(BZ2File_notsup): Removed, since it's not necessary anymore to overload
truncate(), and readinto() with dummy functions.
(BZ2File_methods): Added xreadlines() as an alias to BZ2File_getiter,
and removed truncate() and readinto().
(BZ2File_get_newlines,BZ2File_get_closed,BZ2File_get_mode,BZ2File_get_name,
BZ2File_getset):
Implemented getters for "newlines", "mode", and "name".
(BZ2File_members): Implemented "softspace" member.
(BZ2File_init): Reworked to create a file instance instead of initializing
itself as a file subclass. Also, pass "name" object untouched to the
file constructor, and use PyObject_CallFunction instead of building the
argument tuple locally.
(BZ2File_Type): Set tp_new to PyType_GenericNew, tp_members to
BZ2File_members, and tp_getset to BZ2File_getset.
(initbz2): Do not set BZ2File_Type.tp_base nor BZ2File_Type.tp_new.
* Doc/lib/libbz2.tex
Do not mention that BZ2File inherits from the file type.
* Removed the ifilter flag wart by splitting it into two simpler functions.
* Fixed comment tabbing in C code.
* Factored module start-up code into a loop.
Documentation:
* Re-wrote introduction.
* Addede examples for quantifiers.
* Simplified python equivalent for islice().
* Documented split of ifilter().
Sets.py:
* Replace old ifilter() usage with new.
__ne__ no longer complain if they don't know how to compare to the other
thing. If no meaningful way to compare is known, saying "not equal" is
sensible. This allows things like
if adatetime in some_sequence:
and
somedict[adatetime] = whatever
to work as expected even if some_sequence contains non-datetime objects,
or somedict non-datetime keys, because they only call __eq__.
It still complains (raises TypeError) for mixed-type comparisons in
contexts that require a total ordering, such as list.sort(), use as a
key in a BTree-based data structure, and cmp().
* Fixed typo in exception message for times()
* Filled in missing times_traverse()
* Document reasons that imap() did not adopt a None fill-in feature
* Document that count(sys.maxint) will wrap-around on overflow
* Add overflow test to islice()
* Check that starmap()'s argument returns a tuple
* Verify that imap()'s tuple re-use is safe
* Make a similar tuple re-use (with safety check) for izip()
guarantee to keep valid pointers in its slots.
tests: Moved ExtensionSaver from test_copy_reg into pickletester, and
use it both places. Once extension codes get assigned, it won't be
safe to overwrite them willy nilly in test suites, and ExtensionSaver
does a thorough job of undoing any possible damage.
Beefed up the EXT[124] tests a bit, to check the smallest and largest
codes in each opcode's range too.
this clarifies that they are part of an internal API (albeit shared
between pickle.py, copy_reg.py and cPickle.c).
I'd like to do the same for copy_reg.dispatch_table, but worry that it
might be used by existing code. This risk doesn't exist for the
extension registry.
because it seems more consistent with the rest of the code.
cPickle_PyMapping_HasKey(): This extern function isn't used anywhere in
Python or Zope, so got rid of it.
extension implemented flush() was fixed. Scott also rewrite the
zlib test suite using the unittest module. (SF bug #640230 and
patch #678531.)
Backport candidate I think.
readability.
load_bool(): Now that I know the intended difference between _PUSH and
_APPEND, used the right one.
Pdata_grow(): Squashed out a redundant overflow test.
a function, then
p->f(arg1, arg2, ...)
is semantically the same as
(*p->f)(arg1, arg2, ...)
Changed all instances of the latter into the former. Given how often
the code embeds this kind of expression in an if test, the unnecessary
parens and dereferening operator were a real drag on readability.
loops. Renamed DATA and BINDATA to DATA0 and DATA1. Included
disassemblies, but noted why we can't test them. Added XXX comment to
cPickle about a mysterious comment, where pickle and cPickle diverge
in how they number PUT indices.
Assorted code cleanups; e.g., sizeof(char) is 1 by definition, so there's
no need to do things like multiply by sizeof(char) in hairy malloc
arguments. Fixed an undetected-overflow bug in readline_file().
longobject.c: Fixed a really stupid bug in the new _PyLong_NumBits.
pickle.py: Fixed stupid bug in save_long(): When proto is 2, it
wrote LONG1 or LONG4, but forgot to return then -- it went on to
append the proto 1 LONG opcode too.
Fixed equally stupid cancelling bugs in load_long1() and
load_long4(): they *returned* the unpickled long instead of pushing
it on the stack. The return values were ignored. Tests passed
before only because save_long() pickled the long twice.
Fixed bugs in encode_long().
Noted that decode_long() is quadratic-time despite our hopes,
because long(string, 16) is still quadratic-time in len(string).
It's hex() that's linear-time. I don't know a way to make decode_long()
linear-time in Python, short of maybe transforming the 256's-complement
bytes into marshal's funky internal format, and letting marshal decode
that. It would be more valuable to make long(string, 16) linear time.
pickletester.py: Added a global "protocols" vector so tests can try
all the protocols in a sane way. Changed test_ints() and test_unicode()
to do so. Added a new test_long(), but the tail end of it is disabled
because it "takes forever" under pickle.py (but runs very quickly under
cPickle: cPickle proto 2 for longs is linear-time).
functions. Reworked {time,datetime}_new() to do what their corresponding
setstates used to do in their state-tuple-input paths, but directly,
without constructing an object with throwaway state first. Tightened
the "is this a state tuple input?" paths to check the presumed state
string-length too, and to raise an exception if the optional second state
element isn't a tzinfo instance (IOW, check these paths for type errors
as carefully as the normal paths).
anymore either, so don't. This also allows to get rid of obscure code
making __getnewargs__ identical to __getstate__ (hmm ... hope there
wasn't more to this than I realize!).
(pickling no longer needs them, and immutable objects shouldn't have
visible __setstate__() methods regardless). Rearranged the code to
put the internal setstate functions in the constructor sections.
Repaired the timedelta reduce() method, which was still producing
stuff that required a public timedelta.__setstate__() when unpickling.
Geoff writes:
This is yet another patch to _ssl.c that sets the
underlying BIO to non-blocking if the socket being
wrapped is non-blocking. It also correctly loops when
SSL_connect, SSL_write, or SSL_read indicates that it
needs to read or write more bytes.
This seems to fix bug #673797 which was not fixed by my
previous patch.
error handers in the Unicode codecs: Negative
positions are treated as being relative to the end of
the input and out of bounds positions result in an
IndexError.
Also update the PEP and include an explanation of
this in the documentation for codecs.register_error.
Fixes a small bug in iconv_codecs: if the position
from the callback is negative *add* it to the size
instead of substracting it.
From SF patch #677429.
needs of pickling longs. Backed off to a definition that's much easier
to understand. The pickler will have to work a little harder, but other
uses are more likely to be correct <0.5 wink>.
_PyLong_Sign(): New teensy function to characterize a long, as to <0, ==0,
or >0.
classes have a __reduce__ that returns (self.__class__,
self.__getstate__()). tzinfo.__reduce__() is a bit smarter, calling
__getinitargs__ and __getstate__ if they exist, and falling back to
__dict__ if it exists and isn't empty.
for this iconv() implementation in the init function.
For encoding: use a byteswapped version of the input if
neccessary.
For decoding: byteswap every piece returned by iconv()
if neccessary (but not those pieces returned from the
callback)
Comment out test_sane() in the test script, because
whether this works depends on whether byte swapping
is neccessary or not (an on Py_UNICODE_SIZE)
METH_NOARGS functions are still called with two arguments, one NULL,
so put that back into the function definitions (I didn't know this
until recently).
Make get_history_length() METH_NOARGS.
start for the C implemention of new pickle LONG1 and LONG4 opcodes (the
linear-time way to pickle a long is to call _PyLong_AsByteArray, but
the caller has no idea how big an array to allocate, and correct
calculation is a bit subtle).
compare against "the other" argument, we raise TypeError,
in order to prevent comparison from falling back to the
default (and worse than useless, in this case) comparison
by object address.
That's fine so far as it goes, but leaves no way for
another date/datetime object to make itself comparable
to our objects. For example, it leaves Marc-Andre no way
to teach mxDateTime dates how to compare against Python
dates.
Discussion on Python-Dev raised a number of impractical
ideas, and the simple one implemented here: when we don't
know how to compare against "the other" argument, we raise
TypeError *unless* the other object has a timetuple attr.
In that case, we return NotImplemented instead, and Python
will give the other object a shot at handling the
comparison then.
Note that comparisons of time and timedelta objects still
suffer the original problem, though.
This gives much the same treatment to datetime.fromtimestamp(stamp, tz) as
the last batch of checkins gave to datetime.now(tz): do "the obvious"
thing with the tz argument instead of a senseless thing.
checked in two days agao:
Refactoring of, and new rules for, dt.astimezone(tz).
dt must be aware now, and tz.utcoffset() and tz.dst() must not return None.
The old dt.astimezone(None) no longer works to change an aware datetime
into a naive datetime; use dt.replace(tzinfo=None) instead.
The tzinfo base class now supplies a new fromutc(self, dt) method, and
datetime.astimezone(tz) invokes tz.fromutc(). The default implementation
of fromutc() reproduces the same results as the old astimezone()
implementation, but tzinfo subclasses can override fromutc() if the
default implementation isn't strong enough to get the correct results
in all cases (for example, this may be necessary if a tzinfo subclass
models a time zone whose "standard offset" (wrt UTC) changed in some
year(s), or in some variations of double-daylight time -- the creativity
of time zone politics can't be captured in a single default implementation).
60: Added support for the SkippedEntityHandler, new in Expat 1.95.4.
61: Added support for namespace prefixes, which can be enabled by setting the
"namespace_prefixes" attribute on the parser object.
65: Disable profiling changes for Python 2.0 and 2.1.
66: Update pyexpat to export the Expat 1.95.5 XML_GetFeatureList()
information, and tighten up a type declaration now that Expat is using
an incomplete type rather than a void * for the XML_Parser type.
67: Clarified a comment.
Added support for XML_UseForeignDTD(), new in Expat 1.95.5.
68: Refactor to avoid partial duplication of the code to construct an
ExpatError instance, and actually conform to the API for the exception
instance as well.
69: Remove some spurious trailing whitespace.
Add a special external-entity-ref handler that gets installed once a
handler has raised a Python exception; this can cancel actual parsing
earlier if there's an external entity reference in the input data
after the the Python excpetion has been raised.
70: Untabify APPEND.
71: Backport PyMODINIT_FUNC for 2.2 and earlier.
When daylight time ends, an hour repeats on the local clock (for example,
in US Eastern, the clock jumps from 1:59 back to 1:00 again). Times in
the repeated hour are ambiguous. A tzinfo subclass that wants to play
with astimezone() needs to treat times in the repeated hour as being
standard time. astimezone() previously required that such times be
treated as daylight time. There seems no killer argument either way,
but Guido wants the standard-time version, and it does seem easier the
new way to code both American (local-time based) and European (UTC-based)
switch rules, and the astimezone() implementation is simpler.
underlying DB has already been closed (and thus all of its cursors).
This fixes a potential segfault.
SF pybsddb bug id 667343
bugfix: close the DB object when raising an exception due to an error
during DB.open. This prevents an exception when closing the
environment about not all databases being closed.
SF pybsddb bug id 667340
hoped it would be, but not too bad. A test had to change:
time.__setstate__() can no longer add a non-None tzinfo member to a time
object that didn't already have one, since storage for a tzinfo member
doesn't exist in that case.
into time. This is little more than *exporting* the datetimetz object
under the name "datetime", and similarly for timetz. A good implementation
of this change requires more work, but this is fully functional if you
don't stare too hard at the internals (e.g., right now a type named
"datetime" shows up as a base class of the type named "datetime"). The
docs also need extensive revision, not part of this checkin.
The attached patch enables shared extension
modules to build cleanly under Cygwin without
moving the static initialization of certain function
pointers (i.e., ones exported from the Python
DLL core) to a module initialization function.
Additionally, this patch fixes the modules that
have been changed in the past to accommodate
Cygwin.
cases, plus even tougher tests of that. This implementation follows
the correctness proof very closely, and should also be quicker (yes,
I wrote the proof before the code, and the code proves the proof <wink>).
Lesson learned: kids should not be allowed to use API's starting
with an underscore :-/
zipimport in 2.3a1 is even more broken than I thought: I attemped
to _PyString_Resize a string created by PyString_FromStringAndSize,
which fails for strings with length 0 or 1 since the latter returns
an interned string in those cases. This would cause a SystemError
with empty source files (and no matching pyc) in the zip archive.
I rewrote the offending code to simply allocate a new buffer and
avoid _PyString_Resize altogether.
Added a test that would've caught the problem.
(or None) now. In 2.3a1 they could also return an int or long, but that
was an unhelpfully redundant leftover from an earlier version wherein
they couldn't return a timedelta. TOOWTDI.
On Windows, it was very common to get microsecond values (out of
.today() and .now()) of the form 480999, i.e. with three trailing
nines. The platform precision is .001 seconds, and fp rounding
errors account for the rest. Under the covers, that 480999 started
life as the fractional part of a timestamp, like .4809999978.
Rounding that times 1e6 cures the irritation.
Confession: the platform precision isn't really .001 seconds. It's
usually worse. What actually happens is that MS rounds a cruder value
to a multiple of .001, and that suffers its own rounding errors.
A tiny bit of refactoring added a new internal utility to round
doubles.
the test set as it only tested with a zip archive in the current directory,
but it doesn't work at all for packages when the zip archive was specified
as an absolute path. It's a real embarrassing bug: a strchr call should
have been strrchr; fever apparently implies dyslexia.
Second stupid bug: the zipimport test failed with a name error
__importer__ (which I had renamed to __loader__ everywhere but here).
I would've sworn I ran the test after that change but that can't be true.
What I don't understand that noone reported a failing test_zipimport.py
before the release of 2.3a1.
suggestion from Guido, along with a formal correctness proof of the
trickiest bit. The intricacy of the proof reveals how delicate this
is, but also how robust the conclusion: correctness doesn't rely on
dst() returning +- one hour (not all real time zones do!), it only
relies on:
1. That dst() returns a (any) non-zero value if and only if daylight
time is in effect.
and
2. That the tzinfo subclass implements a consistent notion of time zone.
The meaning of "consistent" was a hidden assumption, which is now an
explicit requirement in the docs. Alas, it's an unverifiable (by the
datetime implementation) requirement, but so it goes.
find a more elegant algorithm (OTOH, the hairy new implementation allows
user-written tzinfo classes to be elegant, so it's a big win even if
astimezone() remains hairy).
Darn! I've only got 10 minutes left to get falling-down drunk! I suppose
I'll have to smoke crack instead now.
The attached patch enables Cygwin Python to
build cleanly against the latest Cygwin Tcl/Tk
which is based on Tcl/Tk 8.3. It also prevents
building against the real X headers, if installed.
A variety of changes from Michael Hudson to get the compiler working
with 2.3. The primary change is the handling of SET_LINENO:
# The set_lineno() function and the explicit emit() calls for
# SET_LINENO below are only used to generate the line number table.
# As of Python 2.3, the interpreter does not have a SET_LINENO
# instruction. pyassem treats SET_LINENO opcodes as a special case.
A few other small changes:
- Remove unused code from pycodegen and pyassem.
- Fix error handling in parsermodule. When PyParser_SimplerParseString()
fails, it sets an exception with detailed info. The parsermodule
was clobbering that exception and replacing it was a generic
"could not parse string" exception. Keep the original exception.
an idea from Guido. This restores that the datetime implementation
never passes a datetime d to a tzinfo method unless d.tzinfo is the
tzinfo instance whose method is being called. That in turn allows
enormous simplifications in user-written tzinfo classes (see the Python
sandbox US.py and EU.py for fully fleshed-out examples).
d.astimezone(tz) also raises ValueError now if d lands in the one hour
of the year that can't be expressed in tz (this can happen iff tz models
both standard and daylight time). That it used to return a nonsense
result always ate at me, and it turned out that it seemed impossible to
force a consistent nonsense result under the new implementation (which
doesn't know anything about how tzinfo classes implement their methods --
it can only infer properties indirectly). Guido doesn't like this --
expect it to change.
New tests of conversion between adjacent DST-aware timezones don't pass
yet, and are commented out.
Running the datetime tests in a loop under a debug build leaks 9
references per test run, but I don't believe the datetime code is the
cause (it didn't leak the last time I changed the C code, and the leak
is the same if I disable all the tests that invoke the only function
that changed here). I'll pursue that next.
devices(), stereodevices(), recdevices() ->
controls(), stereocontrols(), reccontrols()
Based on recommendation of Hannu Savolainen <hannu@opensound.com>:
The right term to use for things like bass/treble/mic/vol/etc is
"control".
"Device" refers to different mixer devices (/dev/mixer0 to /dev/mixerN).
"Channel" cannot be used because it refers to mono/stereo/multich
channels. In fact most mixer controls have left/right channels so ...
- new import hooks in import.c, exposed in the sys module
- new module called 'zipimport'
- various changes to allow bootstrapping from zip files
I hope I didn't break the Windows build (or anything else for that
matter), but then again, it's been sitting on sf long enough...
Regarding the latest discussions on python-dev: zipimport sets
pkg.__path__ as specified in PEP 273, and likewise, sys.path item such as
/path/to/Archive.zip/subdir/ are supported again.
make the callers figure out the right tzinfo arguments to pass, instead of
making the callees guess. The code is uglier this way, but it's less
brittle (when the callee guesses, the caller can get surprised).
of the timetz case. A tzinfo method will always see a datetimetz arg,
or None, now. In the former case, it's still possible that it will get
a datetimetz argument belonging to a different timezone. That will get
fixed next.
Check for readline 2.2 features. This should make it possible to
compile readline.c again with GNU readline versions 2.0 or 2.1; this
ability was removed in readline.c rev. 2.49. Apparently the older
versions are still in widespread deployment on older Solaris
installations. With an older readline, completion behavior is subtly
different (a space is always added).
* channels() -> devices()
* stereochannels() -> stereodevices()
* recchannels() -> recdevices()
* getvol() -> get()
* setvol() -> set()
This is for (slightly) more consistency with the OSS ioctl names
(READ_DEVMASK, READ_RECMASK, READ_STEREODEVS).
Also make sure the C function names correspond more closely to the
Python method names for mixer methods.
Obtain cleaner coding and a system wide
performance boost by using the fast, pre-parsed
PyArg_Unpack function instead of PyArg_ParseTuple
function which is driven by a format string.
This can cause core dumps when Python runs. Python relies on the 754-
(and C99-) mandated default "non-stop" mode for FP exceptions. This
patch from Ben Laurie disables at least one FP exception on FreeBSD at
Python startup time.
operands have identical tzinfo members (meaning object identity -- "is").
I misunderstood the intent here, reading wrong conclusion into
conflicting clues.
be trusted with years before 1900, so now we raise ValueError if a date or
datetime or datetimetz .strftime() method is called with a year before
1900.
such that the datetime tests failed if the envar PYTHON2K was set.
This is an utter mess, and the datetime module's strftime functions
inherit it. I suspect that, regardless of the PYTHON2K setting, and
regardless of platform limitations, the datetime strftime wrappers
will end up delivering nonsense results (or bogus exceptions) for
any year before 1900. I should probably just refuse to accept years
earlier than that -- else we'll have to implement strftime() by hand.
{timetz,datetimetz}.{utcoffset,dst}() now return a timedelta (or None)
instead of an int (or None).
tzinfo.{utcoffset,dst)() can now return a timedelta (or an int, or None).
Curiously, this was much easier to do in the C implementation than in the
Python implementation (which lives in the Zope3 code tree) -- the C code
already had lots of hair to extract C ints from offset objects, and used
C ints internally.
used that.
wrap_strftime(): Removed the most irritating uses of buf.
TestDate.test_ordinal_conversions(): The C implementation is fast enough
that we can afford to check the endpoints of every year. Also added
tm_yday tests at the endpoints.
PyMapping_Check() doesn't guarantee that PyMapping_Size() won't raise
an exception, nor that keys and values are lists.
Also folded some long lines and did a little whitespace normalization.
Probably a 2.2 backport candidate.
* rename oss_t to lad_t, Ladtype to OSSType,
* rename lad_*() methods to oss_*()
* rename lad_methods list to oss_methods
Patch and impetus supplied by Nicholas FitzRoy-Dale <wzdd@lardcave.net>.