the finally clause. An exception here could happen when a daemon
thread exits after the threading module has already been trashed by
the import finalization, and there's not much of a point in trying to
insist doing the cleanup in that stage.
This should fix SF bug ##497111: active_limbo_lock error at program
exit.
2.1.2 and 2.2.1 Bugfix candidate!
deepcopy(), _reconstruct(): pass the memo to the other function, so
that recursive data structures built out of new-style objects may be
deeply copied correctly.
2.2.1 bugfix!
deepcopy(), _reconstruct(): pass the memo to the other function, so
that recursive data structures built out of new-style objects may be
deeply copied correctly.
2.2.1 bugfix!
Instead of sending the real user and host, use "anonymous@" (i.e. no
host name at all!) as the default anonymous FTP password. This avoids
privacy violations.
doesn't have the _HEAPTYPE flag set, e.g. for time.struct_time and
posix.stat_result.
This fixes the immediate symptoms of SF bug #496873 (cPickle /
time.struct_time loop), replacing the infinite loop with an exception.
Make dumbdbm merely "dumb", rather than "terminally broken". Without this
patch, it's almost impossible to use dumbdbm _without_ causing horrible
datalossage. With this patch, dumbdbm passes my own horrible torture test,
as well as the roundup test suite.
dumbdbm really could do with a smidgin of a rewrite or two, but that's not
suitable for the release21-maint branch.
rfc822.AddressList incorrectly handles empty address.
"<>" is converted to None and should be "".
AddressList.__str__() fails on None.
I got an email with such an address and my program
failed processing it.
Example:
>>> import rfc822
>>> rfc822.AddressList("<>").addresslist
[('', None)]
>>> str(rfc822.AddressList("<>"))
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "/usr/lib/python2.1/rfc822.py", line 753, in __str__
return ", ".join(map(dump_address_pair,
self.addresslist))
TypeError: sequence item 0: expected string, None found
[His solution: in the internal routine AddrlistClass.getrouteaddr(),
initialize adlist to "".]
metaclass, reported by Dan Parisien.
Objects that are instances of custom metaclasses, i.e. whose class is
a subclass of 'type', should be pickled the same as new-style classes
(objects whose class is 'type'). This can't be done through a
dispatch table entry, and the __reduce__ trick doesn't work for these,
since it finds the unbound __reduce__ for instances of the class
(inherited from 'object'). So check explicitly using issubclass().
This way, when a socket object is deleted after the socket module has
already been zapped by module shutdown, we don't get annoying warnings
about exceptions in __del__ methods.
crashes.
If no external zip-utility is found, the archive is created by the
zipfile module, which behaves different now than in 2.1: if the
zip-file is created in the root directory if the distribution, it will
contain an (empty) version of itself.
This triggered the above bug - so it's better to create the zip-file
far away in the TMP directory.
paren. This was there to worm around a stupid XEmacs bug, but since I
can't tickle the bug in newer XEmacsen (just tried w/21.4.5) it's
possible the problem has been fixed. We shouldn't have to be working
around editor bugs anyway.
If it crops up again, I'll report it (again) to the XEmacs crowd.
Dietmar Schwertberger.
Bugfix candidate.
"""
RISCOS/Modules/getpath_riscos.c:
Include trailing '\0' when using strncpy [copy
strlen(...)+1 characters].
Lib/plat-riscos/riscospath.py:
Use riscosmodule.expand for os.path.abspath.
[fixes problems with site.py where
abspath("<Python$Dir>") returned
join(os.getcwd(), "<Python$Dir>") as e.g.
"SCSI::SCSI4.$.<Python$Dir>" because "<Python$Dir>"
wasn't recognised as an absolute path.]
"""
Rev 1.20 introduced a call to getpeername() in the dispatcher
constructor. This only works for a connected socket. Apparently
earlier versions of the code worked with un-connected sockets, e.g. a
listening socket.
It's not clear that the code is supposed to accept these sockets,
because it sets self.connected = 1 when passed a socket. But it's
also not clear that it should be a fatal error to pass a listening
socket.
The solution, for now, is to put a try/except around the getpeername()
call and continue if it fails. The self.addr attribute is used
primarily (only?) to produce a nice repr for the object, so it hardly
matters. If there is a real error on a connected socket, it's likely
that subsequent calls will fail too.
Fix for SF bug #492345. (I could've sworn I checked this in, but
apparently I didn't!)
This code:
class Classic:
pass
class New(Classic):
__metaclass__ = type
attempts to create a new-style class with only classic bases -- but it
doesn't work right. Attempts to fix it so it works caused problems
elsewhere, so I'm now raising a TypeError in this case.
>
> When using 'distutils' (shipped with Python 2.1) I've found that my
> Python scripts installed with a first line of:
>
> #!/usr/bin/python2.1None
>
> This is caused by distutils trying to patch the first line of the python
> script to use the current interpreter.
- the repr of unicode. Jython only add the u'' if the string contains char
values > 255.
- A unicode arg to unicode() is perfectly valid in jython.
- A test buffer() test. No buffer() on Jython
This closes patch "[ #490920 ] Jython and test_unicode".
Using grid methods on ScrolledText widgets does not
work as expected. It either fails to pack a widget, or
can even cause Tk to lock up.
The problem is that the .grid method is being called on
the text widget, not the frame widget. This can lead
to the well-known lockup in Tk when a frame's children
are managed by both the pack and grid managers. Even
if it doesn't lock up, the frame is never placed within
the intended widget.
Program fragment:
>>> import ScrolledText
>>> s = ScrolledText.ScrolledText()
>>> s.grid(row=0, column=0, rowspan=2)
The following patch uses the same hack to copy the
'grid' and 'place' geometry manager methods to the
ScrolledText instance as is already used for the 'pack'
manager.
backed out of broken minimal repeat patch from July
also fixed a couple of minor potential resource leaks in pattern_subx
(Guido had already fixed the big one)
type.__module__ behavior.
This adds the module name and a dot in front of the type name in every
type object initializer, except for built-in types (and those that
already had this). Note that it touches lots of Mac modules -- I have
no way to test these but the changes look right. Apologies if they're
not. This also touches the weakref docs, which contains a sample type
object initializer. It also touches the mmap test output, because the
mmap type's repr is included in that output. It touches object.h to
put the correct description in a comment.
open_http():
In urllib.py library module, URLopener.open_https()
returns a class instance of addinfourl() with its
self.url property missing the protocol.
Instead of "https://www.someurl.com", it becomes
"://www.someurl.com".
1. Acknowledge the welknown difference that jython
allows continue in the finally clause.
2. Avoid using _testcapi when running with jython.
This closes patch "[ #490417 ] Jython and test_exceptions"
twice! Fixed this by avoiding the import of test_email, which loads
the module a second time in that situation, and fiddled the __main__
section to resemble other test suites using unittest.
annoying that often you have to hit ^C numerous times before it
works. The solution: before the "except:" clause, insert "except
KeyboardInterrupt: raise". This propagates KeyboardInterrupt out,
stopping the test in its tracks.
distutils for the library modules built as shared objects. A better solution
appears possible, but with the threat that the distutils becomes more
magical ("complex").
This closes SF bug #458343.
initialized, this will be None, but the functions will still work (there will
simply be a bogus parent on the screen). Allowing the parent to be None
is useful when testing the functions from an interactive interpreter.
Add an optional keyword paramter "show" to the _QueryString class; when given
it is used to set the -show option to the entry widget. This allows passing
show="*" or the like to askstring(), making it useful for requesting
passwords/passphrases from the user.
This closes SF bug #438517.
Changed a docstring to be less font-lock-hostile.
Big Hammer to implement -Qnew as PEP 238 says it should work (a global
option affecting all instances of "/").
pydebug.h, main.c, pythonrun.c: define a private _Py_QnewFlag flag, true
iff -Qnew is passed on the command line. This should go away (as the
comments say) when true division becomes The Rule. This is
deliberately not exposed to runtime inspection or modification: it's
a one-way one-shot switch to pretend you're using Python 3.
ceval.c: when _Py_QnewFlag is set, treat BINARY_DIVIDE as
BINARY_TRUE_DIVIDE.
test_{descr, generators, zipfile}.py: fiddle so these pass under
-Qnew too. This was just a matter of s!/!//! in test_generators and
test_zipfile. test_descr was trickier, as testbinop() is passed
assumptions that "/" is the same as calling a "__div__" method; put
a temporary hack there to call "__truediv__" instead when the method
name is "__div__" and 1/2 evaluates to 0.5.
Three standard tests still fail under -Qnew (on Windows; somebody
please try the Linux tests with -Qnew too! Linux runs a whole bunch
of tests Windows doesn't):
test_augassign
test_class
test_coercion
I can't stay awake longer to stare at this (be my guest). Offhand
cures weren't obvious, nor was it even obvious that cures are possible
without major hackery.
Question: when -Qnew is in effect, should calls to __div__ magically
change into calls to __truediv__? See "major hackery" at tail end of
last paragraph <wink>.
It was easier than I thought, assuming that no other things contribute
to the instance size besides slots -- a pretty good bet. With a test
suite, no less!
happy if one could delete the __dict__ attribute of an instance. I
love to make Jim happy, so here goes...
- New-style objects now support deleting their __dict__. This is for
all intents and purposes equivalent to assigning a brand new empty
dictionary, but saves space if the object is not used further.
int_mul(): new and vastly simpler overflow checking. Whether it's
faster or slower will likely vary across platforms, favoring boxes
with fast floating point. OTOH, we no longer have to worry about
people shipping broken LONG_BIT definitions <0.9 wink>.
There's now a new structmember code, T_OBJECT_EX, which is used for
all __slot__ variables (except __weakref__, which has special behavior
anyway). This new code raises AttributeError when the variable is
NULL rather than converting NULL to None.
In goahead(), use a bound version of rawdata.startswith() since we use the
same method all the time and never change the value of rawdata. This can
save a lot of bound method creation.
Rather than tweaking the inheritance of type object slots (which turns
out to be too messy to try), this fix adds a __hash__ to the list and
dict types (the only mutable types I'm aware of) that explicitly
raises an error. This has the advantage that list.__hash__([]) also
raises an error (previously, this would invoke object.__hash__([]),
returning the argument's address); ditto for dict.__hash__.
The disadvantage for this fix is that 3rd party mutable types aren't
automatically fixed. This should be added to the rules for creating
subclassable extension types: if you don't want your object to be
hashable, add a tp_hash function that raises an exception.
Also, it's possible that I've forgotten about other mutable types for
which this should be done.
SF patch #480716 by Greg Chapman fixes the problem that super's
__get__ method always returns an instance of super, even when the
instance whose __get__ method is called is an instance of a subclass
of super.
Other issues fixed:
- super(C, C()).__class__ would return the __class__ attribute of C()
rather than the __class__ attribute of the super object. This is
confusing. To fix this, I decided to change the semantics of super
so that it only applies to code attributes, not to data attributes.
After all, overriding data attributes is not supported anyway.
- While super(C, x) carefully checked that x is an instance of C,
super(C).__get__(x) made no such check, allowing for a loophole.
This is now fixed.
ZipFile.__del__(): call ZipFile.close(), like its docstring says it does.
ZipFile.close(): allow calling more than once (as all file-like objects
in Python should support).
More changes to the formatdate epoch test: the Mac epoch is in
localtime, so east of GMT it falls in 1903:-( Changed the test to
obtain the epoch in both local time and GMT, and do the right
thing in the comparisons. As a sanity measure also check that
day/month is Jan 1.
_verify(): Pass in the values of globals insted of eval()ing their
names. The use of eval() was obscure and unnecessary, and the patch
claimed random.py couldn't be used in Jython applets because of it.
- Fix for SF bug #482752: __getstate__ & __setstate__ ignored (by Anon.)
In fact, only __getstate__ isn't recognized. This fixes that.
- Separately, the test for base.__flags__ & _HEAPTYPE raised an
AttributeError exception when a classic class was amongst the
bases. Fixed this with a hasattr() bandaid (classic classes never
qualify as the "hard" base class anyway, which is what the code is
trying to find).
use the correct way to test for epoch, by looking at the year
component of gmtime(0). Add clause for Unix epoch and Mac epoch (Tim,
what is Windows epoch?).
Also, get rid of the strptime() test, it was way too problematic given
that strptime() is missing on many platforms and issues with locales.
Instead, simply test that formatdate() gets the numeric timezone
calculation correct for the altzone and timezone.
incorrect for "uneven" timezones. This algorithm should work for even
timezones (e.g. America/New_York) and uneven timezones (e.g.
Australia/Adelaide and America/St_Johns).
Closes SF bug #483231.
split parameters from the last path segment. Introduces two new functions,
urlsplit() and urlunsplit(), that do the simpler job of splitting the URL
without monkeying around with the parameters field, since that was not being
handled properly.
This closes bug #478038.
load_inst(): Implement the security hook that cPickle already had.
When unpickling callables which are not classes, we look to see if the
object has an attribute __safe_for_unpickling__. If this exists and
has a true value, then we can call it to create the unpickled object.
Otherwise we raise an UnpicklingError.
find_class(): We no longer mask ImportError, KeyError, and
AttributeError by transforming them into SystemError. The latter is
definitely not the right thing to do, so we let the former three
exceptions simply propagate up if they occur, i.e. we remove the
try/except!
of multiple inheritance from a mix of new- and classic-style classes.
This is his patch, plus a start at some test cases from me. Will check
in more, plus a NEWS blurb, later tonight.
This gives mmap() on Windows the ability to create read-only, write-
through and copy-on-write mmaps. A new keyword argument is introduced
because the mmap() signatures diverged between Windows and Unix, so
while they (now) both support this functionality, there wasn't a way to
spell it in a common way without introducing a new spelling gimmick.
The old spellings are still accepted, so there isn't a backward-
compatibility issue here.
message for bad mode argument -- so that it doesn't fail on Windows.
It's hack. We know that errno is set to 0 in this case on Windows, so
check for that specifically.
rfc822.py. The old rfc822.formatdate() produced date strings using
obsolete syntax. The new version produces the preferred RFC 2822
dates.
Also, an optional argument `localtime' is added, which if true,
produces a date relative to the local timezone, with daylight savings
time properly taken into account.
Fix contributed by Jeffrey C. Ollie.
I haven't tested the fix because the situation is non-trivial to
reproduce.
The basic solution is to get rid of the __current_realm attribute of
authentication handlers. Instead, prevent infinite retries by
checking for the presence of an Authenticate: header in the request
object that exactly matches the Authenticate: header that would be
added.
The problem prevent authentication from working correctly in the
presence of retries.
Ollie mentioned that digest authentication has the same problem and I
applied the same solution there.
Fix by Neil Schemenauer. Visit the Subscript node when trying to find
the operation for a statement.
XXX Not sure if there are other nodes that should be visited.
ntpath.join('a', '') was producing 'a' instead of 'a\\' as in 2.1.
Impossible to guess what was ever *intended*, but since split('a\\')
produces ('a', ''), I think it's best if join('a', '') gives 'a\\' back.
__getaddr(): Watch out for empty addresses that can happen when
something like "MAIL FROM:<CR>" is received. This avoids the
IndexError and rightly returns an SMTP syntax error.
parseargs(): We didn't handle the 2-arg case where both the localspec
and the remotespec were provided on the command line.
and NEWS. Bugfix candidate? That's a dilemma for Anthony <wink>: /F
did fix a longstanding bug here, but the fix can cause code to raise an
exception that previously worked by accident.
test_commands does not work on IRIX
It assumes the output of "ls /bin/ls" is a line
that starts with a '-'. On IRIX that file is
a symbolic link, so the first character is an l.
This causes test_getstatus to fail.
dictionary instead of building a new one, and provide an overridable method
to allow subclasses to catch ADD_INFO records that are not part of the
initial block of ADD_INFO records created by the profiler itself.
And SF patch 473223 -- infinite getattr loop
Wrap select() and poll() calls with try/except for EINTR. If EINTR is
raised, treat as a response where no fd is ready.
In dispatcher constructor, make sure self.socket is always
initialized.
outer level, the iterator protocol is used for memory-efficiency (the
outer sequence may be very large if fully materialized); at the inner
level, PySequence_Fast() is used for time-efficiency (these should
always be sequences of length 2).
dictobject.c, new functions PyDict_{Merge,Update}FromSeq2. These are
wholly analogous to PyDict_{Merge,Update}, but process a sequence-of-2-
sequences argument instead of a mapping object. For now, I left these
functions file static, so no corresponding doc changes. It's tempting
to change dict.update() to allow a sequence-of-2-seqs argument too.
Also changed the name of dictionary's keyword argument from "mapping"
to "x". Got a better name? "mapping_or_sequence_of_pairs" isn't
attractive, although more so than "mosop" <wink>.
abstract.h, abstract.tex: Added new PySequence_Fast_GET_SIZE function,
much faster than going thru the all-purpose PySequence_Size.
libfuncs.tex:
- Document dictionary().
- Fiddle tuple() and list() to admit that their argument is optional.
- The long-winded repetitions of "a sequence, a container that supports
iteration, or an iterator object" is getting to be a PITA. Many
months ago I suggested factoring this out into "iterable object",
where the definition of that could include being explicit about
generators too (as is, I'm not sure a reader outside of PythonLabs
could guess that "an iterator object" includes a generator call).
- Please check my curly braces -- I'm going blind <0.9 wink>.
abstract.c, PySequence_Tuple(): When PyObject_GetIter() fails, leave
its error msg alone now (the msg it produces has improved since
PySequence_Tuple was generalized to accept iterable objects, and
PySequence_Tuple was also stomping on the msg in cases it shouldn't
have even before PyObject_GetIter grew a better msg).
the separating semi-colon shows up on a continuation line (legal, but
weird).
Bug reported and fixed by Matthew Cowles. Test case and sample email
included.
non-standard but common types. Including Martin's suggestion to add
rejected non-standard types from patch #438790. Specifically,
guess_type(), guess_extension(): Both the functions and the methods
grow an optional "strict" flag, defaulting to true, which determines
whether to recognize non-standard, but commonly found types or not.
Also, I sorted, reformatted, and culled duplicates from the big
types_map dictionary. Note that there are a few non-equivalent
duplicates (e.g. .cdf and .xls) for which the first will just get
thrown away. I didn't remove those though.
Finally, use of the module as a script as grown the -l and -e options
to toggle strictness and to do guess_extension(), respectively.
Doc and unittest updates too.
As the comments in the module implied, pyclbr was easily confused by
"strange stuff" inside single- (but not triple-) quoted strings. It
isn't anymore. Its behavior remains flaky in the presence of nested
functions and classes, though.
Bugfix candidate.
ThreadingMixIn/TCPServer forgets close (Max Neunhöffer).
This ensures that handle_error() and close_request() are called when
an error occurs in the thread.
(I am not applying the second chunk of the patch, which moved the
finish() call into the finally clause in BaseRequestHandler's __init__
method; that would be a semantic change that I cannot accept at this
point - the data would be sent even if the handler raised an
exception.)
used by the weakref code since he didn't like the word "referencable".
Is it really necessary to be more specific than to test for TypeError here,
though?
There really isn't a good reason for instance method objects to have
their own __dict__, __doc__ and __name__ properties that just delegate
the request to the function (callable); the default attribute behavior
already does this.
The test suite had to be fixed because the error changes from
TypeError to AttributeError.
strings, not C strings)
removed USE_PYTHON defines, and related sre.py helpers
skip calling the subx helper if the template is callable.
interestingly enough, this means that
def callback(m):
return literal
result = pattern.sub(callback, string)
is much faster than
result = pattern.sub(literal, string)
removed (conceptually flawed) getliteral helper; the new sub/subn code
uses a faster code path for literal replacement strings, but doesn't
(yet) look for literal patterns.
added STATE_OFFSET macro, and use it to convert state.start/ptr to
char indexes
_handle_multipart(): If there is an epilogue and the epilogue does
not itself start with a newline, add a newline before writing the
epilogue. Closes SF bug #472481.
This adds unsetenv to posix, and uses it in the __delitem__ method of
os.environ.
(XXX Should we change the preferred name for putenv to setenv, for
consistency?)
This is a big one, touching lots of files. Some of the platforms
aren't tested yet. Briefly, this changes the return value of the
os/posix functions stat(), fstat(), statvfs(), fstatvfs(), and the
time functions localtime(), gmtime(), and strptime() from tuples into
pseudo-sequences. When accessed as a sequence, they behave exactly as
before. But they also have attributes like st_mtime or tm_year. The
stat return value, moreover, has a few platform-specific attributes
that are not available through the sequence interface (because
everybody expects the sequence to have a fixed length, these couldn't
be added there). If your platform's struct stat doesn't define
st_blksize, st_blocks or st_rdev, they won't be accessible from Python
either.
(Still missing is a documentation update.)
The GUI-mode code to display properties blew up if the property functions
(get, set, etc) weren't simply methods (or functions).
"The problem" here is really that the generic document() method dispatches
to one of .doc{routine, class, module, other}(), but all of those require
a different(!) number of arguments. Thus document isn't general-purpose
at all: you have to know exactly what kind of thing is it you're going
to document first, in order to pass the correct number of arguments to
.document for it to pass on. As an expedient hack, just tacked "*ignored"
on to the end of the formal argument lists for the .docXXX routines so
that .document's caller doesn't have to know in advance which path
.document is going to take.
:-).
Add a test that prevents the __hello__ bytecode from going stale
unnoticed again.
The test also tests the loophole noted in SF bug #404545. This test
will fail right now; I'll check in the fix in a minute.
test_no_semis_header_splitter(): This actually should still split.
test_no_split_long_header(): An example of an unsplittable line.
test_no_semis_header_splitter(): Test for SF bug # 471918, Generator
splitting long headers.
_split_header(): Split on folding whitespace if the attempt to split
on semi-colons failed.
_split_header(): Patch by Matthew Cowles for fixing SF bug # 471918,
Generator splitting long headers.
There are now no known cases where the compiler package computes a
stack depth lower than the one computed by the builtin compiler. (To
achieve this state, we had to fix bugs in both compilers :-).
The chief change is to do the depth calculations with respect to basic
blocks. The stack effect of block is calculated. Then the flow graph
is traversed using breadth-first search to find the max weight path
through the graph.
Had to fix the StackDepthTracker to calculate the right info for
several opcodes: LOAD_ATTR, CALL_FUNCTION (and friends), MAKE_CLOSURE,
and DUP_TOPX.
XXX Still need to handle free variables in MAKE_CLOSURE.
XXX There are still a lot of places where the computed stack depth is
larger than for the builtin compiler. These won't cause the
interpreter to overflow the frame, but they waste space.
Mostly by Toby Dickenson and Titus Brown.
Add an optional argument to a decompression object's decompress()
method. The argument specifies the maximum length of the return
value. If the uncompressed data exceeds this length, the excess data
is stored as the unconsumed_tail attribute. (Not to be confused with
unused_data, which is a separate issue.)
Difference from SF patch: Default value for unconsumed_tail is ""
rather than None. It's simpler if the attribute is always a string.
object.c, PyObject_Str: Don't try to optimize anything except exact
string objects here; in particular, let str subclasses go thru tp_str,
same as non-str objects. This allows overrides of tp_str to take
effect.
stringobject.c:
+ string_print (str's tp_print): If the argument isn't an exact string
object, get one from PyObject_Str.
+ string_str (str's tp_str): Make a genuine-string copy of the object if
it's of a proper str subclass type. str() applied to a str subclass
that doesn't override __str__ ends up here.
test_descr.py: New str_of_str_subclass() test.
changing an application to collect profile data on one part of the
app while still making use of the profiled component, without relying
on side effects.
failobj, and when getting the subtype use 'plain' as the failobj.
text/plain is supposed to be the default if the message contains no
Content-Type: header.
Remove the log file after we are done with it. This should clean up after
the test even on Windows, since the file is now closed before we attempt
removal.
Simply commented it out, and then test_hotshot passes on Windows.
Leaving to Fred to fix "the right way" (it seems to be a feature of
unittest that all unittests try to unlink open files <wink>).
inherit_slots(): tp_as_buffer was getting inherited as if it were a
method pointer, rather than a pointer to a vector of method pointers. As
a result, inheriting from a type that implemented buffer methods was
ineffective, leaving all the tp_as_buffer slots NULL in the subclass.
corresponding to a dispatch slot (e.g. __getitem__ or __add__) is set,
calculate the proper dispatch slot and propagate the change to all
subclasses. Because of multiple inheritance, there's no easy way to
avoid always recursing down the tree of subclasses. Who cares?
(There's more to do, but this works. There's also a test for this now.)
Try to be systematic about dealing with socket and ssl exceptions in
FakeSocket.makefile(). The previous version of the code caught all
ssl errors and treated them as EOF, even though most of the errors
don't mean EOF.
An SSL error can mean on of three things:
1. The SSL/TLS connection was closed.
2. The operation should be retried.
3. An error occurred.
Also, if a socket error occurred and the error was EINTR, retry the
call. Otherwise, it was a legitimate error and the caller should
receive the exception.
headers. It does not parse the body of the message, instead simply
assigning it as a string to the container's payload. This can be much
faster when you're only interested in a message's header.
value, so the programmer will have to catch OverflowError. I'm not sure
what /F's perspective is on this. Perhaps it should be caught and mapped to
an xmlrpclib-specific exception. None of the other type-specific dump
methods seem to do any exception handling though.
call, or via setting an instance or class vrbl.
Rewrote the calibration docs.
Modern boxes are so friggin' fast, and a profiler event does so much work
anyway, that the cost of looking up an instance vrbl (the bias constant)
per profile event just isn't a big deal.
the problem that slots weren't inherited properly. override_slots()
no longer exists; in its place comes fixup_slot_dispatchers() which
does more and different work and is table-based. (Eventually I want
this table also to replace all the little tab_foo tables.)
Also add a wrapper for __delslice__; this required a change in
test_descrtut.py.
When checking for strings use,
! if isinstance(uri, (types.StringType, types.UnicodeType)):
Also get rid of some dodgy code that tried to guess whether attributes
were callable or not.
without the Py_TPFLAGS_CHECKTYPES flag) in the wrappers. This
required a few changes in test_descr.py to cope with the fact that the
complex type has __int__, __long__ and __float__ methods that always
raise an exception.
actual run of the profiler, instead of timing a simplified simulation of
part of what the profiler does. It computes a constant about 60% higher
on my Win98SE box than the old method, and the new constant appears much
more realistic. Deleted the undocumented simple(), instrumented(), and
profiler_simulation() methods (which existed only to support the previous
calibration method).
this type of test fails, vereq() does a better job of reporting than
verify().
Change vereq(x, y) to use "not x == y" rather than "x != y" -- it
makes a difference is some overloading tests.
Most of this code was old enough to vote. Examples of cleanups:
+ Backslashes were used for line continuation even inside unclosed
bracket structures, from back in the days that was still needed.
+ There was no use of % formats, and e.g. the old fpformat module was
still used to format floats "by hand" in conjunction with rjust().
+ There was even use of a do-nothing .ignore() method to tack on to the
end of a chain of method calls, else way back when Python would print
the non-None result (as it does now in an interactive session -- it
*used* to do that in batch mode too).
+ Perhaps controversial (although I can't imagine why for real <wink>),
used augmented assignment where helpful. Stuff like
self.total_calls = self.total_calls + other.total_calls
is just plain harder to follow than
self.total_calls += other.total_calls
seriously wrong. This started out by just fixing the docs, but then it
occurred to me that the doc confusion propagated into misleading vrbl names
too, so I also renamed those to match reality. As a result, INO the time
computations are much easier to understand now (within the limitations of
vast quantities of 3-character names <wink>).
many types were subclassable but had a xxx_dealloc function that
called PyObject_DEL(self) directly instead of deferring to
self->ob_type->tp_free(self). It is permissible to set tp_free in the
type object directly to _PyObject_Del, for non-GC types, or to
_PyObject_GC_Del, for GC types. Still, PyObject_DEL was a tad faster,
so I'm fearing that our pystone rating is going down again. I'm not
sure if doing something like
void xxx_dealloc(PyObject *self)
{
if (PyXxxCheckExact(self))
PyObject_DEL(self);
else
self->ob_type->tp_free(self);
}
is any faster than always calling the else branch, so I haven't
attempted that -- however those types whose own dealloc is fancier
(int, float, unicode) do use this pattern.
This patch allows ConfigParser.getboolean() to interpret TRUE,
FALSE, YES, NO, ON and OFF instead just '0' and '1'.
While just allowing '0' and '1' sounds more correct users often
demand to use more descriptive directives in configuration
files. Instead of forcing every programmer do brew his own
solution a system should include the batteries for this.
[My modification to the patch is a slight rewording of the docstring
and use of lowercase instead of uppercase templates. The code is
still case sensitive. GvR.]
optional, and default to `localhost' and ports 8025 and 25
respectively.
SMTPChannel.__init__(): Calculate __fqdn using socket.getfqdn()
instead of gethostby*() and friends. This allows us to run this
script even if we don't have access to dns (assuming the localhost is
configured properly).
Also, restore my precious page breaks. Hands off, oh Whitespace
Normalizer!
For a dynamically constructed type object, fill in the tp_doc slot with
a copy of the argument dict's "__doc__" value, provided the latter exists
and is a string.
NOTE: I don't know what to do if it's a Unicode string, so in that case
tp_doc is left NULL (which shows up as Py_None if you do Class.__doc__).
Note that tp_doc holds a char*, not a general PyObject*.
it deals correctly with some anomalous cases; according to this test
suite I've fixed it right.
The anomalous cases had to do with 'exception' events: these aren't
generated when they would be most helpful, and the profiler has to
work hard to recover the right information. The problems occur when C
code (such as hasattr(), which is used as the example here) calls back
into Python code and clears an exception raised by that Python code.
Consider this example:
def foo():
hasattr(obj, "bar")
Where obj is an instance from a class like this:
class C:
def __getattr__(self, name):
raise AttributeError
The profiler sees the following sequence of events:
call (foo)
call (__getattr__)
exception (in __getattr__)
return (from foo)
Previously, the profiler would assume the return event returned from
__getattr__. An if statement checking for this condition and raising
an exception was commented out... This version does the right thing.
test for modifying __getattr__ works, now that slot_tp_getattr_hook
zaps the slot if there's no hook. Added an XXX comment with a ref
back to slot_tp_getattr_hook.
Taught doctest about static methods, class methods, and property docstrings
in new-style classes. As for inspect.py/pydoc.py before it, the new stuff
needed didn't really fit into the old architecture (but was less of a
strain to force-fit here).
New-style class docstrings still aren't found, but that's the subject
of a different bug and I want to fix that right instead of hacking around
it in doctest.
instances).
Also added GC support to various auxiliary types: super, property,
descriptors, wrappers, dictproxy. (Only type objects have a tp_clear
field; the other types are.)
One change was necessary to the GC infrastructure. We have statically
allocated type objects that don't have a GC header (and can't easily
be given one) and heap-allocated type objects that do have a GC
header. Giving these different metatypes would be really ugly: I
tried, and I had to modify pickle.py, cPickle.c, copy.py, add a new
invent a new name for the new metatype and make it a built-in, change
affected tests... In short, a mess. So instead, we add a new type
slot tp_is_gc, which is a simple Boolean function that determines
whether a particular instance has GC headers or not. This slot is
only relevant for types that have the (new) GC flag bit set. If the
tp_is_gc slot is NULL (by far the most common case), all instances of
the type are deemed to have GC headers. This slot is called by the
PyObject_IS_GC() macro (which is only used twice, both times in
gcmodule.c).
I also changed the extern declarations for a bunch of GC-related
functions (_PyObject_GC_Del etc.): these always exist but objimpl.h
only declared them when WITH_CYCLE_GC was defined, but I needed to be
able to reference them without #ifdefs. (When WITH_CYCLE_GC is not
defined, they do the same as their non-GC counterparts anyway.)
- The test for deepcopy() in pickles() was indented wrongly, so it got
run twice (one for binary pickle mode, one for text pickle mode; but
the test doesn't depend on the pickle mode).
- In verbose mode, show which subtest (pickle/cPickle/deepcopy, text/bin).
in run_test() referenced two non-existent variables, and in
non-verbose mode, the tests didn't report the actual number, when it
differed from the expected number. Fixed this.
Also added an extra call to gc.collect() at the start of test_all().
This will be needed when I check in the changes to add GC to new-style
classes.
from Tim Hochberg. Also mucho fiddling to change the way doctest
determines whether a thing is a function, module or class. Under 2.2,
this really requires the functions in inspect.py (e.g., types.ClassType
is close to meaningless now, if not outright misleading).
I modified nntplib so the body method can accept an
optional second parameter pointing to a filehandle or
filename (string). This way, really long body
articles can be stored to disk instead of kept in
memory. The way I made the modification should make
it easy to extend this functionality to other extended
return methods.
This is probably a little bit faster, but mostly is just cleaner code.
The old-style support is still used for Python versions < 2.2 so this
source file can be shared with PyXML.
staticness when __dynamic__ = 1 becomes the default:
- Some classes which are used to test the difference between static
and dynamic.
- Subclasses of complex: complex uses old-style numbers and the slot
wrappers used by dynamic classes only support new-style numbers.
(Ideally, the complex type should be fixed, but that looks like a
labor-intensive job.)
__rop__ now takes precendence over __op__. Those circumstances are:
- Both arguments are new-style classes
- Both arguments are new-style numbers
- Their implementation slots for tp_op differ
- Their types differ
- The right argument's type is a subtype of the left argument's type
Also did this for the ternary operator (pow) -- only the binary case
is dealt with properly though, since __rpow__ is not supported anyway.