XXX [1] These changes aren't tested very thoroughly, because regrtest
doesn't do any SSL tests. I've done some trivial tests on my own, but
don't really know how to use the key and cert files. In one case, an
SSL-level error causes Python to dump core. I'll get the fixed in the
next round of changes.
XXX [2] The checkin removes the x_attr member of the SSLObject struct.
I'm not sure if this is kosher for backwards compatibility at the
binary level. Perhaps its safer to keep the member but keep it
assigned to NULL.
And the leaks?
newSSLObject() called PyDict_New(), stored the result in x_attr
without checking it, and later stored NULL in x_attr without doing
anything to the dict. So the dict always leaks. There is no further
reference to x_attr, so I just removed it completely.
The error cases in newSSLObject() passed the return value of
PyString_FromString() directly to PyErr_SetObject().
PyErr_SetObject() expects a borrowed reference, so the string leaked.
This simplifies the rounding in _PyObject_VAR_SIZE, allows to restore the
pre-rounding calling sequence, and allows some nice little simplifications
in its callers. I'm still making it return a size_t, though.
As Guido suggested, this makes the new subclassing code substantially
simpler. But the mechanics of doing it w/ C macro semantics are a mess,
and _PyObject_VAR_SIZE has a new calling sequence now.
Question: The PyObject_NEW_VAR macro appears to be part of the public API.
Regardless of what it expands to, the notion that it has to round up the
memory it allocates is new, and extensions containing the old
PyObject_NEW_VAR macro expansion (which was embedded in the
PyObject_NEW_VAR expansion) won't do this rounding. But the rounding
isn't actually *needed* except for new-style instances with dict pointers
after a variable-length blob of embedded data. So my guess is that we do
not need to bump the API version for this (as the rounding isn't needed
for anything an extension can do unless it's recompiled anyway). What's
your guess?
pad memory to properly align the __dict__ pointer in all cases.
gcmodule.c/objimpl.h, _PyObject_GC_Malloc:
+ Added a "padding" argument so that this flavor of malloc can allocate
enough bytes for alignment padding (it can't know this is needed, but
its callers do).
typeobject.c, PyType_GenericAlloc:
+ Allocated enough bytes to align the __dict__ pointer.
+ Sped and simplified the round-up-to-PTRSIZE logic.
+ Added blank lines so I could parse the if/else blocks <0.7 wink>.
Generalize PyLong_AsLongLong to accept int arguments too. The real point
is so that PyArg_ParseTuple's 'L' code does too. That code was
undocumented (AFAICT), so documented it.
#424002.
Refactor init_path_from_argv0() and rename to copy_absolute(); add
absolutize() which does the same in-place.
Clean up whitespace (leading tabs -> spaces, delete trailing
spaces/tabs).
Add raise_exception() to the _testcapi module. It isn't a test, but
the C API exists only to support test_exceptions. raise_exception()
takes two arguments -- an exception class and an integer specifying
how many arguments it should be called with.
test_exceptions uses BadException() to test the interpreter's behavior
when there is a problem instantiating the exception. test_capi1()
calls it with too many arguments. test_capi2() causes an exception to
be raised in the Python code of the constructor.
no backwards compatibility to worry about, so I just pushed the
'closure' struct member to the back -- it's never used in the current
code base (I may eliminate it, but that's more work because the getter
and setter signatures would have to change.)
As examples, I added actual docstrings to the getset attributes of a
few types: file.closed, xxsubtype.spamdict.state.
compatibility, this required all places where an array of "struct
memberlist" structures was declared that is referenced from a type's
tp_members slot to change the type of the structure to PyMemberDef;
"struct memberlist" is now only used by old code that still calls
PyMember_Get/Set. The code in PyObject_GenericGetAttr/SetAttr now
calls the new APIs PyMember_GetOne/SetOne, which take a PyMemberDef
argument.
As examples, I added actual docstrings to the attributes of a few
types: file, complex, instance method, super, and xxsubtype.spamlist.
Also converted the symtable to new style getattr.
#462270: sub-tle difference between pre.sub and sre.sub. PRE ignored
an empty match at the previous location, SRE didn't.
also synced with Secret Labs "sreopen" codebase.
Curious: the MS docs say stati64 etc are supported even on Win95, but
Win95 doesn't support a filesystem that allows partitions > 2 Gb.
test_largefile: This was opening its test file in text mode. I have no
idea how that worked under Win64, but it sure needs binary mode on Win98.
BTW, on Win98 test_largefile runs quickly (under a second).
requires that errno ever get set, and it looks like glibc is already
playing that game. New rules:
+ Never use HUGE_VAL. Use the new Py_HUGE_VAL instead.
+ Never believe errno. If overflow is the only thing you're interested in,
use the new Py_OVERFLOWED(x) macro. If you're interested in any libm
errors, use the new Py_SET_ERANGE_IF_OVERFLOW(x) macro, which attempts
to set errno the way C89 said it worked.
Unfortunately, none of these are reliable, but they work on Windows and I
*expect* under glibc too.
getting Infs, NaNs, or nonsense in 2.1 and before; in yesterday's CVS we
were getting OverflowError; but these functions always make good sense
for positive arguments, no matter how large).
PEP 238. Changes:
- add a new flag variable Py_DivisionWarningFlag, declared in
pydebug.h, defined in object.c, set in main.c, and used in
{int,long,float,complex}object.c. When this flag is set, the
classic division operator issues a DeprecationWarning message.
- add a new API PyRun_SimpleStringFlags() to match
PyRun_SimpleString(). The main() function calls this so that
commands run with -c can also benefit from -Dnew.
- While I was at it, I changed the usage message in main() somewhat:
alphabetized the options, split it in *four* parts to fit in under
512 bytes (not that I still believe this is necessary -- doc strings
elsewhere are much longer), and perhaps most visibly, don't display
the full list of options on each command line error. Instead, the
full list is only displayed when -h is used, and otherwise a brief
reminder of -h is displayed. When -h is used, write to stdout so
that you can do `python -h | more'.
Notes:
- I don't want to use the -W option to control whether the classic
division warning is issued or not, because the machinery to decide
whether to display the warning or not is very expensive (it involves
calling into the warnings.py module). You can use -Werror to turn
the warnings into exceptions though.
- The -Dnew option doesn't select future division for all of the
program -- only for the __main__ module. I don't know if I'll ever
change this -- it would require changes to the .pyc file magic
number to do it right, and a more global notion of compiler flags.
- You can usefully combine -Dwarn and -Dnew: this gives the __main__
module new division, and warns about classic division everywhere
else.
visit_finalizer_reachable since it's the same as visit_reachable.
Rename visit_reachable to visit_move. Objects can now have the GC type
flag set, reachable by tp_traverse and not be in a GC linked list. This
should make the collector more robust and easier to use by extension
module writers. Add memory management functions for container objects
(new, del, resize).
pyport.h: typedef a new Py_intptr_t type.
DELICATE ASSUMPTION: That HAVE_UINTPTR_T implies intptr_t is
available as well as uintptr_t. If that turns out not to be
true, things must get uglier (C99 wants both, so I think it's
an assumption we're *likely* to get away with).
thread_nt.h, PyThread_start_new_thread: MS _beginthread is documented
as returning unsigned long; no idea why uintptr_t was being used.
Others: Always use Py_[u]intptr_t, never [u]intptr_t directly.
This patch attempts to do to cPickle what Guido did
for pickle.py v 1.50. That is: save_global tries
importing the module, and fetching the name from the
module. If that fails, or the returned object is not
the same one we started with, it raises a
PicklingError. (All this so pickling a lambda will
fail at save time, rather than load time).
right way"). Fiddle __future__.py to use them.
Jeremy's pyassem.py may also want to use them (by-hand duplication of
magic numbers is brittle), but leaving that to his judgment.
Beef up __future__'s test to verify the exported feature names appear
correct.
- Do not compile unicodeobject, unicodectype, and unicodedata if Unicode is disabled
- check for Py_USING_UNICODE in all places that use Unicode functions
- disables unicode literals, and the builtin functions
- add the types.StringTypes list
- remove Unicode literals from most tests.
the "#ifdef MS_WINDOWS" to "#ifdef SELECT_USES_HEAP" and by
setting SELECT_USES_HEAP when FD_SETSIZE > 1024.
The indirection seems useful since this subtly changes the path
that "normal" Windows programs take (where Timmie sez FD_SETSIZE =
512). If that's a problem for Windows, he has only one place to
change.
Peter Schneider-Kamp.
Clarified some docstrings in the spirit of the patch; left out the
degrees() and radians() functions (see the patch comments on SF).
- Add an explicit call to PyType_Ready(&PyList_Type) to pythonrun.c
(just for the heck of it, really -- we should either explicitly
ready all types, or none).
New functions getnameinfo, getaddrinfo. New exceptions socket.gaierror,
socket.herror. Various new constants, in particular AF_INET6 and error
codes and parameters for getaddrinfo.
AF_INET6 support in setipaddr, makesockaddr, getsockaddr, getsockaddrlen,
gethost_common, PySocket_gethostbyaddr.
The ob_sval member of a string object isn't necessarily aligned to better
than a native long, so the new "q" and "Q" struct codes can't get away w/
casting tricks on platforms where LONG_LONG requires stricter-than-long
alignment. After I thought of a few elaborate workarounds, Guido bashed
me over the head with the obvious memcpy approach, herewith implemented.
Check for slice/item deletion, which calls slice/item assignment with a NULL
value, and raise a TypeError instead of coredumping. Bugreport and suggested
fix by Alex Martelli.
that info to code dynamically compiled *by* code compiled with generators
enabled. Doesn't yet work because there's still no way to tell the parser
that "yield" is OK (unlike nested_scopes, the parser has its fingers in
this too).
Replaced PyEval_GetNestedScopes by a more-general
PyEval_MergeCompilerFlags. Perhaps I should not have? I doubted it was
*intended* to be part of the public API, so just did.
Include sys/poll.h if it was found by the configure script. The OpenGroup
spec says poll.h is the correct header file to use, so that file is
preferred.
Also note that it isn't just Linux nice() that is broken: at least FreeBSD
and BSDI also have this problem. os.nice() should probably just be emulated
using getpriority()/setpriority(), if they are available, but I'll get to
that later.
This patch allows the readline module to build cleanly with GNU
readline 4.2 without breaking the build for earlier GNU readline
versions. The configure script checks for the presence of
rl_completion_matches in libreadline.
unicodeobject.h, which forces sizeof(Py_UNICODE) == sizeof(Py_UCS4).
(this may be good enough for platforms that doesn't have a 16-bit
type. the UTF-16 codecs don't work, though)
Protect several more uses of constants with #ifdefs; these are necessary on
(at least) SCO OpenServer 5. Fixes a non-SF-submitted bugreport by Michael
Kent.
the new PyLong_{As,From}{Unsigned,}LongLong tests, so the bulk of the
code is in the new #include file testcapi_long.h, which generates
different code depending on how macros are set. This sucks, but I couldn't
think of anything that sucked less.
UNIX headache? If we still maintain dependencies by hand, someone who
knows what they're doing should teach whatever needs it that
_testcapimodule.c includes testcapi_long.h.
Replaced PyLong_{As,From}{Unsigned,}LongLong guts with calls
to _PyLong_{As,From}ByteArray.
_testcapimodule.c:
Added strong tests of PyLong_{As,From}{Unsigned,}LongLong.
Fixes SF bug #432552 PyLong_AsLongLong() problems.
Possible bugfix candidate, but the fix relies on code added to longobject
to support the new q/Q structmodule format codes.
functions. I intend to replace their guts with calls to the new
_PyLong_{As,From}ByteArray() functions, but AFAICT there's no tests for
them at all now; I also suspect PyLong_AsLongLong() isn't catching all
overflow cases, but without a std test to demonstrate that why should you
believe me <wink>.
Also added a raiseTestError() utility function.
This completes the q/Q project.
longobject.c _PyLong_AsByteArray: The original code had a gross bug:
the most-significant Python digit doesn't necessarily have SHIFT
significant bits, and you really need to count how many copies of the sign
bit it has else spurious overflow errors result.
test_struct.py: This now does exhaustive std q/Q testing at, and on both
sides of, all relevant power-of-2 boundaries, both positive and negative.
NEWS: Added brief dict news while I was at it.
native mode, and only when config #defines HAVE_LONG_LONG. Standard mode
will eventually treat them as 8-byte ints across all platforms, but that
likely requires a new set of routines in longobject.c first (while
sizeof(long) >= 4 is guaranteed by C, there's nothing in C we can rely
on x-platform to hold 8 bytes of int, so we'll have to roll our own;
I'm thinking of a simple pair of conversion functions, Python long
to/from sized vector of unsigned bytes; that may be useful for GMP
conversions too; std q/Q would call them with size fixed at 8).
test_struct.py: In addition to adding some native-mode 'q' and 'Q' tests,
got rid of unused code, and repaired a non-portable assumption about
native sizeof(short) (it isn't 2 on some Cray boxes).
libstruct.tex: In addition to adding a bit of 'q'/'Q' docs (more needed
later), removed an erroneous footnote about 'I' behavior.
*are* obsolete; three variables and the maketrans() function are not
(yet) obsolete.
Add a compensating warnings.filterwarnings() call to test_strop.py.
Add this to the NEWS.
return a (C) long: PyArg_ParseTuple and Py_BuildValue may not let us get
at the size_t we really want, but C int is clearly too small for a 64-bit
box, and both the start parameter and the return value should work for
large mapped files even on 32-bit boxes. The code really needs to be
rethought from scratch (not by me, though ...).
1) it didn't obey the "start" parameter (and when it does, we must validate
the value)
2) the return value needs to be an absolute index, rather than relative to
some arbitrary point in the file
(checking CVS, it appears this method never worked; these changes bring it
into line with typical .find() behavior)
When getting a string buffer for a string we just created, use
PyString_AS_STRING() instead of PyString_AsString() to avoid the
call overhead and extra type check.
meaning infinity -- but at least warn about it in the code! I pissed
away a couple hours on this today, and don't wish the same on the next
in line.
Bugfix candidate.
means "replace everything". But the string module, string.replace()
amd test_string.py believe a 0 count means "replace nothing".
"Nothing" wins, strop loses.
Bugfix candidate.
Platform blew up on "123".replace("123", ""). Michael Hudson pinned the
blame on platform malloc(0) returning NULL.
This is a candidate for all bugfix releases.
This closes SF bug #231328.
Added all constants needed to use the functions defined in this module
that are not defined elsewhere (the O_* symbols are available in the
os module). No additonal modules are needed to use this now.
PyObject_AsFileDescriptor() -- it does the same thing everywhere, so
use it the same way everyone else does so that exceptions are
consistent. This means we have less code here, and we do not need to
resort to hackish ways of getting the Python-visible function name to
fdconv().
this could cause invalid paths to be returned for AF_UNIX sockets on some
platforms (including FreeBSD 4.2-RELEASE), appearantly because there is
no assurance that the address will be nul-terminated when filled in by the
kernel.
PySocketSock_recvfrom(): Use PyString_AS_STRING() to get the data pointer
of a string we create ourselves; there is no need for the extra type
check from PyString_AsString().
This closes SF bug #416573.
This patch does several things to termios:
(1) changes all functions to be METH_VARARGS
(2) changes all functions to be able to take a file object as the
first parameter, as per
http://mail.python.org/pipermail/python-dev/2001-February/012701.html
(3) give better error messages
(4) removes a bunch of comments that just repeat the docstrings
(5) #includes <termio.h> before #including <sys/ioctl.h> so more
#constants are actually #defined.
(6) a couple of docstring tweaks
I have tested this minimally (i.e. it builds, and
doesn't blow up too embarassingly) on OSF1/alpha and
on one of the sf compile farm's solaris boxes, and
rather more comprehansively on my linux/x86 box.
It still needs to be tested on all the other platforms
we build termios on.
This closes the code portion of SF patch #417081.
while not generally a good idea, this is used by RDF users, and works
to implement RDF-style namespace+localname concatenation as defined
in the RDF specifications. (This also corrects a backwards-compatibility
bug.)
Be more conservative while clearing out handlers; set the slot in the
self->handlers array to NULL before DECREFing the callback.
Still more adjustments to make the code style internally consistent.
OpenSSL versions beore 0.9.5. This just is too experimental to be
worth it, especially since the user would have to do some severe
hacking of the Modules/Setup file to even enable the EGD code, and
without the EGD code it would always spit out a warning on some
systems -- even when socket.ssl() is not used. Fixing that properly
is not my job; the EGD patch is clearly not so important that it
should hold up the 2.1 release.
always:
- #undef HAVE_CONFIG_H (because otherwise chardefs.h tries to include
strings.h)
- #include readline.h and history.h
and we never declare any readline function prototypes ourselves.
This makes it compile with readline 4.2, albeit with a few warnings.
Some of the remaining warnings are about completion_matches(), which
is renamed to rl_completion_matches().
I've tested it with various other versions, from 2.0 up, and they all
seem to work (some with warnings) -- but only on Red Hat Linux 6.2.
Fixing the warnings for readline 4.2 would break compatibility with
3.0 (and maybe even earlier versions), and readline doesn't seem to
have a way to test for its version at compile time, so I'd rather
leave the warnings in than break compilation with older versions.
problem reported by Neil Schemenauer on python-dev on 4/12/01, wth
subject "Problem with SSL and socketmodule on Debian Potato?".
It's tentative because Moshe objected, but Martin rebutted, and Moshe
seems unavailable for comments.
(Note that with OpenSSL 0.9.6a, I get a lot of compilation warnings
for socketmodule.c -- I'm assuming I can safely ignore these until 2.1
is released.)
before calling any callbacks. This is important
since the callback objects only look at themselves
to determine that they are invalide. This change
avoids a segfault when callbacks use a different
reference to an object in the process of being
deallocated.
This fixes SF bug #415660.
(with modification of existing dict elements!).
This is part of SF patch #409864: lazy fix for Pings bizarre scoping
crash.
The adaptation I made to Michael's patch was to change the error
handling to avoid masking other errors (moving the specific error
message to inside test_dict_inner()), and to insert a test for
dict==NULL at the start.
Michael Hudson suggested this fox for the Tru64 problem (SF bug
232597). It looks reasonable, it works on Tru64, and it doesn't beak
anything on Linux, so I say go for it.
of 2-space and 4-space indents. Whatever, when I saw the checkin diff it
was clear that what my editor thinks a tab means didn't match this module's
belief. Removed all the tabs from the lines I added and changed, left
everything else alone.
pickled into the signed(!) 4-byte BININT format, so were getting unpickled
again as negative ints. Repaired that.
Added some minimal docs at the top about what I've learned about the pickle
format codes (little of which was obvious from staring at the code,
although that's partly because all the size-related bugs greatly obscured
the true intent of the code).
Happy side effect: because save_int() needed to grow a *proper* range
check in order to fix this bug, it can now use the more-efficient BININT1,
BININT2 and BININT formats when the long's value is small enough to fit
in a signed 4-byte int (before this, on a sizeof(long)==8 box it always
used the general INT format for negative ints).
test_cpickle works again on sizeof(long)==8 machines. test_pickle is
still busted big-time.
binary pickle, and the latter contains a pickle of a negative Python
int i written on a sizeof(long)==4 box (and whether by cPickle or
pickle.py), it's read incorrectly as i + 2**32. The patch repairs that,
and allows test_cpickle.py (to which I added a relevant test case earlier
today) to work again on sizeof(long)==8 boxes.
There's another (at least one) sizeof(long)==8 binary pickle bug, but in
pickle.py instead. That bug is still there, and test_pickle.py doesn't
catch it yet (try pickling and unpickling, e.g., 1 << 46).
where sizeof(long)==8. This *was* broken on boxes where signed right
shifts didn't sign-extend, but not elsewhere. Unfortunately, apart
from the Cray T3E I don't know of such a box, and Guido has so far
refused to buy me any Cray machines for home Python testing <wink>.
More immediately interesting would be if someone could please test
this on *any* sizeof(long)==8 box, to make sure I didn't break it.
handling of EAGAIN.
This may or may not fix the problem for me (Mandrake 7.2 on a Dell
Optiplex GX110 desktop): I can't hear the output, but it does pass the
test now. It doesn't fix the problem for Fred (Mandrake 7.2 on a Dell
Inspiron 7500 which has the Maestro sound drivers). Fred suspects
that it's the kernel version in combination with the driver.
gives the CVS revision of this file even if it does not include the
extra RCS "$Revision: " cruft.
initpyexpat(): Use get_version_string() instead of hard-coding magic
indexes into the RCS string (which may be affected by export options).
tracked as soon as it is clear; this can decrease the number of roots for
the cycle detector sooner rather than later in applications which hold on
to weak references beyond the time of the invalidation.
If a module has a future statement enabling nested scopes, they are
also enable for the exec statement and the functions compile() and
execfile() if they occur in the module.
If Python is run with the -i option, which enters interactive mode
after executing a script, and the script it runs enables nested
scopes, they are also enabled in interactive mode.
XXX The use of -i with -c "from __future__ import nested_scopes" is
not supported. What's the point?
To support these changes, many function variants have been added to
pythonrun.c. All the variants names end with Flags and they take an
extra PyCompilerFlags * argument. It is possible that this complexity
will be eliminated in a future version of the interpreter in which
nested scopes are not optional.
with free variables. Thanks to Martin v. Loewis for finding two of
the problems. This fixes SF buf 405583.
There is also a C API change: PyFrame_New() is reverting to its
pre-2.1 signature. The change introduced by nested scopes was a
mistake. XXX Is this okay between beta releases?
cell_clear(), the GC helper, must decref its reference to break
cycles.
frame_dealloc() must dealloc all cell vars and free vars in addition
to locals.
eval_code2() setup code must INCREF cells it copies out of the
closure.
The STORE_DEREF opcode implementation must DECREF the object it passes
to PyCell_Set().
these can be missing on some (all?) Irix and Tru64 versions.
Protect the CRTSCTS value with a cast; this can be a larger value on
Solaris/SPARC.
This should fix SF tracker items #405092, #405350, and #405355.
defined and export both names.
Solaris also does not define CBAUDEX; it is not clear that CBAUDEXT
(which is defined there) is the same thing, so we only protect against
the lack of CBAUDEX.
Reported by Greg V. Wilson.