Follow a suggestion in an /*XXX*/ comment [in com_add()] to speed up
compilation by using supplemental dictionaries to keep track of names
and constants, eliminating quadratic behavior. With this patch in
place, the time to import a 5000-line file with lots of constants [at
the global level] is reduced from 20 seconds to under 3 on my system.
Here's a patch which changes modsupport to add 'u' and 'u#',
to support building Unicode objects from a null-terminated
Py_UNICODE *, and a Py_UNICODE * with length, respectively.
[Conversion from 'U' to 'u' by Fred, based on python-dev comments.]
Note that the use of None for NULL values of the Py_UNICODE* value is
still in; I'm not sure of the conclusion on that issue.
remaining object references if the environment variable PYTHONDUMPREFS
exists. The default behaviour caused problems for background or
otherwise invisible processes that use the debug build of Python.
Fixed a reference leak in the allocator.
Renamed utf8_string to _PyUnicode_AsUTF8String() and made
it external for use by other parts of the interpreter.
Fixed a memory leak found by Fredrik Lundh. Instead of
PyUnicode_AsUTF8String() we now use _PyUnicode_AsUTF8String() which
returns the string object without incremented refcount (and assures
that the so obtained object remains alive until the Unicode object is
garbage collected).
even if it's already absolute. Currently only implemented for Unix; I'm
not entirely sure of the right thing to do for DOS/Windows, and have no
clue what to do for Mac OS.
This patch is a workaround for Macintosh, where the GUSI I/O library
(time, stat, etc) use the MacOS epoch of 1-Jan-1904 and the MSL C
library (ctime, localtime, etc) uses the (apparently ANSI standard)
epoch of 1-Jan-1900. Python programs see the MacOS epoch and we
convert values when needed.
The previous checkin (2.84) added a PyErr_Format call that made the
cost of raising an AttributeError much more expensive. In general
this doesn't matter, except that checks for __init__ and
__del__ methods, where exceptions are caught and cleared in C, also
got much more expensive.
The fix is to split instance_getattr1 into two calls:
instance_getattr2 checks the instance and the class for the attribute
and returns it or returns NULL on error. It does not raise an
exception.
instance_getattr1 does rexec checks, then calls instance_getattr2. It
raises an exception if instance_getattr2 returns NULL.
PyInstance_New and instance_dealloc now call instance_getattr2
directly.
This patch changes posixmodule.c:execv to
a) check for zero length args (does this to execve, too), raising
ValueError.
b) raises more rational exceptions for various flavours of duff arguments.
I *hate*
TypeError: "illegal argument type for built-in operation"
It has to be one of the most frustrating error messages ever.
get_rfc_url(): New function; returns the URL for a numbered IETF RFC.
do_cmd_rfc(): Use get_rfc_url() instead of hard-coding in the HTML
formatting.
do_cmd_seerfc(): New function.
do_env_definitions(): Small change to avoid "local".
in command-line options, and in two phases at that: first, we expand
'install_base' and 'install_platbase', and then the other 'install_*'
options. This lets us do tricky stuff like
install --prefix='/tmp$sys_prefix'
...oooh, neat.
Simplified 'select_scheme()' -- it's no longer responsible for expanding
config vars, tildes, etc.
Define installation-specific config vars in 'self.config_vars', rather than
in a local dictionary of one method. Also factored '_expand_attrs()' out
of 'expand_dirs()' and added 'expand_basedirs()'.
Added a bunch of debugging output so I (and others) can judge the
success of this crazy scheme through direct feedback.
I think that after this patch, all objects in the os module (with names
that don't start with "_") that can have docstrings, do, on Linux at
least.
Also fix a nit in one of my spawn* docstrings.
project. [However I didn't add the other changes in his patch, which
were just taking away the source code control stuff -- this doesn't
hurt and would come back as soon as I make another change. --GvR]
Adds bztar format to generate .tar.bz2 tarballs
Uses the -f argument to overright old tarballs automatically, I am
assuming that if the old tarball was wanted it would have been moved or
else the version number would have been changed.
Uses the -9 argument to bzip2 and gzip to use maximum
compression. Compress uses the maximum compression by default.
Tests for correct value for the 'compress' argument of make_tarball. This
is one less place for someone adding new compression programs to forget to
change.
Improvements:
- does no longer need any extra memory
- has no relationship to tstate
- works in debug mode
- can easily be modified for free threading (hi Greg:)
Side effects:
Trashcan does change the order of object destruction.
Prevending that would be quite an immense effort, as
my attempts have shown. This version works always
the same, with debug mode or not. The slightly
changed destruction order should therefore be no problem.
Algorithm:
While the old idea of delaying the destruction of some
obejcts at a certain recursion level was kept, we now
no longer aloocate an object to hold these objects.
The delayed objects are instead chained together
via their ob_type field. The type is encoded via
ob_refcnt. When it comes to the destruction of the
chain of waiting objects, the topmost object is popped
off the chain and revived with type and refcount 1,
then it gets a normal Py_DECREF.
I am confident that this solution is near optimum
for minimizing side effects and code bloat.
socklen_t (unsigned int) for most size parameters. Apparently this is
part of the UNIX 98 standard.
[GvR: the changes to configure.in etc. that I just checked in make
sure that socklen_t is defined everywhere, so I deleted the little
part of Jack's mod to define socklen_t if not in GUSI2. I suppose I
will have to add it to the Windows config.h in a minute.]