mirror of https://github.com/python/cpython
Merge.
This commit is contained in:
commit
69cfcabae3
|
@ -351,18 +351,20 @@ Note that most parent parsers will specify ``add_help=False``. Otherwise, the
|
|||
:class:`ArgumentParser` will see two ``-h/--help`` options (one in the parent
|
||||
and one in the child) and raise an error.
|
||||
|
||||
.. note::
|
||||
You must fully initialize the parsers before passing them via ``parents=``.
|
||||
If you change the parent parsers after the child parser, those changes will
|
||||
not be reflected in the child.
|
||||
|
||||
|
||||
formatter_class
|
||||
^^^^^^^^^^^^^^^
|
||||
|
||||
:class:`ArgumentParser` objects allow the help formatting to be customized by
|
||||
specifying an alternate formatting class. Currently, there are three such
|
||||
classes: :class:`argparse.RawDescriptionHelpFormatter`,
|
||||
:class:`argparse.RawTextHelpFormatter` and
|
||||
:class:`argparse.ArgumentDefaultsHelpFormatter`. The first two allow more
|
||||
control over how textual descriptions are displayed, while the last
|
||||
automatically adds information about argument default values.
|
||||
specifying an alternate formatting class.
|
||||
|
||||
:class:`RawDescriptionHelpFormatter` and :class:`RawTextHelpFormatter` give
|
||||
more control over how textual descriptions are displayed.
|
||||
By default, :class:`ArgumentParser` objects line-wrap the description_ and
|
||||
epilog_ texts in command-line help messages::
|
||||
|
||||
|
@ -386,7 +388,7 @@ epilog_ texts in command-line help messages::
|
|||
likewise for this epilog whose whitespace will be cleaned up and whose words
|
||||
will be wrapped across a couple lines
|
||||
|
||||
Passing :class:`argparse.RawDescriptionHelpFormatter` as ``formatter_class=``
|
||||
Passing :class:`RawDescriptionHelpFormatter` as ``formatter_class=``
|
||||
indicates that description_ and epilog_ are already correctly formatted and
|
||||
should not be line-wrapped::
|
||||
|
||||
|
@ -412,11 +414,11 @@ should not be line-wrapped::
|
|||
optional arguments:
|
||||
-h, --help show this help message and exit
|
||||
|
||||
:class:`RawTextHelpFormatter` maintains whitespace for all sorts of help text
|
||||
:class:`RawTextHelpFormatter` maintains whitespace for all sorts of help text,
|
||||
including argument descriptions.
|
||||
|
||||
The other formatter class available, :class:`ArgumentDefaultsHelpFormatter`,
|
||||
will add information about the default value of each of the arguments::
|
||||
:class:`ArgumentDefaultsHelpFormatter` automatically adds information about
|
||||
default values to each of the argument help messages::
|
||||
|
||||
>>> parser = argparse.ArgumentParser(
|
||||
... prog='PROG',
|
||||
|
@ -433,6 +435,25 @@ will add information about the default value of each of the arguments::
|
|||
-h, --help show this help message and exit
|
||||
--foo FOO FOO! (default: 42)
|
||||
|
||||
:class:`MetavarTypeHelpFormatter` uses the name of the type_ argument for each
|
||||
argument as as the display name for its values (rather than using the dest_
|
||||
as the regular formatter does)::
|
||||
|
||||
>>> parser = argparse.ArgumentParser(
|
||||
... prog='PROG',
|
||||
... formatter_class=argparse.MetavarTypeHelpFormatter)
|
||||
>>> parser.add_argument('--foo', type=int)
|
||||
>>> parser.add_argument('bar', type=float)
|
||||
>>> parser.print_help()
|
||||
usage: PROG [-h] [--foo int] float
|
||||
|
||||
positional arguments:
|
||||
float
|
||||
|
||||
optional arguments:
|
||||
-h, --help show this help message and exit
|
||||
--foo int
|
||||
|
||||
|
||||
conflict_handler
|
||||
^^^^^^^^^^^^^^^^
|
||||
|
@ -1314,13 +1335,24 @@ of :data:`sys.argv`. This can be accomplished by passing a list of strings to
|
|||
Namespace(accumulate=<built-in function sum>, integers=[1, 2, 3, 4])
|
||||
|
||||
|
||||
Custom namespaces
|
||||
^^^^^^^^^^^^^^^^^
|
||||
The Namespace object
|
||||
^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
By default, :meth:`parse_args` will return a new object of type :class:`Namespace`
|
||||
where the necessary attributes have been set. This class is deliberately simple,
|
||||
just an :class:`object` subclass with a readable string representation. If you
|
||||
prefer to have dict-like view of the attributes, you can use the standard Python
|
||||
idiom via :func:`vars`::
|
||||
|
||||
>>> parser = argparse.ArgumentParser()
|
||||
>>> parser.add_argument('--foo')
|
||||
>>> args = parser.parse_args(['--foo', 'BAR'])
|
||||
>>> vars(args)
|
||||
{'foo': 'BAR'}
|
||||
|
||||
It may also be useful to have an :class:`ArgumentParser` assign attributes to an
|
||||
already existing object, rather than the newly-created :class:`Namespace` object
|
||||
that is normally used. This can be achieved by specifying the ``namespace=``
|
||||
keyword argument::
|
||||
already existing object, rather than a new :class:`Namespace` object. This can
|
||||
be achieved by specifying the ``namespace=`` keyword argument::
|
||||
|
||||
>>> class C:
|
||||
... pass
|
||||
|
|
|
@ -28,10 +28,10 @@ example, whether it is hashable or whether it is a mapping.
|
|||
Collections Abstract Base Classes
|
||||
---------------------------------
|
||||
|
||||
The collections module offers the following ABCs:
|
||||
The collections module offers the following :term:`ABCs <abstract base class>`:
|
||||
|
||||
========================= ===================== ====================== ====================================================
|
||||
ABC Inherits Abstract Methods Mixin Methods
|
||||
ABC Inherits from Abstract Methods Mixin Methods
|
||||
========================= ===================== ====================== ====================================================
|
||||
:class:`Container` ``__contains__``
|
||||
:class:`Hashable` ``__hash__``
|
||||
|
@ -44,15 +44,15 @@ ABC Inherits Abstract Methods Mixin
|
|||
:class:`Iterable`, ``index``, and ``count``
|
||||
:class:`Container`
|
||||
|
||||
:class:`MutableSequence` :class:`Sequence` ``__setitem__`` Inherited Sequence methods and
|
||||
:class:`MutableSequence` :class:`Sequence` ``__setitem__`` Inherited :class:`Sequence` methods and
|
||||
``__delitem__``, ``append``, ``reverse``, ``extend``, ``pop``,
|
||||
and ``insert`` ``remove``, ``clear``, and ``__iadd__``
|
||||
``insert`` ``remove``, ``clear``, and ``__iadd__``
|
||||
|
||||
:class:`Set` :class:`Sized`, ``__le__``, ``__lt__``, ``__eq__``, ``__ne__``,
|
||||
:class:`Iterable`, ``__gt__``, ``__ge__``, ``__and__``, ``__or__``,
|
||||
:class:`Container` ``__sub__``, ``__xor__``, and ``isdisjoint``
|
||||
|
||||
:class:`MutableSet` :class:`Set` ``add`` and Inherited Set methods and
|
||||
:class:`MutableSet` :class:`Set` ``add``, Inherited :class:`Set` methods and
|
||||
``discard`` ``clear``, ``pop``, ``remove``, ``__ior__``,
|
||||
``__iand__``, ``__ixor__``, and ``__isub__``
|
||||
|
||||
|
@ -60,19 +60,61 @@ ABC Inherits Abstract Methods Mixin
|
|||
:class:`Iterable`, ``get``, ``__eq__``, and ``__ne__``
|
||||
:class:`Container`
|
||||
|
||||
:class:`MutableMapping` :class:`Mapping` ``__setitem__`` and Inherited Mapping methods and
|
||||
:class:`MutableMapping` :class:`Mapping` ``__setitem__``, Inherited :class:`Mapping` methods and
|
||||
``__delitem__`` ``pop``, ``popitem``, ``clear``, ``update``,
|
||||
and ``setdefault``
|
||||
|
||||
|
||||
:class:`MappingView` :class:`Sized` ``__len__``
|
||||
:class:`KeysView` :class:`MappingView`, ``__contains__``,
|
||||
:class:`Set` ``__iter__``
|
||||
:class:`ItemsView` :class:`MappingView`, ``__contains__``,
|
||||
:class:`Set` ``__iter__``
|
||||
:class:`KeysView` :class:`MappingView`, ``__contains__``,
|
||||
:class:`Set` ``__iter__``
|
||||
:class:`ValuesView` :class:`MappingView` ``__contains__``, ``__iter__``
|
||||
========================= ===================== ====================== ====================================================
|
||||
|
||||
|
||||
.. class:: Container
|
||||
Hashable
|
||||
Sized
|
||||
Callable
|
||||
|
||||
ABCs for classes that provide respectively the methods :meth:`__contains__`,
|
||||
:meth:`__hash__`, :meth:`__len__`, and :meth:`__call__`.
|
||||
|
||||
.. class:: Iterable
|
||||
|
||||
ABC for classes that provide the :meth:`__iter__` method.
|
||||
See also the definition of :term:`iterable`.
|
||||
|
||||
.. class:: Iterator
|
||||
|
||||
ABC for classes that provide the :meth:`__iter__` and :meth:`next` methods.
|
||||
See also the definition of :term:`iterator`.
|
||||
|
||||
.. class:: Sequence
|
||||
MutableSequence
|
||||
|
||||
ABCs for read-only and mutable :term:`sequences <sequence>`.
|
||||
|
||||
.. class:: Set
|
||||
MutableSet
|
||||
|
||||
ABCs for read-only and mutable sets.
|
||||
|
||||
.. class:: Mapping
|
||||
MutableMapping
|
||||
|
||||
ABCs for read-only and mutable :term:`mappings <mapping>`.
|
||||
|
||||
.. class:: MappingView
|
||||
ItemsView
|
||||
KeysView
|
||||
ValuesView
|
||||
|
||||
ABCs for mapping, items, keys, and values :term:`views <view>`.
|
||||
|
||||
|
||||
These ABCs allow us to ask classes or instances if they provide
|
||||
particular functionality, for example::
|
||||
|
||||
|
|
|
@ -68,6 +68,9 @@ compile Python sources.
|
|||
.. versionchanged:: 3.2
|
||||
Added the ``-i``, ``-b`` and ``-h`` options.
|
||||
|
||||
There is no command-line option to control the optimization level used by the
|
||||
:func:`compile` function, because the Python interpreter itself already
|
||||
provides the option: :program:`python -O -m compileall`.
|
||||
|
||||
Public functions
|
||||
----------------
|
||||
|
|
|
@ -102,7 +102,7 @@ as a string. :class:`HeaderParser` has the same API as the :class:`Parser`
|
|||
class.
|
||||
|
||||
|
||||
.. class:: Parser(_class=email.message.Message, strict=None)
|
||||
.. class:: Parser(_class=email.message.Message)
|
||||
|
||||
The constructor for the :class:`Parser` class takes an optional argument
|
||||
*_class*. This must be a callable factory (such as a function or a class), and
|
||||
|
@ -110,13 +110,8 @@ class.
|
|||
:class:`~email.message.Message` (see :mod:`email.message`). The factory will
|
||||
be called without arguments.
|
||||
|
||||
The optional *strict* flag is ignored.
|
||||
|
||||
.. deprecated:: 2.4
|
||||
Because the :class:`Parser` class is a backward compatible API wrapper
|
||||
around the new-in-Python 2.4 :class:`FeedParser`, *all* parsing is
|
||||
effectively non-strict. You should simply stop passing a *strict* flag to
|
||||
the :class:`Parser` constructor.
|
||||
.. versionchanged:: 3.2
|
||||
Removed the *strict* argument that was deprecated in 2.4.
|
||||
|
||||
The other public :class:`Parser` methods are:
|
||||
|
||||
|
|
|
@ -722,7 +722,7 @@ cookies (assumes Unix/Netscape convention for location of the cookies file)::
|
|||
|
||||
import os, http.cookiejar, urllib.request
|
||||
cj = http.cookiejar.MozillaCookieJar()
|
||||
cj.load(os.path.join(os.environ["HOME"], ".netscape/cookies.txt"))
|
||||
cj.load(os.path.join(os.path.expanduser("~"), ".netscape", "cookies.txt"))
|
||||
opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj))
|
||||
r = opener.open("http://example.com/")
|
||||
|
||||
|
|
|
@ -46,7 +46,7 @@ Iterator Arguments Results
|
|||
==================== ============================ ================================================= =============================================================
|
||||
Iterator Arguments Results Example
|
||||
==================== ============================ ================================================= =============================================================
|
||||
:func:`accumulate` p p0, p0+p1, p0+p1+p2, ... ``accumulate([1,2,3,4,5]) --> 1 3 6 10 15``
|
||||
:func:`accumulate` p [,func] p0, p0+p1, p0+p1+p2, ... ``accumulate([1,2,3,4,5]) --> 1 3 6 10 15``
|
||||
:func:`chain` p, q, ... p0, p1, ... plast, q0, q1, ... ``chain('ABC', 'DEF') --> A B C D E F``
|
||||
:func:`compress` data, selectors (d[0] if s[0]), (d[1] if s[1]), ... ``compress('ABCDEF', [1,0,1,0,1,1]) --> A C E F``
|
||||
:func:`dropwhile` pred, seq seq[n], seq[n+1], starting when pred fails ``dropwhile(lambda x: x<5, [1,4,6,4,1]) --> 6 4 1``
|
||||
|
@ -84,23 +84,46 @@ The following module functions all construct and return iterators. Some provide
|
|||
streams of infinite length, so they should only be accessed by functions or
|
||||
loops that truncate the stream.
|
||||
|
||||
.. function:: accumulate(iterable)
|
||||
.. function:: accumulate(iterable[, func])
|
||||
|
||||
Make an iterator that returns accumulated sums. Elements may be any addable
|
||||
type including :class:`Decimal` or :class:`Fraction`. Equivalent to::
|
||||
type including :class:`Decimal` or :class:`Fraction`. If the optional
|
||||
*func* argument is supplied, it should be a function of two arguments
|
||||
and it will be used instead of addition.
|
||||
|
||||
def accumulate(iterable):
|
||||
Equivalent to::
|
||||
|
||||
def accumulate(iterable, func=operator.add):
|
||||
'Return running totals'
|
||||
# accumulate([1,2,3,4,5]) --> 1 3 6 10 15
|
||||
# accumulate([1,2,3,4,5], operator.mul) --> 1 2 6 24 120
|
||||
it = iter(iterable)
|
||||
total = next(it)
|
||||
yield total
|
||||
for element in it:
|
||||
total = total + element
|
||||
total = func(total, element)
|
||||
yield total
|
||||
|
||||
Uses for the *func* argument include :func:`min` for a running minimum,
|
||||
:func:`max` for a running maximum, and :func:`operator.mul` for a running
|
||||
product::
|
||||
|
||||
>>> data = [3, 4, 6, 2, 1, 9, 0, 7, 5, 8]
|
||||
>>> list(accumulate(data, operator.mul)) # running product
|
||||
[3, 12, 72, 144, 144, 1296, 0, 0, 0, 0]
|
||||
>>> list(accumulate(data, max)) # running maximum
|
||||
[3, 4, 6, 6, 6, 9, 9, 9, 9, 9]
|
||||
|
||||
# Amortize a 5% loan of 1000 with 4 annual payments of 90
|
||||
>>> cashflows = [1000, -90, -90, -90, -90]
|
||||
>>> list(accumulate(cashflows, lambda bal, pmt: bal*1.05 + pmt))
|
||||
[1000, 960.0, 918.0, 873.9000000000001, 827.5950000000001]
|
||||
|
||||
.. versionadded:: 3.2
|
||||
|
||||
.. versionchanged:: 3.3
|
||||
Added the optional *func* parameter.
|
||||
|
||||
.. function:: chain(*iterables)
|
||||
|
||||
Make an iterator that returns elements from the first iterable until it is
|
||||
|
|
|
@ -196,7 +196,7 @@ normally be executed automatically during interactive sessions from the user's
|
|||
|
||||
import os
|
||||
import readline
|
||||
histfile = os.path.join(os.environ["HOME"], ".pyhist")
|
||||
histfile = os.path.join(os.path.expanduser("~"), ".pyhist")
|
||||
try:
|
||||
readline.read_history_file(histfile)
|
||||
except IOError:
|
||||
|
|
|
@ -227,33 +227,21 @@ always available.
|
|||
The struct sequence *flags* exposes the status of command line flags. The
|
||||
attributes are read only.
|
||||
|
||||
+------------------------------+------------------------------------------+
|
||||
| attribute | flag |
|
||||
+==============================+==========================================+
|
||||
| :const:`debug` | -d |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`division_warning` | -Q |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`inspect` | -i |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`interactive` | -i |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`optimize` | -O or -OO |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`dont_write_bytecode` | -B |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`no_user_site` | -s |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`no_site` | -S |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`ignore_environment` | -E |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`verbose` | -v |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`bytes_warning` | -b |
|
||||
+------------------------------+------------------------------------------+
|
||||
| :const:`quiet` | -q |
|
||||
+------------------------------+------------------------------------------+
|
||||
============================= =============================
|
||||
attribute flag
|
||||
============================= =============================
|
||||
:const:`debug` :option:`-d`
|
||||
:const:`inspect` :option:`-i`
|
||||
:const:`interactive` :option:`-i`
|
||||
:const:`optimize` :option:`-O` or :option:`-OO`
|
||||
:const:`dont_write_bytecode` :option:`-B`
|
||||
:const:`no_user_site` :option:`-s`
|
||||
:const:`no_site` :option:`-S`
|
||||
:const:`ignore_environment` :option:`-E`
|
||||
:const:`verbose` :option:`-v`
|
||||
:const:`bytes_warning` :option:`-b`
|
||||
:const:`quiet` :option:`-q`
|
||||
============================= =============================
|
||||
|
||||
.. versionchanged:: 3.2
|
||||
Added ``quiet`` attribute for the new :option:`-q` flag.
|
||||
|
|
|
@ -123,10 +123,7 @@ interpreter. ::
|
|||
# bound to the Esc key by default (you can change it - see readline docs).
|
||||
#
|
||||
# Store the file in ~/.pystartup, and set an environment variable to point
|
||||
# to it: "export PYTHONSTARTUP=/home/user/.pystartup" in bash.
|
||||
#
|
||||
# Note that PYTHONSTARTUP does *not* expand "~", so you have to put in the
|
||||
# full path to your home directory.
|
||||
# to it: "export PYTHONSTARTUP=~/.pystartup" in bash.
|
||||
|
||||
import atexit
|
||||
import os
|
||||
|
|
|
@ -1499,11 +1499,11 @@ filenames:
|
|||
>>> os.fsencode(filename)
|
||||
b'Sehensw\xc3\xbcrdigkeiten'
|
||||
|
||||
Some operating systems allow direct access to the unencoded bytes in the
|
||||
Some operating systems allow direct access to encoded bytes in the
|
||||
environment. If so, the :attr:`os.supports_bytes_environ` constant will be
|
||||
true.
|
||||
|
||||
For direct access to unencoded environment variables (if available),
|
||||
For direct access to encoded environment variables (if available),
|
||||
use the new :func:`os.getenvb` function or use :data:`os.environb`
|
||||
which is a bytes version of :data:`os.environ`.
|
||||
|
||||
|
|
|
@ -133,3 +133,7 @@ that may require changes to your code:
|
|||
``import site`` will not add site-specific paths to the module search
|
||||
paths. In previous versions, it did. See changeset for doc changes in
|
||||
various files. Contributed by Carl Meyer with editions by Éric Araujo.
|
||||
|
||||
.. Issue #10998: -Q command-line flags are related artifacts have been
|
||||
removed. Code checking sys.flags.division_warning will need updating.
|
||||
Contributed by Éric Araujo.
|
||||
|
|
|
@ -16,7 +16,6 @@ PyAPI_DATA(int) Py_BytesWarningFlag;
|
|||
PyAPI_DATA(int) Py_UseClassExceptionsFlag;
|
||||
PyAPI_DATA(int) Py_FrozenFlag;
|
||||
PyAPI_DATA(int) Py_IgnoreEnvironmentFlag;
|
||||
PyAPI_DATA(int) Py_DivisionWarningFlag;
|
||||
PyAPI_DATA(int) Py_DontWriteBytecodeFlag;
|
||||
PyAPI_DATA(int) Py_NoUserSiteDirectory;
|
||||
PyAPI_DATA(int) Py_UnbufferedStdioFlag;
|
||||
|
|
|
@ -71,6 +71,7 @@ __all__ = [
|
|||
'ArgumentDefaultsHelpFormatter',
|
||||
'RawDescriptionHelpFormatter',
|
||||
'RawTextHelpFormatter',
|
||||
'MetavarTypeHelpFormatter',
|
||||
'Namespace',
|
||||
'Action',
|
||||
'ONE_OR_MORE',
|
||||
|
@ -82,6 +83,7 @@ __all__ = [
|
|||
]
|
||||
|
||||
|
||||
import collections as _collections
|
||||
import copy as _copy
|
||||
import os as _os
|
||||
import re as _re
|
||||
|
@ -422,7 +424,8 @@ class HelpFormatter(object):
|
|||
|
||||
# produce all arg strings
|
||||
elif not action.option_strings:
|
||||
part = self._format_args(action, action.dest)
|
||||
default = self._get_default_metavar_for_positional(action)
|
||||
part = self._format_args(action, default)
|
||||
|
||||
# if it's in a group, strip the outer []
|
||||
if action in group_actions:
|
||||
|
@ -444,7 +447,7 @@ class HelpFormatter(object):
|
|||
# if the Optional takes a value, format is:
|
||||
# -s ARGS or --long ARGS
|
||||
else:
|
||||
default = action.dest.upper()
|
||||
default = self._get_default_metavar_for_optional(action)
|
||||
args_string = self._format_args(action, default)
|
||||
part = '%s %s' % (option_string, args_string)
|
||||
|
||||
|
@ -530,7 +533,8 @@ class HelpFormatter(object):
|
|||
|
||||
def _format_action_invocation(self, action):
|
||||
if not action.option_strings:
|
||||
metavar, = self._metavar_formatter(action, action.dest)(1)
|
||||
default = self._get_default_metavar_for_positional(action)
|
||||
metavar, = self._metavar_formatter(action, default)(1)
|
||||
return metavar
|
||||
|
||||
else:
|
||||
|
@ -544,7 +548,7 @@ class HelpFormatter(object):
|
|||
# if the Optional takes a value, format is:
|
||||
# -s ARGS, --long ARGS
|
||||
else:
|
||||
default = action.dest.upper()
|
||||
default = self._get_default_metavar_for_optional(action)
|
||||
args_string = self._format_args(action, default)
|
||||
for option_string in action.option_strings:
|
||||
parts.append('%s %s' % (option_string, args_string))
|
||||
|
@ -622,6 +626,12 @@ class HelpFormatter(object):
|
|||
def _get_help_string(self, action):
|
||||
return action.help
|
||||
|
||||
def _get_default_metavar_for_optional(self, action):
|
||||
return action.dest.upper()
|
||||
|
||||
def _get_default_metavar_for_positional(self, action):
|
||||
return action.dest
|
||||
|
||||
|
||||
class RawDescriptionHelpFormatter(HelpFormatter):
|
||||
"""Help message formatter which retains any formatting in descriptions.
|
||||
|
@ -662,6 +672,22 @@ class ArgumentDefaultsHelpFormatter(HelpFormatter):
|
|||
return help
|
||||
|
||||
|
||||
class MetavarTypeHelpFormatter(HelpFormatter):
|
||||
"""Help message formatter which uses the argument 'type' as the default
|
||||
metavar value (instead of the argument 'dest')
|
||||
|
||||
Only the name of this class is considered a public API. All the methods
|
||||
provided by the class are considered an implementation detail.
|
||||
"""
|
||||
|
||||
def _get_default_metavar_for_optional(self, action):
|
||||
return action.type.__name__
|
||||
|
||||
def _get_default_metavar_for_positional(self, action):
|
||||
return action.type.__name__
|
||||
|
||||
|
||||
|
||||
# =====================
|
||||
# Options and Arguments
|
||||
# =====================
|
||||
|
@ -1041,7 +1067,7 @@ class _SubParsersAction(Action):
|
|||
|
||||
self._prog_prefix = prog
|
||||
self._parser_class = parser_class
|
||||
self._name_parser_map = {}
|
||||
self._name_parser_map = _collections.OrderedDict()
|
||||
self._choices_actions = []
|
||||
|
||||
super(_SubParsersAction, self).__init__(
|
||||
|
@ -1294,6 +1320,13 @@ class _ActionsContainer(object):
|
|||
if not _callable(type_func):
|
||||
raise ValueError('%r is not callable' % type_func)
|
||||
|
||||
# raise an error if the metavar does not match the type
|
||||
if hasattr(self, "_get_formatter"):
|
||||
try:
|
||||
self._get_formatter()._format_args(action, None)
|
||||
except TypeError:
|
||||
raise ValueError("length of metavar tuple does not match nargs")
|
||||
|
||||
return self._add_action(action)
|
||||
|
||||
def add_argument_group(self, *args, **kwargs):
|
||||
|
|
|
@ -66,14 +66,17 @@ import weakref
|
|||
# workers to exit when their work queues are empty and then waits until the
|
||||
# threads/processes finish.
|
||||
|
||||
_live_threads = weakref.WeakSet()
|
||||
_threads_queues = weakref.WeakKeyDictionary()
|
||||
_shutdown = False
|
||||
|
||||
def _python_exit():
|
||||
global _shutdown
|
||||
_shutdown = True
|
||||
for thread in _live_threads:
|
||||
thread.join()
|
||||
items = list(_threads_queues.items())
|
||||
for t, q in items:
|
||||
q.put(None)
|
||||
for t, q in items:
|
||||
t.join()
|
||||
|
||||
# Controls how many more calls than processes will be queued in the call queue.
|
||||
# A smaller number will mean that processes spend more time idle waiting for
|
||||
|
@ -116,11 +119,15 @@ def _process_worker(call_queue, result_queue, shutdown):
|
|||
"""
|
||||
while True:
|
||||
try:
|
||||
call_item = call_queue.get(block=True, timeout=0.1)
|
||||
call_item = call_queue.get(block=True)
|
||||
except queue.Empty:
|
||||
if shutdown.is_set():
|
||||
return
|
||||
else:
|
||||
if call_item is None:
|
||||
# Wake up queue management thread
|
||||
result_queue.put(None)
|
||||
return
|
||||
try:
|
||||
r = call_item.fn(*call_item.args, **call_item.kwargs)
|
||||
except BaseException as e:
|
||||
|
@ -195,40 +202,56 @@ def _queue_manangement_worker(executor_reference,
|
|||
process workers that they should exit when their work queue is
|
||||
empty.
|
||||
"""
|
||||
nb_shutdown_processes = 0
|
||||
def shutdown_one_process():
|
||||
"""Tell a worker to terminate, which will in turn wake us again"""
|
||||
nonlocal nb_shutdown_processes
|
||||
call_queue.put(None)
|
||||
nb_shutdown_processes += 1
|
||||
while True:
|
||||
_add_call_item_to_queue(pending_work_items,
|
||||
work_ids_queue,
|
||||
call_queue)
|
||||
|
||||
try:
|
||||
result_item = result_queue.get(block=True, timeout=0.1)
|
||||
result_item = result_queue.get(block=True)
|
||||
except queue.Empty:
|
||||
executor = executor_reference()
|
||||
# No more work items can be added if:
|
||||
# - The interpreter is shutting down OR
|
||||
# - The executor that owns this worker has been collected OR
|
||||
# - The executor that owns this worker has been shutdown.
|
||||
if _shutdown or executor is None or executor._shutdown_thread:
|
||||
# Since no new work items can be added, it is safe to shutdown
|
||||
# this thread if there are no pending work items.
|
||||
if not pending_work_items:
|
||||
shutdown_process_event.set()
|
||||
|
||||
# If .join() is not called on the created processes then
|
||||
# some multiprocessing.Queue methods may deadlock on Mac OS
|
||||
# X.
|
||||
for p in processes:
|
||||
p.join()
|
||||
return
|
||||
del executor
|
||||
pass
|
||||
else:
|
||||
work_item = pending_work_items[result_item.work_id]
|
||||
del pending_work_items[result_item.work_id]
|
||||
if result_item is not None:
|
||||
work_item = pending_work_items[result_item.work_id]
|
||||
del pending_work_items[result_item.work_id]
|
||||
|
||||
if result_item.exception:
|
||||
work_item.future.set_exception(result_item.exception)
|
||||
if result_item.exception:
|
||||
work_item.future.set_exception(result_item.exception)
|
||||
else:
|
||||
work_item.future.set_result(result_item.result)
|
||||
continue
|
||||
# If we come here, we either got a timeout or were explicitly woken up.
|
||||
# In either case, check whether we should start shutting down.
|
||||
executor = executor_reference()
|
||||
# No more work items can be added if:
|
||||
# - The interpreter is shutting down OR
|
||||
# - The executor that owns this worker has been collected OR
|
||||
# - The executor that owns this worker has been shutdown.
|
||||
if _shutdown or executor is None or executor._shutdown_thread:
|
||||
# Since no new work items can be added, it is safe to shutdown
|
||||
# this thread if there are no pending work items.
|
||||
if not pending_work_items:
|
||||
shutdown_process_event.set()
|
||||
|
||||
while nb_shutdown_processes < len(processes):
|
||||
shutdown_one_process()
|
||||
# If .join() is not called on the created processes then
|
||||
# some multiprocessing.Queue methods may deadlock on Mac OS
|
||||
# X.
|
||||
for p in processes:
|
||||
p.join()
|
||||
return
|
||||
else:
|
||||
work_item.future.set_result(result_item.result)
|
||||
# Start shutting down by telling a process it can exit.
|
||||
shutdown_one_process()
|
||||
del executor
|
||||
|
||||
_system_limits_checked = False
|
||||
_system_limited = None
|
||||
|
@ -289,10 +312,14 @@ class ProcessPoolExecutor(_base.Executor):
|
|||
self._pending_work_items = {}
|
||||
|
||||
def _start_queue_management_thread(self):
|
||||
# When the executor gets lost, the weakref callback will wake up
|
||||
# the queue management thread.
|
||||
def weakref_cb(_, q=self._result_queue):
|
||||
q.put(None)
|
||||
if self._queue_management_thread is None:
|
||||
self._queue_management_thread = threading.Thread(
|
||||
target=_queue_manangement_worker,
|
||||
args=(weakref.ref(self),
|
||||
args=(weakref.ref(self, weakref_cb),
|
||||
self._processes,
|
||||
self._pending_work_items,
|
||||
self._work_ids,
|
||||
|
@ -301,7 +328,7 @@ class ProcessPoolExecutor(_base.Executor):
|
|||
self._shutdown_process_event))
|
||||
self._queue_management_thread.daemon = True
|
||||
self._queue_management_thread.start()
|
||||
_live_threads.add(self._queue_management_thread)
|
||||
_threads_queues[self._queue_management_thread] = self._result_queue
|
||||
|
||||
def _adjust_process_count(self):
|
||||
for _ in range(len(self._processes), self._max_workers):
|
||||
|
@ -324,6 +351,8 @@ class ProcessPoolExecutor(_base.Executor):
|
|||
self._pending_work_items[self._queue_count] = w
|
||||
self._work_ids.put(self._queue_count)
|
||||
self._queue_count += 1
|
||||
# Wake up queue management thread
|
||||
self._result_queue.put(None)
|
||||
|
||||
self._start_queue_management_thread()
|
||||
self._adjust_process_count()
|
||||
|
@ -333,8 +362,10 @@ class ProcessPoolExecutor(_base.Executor):
|
|||
def shutdown(self, wait=True):
|
||||
with self._shutdown_lock:
|
||||
self._shutdown_thread = True
|
||||
if wait:
|
||||
if self._queue_management_thread:
|
||||
if self._queue_management_thread:
|
||||
# Wake up queue management thread
|
||||
self._result_queue.put(None)
|
||||
if wait:
|
||||
self._queue_management_thread.join()
|
||||
# To reduce the risk of openning too many files, remove references to
|
||||
# objects that use file descriptors.
|
||||
|
|
|
@ -25,14 +25,18 @@ import weakref
|
|||
# workers to exit when their work queues are empty and then waits until the
|
||||
# threads finish.
|
||||
|
||||
_live_threads = weakref.WeakSet()
|
||||
_threads_queues = weakref.WeakKeyDictionary()
|
||||
_shutdown = False
|
||||
|
||||
def _python_exit():
|
||||
global _shutdown
|
||||
_shutdown = True
|
||||
for thread in _live_threads:
|
||||
thread.join()
|
||||
items = list(_threads_queues.items())
|
||||
for t, q in items:
|
||||
q.put(None)
|
||||
for t, q in items:
|
||||
t.join()
|
||||
|
||||
atexit.register(_python_exit)
|
||||
|
||||
class _WorkItem(object):
|
||||
|
@ -57,18 +61,23 @@ def _worker(executor_reference, work_queue):
|
|||
try:
|
||||
while True:
|
||||
try:
|
||||
work_item = work_queue.get(block=True, timeout=0.1)
|
||||
work_item = work_queue.get(block=True)
|
||||
except queue.Empty:
|
||||
executor = executor_reference()
|
||||
# Exit if:
|
||||
# - The interpreter is shutting down OR
|
||||
# - The executor that owns the worker has been collected OR
|
||||
# - The executor that owns the worker has been shutdown.
|
||||
if _shutdown or executor is None or executor._shutdown:
|
||||
return
|
||||
del executor
|
||||
pass
|
||||
else:
|
||||
work_item.run()
|
||||
if work_item is not None:
|
||||
work_item.run()
|
||||
continue
|
||||
executor = executor_reference()
|
||||
# Exit if:
|
||||
# - The interpreter is shutting down OR
|
||||
# - The executor that owns the worker has been collected OR
|
||||
# - The executor that owns the worker has been shutdown.
|
||||
if _shutdown or executor is None or executor._shutdown:
|
||||
# Notice other workers
|
||||
work_queue.put(None)
|
||||
return
|
||||
del executor
|
||||
except BaseException as e:
|
||||
_base.LOGGER.critical('Exception in worker', exc_info=True)
|
||||
|
||||
|
@ -100,19 +109,25 @@ class ThreadPoolExecutor(_base.Executor):
|
|||
submit.__doc__ = _base.Executor.submit.__doc__
|
||||
|
||||
def _adjust_thread_count(self):
|
||||
# When the executor gets lost, the weakref callback will wake up
|
||||
# the worker threads.
|
||||
def weakref_cb(_, q=self._work_queue):
|
||||
q.put(None)
|
||||
# TODO(bquinlan): Should avoid creating new threads if there are more
|
||||
# idle threads than items in the work queue.
|
||||
if len(self._threads) < self._max_workers:
|
||||
t = threading.Thread(target=_worker,
|
||||
args=(weakref.ref(self), self._work_queue))
|
||||
args=(weakref.ref(self, weakref_cb),
|
||||
self._work_queue))
|
||||
t.daemon = True
|
||||
t.start()
|
||||
self._threads.add(t)
|
||||
_live_threads.add(t)
|
||||
_threads_queues[t] = self._work_queue
|
||||
|
||||
def shutdown(self, wait=True):
|
||||
with self._shutdown_lock:
|
||||
self._shutdown = True
|
||||
self._work_queue.put(None)
|
||||
if wait:
|
||||
for t in self._threads:
|
||||
t.join()
|
||||
|
|
|
@ -187,6 +187,18 @@ class BasicWrapTestCase(unittest.TestCase):
|
|||
self.assertEqual((s8i.a, s8i.b, s8i.c, s8i.d, s8i.e, s8i.f, s8i.g, s8i.h),
|
||||
(9*2, 8*3, 7*4, 6*5, 5*6, 4*7, 3*8, 2*9))
|
||||
|
||||
def test_recursive_as_param(self):
|
||||
from ctypes import c_int
|
||||
|
||||
class A(object):
|
||||
pass
|
||||
|
||||
a = A()
|
||||
a._as_parameter_ = a
|
||||
with self.assertRaises(RuntimeError):
|
||||
c_int.from_param(a)
|
||||
|
||||
|
||||
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
class AsParamWrapper(object):
|
||||
|
|
|
@ -66,9 +66,15 @@ def decode_header(header):
|
|||
otherwise a lower-case string containing the name of the character set
|
||||
specified in the encoded string.
|
||||
|
||||
header may be a string that may or may not contain RFC2047 encoded words,
|
||||
or it may be a Header object.
|
||||
|
||||
An email.errors.HeaderParseError may be raised when certain decoding error
|
||||
occurs (e.g. a base64 decoding exception).
|
||||
"""
|
||||
# If it is a Header object, we can just return the chunks.
|
||||
if hasattr(header, '_chunks'):
|
||||
return list(header._chunks)
|
||||
# If no encoding, just return the header with no charset.
|
||||
if not ecre.search(header):
|
||||
return [(header, None)]
|
||||
|
|
|
@ -15,7 +15,7 @@ from email.message import Message
|
|||
|
||||
|
||||
class Parser:
|
||||
def __init__(self, *args, **kws):
|
||||
def __init__(self, _class=Message):
|
||||
"""Parser of RFC 2822 and MIME email messages.
|
||||
|
||||
Creates an in-memory object tree representing the email message, which
|
||||
|
@ -31,27 +31,7 @@ class Parser:
|
|||
must be created. This class must have a constructor that can take
|
||||
zero arguments. Default is Message.Message.
|
||||
"""
|
||||
if len(args) >= 1:
|
||||
if '_class' in kws:
|
||||
raise TypeError("Multiple values for keyword arg '_class'")
|
||||
kws['_class'] = args[0]
|
||||
if len(args) == 2:
|
||||
if 'strict' in kws:
|
||||
raise TypeError("Multiple values for keyword arg 'strict'")
|
||||
kws['strict'] = args[1]
|
||||
if len(args) > 2:
|
||||
raise TypeError('Too many arguments')
|
||||
if '_class' in kws:
|
||||
self._class = kws['_class']
|
||||
del kws['_class']
|
||||
else:
|
||||
self._class = Message
|
||||
if 'strict' in kws:
|
||||
warnings.warn("'strict' argument is deprecated (and ignored)",
|
||||
DeprecationWarning, 2)
|
||||
del kws['strict']
|
||||
if kws:
|
||||
raise TypeError('Unexpected keyword arguments')
|
||||
self._class = _class
|
||||
|
||||
def parse(self, fp, headersonly=False):
|
||||
"""Create a message structure from the data in a file.
|
||||
|
|
|
@ -303,13 +303,13 @@ class EditorWindow(object):
|
|||
return "break"
|
||||
|
||||
def home_callback(self, event):
|
||||
if (event.state & 12) != 0 and event.keysym == "Home":
|
||||
# state&1==shift, state&4==control, state&8==alt
|
||||
return # <Modifier-Home>; fall back to class binding
|
||||
|
||||
if (event.state & 4) != 0 and event.keysym == "Home":
|
||||
# state&4==Control. If <Control-Home>, use the Tk binding.
|
||||
return
|
||||
if self.text.index("iomark") and \
|
||||
self.text.compare("iomark", "<=", "insert lineend") and \
|
||||
self.text.compare("insert linestart", "<=", "iomark"):
|
||||
# In Shell on input line, go to just after prompt
|
||||
insertpt = int(self.text.index("iomark").split(".")[1])
|
||||
else:
|
||||
line = self.text.get("insert linestart", "insert lineend")
|
||||
|
@ -318,30 +318,27 @@ class EditorWindow(object):
|
|||
break
|
||||
else:
|
||||
insertpt=len(line)
|
||||
|
||||
lineat = int(self.text.index("insert").split('.')[1])
|
||||
|
||||
if insertpt == lineat:
|
||||
insertpt = 0
|
||||
|
||||
dest = "insert linestart+"+str(insertpt)+"c"
|
||||
|
||||
if (event.state&1) == 0:
|
||||
# shift not pressed
|
||||
# shift was not pressed
|
||||
self.text.tag_remove("sel", "1.0", "end")
|
||||
else:
|
||||
if not self.text.index("sel.first"):
|
||||
self.text.mark_set("anchor","insert")
|
||||
|
||||
self.text.mark_set("my_anchor", "insert") # there was no previous selection
|
||||
else:
|
||||
if self.text.compare(self.text.index("sel.first"), "<", self.text.index("insert")):
|
||||
self.text.mark_set("my_anchor", "sel.first") # extend back
|
||||
else:
|
||||
self.text.mark_set("my_anchor", "sel.last") # extend forward
|
||||
first = self.text.index(dest)
|
||||
last = self.text.index("anchor")
|
||||
|
||||
last = self.text.index("my_anchor")
|
||||
if self.text.compare(first,">",last):
|
||||
first,last = last,first
|
||||
|
||||
self.text.tag_remove("sel", "1.0", "end")
|
||||
self.text.tag_add("sel", first, last)
|
||||
|
||||
self.text.mark_set("insert", dest)
|
||||
self.text.see("insert")
|
||||
return "break"
|
||||
|
|
|
@ -1,4 +1,15 @@
|
|||
What's New in IDLE 3.1?
|
||||
What's New in IDLE 3.1.4?
|
||||
=========================
|
||||
|
||||
*Release date: XX-XXX-XX*
|
||||
|
||||
- <Home> toggle failing on Tk 8.5, causing IDLE exits and strange selection
|
||||
behavior. Issue 4676. Improve selection extension behaviour.
|
||||
- <Home> toggle non-functional when NumLock set on Windows. Issue 3851.
|
||||
|
||||
|
||||
|
||||
What's New in IDLE 3.1b1?
|
||||
=========================
|
||||
|
||||
*Release date: 27-Jun-09*
|
||||
|
|
|
@ -944,8 +944,14 @@ def getcallargs(func, *positional, **named):
|
|||
f_name, 'at most' if defaults else 'exactly', num_args,
|
||||
'arguments' if num_args > 1 else 'argument', num_total))
|
||||
elif num_args == 0 and num_total:
|
||||
raise TypeError('%s() takes no arguments (%d given)' %
|
||||
(f_name, num_total))
|
||||
if varkw or kwonlyargs:
|
||||
if num_pos:
|
||||
# XXX: We should use num_pos, but Python also uses num_total:
|
||||
raise TypeError('%s() takes exactly 0 positional arguments '
|
||||
'(%d given)' % (f_name, num_total))
|
||||
else:
|
||||
raise TypeError('%s() takes no arguments (%d given)' %
|
||||
(f_name, num_total))
|
||||
|
||||
for arg in itertools.chain(args, kwonlyargs):
|
||||
if arg in named:
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# Copyright 2001-2010 by Vinay Sajip. All Rights Reserved.
|
||||
# Copyright 2001-2011 by Vinay Sajip. All Rights Reserved.
|
||||
#
|
||||
# Permission to use, copy, modify, and distribute this software and its
|
||||
# documentation for any purpose and without fee is hereby granted,
|
||||
|
@ -18,7 +18,7 @@
|
|||
Logging package for Python. Based on PEP 282 and comments thereto in
|
||||
comp.lang.python, and influenced by Apache's log4j system.
|
||||
|
||||
Copyright (C) 2001-2010 Vinay Sajip. All Rights Reserved.
|
||||
Copyright (C) 2001-2011 Vinay Sajip. All Rights Reserved.
|
||||
|
||||
To use, simply 'import logging' and log away!
|
||||
"""
|
||||
|
@ -1826,10 +1826,10 @@ class NullHandler(Handler):
|
|||
package.
|
||||
"""
|
||||
def handle(self, record):
|
||||
pass
|
||||
"""Stub."""
|
||||
|
||||
def emit(self, record):
|
||||
pass
|
||||
"""Stub."""
|
||||
|
||||
def createLock(self):
|
||||
self.lock = None
|
||||
|
|
|
@ -172,11 +172,10 @@ def add_tables(db, module):
|
|||
add_data(db, table, getattr(module, table))
|
||||
|
||||
def make_id(str):
|
||||
#str = str.replace(".", "_") # colons are allowed
|
||||
str = str.replace(" ", "_")
|
||||
str = str.replace("-", "_")
|
||||
if str[0] in string.digits:
|
||||
str = "_"+str
|
||||
identifier_chars = string.ascii_letters + string.digits + "._"
|
||||
str = "".join([c if c in identifier_chars else "_" for c in str])
|
||||
if str[0] in (string.digits + "."):
|
||||
str = "_" + str
|
||||
assert re.match("^[A-Za-z_][A-Za-z0-9_.]*$", str), "FILE"+str
|
||||
return str
|
||||
|
||||
|
@ -284,19 +283,28 @@ class Directory:
|
|||
[(feature.id, component)])
|
||||
|
||||
def make_short(self, file):
|
||||
oldfile = file
|
||||
file = file.replace('+', '_')
|
||||
file = ''.join(c for c in file if not c in ' "/\[]:;=,')
|
||||
parts = file.split(".")
|
||||
if len(parts)>1:
|
||||
if len(parts) > 1:
|
||||
prefix = "".join(parts[:-1]).upper()
|
||||
suffix = parts[-1].upper()
|
||||
if not prefix:
|
||||
prefix = suffix
|
||||
suffix = None
|
||||
else:
|
||||
prefix = file.upper()
|
||||
suffix = None
|
||||
prefix = parts[0].upper()
|
||||
if len(prefix) <= 8 and (not suffix or len(suffix)<=3):
|
||||
if len(parts) < 3 and len(prefix) <= 8 and file == oldfile and (
|
||||
not suffix or len(suffix) <= 3):
|
||||
if suffix:
|
||||
file = prefix+"."+suffix
|
||||
else:
|
||||
file = prefix
|
||||
assert file not in self.short_names
|
||||
else:
|
||||
file = None
|
||||
if file is None or file in self.short_names:
|
||||
prefix = prefix[:6]
|
||||
if suffix:
|
||||
suffix = suffix[:3]
|
||||
|
|
|
@ -80,7 +80,9 @@ def RawArray(typecode_or_type, size_or_initializer):
|
|||
type_ = typecode_to_type.get(typecode_or_type, typecode_or_type)
|
||||
if isinstance(size_or_initializer, int):
|
||||
type_ = type_ * size_or_initializer
|
||||
return _new_value(type_)
|
||||
obj = _new_value(type_)
|
||||
ctypes.memset(ctypes.addressof(obj), 0, ctypes.sizeof(obj))
|
||||
return obj
|
||||
else:
|
||||
type_ = type_ * len(size_or_initializer)
|
||||
result = _new_value(type_)
|
||||
|
|
21
Lib/pydoc.py
21
Lib/pydoc.py
|
@ -165,7 +165,7 @@ def _split_list(s, predicate):
|
|||
no.append(x)
|
||||
return yes, no
|
||||
|
||||
def visiblename(name, all=None):
|
||||
def visiblename(name, all=None, obj=None):
|
||||
"""Decide whether to show documentation on a variable."""
|
||||
# Certain special names are redundant.
|
||||
if name in {'__builtins__', '__doc__', '__file__', '__path__',
|
||||
|
@ -175,6 +175,9 @@ def visiblename(name, all=None):
|
|||
return 0
|
||||
# Private names are hidden, but special names are displayed.
|
||||
if name.startswith('__') and name.endswith('__'): return 1
|
||||
# Namedtuples have public fields and methods with a single leading underscore
|
||||
if name.startswith('_') and hasattr(obj, '_fields'):
|
||||
return True
|
||||
if all is not None:
|
||||
# only document that which the programmer exported in __all__
|
||||
return name in all
|
||||
|
@ -642,7 +645,7 @@ class HTMLDoc(Doc):
|
|||
# if __all__ exists, believe it. Otherwise use old heuristic.
|
||||
if (all is not None or
|
||||
(inspect.getmodule(value) or object) is object):
|
||||
if visiblename(key, all):
|
||||
if visiblename(key, all, object):
|
||||
classes.append((key, value))
|
||||
cdict[key] = cdict[value] = '#' + key
|
||||
for key, value in classes:
|
||||
|
@ -658,13 +661,13 @@ class HTMLDoc(Doc):
|
|||
# if __all__ exists, believe it. Otherwise use old heuristic.
|
||||
if (all is not None or
|
||||
inspect.isbuiltin(value) or inspect.getmodule(value) is object):
|
||||
if visiblename(key, all):
|
||||
if visiblename(key, all, object):
|
||||
funcs.append((key, value))
|
||||
fdict[key] = '#-' + key
|
||||
if inspect.isfunction(value): fdict[value] = fdict[key]
|
||||
data = []
|
||||
for key, value in inspect.getmembers(object, isdata):
|
||||
if visiblename(key, all):
|
||||
if visiblename(key, all, object):
|
||||
data.append((key, value))
|
||||
|
||||
doc = self.markup(getdoc(object), self.preformat, fdict, cdict)
|
||||
|
@ -789,7 +792,7 @@ class HTMLDoc(Doc):
|
|||
|
||||
attrs = [(name, kind, cls, value)
|
||||
for name, kind, cls, value in classify_class_attrs(object)
|
||||
if visiblename(name)]
|
||||
if visiblename(name, obj=object)]
|
||||
|
||||
mdict = {}
|
||||
for key, kind, homecls, value in attrs:
|
||||
|
@ -1056,18 +1059,18 @@ doubt, consult the module reference at the location listed above.
|
|||
# if __all__ exists, believe it. Otherwise use old heuristic.
|
||||
if (all is not None
|
||||
or (inspect.getmodule(value) or object) is object):
|
||||
if visiblename(key, all):
|
||||
if visiblename(key, all, object):
|
||||
classes.append((key, value))
|
||||
funcs = []
|
||||
for key, value in inspect.getmembers(object, inspect.isroutine):
|
||||
# if __all__ exists, believe it. Otherwise use old heuristic.
|
||||
if (all is not None or
|
||||
inspect.isbuiltin(value) or inspect.getmodule(value) is object):
|
||||
if visiblename(key, all):
|
||||
if visiblename(key, all, object):
|
||||
funcs.append((key, value))
|
||||
data = []
|
||||
for key, value in inspect.getmembers(object, isdata):
|
||||
if visiblename(key, all):
|
||||
if visiblename(key, all, object):
|
||||
data.append((key, value))
|
||||
|
||||
modpkgs = []
|
||||
|
@ -1206,7 +1209,7 @@ doubt, consult the module reference at the location listed above.
|
|||
|
||||
attrs = [(name, kind, cls, value)
|
||||
for name, kind, cls, value in classify_class_attrs(object)
|
||||
if visiblename(name)]
|
||||
if visiblename(name, obj=object)]
|
||||
|
||||
while attrs:
|
||||
if mro:
|
||||
|
|
|
@ -224,8 +224,7 @@ def escape(pattern):
|
|||
if isinstance(pattern, str):
|
||||
alphanum = _alphanum_str
|
||||
s = list(pattern)
|
||||
for i in range(len(pattern)):
|
||||
c = pattern[i]
|
||||
for i, c in enumerate(pattern):
|
||||
if c not in alphanum:
|
||||
if c == "\000":
|
||||
s[i] = "\\000"
|
||||
|
|
|
@ -1723,68 +1723,3 @@ class Popen(object):
|
|||
"""Kill the process with SIGKILL
|
||||
"""
|
||||
self.send_signal(signal.SIGKILL)
|
||||
|
||||
|
||||
def _demo_posix():
|
||||
#
|
||||
# Example 1: Simple redirection: Get process list
|
||||
#
|
||||
plist = Popen(["ps"], stdout=PIPE).communicate()[0]
|
||||
print("Process list:")
|
||||
print(plist)
|
||||
|
||||
#
|
||||
# Example 2: Change uid before executing child
|
||||
#
|
||||
if os.getuid() == 0:
|
||||
p = Popen(["id"], preexec_fn=lambda: os.setuid(100))
|
||||
p.wait()
|
||||
|
||||
#
|
||||
# Example 3: Connecting several subprocesses
|
||||
#
|
||||
print("Looking for 'hda'...")
|
||||
p1 = Popen(["dmesg"], stdout=PIPE)
|
||||
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
|
||||
print(repr(p2.communicate()[0]))
|
||||
|
||||
#
|
||||
# Example 4: Catch execution error
|
||||
#
|
||||
print()
|
||||
print("Trying a weird file...")
|
||||
try:
|
||||
print(Popen(["/this/path/does/not/exist"]).communicate())
|
||||
except OSError as e:
|
||||
if e.errno == errno.ENOENT:
|
||||
print("The file didn't exist. I thought so...")
|
||||
print("Child traceback:")
|
||||
print(e.child_traceback)
|
||||
else:
|
||||
print("Error", e.errno)
|
||||
else:
|
||||
print("Gosh. No error.", file=sys.stderr)
|
||||
|
||||
|
||||
def _demo_windows():
|
||||
#
|
||||
# Example 1: Connecting several subprocesses
|
||||
#
|
||||
print("Looking for 'PROMPT' in set output...")
|
||||
p1 = Popen("set", stdout=PIPE, shell=True)
|
||||
p2 = Popen('find "PROMPT"', stdin=p1.stdout, stdout=PIPE)
|
||||
print(repr(p2.communicate()[0]))
|
||||
|
||||
#
|
||||
# Example 2: Simple execution of program
|
||||
#
|
||||
print("Executing calc...")
|
||||
p = Popen("calc")
|
||||
p.wait()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
if mswindows:
|
||||
_demo_windows()
|
||||
else:
|
||||
_demo_posix()
|
||||
|
|
|
@ -42,6 +42,9 @@ Selecting tests
|
|||
-- specify which special resource intensive tests to run
|
||||
-M/--memlimit LIMIT
|
||||
-- run very large memory-consuming tests
|
||||
--testdir DIR
|
||||
-- execute test files in the specified directory (instead
|
||||
of the Python stdlib test suite)
|
||||
|
||||
Special runs
|
||||
|
||||
|
@ -265,7 +268,7 @@ def main(tests=None, testdir=None, verbose=0, quiet=False,
|
|||
'use=', 'threshold=', 'trace', 'coverdir=', 'nocoverdir',
|
||||
'runleaks', 'huntrleaks=', 'memlimit=', 'randseed=',
|
||||
'multiprocess=', 'coverage', 'slaveargs=', 'forever', 'debug',
|
||||
'start=', 'nowindows', 'header'])
|
||||
'start=', 'nowindows', 'header', 'testdir='])
|
||||
except getopt.error as msg:
|
||||
usage(msg)
|
||||
|
||||
|
@ -315,7 +318,9 @@ def main(tests=None, testdir=None, verbose=0, quiet=False,
|
|||
elif o in ('-T', '--coverage'):
|
||||
trace = True
|
||||
elif o in ('-D', '--coverdir'):
|
||||
coverdir = os.path.join(os.getcwd(), a)
|
||||
# CWD is replaced with a temporary dir before calling main(), so we
|
||||
# need join it with the saved CWD so it goes where the user expects.
|
||||
coverdir = os.path.join(support.SAVEDCWD, a)
|
||||
elif o in ('-N', '--nocoverdir'):
|
||||
coverdir = None
|
||||
elif o in ('-R', '--huntrleaks'):
|
||||
|
@ -393,6 +398,10 @@ def main(tests=None, testdir=None, verbose=0, quiet=False,
|
|||
print() # Force a newline (just in case)
|
||||
print(json.dumps(result))
|
||||
sys.exit(0)
|
||||
elif o == '--testdir':
|
||||
# CWD is replaced with a temporary dir before calling main(), so we
|
||||
# join it with the saved CWD so it ends up where the user expects.
|
||||
testdir = os.path.join(support.SAVEDCWD, a)
|
||||
else:
|
||||
print(("No handler for option {}. Please report this as a bug "
|
||||
"at http://bugs.python.org.").format(o), file=sys.stderr)
|
||||
|
@ -467,7 +476,13 @@ def main(tests=None, testdir=None, verbose=0, quiet=False,
|
|||
print("== ", os.getcwd())
|
||||
print("Testing with flags:", sys.flags)
|
||||
|
||||
alltests = findtests(testdir, stdtests, nottests)
|
||||
# if testdir is set, then we are not running the python tests suite, so
|
||||
# don't add default tests to be executed or skipped (pass empty values)
|
||||
if testdir:
|
||||
alltests = findtests(testdir, list(), set())
|
||||
else:
|
||||
alltests = findtests(testdir, stdtests, nottests)
|
||||
|
||||
selected = tests or args or alltests
|
||||
if single:
|
||||
selected = selected[:1]
|
||||
|
@ -713,6 +728,8 @@ def main(tests=None, testdir=None, verbose=0, quiet=False,
|
|||
sys.exit(len(bad) > 0 or interrupted)
|
||||
|
||||
|
||||
# small set of tests to determine if we have a basically functioning interpreter
|
||||
# (i.e. if any of these fail, then anything else is likely to follow)
|
||||
STDTESTS = [
|
||||
'test_grammar',
|
||||
'test_opcodes',
|
||||
|
@ -725,10 +742,8 @@ STDTESTS = [
|
|||
'test_doctest2',
|
||||
]
|
||||
|
||||
NOTTESTS = {
|
||||
'test_future1',
|
||||
'test_future2',
|
||||
}
|
||||
# set of tests that we don't want to be executed when using regrtest
|
||||
NOTTESTS = set()
|
||||
|
||||
def findtests(testdir=None, stdtests=STDTESTS, nottests=NOTTESTS):
|
||||
"""Return a list of all applicable test modules."""
|
||||
|
|
|
@ -1029,6 +1029,11 @@ def bigmemtest(minsize, memuse):
|
|||
return decorator
|
||||
|
||||
def precisionbigmemtest(size, memuse):
|
||||
"""Decorator for bigmem tests that need exact sizes.
|
||||
|
||||
Like bigmemtest, but without the size scaling upward to fill available
|
||||
memory.
|
||||
"""
|
||||
def decorator(f):
|
||||
def wrapper(self):
|
||||
size = wrapper.size
|
||||
|
|
|
@ -2837,16 +2837,22 @@ class TestHelpFormattingMetaclass(type):
|
|||
parser = argparse.ArgumentParser(
|
||||
*tester.parser_signature.args,
|
||||
**tester.parser_signature.kwargs)
|
||||
for argument_sig in tester.argument_signatures:
|
||||
for argument_sig in getattr(tester, 'argument_signatures', []):
|
||||
parser.add_argument(*argument_sig.args,
|
||||
**argument_sig.kwargs)
|
||||
group_signatures = tester.argument_group_signatures
|
||||
for group_sig, argument_sigs in group_signatures:
|
||||
group_sigs = getattr(tester, 'argument_group_signatures', [])
|
||||
for group_sig, argument_sigs in group_sigs:
|
||||
group = parser.add_argument_group(*group_sig.args,
|
||||
**group_sig.kwargs)
|
||||
for argument_sig in argument_sigs:
|
||||
group.add_argument(*argument_sig.args,
|
||||
**argument_sig.kwargs)
|
||||
subparsers_sigs = getattr(tester, 'subparsers_signatures', [])
|
||||
if subparsers_sigs:
|
||||
subparsers = parser.add_subparsers()
|
||||
for subparser_sig in subparsers_sigs:
|
||||
subparsers.add_parser(*subparser_sig.args,
|
||||
**subparser_sig.kwargs)
|
||||
return parser
|
||||
|
||||
def _test(self, tester, parser_text):
|
||||
|
@ -3940,6 +3946,108 @@ class TestHelpVersionAction(HelpTestCase):
|
|||
'''
|
||||
version = ''
|
||||
|
||||
class TestHelpSubparsersOrdering(HelpTestCase):
|
||||
"""Test ordering of subcommands in help matches the code"""
|
||||
parser_signature = Sig(prog='PROG',
|
||||
description='display some subcommands',
|
||||
version='0.1')
|
||||
|
||||
subparsers_signatures = [Sig(name=name)
|
||||
for name in ('a', 'b', 'c', 'd', 'e')]
|
||||
|
||||
usage = '''\
|
||||
usage: PROG [-h] [-v] {a,b,c,d,e} ...
|
||||
'''
|
||||
|
||||
help = usage + '''\
|
||||
|
||||
display some subcommands
|
||||
|
||||
positional arguments:
|
||||
{a,b,c,d,e}
|
||||
|
||||
optional arguments:
|
||||
-h, --help show this help message and exit
|
||||
-v, --version show program's version number and exit
|
||||
'''
|
||||
|
||||
version = '''\
|
||||
0.1
|
||||
'''
|
||||
|
||||
class TestHelpSubparsersWithHelpOrdering(HelpTestCase):
|
||||
"""Test ordering of subcommands in help matches the code"""
|
||||
parser_signature = Sig(prog='PROG',
|
||||
description='display some subcommands',
|
||||
version='0.1')
|
||||
|
||||
subcommand_data = (('a', 'a subcommand help'),
|
||||
('b', 'b subcommand help'),
|
||||
('c', 'c subcommand help'),
|
||||
('d', 'd subcommand help'),
|
||||
('e', 'e subcommand help'),
|
||||
)
|
||||
|
||||
subparsers_signatures = [Sig(name=name, help=help)
|
||||
for name, help in subcommand_data]
|
||||
|
||||
usage = '''\
|
||||
usage: PROG [-h] [-v] {a,b,c,d,e} ...
|
||||
'''
|
||||
|
||||
help = usage + '''\
|
||||
|
||||
display some subcommands
|
||||
|
||||
positional arguments:
|
||||
{a,b,c,d,e}
|
||||
a a subcommand help
|
||||
b b subcommand help
|
||||
c c subcommand help
|
||||
d d subcommand help
|
||||
e e subcommand help
|
||||
|
||||
optional arguments:
|
||||
-h, --help show this help message and exit
|
||||
-v, --version show program's version number and exit
|
||||
'''
|
||||
|
||||
version = '''\
|
||||
0.1
|
||||
'''
|
||||
|
||||
|
||||
|
||||
class TestHelpMetavarTypeFormatter(HelpTestCase):
|
||||
""""""
|
||||
|
||||
def custom_type(string):
|
||||
return string
|
||||
|
||||
parser_signature = Sig(prog='PROG', description='description',
|
||||
formatter_class=argparse.MetavarTypeHelpFormatter)
|
||||
argument_signatures = [Sig('a', type=int),
|
||||
Sig('-b', type=custom_type),
|
||||
Sig('-c', type=float, metavar='SOME FLOAT')]
|
||||
argument_group_signatures = []
|
||||
usage = '''\
|
||||
usage: PROG [-h] [-b custom_type] [-c SOME FLOAT] int
|
||||
'''
|
||||
help = usage + '''\
|
||||
|
||||
description
|
||||
|
||||
positional arguments:
|
||||
int
|
||||
|
||||
optional arguments:
|
||||
-h, --help show this help message and exit
|
||||
-b custom_type
|
||||
-c SOME FLOAT
|
||||
'''
|
||||
version = ''
|
||||
|
||||
|
||||
# =====================================
|
||||
# Optional/Positional constructor tests
|
||||
# =====================================
|
||||
|
@ -4394,6 +4502,177 @@ class TestParseKnownArgs(TestCase):
|
|||
self.assertEqual(NS(v=3, spam=True, badger="B"), args)
|
||||
self.assertEqual(["C", "--foo", "4"], extras)
|
||||
|
||||
# ==========================
|
||||
# add_argument metavar tests
|
||||
# ==========================
|
||||
|
||||
class TestAddArgumentMetavar(TestCase):
|
||||
|
||||
EXPECTED_MESSAGE = "length of metavar tuple does not match nargs"
|
||||
|
||||
def do_test_no_exception(self, nargs, metavar):
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument("--foo", nargs=nargs, metavar=metavar)
|
||||
|
||||
def do_test_exception(self, nargs, metavar):
|
||||
parser = argparse.ArgumentParser()
|
||||
with self.assertRaises(ValueError) as cm:
|
||||
parser.add_argument("--foo", nargs=nargs, metavar=metavar)
|
||||
self.assertEqual(cm.exception.args[0], self.EXPECTED_MESSAGE)
|
||||
|
||||
# Unit tests for different values of metavar when nargs=None
|
||||
|
||||
def test_nargs_None_metavar_string(self):
|
||||
self.do_test_no_exception(nargs=None, metavar="1")
|
||||
|
||||
def test_nargs_None_metavar_length0(self):
|
||||
self.do_test_exception(nargs=None, metavar=tuple())
|
||||
|
||||
def test_nargs_None_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs=None, metavar=("1"))
|
||||
|
||||
def test_nargs_None_metavar_length2(self):
|
||||
self.do_test_exception(nargs=None, metavar=("1", "2"))
|
||||
|
||||
def test_nargs_None_metavar_length3(self):
|
||||
self.do_test_exception(nargs=None, metavar=("1", "2", "3"))
|
||||
|
||||
# Unit tests for different values of metavar when nargs=?
|
||||
|
||||
def test_nargs_optional_metavar_string(self):
|
||||
self.do_test_no_exception(nargs="?", metavar="1")
|
||||
|
||||
def test_nargs_optional_metavar_length0(self):
|
||||
self.do_test_exception(nargs="?", metavar=tuple())
|
||||
|
||||
def test_nargs_optional_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs="?", metavar=("1"))
|
||||
|
||||
def test_nargs_optional_metavar_length2(self):
|
||||
self.do_test_exception(nargs="?", metavar=("1", "2"))
|
||||
|
||||
def test_nargs_optional_metavar_length3(self):
|
||||
self.do_test_exception(nargs="?", metavar=("1", "2", "3"))
|
||||
|
||||
# Unit tests for different values of metavar when nargs=*
|
||||
|
||||
def test_nargs_zeroormore_metavar_string(self):
|
||||
self.do_test_no_exception(nargs="*", metavar="1")
|
||||
|
||||
def test_nargs_zeroormore_metavar_length0(self):
|
||||
self.do_test_exception(nargs="*", metavar=tuple())
|
||||
|
||||
def test_nargs_zeroormore_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs="*", metavar=("1"))
|
||||
|
||||
def test_nargs_zeroormore_metavar_length2(self):
|
||||
self.do_test_no_exception(nargs="*", metavar=("1", "2"))
|
||||
|
||||
def test_nargs_zeroormore_metavar_length3(self):
|
||||
self.do_test_exception(nargs="*", metavar=("1", "2", "3"))
|
||||
|
||||
# Unit tests for different values of metavar when nargs=+
|
||||
|
||||
def test_nargs_oneormore_metavar_string(self):
|
||||
self.do_test_no_exception(nargs="+", metavar="1")
|
||||
|
||||
def test_nargs_oneormore_metavar_length0(self):
|
||||
self.do_test_exception(nargs="+", metavar=tuple())
|
||||
|
||||
def test_nargs_oneormore_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs="+", metavar=("1"))
|
||||
|
||||
def test_nargs_oneormore_metavar_length2(self):
|
||||
self.do_test_no_exception(nargs="+", metavar=("1", "2"))
|
||||
|
||||
def test_nargs_oneormore_metavar_length3(self):
|
||||
self.do_test_exception(nargs="+", metavar=("1", "2", "3"))
|
||||
|
||||
# Unit tests for different values of metavar when nargs=...
|
||||
|
||||
def test_nargs_remainder_metavar_string(self):
|
||||
self.do_test_no_exception(nargs="...", metavar="1")
|
||||
|
||||
def test_nargs_remainder_metavar_length0(self):
|
||||
self.do_test_no_exception(nargs="...", metavar=tuple())
|
||||
|
||||
def test_nargs_remainder_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs="...", metavar=("1"))
|
||||
|
||||
def test_nargs_remainder_metavar_length2(self):
|
||||
self.do_test_no_exception(nargs="...", metavar=("1", "2"))
|
||||
|
||||
def test_nargs_remainder_metavar_length3(self):
|
||||
self.do_test_no_exception(nargs="...", metavar=("1", "2", "3"))
|
||||
|
||||
# Unit tests for different values of metavar when nargs=A...
|
||||
|
||||
def test_nargs_parser_metavar_string(self):
|
||||
self.do_test_no_exception(nargs="A...", metavar="1")
|
||||
|
||||
def test_nargs_parser_metavar_length0(self):
|
||||
self.do_test_exception(nargs="A...", metavar=tuple())
|
||||
|
||||
def test_nargs_parser_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs="A...", metavar=("1"))
|
||||
|
||||
def test_nargs_parser_metavar_length2(self):
|
||||
self.do_test_exception(nargs="A...", metavar=("1", "2"))
|
||||
|
||||
def test_nargs_parser_metavar_length3(self):
|
||||
self.do_test_exception(nargs="A...", metavar=("1", "2", "3"))
|
||||
|
||||
# Unit tests for different values of metavar when nargs=1
|
||||
|
||||
def test_nargs_1_metavar_string(self):
|
||||
self.do_test_no_exception(nargs=1, metavar="1")
|
||||
|
||||
def test_nargs_1_metavar_length0(self):
|
||||
self.do_test_exception(nargs=1, metavar=tuple())
|
||||
|
||||
def test_nargs_1_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs=1, metavar=("1"))
|
||||
|
||||
def test_nargs_1_metavar_length2(self):
|
||||
self.do_test_exception(nargs=1, metavar=("1", "2"))
|
||||
|
||||
def test_nargs_1_metavar_length3(self):
|
||||
self.do_test_exception(nargs=1, metavar=("1", "2", "3"))
|
||||
|
||||
# Unit tests for different values of metavar when nargs=2
|
||||
|
||||
def test_nargs_2_metavar_string(self):
|
||||
self.do_test_no_exception(nargs=2, metavar="1")
|
||||
|
||||
def test_nargs_2_metavar_length0(self):
|
||||
self.do_test_exception(nargs=2, metavar=tuple())
|
||||
|
||||
def test_nargs_2_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs=2, metavar=("1"))
|
||||
|
||||
def test_nargs_2_metavar_length2(self):
|
||||
self.do_test_no_exception(nargs=2, metavar=("1", "2"))
|
||||
|
||||
def test_nargs_2_metavar_length3(self):
|
||||
self.do_test_exception(nargs=2, metavar=("1", "2", "3"))
|
||||
|
||||
# Unit tests for different values of metavar when nargs=3
|
||||
|
||||
def test_nargs_3_metavar_string(self):
|
||||
self.do_test_no_exception(nargs=3, metavar="1")
|
||||
|
||||
def test_nargs_3_metavar_length0(self):
|
||||
self.do_test_exception(nargs=3, metavar=tuple())
|
||||
|
||||
def test_nargs_3_metavar_length1(self):
|
||||
self.do_test_no_exception(nargs=3, metavar=("1"))
|
||||
|
||||
def test_nargs_3_metavar_length2(self):
|
||||
self.do_test_exception(nargs=3, metavar=("1", "2"))
|
||||
|
||||
def test_nargs_3_metavar_length3(self):
|
||||
self.do_test_no_exception(nargs=3, metavar=("1", "2", "3"))
|
||||
|
||||
# ============================
|
||||
# from argparse import * tests
|
||||
# ============================
|
||||
|
|
|
@ -1,3 +1,13 @@
|
|||
"""Bigmem tests - tests for the 32-bit boundary in containers.
|
||||
|
||||
These tests try to exercise the 32-bit boundary that is sometimes, if
|
||||
rarely, exceeded in practice, but almost never tested. They are really only
|
||||
meaningful on 64-bit builds on machines with a *lot* of memory, but the
|
||||
tests are always run, usually with very low memory limits to make sure the
|
||||
tests themselves don't suffer from bitrot. To run them for real, pass a
|
||||
high memory limit to regrtest, with the -M option.
|
||||
"""
|
||||
|
||||
from test import support
|
||||
from test.support import bigmemtest, _1G, _2G, _4G, precisionbigmemtest
|
||||
|
||||
|
@ -6,30 +16,45 @@ import operator
|
|||
import sys
|
||||
import functools
|
||||
|
||||
# These tests all use one of the bigmemtest decorators to indicate how much
|
||||
# memory they use and how much memory they need to be even meaningful. The
|
||||
# decorators take two arguments: a 'memuse' indicator declaring
|
||||
# (approximate) bytes per size-unit the test will use (at peak usage), and a
|
||||
# 'minsize' indicator declaring a minimum *useful* size. A test that
|
||||
# allocates a bytestring to test various operations near the end will have a
|
||||
# minsize of at least 2Gb (or it wouldn't reach the 32-bit limit, so the
|
||||
# test wouldn't be very useful) and a memuse of 1 (one byte per size-unit,
|
||||
# if it allocates only one big string at a time.)
|
||||
#
|
||||
# When run with a memory limit set, both decorators skip tests that need
|
||||
# more memory than available to be meaningful. The precisionbigmemtest will
|
||||
# always pass minsize as size, even if there is much more memory available.
|
||||
# The bigmemtest decorator will scale size upward to fill available memory.
|
||||
#
|
||||
# Bigmem testing houserules:
|
||||
#
|
||||
# - Try not to allocate too many large objects. It's okay to rely on
|
||||
# refcounting semantics, but don't forget that 's = create_largestring()'
|
||||
# refcounting semantics, and don't forget that 's = create_largestring()'
|
||||
# doesn't release the old 's' (if it exists) until well after its new
|
||||
# value has been created. Use 'del s' before the create_largestring call.
|
||||
#
|
||||
# - Do *not* compare large objects using assertEqual or similar. It's a
|
||||
# lengthy operation and the errormessage will be utterly useless due to
|
||||
# its size. To make sure whether a result has the right contents, better
|
||||
# to use the strip or count methods, or compare meaningful slices.
|
||||
# - Do *not* compare large objects using assertEqual, assertIn or similar.
|
||||
# It's a lengthy operation and the errormessage will be utterly useless
|
||||
# due to its size. To make sure whether a result has the right contents,
|
||||
# better to use the strip or count methods, or compare meaningful slices.
|
||||
#
|
||||
# - Don't forget to test for large indices, offsets and results and such,
|
||||
# in addition to large sizes.
|
||||
# in addition to large sizes. Anything that probes the 32-bit boundary.
|
||||
#
|
||||
# - When repeating an object (say, a substring, or a small list) to create
|
||||
# a large object, make the subobject of a length that is not a power of
|
||||
# 2. That way, int-wrapping problems are more easily detected.
|
||||
#
|
||||
# - While the bigmemtest decorator speaks of 'minsize', all tests will
|
||||
# actually be called with a much smaller number too, in the normal
|
||||
# test run (5Kb currently.) This is so the tests themselves get frequent
|
||||
# testing. Consequently, always make all large allocations based on the
|
||||
# passed-in 'size', and don't rely on the size being very large. Also,
|
||||
# - While the bigmem decorators speak of 'minsize', all tests will actually
|
||||
# be called with a much smaller number too, in the normal test run (5Kb
|
||||
# currently.) This is so the tests themselves get frequent testing.
|
||||
# Consequently, always make all large allocations based on the passed-in
|
||||
# 'size', and don't rely on the size being very large. Also,
|
||||
# memuse-per-size should remain sane (less than a few thousand); if your
|
||||
# test uses more, adjust 'size' upward, instead.
|
||||
|
||||
|
@ -92,7 +117,7 @@ class BaseStrTest:
|
|||
_ = self.from_latin1
|
||||
s = _('-') * size
|
||||
tabsize = 8
|
||||
self.assertEqual(s.expandtabs(), s)
|
||||
self.assertTrue(s.expandtabs() == s)
|
||||
del s
|
||||
slen, remainder = divmod(size, tabsize)
|
||||
s = _(' \t') * slen
|
||||
|
@ -519,19 +544,19 @@ class BaseStrTest:
|
|||
edge = _('-') * (size // 2)
|
||||
s = _('').join([edge, SUBSTR, edge])
|
||||
del edge
|
||||
self.assertIn(SUBSTR, s)
|
||||
self.assertNotIn(SUBSTR * 2, s)
|
||||
self.assertIn(_('-'), s)
|
||||
self.assertNotIn(_('a'), s)
|
||||
self.assertTrue(SUBSTR in s)
|
||||
self.assertFalse(SUBSTR * 2 in s)
|
||||
self.assertTrue(_('-') in s)
|
||||
self.assertFalse(_('a') in s)
|
||||
s += _('a')
|
||||
self.assertIn(_('a'), s)
|
||||
self.assertTrue(_('a') in s)
|
||||
|
||||
@bigmemtest(minsize=_2G + 10, memuse=2)
|
||||
def test_compare(self, size):
|
||||
_ = self.from_latin1
|
||||
s1 = _('-') * size
|
||||
s2 = _('-') * size
|
||||
self.assertEqual(s1, s2)
|
||||
self.assertTrue(s1 == s2)
|
||||
del s2
|
||||
s2 = s1 + _('a')
|
||||
self.assertFalse(s1 == s2)
|
||||
|
@ -552,7 +577,7 @@ class BaseStrTest:
|
|||
h1 = hash(s)
|
||||
del s
|
||||
s = _('\x00') * (size + 1)
|
||||
self.assertFalse(h1 == hash(s))
|
||||
self.assertNotEqual(h1, hash(s))
|
||||
|
||||
|
||||
class StrTest(unittest.TestCase, BaseStrTest):
|
||||
|
@ -633,7 +658,7 @@ class StrTest(unittest.TestCase, BaseStrTest):
|
|||
def test_format(self, size):
|
||||
s = '-' * size
|
||||
sf = '%s' % (s,)
|
||||
self.assertEqual(s, sf)
|
||||
self.assertTrue(s == sf)
|
||||
del sf
|
||||
sf = '..%s..' % (s,)
|
||||
self.assertEqual(len(sf), len(s) + 4)
|
||||
|
@ -743,7 +768,7 @@ class TupleTest(unittest.TestCase):
|
|||
def test_compare(self, size):
|
||||
t1 = ('',) * size
|
||||
t2 = ('',) * size
|
||||
self.assertEqual(t1, t2)
|
||||
self.assertTrue(t1 == t2)
|
||||
del t2
|
||||
t2 = ('',) * (size + 1)
|
||||
self.assertFalse(t1 == t2)
|
||||
|
@ -774,9 +799,9 @@ class TupleTest(unittest.TestCase):
|
|||
def test_contains(self, size):
|
||||
t = (1, 2, 3, 4, 5) * size
|
||||
self.assertEqual(len(t), size * 5)
|
||||
self.assertIn(5, t)
|
||||
self.assertNotIn((1, 2, 3, 4, 5), t)
|
||||
self.assertNotIn(0, t)
|
||||
self.assertTrue(5 in t)
|
||||
self.assertFalse((1, 2, 3, 4, 5) in t)
|
||||
self.assertFalse(0 in t)
|
||||
|
||||
@bigmemtest(minsize=_2G + 10, memuse=8)
|
||||
def test_hash(self, size):
|
||||
|
@ -879,7 +904,7 @@ class ListTest(unittest.TestCase):
|
|||
def test_compare(self, size):
|
||||
l1 = [''] * size
|
||||
l2 = [''] * size
|
||||
self.assertEqual(l1, l2)
|
||||
self.assertTrue(l1 == l2)
|
||||
del l2
|
||||
l2 = [''] * (size + 1)
|
||||
self.assertFalse(l1 == l2)
|
||||
|
@ -925,9 +950,9 @@ class ListTest(unittest.TestCase):
|
|||
def test_contains(self, size):
|
||||
l = [1, 2, 3, 4, 5] * size
|
||||
self.assertEqual(len(l), size * 5)
|
||||
self.assertIn(5, l)
|
||||
self.assertNotIn([1, 2, 3, 4, 5], l)
|
||||
self.assertNotIn(0, l)
|
||||
self.assertTrue(5 in l)
|
||||
self.assertFalse([1, 2, 3, 4, 5] in l)
|
||||
self.assertFalse(0 in l)
|
||||
|
||||
@bigmemtest(minsize=_2G + 10, memuse=8)
|
||||
def test_hash(self, size):
|
||||
|
|
|
@ -31,12 +31,6 @@ class CmdLineTest(unittest.TestCase):
|
|||
self.verify_valid_flag('-O')
|
||||
self.verify_valid_flag('-OO')
|
||||
|
||||
def test_q(self):
|
||||
self.verify_valid_flag('-Qold')
|
||||
self.verify_valid_flag('-Qnew')
|
||||
self.verify_valid_flag('-Qwarn')
|
||||
self.verify_valid_flag('-Qwarnall')
|
||||
|
||||
def test_site_flag(self):
|
||||
self.verify_valid_flag('-S')
|
||||
|
||||
|
|
|
@ -332,37 +332,12 @@ class TestNamedTuple(unittest.TestCase):
|
|||
# verify that _source can be run through exec()
|
||||
tmp = namedtuple('NTColor', 'red green blue')
|
||||
globals().pop('NTColor', None) # remove artifacts from other tests
|
||||
self.assertNotIn('NTColor', globals())
|
||||
exec(tmp._source, globals())
|
||||
self.assertIn('NTColor', globals())
|
||||
c = NTColor(10, 20, 30)
|
||||
self.assertEqual((c.red, c.green, c.blue), (10, 20, 30))
|
||||
self.assertEqual(NTColor._fields, ('red', 'green', 'blue'))
|
||||
globals().pop('NTColor', None) # clean-up after this test
|
||||
self.assertNotIn('NTColor', globals())
|
||||
|
||||
def test_source_importable(self):
|
||||
tmp = namedtuple('Color', 'hue sat val')
|
||||
|
||||
compiled = None
|
||||
source = TESTFN + '.py'
|
||||
with open(source, 'w') as f:
|
||||
print(tmp._source, file=f)
|
||||
|
||||
if TESTFN in sys.modules:
|
||||
del sys.modules[TESTFN]
|
||||
try:
|
||||
mod = __import__(TESTFN)
|
||||
compiled = mod.__file__
|
||||
Color = mod.Color
|
||||
c = Color(10, 20, 30)
|
||||
self.assertEqual((c.hue, c.sat, c.val), (10, 20, 30))
|
||||
self.assertEqual(Color._fields, ('hue', 'sat', 'val'))
|
||||
finally:
|
||||
forget(TESTFN)
|
||||
if compiled:
|
||||
unlink(compiled)
|
||||
unlink(source)
|
||||
|
||||
|
||||
################################################################################
|
||||
|
|
|
@ -0,0 +1,3 @@
|
|||
from test.test_email import test_main
|
||||
|
||||
test_main()
|
|
@ -3925,6 +3925,20 @@ A very long line that must get split to something other than at the
|
|||
h.append(x, errors='replace')
|
||||
eq(str(h), e)
|
||||
|
||||
def test_escaped_8bit_header(self):
|
||||
x = b'Ynwp4dUEbay Auction Semiar- No Charge \x96 Earn Big'
|
||||
x = x.decode('ascii', 'surrogateescape')
|
||||
h = Header(x, charset=email.charset.UNKNOWN8BIT)
|
||||
self.assertEqual(str(h),
|
||||
'Ynwp4dUEbay Auction Semiar- No Charge \uFFFD Earn Big')
|
||||
self.assertEqual(email.header.decode_header(h), [(x, 'unknown-8bit')])
|
||||
|
||||
def test_modify_returned_list_does_not_change_header(self):
|
||||
h = Header('test')
|
||||
chunks = email.header.decode_header(h)
|
||||
chunks.append(('ascii', 'test2'))
|
||||
self.assertEqual(str(h), 'test')
|
||||
|
||||
def test_encoded_adjacent_nonencoded(self):
|
||||
eq = self.assertEqual
|
||||
h = Header()
|
||||
|
|
|
@ -13,14 +13,14 @@ def get_error_location(msg):
|
|||
class FutureTest(unittest.TestCase):
|
||||
|
||||
def test_future1(self):
|
||||
support.unload('test_future1')
|
||||
from test import test_future1
|
||||
self.assertEqual(test_future1.result, 6)
|
||||
support.unload('future_test1')
|
||||
from test import future_test1
|
||||
self.assertEqual(future_test1.result, 6)
|
||||
|
||||
def test_future2(self):
|
||||
support.unload('test_future2')
|
||||
from test import test_future2
|
||||
self.assertEqual(test_future2.result, 6)
|
||||
support.unload('future_test2')
|
||||
from test import future_test2
|
||||
self.assertEqual(future_test2.result, 6)
|
||||
|
||||
def test_future3(self):
|
||||
support.unload('test_future3')
|
||||
|
|
|
@ -632,6 +632,16 @@ class TestGetcallargsFunctions(unittest.TestCase):
|
|||
self.assertEqualCallArgs(f, '2, c=4, **collections.UserDict(b=3)')
|
||||
self.assertEqualCallArgs(f, 'b=2, **collections.UserDict(a=3, c=4)')
|
||||
|
||||
def test_varkw_only(self):
|
||||
# issue11256:
|
||||
f = self.makeCallable('**c')
|
||||
self.assertEqualCallArgs(f, '')
|
||||
self.assertEqualCallArgs(f, 'a=1')
|
||||
self.assertEqualCallArgs(f, 'a=1, b=2')
|
||||
self.assertEqualCallArgs(f, 'c=3, **{"a": 1, "b": 2}')
|
||||
self.assertEqualCallArgs(f, '**collections.UserDict(a=1, b=2)')
|
||||
self.assertEqualCallArgs(f, 'c=3, **collections.UserDict(a=1, b=2)')
|
||||
|
||||
def test_keyword_only(self):
|
||||
f = self.makeCallable('a=3, *, c, d=2')
|
||||
self.assertEqualCallArgs(f, 'c=3')
|
||||
|
@ -643,6 +653,11 @@ class TestGetcallargsFunctions(unittest.TestCase):
|
|||
self.assertEqualException(f, 'a=3')
|
||||
self.assertEqualException(f, 'd=4')
|
||||
|
||||
f = self.makeCallable('*, c, d=2')
|
||||
self.assertEqualCallArgs(f, 'c=3')
|
||||
self.assertEqualCallArgs(f, 'c=3, d=4')
|
||||
self.assertEqualCallArgs(f, 'd=4, c=3')
|
||||
|
||||
def test_multiple_features(self):
|
||||
f = self.makeCallable('a, b=2, *f, **g')
|
||||
self.assertEqualCallArgs(f, '2, 3, 7')
|
||||
|
@ -656,6 +671,17 @@ class TestGetcallargsFunctions(unittest.TestCase):
|
|||
'(4,[5,6])]), **collections.UserDict('
|
||||
'y=9, z=10)')
|
||||
|
||||
f = self.makeCallable('a, b=2, *f, x, y=99, **g')
|
||||
self.assertEqualCallArgs(f, '2, 3, x=8')
|
||||
self.assertEqualCallArgs(f, '2, 3, x=8, *[(4,[5,6]), 7]')
|
||||
self.assertEqualCallArgs(f, '2, x=8, *[3, (4,[5,6]), 7], y=9, z=10')
|
||||
self.assertEqualCallArgs(f, 'x=8, *[2, 3, (4,[5,6])], y=9, z=10')
|
||||
self.assertEqualCallArgs(f, 'x=8, *collections.UserList('
|
||||
'[2, 3, (4,[5,6])]), q=0, **{"y":9, "z":10}')
|
||||
self.assertEqualCallArgs(f, '2, x=8, *collections.UserList([3, '
|
||||
'(4,[5,6])]), q=0, **collections.UserDict('
|
||||
'y=9, z=10)')
|
||||
|
||||
def test_errors(self):
|
||||
f0 = self.makeCallable('')
|
||||
f1 = self.makeCallable('a, b')
|
||||
|
@ -692,6 +718,13 @@ class TestGetcallargsFunctions(unittest.TestCase):
|
|||
# - for functions and bound methods: unexpected keyword 'c'
|
||||
# - for unbound methods: multiple values for keyword 'a'
|
||||
#self.assertEqualException(f, '1, c=3, a=2')
|
||||
# issue11256:
|
||||
f3 = self.makeCallable('**c')
|
||||
self.assertEqualException(f3, '1, 2')
|
||||
self.assertEqualException(f3, '1, 2, a=1, b=2')
|
||||
f4 = self.makeCallable('*, a, b=0')
|
||||
self.assertEqualException(f3, '1, 2')
|
||||
self.assertEqualException(f3, '1, 2, a=1, b=2')
|
||||
|
||||
class TestGetcallargsMethods(TestGetcallargsFunctions):
|
||||
|
||||
|
|
|
@ -69,11 +69,21 @@ class TestBasicOps(unittest.TestCase):
|
|||
self.assertEqual(list(accumulate('abc')), ['a', 'ab', 'abc']) # works with non-numeric
|
||||
self.assertEqual(list(accumulate([])), []) # empty iterable
|
||||
self.assertEqual(list(accumulate([7])), [7]) # iterable of length one
|
||||
self.assertRaises(TypeError, accumulate, range(10), 5) # too many args
|
||||
self.assertRaises(TypeError, accumulate, range(10), 5, 6) # too many args
|
||||
self.assertRaises(TypeError, accumulate) # too few args
|
||||
self.assertRaises(TypeError, accumulate, x=range(10)) # unexpected kwd arg
|
||||
self.assertRaises(TypeError, list, accumulate([1, []])) # args that don't add
|
||||
|
||||
s = [2, 8, 9, 5, 7, 0, 3, 4, 1, 6]
|
||||
self.assertEqual(list(accumulate(s, min)),
|
||||
[2, 2, 2, 2, 2, 0, 0, 0, 0, 0])
|
||||
self.assertEqual(list(accumulate(s, max)),
|
||||
[2, 8, 9, 9, 9, 9, 9, 9, 9, 9])
|
||||
self.assertEqual(list(accumulate(s, operator.mul)),
|
||||
[2, 16, 144, 720, 5040, 0, 0, 0, 0, 0])
|
||||
with self.assertRaises(TypeError):
|
||||
list(accumulate(s, chr)) # unary-operation
|
||||
|
||||
def test_chain(self):
|
||||
|
||||
def chain2(*iterables):
|
||||
|
|
|
@ -40,7 +40,7 @@ from socketserver import ThreadingTCPServer, StreamRequestHandler
|
|||
import struct
|
||||
import sys
|
||||
import tempfile
|
||||
from test.support import captured_stdout, run_with_locale, run_unittest
|
||||
from test.support import captured_stdout, run_with_locale, run_unittest, patch
|
||||
import textwrap
|
||||
import unittest
|
||||
import warnings
|
||||
|
@ -1082,28 +1082,39 @@ class WarningsTest(BaseTest):
|
|||
def test_warnings(self):
|
||||
with warnings.catch_warnings():
|
||||
logging.captureWarnings(True)
|
||||
try:
|
||||
warnings.filterwarnings("always", category=UserWarning)
|
||||
file = io.StringIO()
|
||||
h = logging.StreamHandler(file)
|
||||
logger = logging.getLogger("py.warnings")
|
||||
logger.addHandler(h)
|
||||
warnings.warn("I'm warning you...")
|
||||
logger.removeHandler(h)
|
||||
s = file.getvalue()
|
||||
h.close()
|
||||
self.assertTrue(s.find("UserWarning: I'm warning you...\n") > 0)
|
||||
self.addCleanup(lambda: logging.captureWarnings(False))
|
||||
warnings.filterwarnings("always", category=UserWarning)
|
||||
stream = io.StringIO()
|
||||
h = logging.StreamHandler(stream)
|
||||
logger = logging.getLogger("py.warnings")
|
||||
logger.addHandler(h)
|
||||
warnings.warn("I'm warning you...")
|
||||
logger.removeHandler(h)
|
||||
s = stream.getvalue()
|
||||
h.close()
|
||||
self.assertTrue(s.find("UserWarning: I'm warning you...\n") > 0)
|
||||
|
||||
#See if an explicit file uses the original implementation
|
||||
file = io.StringIO()
|
||||
warnings.showwarning("Explicit", UserWarning, "dummy.py", 42,
|
||||
file, "Dummy line")
|
||||
s = file.getvalue()
|
||||
file.close()
|
||||
self.assertEqual(s,
|
||||
"dummy.py:42: UserWarning: Explicit\n Dummy line\n")
|
||||
finally:
|
||||
logging.captureWarnings(False)
|
||||
#See if an explicit file uses the original implementation
|
||||
a_file = io.StringIO()
|
||||
warnings.showwarning("Explicit", UserWarning, "dummy.py", 42,
|
||||
a_file, "Dummy line")
|
||||
s = a_file.getvalue()
|
||||
a_file.close()
|
||||
self.assertEqual(s,
|
||||
"dummy.py:42: UserWarning: Explicit\n Dummy line\n")
|
||||
|
||||
def test_warnings_no_handlers(self):
|
||||
with warnings.catch_warnings():
|
||||
logging.captureWarnings(True)
|
||||
self.addCleanup(lambda: logging.captureWarnings(False))
|
||||
|
||||
# confirm our assumption: no loggers are set
|
||||
logger = logging.getLogger("py.warnings")
|
||||
assert logger.handlers == []
|
||||
|
||||
warnings.showwarning("Explicit", UserWarning, "dummy.py", 42)
|
||||
self.assertTrue(len(logger.handlers) == 1)
|
||||
self.assertIsInstance(logger.handlers[0], logging.NullHandler)
|
||||
|
||||
|
||||
def formatFunc(format, datefmt=None):
|
||||
|
@ -2007,6 +2018,11 @@ class ManagerTest(BaseTest):
|
|||
|
||||
self.assertEqual(logged, ['should appear in logged'])
|
||||
|
||||
def test_set_log_record_factory(self):
|
||||
man = logging.Manager(None)
|
||||
expected = object()
|
||||
man.setLogRecordFactory(expected)
|
||||
self.assertEqual(man.logRecordFactory, expected)
|
||||
|
||||
class ChildLoggerTest(BaseTest):
|
||||
def test_child_loggers(self):
|
||||
|
@ -2198,6 +2214,479 @@ class LastResortTest(BaseTest):
|
|||
logging.raiseExceptions = old_raise_exceptions
|
||||
|
||||
|
||||
class FakeHandler:
|
||||
|
||||
def __init__(self, identifier, called):
|
||||
for method in ('acquire', 'flush', 'close', 'release'):
|
||||
setattr(self, method, self.record_call(identifier, method, called))
|
||||
|
||||
def record_call(self, identifier, method_name, called):
|
||||
def inner():
|
||||
called.append('{} - {}'.format(identifier, method_name))
|
||||
return inner
|
||||
|
||||
|
||||
class RecordingHandler(logging.NullHandler):
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(RecordingHandler, self).__init__(*args, **kwargs)
|
||||
self.records = []
|
||||
|
||||
def handle(self, record):
|
||||
"""Keep track of all the emitted records."""
|
||||
self.records.append(record)
|
||||
|
||||
|
||||
class ShutdownTest(BaseTest):
|
||||
|
||||
"""Tets suite for the shutdown method."""
|
||||
|
||||
def setUp(self):
|
||||
super(ShutdownTest, self).setUp()
|
||||
self.called = []
|
||||
|
||||
raise_exceptions = logging.raiseExceptions
|
||||
self.addCleanup(lambda: setattr(logging, 'raiseExceptions', raise_exceptions))
|
||||
|
||||
def raise_error(self, error):
|
||||
def inner():
|
||||
raise error()
|
||||
return inner
|
||||
|
||||
def test_no_failure(self):
|
||||
# create some fake handlers
|
||||
handler0 = FakeHandler(0, self.called)
|
||||
handler1 = FakeHandler(1, self.called)
|
||||
handler2 = FakeHandler(2, self.called)
|
||||
|
||||
# create live weakref to those handlers
|
||||
handlers = map(logging.weakref.ref, [handler0, handler1, handler2])
|
||||
|
||||
logging.shutdown(handlerList=list(handlers))
|
||||
|
||||
expected = ['2 - acquire', '2 - flush', '2 - close', '2 - release',
|
||||
'1 - acquire', '1 - flush', '1 - close', '1 - release',
|
||||
'0 - acquire', '0 - flush', '0 - close', '0 - release']
|
||||
self.assertEqual(expected, self.called)
|
||||
|
||||
def _test_with_failure_in_method(self, method, error):
|
||||
handler = FakeHandler(0, self.called)
|
||||
setattr(handler, method, self.raise_error(error))
|
||||
handlers = [logging.weakref.ref(handler)]
|
||||
|
||||
logging.shutdown(handlerList=list(handlers))
|
||||
|
||||
self.assertEqual('0 - release', self.called[-1])
|
||||
|
||||
def test_with_ioerror_in_acquire(self):
|
||||
self._test_with_failure_in_method('acquire', IOError)
|
||||
|
||||
def test_with_ioerror_in_flush(self):
|
||||
self._test_with_failure_in_method('flush', IOError)
|
||||
|
||||
def test_with_ioerror_in_close(self):
|
||||
self._test_with_failure_in_method('close', IOError)
|
||||
|
||||
def test_with_valueerror_in_acquire(self):
|
||||
self._test_with_failure_in_method('acquire', ValueError)
|
||||
|
||||
def test_with_valueerror_in_flush(self):
|
||||
self._test_with_failure_in_method('flush', ValueError)
|
||||
|
||||
def test_with_valueerror_in_close(self):
|
||||
self._test_with_failure_in_method('close', ValueError)
|
||||
|
||||
def test_with_other_error_in_acquire_without_raise(self):
|
||||
logging.raiseExceptions = False
|
||||
self._test_with_failure_in_method('acquire', IndexError)
|
||||
|
||||
def test_with_other_error_in_flush_without_raise(self):
|
||||
logging.raiseExceptions = False
|
||||
self._test_with_failure_in_method('flush', IndexError)
|
||||
|
||||
def test_with_other_error_in_close_without_raise(self):
|
||||
logging.raiseExceptions = False
|
||||
self._test_with_failure_in_method('close', IndexError)
|
||||
|
||||
def test_with_other_error_in_acquire_with_raise(self):
|
||||
logging.raiseExceptions = True
|
||||
self.assertRaises(IndexError, self._test_with_failure_in_method,
|
||||
'acquire', IndexError)
|
||||
|
||||
def test_with_other_error_in_flush_with_raise(self):
|
||||
logging.raiseExceptions = True
|
||||
self.assertRaises(IndexError, self._test_with_failure_in_method,
|
||||
'flush', IndexError)
|
||||
|
||||
def test_with_other_error_in_close_with_raise(self):
|
||||
logging.raiseExceptions = True
|
||||
self.assertRaises(IndexError, self._test_with_failure_in_method,
|
||||
'close', IndexError)
|
||||
|
||||
|
||||
class ModuleLevelMiscTest(BaseTest):
|
||||
|
||||
"""Tets suite for some module level methods."""
|
||||
|
||||
def test_disable(self):
|
||||
old_disable = logging.root.manager.disable
|
||||
# confirm our assumptions are correct
|
||||
assert old_disable == 0
|
||||
self.addCleanup(lambda: logging.disable(old_disable))
|
||||
|
||||
logging.disable(83)
|
||||
self.assertEqual(logging.root.manager.disable, 83)
|
||||
|
||||
def _test_log(self, method, level=None):
|
||||
called = []
|
||||
patch(self, logging, 'basicConfig',
|
||||
lambda *a, **kw: called.append(a, kw))
|
||||
|
||||
recording = RecordingHandler()
|
||||
logging.root.addHandler(recording)
|
||||
|
||||
log_method = getattr(logging, method)
|
||||
if level is not None:
|
||||
log_method(level, "test me: %r", recording)
|
||||
else:
|
||||
log_method("test me: %r", recording)
|
||||
|
||||
self.assertEqual(len(recording.records), 1)
|
||||
record = recording.records[0]
|
||||
self.assertEqual(record.getMessage(), "test me: %r" % recording)
|
||||
|
||||
expected_level = level if level is not None else getattr(logging, method.upper())
|
||||
self.assertEqual(record.levelno, expected_level)
|
||||
|
||||
# basicConfig was not called!
|
||||
self.assertEqual(called, [])
|
||||
|
||||
def test_log(self):
|
||||
self._test_log('log', logging.ERROR)
|
||||
|
||||
def test_debug(self):
|
||||
self._test_log('debug')
|
||||
|
||||
def test_info(self):
|
||||
self._test_log('info')
|
||||
|
||||
def test_warning(self):
|
||||
self._test_log('warning')
|
||||
|
||||
def test_error(self):
|
||||
self._test_log('error')
|
||||
|
||||
def test_critical(self):
|
||||
self._test_log('critical')
|
||||
|
||||
def test_set_logger_class(self):
|
||||
self.assertRaises(TypeError, logging.setLoggerClass, object)
|
||||
|
||||
class MyLogger(logging.Logger):
|
||||
pass
|
||||
|
||||
logging.setLoggerClass(MyLogger)
|
||||
self.assertEqual(logging.getLoggerClass(), MyLogger)
|
||||
|
||||
logging.setLoggerClass(logging.Logger)
|
||||
self.assertEqual(logging.getLoggerClass(), logging.Logger)
|
||||
|
||||
|
||||
class BasicConfigTest(unittest.TestCase):
|
||||
|
||||
"""Tets suite for logging.basicConfig."""
|
||||
|
||||
def setUp(self):
|
||||
super(BasicConfigTest, self).setUp()
|
||||
handlers = logging.root.handlers
|
||||
self.addCleanup(lambda: setattr(logging.root, 'handlers', handlers))
|
||||
logging.root.handlers = []
|
||||
|
||||
def tearDown(self):
|
||||
logging.shutdown()
|
||||
super(BasicConfigTest, self).tearDown()
|
||||
|
||||
def test_no_kwargs(self):
|
||||
logging.basicConfig()
|
||||
|
||||
# handler defaults to a StreamHandler to sys.stderr
|
||||
self.assertEqual(len(logging.root.handlers), 1)
|
||||
handler = logging.root.handlers[0]
|
||||
self.assertIsInstance(handler, logging.StreamHandler)
|
||||
self.assertEqual(handler.stream, sys.stderr)
|
||||
|
||||
formatter = handler.formatter
|
||||
# format defaults to logging.BASIC_FORMAT
|
||||
self.assertEqual(formatter._style._fmt, logging.BASIC_FORMAT)
|
||||
# datefmt defaults to None
|
||||
self.assertIsNone(formatter.datefmt)
|
||||
# style defaults to %
|
||||
self.assertIsInstance(formatter._style, logging.PercentStyle)
|
||||
|
||||
# level is not explicitely set
|
||||
self.assertEqual(logging.root.level, logging.WARNING)
|
||||
|
||||
def test_filename(self):
|
||||
logging.basicConfig(filename='test.log')
|
||||
|
||||
self.assertEqual(len(logging.root.handlers), 1)
|
||||
handler = logging.root.handlers[0]
|
||||
self.assertIsInstance(handler, logging.FileHandler)
|
||||
|
||||
expected = logging.FileHandler('test.log', 'a')
|
||||
self.addCleanup(expected.close)
|
||||
self.assertEqual(handler.stream.mode, expected.stream.mode)
|
||||
self.assertEqual(handler.stream.name, expected.stream.name)
|
||||
|
||||
def test_filemode(self):
|
||||
logging.basicConfig(filename='test.log', filemode='wb')
|
||||
|
||||
handler = logging.root.handlers[0]
|
||||
expected = logging.FileHandler('test.log', 'wb')
|
||||
self.addCleanup(expected.close)
|
||||
self.assertEqual(handler.stream.mode, expected.stream.mode)
|
||||
|
||||
def test_stream(self):
|
||||
stream = io.StringIO()
|
||||
self.addCleanup(stream.close)
|
||||
logging.basicConfig(stream=stream)
|
||||
|
||||
self.assertEqual(len(logging.root.handlers), 1)
|
||||
handler = logging.root.handlers[0]
|
||||
self.assertIsInstance(handler, logging.StreamHandler)
|
||||
self.assertEqual(handler.stream, stream)
|
||||
|
||||
def test_format(self):
|
||||
logging.basicConfig(format='foo')
|
||||
|
||||
formatter = logging.root.handlers[0].formatter
|
||||
self.assertEqual(formatter._style._fmt, 'foo')
|
||||
|
||||
def test_datefmt(self):
|
||||
logging.basicConfig(datefmt='bar')
|
||||
|
||||
formatter = logging.root.handlers[0].formatter
|
||||
self.assertEqual(formatter.datefmt, 'bar')
|
||||
|
||||
def test_style(self):
|
||||
logging.basicConfig(style='$')
|
||||
|
||||
formatter = logging.root.handlers[0].formatter
|
||||
self.assertIsInstance(formatter._style, logging.StringTemplateStyle)
|
||||
|
||||
def test_level(self):
|
||||
old_level = logging.root.level
|
||||
self.addCleanup(lambda: logging.root.setLevel(old_level))
|
||||
|
||||
logging.basicConfig(level=57)
|
||||
self.assertEqual(logging.root.level, 57)
|
||||
|
||||
def _test_log(self, method, level=None):
|
||||
# logging.root has no handlers so basicConfig should be called
|
||||
called = []
|
||||
|
||||
old_basic_config = logging.basicConfig
|
||||
def my_basic_config(*a, **kw):
|
||||
old_basic_config()
|
||||
old_level = logging.root.level
|
||||
logging.root.setLevel(100) # avoid having messages in stderr
|
||||
self.addCleanup(lambda: logging.root.setLevel(old_level))
|
||||
called.append((a, kw))
|
||||
|
||||
patch(self, logging, 'basicConfig', my_basic_config)
|
||||
|
||||
log_method = getattr(logging, method)
|
||||
if level is not None:
|
||||
log_method(level, "test me")
|
||||
else:
|
||||
log_method("test me")
|
||||
|
||||
# basicConfig was called with no arguments
|
||||
self.assertEqual(called, [((), {})])
|
||||
|
||||
def test_log(self):
|
||||
self._test_log('log', logging.WARNING)
|
||||
|
||||
def test_debug(self):
|
||||
self._test_log('debug')
|
||||
|
||||
def test_info(self):
|
||||
self._test_log('info')
|
||||
|
||||
def test_warning(self):
|
||||
self._test_log('warning')
|
||||
|
||||
def test_error(self):
|
||||
self._test_log('error')
|
||||
|
||||
def test_critical(self):
|
||||
self._test_log('critical')
|
||||
|
||||
|
||||
class LoggerAdapterTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(LoggerAdapterTest, self).setUp()
|
||||
old_handler_list = logging._handlerList[:]
|
||||
|
||||
self.recording = RecordingHandler()
|
||||
self.logger = logging.root
|
||||
self.logger.addHandler(self.recording)
|
||||
self.addCleanup(lambda: self.logger.removeHandler(self.recording))
|
||||
self.addCleanup(self.recording.close)
|
||||
|
||||
def cleanup():
|
||||
logging._handlerList[:] = old_handler_list
|
||||
|
||||
self.addCleanup(cleanup)
|
||||
self.addCleanup(logging.shutdown)
|
||||
self.adapter = logging.LoggerAdapter(logger=self.logger, extra=None)
|
||||
|
||||
def test_exception(self):
|
||||
msg = 'testing exception: %r'
|
||||
exc = None
|
||||
try:
|
||||
assert False
|
||||
except AssertionError as e:
|
||||
exc = e
|
||||
self.adapter.exception(msg, self.recording)
|
||||
|
||||
self.assertEqual(len(self.recording.records), 1)
|
||||
record = self.recording.records[0]
|
||||
self.assertEqual(record.levelno, logging.ERROR)
|
||||
self.assertEqual(record.msg, msg)
|
||||
self.assertEqual(record.args, (self.recording,))
|
||||
self.assertEqual(record.exc_info,
|
||||
(exc.__class__, exc, exc.__traceback__))
|
||||
|
||||
def test_critical(self):
|
||||
msg = 'critical test! %r'
|
||||
self.adapter.critical(msg, self.recording)
|
||||
|
||||
self.assertEqual(len(self.recording.records), 1)
|
||||
record = self.recording.records[0]
|
||||
self.assertEqual(record.levelno, logging.CRITICAL)
|
||||
self.assertEqual(record.msg, msg)
|
||||
self.assertEqual(record.args, (self.recording,))
|
||||
|
||||
def test_is_enabled_for(self):
|
||||
old_disable = self.adapter.logger.manager.disable
|
||||
self.adapter.logger.manager.disable = 33
|
||||
self.addCleanup(lambda: setattr(self.adapter.logger.manager,
|
||||
'disable', old_disable))
|
||||
self.assertFalse(self.adapter.isEnabledFor(32))
|
||||
|
||||
def test_has_handlers(self):
|
||||
self.assertTrue(self.adapter.hasHandlers())
|
||||
|
||||
for handler in self.logger.handlers:
|
||||
self.logger.removeHandler(handler)
|
||||
assert not self.logger.hasHandlers()
|
||||
|
||||
self.assertFalse(self.adapter.hasHandlers())
|
||||
|
||||
|
||||
class LoggerTest(BaseTest):
|
||||
|
||||
def setUp(self):
|
||||
super(LoggerTest, self).setUp()
|
||||
self.recording = RecordingHandler()
|
||||
self.logger = logging.Logger(name='blah')
|
||||
self.logger.addHandler(self.recording)
|
||||
self.addCleanup(lambda: self.logger.removeHandler(self.recording))
|
||||
self.addCleanup(self.recording.close)
|
||||
self.addCleanup(logging.shutdown)
|
||||
|
||||
def test_set_invalid_level(self):
|
||||
self.assertRaises(TypeError, self.logger.setLevel, object())
|
||||
|
||||
def test_exception(self):
|
||||
msg = 'testing exception: %r'
|
||||
exc = None
|
||||
try:
|
||||
assert False
|
||||
except AssertionError as e:
|
||||
exc = e
|
||||
self.logger.exception(msg, self.recording)
|
||||
|
||||
self.assertEqual(len(self.recording.records), 1)
|
||||
record = self.recording.records[0]
|
||||
self.assertEqual(record.levelno, logging.ERROR)
|
||||
self.assertEqual(record.msg, msg)
|
||||
self.assertEqual(record.args, (self.recording,))
|
||||
self.assertEqual(record.exc_info,
|
||||
(exc.__class__, exc, exc.__traceback__))
|
||||
|
||||
def test_log_invalid_level_with_raise(self):
|
||||
old_raise = logging.raiseExceptions
|
||||
self.addCleanup(lambda: setattr(logging, 'raiseExecptions', old_raise))
|
||||
|
||||
logging.raiseExceptions = True
|
||||
self.assertRaises(TypeError, self.logger.log, '10', 'test message')
|
||||
|
||||
def test_log_invalid_level_no_raise(self):
|
||||
old_raise = logging.raiseExceptions
|
||||
self.addCleanup(lambda: setattr(logging, 'raiseExecptions', old_raise))
|
||||
|
||||
logging.raiseExceptions = False
|
||||
self.logger.log('10', 'test message') # no exception happens
|
||||
|
||||
def test_find_caller_with_stack_info(self):
|
||||
called = []
|
||||
patch(self, logging.traceback, 'print_stack',
|
||||
lambda f, file: called.append(file.getvalue()))
|
||||
|
||||
self.logger.findCaller(stack_info=True)
|
||||
|
||||
self.assertEqual(len(called), 1)
|
||||
self.assertEqual('Stack (most recent call last):\n', called[0])
|
||||
|
||||
def test_make_record_with_extra_overwrite(self):
|
||||
name = 'my record'
|
||||
level = 13
|
||||
fn = lno = msg = args = exc_info = func = sinfo = None
|
||||
rv = logging._logRecordFactory(name, level, fn, lno, msg, args,
|
||||
exc_info, func, sinfo)
|
||||
|
||||
for key in ('message', 'asctime') + tuple(rv.__dict__.keys()):
|
||||
extra = {key: 'some value'}
|
||||
self.assertRaises(KeyError, self.logger.makeRecord, name, level,
|
||||
fn, lno, msg, args, exc_info,
|
||||
extra=extra, sinfo=sinfo)
|
||||
|
||||
def test_make_record_with_extra_no_overwrite(self):
|
||||
name = 'my record'
|
||||
level = 13
|
||||
fn = lno = msg = args = exc_info = func = sinfo = None
|
||||
extra = {'valid_key': 'some value'}
|
||||
result = self.logger.makeRecord(name, level, fn, lno, msg, args,
|
||||
exc_info, extra=extra, sinfo=sinfo)
|
||||
self.assertIn('valid_key', result.__dict__)
|
||||
|
||||
def test_has_handlers(self):
|
||||
self.assertTrue(self.logger.hasHandlers())
|
||||
|
||||
for handler in self.logger.handlers:
|
||||
self.logger.removeHandler(handler)
|
||||
assert not self.logger.hasHandlers()
|
||||
|
||||
self.assertFalse(self.logger.hasHandlers())
|
||||
|
||||
def test_has_handlers_no_propagate(self):
|
||||
child_logger = logging.getLogger('blah.child')
|
||||
child_logger.propagate = False
|
||||
assert child_logger.handlers == []
|
||||
|
||||
self.assertFalse(child_logger.hasHandlers())
|
||||
|
||||
def test_is_enabled_for(self):
|
||||
old_disable = self.logger.manager.disable
|
||||
self.logger.manager.disable = 23
|
||||
self.addCleanup(lambda: setattr(self.logger.manager,
|
||||
'disable', old_disable))
|
||||
self.assertFalse(self.logger.isEnabledFor(22))
|
||||
|
||||
|
||||
class BaseFileTest(BaseTest):
|
||||
"Base class for handler tests that write log files"
|
||||
|
||||
|
@ -2319,6 +2808,8 @@ def test_main():
|
|||
EncodingTest, WarningsTest, ConfigDictTest, ManagerTest,
|
||||
FormatterTest,
|
||||
LogRecordFactoryTest, ChildLoggerTest, QueueHandlerTest,
|
||||
ShutdownTest, ModuleLevelMiscTest, BasicConfigTest,
|
||||
LoggerAdapterTest, LoggerTest,
|
||||
RotatingFileHandlerTest,
|
||||
LastResortTest,
|
||||
TimedRotatingFileHandlerTest
|
||||
|
|
|
@ -869,8 +869,6 @@ class TestMaildir(TestMailbox):
|
|||
self.assertFalse((perms & 0o111)) # Execute bits should all be off.
|
||||
|
||||
def test_reread(self):
|
||||
# Wait for 2 seconds
|
||||
time.sleep(2)
|
||||
|
||||
# Initially, the mailbox has not been read and the time is null.
|
||||
assert getattr(self._box, '_last_read', None) is None
|
||||
|
@ -879,15 +877,21 @@ class TestMaildir(TestMailbox):
|
|||
self._box._refresh()
|
||||
assert getattr(self._box, '_last_read', None) is not None
|
||||
|
||||
# Try calling _refresh() again; the modification times shouldn't have
|
||||
# changed, so the mailbox should not be re-reading. Re-reading causes
|
||||
# the ._toc attribute to be assigned a new dictionary object, so
|
||||
# we'll check that the ._toc attribute isn't a different object.
|
||||
# Put the last modified times more than one second into the past
|
||||
# (because mtime has a one second granularity, a refresh is done
|
||||
# unconditionally if called for within the same second, just in case
|
||||
# the mbox has changed).
|
||||
for subdir in ('cur', 'new'):
|
||||
os.utime(os.path.join(self._box._path, subdir),
|
||||
(time.time()-5,)*2)
|
||||
|
||||
# Re-reading causes the ._toc attribute to be assigned a new dictionary
|
||||
# object, so we'll check that the ._toc attribute isn't a different
|
||||
# object.
|
||||
orig_toc = self._box._toc
|
||||
def refreshed():
|
||||
return self._box._toc is not orig_toc
|
||||
|
||||
time.sleep(1) # Wait 1sec to ensure time.time()'s value changes
|
||||
self._box._refresh()
|
||||
assert not refreshed()
|
||||
|
||||
|
|
|
@ -0,0 +1,46 @@
|
|||
""" Test suite for the code in msilib """
|
||||
import unittest
|
||||
import os
|
||||
from test.support import run_unittest, import_module
|
||||
msilib = import_module('msilib')
|
||||
|
||||
class Test_make_id(unittest.TestCase):
|
||||
#http://msdn.microsoft.com/en-us/library/aa369212(v=vs.85).aspx
|
||||
"""The Identifier data type is a text string. Identifiers may contain the
|
||||
ASCII characters A-Z (a-z), digits, underscores (_), or periods (.).
|
||||
However, every identifier must begin with either a letter or an
|
||||
underscore.
|
||||
"""
|
||||
|
||||
def test_is_no_change_required(self):
|
||||
self.assertEqual(
|
||||
msilib.make_id("short"), "short")
|
||||
self.assertEqual(
|
||||
msilib.make_id("nochangerequired"), "nochangerequired")
|
||||
self.assertEqual(
|
||||
msilib.make_id("one.dot"), "one.dot")
|
||||
self.assertEqual(
|
||||
msilib.make_id("_"), "_")
|
||||
self.assertEqual(
|
||||
msilib.make_id("a"), "a")
|
||||
#self.assertEqual(
|
||||
# msilib.make_id(""), "")
|
||||
|
||||
def test_invalid_first_char(self):
|
||||
self.assertEqual(
|
||||
msilib.make_id("9.short"), "_9.short")
|
||||
self.assertEqual(
|
||||
msilib.make_id(".short"), "_.short")
|
||||
|
||||
def test_invalid_any_char(self):
|
||||
self.assertEqual(
|
||||
msilib.make_id(".s\x82ort"), "_.s_ort")
|
||||
self.assertEqual (
|
||||
msilib.make_id(".s\x82o?*+rt"), "_.s_o___rt")
|
||||
|
||||
|
||||
def test_main():
|
||||
run_unittest(__name__)
|
||||
|
||||
if __name__ == '__main__':
|
||||
test_main()
|
|
@ -928,6 +928,21 @@ class _TestArray(BaseTestCase):
|
|||
|
||||
self.assertEqual(list(arr[:]), seq)
|
||||
|
||||
@unittest.skipIf(c_int is None, "requires _ctypes")
|
||||
def test_array_from_size(self):
|
||||
size = 10
|
||||
# Test for zeroing (see issue #11675).
|
||||
# The repetition below strengthens the test by increasing the chances
|
||||
# of previously allocated non-zero memory being used for the new array
|
||||
# on the 2nd and 3rd loops.
|
||||
for _ in range(3):
|
||||
arr = self.Array('i', size)
|
||||
self.assertEqual(len(arr), size)
|
||||
self.assertEqual(list(arr), [0] * size)
|
||||
arr[:] = range(10)
|
||||
self.assertEqual(list(arr), list(range(10)))
|
||||
del arr
|
||||
|
||||
@unittest.skipIf(c_int is None, "requires _ctypes")
|
||||
def test_rawarray(self):
|
||||
self.test_array(raw=True)
|
||||
|
|
|
@ -12,9 +12,10 @@ import unittest
|
|||
import xml.etree
|
||||
import textwrap
|
||||
from io import StringIO
|
||||
from collections import namedtuple
|
||||
from contextlib import contextmanager
|
||||
from test.support import TESTFN, forget, rmtree, EnvironmentVarGuard, \
|
||||
reap_children, captured_output
|
||||
reap_children, captured_output, captured_stdout
|
||||
|
||||
from test import pydoc_mod
|
||||
|
||||
|
@ -379,6 +380,15 @@ class PydocDocTest(unittest.TestCase):
|
|||
finally:
|
||||
pydoc.getpager = getpager_old
|
||||
|
||||
def test_namedtuple_public_underscore(self):
|
||||
NT = namedtuple('NT', ['abc', 'def'], rename=True)
|
||||
with captured_stdout() as help_io:
|
||||
help(NT)
|
||||
helptext = help_io.getvalue()
|
||||
self.assertIn('_1', helptext)
|
||||
self.assertIn('_replace', helptext)
|
||||
self.assertIn('_asdict', helptext)
|
||||
|
||||
|
||||
class TestDescriptions(unittest.TestCase):
|
||||
|
||||
|
|
|
@ -1,7 +1,9 @@
|
|||
from test.support import verbose, run_unittest
|
||||
import re
|
||||
from re import Scanner
|
||||
import sys, traceback
|
||||
import sys
|
||||
import string
|
||||
import traceback
|
||||
from weakref import proxy
|
||||
|
||||
# Misc tests from Tim Peters' re.doc
|
||||
|
@ -411,31 +413,62 @@ class ReTests(unittest.TestCase):
|
|||
self.assertEqual(re.search("\s(b)", " b").group(1), "b")
|
||||
self.assertEqual(re.search("a\s", "a ").group(0), "a ")
|
||||
|
||||
def test_re_escape(self):
|
||||
p=""
|
||||
self.assertEqual(re.escape(p), p)
|
||||
for i in range(0, 256):
|
||||
p = p + chr(i)
|
||||
self.assertEqual(re.match(re.escape(chr(i)), chr(i)) is not None,
|
||||
True)
|
||||
self.assertEqual(re.match(re.escape(chr(i)), chr(i)).span(), (0,1))
|
||||
def assertMatch(self, pattern, text, match=None, span=None,
|
||||
matcher=re.match):
|
||||
if match is None and span is None:
|
||||
# the pattern matches the whole text
|
||||
match = text
|
||||
span = (0, len(text))
|
||||
elif match is None or span is None:
|
||||
raise ValueError('If match is not None, span should be specified '
|
||||
'(and vice versa).')
|
||||
m = matcher(pattern, text)
|
||||
self.assertTrue(m)
|
||||
self.assertEqual(m.group(), match)
|
||||
self.assertEqual(m.span(), span)
|
||||
|
||||
pat=re.compile(re.escape(p))
|
||||
self.assertEqual(pat.match(p) is not None, True)
|
||||
self.assertEqual(pat.match(p).span(), (0,256))
|
||||
def test_re_escape(self):
|
||||
alnum_chars = string.ascii_letters + string.digits
|
||||
p = ''.join(chr(i) for i in range(256))
|
||||
for c in p:
|
||||
if c in alnum_chars:
|
||||
self.assertEqual(re.escape(c), c)
|
||||
elif c == '\x00':
|
||||
self.assertEqual(re.escape(c), '\\000')
|
||||
else:
|
||||
self.assertEqual(re.escape(c), '\\' + c)
|
||||
self.assertMatch(re.escape(c), c)
|
||||
self.assertMatch(re.escape(p), p)
|
||||
|
||||
def test_re_escape_byte(self):
|
||||
p=b""
|
||||
self.assertEqual(re.escape(p), p)
|
||||
for i in range(0, 256):
|
||||
alnum_chars = (string.ascii_letters + string.digits).encode('ascii')
|
||||
p = bytes(range(256))
|
||||
for i in p:
|
||||
b = bytes([i])
|
||||
p += b
|
||||
self.assertEqual(re.match(re.escape(b), b) is not None, True)
|
||||
self.assertEqual(re.match(re.escape(b), b).span(), (0,1))
|
||||
if b in alnum_chars:
|
||||
self.assertEqual(re.escape(b), b)
|
||||
elif i == 0:
|
||||
self.assertEqual(re.escape(b), b'\\000')
|
||||
else:
|
||||
self.assertEqual(re.escape(b), b'\\' + b)
|
||||
self.assertMatch(re.escape(b), b)
|
||||
self.assertMatch(re.escape(p), p)
|
||||
|
||||
pat=re.compile(re.escape(p))
|
||||
self.assertEqual(pat.match(p) is not None, True)
|
||||
self.assertEqual(pat.match(p).span(), (0,256))
|
||||
def test_re_escape_non_ascii(self):
|
||||
s = 'xxx\u2620\u2620\u2620xxx'
|
||||
s_escaped = re.escape(s)
|
||||
self.assertEqual(s_escaped, 'xxx\\\u2620\\\u2620\\\u2620xxx')
|
||||
self.assertMatch(s_escaped, s)
|
||||
self.assertMatch('.%s+.' % re.escape('\u2620'), s,
|
||||
'x\u2620\u2620\u2620x', (2, 7), re.search)
|
||||
|
||||
def test_re_escape_non_ascii_bytes(self):
|
||||
b = 'y\u2620y\u2620y'.encode('utf-8')
|
||||
b_escaped = re.escape(b)
|
||||
self.assertEqual(b_escaped, b'y\\\xe2\\\x98\\\xa0y\\\xe2\\\x98\\\xa0y')
|
||||
self.assertMatch(b_escaped, b)
|
||||
res = re.findall(re.escape('\u2620'.encode('utf-8')), b)
|
||||
self.assertEqual(len(res), 2)
|
||||
|
||||
def pickle_test(self, pickle):
|
||||
oldpat = re.compile('a(?:b|(c|e){1,2}?|d)+?(.)')
|
||||
|
|
|
@ -1325,6 +1325,7 @@ class POSIXProcessTestCase(BaseTestCase):
|
|||
stdout=subprocess.PIPE,
|
||||
bufsize=0)
|
||||
f = p.stdout
|
||||
self.addCleanup(f.close)
|
||||
try:
|
||||
self.assertEqual(f.read(4), b"appl")
|
||||
self.assertIn(f, select.select([f], [], [], 0.0)[0])
|
||||
|
|
|
@ -501,7 +501,7 @@ class SysModuleTest(unittest.TestCase):
|
|||
|
||||
def test_sys_flags(self):
|
||||
self.assertTrue(sys.flags)
|
||||
attrs = ("debug", "division_warning",
|
||||
attrs = ("debug",
|
||||
"inspect", "interactive", "optimize", "dont_write_bytecode",
|
||||
"no_user_site", "no_site", "ignore_environment", "verbose",
|
||||
"bytes_warning", "quiet")
|
||||
|
|
|
@ -12,6 +12,7 @@ import time
|
|||
|
||||
|
||||
class URLTimeoutTest(unittest.TestCase):
|
||||
# XXX this test doesn't seem to test anything useful.
|
||||
|
||||
TIMEOUT = 30.0
|
||||
|
||||
|
@ -24,7 +25,7 @@ class URLTimeoutTest(unittest.TestCase):
|
|||
def testURLread(self):
|
||||
with support.transient_internet("www.python.org"):
|
||||
f = urllib.request.urlopen("http://www.python.org/")
|
||||
x = f.read()
|
||||
x = f.read()
|
||||
|
||||
class urlopenNetworkTests(unittest.TestCase):
|
||||
"""Tests urllib.reqest.urlopen using the network.
|
||||
|
@ -43,8 +44,10 @@ class urlopenNetworkTests(unittest.TestCase):
|
|||
|
||||
def urlopen(self, *args, **kwargs):
|
||||
resource = args[0]
|
||||
with support.transient_internet(resource):
|
||||
return urllib.request.urlopen(*args, **kwargs)
|
||||
cm = support.transient_internet(resource)
|
||||
cm.__enter__()
|
||||
self.addCleanup(cm.__exit__, None, None, None)
|
||||
return urllib.request.urlopen(*args, **kwargs)
|
||||
|
||||
def test_basic(self):
|
||||
# Simple test expected to pass.
|
||||
|
@ -135,8 +138,10 @@ class urlretrieveNetworkTests(unittest.TestCase):
|
|||
|
||||
def urlretrieve(self, *args):
|
||||
resource = args[0]
|
||||
with support.transient_internet(resource):
|
||||
return urllib.request.urlretrieve(*args)
|
||||
cm = support.transient_internet(resource)
|
||||
cm.__enter__()
|
||||
self.addCleanup(cm.__exit__, None, None, None)
|
||||
return urllib.request.urlretrieve(*args)
|
||||
|
||||
def test_basic(self):
|
||||
# Test basic functionality.
|
||||
|
|
|
@ -12,6 +12,7 @@ class XDRTest(unittest.TestCase):
|
|||
a = [b'what', b'is', b'hapnin', b'doctor']
|
||||
|
||||
p.pack_int(42)
|
||||
p.pack_int(-17)
|
||||
p.pack_uint(9)
|
||||
p.pack_bool(True)
|
||||
p.pack_bool(False)
|
||||
|
@ -29,6 +30,7 @@ class XDRTest(unittest.TestCase):
|
|||
self.assertEqual(up.get_position(), 0)
|
||||
|
||||
self.assertEqual(up.unpack_int(), 42)
|
||||
self.assertEqual(up.unpack_int(), -17)
|
||||
self.assertEqual(up.unpack_uint(), 9)
|
||||
self.assertTrue(up.unpack_bool() is True)
|
||||
|
||||
|
|
|
@ -50,7 +50,9 @@ class Packer:
|
|||
def pack_uint(self, x):
|
||||
self.__buf.write(struct.pack('>L', x))
|
||||
|
||||
pack_int = pack_uint
|
||||
def pack_int(self, x):
|
||||
self.__buf.write(struct.pack('>l', x))
|
||||
|
||||
pack_enum = pack_int
|
||||
|
||||
def pack_bool(self, x):
|
||||
|
|
|
@ -358,6 +358,7 @@ Jonathan Hartley
|
|||
Larry Hastings
|
||||
Shane Hathaway
|
||||
Rycharde Hawkes
|
||||
Ben Hayden
|
||||
Jochen Hayek
|
||||
Christian Heimes
|
||||
Thomas Heller
|
||||
|
@ -573,6 +574,7 @@ Chris McDonough
|
|||
Greg McFarlane
|
||||
Alan McIntyre
|
||||
Michael McLay
|
||||
Mark Mc Mahon
|
||||
Gordon McMillan
|
||||
Caolan McNamara
|
||||
Andrew McNamara
|
||||
|
|
53
Misc/NEWS
53
Misc/NEWS
|
@ -10,6 +10,9 @@ What's New in Python 3.3 Alpha 1?
|
|||
Core and Builtins
|
||||
-----------------
|
||||
|
||||
- Issue #10998: Remove mentions of -Q, sys.flags.division_warning and
|
||||
Py_DivisionWarningFlag left over from Python 2.
|
||||
|
||||
- Issue #11244: Remove an unnecessary peepholer check that was preventing
|
||||
negative zeros from being constant-folded properly.
|
||||
|
||||
|
@ -84,8 +87,33 @@ Core and Builtins
|
|||
Library
|
||||
-------
|
||||
|
||||
- Issue #11662: Make urllib and urllib2 ignore redirections if the
|
||||
scheme is not HTTP, HTTPS or FTP (CVE-2011-1521).
|
||||
- Removed the 'strict' argument to email.parser.Parser, which has been
|
||||
deprecated since Python 2.4.
|
||||
|
||||
- Issue #11256: Fix inspect.getcallargs on functions that take only keyword
|
||||
arguments.
|
||||
|
||||
- Issue #11696: Fix ID generation in msilib.
|
||||
|
||||
- itertools.accumulate now supports an optional *func* argument for
|
||||
a user-supplied binary function.
|
||||
|
||||
- Issue #11692: Remove unnecessary demo functions in subprocess module.
|
||||
|
||||
- Issue #9696: Fix exception incorrectly raised by xdrlib.Packer.pack_int when
|
||||
trying to pack a negative (in-range) integer.
|
||||
|
||||
- Issue #11675: multiprocessing.[Raw]Array objects created from an integer size
|
||||
are now zeroed on creation. This matches the behaviour specified by the
|
||||
documentation.
|
||||
|
||||
- Issue #7639: Fix short file name generation in bdist_msi
|
||||
|
||||
- Issue #11659: Fix ResourceWarning in test_subprocess introduced by #11459.
|
||||
Patch by Ben Hayden.
|
||||
|
||||
- Issue #11635: Don't use polling in worker threads and processes launched by
|
||||
concurrent.futures.
|
||||
|
||||
- Issue #6811: Allow importlib to change a code object's co_filename attribute
|
||||
to match the path to where the source code currently is, not where the code
|
||||
|
@ -108,6 +136,9 @@ Library
|
|||
|
||||
- Issue #11628: cmp_to_key generated class should use __slots__.
|
||||
|
||||
- Issue #11666: let help() display named tuple attributes and methods
|
||||
that start with a leading underscore.
|
||||
|
||||
- Issue #5537: Fix time2isoz() and time2netscape() functions of
|
||||
httplib.cookiejar for expiration year greater than 2038 on 32-bit systems.
|
||||
|
||||
|
@ -284,6 +315,18 @@ Library
|
|||
|
||||
- Issue #11388: Added a clear() method to MutableSequence
|
||||
|
||||
- Issue #11174: Add argparse.MetavarTypeHelpFormatter, which uses type names
|
||||
for the names of optional and positional arguments in help messages.
|
||||
|
||||
- Issue #9348: Raise an early error if argparse nargs and metavar don't match.
|
||||
|
||||
- Issue #8982: Improve the documentation for the argparse Namespace object.
|
||||
|
||||
- Issue #9343: Document that argparse parent parsers must be configured before
|
||||
their children.
|
||||
|
||||
- Issue #9026: Fix order of argparse sub-commands in help messages.
|
||||
|
||||
Build
|
||||
-----
|
||||
|
||||
|
@ -299,6 +342,12 @@ Tools/Demos
|
|||
|
||||
- Issue #11179: Make ccbench work under Python 3.1 and 2.7 again.
|
||||
|
||||
Extensions
|
||||
----------
|
||||
|
||||
- Issue #1838: Prevent segfault in ctypes, when _as_parameter_ on a class is set
|
||||
to an instance of the class.
|
||||
|
||||
Tests
|
||||
-----
|
||||
|
||||
|
|
|
@ -37,10 +37,6 @@ python \- an interpreted, interactive, object-oriented programming language
|
|||
.B \-O0
|
||||
]
|
||||
[
|
||||
.B -Q
|
||||
.I argument
|
||||
]
|
||||
[
|
||||
.B \-s
|
||||
]
|
||||
[
|
||||
|
@ -152,15 +148,6 @@ Discard docstrings in addition to the \fB-O\fP optimizations.
|
|||
Do not print the version and copyright messages. These messages are
|
||||
also suppressed in non-interactive mode.
|
||||
.TP
|
||||
.BI "\-Q " argument
|
||||
Division control; see PEP 238. The argument must be one of "old" (the
|
||||
default, int/int and long/long return an int or long), "new" (new
|
||||
division semantics, i.e. int/int and long/long returns a float),
|
||||
"warn" (old division semantics with a warning for int/int and
|
||||
long/long), or "warnall" (old division semantics with a warning for
|
||||
all use of the division operator). For a use of "warnall", see the
|
||||
Tools/scripts/fixdiv.py script.
|
||||
.TP
|
||||
.B \-s
|
||||
Don't add user site directory to sys.path.
|
||||
.TP
|
||||
|
|
|
@ -585,7 +585,10 @@ static PyObject *
|
|||
CDataType_from_param(PyObject *type, PyObject *value)
|
||||
{
|
||||
PyObject *as_parameter;
|
||||
if (1 == PyObject_IsInstance(value, type)) {
|
||||
int res = PyObject_IsInstance(value, type);
|
||||
if (res == -1)
|
||||
return NULL;
|
||||
if (res) {
|
||||
Py_INCREF(value);
|
||||
return value;
|
||||
}
|
||||
|
@ -598,10 +601,14 @@ CDataType_from_param(PyObject *type, PyObject *value)
|
|||
|
||||
/* If we got a PyCArgObject, we must check if the object packed in it
|
||||
is an instance of the type's dict->proto */
|
||||
if(dict && ob
|
||||
&& PyObject_IsInstance(ob, dict->proto)) {
|
||||
Py_INCREF(value);
|
||||
return value;
|
||||
if(dict && ob) {
|
||||
res = PyObject_IsInstance(ob, dict->proto);
|
||||
if (res == -1)
|
||||
return NULL;
|
||||
if (res) {
|
||||
Py_INCREF(value);
|
||||
return value;
|
||||
}
|
||||
}
|
||||
ob_name = (ob) ? Py_TYPE(ob)->tp_name : "???";
|
||||
PyErr_Format(PyExc_TypeError,
|
||||
|
@ -951,8 +958,7 @@ PyCPointerType_from_param(PyObject *type, PyObject *value)
|
|||
Py_INCREF(value); /* _byref steals a refcount */
|
||||
return _byref(value);
|
||||
case -1:
|
||||
PyErr_Clear();
|
||||
break;
|
||||
return NULL;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
|
@ -1431,6 +1437,7 @@ static PyObject *
|
|||
c_wchar_p_from_param(PyObject *type, PyObject *value)
|
||||
{
|
||||
PyObject *as_parameter;
|
||||
int res;
|
||||
if (value == Py_None) {
|
||||
Py_INCREF(Py_None);
|
||||
return Py_None;
|
||||
|
@ -1451,7 +1458,10 @@ c_wchar_p_from_param(PyObject *type, PyObject *value)
|
|||
}
|
||||
return (PyObject *)parg;
|
||||
}
|
||||
if (PyObject_IsInstance(value, type)) {
|
||||
res = PyObject_IsInstance(value, type);
|
||||
if (res == -1)
|
||||
return NULL;
|
||||
if (res) {
|
||||
Py_INCREF(value);
|
||||
return value;
|
||||
}
|
||||
|
@ -1492,6 +1502,7 @@ static PyObject *
|
|||
c_char_p_from_param(PyObject *type, PyObject *value)
|
||||
{
|
||||
PyObject *as_parameter;
|
||||
int res;
|
||||
if (value == Py_None) {
|
||||
Py_INCREF(Py_None);
|
||||
return Py_None;
|
||||
|
@ -1512,7 +1523,10 @@ c_char_p_from_param(PyObject *type, PyObject *value)
|
|||
}
|
||||
return (PyObject *)parg;
|
||||
}
|
||||
if (PyObject_IsInstance(value, type)) {
|
||||
res = PyObject_IsInstance(value, type);
|
||||
if (res == -1)
|
||||
return NULL;
|
||||
if (res) {
|
||||
Py_INCREF(value);
|
||||
return value;
|
||||
}
|
||||
|
@ -1554,6 +1568,7 @@ c_void_p_from_param(PyObject *type, PyObject *value)
|
|||
{
|
||||
StgDictObject *stgd;
|
||||
PyObject *as_parameter;
|
||||
int res;
|
||||
|
||||
/* None */
|
||||
if (value == Py_None) {
|
||||
|
@ -1631,7 +1646,10 @@ c_void_p_from_param(PyObject *type, PyObject *value)
|
|||
return (PyObject *)parg;
|
||||
}
|
||||
/* c_void_p instance (or subclass) */
|
||||
if (PyObject_IsInstance(value, type)) {
|
||||
res = PyObject_IsInstance(value, type);
|
||||
if (res == -1)
|
||||
return NULL;
|
||||
if (res) {
|
||||
/* c_void_p instances */
|
||||
Py_INCREF(value);
|
||||
return value;
|
||||
|
@ -1990,10 +2008,14 @@ PyCSimpleType_from_param(PyObject *type, PyObject *value)
|
|||
PyCArgObject *parg;
|
||||
struct fielddesc *fd;
|
||||
PyObject *as_parameter;
|
||||
int res;
|
||||
|
||||
/* If the value is already an instance of the requested type,
|
||||
we can use it as is */
|
||||
if (1 == PyObject_IsInstance(value, type)) {
|
||||
res = PyObject_IsInstance(value, type);
|
||||
if (res == -1)
|
||||
return NULL;
|
||||
if (res) {
|
||||
Py_INCREF(value);
|
||||
return value;
|
||||
}
|
||||
|
@ -2022,7 +2044,12 @@ PyCSimpleType_from_param(PyObject *type, PyObject *value)
|
|||
|
||||
as_parameter = PyObject_GetAttrString(value, "_as_parameter_");
|
||||
if (as_parameter) {
|
||||
if (Py_EnterRecursiveCall("while processing _as_parameter_")) {
|
||||
Py_DECREF(as_parameter);
|
||||
return NULL;
|
||||
}
|
||||
value = PyCSimpleType_from_param(type, as_parameter);
|
||||
Py_LeaveRecursiveCall();
|
||||
Py_DECREF(as_parameter);
|
||||
return value;
|
||||
}
|
||||
|
@ -2714,6 +2741,7 @@ _PyCData_set(CDataObject *dst, PyObject *type, SETFUNC setfunc, PyObject *value,
|
|||
Py_ssize_t size, char *ptr)
|
||||
{
|
||||
CDataObject *src;
|
||||
int err;
|
||||
|
||||
if (setfunc)
|
||||
return setfunc(ptr, value, size);
|
||||
|
@ -2754,7 +2782,10 @@ _PyCData_set(CDataObject *dst, PyObject *type, SETFUNC setfunc, PyObject *value,
|
|||
}
|
||||
src = (CDataObject *)value;
|
||||
|
||||
if (PyObject_IsInstance(value, type)) {
|
||||
err = PyObject_IsInstance(value, type);
|
||||
if (err == -1)
|
||||
return NULL;
|
||||
if (err) {
|
||||
memcpy(ptr,
|
||||
src->b_ptr,
|
||||
size);
|
||||
|
@ -4749,14 +4780,17 @@ Pointer_set_contents(CDataObject *self, PyObject *value, void *closure)
|
|||
stgdict = PyObject_stgdict((PyObject *)self);
|
||||
assert(stgdict); /* Cannot be NULL fr pointer instances */
|
||||
assert(stgdict->proto);
|
||||
if (!CDataObject_Check(value)
|
||||
|| 0 == PyObject_IsInstance(value, stgdict->proto)) {
|
||||
/* XXX PyObject_IsInstance could return -1! */
|
||||
PyErr_Format(PyExc_TypeError,
|
||||
"expected %s instead of %s",
|
||||
((PyTypeObject *)(stgdict->proto))->tp_name,
|
||||
Py_TYPE(value)->tp_name);
|
||||
return -1;
|
||||
if (!CDataObject_Check(value)) {
|
||||
int res = PyObject_IsInstance(value, stgdict->proto);
|
||||
if (res == -1)
|
||||
return -1;
|
||||
if (!res) {
|
||||
PyErr_Format(PyExc_TypeError,
|
||||
"expected %s instead of %s",
|
||||
((PyTypeObject *)(stgdict->proto))->tp_name,
|
||||
Py_TYPE(value)->tp_name);
|
||||
return -1;
|
||||
}
|
||||
}
|
||||
|
||||
dst = (CDataObject *)value;
|
||||
|
|
|
@ -34,9 +34,9 @@
|
|||
const char *
|
||||
Py_GetBuildInfo(void)
|
||||
{
|
||||
static char buildinfo[50 + sizeof HGVERSION +
|
||||
((sizeof HGTAG > sizeof HGBRANCH) ?
|
||||
sizeof HGTAG : sizeof HGBRANCH)];
|
||||
static char buildinfo[50 + sizeof(HGVERSION) +
|
||||
((sizeof(HGTAG) > sizeof(HGBRANCH)) ?
|
||||
sizeof(HGTAG) : sizeof(HGBRANCH))];
|
||||
const char *revision = _Py_hgversion();
|
||||
const char *sep = *revision ? ":" : "";
|
||||
const char *hgid = _Py_hgidentifier();
|
||||
|
|
|
@ -2590,6 +2590,7 @@ typedef struct {
|
|||
PyObject_HEAD
|
||||
PyObject *total;
|
||||
PyObject *it;
|
||||
PyObject *binop;
|
||||
} accumulateobject;
|
||||
|
||||
static PyTypeObject accumulate_type;
|
||||
|
@ -2597,12 +2598,14 @@ static PyTypeObject accumulate_type;
|
|||
static PyObject *
|
||||
accumulate_new(PyTypeObject *type, PyObject *args, PyObject *kwds)
|
||||
{
|
||||
static char *kwargs[] = {"iterable", NULL};
|
||||
static char *kwargs[] = {"iterable", "func", NULL};
|
||||
PyObject *iterable;
|
||||
PyObject *it;
|
||||
PyObject *binop = NULL;
|
||||
accumulateobject *lz;
|
||||
|
||||
if (!PyArg_ParseTupleAndKeywords(args, kwds, "O:accumulate", kwargs, &iterable))
|
||||
if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:accumulate",
|
||||
kwargs, &iterable, &binop))
|
||||
return NULL;
|
||||
|
||||
/* Get iterator. */
|
||||
|
@ -2617,6 +2620,8 @@ accumulate_new(PyTypeObject *type, PyObject *args, PyObject *kwds)
|
|||
return NULL;
|
||||
}
|
||||
|
||||
Py_XINCREF(binop);
|
||||
lz->binop = binop;
|
||||
lz->total = NULL;
|
||||
lz->it = it;
|
||||
return (PyObject *)lz;
|
||||
|
@ -2626,6 +2631,7 @@ static void
|
|||
accumulate_dealloc(accumulateobject *lz)
|
||||
{
|
||||
PyObject_GC_UnTrack(lz);
|
||||
Py_XDECREF(lz->binop);
|
||||
Py_XDECREF(lz->total);
|
||||
Py_XDECREF(lz->it);
|
||||
Py_TYPE(lz)->tp_free(lz);
|
||||
|
@ -2634,6 +2640,7 @@ accumulate_dealloc(accumulateobject *lz)
|
|||
static int
|
||||
accumulate_traverse(accumulateobject *lz, visitproc visit, void *arg)
|
||||
{
|
||||
Py_VISIT(lz->binop);
|
||||
Py_VISIT(lz->it);
|
||||
Py_VISIT(lz->total);
|
||||
return 0;
|
||||
|
@ -2654,7 +2661,10 @@ accumulate_next(accumulateobject *lz)
|
|||
return lz->total;
|
||||
}
|
||||
|
||||
newtotal = PyNumber_Add(lz->total, val);
|
||||
if (lz->binop == NULL)
|
||||
newtotal = PyNumber_Add(lz->total, val);
|
||||
else
|
||||
newtotal = PyObject_CallFunctionObjArgs(lz->binop, lz->total, val, NULL);
|
||||
Py_DECREF(val);
|
||||
if (newtotal == NULL)
|
||||
return NULL;
|
||||
|
@ -2668,9 +2678,9 @@ accumulate_next(accumulateobject *lz)
|
|||
}
|
||||
|
||||
PyDoc_STRVAR(accumulate_doc,
|
||||
"accumulate(iterable) --> accumulate object\n\
|
||||
"accumulate(iterable[, func]) --> accumulate object\n\
|
||||
\n\
|
||||
Return series of accumulated sums.");
|
||||
Return series of accumulated sums (or other binary function results).");
|
||||
|
||||
static PyTypeObject accumulate_type = {
|
||||
PyVarObject_HEAD_INIT(NULL, 0)
|
||||
|
@ -3628,7 +3638,7 @@ cycle(p) --> p0, p1, ... plast, p0, p1, ...\n\
|
|||
repeat(elem [,n]) --> elem, elem, elem, ... endlessly or up to n times\n\
|
||||
\n\
|
||||
Iterators terminating on the shortest input sequence:\n\
|
||||
accumulate(p, start=0) --> p0, p0+p1, p0+p1+p2\n\
|
||||
accumulate(p[, func]) --> p0, p0+p1, p0+p1+p2\n\
|
||||
chain(p, q, ...) --> p0, p1, ... plast, q0, q1, ... \n\
|
||||
compress(data, selectors) --> (d[0] if s[0]), (d[1] if s[1]), ...\n\
|
||||
dropwhile(pred, seq) --> seq[n], seq[n+1], starting when pred fails\n\
|
||||
|
|
|
@ -41,10 +41,6 @@ static PyBytesObject *nullstring;
|
|||
#define PyBytesObject_SIZE (offsetof(PyBytesObject, ob_sval) + 1)
|
||||
|
||||
/*
|
||||
For both PyBytes_FromString() and PyBytes_FromStringAndSize(), the
|
||||
parameter `size' denotes number of characters to allocate, not counting any
|
||||
null terminating character.
|
||||
|
||||
For PyBytes_FromString(), the parameter `str' points to a null-terminated
|
||||
string containing exactly `size' bytes.
|
||||
|
||||
|
@ -61,8 +57,8 @@ static PyBytesObject *nullstring;
|
|||
|
||||
The PyObject member `op->ob_size', which denotes the number of "extra
|
||||
items" in a variable-size object, will contain the number of bytes
|
||||
allocated for string data, not counting the null terminating character. It
|
||||
is therefore equal to the equal to the `size' parameter (for
|
||||
allocated for string data, not counting the null terminating character.
|
||||
It is therefore equal to the `size' parameter (for
|
||||
PyBytes_FromStringAndSize()) or the length of the string in the `str'
|
||||
parameter (for PyBytes_FromString()).
|
||||
*/
|
||||
|
|
|
@ -29,8 +29,6 @@ _Py_GetRefTotal(void)
|
|||
}
|
||||
#endif /* Py_REF_DEBUG */
|
||||
|
||||
int Py_DivisionWarningFlag;
|
||||
|
||||
/* Object allocation routines used by NEWOBJ and NEWVAROBJ macros.
|
||||
These are used by the individual routines for object creation.
|
||||
Do not call them otherwise, they do not initialize the object! */
|
||||
|
|
|
@ -1312,7 +1312,6 @@ static PyTypeObject FlagsType;
|
|||
|
||||
static PyStructSequence_Field flags_fields[] = {
|
||||
{"debug", "-d"},
|
||||
{"division_warning", "-Q"},
|
||||
{"inspect", "-i"},
|
||||
{"interactive", "-i"},
|
||||
{"optimize", "-O or -OO"},
|
||||
|
@ -1336,9 +1335,9 @@ static PyStructSequence_Desc flags_desc = {
|
|||
flags__doc__, /* doc */
|
||||
flags_fields, /* fields */
|
||||
#ifdef RISCOS
|
||||
13
|
||||
#else
|
||||
12
|
||||
#else
|
||||
11
|
||||
#endif
|
||||
};
|
||||
|
||||
|
@ -1356,7 +1355,6 @@ make_flags(void)
|
|||
PyStructSequence_SET_ITEM(seq, pos++, PyLong_FromLong(flag))
|
||||
|
||||
SetFlag(Py_DebugFlag);
|
||||
SetFlag(Py_DivisionWarningFlag);
|
||||
SetFlag(Py_InspectFlag);
|
||||
SetFlag(Py_InteractiveFlag);
|
||||
SetFlag(Py_OptimizeFlag);
|
||||
|
|
|
@ -1,7 +1,10 @@
|
|||
@rem Used by the buildbot "clean" step.
|
||||
call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64
|
||||
cd PCbuild
|
||||
@echo Deleting .pyc/.pyo files ...
|
||||
del /s Lib\*.pyc Lib\*.pyo
|
||||
@echo Deleting test leftovers ...
|
||||
rmdir /s /q build
|
||||
cd PCbuild
|
||||
vcbuild /clean pcbuild.sln "Release|x64"
|
||||
vcbuild /clean pcbuild.sln "Debug|x64"
|
||||
cd ..
|
||||
|
|
|
@ -2,6 +2,9 @@
|
|||
call "%VS90COMNTOOLS%vsvars32.bat"
|
||||
@echo Deleting .pyc/.pyo files ...
|
||||
del /s Lib\*.pyc Lib\*.pyo
|
||||
@echo Deleting test leftovers ...
|
||||
rmdir /s /q build
|
||||
cd PCbuild
|
||||
vcbuild /clean pcbuild.sln "Release|Win32"
|
||||
vcbuild /clean pcbuild.sln "Debug|Win32"
|
||||
cd ..
|
||||
|
|
Loading…
Reference in New Issue