Merge heads

This commit is contained in:
Serhiy Storchaka 2014-07-07 13:46:09 +03:00
commit 3bc13cc8b0
23 changed files with 223 additions and 94 deletions

View File

@ -664,62 +664,6 @@ before you write any of the actual code. Of course Python allows you to be
sloppy and not write test cases at all.
Why are default values shared between objects?
----------------------------------------------
This type of bug commonly bites neophyte programmers. Consider this function::
def foo(mydict={}): # Danger: shared reference to one dict for all calls
... compute something ...
mydict[key] = value
return mydict
The first time you call this function, ``mydict`` contains a single item. The
second time, ``mydict`` contains two items because when ``foo()`` begins
executing, ``mydict`` starts out with an item already in it.
It is often expected that a function call creates new objects for default
values. This is not what happens. Default values are created exactly once, when
the function is defined. If that object is changed, like the dictionary in this
example, subsequent calls to the function will refer to this changed object.
By definition, immutable objects such as numbers, strings, tuples, and ``None``,
are safe from change. Changes to mutable objects such as dictionaries, lists,
and class instances can lead to confusion.
Because of this feature, it is good programming practice to not use mutable
objects as default values. Instead, use ``None`` as the default value and
inside the function, check if the parameter is ``None`` and create a new
list/dictionary/whatever if it is. For example, don't write::
def foo(mydict={}):
...
but::
def foo(mydict=None):
if mydict is None:
mydict = {} # create a new dict for local namespace
This feature can be useful. When you have a function that's time-consuming to
compute, a common technique is to cache the parameters and the resulting value
of each call to the function, and return the cached value if the same value is
requested again. This is called "memoizing", and can be implemented like this::
# Callers will never provide a third parameter for this function.
def expensive(arg1, arg2, _cache={}):
if (arg1, arg2) in _cache:
return _cache[(arg1, arg2)]
# Calculate the value
result = ... expensive computation ...
_cache[(arg1, arg2)] = result # Store result in the cache
return result
You could use a global variable containing a dictionary instead of the default
value; it's a matter of taste.
Why is there no goto?
---------------------

View File

@ -352,6 +352,62 @@ the import inside the class but outside of any method still causes the import to
occur when the module is initialized.
Why are default values shared between objects?
----------------------------------------------
This type of bug commonly bites neophyte programmers. Consider this function::
def foo(mydict={}): # Danger: shared reference to one dict for all calls
... compute something ...
mydict[key] = value
return mydict
The first time you call this function, ``mydict`` contains a single item. The
second time, ``mydict`` contains two items because when ``foo()`` begins
executing, ``mydict`` starts out with an item already in it.
It is often expected that a function call creates new objects for default
values. This is not what happens. Default values are created exactly once, when
the function is defined. If that object is changed, like the dictionary in this
example, subsequent calls to the function will refer to this changed object.
By definition, immutable objects such as numbers, strings, tuples, and ``None``,
are safe from change. Changes to mutable objects such as dictionaries, lists,
and class instances can lead to confusion.
Because of this feature, it is good programming practice to not use mutable
objects as default values. Instead, use ``None`` as the default value and
inside the function, check if the parameter is ``None`` and create a new
list/dictionary/whatever if it is. For example, don't write::
def foo(mydict={}):
...
but::
def foo(mydict=None):
if mydict is None:
mydict = {} # create a new dict for local namespace
This feature can be useful. When you have a function that's time-consuming to
compute, a common technique is to cache the parameters and the resulting value
of each call to the function, and return the cached value if the same value is
requested again. This is called "memoizing", and can be implemented like this::
# Callers will never provide a third parameter for this function.
def expensive(arg1, arg2, _cache={}):
if (arg1, arg2) in _cache:
return _cache[(arg1, arg2)]
# Calculate the value
result = ... expensive computation ...
_cache[(arg1, arg2)] = result # Store result in the cache
return result
You could use a global variable containing a dictionary instead of the default
value; it's a matter of taste.
How can I pass optional or keyword parameters from one function to another?
---------------------------------------------------------------------------

View File

@ -12,7 +12,7 @@ standard input, a script, or from an interactive prompt.
A module can discover whether or not it is running in the main scope by
checking its own ``__name__``, which allows a common idiom for conditionally
executing code in a module when it is run as a script or with ``python
-m`` but not when it is imported:
-m`` but not when it is imported::
if __name__ == "__main__":
# execute only if run as a script

View File

@ -651,7 +651,10 @@ Print ``Hello World`` every two seconds, using a callback::
loop = asyncio.get_event_loop()
loop.call_soon(print_and_repeat, loop)
try:
loop.run_forever()
finally:
loop.close()
.. seealso::
@ -679,5 +682,8 @@ Register handlers for signals :py:data:`SIGINT` and :py:data:`SIGTERM`::
print("Event loop running forever, press CTRL+c to interrupt.")
print("pid %s: send SIGINT or SIGTERM to exit." % os.getpid())
try:
loop.run_forever()
finally:
loop.close()

View File

@ -89,7 +89,10 @@ Print ``"Hello World"`` every two seconds using a coroutine::
yield from asyncio.sleep(2)
loop = asyncio.get_event_loop()
try:
loop.run_until_complete(greet_every_two_seconds())
finally:
loop.close()
.. seealso::

View File

@ -64,6 +64,12 @@ class CoroWrapper:
self.gen = gen
self.func = func
self._source_traceback = traceback.extract_stack(sys._getframe(1))
# __name__, __qualname__, __doc__ attributes are set by the coroutine()
# decorator
def __repr__(self):
return ('<%s %s>'
% (self.__class__.__name__, _format_coroutine(self)))
def __iter__(self):
return self

View File

@ -316,6 +316,12 @@ class Future:
# So-called internal methods (note: no set_running_or_notify_cancel()).
def _set_result_unless_cancelled(self, result):
"""Helper setting the result only if the future was not cancelled."""
if self.cancelled():
return
self.set_result(result)
def set_result(self, result):
"""Mark the future done and set its result.

View File

@ -38,7 +38,7 @@ class _ProactorBasePipeTransport(transports._FlowControlMixin,
self._server.attach(self)
self._loop.call_soon(self._protocol.connection_made, self)
if waiter is not None:
self._loop.call_soon(waiter.set_result, None)
self._loop.call_soon(waiter._set_result_unless_cancelled, None)
def _set_extra(self, sock):
self._extra['pipe'] = sock

View File

@ -173,7 +173,7 @@ class Queue:
# run, we need to defer the put for a tick to ensure that
# getters and putters alternate perfectly. See
# ChannelTest.test_wait.
self._loop.call_soon(putter.set_result, None)
self._loop.call_soon(putter._set_result_unless_cancelled, None)
return self._get()

View File

@ -481,7 +481,7 @@ class _SelectorSocketTransport(_SelectorTransport):
self._loop.add_reader(self._sock_fd, self._read_ready)
self._loop.call_soon(self._protocol.connection_made, self)
if waiter is not None:
self._loop.call_soon(waiter.set_result, None)
self._loop.call_soon(waiter._set_result_unless_cancelled, None)
def pause_reading(self):
if self._closing:
@ -690,7 +690,8 @@ class _SelectorSslTransport(_SelectorTransport):
self._loop.add_reader(self._sock_fd, self._read_ready)
self._loop.call_soon(self._protocol.connection_made, self)
if self._waiter is not None:
self._loop.call_soon(self._waiter.set_result, None)
self._loop.call_soon(self._waiter._set_result_unless_cancelled,
None)
def pause_reading(self):
# XXX This is a bit icky, given the comment at the top of

View File

@ -487,7 +487,8 @@ def as_completed(fs, *, loop=None, timeout=None):
def sleep(delay, result=None, *, loop=None):
"""Coroutine that completes after a given time (in seconds)."""
future = futures.Future(loop=loop)
h = future._loop.call_later(delay, future.set_result, result)
h = future._loop.call_later(delay,
future._set_result_unless_cancelled, result)
try:
return (yield from future)
finally:

View File

@ -269,7 +269,7 @@ class _UnixReadPipeTransport(transports.ReadTransport):
self._loop.add_reader(self._fileno, self._read_ready)
self._loop.call_soon(self._protocol.connection_made, self)
if waiter is not None:
self._loop.call_soon(waiter.set_result, None)
self._loop.call_soon(waiter._set_result_unless_cancelled, None)
def _read_ready(self):
try:
@ -353,7 +353,7 @@ class _UnixWritePipeTransport(transports._FlowControlMixin,
self._loop.call_soon(self._protocol.connection_made, self)
if waiter is not None:
self._loop.call_soon(waiter.set_result, None)
self._loop.call_soon(waiter._set_result_unless_cancelled, None)
def get_write_buffer_size(self):
return sum(len(data) for data in self._buffer)

View File

@ -179,7 +179,8 @@ def customize_compiler(compiler):
# version and build tools may not support the same set
# of CPU architectures for universal builds.
global _config_vars
if not _config_vars.get('CUSTOMIZED_OSX_COMPILER', ''):
# Use get_config_var() to ensure _config_vars is initialized.
if not get_config_var('CUSTOMIZED_OSX_COMPILER'):
import _osx_support
_osx_support.customize_compiler(_config_vars)
_config_vars['CUSTOMIZED_OSX_COMPILER'] = 'True'

View File

@ -1,6 +1,9 @@
"""Tests for distutils.sysconfig."""
import os
import shutil
import subprocess
import sys
import textwrap
import unittest
from distutils import sysconfig
@ -174,6 +177,25 @@ class SysconfigTestCase(support.EnvironGuard, unittest.TestCase):
self.assertIsNotNone(vars['SO'])
self.assertEqual(vars['SO'], vars['EXT_SUFFIX'])
def test_customize_compiler_before_get_config_vars(self):
# Issue #21923: test that a Distribution compiler
# instance can be called without an explicit call to
# get_config_vars().
with open(TESTFN, 'w') as f:
f.writelines(textwrap.dedent('''\
from distutils.core import Distribution
config = Distribution().get_command_obj('config')
# try_compile may pass or it may fail if no compiler
# is found but it should not raise an exception.
rc = config.try_compile('int x;')
'''))
p = subprocess.Popen([str(sys.executable), TESTFN],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
universal_newlines=True)
outs, errs = p.communicate()
self.assertEqual(0, p.returncode, "Subprocess failed: " + outs)
def test_suite():
suite = unittest.TestSuite()

View File

@ -749,17 +749,20 @@ class PurePath(object):
"""Return a new path with the file name changed."""
if not self.name:
raise ValueError("%r has an empty name" % (self,))
drv, root, parts = self._flavour.parse_parts((name,))
if (not name or name[-1] in [self._flavour.sep, self._flavour.altsep]
or drv or root or len(parts) != 1):
raise ValueError("Invalid name %r" % (name))
return self._from_parsed_parts(self._drv, self._root,
self._parts[:-1] + [name])
def with_suffix(self, suffix):
"""Return a new path with the file suffix changed (or added, if none)."""
# XXX if suffix is None, should the current suffix be removed?
drv, root, parts = self._flavour.parse_parts((suffix,))
if drv or root or len(parts) != 1:
f = self._flavour
if f.sep in suffix or f.altsep and f.altsep in suffix:
raise ValueError("Invalid suffix %r" % (suffix))
suffix = parts[0]
if not suffix.startswith('.'):
if suffix and not suffix.startswith('.') or suffix == '.':
raise ValueError("Invalid suffix %r" % (suffix))
name = self.name
if not name:

View File

@ -343,6 +343,12 @@ class FutureTests(test_utils.TestCase):
message = m_log.error.call_args[0][0]
self.assertRegex(message, re.compile(regex, re.DOTALL))
def test_set_result_unless_cancelled(self):
fut = asyncio.Future(loop=self.loop)
fut.cancel()
fut._set_result_unless_cancelled(2)
self.assertTrue(fut.cancelled())
class FutureDoneCallbackTests(test_utils.TestCase):

View File

@ -211,6 +211,10 @@ class TaskTests(test_utils.TestCase):
coro = ('%s() at %s:%s'
% (coro_qualname, code.co_filename, code.co_firstlineno))
# test repr(CoroWrapper)
if coroutines._DEBUG:
self.assertEqual(repr(gen), '<CoroWrapper %s>' % coro)
# test pending Task
t = asyncio.Task(gen, loop=self.loop)
t.add_done_callback(Dummy())

View File

@ -1,5 +1,6 @@
import gc
import sys
import types
import unittest
import weakref
@ -109,6 +110,57 @@ class ClearTest(unittest.TestCase):
self.assertIs(None, wr())
class FrameLocalsTest(unittest.TestCase):
"""
Tests for the .f_locals attribute.
"""
def make_frames(self):
def outer():
x = 5
y = 6
def inner():
z = x + 2
1/0
t = 9
return inner()
try:
outer()
except ZeroDivisionError as e:
tb = e.__traceback__
frames = []
while tb:
frames.append(tb.tb_frame)
tb = tb.tb_next
return frames
def test_locals(self):
f, outer, inner = self.make_frames()
outer_locals = outer.f_locals
self.assertIsInstance(outer_locals.pop('inner'), types.FunctionType)
self.assertEqual(outer_locals, {'x': 5, 'y': 6})
inner_locals = inner.f_locals
self.assertEqual(inner_locals, {'x': 5, 'z': 7})
def test_clear_locals(self):
# Test f_locals after clear() (issue #21897)
f, outer, inner = self.make_frames()
outer.clear()
inner.clear()
self.assertEqual(outer.f_locals, {})
self.assertEqual(inner.f_locals, {})
def test_locals_clear_locals(self):
# Test f_locals before and after clear() (to exercise caching)
f, outer, inner = self.make_frames()
outer.f_locals
inner.f_locals
outer.clear()
inner.clear()
self.assertEqual(outer.f_locals, {})
self.assertEqual(inner.f_locals, {})
def test_main():
support.run_unittest(__name__)

View File

@ -540,6 +540,10 @@ class _BasePurePathTest(object):
self.assertRaises(ValueError, P('').with_name, 'd.xml')
self.assertRaises(ValueError, P('.').with_name, 'd.xml')
self.assertRaises(ValueError, P('/').with_name, 'd.xml')
self.assertRaises(ValueError, P('a/b').with_name, '')
self.assertRaises(ValueError, P('a/b').with_name, '/c')
self.assertRaises(ValueError, P('a/b').with_name, 'c/')
self.assertRaises(ValueError, P('a/b').with_name, 'c/d')
def test_with_suffix_common(self):
P = self.cls
@ -547,6 +551,9 @@ class _BasePurePathTest(object):
self.assertEqual(P('/a/b').with_suffix('.gz'), P('/a/b.gz'))
self.assertEqual(P('a/b.py').with_suffix('.gz'), P('a/b.gz'))
self.assertEqual(P('/a/b.py').with_suffix('.gz'), P('/a/b.gz'))
# Stripping suffix
self.assertEqual(P('a/b.py').with_suffix(''), P('a/b'))
self.assertEqual(P('/a/b').with_suffix(''), P('/a/b'))
# Path doesn't have a "filename" component
self.assertRaises(ValueError, P('').with_suffix, '.gz')
self.assertRaises(ValueError, P('.').with_suffix, '.gz')
@ -554,9 +561,12 @@ class _BasePurePathTest(object):
# Invalid suffix
self.assertRaises(ValueError, P('a/b').with_suffix, 'gz')
self.assertRaises(ValueError, P('a/b').with_suffix, '/')
self.assertRaises(ValueError, P('a/b').with_suffix, '.')
self.assertRaises(ValueError, P('a/b').with_suffix, '/.gz')
self.assertRaises(ValueError, P('a/b').with_suffix, 'c/d')
self.assertRaises(ValueError, P('a/b').with_suffix, '.c/.d')
self.assertRaises(ValueError, P('a/b').with_suffix, './.d')
self.assertRaises(ValueError, P('a/b').with_suffix, '.d/.')
def test_relative_to_common(self):
P = self.cls
@ -950,6 +960,10 @@ class PureWindowsPathTest(_BasePurePathTest, unittest.TestCase):
self.assertRaises(ValueError, P('c:').with_name, 'd.xml')
self.assertRaises(ValueError, P('c:/').with_name, 'd.xml')
self.assertRaises(ValueError, P('//My/Share').with_name, 'd.xml')
self.assertRaises(ValueError, P('c:a/b').with_name, 'd:')
self.assertRaises(ValueError, P('c:a/b').with_name, 'd:e')
self.assertRaises(ValueError, P('c:a/b').with_name, 'd:/e')
self.assertRaises(ValueError, P('c:a/b').with_name, '//My/Share')
def test_with_suffix(self):
P = self.cls

View File

@ -27,6 +27,15 @@ Core and Builtins
Library
-------
- Issue #20639: calling Path.with_suffix('') allows removing the suffix
again. Patch by July Tikhonov.
- Issue #21714: Disallow the construction of invalid paths using
Path.with_name(). Original patch by Antony Lee.
- Issue #21897: Fix a crash with the f_locals attribute with closure
variables when frame.clear() has been called.
- Issue #21151: Fixed a segfault in the winreg module when ``None`` is passed
as a ``REG_BINARY`` value to SetValueEx. Patch by John Ehresman.
@ -133,6 +142,9 @@ Library
- Issue #21801: Validate that __signature__ is None or an instance of Signature.
- Issue #21923: Prevent AttributeError in distutils.sysconfig.customize_compiler
due to possible uninitialized _config_vars.
Build
-----

View File

@ -465,11 +465,13 @@ io_open(PyObject *self, PyObject *args, PyObject *kwds)
error:
if (result != NULL) {
PyObject *exc, *val, *tb;
PyObject *exc, *val, *tb, *close_result;
PyErr_Fetch(&exc, &val, &tb);
if (_PyObject_CallMethodId(result, &PyId_close, NULL) != NULL)
close_result = _PyObject_CallMethodId(result, &PyId_close, NULL);
if (close_result != NULL) {
Py_DECREF(close_result);
PyErr_Restore(exc, val, tb);
else {
} else {
PyObject *exc2, *val2, *tb2;
PyErr_Fetch(&exc2, &val2, &tb2);
PyErr_NormalizeException(&exc, &val, &tb);

View File

@ -786,7 +786,7 @@ map_to_dict(PyObject *map, Py_ssize_t nmap, PyObject *dict, PyObject **values,
PyObject *key = PyTuple_GET_ITEM(map, j);
PyObject *value = values[j];
assert(PyUnicode_Check(key));
if (deref) {
if (deref && value != NULL) {
assert(PyCell_Check(value));
value = PyCell_GET(value);
}

View File

@ -1372,9 +1372,8 @@ PyUnicode_CopyCharacters(PyObject *to, Py_ssize_t to_start,
how_many = Py_MIN(PyUnicode_GET_LENGTH(from), how_many);
if (to_start + how_many > PyUnicode_GET_LENGTH(to)) {
PyErr_Format(PyExc_SystemError,
"Cannot write %" PY_FORMAT_SIZE_T "i characters at %"
PY_FORMAT_SIZE_T "i in a string of %"
PY_FORMAT_SIZE_T "i characters",
"Cannot write %zi characters at %zi "
"in a string of %zi characters",
how_many, to_start, PyUnicode_GET_LENGTH(to));
return -1;
}
@ -4083,9 +4082,7 @@ unicode_decode_call_errorhandler_wchar(
if (newpos<0)
newpos = insize+newpos;
if (newpos<0 || newpos>insize) {
PyErr_Format(PyExc_IndexError,
"position %" PY_FORMAT_SIZE_T
"d from error handler out of bounds", newpos);
PyErr_Format(PyExc_IndexError, "position %zd from error handler out of bounds", newpos);
goto onError;
}
@ -4178,9 +4175,7 @@ unicode_decode_call_errorhandler_writer(
if (newpos<0)
newpos = insize+newpos;
if (newpos<0 || newpos>insize) {
PyErr_Format(PyExc_IndexError,
"position %" PY_FORMAT_SIZE_T
"d from error handler out of bounds", newpos);
PyErr_Format(PyExc_IndexError, "position %zd from error handler out of bounds", newpos);
goto onError;
}
@ -6443,9 +6438,7 @@ unicode_encode_call_errorhandler(const char *errors,
if (*newpos<0)
*newpos = len + *newpos;
if (*newpos<0 || *newpos>len) {
PyErr_Format(PyExc_IndexError,
"position %" PY_FORMAT_SIZE_T
"d from error handler out of bounds", *newpos);
PyErr_Format(PyExc_IndexError, "position %zd from error handler out of bounds", *newpos);
Py_DECREF(restuple);
return NULL;
}
@ -8468,9 +8461,7 @@ unicode_translate_call_errorhandler(const char *errors,
else
*newpos = i_newpos;
if (*newpos<0 || *newpos>PyUnicode_GET_LENGTH(unicode)) {
PyErr_Format(PyExc_IndexError,
"position %" PY_FORMAT_SIZE_T
"d from error handler out of bounds", *newpos);
PyErr_Format(PyExc_IndexError, "position %zd from error handler out of bounds", *newpos);
Py_DECREF(restuple);
return NULL;
}
@ -9752,8 +9743,7 @@ PyUnicode_Join(PyObject *separator, PyObject *seq)
item = items[i];
if (!PyUnicode_Check(item)) {
PyErr_Format(PyExc_TypeError,
"sequence item %" PY_FORMAT_SIZE_T
"d: expected str instance,"
"sequence item %zd: expected str instance,"
" %.80s found",
i, Py_TYPE(item)->tp_name);
goto onError;
@ -14452,7 +14442,7 @@ unicode_format_arg_format(struct unicode_formatter_t *ctx,
default:
PyErr_Format(PyExc_ValueError,
"unsupported format character '%c' (0x%x) "
"at index %" PY_FORMAT_SIZE_T "d",
"at index %zd",
(31<=arg->ch && arg->ch<=126) ? (char)arg->ch : '?',
(int)arg->ch,
ctx->fmtpos - 1);