merge heads.

This commit is contained in:
Gregory P. Smith 2011-05-14 15:26:35 -07:00
commit 873cab2890
25 changed files with 424 additions and 305 deletions

View File

@ -43,10 +43,10 @@ web server it's talking to uses both "server" sockets and "client" sockets.
History History
------- -------
Of the various forms of IPC (*Inter Process Communication*), sockets are by far Of the various forms of :abbr:`IPC (Inter Process Communication)`,
the most popular. On any given platform, there are likely to be other forms of sockets are by far the most popular. On any given platform, there are
IPC that are faster, but for cross-platform communication, sockets are about the likely to be other forms of IPC that are faster, but for
only game in town. cross-platform communication, sockets are about the only game in town.
They were invented in Berkeley as part of the BSD flavor of Unix. They spread They were invented in Berkeley as part of the BSD flavor of Unix. They spread
like wildfire with the Internet. With good reason --- the combination of sockets like wildfire with the Internet. With good reason --- the combination of sockets
@ -66,13 +66,14 @@ your browser did something like the following::
# - the normal http port # - the normal http port
s.connect(("www.mcmillan-inc.com", 80)) s.connect(("www.mcmillan-inc.com", 80))
When the ``connect`` completes, the socket ``s`` can now be used to send in a When the ``connect`` completes, the socket ``s`` can be used to send
request for the text of this page. The same socket will read the reply, and then in a request for the text of the page. The same socket will read the
be destroyed. That's right - destroyed. Client sockets are normally only used reply, and then be destroyed. That's right, destroyed. Client sockets
for one exchange (or a small set of sequential exchanges). are normally only used for one exchange (or a small set of sequential
exchanges).
What happens in the web server is a bit more complex. First, the web server What happens in the web server is a bit more complex. First, the web server
creates a "server socket". :: creates a "server socket"::
#create an INET, STREAMing socket #create an INET, STREAMing socket
serversocket = socket.socket( serversocket = socket.socket(
@ -96,7 +97,7 @@ Finally, the argument to ``listen`` tells the socket library that we want it to
queue up as many as 5 connect requests (the normal max) before refusing outside queue up as many as 5 connect requests (the normal max) before refusing outside
connections. If the rest of the code is written properly, that should be plenty. connections. If the rest of the code is written properly, that should be plenty.
OK, now we have a "server" socket, listening on port 80. Now we enter the Now that we have a "server" socket, listening on port 80, we can enter the
mainloop of the web server:: mainloop of the web server::
while True: while True:
@ -145,7 +146,7 @@ perhaps a signon. But that's a design decision - it's not a rule of sockets.
Now there are two sets of verbs to use for communication. You can use ``send`` Now there are two sets of verbs to use for communication. You can use ``send``
and ``recv``, or you can transform your client socket into a file-like beast and and ``recv``, or you can transform your client socket into a file-like beast and
use ``read`` and ``write``. The latter is the way Java presents their sockets. use ``read`` and ``write``. The latter is the way Java presents its sockets.
I'm not going to talk about it here, except to warn you that you need to use I'm not going to talk about it here, except to warn you that you need to use
``flush`` on sockets. These are buffered "files", and a common mistake is to ``flush`` on sockets. These are buffered "files", and a common mistake is to
``write`` something, and then ``read`` for a reply. Without a ``flush`` in ``write`` something, and then ``read`` for a reply. Without a ``flush`` in
@ -166,11 +167,11 @@ this connection. Ever. You may be able to send data successfully; I'll talk
about that some on the next page. about that some on the next page.
A protocol like HTTP uses a socket for only one transfer. The client sends a A protocol like HTTP uses a socket for only one transfer. The client sends a
request, the reads a reply. That's it. The socket is discarded. This means that request, then reads a reply. That's it. The socket is discarded. This means that
a client can detect the end of the reply by receiving 0 bytes. a client can detect the end of the reply by receiving 0 bytes.
But if you plan to reuse your socket for further transfers, you need to realize But if you plan to reuse your socket for further transfers, you need to realize
that *there is no "EOT" (End of Transfer) on a socket.* I repeat: if a socket that *there is no* :abbr:`EOT (End of Transfer)` *on a socket.* I repeat: if a socket
``send`` or ``recv`` returns after handling 0 bytes, the connection has been ``send`` or ``recv`` returns after handling 0 bytes, the connection has been
broken. If the connection has *not* been broken, you may wait on a ``recv`` broken. If the connection has *not* been broken, you may wait on a ``recv``
forever, because the socket will *not* tell you that there's nothing more to forever, because the socket will *not* tell you that there's nothing more to
@ -336,7 +337,7 @@ Use ``select``.
In C, coding ``select`` is fairly complex. In Python, it's a piece of cake, but In C, coding ``select`` is fairly complex. In Python, it's a piece of cake, but
it's close enough to the C version that if you understand ``select`` in Python, it's close enough to the C version that if you understand ``select`` in Python,
you'll have little trouble with it in C. :: you'll have little trouble with it in C::
ready_to_read, ready_to_write, in_error = \ ready_to_read, ready_to_write, in_error = \
select.select( select.select(
@ -353,9 +354,9 @@ call is blocking, but you can give it a timeout. This is generally a sensible
thing to do - give it a nice long timeout (say a minute) unless you have good thing to do - give it a nice long timeout (say a minute) unless you have good
reason to do otherwise. reason to do otherwise.
In return, you will get three lists. They have the sockets that are actually In return, you will get three lists. They contain the sockets that are actually
readable, writable and in error. Each of these lists is a subset (possibly readable, writable and in error. Each of these lists is a subset (possibly
empty) of the corresponding list you passed in. And if you put a socket in more empty) of the corresponding list you passed in. If you put a socket in more
than one input list, it will only be (at most) in one output list. than one input list, it will only be (at most) in one output list.
If a socket is in the output readable list, you can be If a socket is in the output readable list, you can be

View File

@ -189,37 +189,105 @@ are converted to strings. The default implementation uses the internals of the
.. _pprint-example: .. _pprint-example:
pprint Example Example
-------------- -------
This example demonstrates several uses of the :func:`pprint` function and its To demonstrate several uses of the :func:`pprint` function and its parameters,
parameters. let's fetch information about a package from PyPI::
>>> import json
>>> import pprint >>> import pprint
>>> tup = ('spam', ('eggs', ('lumberjack', ('knights', ('ni', ('dead', >>> from urllib.request import urlopen
... ('parrot', ('fresh fruit',)))))))) >>> with urlopen('http://pypi.python.org/pypi/configparser/json') as url:
>>> stuff = ['a' * 10, tup, ['a' * 30, 'b' * 30], ['c' * 20, 'd' * 20]] ... http_info = url.info()
>>> pprint.pprint(stuff) ... raw_data = url.read().decode(http_info.get_content_charset())
['aaaaaaaaaa', >>> package_data = json.loads(raw_data)
('spam', >>> result = {'headers': http_info.items(), 'body': package_data}
('eggs',
('lumberjack',
('knights', ('ni', ('dead', ('parrot', ('fresh fruit',)))))))),
['aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'bbbbbbbbbbbbbbbbbbbbbbbbbbbbbb'],
['cccccccccccccccccccc', 'dddddddddddddddddddd']]
>>> pprint.pprint(stuff, depth=3)
['aaaaaaaaaa',
('spam', ('eggs', (...))),
['aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'bbbbbbbbbbbbbbbbbbbbbbbbbbbbbb'],
['cccccccccccccccccccc', 'dddddddddddddddddddd']]
>>> pprint.pprint(stuff, width=60)
['aaaaaaaaaa',
('spam',
('eggs',
('lumberjack',
('knights',
('ni', ('dead', ('parrot', ('fresh fruit',)))))))),
['aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa',
'bbbbbbbbbbbbbbbbbbbbbbbbbbbbbb'],
['cccccccccccccccccccc', 'dddddddddddddddddddd']]
In its basic form, :func:`pprint` shows the whole object::
>>> pprint.pprint(result)
{'body': {'info': {'_pypi_hidden': False,
'_pypi_ordering': 12,
'classifiers': ['Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Natural Language :: English',
'Operating System :: OS Independent',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Topic :: Software Development :: Libraries',
'Topic :: Software Development :: Libraries :: Python Modules'],
'download_url': 'UNKNOWN',
'home_page': 'http://docs.python.org/py3k/library/configparser.html',
'keywords': 'configparser ini parsing conf cfg configuration file',
'license': 'MIT',
'name': 'configparser',
'package_url': 'http://pypi.python.org/pypi/configparser',
'platform': 'any',
'release_url': 'http://pypi.python.org/pypi/configparser/3.2.0r3',
'requires_python': None,
'stable_version': None,
'summary': 'This library brings the updated configparser from Python 3.2+ to Python 2.6-2.7.',
'version': '3.2.0r3'},
'urls': [{'comment_text': '',
'downloads': 47,
'filename': 'configparser-3.2.0r3.tar.gz',
'has_sig': False,
'md5_digest': '8500fd87c61ac0de328fc996fce69b96',
'packagetype': 'sdist',
'python_version': 'source',
'size': 32281,
'upload_time': '2011-05-10T16:28:50',
'url': 'http://pypi.python.org/packages/source/c/configparser/configparser-3.2.0r3.tar.gz'}]},
'headers': [('Date', 'Sat, 14 May 2011 12:48:52 GMT'),
('Server', 'Apache/2.2.16 (Debian)'),
('Content-Disposition', 'inline'),
('Connection', 'close'),
('Transfer-Encoding', 'chunked'),
('Content-Type', 'application/json; charset="UTF-8"')]}
The result can be limited to a certain *depth* (ellipsis is used for deeper
contents)::
>>> pprint.pprint(result, depth=3)
{'body': {'info': {'_pypi_hidden': False,
'_pypi_ordering': 12,
'classifiers': [...],
'download_url': 'UNKNOWN',
'home_page': 'http://docs.python.org/py3k/library/configparser.html',
'keywords': 'configparser ini parsing conf cfg configuration file',
'license': 'MIT',
'name': 'configparser',
'package_url': 'http://pypi.python.org/pypi/configparser',
'platform': 'any',
'release_url': 'http://pypi.python.org/pypi/configparser/3.2.0r3',
'requires_python': None,
'stable_version': None,
'summary': 'This library brings the updated configparser from Python 3.2+ to Python 2.6-2.7.',
'version': '3.2.0r3'},
'urls': [{...}]},
'headers': [('Date', 'Sat, 14 May 2011 12:48:52 GMT'),
('Server', 'Apache/2.2.16 (Debian)'),
('Content-Disposition', 'inline'),
('Connection', 'close'),
('Transfer-Encoding', 'chunked'),
('Content-Type', 'application/json; charset="UTF-8"')]}
Additionally, maximum *width* can be suggested. If a long object cannot be
split, the specified width will be exceeded::
>>> pprint.pprint(result['headers'], width=30)
[('Date',
'Sat, 14 May 2011 12:48:52 GMT'),
('Server',
'Apache/2.2.16 (Debian)'),
('Content-Disposition',
'inline'),
('Connection', 'close'),
('Transfer-Encoding',
'chunked'),
('Content-Type',
'application/json; charset="UTF-8"')]

View File

@ -3,6 +3,8 @@ What's New in IDLE 3.2.1?
*Release date: 15-May-11* *Release date: 15-May-11*
- Issue #6378: Further adjust idle.bat to start associated Python
- Issue #11896: Save on Close failed despite selecting "Yes" in dialog. - Issue #11896: Save on Close failed despite selecting "Yes" in dialog.
- Issue #1028: Ctrl-space binding to show completions was causing IDLE to exit. - Issue #1028: Ctrl-space binding to show completions was causing IDLE to exit.
@ -63,7 +65,7 @@ What's New in IDLE 2.7? (UNRELEASED, but merged into 3.1 releases above.)
extract port from command line when warnings are present. extract port from command line when warnings are present.
- Tk 8.5 Text widget requires 'wordprocessor' tabstyle attr to handle - Tk 8.5 Text widget requires 'wordprocessor' tabstyle attr to handle
mixed space/tab properly. Issue 5120, patch by Guilherme Polo. mixed space/tab properly. Issue 5129, patch by Guilherme Polo.
- Issue #3549: On MacOS the preferences menu was not present - Issue #3549: On MacOS the preferences menu was not present

View File

@ -1,4 +1,4 @@
@echo off @echo off
rem Start IDLE using the appropriate Python interpreter rem Start IDLE using the appropriate Python interpreter
set CURRDIR=%~dp0 set CURRDIR=%~dp0
start "%CURRDIR%..\..\pythonw.exe" "%CURRDIR%idle.pyw" %1 %2 %3 %4 %5 %6 %7 %8 %9 start "IDLE" "%CURRDIR%..\..\pythonw.exe" "%CURRDIR%idle.pyw" %1 %2 %3 %4 %5 %6 %7 %8 %9

View File

@ -5,7 +5,7 @@ import re
import sys import sys
import struct import struct
from json.scanner import make_scanner from json import scanner
try: try:
from _json import scanstring as c_scanstring from _json import scanstring as c_scanstring
except ImportError: except ImportError:
@ -340,7 +340,7 @@ class JSONDecoder(object):
self.parse_array = JSONArray self.parse_array = JSONArray
self.parse_string = scanstring self.parse_string = scanstring
self.memo = {} self.memo = {}
self.scan_once = make_scanner(self) self.scan_once = scanner.make_scanner(self)
def decode(self, s, _w=WHITESPACE.match): def decode(self, s, _w=WHITESPACE.match):

View File

@ -1,7 +1,46 @@
import os import os
import sys import sys
import unittest import json
import doctest import doctest
import unittest
from test import support
# import json with and without accelerations
cjson = support.import_fresh_module('json', fresh=['_json'])
pyjson = support.import_fresh_module('json', blocked=['_json'])
# create two base classes that will be used by the other tests
class PyTest(unittest.TestCase):
json = pyjson
loads = staticmethod(pyjson.loads)
dumps = staticmethod(pyjson.dumps)
@unittest.skipUnless(cjson, 'requires _json')
class CTest(unittest.TestCase):
if cjson is not None:
json = cjson
loads = staticmethod(cjson.loads)
dumps = staticmethod(cjson.dumps)
# test PyTest and CTest checking if the functions come from the right module
class TestPyTest(PyTest):
def test_pyjson(self):
self.assertEqual(self.json.scanner.make_scanner.__module__,
'json.scanner')
self.assertEqual(self.json.decoder.scanstring.__module__,
'json.decoder')
self.assertEqual(self.json.encoder.encode_basestring_ascii.__module__,
'json.encoder')
class TestCTest(CTest):
def test_cjson(self):
self.assertEqual(self.json.scanner.make_scanner.__module__, '_json')
self.assertEqual(self.json.decoder.scanstring.__module__, '_json')
self.assertEqual(self.json.encoder.c_make_encoder.__module__, '_json')
self.assertEqual(self.json.encoder.encode_basestring_ascii.__module__,
'_json')
here = os.path.dirname(__file__) here = os.path.dirname(__file__)
@ -17,12 +56,11 @@ def test_suite():
return suite return suite
def additional_tests(): def additional_tests():
import json
import json.encoder
import json.decoder
suite = unittest.TestSuite() suite = unittest.TestSuite()
for mod in (json, json.encoder, json.decoder): for mod in (json, json.encoder, json.decoder):
suite.addTest(doctest.DocTestSuite(mod)) suite.addTest(doctest.DocTestSuite(mod))
suite.addTest(TestPyTest('test_pyjson'))
suite.addTest(TestCTest('test_cjson'))
return suite return suite
def main(): def main():

View File

@ -1,55 +1,38 @@
import decimal import decimal
from unittest import TestCase
from io import StringIO from io import StringIO
from contextlib import contextmanager
import json
import json.decoder
import json.scanner
from collections import OrderedDict from collections import OrderedDict
from test.json_tests import PyTest, CTest
@contextmanager class TestDecode:
def use_python_scanner():
py_scanner = json.scanner.py_make_scanner
old_scanner = json.decoder.make_scanner
json.decoder.make_scanner = py_scanner
try:
yield
finally:
json.decoder.make_scanner = old_scanner
class TestDecode(TestCase):
def test_decimal(self): def test_decimal(self):
rval = json.loads('1.1', parse_float=decimal.Decimal) rval = self.loads('1.1', parse_float=decimal.Decimal)
self.assertTrue(isinstance(rval, decimal.Decimal)) self.assertTrue(isinstance(rval, decimal.Decimal))
self.assertEqual(rval, decimal.Decimal('1.1')) self.assertEqual(rval, decimal.Decimal('1.1'))
def test_float(self): def test_float(self):
rval = json.loads('1', parse_int=float) rval = self.loads('1', parse_int=float)
self.assertTrue(isinstance(rval, float)) self.assertTrue(isinstance(rval, float))
self.assertEqual(rval, 1.0) self.assertEqual(rval, 1.0)
def test_empty_objects(self): def test_empty_objects(self):
self.assertEqual(json.loads('{}'), {}) self.assertEqual(self.loads('{}'), {})
self.assertEqual(json.loads('[]'), []) self.assertEqual(self.loads('[]'), [])
self.assertEqual(json.loads('""'), "") self.assertEqual(self.loads('""'), "")
def test_object_pairs_hook(self): def test_object_pairs_hook(self):
s = '{"xkd":1, "kcw":2, "art":3, "hxm":4, "qrt":5, "pad":6, "hoy":7}' s = '{"xkd":1, "kcw":2, "art":3, "hxm":4, "qrt":5, "pad":6, "hoy":7}'
p = [("xkd", 1), ("kcw", 2), ("art", 3), ("hxm", 4), p = [("xkd", 1), ("kcw", 2), ("art", 3), ("hxm", 4),
("qrt", 5), ("pad", 6), ("hoy", 7)] ("qrt", 5), ("pad", 6), ("hoy", 7)]
self.assertEqual(json.loads(s), eval(s)) self.assertEqual(self.loads(s), eval(s))
self.assertEqual(json.loads(s, object_pairs_hook = lambda x: x), p) self.assertEqual(self.loads(s, object_pairs_hook = lambda x: x), p)
self.assertEqual(json.load(StringIO(s), self.assertEqual(self.json.load(StringIO(s),
object_pairs_hook=lambda x: x), p) object_pairs_hook=lambda x: x), p)
od = json.loads(s, object_pairs_hook = OrderedDict) od = self.loads(s, object_pairs_hook = OrderedDict)
self.assertEqual(od, OrderedDict(p)) self.assertEqual(od, OrderedDict(p))
self.assertEqual(type(od), OrderedDict) self.assertEqual(type(od), OrderedDict)
# the object_pairs_hook takes priority over the object_hook # the object_pairs_hook takes priority over the object_hook
self.assertEqual(json.loads(s, self.assertEqual(self.loads(s, object_pairs_hook = OrderedDict,
object_pairs_hook = OrderedDict,
object_hook = lambda x: None), object_hook = lambda x: None),
OrderedDict(p)) OrderedDict(p))
@ -57,7 +40,7 @@ class TestDecode(TestCase):
# Several optimizations were made that skip over calls to # Several optimizations were made that skip over calls to
# the whitespace regex, so this test is designed to try and # the whitespace regex, so this test is designed to try and
# exercise the uncommon cases. The array cases are already covered. # exercise the uncommon cases. The array cases are already covered.
rval = json.loads('{ "key" : "value" , "k":"v" }') rval = self.loads('{ "key" : "value" , "k":"v" }')
self.assertEqual(rval, {"key":"value", "k":"v"}) self.assertEqual(rval, {"key":"value", "k":"v"})
def check_keys_reuse(self, source, loads): def check_keys_reuse(self, source, loads):
@ -68,7 +51,9 @@ class TestDecode(TestCase):
def test_keys_reuse(self): def test_keys_reuse(self):
s = '[{"a_key": 1, "b_\xe9": 2}, {"a_key": 3, "b_\xe9": 4}]' s = '[{"a_key": 1, "b_\xe9": 2}, {"a_key": 3, "b_\xe9": 4}]'
self.check_keys_reuse(s, json.loads) self.check_keys_reuse(s, self.loads)
# Disabled: the pure Python version of json simply doesn't work self.check_keys_reuse(s, self.json.decoder.JSONDecoder().decode)
with use_python_scanner():
self.check_keys_reuse(s, json.decoder.JSONDecoder().decode)
class TestPyDecode(TestDecode, PyTest): pass
class TestCDecode(TestDecode, CTest): pass

View File

@ -1,9 +1,12 @@
from unittest import TestCase from test.json_tests import PyTest, CTest
import json
class TestDefault(TestCase): class TestDefault:
def test_default(self): def test_default(self):
self.assertEqual( self.assertEqual(
json.dumps(type, default=repr), self.dumps(type, default=repr),
json.dumps(repr(type))) self.dumps(repr(type)))
class TestPyDefault(TestDefault, PyTest): pass
class TestCDefault(TestDefault, CTest): pass

View File

@ -1,21 +1,24 @@
from unittest import TestCase
from io import StringIO from io import StringIO
from test.json_tests import PyTest, CTest
import json
class TestDump(TestCase): class TestDump:
def test_dump(self): def test_dump(self):
sio = StringIO() sio = StringIO()
json.dump({}, sio) self.json.dump({}, sio)
self.assertEqual(sio.getvalue(), '{}') self.assertEqual(sio.getvalue(), '{}')
def test_dumps(self): def test_dumps(self):
self.assertEqual(json.dumps({}), '{}') self.assertEqual(self.dumps({}), '{}')
def test_encode_truefalse(self): def test_encode_truefalse(self):
self.assertEqual(json.dumps( self.assertEqual(self.dumps(
{True: False, False: True}, sort_keys=True), {True: False, False: True}, sort_keys=True),
'{"false": true, "true": false}') '{"false": true, "true": false}')
self.assertEqual(json.dumps( self.assertEqual(self.dumps(
{2: 3.0, 4.0: 5, False: 1, 6: True}, sort_keys=True), {2: 3.0, 4.0: 5, False: 1, 6: True}, sort_keys=True),
'{"false": 1, "2": 3.0, "4.0": 5, "6": true}') '{"false": 1, "2": 3.0, "4.0": 5, "6": true}')
class TestPyDump(TestDump, PyTest): pass
class TestCDump(TestDump, CTest): pass

View File

@ -1,8 +1,6 @@
from unittest import TestCase
import json.encoder
from json import dumps
from collections import OrderedDict from collections import OrderedDict
from test.json_tests import PyTest, CTest
CASES = [ CASES = [
('/\\"\ucafe\ubabe\uab98\ufcde\ubcda\uef4a\x08\x0c\n\r\t`1~!@#$%^&*()_+-=[]{}|;:\',./<>?', '"/\\\\\\"\\ucafe\\ubabe\\uab98\\ufcde\\ubcda\\uef4a\\b\\f\\n\\r\\t`1~!@#$%^&*()_+-=[]{}|;:\',./<>?"'), ('/\\"\ucafe\ubabe\uab98\ufcde\ubcda\uef4a\x08\x0c\n\r\t`1~!@#$%^&*()_+-=[]{}|;:\',./<>?', '"/\\\\\\"\\ucafe\\ubabe\\uab98\\ufcde\\ubcda\\uef4a\\b\\f\\n\\r\\t`1~!@#$%^&*()_+-=[]{}|;:\',./<>?"'),
@ -21,19 +19,11 @@ CASES = [
('\u0123\u4567\u89ab\ucdef\uabcd\uef4a', '"\\u0123\\u4567\\u89ab\\ucdef\\uabcd\\uef4a"'), ('\u0123\u4567\u89ab\ucdef\uabcd\uef4a', '"\\u0123\\u4567\\u89ab\\ucdef\\uabcd\\uef4a"'),
] ]
class TestEncodeBaseStringAscii(TestCase): class TestEncodeBasestringAscii:
def test_py_encode_basestring_ascii(self): def test_encode_basestring_ascii(self):
self._test_encode_basestring_ascii(json.encoder.py_encode_basestring_ascii) fname = self.json.encoder.encode_basestring_ascii.__name__
def test_c_encode_basestring_ascii(self):
if not json.encoder.c_encode_basestring_ascii:
return
self._test_encode_basestring_ascii(json.encoder.c_encode_basestring_ascii)
def _test_encode_basestring_ascii(self, encode_basestring_ascii):
fname = encode_basestring_ascii.__name__
for input_string, expect in CASES: for input_string, expect in CASES:
result = encode_basestring_ascii(input_string) result = self.json.encoder.encode_basestring_ascii(input_string)
self.assertEqual(result, expect, self.assertEqual(result, expect,
'{0!r} != {1!r} for {2}({3!r})'.format( '{0!r} != {1!r} for {2}({3!r})'.format(
result, expect, fname, input_string)) result, expect, fname, input_string))
@ -41,10 +31,14 @@ class TestEncodeBaseStringAscii(TestCase):
def test_ordered_dict(self): def test_ordered_dict(self):
# See issue 6105 # See issue 6105
items = [('one', 1), ('two', 2), ('three', 3), ('four', 4), ('five', 5)] items = [('one', 1), ('two', 2), ('three', 3), ('four', 4), ('five', 5)]
s = json.dumps(OrderedDict(items)) s = self.dumps(OrderedDict(items))
self.assertEqual(s, '{"one": 1, "two": 2, "three": 3, "four": 4, "five": 5}') self.assertEqual(s, '{"one": 1, "two": 2, "three": 3, "four": 4, "five": 5}')
def test_sorted_dict(self): def test_sorted_dict(self):
items = [('one', 1), ('two', 2), ('three', 3), ('four', 4), ('five', 5)] items = [('one', 1), ('two', 2), ('three', 3), ('four', 4), ('five', 5)]
s = json.dumps(dict(items), sort_keys=True) s = self.dumps(dict(items), sort_keys=True)
self.assertEqual(s, '{"five": 5, "four": 4, "one": 1, "three": 3, "two": 2}') self.assertEqual(s, '{"five": 5, "four": 4, "one": 1, "three": 3, "two": 2}')
class TestPyEncodeBasestringAscii(TestEncodeBasestringAscii, PyTest): pass
class TestCEncodeBasestringAscii(TestEncodeBasestringAscii, CTest): pass

View File

@ -1,6 +1,4 @@
from unittest import TestCase from test.json_tests import PyTest, CTest
import json
# Fri Dec 30 18:57:26 2005 # Fri Dec 30 18:57:26 2005
JSONDOCS = [ JSONDOCS = [
@ -61,15 +59,15 @@ SKIPS = {
18: "spec doesn't specify any nesting limitations", 18: "spec doesn't specify any nesting limitations",
} }
class TestFail(TestCase): class TestFail:
def test_failures(self): def test_failures(self):
for idx, doc in enumerate(JSONDOCS): for idx, doc in enumerate(JSONDOCS):
idx = idx + 1 idx = idx + 1
if idx in SKIPS: if idx in SKIPS:
json.loads(doc) self.loads(doc)
continue continue
try: try:
json.loads(doc) self.loads(doc)
except ValueError: except ValueError:
pass pass
else: else:
@ -79,7 +77,11 @@ class TestFail(TestCase):
data = {'a' : 1, (1, 2) : 2} data = {'a' : 1, (1, 2) : 2}
#This is for c encoder #This is for c encoder
self.assertRaises(TypeError, json.dumps, data) self.assertRaises(TypeError, self.dumps, data)
#This is for python encoder #This is for python encoder
self.assertRaises(TypeError, json.dumps, data, indent=True) self.assertRaises(TypeError, self.dumps, data, indent=True)
class TestPyFail(TestFail, PyTest): pass
class TestCFail(TestFail, CTest): pass

View File

@ -1,15 +1,18 @@
import math import math
from unittest import TestCase from test.json_tests import PyTest, CTest
import json
class TestFloat(TestCase): class TestFloat:
def test_floats(self): def test_floats(self):
for num in [1617161771.7650001, math.pi, math.pi**100, math.pi**-100, 3.1]: for num in [1617161771.7650001, math.pi, math.pi**100, math.pi**-100, 3.1]:
self.assertEqual(float(json.dumps(num)), num) self.assertEqual(float(self.dumps(num)), num)
self.assertEqual(json.loads(json.dumps(num)), num) self.assertEqual(self.loads(self.dumps(num)), num)
def test_ints(self): def test_ints(self):
for num in [1, 1<<32, 1<<64]: for num in [1, 1<<32, 1<<64]:
self.assertEqual(json.dumps(num), str(num)) self.assertEqual(self.dumps(num), str(num))
self.assertEqual(int(json.dumps(num)), num) self.assertEqual(int(self.dumps(num)), num)
class TestPyFloat(TestFloat, PyTest): pass
class TestCFloat(TestFloat, CTest): pass

View File

@ -1,10 +1,9 @@
from unittest import TestCase
import json
import textwrap import textwrap
from io import StringIO from io import StringIO
from test.json_tests import PyTest, CTest
class TestIndent(TestCase):
class TestIndent:
def test_indent(self): def test_indent(self):
h = [['blorpie'], ['whoops'], [], 'd-shtaeou', 'd-nthiouh', 'i-vhbjkhnth', h = [['blorpie'], ['whoops'], [], 'd-shtaeou', 'd-nthiouh', 'i-vhbjkhnth',
{'nifty': 87}, {'field': 'yes', 'morefield': False} ] {'nifty': 87}, {'field': 'yes', 'morefield': False} ]
@ -30,14 +29,13 @@ class TestIndent(TestCase):
\t} \t}
]""") ]""")
d1 = self.dumps(h)
d2 = self.dumps(h, indent=2, sort_keys=True, separators=(',', ': '))
d3 = self.dumps(h, indent='\t', sort_keys=True, separators=(',', ': '))
d1 = json.dumps(h) h1 = self.loads(d1)
d2 = json.dumps(h, indent=2, sort_keys=True, separators=(',', ': ')) h2 = self.loads(d2)
d3 = json.dumps(h, indent='\t', sort_keys=True, separators=(',', ': ')) h3 = self.loads(d3)
h1 = json.loads(d1)
h2 = json.loads(d2)
h3 = json.loads(d3)
self.assertEqual(h1, h) self.assertEqual(h1, h)
self.assertEqual(h2, h) self.assertEqual(h2, h)
@ -48,14 +46,18 @@ class TestIndent(TestCase):
def test_indent0(self): def test_indent0(self):
h = {3: 1} h = {3: 1}
def check(indent, expected): def check(indent, expected):
d1 = json.dumps(h, indent=indent) d1 = self.dumps(h, indent=indent)
self.assertEqual(d1, expected) self.assertEqual(d1, expected)
sio = StringIO() sio = StringIO()
json.dump(h, sio, indent=indent) self.json.dump(h, sio, indent=indent)
self.assertEqual(sio.getvalue(), expected) self.assertEqual(sio.getvalue(), expected)
# indent=0 should emit newlines # indent=0 should emit newlines
check(0, '{\n"3": 1\n}') check(0, '{\n"3": 1\n}')
# indent=None is more compact # indent=None is more compact
check(None, '{"3": 1}') check(None, '{"3": 1}')
class TestPyIndent(TestIndent, PyTest): pass
class TestCIndent(TestIndent, CTest): pass

View File

@ -1,6 +1,5 @@
from unittest import TestCase from test.json_tests import PyTest, CTest
import json
# from http://json.org/JSON_checker/test/pass1.json # from http://json.org/JSON_checker/test/pass1.json
JSON = r''' JSON = r'''
@ -62,15 +61,19 @@ JSON = r'''
,"rosebud"] ,"rosebud"]
''' '''
class TestPass1(TestCase): class TestPass1:
def test_parse(self): def test_parse(self):
# test in/out equivalence and parsing # test in/out equivalence and parsing
res = json.loads(JSON) res = self.loads(JSON)
out = json.dumps(res) out = self.dumps(res)
self.assertEqual(res, json.loads(out)) self.assertEqual(res, self.loads(out))
try: try:
json.dumps(res, allow_nan=False) self.dumps(res, allow_nan=False)
except ValueError: except ValueError:
pass pass
else: else:
self.fail("23456789012E666 should be out of range") self.fail("23456789012E666 should be out of range")
class TestPyPass1(TestPass1, PyTest): pass
class TestCPass1(TestPass1, CTest): pass

View File

@ -1,14 +1,18 @@
from unittest import TestCase from test.json_tests import PyTest, CTest
import json
# from http://json.org/JSON_checker/test/pass2.json # from http://json.org/JSON_checker/test/pass2.json
JSON = r''' JSON = r'''
[[[[[[[[[[[[[[[[[[["Not too deep"]]]]]]]]]]]]]]]]]]] [[[[[[[[[[[[[[[[[[["Not too deep"]]]]]]]]]]]]]]]]]]]
''' '''
class TestPass2(TestCase): class TestPass2:
def test_parse(self): def test_parse(self):
# test in/out equivalence and parsing # test in/out equivalence and parsing
res = json.loads(JSON) res = self.loads(JSON)
out = json.dumps(res) out = self.dumps(res)
self.assertEqual(res, json.loads(out)) self.assertEqual(res, self.loads(out))
class TestPyPass2(TestPass2, PyTest): pass
class TestCPass2(TestPass2, CTest): pass

View File

@ -1,6 +1,5 @@
from unittest import TestCase from test.json_tests import PyTest, CTest
import json
# from http://json.org/JSON_checker/test/pass3.json # from http://json.org/JSON_checker/test/pass3.json
JSON = r''' JSON = r'''
@ -12,9 +11,14 @@ JSON = r'''
} }
''' '''
class TestPass3(TestCase):
class TestPass3:
def test_parse(self): def test_parse(self):
# test in/out equivalence and parsing # test in/out equivalence and parsing
res = json.loads(JSON) res = self.loads(JSON)
out = json.dumps(res) out = self.dumps(res)
self.assertEqual(res, json.loads(out)) self.assertEqual(res, self.loads(out))
class TestPyPass3(TestPass3, PyTest): pass
class TestCPass3(TestPass3, CTest): pass

View File

@ -1,33 +1,16 @@
from unittest import TestCase from test.json_tests import PyTest, CTest
import json
class JSONTestObject: class JSONTestObject:
pass pass
class RecursiveJSONEncoder(json.JSONEncoder): class TestRecursion:
recurse = False
def default(self, o):
if o is JSONTestObject:
if self.recurse:
return [JSONTestObject]
else:
return 'JSONTestObject'
return json.JSONEncoder.default(o)
class EndlessJSONEncoder(json.JSONEncoder):
def default(self, o):
"""If check_circular is False, this will keep adding another list."""
return [o]
class TestRecursion(TestCase):
def test_listrecursion(self): def test_listrecursion(self):
x = [] x = []
x.append(x) x.append(x)
try: try:
json.dumps(x) self.dumps(x)
except ValueError: except ValueError:
pass pass
else: else:
@ -36,7 +19,7 @@ class TestRecursion(TestCase):
y = [x] y = [x]
x.append(y) x.append(y)
try: try:
json.dumps(x) self.dumps(x)
except ValueError: except ValueError:
pass pass
else: else:
@ -44,13 +27,13 @@ class TestRecursion(TestCase):
y = [] y = []
x = [y, y] x = [y, y]
# ensure that the marker is cleared # ensure that the marker is cleared
json.dumps(x) self.dumps(x)
def test_dictrecursion(self): def test_dictrecursion(self):
x = {} x = {}
x["test"] = x x["test"] = x
try: try:
json.dumps(x) self.dumps(x)
except ValueError: except ValueError:
pass pass
else: else:
@ -58,9 +41,19 @@ class TestRecursion(TestCase):
x = {} x = {}
y = {"a": x, "b": x} y = {"a": x, "b": x}
# ensure that the marker is cleared # ensure that the marker is cleared
json.dumps(x) self.dumps(x)
def test_defaultrecursion(self): def test_defaultrecursion(self):
class RecursiveJSONEncoder(self.json.JSONEncoder):
recurse = False
def default(self, o):
if o is JSONTestObject:
if self.recurse:
return [JSONTestObject]
else:
return 'JSONTestObject'
return pyjson.JSONEncoder.default(o)
enc = RecursiveJSONEncoder() enc = RecursiveJSONEncoder()
self.assertEqual(enc.encode(JSONTestObject), '"JSONTestObject"') self.assertEqual(enc.encode(JSONTestObject), '"JSONTestObject"')
enc.recurse = True enc.recurse = True
@ -76,11 +69,11 @@ class TestRecursion(TestCase):
# test that loading highly-nested objects doesn't segfault when C # test that loading highly-nested objects doesn't segfault when C
# accelerations are used. See #12017 # accelerations are used. See #12017
with self.assertRaises(RuntimeError): with self.assertRaises(RuntimeError):
json.loads('{"a":' * 100000 + '1' + '}' * 100000) self.loads('{"a":' * 100000 + '1' + '}' * 100000)
with self.assertRaises(RuntimeError): with self.assertRaises(RuntimeError):
json.loads('{"a":' * 100000 + '[1]' + '}' * 100000) self.loads('{"a":' * 100000 + '[1]' + '}' * 100000)
with self.assertRaises(RuntimeError): with self.assertRaises(RuntimeError):
json.loads('[' * 100000 + '1' + ']' * 100000) self.loads('[' * 100000 + '1' + ']' * 100000)
def test_highly_nested_objects_encoding(self): def test_highly_nested_objects_encoding(self):
# See #12051 # See #12051
@ -88,11 +81,20 @@ class TestRecursion(TestCase):
for x in range(100000): for x in range(100000):
l, d = [l], {'k':d} l, d = [l], {'k':d}
with self.assertRaises(RuntimeError): with self.assertRaises(RuntimeError):
json.dumps(l) self.dumps(l)
with self.assertRaises(RuntimeError): with self.assertRaises(RuntimeError):
json.dumps(d) self.dumps(d)
def test_endless_recursion(self): def test_endless_recursion(self):
# See #12051 # See #12051
class EndlessJSONEncoder(self.json.JSONEncoder):
def default(self, o):
"""If check_circular is False, this will keep adding another list."""
return [o]
with self.assertRaises(RuntimeError): with self.assertRaises(RuntimeError):
EndlessJSONEncoder(check_circular=False).encode(5j) EndlessJSONEncoder(check_circular=False).encode(5j)
class TestPyRecursion(TestRecursion, PyTest): pass
class TestCRecursion(TestRecursion, CTest): pass

View File

@ -1,24 +1,10 @@
import sys import sys
from unittest import TestCase, skipUnless from test.json_tests import PyTest, CTest
import json
import json.decoder
try: class TestScanstring:
import _json def test_scanstring(self):
except ImportError: scanstring = self.json.decoder.scanstring
_json = None
class TestScanString(TestCase):
def test_py_scanstring(self):
self._test_scanstring(json.decoder.py_scanstring)
@skipUnless(_json, 'test requires the _json module')
def test_c_scanstring(self):
if json.decoder.c_scanstring is not None:
self._test_scanstring(json.decoder.c_scanstring)
def _test_scanstring(self, scanstring):
self.assertEqual( self.assertEqual(
scanstring('"z\\ud834\\udd20x"', 1, True), scanstring('"z\\ud834\\udd20x"', 1, True),
('z\U0001d120x', 16)) ('z\U0001d120x', 16))
@ -109,4 +95,9 @@ class TestScanString(TestCase):
('Bad value', 12)) ('Bad value', 12))
def test_overflow(self): def test_overflow(self):
self.assertRaises(OverflowError, json.decoder.scanstring, b"xxx", sys.maxsize+1) with self.assertRaises(OverflowError):
self.json.decoder.scanstring(b"xxx", sys.maxsize+1)
class TestPyScanstring(TestScanstring, PyTest): pass
class TestCScanstring(TestScanstring, CTest): pass

View File

@ -1,10 +1,8 @@
import textwrap import textwrap
from unittest import TestCase from test.json_tests import PyTest, CTest
import json
class TestSeparators(TestCase): class TestSeparators:
def test_separators(self): def test_separators(self):
h = [['blorpie'], ['whoops'], [], 'd-shtaeou', 'd-nthiouh', 'i-vhbjkhnth', h = [['blorpie'], ['whoops'], [], 'd-shtaeou', 'd-nthiouh', 'i-vhbjkhnth',
{'nifty': 87}, {'field': 'yes', 'morefield': False} ] {'nifty': 87}, {'field': 'yes', 'morefield': False} ]
@ -31,12 +29,16 @@ class TestSeparators(TestCase):
]""") ]""")
d1 = json.dumps(h) d1 = self.dumps(h)
d2 = json.dumps(h, indent=2, sort_keys=True, separators=(' ,', ' : ')) d2 = self.dumps(h, indent=2, sort_keys=True, separators=(' ,', ' : '))
h1 = json.loads(d1) h1 = self.loads(d1)
h2 = json.loads(d2) h2 = self.loads(d2)
self.assertEqual(h1, h) self.assertEqual(h1, h)
self.assertEqual(h2, h) self.assertEqual(h2, h)
self.assertEqual(d2, expect) self.assertEqual(d2, expect)
class TestPySeparators(TestSeparators, PyTest): pass
class TestCSeparators(TestSeparators, CTest): pass

View File

@ -1,29 +1,24 @@
from unittest import TestCase, skipUnless from test.json_tests import CTest
from json import decoder, encoder, scanner
try: class TestSpeedups(CTest):
import _json
except ImportError:
_json = None
@skipUnless(_json, 'test requires the _json module')
class TestSpeedups(TestCase):
def test_scanstring(self): def test_scanstring(self):
self.assertEqual(decoder.scanstring.__module__, "_json") self.assertEqual(self.json.decoder.scanstring.__module__, "_json")
self.assertIs(decoder.scanstring, decoder.c_scanstring) self.assertIs(self.json.decoder.scanstring, self.json.decoder.c_scanstring)
def test_encode_basestring_ascii(self): def test_encode_basestring_ascii(self):
self.assertEqual(encoder.encode_basestring_ascii.__module__, "_json") self.assertEqual(self.json.encoder.encode_basestring_ascii.__module__,
self.assertIs(encoder.encode_basestring_ascii, "_json")
encoder.c_encode_basestring_ascii) self.assertIs(self.json.encoder.encode_basestring_ascii,
self.json.encoder.c_encode_basestring_ascii)
class TestDecode(TestCase):
class TestDecode(CTest):
def test_make_scanner(self): def test_make_scanner(self):
self.assertRaises(AttributeError, scanner.c_make_scanner, 1) self.assertRaises(AttributeError, self.json.scanner.c_make_scanner, 1)
def test_make_encoder(self): def test_make_encoder(self):
self.assertRaises(TypeError, encoder.c_make_encoder, self.assertRaises(TypeError, self.json.encoder.c_make_encoder,
(True, False), (True, False),
b"\xCD\x7D\x3D\x4E\x12\x4C\xF9\x79\xD7\x52\xBA\x82\xF2\x27\x4A\x7D\xA0\xCA\x75", b"\xCD\x7D\x3D\x4E\x12\x4C\xF9\x79\xD7\x52\xBA\x82\xF2\x27\x4A\x7D\xA0\xCA\x75",
None) None)

View File

@ -1,73 +1,75 @@
from unittest import TestCase
import json
from collections import OrderedDict from collections import OrderedDict
from test.json_tests import PyTest, CTest
class TestUnicode(TestCase):
class TestUnicode:
# test_encoding1 and test_encoding2 from 2.x are irrelevant (only str # test_encoding1 and test_encoding2 from 2.x are irrelevant (only str
# is supported as input, not bytes). # is supported as input, not bytes).
def test_encoding3(self): def test_encoding3(self):
u = '\N{GREEK SMALL LETTER ALPHA}\N{GREEK CAPITAL LETTER OMEGA}' u = '\N{GREEK SMALL LETTER ALPHA}\N{GREEK CAPITAL LETTER OMEGA}'
j = json.dumps(u) j = self.dumps(u)
self.assertEqual(j, '"\\u03b1\\u03a9"') self.assertEqual(j, '"\\u03b1\\u03a9"')
def test_encoding4(self): def test_encoding4(self):
u = '\N{GREEK SMALL LETTER ALPHA}\N{GREEK CAPITAL LETTER OMEGA}' u = '\N{GREEK SMALL LETTER ALPHA}\N{GREEK CAPITAL LETTER OMEGA}'
j = json.dumps([u]) j = self.dumps([u])
self.assertEqual(j, '["\\u03b1\\u03a9"]') self.assertEqual(j, '["\\u03b1\\u03a9"]')
def test_encoding5(self): def test_encoding5(self):
u = '\N{GREEK SMALL LETTER ALPHA}\N{GREEK CAPITAL LETTER OMEGA}' u = '\N{GREEK SMALL LETTER ALPHA}\N{GREEK CAPITAL LETTER OMEGA}'
j = json.dumps(u, ensure_ascii=False) j = self.dumps(u, ensure_ascii=False)
self.assertEqual(j, '"{0}"'.format(u)) self.assertEqual(j, '"{0}"'.format(u))
def test_encoding6(self): def test_encoding6(self):
u = '\N{GREEK SMALL LETTER ALPHA}\N{GREEK CAPITAL LETTER OMEGA}' u = '\N{GREEK SMALL LETTER ALPHA}\N{GREEK CAPITAL LETTER OMEGA}'
j = json.dumps([u], ensure_ascii=False) j = self.dumps([u], ensure_ascii=False)
self.assertEqual(j, '["{0}"]'.format(u)) self.assertEqual(j, '["{0}"]'.format(u))
def test_big_unicode_encode(self): def test_big_unicode_encode(self):
u = '\U0001d120' u = '\U0001d120'
self.assertEqual(json.dumps(u), '"\\ud834\\udd20"') self.assertEqual(self.dumps(u), '"\\ud834\\udd20"')
self.assertEqual(json.dumps(u, ensure_ascii=False), '"\U0001d120"') self.assertEqual(self.dumps(u, ensure_ascii=False), '"\U0001d120"')
def test_big_unicode_decode(self): def test_big_unicode_decode(self):
u = 'z\U0001d120x' u = 'z\U0001d120x'
self.assertEqual(json.loads('"' + u + '"'), u) self.assertEqual(self.loads('"' + u + '"'), u)
self.assertEqual(json.loads('"z\\ud834\\udd20x"'), u) self.assertEqual(self.loads('"z\\ud834\\udd20x"'), u)
def test_unicode_decode(self): def test_unicode_decode(self):
for i in range(0, 0xd7ff): for i in range(0, 0xd7ff):
u = chr(i) u = chr(i)
s = '"\\u{0:04x}"'.format(i) s = '"\\u{0:04x}"'.format(i)
self.assertEqual(json.loads(s), u) self.assertEqual(self.loads(s), u)
def test_unicode_preservation(self): def test_unicode_preservation(self):
self.assertEqual(type(json.loads('""')), str) self.assertEqual(type(self.loads('""')), str)
self.assertEqual(type(json.loads('"a"')), str) self.assertEqual(type(self.loads('"a"')), str)
self.assertEqual(type(json.loads('["a"]')[0]), str) self.assertEqual(type(self.loads('["a"]')[0]), str)
def test_bytes_encode(self): def test_bytes_encode(self):
self.assertRaises(TypeError, json.dumps, b"hi") self.assertRaises(TypeError, self.dumps, b"hi")
self.assertRaises(TypeError, json.dumps, [b"hi"]) self.assertRaises(TypeError, self.dumps, [b"hi"])
def test_bytes_decode(self): def test_bytes_decode(self):
self.assertRaises(TypeError, json.loads, b'"hi"') self.assertRaises(TypeError, self.loads, b'"hi"')
self.assertRaises(TypeError, json.loads, b'["hi"]') self.assertRaises(TypeError, self.loads, b'["hi"]')
def test_object_pairs_hook_with_unicode(self): def test_object_pairs_hook_with_unicode(self):
s = '{"xkd":1, "kcw":2, "art":3, "hxm":4, "qrt":5, "pad":6, "hoy":7}' s = '{"xkd":1, "kcw":2, "art":3, "hxm":4, "qrt":5, "pad":6, "hoy":7}'
p = [("xkd", 1), ("kcw", 2), ("art", 3), ("hxm", 4), p = [("xkd", 1), ("kcw", 2), ("art", 3), ("hxm", 4),
("qrt", 5), ("pad", 6), ("hoy", 7)] ("qrt", 5), ("pad", 6), ("hoy", 7)]
self.assertEqual(json.loads(s), eval(s)) self.assertEqual(self.loads(s), eval(s))
self.assertEqual(json.loads(s, object_pairs_hook = lambda x: x), p) self.assertEqual(self.loads(s, object_pairs_hook = lambda x: x), p)
od = json.loads(s, object_pairs_hook = OrderedDict) od = self.loads(s, object_pairs_hook = OrderedDict)
self.assertEqual(od, OrderedDict(p)) self.assertEqual(od, OrderedDict(p))
self.assertEqual(type(od), OrderedDict) self.assertEqual(type(od), OrderedDict)
# the object_pairs_hook takes priority over the object_hook # the object_pairs_hook takes priority over the object_hook
self.assertEqual(json.loads(s, self.assertEqual(self.loads(s, object_pairs_hook = OrderedDict,
object_pairs_hook = OrderedDict,
object_hook = lambda x: None), object_hook = lambda x: None),
OrderedDict(p)) OrderedDict(p))
class TestPyUnicode(TestUnicode, PyTest): pass
class TestCUnicode(TestUnicode, CTest): pass

View File

@ -37,6 +37,7 @@ __all__ = [
"findfile", "sortdict", "check_syntax_error", "open_urlresource", "findfile", "sortdict", "check_syntax_error", "open_urlresource",
"check_warnings", "CleanImport", "EnvironmentVarGuard", "check_warnings", "CleanImport", "EnvironmentVarGuard",
"TransientResource", "captured_output", "captured_stdout", "TransientResource", "captured_output", "captured_stdout",
"captured_stdin", "captured_stderr",
"time_out", "socket_peer_reset", "ioerror_peer_reset", "time_out", "socket_peer_reset", "ioerror_peer_reset",
"run_with_locale", 'temp_umask', "transient_internet", "run_with_locale", 'temp_umask', "transient_internet",
"set_memlimit", "bigmemtest", "bigaddrspacetest", "BasicTestRunner", "set_memlimit", "bigmemtest", "bigaddrspacetest", "BasicTestRunner",
@ -92,19 +93,15 @@ def import_module(name, deprecated=False):
def _save_and_remove_module(name, orig_modules): def _save_and_remove_module(name, orig_modules):
"""Helper function to save and remove a module from sys.modules """Helper function to save and remove a module from sys.modules
Return True if the module was in sys.modules, False otherwise.
Raise ImportError if the module can't be imported.""" Raise ImportError if the module can't be imported."""
saved = True
try:
orig_modules[name] = sys.modules[name]
except KeyError:
# try to import the module and raise an error if it can't be imported # try to import the module and raise an error if it can't be imported
if name not in sys.modules:
__import__(name) __import__(name)
saved = False
else:
del sys.modules[name] del sys.modules[name]
return saved for modname in list(sys.modules):
if modname == name or modname.startswith(name + '.'):
orig_modules[modname] = sys.modules[modname]
del sys.modules[modname]
def _save_and_block_module(name, orig_modules): def _save_and_block_module(name, orig_modules):
"""Helper function to save and block a module in sys.modules """Helper function to save and block a module in sys.modules
@ -132,8 +129,8 @@ def import_fresh_module(name, fresh=(), blocked=(), deprecated=False):
If deprecated is True, any module or package deprecation messages If deprecated is True, any module or package deprecation messages
will be suppressed.""" will be suppressed."""
# NOTE: test_heapq and test_warnings include extra sanity checks to make # NOTE: test_heapq, test_json and test_warnings include extra sanity checks
# sure that this utility function is working as expected # to make sure that this utility function is working as expected
with _ignore_deprecated_imports(deprecated): with _ignore_deprecated_imports(deprecated):
# Keep track of modules saved for later restoration as well # Keep track of modules saved for later restoration as well
# as those which just need a blocking entry removed # as those which just need a blocking entry removed
@ -895,14 +892,8 @@ def transient_internet(resource_name, *, timeout=30.0, errnos=()):
@contextlib.contextmanager @contextlib.contextmanager
def captured_output(stream_name): def captured_output(stream_name):
"""Run the 'with' statement body using a StringIO object in place of a """Return a context manager used by captured_stdout/stdin/stderr
specific attribute on the sys module. that temporarily replaces the sys stream *stream_name* with a StringIO."""
Example use (with 'stream_name=stdout')::
with captured_stdout() as s:
print("hello")
assert s.getvalue() == "hello"
"""
import io import io
orig_stdout = getattr(sys, stream_name) orig_stdout = getattr(sys, stream_name)
setattr(sys, stream_name, io.StringIO()) setattr(sys, stream_name, io.StringIO())
@ -912,6 +903,12 @@ def captured_output(stream_name):
setattr(sys, stream_name, orig_stdout) setattr(sys, stream_name, orig_stdout)
def captured_stdout(): def captured_stdout():
"""Capture the output of sys.stdout:
with captured_stdout() as s:
print("hello")
self.assertEqual(s.getvalue(), "hello")
"""
return captured_output("stdout") return captured_output("stdout")
def captured_stderr(): def captured_stderr():
@ -920,6 +917,7 @@ def captured_stderr():
def captured_stdin(): def captured_stdin():
return captured_output("stdin") return captured_output("stdin")
def gc_collect(): def gc_collect():
"""Force as many objects as possible to be collected. """Force as many objects as possible to be collected.

View File

@ -193,6 +193,7 @@ class CompressTestCase(BaseCompressTestCase, unittest.TestCase):
data = b'x' * size data = b'x' * size
try: try:
self.assertRaises(OverflowError, zlib.compress, data, 1) self.assertRaises(OverflowError, zlib.compress, data, 1)
self.assertRaises(OverflowError, zlib.decompress, data)
finally: finally:
data = None data = None
@ -360,6 +361,15 @@ class CompressObjectTestCase(BaseCompressTestCase, unittest.TestCase):
self.assertRaises(ValueError, dco.decompress, b"", -1) self.assertRaises(ValueError, dco.decompress, b"", -1)
self.assertEqual(b'', dco.unconsumed_tail) self.assertEqual(b'', dco.unconsumed_tail)
def test_clear_unconsumed_tail(self):
# Issue #12050: calling decompress() without providing max_length
# should clear the unconsumed_tail attribute.
cdata = b"x\x9cKLJ\x06\x00\x02M\x01" # "abc"
dco = zlib.decompressobj()
ddata = dco.decompress(cdata, 1)
ddata += dco.decompress(dco.unconsumed_tail)
self.assertEqual(dco.unconsumed_tail, b"")
def test_flushes(self): def test_flushes(self):
# Test flush() with the various options, using all the # Test flush() with the various options, using all the
# different levels in order to provide more variations. # different levels in order to provide more variations.

View File

@ -90,6 +90,9 @@ Core and Builtins
Library Library
------- -------
- Issue #12050: zlib.decompressobj().decompress() now clears the unconsumed_tail
attribute when called without a max_length argument.
- Issue #12062: Fix a flushing bug when doing a certain type of I/O sequence - Issue #12062: Fix a flushing bug when doing a certain type of I/O sequence
on a file opened in read+write mode (namely: reading, seeking a bit forward, on a file opened in read+write mode (namely: reading, seeking a bit forward,
writing, then seeking before the previous write but still within buffered writing, then seeking before the previous write but still within buffered
@ -379,6 +382,8 @@ Extension Modules
Tests Tests
----- -----
- Issue #5723: Improve json tests to be executed with and without accelerations.
- Issue #11873: Change regex in test_compileall to fix occasional failures when - Issue #11873: Change regex in test_compileall to fix occasional failures when
when the randomly generated temporary path happened to match the regex. when the randomly generated temporary path happened to match the regex.

View File

@ -116,7 +116,7 @@ PyZlib_compress(PyObject *self, PyObject *args)
{ {
PyObject *ReturnVal = NULL; PyObject *ReturnVal = NULL;
Py_buffer pinput; Py_buffer pinput;
Byte *input, *output; Byte *input, *output = NULL;
unsigned int length; unsigned int length;
int level=Z_DEFAULT_COMPRESSION, err; int level=Z_DEFAULT_COMPRESSION, err;
z_stream zst; z_stream zst;
@ -127,20 +127,19 @@ PyZlib_compress(PyObject *self, PyObject *args)
if (pinput.len > UINT_MAX) { if (pinput.len > UINT_MAX) {
PyErr_SetString(PyExc_OverflowError, PyErr_SetString(PyExc_OverflowError,
"size does not fit in an unsigned int"); "Size does not fit in an unsigned int");
return NULL; goto error;
} }
length = pinput.len;
input = pinput.buf; input = pinput.buf;
length = pinput.len;
zst.avail_out = length + length/1000 + 12 + 1; zst.avail_out = length + length/1000 + 12 + 1;
output = (Byte*)malloc(zst.avail_out); output = (Byte*)malloc(zst.avail_out);
if (output == NULL) { if (output == NULL) {
PyBuffer_Release(&pinput);
PyErr_SetString(PyExc_MemoryError, PyErr_SetString(PyExc_MemoryError,
"Can't allocate memory to compress data"); "Can't allocate memory to compress data");
return NULL; goto error;
} }
/* Past the point of no return. From here on out, we need to make sure /* Past the point of no return. From here on out, we need to make sure
@ -203,7 +202,7 @@ PyDoc_STRVAR(decompress__doc__,
static PyObject * static PyObject *
PyZlib_decompress(PyObject *self, PyObject *args) PyZlib_decompress(PyObject *self, PyObject *args)
{ {
PyObject *result_str; PyObject *result_str = NULL;
Py_buffer pinput; Py_buffer pinput;
Byte *input; Byte *input;
unsigned int length; unsigned int length;
@ -218,11 +217,11 @@ PyZlib_decompress(PyObject *self, PyObject *args)
if (pinput.len > UINT_MAX) { if (pinput.len > UINT_MAX) {
PyErr_SetString(PyExc_OverflowError, PyErr_SetString(PyExc_OverflowError,
"size does not fit in an unsigned int"); "Size does not fit in an unsigned int");
return NULL; goto error;
} }
length = pinput.len;
input = pinput.buf; input = pinput.buf;
length = pinput.len;
if (r_strlen <= 0) if (r_strlen <= 0)
r_strlen = 1; r_strlen = 1;
@ -230,10 +229,8 @@ PyZlib_decompress(PyObject *self, PyObject *args)
zst.avail_in = length; zst.avail_in = length;
zst.avail_out = r_strlen; zst.avail_out = r_strlen;
if (!(result_str = PyBytes_FromStringAndSize(NULL, r_strlen))) { if (!(result_str = PyBytes_FromStringAndSize(NULL, r_strlen)))
PyBuffer_Release(&pinput); goto error;
return NULL;
}
zst.zalloc = (alloc_func)NULL; zst.zalloc = (alloc_func)NULL;
zst.zfree = (free_func)Z_NULL; zst.zfree = (free_func)Z_NULL;
@ -574,18 +571,23 @@ PyZlib_objdecompress(compobject *self, PyObject *args)
Py_END_ALLOW_THREADS Py_END_ALLOW_THREADS
} }
/* Not all of the compressed data could be accommodated in the output buffer
of specified size. Return the unconsumed tail in an attribute.*/
if(max_length) { if(max_length) {
/* Not all of the compressed data could be accommodated in a buffer of
the specified size. Return the unconsumed tail in an attribute. */
Py_DECREF(self->unconsumed_tail); Py_DECREF(self->unconsumed_tail);
self->unconsumed_tail = PyBytes_FromStringAndSize((char *)self->zst.next_in, self->unconsumed_tail = PyBytes_FromStringAndSize((char *)self->zst.next_in,
self->zst.avail_in); self->zst.avail_in);
if(!self->unconsumed_tail) { }
else if (PyBytes_GET_SIZE(self->unconsumed_tail) > 0) {
/* All of the compressed data was consumed. Clear unconsumed_tail. */
Py_DECREF(self->unconsumed_tail);
self->unconsumed_tail = PyBytes_FromStringAndSize("", 0);
}
if (self->unconsumed_tail == NULL) {
Py_DECREF(RetVal); Py_DECREF(RetVal);
RetVal = NULL; RetVal = NULL;
goto error; goto error;
} }
}
/* The end of the compressed data has been reached, so set the /* The end of the compressed data has been reached, so set the
unused_data attribute to a string containing the remainder of the unused_data attribute to a string containing the remainder of the