This commit is contained in:
Larry Hastings 2014-01-26 22:28:06 -08:00
commit 23105d8014
38 changed files with 1006 additions and 468 deletions

View File

@ -90,6 +90,16 @@ in various ways. There is a separate error indicator for each thread.
the class in that case. If the values are already normalized, nothing happens. the class in that case. If the values are already normalized, nothing happens.
The delayed normalization is implemented to improve performance. The delayed normalization is implemented to improve performance.
.. note::
This function *does not* implicitly set the ``__traceback__``
attribute on the exception value. If setting the traceback
appropriately is desired, the following additional snippet is needed::
if (tb != NULL) {
PyException_SetTraceback(val, tb);
}
.. c:function:: void PyErr_Clear() .. c:function:: void PyErr_Clear()

View File

@ -371,7 +371,8 @@ Module Functions
Returns an iterator over the :class:`Future` instances (possibly created by Returns an iterator over the :class:`Future` instances (possibly created by
different :class:`Executor` instances) given by *fs* that yields futures as different :class:`Executor` instances) given by *fs* that yields futures as
they complete (finished or were cancelled). Any futures that completed they complete (finished or were cancelled). Any futures given by *fs* that
are duplicated will be returned once. Any futures that completed
before :func:`as_completed` is called will be yielded first. The returned before :func:`as_completed` is called will be yielded first. The returned
iterator raises a :exc:`TimeoutError` if :meth:`~iterator.__next__` is iterator raises a :exc:`TimeoutError` if :meth:`~iterator.__next__` is
called and the result isn't available after *timeout* seconds from the called and the result isn't available after *timeout* seconds from the

View File

@ -319,27 +319,25 @@ Yield expressions
yield_atom: "(" `yield_expression` ")" yield_atom: "(" `yield_expression` ")"
yield_expression: "yield" [`expression_list` | "from" `expression`] yield_expression: "yield" [`expression_list` | "from" `expression`]
The :keyword:`yield` expression is only used when defining a :term:`generator` The yield expression is only used when defining a :term:`generator` function and
function, thus can only be used in the body of a function definition. Using a yield
and can only be used in the body of a function definition. Using a expression in a function's body causes that function to be a generator.
:keyword:`yield` expression in a function definition is sufficient to cause that
definition to create a generator function instead of a normal function.
When a generator function is called, it returns an iterator known as a When a generator function is called, it returns an iterator known as a
generator. That generator then controls the execution of a generator function. generator. That generator then controls the execution of a generator function.
The execution starts when one of the generator's methods is called. At that The execution starts when one of the generator's methods is called. At that
time, the execution proceeds to the first :keyword:`yield` expression, where it time, the execution proceeds to the first yield expression, where it is
is suspended again, returning the value of :token:`expression_list` to suspended again, returning the value of :token:`expression_list` to generator's
generator's caller. By suspended we mean that all local state is retained, caller. By suspended, we mean that all local state is retained, including the
including the current bindings of local variables, the instruction pointer, and current bindings of local variables, the instruction pointer, and the internal
the internal evaluation stack. When the execution is resumed by calling one of evaluation stack. When the execution is resumed by calling one of the
the generator's methods, the function can proceed exactly as if the generator's methods, the function can proceed exactly as if the yield expression
:keyword:`yield` expression was just another external call. The value of the was just another external call. The value of the yield expression after
:keyword:`yield` expression after resuming depends on the method which resumed resuming depends on the method which resumed the execution. If
the execution. If :meth:`~generator.__next__` is used (typically via either a :meth:`~generator.__next__` is used (typically via either a :keyword:`for` or
:keyword:`for` or the :func:`next` builtin) then the result is :const:`None`, the :func:`next` builtin) then the result is :const:`None`. Otherwise, if
otherwise, if :meth:`~generator.send` is used, then the result will be the :meth:`~generator.send` is used, then the result will be the value passed in to
value passed in to that method. that method.
.. index:: single: coroutine .. index:: single: coroutine
@ -349,11 +347,11 @@ suspended. The only difference is that a generator function cannot control
where should the execution continue after it yields; the control is always where should the execution continue after it yields; the control is always
transferred to the generator's caller. transferred to the generator's caller.
:keyword:`yield` expressions are allowed in the :keyword:`try` clause of a yield expressions are allowed in the :keyword:`try` clause of a :keyword:`try`
:keyword:`try` ... :keyword:`finally` construct. If the generator is not ... :keyword:`finally` construct. If the generator is not resumed before it is
resumed before it is finalized (by reaching a zero reference count or by being finalized (by reaching a zero reference count or by being garbage collected),
garbage collected), the generator-iterator's :meth:`~generator.close` method the generator-iterator's :meth:`~generator.close` method will be called,
will be called, allowing any pending :keyword:`finally` clauses to execute. allowing any pending :keyword:`finally` clauses to execute.
When ``yield from <expr>`` is used, it treats the supplied expression as When ``yield from <expr>`` is used, it treats the supplied expression as
a subiterator. All values produced by that subiterator are passed directly a subiterator. All values produced by that subiterator are passed directly
@ -373,12 +371,24 @@ the yield expression. It can be either set explicitly when raising
.. versionchanged:: 3.3 .. versionchanged:: 3.3
Added ``yield from <expr>`` to delegate control flow to a subiterator Added ``yield from <expr>`` to delegate control flow to a subiterator
The parentheses can be omitted when the :keyword:`yield` expression is the The parentheses may be omitted when the yield expression is the sole expression
sole expression on the right hand side of an assignment statement. on the right hand side of an assignment statement.
.. seealso::
:pep:`0255` - Simple Generators
The proposal for adding generators and the :keyword:`yield` statement to Python.
:pep:`0342` - Coroutines via Enhanced Generators
The proposal to enhance the API and syntax of generators, making them
usable as simple coroutines.
:pep:`0380` - Syntax for Delegating to a Subgenerator
The proposal to introduce the :token:`yield_from` syntax, making delegation
to sub-generators easy.
.. index:: object: generator .. index:: object: generator
Generator-iterator methods Generator-iterator methods
^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^
@ -395,13 +405,12 @@ is already executing raises a :exc:`ValueError` exception.
.. method:: generator.__next__() .. method:: generator.__next__()
Starts the execution of a generator function or resumes it at the last Starts the execution of a generator function or resumes it at the last
executed :keyword:`yield` expression. When a generator function is resumed executed yield expression. When a generator function is resumed with a
with a :meth:`~generator.__next__` method, the current :keyword:`yield` :meth:`~generator.__next__` method, the current yield expression always
expression always evaluates to :const:`None`. The execution then continues evaluates to :const:`None`. The execution then continues to the next yield
to the next :keyword:`yield` expression, where the generator is suspended expression, where the generator is suspended again, and the value of the
again, and the value of the :token:`expression_list` is returned to :token:`expression_list` is returned to :meth:`next`'s caller. If the
:meth:`next`'s caller. generator exits without yielding another value, a :exc:`StopIteration`
If the generator exits without yielding another value, a :exc:`StopIteration`
exception is raised. exception is raised.
This method is normally called implicitly, e.g. by a :keyword:`for` loop, or This method is normally called implicitly, e.g. by a :keyword:`for` loop, or
@ -411,12 +420,12 @@ is already executing raises a :exc:`ValueError` exception.
.. method:: generator.send(value) .. method:: generator.send(value)
Resumes the execution and "sends" a value into the generator function. The Resumes the execution and "sends" a value into the generator function. The
``value`` argument becomes the result of the current :keyword:`yield` *value* argument becomes the result of the current yield expression. The
expression. The :meth:`send` method returns the next value yielded by the :meth:`send` method returns the next value yielded by the generator, or
generator, or raises :exc:`StopIteration` if the generator exits without raises :exc:`StopIteration` if the generator exits without yielding another
yielding another value. When :meth:`send` is called to start the generator, value. When :meth:`send` is called to start the generator, it must be called
it must be called with :const:`None` as the argument, because there is no with :const:`None` as the argument, because there is no yield expression that
:keyword:`yield` expression that could receive the value. could receive the value.
.. method:: generator.throw(type[, value[, traceback]]) .. method:: generator.throw(type[, value[, traceback]])
@ -478,20 +487,6 @@ For examples using ``yield from``, see :ref:`pep-380` in "What's New in
Python." Python."
.. seealso::
:pep:`0255` - Simple Generators
The proposal for adding generators and the :keyword:`yield` statement to Python.
:pep:`0342` - Coroutines via Enhanced Generators
The proposal to enhance the API and syntax of generators, making them
usable as simple coroutines.
:pep:`0380` - Syntax for Delegating to a Subgenerator
The proposal to introduce the :token:`yield_from` syntax, making delegation
to sub-generators easy.
.. _primaries: .. _primaries:
Primaries Primaries

View File

@ -445,53 +445,26 @@ The :keyword:`yield` statement
.. productionlist:: .. productionlist::
yield_stmt: `yield_expression` yield_stmt: `yield_expression`
The :keyword:`yield` statement is only used when defining a generator function, A :keyword:`yield` statement is semantically equivalent to a :ref:`yield
and is only used in the body of the generator function. Using a :keyword:`yield` expression <yieldexpr>`. The yield statement can be used to omit the parentheses
statement in a function definition is sufficient to cause that definition to that would otherwise be required in the equivalent yield expression
create a generator function instead of a normal function. statement. For example, the yield statements ::
When a generator function is called, it returns an iterator known as a generator yield <expr>
iterator, or more commonly, a generator. The body of the generator function is yield from <expr>
executed by calling the :func:`next` function on the generator repeatedly until
it raises an exception.
When a :keyword:`yield` statement is executed, the state of the generator is are equivalent to the yield expression statements ::
frozen and the value of :token:`expression_list` is returned to :meth:`next`'s
caller. By "frozen" we mean that all local state is retained, including the
current bindings of local variables, the instruction pointer, and the internal
evaluation stack: enough information is saved so that the next time :func:`next`
is invoked, the function can proceed exactly as if the :keyword:`yield`
statement were just another external call.
The :keyword:`yield` statement is allowed in the :keyword:`try` clause of a (yield <expr>)
:keyword:`try` ... :keyword:`finally` construct. If the generator is not (yield from <expr>)
resumed before it is finalized (by reaching a zero reference count or by being
garbage collected), the generator-iterator's :meth:`close` method will be
called, allowing any pending :keyword:`finally` clauses to execute.
When ``yield from <expr>`` is used, it treats the supplied expression as Yield expressions and statements are only used when defining a :term:`generator`
a subiterator, producing values from it until the underlying iterator is function, and are only used in the body of the generator function. Using yield
exhausted. in a function definition is sufficient to cause that definition to create a
generator function instead of a normal function.
.. versionchanged:: 3.3
Added ``yield from <expr>`` to delegate control flow to a subiterator
For full details of :keyword:`yield` semantics, refer to the :ref:`yieldexpr`
section.
.. seealso::
:pep:`0255` - Simple Generators
The proposal for adding generators and the :keyword:`yield` statement to Python.
:pep:`0342` - Coroutines via Enhanced Generators
The proposal to enhance the API and syntax of generators, making them
usable as simple coroutines.
:pep:`0380` - Syntax for Delegating to a Subgenerator
The proposal to introduce the :token:`yield_from` syntax, making delegation
to sub-generators easy.
For full details of :keyword:`yield` semantics, refer to the
:ref:`yieldexpr` section.
.. _raise: .. _raise:

View File

@ -264,6 +264,9 @@ name of the codec responsible for producing the error::
>>> import codecs >>> import codecs
>>> codecs.decode(b"abcdefgh", "hex") >>> codecs.decode(b"abcdefgh", "hex")
Traceback (most recent call last):
File "/usr/lib/python3.4/encodings/hex_codec.py", line 20, in hex_decode
return (binascii.a2b_hex(input), len(input))
binascii.Error: Non-hexadecimal digit found binascii.Error: Non-hexadecimal digit found
The above exception was the direct cause of the following exception: The above exception was the direct cause of the following exception:
@ -273,6 +276,11 @@ name of the codec responsible for producing the error::
binascii.Error: decoding with 'hex' codec failed (Error: Non-hexadecimal digit found) binascii.Error: decoding with 'hex' codec failed (Error: Non-hexadecimal digit found)
>>> codecs.encode("hello", "bz2") >>> codecs.encode("hello", "bz2")
Traceback (most recent call last):
File "/usr/lib/python3.4/encodings/bz2_codec.py", line 17, in bz2_encode
return (bz2.compress(input), len(input))
File "/usr/lib/python3.4/bz2.py", line 498, in compress
return comp.compress(data) + comp.flush()
TypeError: 'str' does not support the buffer interface TypeError: 'str' does not support the buffer interface
The above exception was the direct cause of the following exception: The above exception was the direct cause of the following exception:

View File

@ -114,7 +114,6 @@ class BaseSubprocessTransport(transports.SubprocessTransport):
assert returncode is not None, returncode assert returncode is not None, returncode
assert self._returncode is None, self._returncode assert self._returncode is None, self._returncode
self._returncode = returncode self._returncode = returncode
self._loop._subprocess_closed(self)
self._call(self._protocol.process_exited) self._call(self._protocol.process_exited)
self._try_finish() self._try_finish()

View File

@ -169,9 +169,6 @@ class _UnixSelectorEventLoop(selector_events.BaseSelectorEventLoop):
def _child_watcher_callback(self, pid, returncode, transp): def _child_watcher_callback(self, pid, returncode, transp):
self.call_soon_threadsafe(transp._process_exited, returncode) self.call_soon_threadsafe(transp._process_exited, returncode)
def _subprocess_closed(self, transp):
pass
def _set_nonblocking(fd): def _set_nonblocking(fd):
flags = fcntl.fcntl(fd, fcntl.F_GETFL) flags = fcntl.fcntl(fd, fcntl.F_GETFL)

View File

@ -178,9 +178,6 @@ class ProactorEventLoop(proactor_events.BaseProactorEventLoop):
yield from transp._post_init() yield from transp._post_init()
return transp return transp
def _subprocess_closed(self, transport):
pass
class IocpProactor: class IocpProactor:
"""Proactor implementation using IOCP.""" """Proactor implementation using IOCP."""

View File

@ -475,15 +475,12 @@ class StreamReader(Codec):
# read until we get the required number of characters (if available) # read until we get the required number of characters (if available)
while True: while True:
# can the request be satisfied from the character buffer? # can the request be satisfied from the character buffer?
if chars < 0: if chars >= 0:
if size < 0:
if self.charbuffer:
break
elif len(self.charbuffer) >= size:
break
else:
if len(self.charbuffer) >= chars: if len(self.charbuffer) >= chars:
break break
elif size >= 0:
if len(self.charbuffer) >= size:
break
# we need more data # we need more data
if size < 0: if size < 0:
newdata = self.stream.read() newdata = self.stream.read()
@ -491,6 +488,8 @@ class StreamReader(Codec):
newdata = self.stream.read(size) newdata = self.stream.read(size)
# decode bytes (those remaining from the last call included) # decode bytes (those remaining from the last call included)
data = self.bytebuffer + newdata data = self.bytebuffer + newdata
if not data:
break
try: try:
newchars, decodedbytes = self.decode(data, self.errors) newchars, decodedbytes = self.decode(data, self.errors)
except UnicodeDecodeError as exc: except UnicodeDecodeError as exc:

View File

@ -181,7 +181,8 @@ def as_completed(fs, timeout=None):
Returns: Returns:
An iterator that yields the given Futures as they complete (finished or An iterator that yields the given Futures as they complete (finished or
cancelled). cancelled). If any given Futures are duplicated, they will be returned
once.
Raises: Raises:
TimeoutError: If the entire result iterator could not be generated TimeoutError: If the entire result iterator could not be generated
@ -190,11 +191,12 @@ def as_completed(fs, timeout=None):
if timeout is not None: if timeout is not None:
end_time = timeout + time.time() end_time = timeout + time.time()
fs = set(fs)
with _AcquireFutures(fs): with _AcquireFutures(fs):
finished = set( finished = set(
f for f in fs f for f in fs
if f._state in [CANCELLED_AND_NOTIFIED, FINISHED]) if f._state in [CANCELLED_AND_NOTIFIED, FINISHED])
pending = set(fs) - finished pending = fs - finished
waiter = _create_and_install_waiters(fs, _AS_COMPLETED) waiter = _create_and_install_waiters(fs, _AS_COMPLETED)
try: try:

View File

@ -5,16 +5,16 @@ parameter and docstring information when you type an opening parenthesis, and
which disappear when you type a closing parenthesis. which disappear when you type a closing parenthesis.
""" """
import __main__
import inspect
import re import re
import sys import sys
import textwrap
import types import types
import inspect
from idlelib import CallTipWindow from idlelib import CallTipWindow
from idlelib.HyperParser import HyperParser from idlelib.HyperParser import HyperParser
import __main__
class CallTips: class CallTips:
menudefs = [ menudefs = [
@ -117,8 +117,9 @@ def get_entity(expression):
return None return None
# The following are used in get_argspec and some in tests # The following are used in get_argspec and some in tests
_MAX_COLS = 79 _MAX_COLS = 85
_MAX_LINES = 5 # enough for bytes _MAX_LINES = 5 # enough for bytes
_INDENT = ' '*4 # for wrapped signatures
_first_param = re.compile('(?<=\()\w*\,?\s*') _first_param = re.compile('(?<=\()\w*\,?\s*')
_default_callable_argspec = "See source or doc" _default_callable_argspec = "See source or doc"
@ -149,13 +150,15 @@ def get_argspec(ob):
isinstance(ob_call, types.MethodType)): isinstance(ob_call, types.MethodType)):
argspec = _first_param.sub("", argspec) argspec = _first_param.sub("", argspec)
lines = (textwrap.wrap(argspec, _MAX_COLS, subsequent_indent=_INDENT)
if len(argspec) > _MAX_COLS else [argspec] if argspec else [])
if isinstance(ob_call, types.MethodType): if isinstance(ob_call, types.MethodType):
doc = ob_call.__doc__ doc = ob_call.__doc__
else: else:
doc = getattr(ob, "__doc__", "") doc = getattr(ob, "__doc__", "")
if doc: if doc:
lines = [argspec] if argspec else [] for line in doc.split('\n', _MAX_LINES)[:_MAX_LINES]:
for line in doc.split('\n', 5)[:_MAX_LINES]:
line = line.strip() line = line.strip()
if not line: if not line:
break break

View File

@ -82,9 +82,10 @@ class ConfigDialog(Toplevel):
else: else:
extraKwds=dict(padx=6, pady=3) extraKwds=dict(padx=6, pady=3)
self.buttonHelp = Button(frameActionButtons,text='Help', # Comment out button creation and packing until implement self.Help
command=self.Help,takefocus=FALSE, ## self.buttonHelp = Button(frameActionButtons,text='Help',
**extraKwds) ## command=self.Help,takefocus=FALSE,
## **extraKwds)
self.buttonOk = Button(frameActionButtons,text='Ok', self.buttonOk = Button(frameActionButtons,text='Ok',
command=self.Ok,takefocus=FALSE, command=self.Ok,takefocus=FALSE,
**extraKwds) **extraKwds)
@ -98,7 +99,7 @@ class ConfigDialog(Toplevel):
self.CreatePageHighlight() self.CreatePageHighlight()
self.CreatePageKeys() self.CreatePageKeys()
self.CreatePageGeneral() self.CreatePageGeneral()
self.buttonHelp.pack(side=RIGHT,padx=5) ## self.buttonHelp.pack(side=RIGHT,padx=5)
self.buttonOk.pack(side=LEFT,padx=5) self.buttonOk.pack(side=LEFT,padx=5)
self.buttonApply.pack(side=LEFT,padx=5) self.buttonApply.pack(side=LEFT,padx=5)
self.buttonCancel.pack(side=LEFT,padx=5) self.buttonCancel.pack(side=LEFT,padx=5)

View File

@ -1,5 +1,6 @@
import unittest import unittest
import idlelib.CallTips as ct import idlelib.CallTips as ct
import textwrap
import types import types
default_tip = ct._default_callable_argspec default_tip = ct._default_callable_argspec
@ -55,32 +56,45 @@ class Get_signatureTest(unittest.TestCase):
gtest(list.__new__, gtest(list.__new__,
'T.__new__(S, ...) -> a new object with type S, a subtype of T') 'T.__new__(S, ...) -> a new object with type S, a subtype of T')
gtest(list.__init__, gtest(list.__init__,
'Initializes self. See help(type(self)) for accurate signature.') 'x.__init__(...) initializes x; see help(type(x)) for signature')
append_doc = "L.append(object) -> None -- append object to end" append_doc = "L.append(object) -> None -- append object to end"
gtest(list.append, append_doc) gtest(list.append, append_doc)
gtest([].append, append_doc) gtest([].append, append_doc)
gtest(List.append, append_doc) gtest(List.append, append_doc)
gtest(types.MethodType, "Create a bound instance method object.") gtest(types.MethodType, "method(function, instance)")
gtest(SB(), default_tip) gtest(SB(), default_tip)
def test_signature_wrap(self):
self.assertEqual(signature(textwrap.TextWrapper), '''\
(width=70, initial_indent='', subsequent_indent='', expand_tabs=True,
replace_whitespace=True, fix_sentence_endings=False, break_long_words=True,
drop_whitespace=True, break_on_hyphens=True, tabsize=8, *, max_lines=None,
placeholder=' [...]')''')
def test_docline_truncation(self):
def f(): pass
f.__doc__ = 'a'*300
self.assertEqual(signature(f), '()\n' + 'a' * (ct._MAX_COLS-3) + '...')
def test_multiline_docstring(self): def test_multiline_docstring(self):
# Test fewer lines than max. # Test fewer lines than max.
self.assertEqual(signature(dict), self.assertEqual(signature(list),
"dict(mapping) -> new dictionary initialized from a mapping object's\n" "list() -> new empty list\n"
"(key, value) pairs\n" "list(iterable) -> new list initialized from iterable's items")
"dict(iterable) -> new dictionary initialized as if via:\n"
"d = {}\n"
"for k, v in iterable:"
)
# Test max lines and line (currently) too long. # Test max lines
self.assertEqual(signature(bytes), self.assertEqual(signature(bytes), '''\
"bytes(string, encoding[, errors]) -> bytes\n" bytes(iterable_of_ints) -> bytes
"bytes(bytes_or_buffer) -> immutable copy of bytes_or_buffer\n" bytes(string, encoding[, errors]) -> bytes
#bytes(int) -> bytes object of size given by the parameter initialized with null bytes bytes(bytes_or_buffer) -> immutable copy of bytes_or_buffer
"bytes(int) -> bytes object of size given by the parameter initialized with n...\n" bytes(int) -> bytes object of size given by the parameter initialized with null bytes
"bytes() -> empty bytes object") bytes() -> empty bytes object''')
# Test more than max lines
def f(): pass
f.__doc__ = 'a\n' * 15
self.assertEqual(signature(f), '()' + '\na' * ct._MAX_LINES)
def test_functions(self): def test_functions(self):
def t1(): 'doc' def t1(): 'doc'
@ -109,6 +123,16 @@ class Get_signatureTest(unittest.TestCase):
(tc.__call__, '(ci)'), (tc, '(ci)'), (TC.cm, "(a)"),): (tc.__call__, '(ci)'), (tc, '(ci)'), (TC.cm, "(a)"),):
self.assertEqual(signature(meth), mtip + "\ndoc") self.assertEqual(signature(meth), mtip + "\ndoc")
def test_starred_parameter(self):
# test that starred first parameter is *not* removed from argspec
class C:
def m1(*args): pass
def m2(**kwds): pass
c = C()
for meth, mtip in ((C.m1, '(*args)'), (c.m1, "(*args)"),
(C.m2, "(**kwds)"), (c.m2, "(**kwds)"),):
self.assertEqual(signature(meth), mtip)
def test_non_ascii_name(self): def test_non_ascii_name(self):
# test that re works to delete a first parameter name that # test that re works to delete a first parameter name that
# includes non-ascii chars, such as various forms of A. # includes non-ascii chars, such as various forms of A.

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 610 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 543 B

View File

@ -0,0 +1,3 @@
P4
16 16
ûñ¿úßÕ­±[ñ¥a_ÁX°°ðððð?ÿÿ

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 1020 B

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,6 @@
#define python_width 16
#define python_height 16
static char python_bits[] = {
0xDF, 0xFE, 0x8F, 0xFD, 0x5F, 0xFB, 0xAB, 0xFE, 0xB5, 0x8D, 0xDA, 0x8F,
0xA5, 0x86, 0xFA, 0x83, 0x1A, 0x80, 0x0D, 0x80, 0x0D, 0x80, 0x0F, 0xE0,
0x0F, 0xF8, 0x0F, 0xF8, 0x0F, 0xFC, 0xFF, 0xFF, };

View File

@ -175,6 +175,40 @@ class ReadTest(MixInCheckStateHandling):
size*"a", size*"a",
) )
def test_mixed_readline_and_read(self):
lines = ["Humpty Dumpty sat on a wall,\n",
"Humpty Dumpty had a great fall.\r\n",
"All the king's horses and all the king's men\r",
"Couldn't put Humpty together again."]
data = ''.join(lines)
def getreader():
stream = io.BytesIO(data.encode(self.encoding))
return codecs.getreader(self.encoding)(stream)
# Issue #8260: Test readline() followed by read()
f = getreader()
self.assertEqual(f.readline(), lines[0])
self.assertEqual(f.read(), ''.join(lines[1:]))
self.assertEqual(f.read(), '')
# Issue #16636: Test readline() followed by readlines()
f = getreader()
self.assertEqual(f.readline(), lines[0])
self.assertEqual(f.readlines(), lines[1:])
self.assertEqual(f.read(), '')
# Test read() followed by read()
f = getreader()
self.assertEqual(f.read(size=40, chars=5), data[:5])
self.assertEqual(f.read(), data[5:])
self.assertEqual(f.read(), '')
# Issue #12446: Test read() followed by readlines()
f = getreader()
self.assertEqual(f.read(size=40, chars=5), data[:5])
self.assertEqual(f.readlines(), [lines[0][5:]] + lines[1:])
self.assertEqual(f.read(), '')
def test_bug1175396(self): def test_bug1175396(self):
s = [ s = [
'<%!--===================================================\r\n', '<%!--===================================================\r\n',
@ -2370,8 +2404,6 @@ class TransformCodecTest(unittest.TestCase):
def test_readline(self): def test_readline(self):
for encoding in bytes_transform_encodings: for encoding in bytes_transform_encodings:
if encoding in ['uu_codec', 'zlib_codec']:
continue
with self.subTest(encoding=encoding): with self.subTest(encoding=encoding):
sin = codecs.encode(b"\x80", encoding) sin = codecs.encode(b"\x80", encoding)
reader = codecs.getreader(encoding)(io.BytesIO(sin)) reader = codecs.getreader(encoding)(io.BytesIO(sin))
@ -2522,6 +2554,7 @@ class ExceptionChainingTest(unittest.TestCase):
with self.assertRaisesRegex(exc_type, full_msg) as caught: with self.assertRaisesRegex(exc_type, full_msg) as caught:
yield caught yield caught
self.assertIsInstance(caught.exception.__cause__, exc_type) self.assertIsInstance(caught.exception.__cause__, exc_type)
self.assertIsNotNone(caught.exception.__cause__.__traceback__)
def raise_obj(self, *args, **kwds): def raise_obj(self, *args, **kwds):
# Helper to dynamically change the object raised by a test codec # Helper to dynamically change the object raised by a test codec

View File

@ -350,6 +350,13 @@ class AsCompletedTests:
SUCCESSFUL_FUTURE]), SUCCESSFUL_FUTURE]),
completed_futures) completed_futures)
def test_duplicate_futures(self):
# Issue 20367. Duplicate futures should not raise exceptions or give
# duplicate responses.
future1 = self.executor.submit(time.sleep, 2)
completed = [f for f in futures.as_completed([future1,future1])]
self.assertEqual(len(completed), 1)
class ThreadPoolAsCompletedTests(ThreadPoolMixin, AsCompletedTests, unittest.TestCase): class ThreadPoolAsCompletedTests(ThreadPoolMixin, AsCompletedTests, unittest.TestCase):
pass pass

131
Lib/test/test_imghdr.py Normal file
View File

@ -0,0 +1,131 @@
import imghdr
import io
import os
import unittest
import warnings
from test.support import findfile, TESTFN, unlink
TEST_FILES = (
('python.png', 'png'),
('python.gif', 'gif'),
('python.bmp', 'bmp'),
('python.ppm', 'ppm'),
('python.pgm', 'pgm'),
('python.pbm', 'pbm'),
('python.jpg', 'jpeg'),
('python.ras', 'rast'),
('python.sgi', 'rgb'),
('python.tiff', 'tiff'),
('python.xbm', 'xbm')
)
class UnseekableIO(io.FileIO):
def tell(self):
raise io.UnsupportedOperation
def seek(self, *args, **kwargs):
raise io.UnsupportedOperation
class TestImghdr(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.testfile = findfile('python.png', subdir='imghdrdata')
with open(cls.testfile, 'rb') as stream:
cls.testdata = stream.read()
def tearDown(self):
unlink(TESTFN)
def test_data(self):
for filename, expected in TEST_FILES:
filename = findfile(filename, subdir='imghdrdata')
self.assertEqual(imghdr.what(filename), expected)
with open(filename, 'rb') as stream:
self.assertEqual(imghdr.what(stream), expected)
with open(filename, 'rb') as stream:
data = stream.read()
self.assertEqual(imghdr.what(None, data), expected)
self.assertEqual(imghdr.what(None, bytearray(data)), expected)
def test_register_test(self):
def test_jumbo(h, file):
if h.startswith(b'eggs'):
return 'ham'
imghdr.tests.append(test_jumbo)
self.addCleanup(imghdr.tests.pop)
self.assertEqual(imghdr.what(None, b'eggs'), 'ham')
def test_file_pos(self):
with open(TESTFN, 'wb') as stream:
stream.write(b'ababagalamaga')
pos = stream.tell()
stream.write(self.testdata)
with open(TESTFN, 'rb') as stream:
stream.seek(pos)
self.assertEqual(imghdr.what(stream), 'png')
self.assertEqual(stream.tell(), pos)
def test_bad_args(self):
with self.assertRaises(TypeError):
imghdr.what()
with self.assertRaises(AttributeError):
imghdr.what(None)
with self.assertRaises(TypeError):
imghdr.what(self.testfile, 1)
with self.assertRaises(AttributeError):
imghdr.what(os.fsencode(self.testfile))
with open(self.testfile, 'rb') as f:
with self.assertRaises(AttributeError):
imghdr.what(f.fileno())
def test_invalid_headers(self):
for header in (b'\211PN\r\n',
b'\001\331',
b'\x59\xA6',
b'cutecat',
b'000000JFI',
b'GIF80'):
self.assertIsNone(imghdr.what(None, header))
def test_string_data(self):
with warnings.catch_warnings():
warnings.simplefilter("ignore", BytesWarning)
for filename, _ in TEST_FILES:
filename = findfile(filename, subdir='imghdrdata')
with open(filename, 'rb') as stream:
data = stream.read().decode('latin1')
with self.assertRaises(TypeError):
imghdr.what(io.StringIO(data))
with self.assertRaises(TypeError):
imghdr.what(None, data)
def test_missing_file(self):
with self.assertRaises(FileNotFoundError):
imghdr.what('missing')
def test_closed_file(self):
stream = open(self.testfile, 'rb')
stream.close()
with self.assertRaises(ValueError) as cm:
imghdr.what(stream)
stream = io.BytesIO(self.testdata)
stream.close()
with self.assertRaises(ValueError) as cm:
imghdr.what(stream)
def test_unseekable(self):
with open(TESTFN, 'wb') as stream:
stream.write(self.testdata)
with UnseekableIO(TESTFN, 'rb') as stream:
with self.assertRaises(io.UnsupportedOperation):
imghdr.what(stream)
def test_output_stream(self):
with open(TESTFN, 'wb') as stream:
stream.write(self.testdata)
stream.seek(0)
with self.assertRaises(OSError) as cm:
imghdr.what(stream)
if __name__ == '__main__':
unittest.main()

View File

@ -6,7 +6,7 @@ import unittest
class TestUntestedModules(unittest.TestCase): class TestUntestedModules(unittest.TestCase):
def test_untested_modules_can_be_imported(self): def test_untested_modules_can_be_imported(self):
untested = ('bdb', 'encodings', 'formatter', 'imghdr', untested = ('bdb', 'encodings', 'formatter',
'nturl2path', 'tabnanny') 'nturl2path', 'tabnanny')
with support.check_warnings(quiet=True): with support.check_warnings(quiet=True):
for name in untested: for name in untested:

View File

@ -1,4 +1,4 @@
"""wsgiref -- a WSGI (PEP 333) Reference Library """wsgiref -- a WSGI (PEP 3333) Reference Library
Current Contents: Current Contents:

View File

@ -725,6 +725,7 @@ Ronan Lamy
Torsten Landschoff Torsten Landschoff
Łukasz Langa Łukasz Langa
Tino Lange Tino Lange
Glenn Langford
Andrew Langmead Andrew Langmead
Detlef Lannert Detlef Lannert
Soren Larsen Soren Larsen

View File

@ -48,6 +48,16 @@ Core and Builtins
Library Library
------- -------
- Issue #20367: Fix behavior of concurrent.futures.as_completed() for
duplicate arguments. Patch by Glenn Langford.
- Issue #8260: The read(), readline() and readlines() methods of
codecs.StreamReader returned incomplete data when were called after
readline() or read(size). Based on patch by Amaury Forgeot d'Arc.
- Issue #20105: the codec exception chaining now correctly sets the
traceback of the original exception as its __traceback__ attribute.
- asyncio: Various improvements and small changes not all covered by - asyncio: Various improvements and small changes not all covered by
issues listed below. E.g. wait_for() now cancels the inner task if issues listed below. E.g. wait_for() now cancels the inner task if
the timeout occcurs; tweaked the set of exported symbols; renamed the timeout occcurs; tweaked the set of exported symbols; renamed
@ -145,6 +155,9 @@ Library
IDLE IDLE
---- ----
- Issue #17721: Remove non-functional configuration dialog help button until we
make it actually gives some help when clicked. Patch by Guilherme Simões.
- Issue #17390: Add Python version to Idle editor window title bar. - Issue #17390: Add Python version to Idle editor window title bar.
Original patches by Edmond Burnett and Kent Johnson. Original patches by Edmond Burnett and Kent Johnson.
@ -154,9 +167,15 @@ IDLE
Tests Tests
----- -----
- Issue #19990: Added tests for the imghdr module. Based on patch by
Claudiu Popa.
- Issue #20358: Tests for curses.window.overlay and curses.window.overwrite - Issue #20358: Tests for curses.window.overlay and curses.window.overwrite
no longer specify min{row,col} > max{row,col}. no longer specify min{row,col} > max{row,col}.
- Issue #19804: The test_find_mac test in test_uuid is now skipped if the
ifconfig executable is not available.
- Issue #19886: Use better estimated memory requirements for bigmem tests. - Issue #19886: Use better estimated memory requirements for bigmem tests.
Tools/Demos Tools/Demos
@ -553,9 +572,6 @@ IDLE
Tests Tests
----- -----
- Issue #19804: The test_find_mac test in test_uuid is now skipped if the
ifconfig executable is not available.
- Issue #20055: Fix test_shutil under Windows with symlink privileges held. - Issue #20055: Fix test_shutil under Windows with symlink privileges held.
Patch by Vajrasky Kok. Patch by Vajrasky Kok.

View File

@ -6149,7 +6149,7 @@ static PyObject *
load(UnpicklerObject *self) load(UnpicklerObject *self)
{ {
PyObject *value = NULL; PyObject *value = NULL;
char *s; char *s = NULL;
self->num_marks = 0; self->num_marks = 0;
self->proto = 0; self->proto = 0;

View File

@ -1304,6 +1304,7 @@ audioop_ratecv_impl(PyModuleDef *module, Py_buffer *fragment, int width, int nch
"weightA should be >= 1, weightB should be >= 0"); "weightA should be >= 1, weightB should be >= 0");
return NULL; return NULL;
} }
assert(fragment->len >= 0);
if (fragment->len % bytes_per_frame != 0) { if (fragment->len % bytes_per_frame != 0) {
PyErr_SetString(AudioopError, "not a whole number of frames"); PyErr_SetString(AudioopError, "not a whole number of frames");
return NULL; return NULL;
@ -1370,7 +1371,7 @@ audioop_ratecv_impl(PyModuleDef *module, Py_buffer *fragment, int width, int nch
case ceiling(len/inrate) * outrate. */ case ceiling(len/inrate) * outrate. */
/* compute ceiling(len/inrate) without overflow */ /* compute ceiling(len/inrate) without overflow */
Py_ssize_t q = len > 0 ? 1 + (len - 1) / inrate : 0; Py_ssize_t q = 1 + (len - 1) / inrate;
if (outrate > PY_SSIZE_T_MAX / q / bytes_per_frame) if (outrate > PY_SSIZE_T_MAX / q / bytes_per_frame)
str = NULL; str = NULL;
else else
@ -1608,7 +1609,7 @@ audioop_lin2adpcm_impl(PyModuleDef *module, Py_buffer *fragment, int width, PyOb
Py_ssize_t i; Py_ssize_t i;
int step, valpred, delta, int step, valpred, delta,
index, sign, vpdiff, diff; index, sign, vpdiff, diff;
PyObject *rv, *str; PyObject *rv = NULL, *str;
int outputbuffer = 0, bufferstep; int outputbuffer = 0, bufferstep;
if (!audioop_check_parameters(fragment->len, width)) if (!audioop_check_parameters(fragment->len, width))
@ -1626,9 +1627,10 @@ audioop_lin2adpcm_impl(PyModuleDef *module, Py_buffer *fragment, int width, PyOb
index = 0; index = 0;
} else if (!PyTuple_Check(state)) { } else if (!PyTuple_Check(state)) {
PyErr_SetString(PyExc_TypeError, "state must be a tuple or None"); PyErr_SetString(PyExc_TypeError, "state must be a tuple or None");
return NULL; goto exit;
} else if (!PyArg_ParseTuple(state, "ii", &valpred, &index)) } else if (!PyArg_ParseTuple(state, "ii", &valpred, &index)) {
return NULL; goto exit;
}
step = stepsizeTable[index]; step = stepsizeTable[index];
bufferstep = 1; bufferstep = 1;
@ -1704,6 +1706,8 @@ audioop_lin2adpcm_impl(PyModuleDef *module, Py_buffer *fragment, int width, PyOb
bufferstep = !bufferstep; bufferstep = !bufferstep;
} }
rv = Py_BuildValue("(O(ii))", str, valpred, index); rv = Py_BuildValue("(O(ii))", str, valpred, index);
exit:
Py_DECREF(str); Py_DECREF(str);
return rv; return rv;
} }

View File

@ -195,6 +195,11 @@ class ascii_buffer_converter(CConverter):
type = 'Py_buffer' type = 'Py_buffer'
converter = 'ascii_buffer_converter' converter = 'ascii_buffer_converter'
impl_by_reference = True impl_by_reference = True
c_default = "{NULL, NULL}"
def cleanup(self):
name = self.name
return "".join(["if (", name, ".obj)\n PyBuffer_Release(&", name, ");\n"])
[python start generated code]*/ [python start generated code]*/
/*[python end generated code: checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ /*[python end generated code: checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709]*/

View File

@ -16,7 +16,7 @@ static PyObject *
binascii_a2b_uu(PyModuleDef *module, PyObject *args) binascii_a2b_uu(PyModuleDef *module, PyObject *args)
{ {
PyObject *return_value = NULL; PyObject *return_value = NULL;
Py_buffer data; Py_buffer data = {NULL, NULL};
if (!PyArg_ParseTuple(args, if (!PyArg_ParseTuple(args,
"O&:a2b_uu", "O&:a2b_uu",
@ -25,6 +25,10 @@ binascii_a2b_uu(PyModuleDef *module, PyObject *args)
return_value = binascii_a2b_uu_impl(module, &data); return_value = binascii_a2b_uu_impl(module, &data);
exit: exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value; return return_value;
} }
@ -72,7 +76,7 @@ static PyObject *
binascii_a2b_base64(PyModuleDef *module, PyObject *args) binascii_a2b_base64(PyModuleDef *module, PyObject *args)
{ {
PyObject *return_value = NULL; PyObject *return_value = NULL;
Py_buffer data; Py_buffer data = {NULL, NULL};
if (!PyArg_ParseTuple(args, if (!PyArg_ParseTuple(args,
"O&:a2b_base64", "O&:a2b_base64",
@ -81,6 +85,10 @@ binascii_a2b_base64(PyModuleDef *module, PyObject *args)
return_value = binascii_a2b_base64_impl(module, &data); return_value = binascii_a2b_base64_impl(module, &data);
exit: exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value; return return_value;
} }
@ -128,7 +136,7 @@ static PyObject *
binascii_a2b_hqx(PyModuleDef *module, PyObject *args) binascii_a2b_hqx(PyModuleDef *module, PyObject *args)
{ {
PyObject *return_value = NULL; PyObject *return_value = NULL;
Py_buffer data; Py_buffer data = {NULL, NULL};
if (!PyArg_ParseTuple(args, if (!PyArg_ParseTuple(args,
"O&:a2b_hqx", "O&:a2b_hqx",
@ -137,6 +145,10 @@ binascii_a2b_hqx(PyModuleDef *module, PyObject *args)
return_value = binascii_a2b_hqx_impl(module, &data); return_value = binascii_a2b_hqx_impl(module, &data);
exit: exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value; return return_value;
} }
@ -350,7 +362,7 @@ static PyObject *
binascii_a2b_hex(PyModuleDef *module, PyObject *args) binascii_a2b_hex(PyModuleDef *module, PyObject *args)
{ {
PyObject *return_value = NULL; PyObject *return_value = NULL;
Py_buffer hexstr; Py_buffer hexstr = {NULL, NULL};
if (!PyArg_ParseTuple(args, if (!PyArg_ParseTuple(args,
"O&:a2b_hex", "O&:a2b_hex",
@ -359,6 +371,10 @@ binascii_a2b_hex(PyModuleDef *module, PyObject *args)
return_value = binascii_a2b_hex_impl(module, &hexstr); return_value = binascii_a2b_hex_impl(module, &hexstr);
exit: exit:
/* Cleanup for hexstr */
if (hexstr.obj)
PyBuffer_Release(&hexstr);
return return_value; return return_value;
} }
@ -377,7 +393,7 @@ binascii_a2b_qp(PyModuleDef *module, PyObject *args, PyObject *kwargs)
{ {
PyObject *return_value = NULL; PyObject *return_value = NULL;
static char *_keywords[] = {"data", "header", NULL}; static char *_keywords[] = {"data", "header", NULL};
Py_buffer data; Py_buffer data = {NULL, NULL};
int header = 0; int header = 0;
if (!PyArg_ParseTupleAndKeywords(args, kwargs, if (!PyArg_ParseTupleAndKeywords(args, kwargs,
@ -387,6 +403,10 @@ binascii_a2b_qp(PyModuleDef *module, PyObject *args, PyObject *kwargs)
return_value = binascii_a2b_qp_impl(module, &data, header); return_value = binascii_a2b_qp_impl(module, &data, header);
exit: exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value; return return_value;
} }
@ -427,4 +447,4 @@ exit:
return return_value; return return_value;
} }
/*[clinic end generated code: checksum=abe48ca8020fa3ec25e13bd9fa7414f6b3ee2946]*/ /*[clinic end generated code: checksum=8180e5be47a110ae8c89263a7c12a91d80754f60]*/

View File

@ -0,0 +1,411 @@
/*[clinic input]
preserve
[clinic start generated code]*/
PyDoc_STRVAR(zlib_compress__doc__,
"compress(module, bytes, level=Z_DEFAULT_COMPRESSION)\n"
"Returns a bytes object containing compressed data.\n"
"\n"
" bytes\n"
" Binary data to be compressed.\n"
" level\n"
" Compression level, in 0-9.");
#define ZLIB_COMPRESS_METHODDEF \
{"compress", (PyCFunction)zlib_compress, METH_VARARGS, zlib_compress__doc__},
static PyObject *
zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int level);
static PyObject *
zlib_compress(PyModuleDef *module, PyObject *args)
{
PyObject *return_value = NULL;
Py_buffer bytes = {NULL, NULL};
int level = Z_DEFAULT_COMPRESSION;
if (!PyArg_ParseTuple(args,
"y*|i:compress",
&bytes, &level))
goto exit;
return_value = zlib_compress_impl(module, &bytes, level);
exit:
/* Cleanup for bytes */
if (bytes.obj)
PyBuffer_Release(&bytes);
return return_value;
}
PyDoc_STRVAR(zlib_decompress__doc__,
"decompress(module, data, wbits=MAX_WBITS, bufsize=DEF_BUF_SIZE)\n"
"Returns a bytes object containing the uncompressed data.\n"
"\n"
" data\n"
" Compressed data.\n"
" wbits\n"
" The window buffer size.\n"
" bufsize\n"
" The initial output buffer size.");
#define ZLIB_DECOMPRESS_METHODDEF \
{"decompress", (PyCFunction)zlib_decompress, METH_VARARGS, zlib_decompress__doc__},
static PyObject *
zlib_decompress_impl(PyModuleDef *module, Py_buffer *data, int wbits, unsigned int bufsize);
static PyObject *
zlib_decompress(PyModuleDef *module, PyObject *args)
{
PyObject *return_value = NULL;
Py_buffer data = {NULL, NULL};
int wbits = MAX_WBITS;
unsigned int bufsize = DEF_BUF_SIZE;
if (!PyArg_ParseTuple(args,
"y*|iO&:decompress",
&data, &wbits, uint_converter, &bufsize))
goto exit;
return_value = zlib_decompress_impl(module, &data, wbits, bufsize);
exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value;
}
PyDoc_STRVAR(zlib_compressobj__doc__,
"compressobj(module, level=Z_DEFAULT_COMPRESSION, method=DEFLATED, wbits=MAX_WBITS, memLevel=DEF_MEM_LEVEL, strategy=Z_DEFAULT_STRATEGY, zdict=None)\n"
"Return a compressor object.\n"
"\n"
" level\n"
" The compression level (an integer in the range 0-9; default is 6).\n"
" Higher compression levels are slower, but produce smaller results.\n"
" method\n"
" The compression algorithm. If given, this must be DEFLATED.\n"
" wbits\n"
" The base two logarithm of the window size (range: 8..15).\n"
" memLevel\n"
" Controls the amount of memory used for internal compression state.\n"
" Valid values range from 1 to 9. Higher values result in higher memory\n"
" usage, faster compression, and smaller output.\n"
" strategy\n"
" Used to tune the compression algorithm. Possible values are\n"
" Z_DEFAULT_STRATEGY, Z_FILTERED, and Z_HUFFMAN_ONLY.\n"
" zdict\n"
" The predefined compression dictionary - a sequence of bytes\n"
" containing subsequences that are likely to occur in the input data.");
#define ZLIB_COMPRESSOBJ_METHODDEF \
{"compressobj", (PyCFunction)zlib_compressobj, METH_VARARGS|METH_KEYWORDS, zlib_compressobj__doc__},
static PyObject *
zlib_compressobj_impl(PyModuleDef *module, int level, int method, int wbits, int memLevel, int strategy, Py_buffer *zdict);
static PyObject *
zlib_compressobj(PyModuleDef *module, PyObject *args, PyObject *kwargs)
{
PyObject *return_value = NULL;
static char *_keywords[] = {"level", "method", "wbits", "memLevel", "strategy", "zdict", NULL};
int level = Z_DEFAULT_COMPRESSION;
int method = DEFLATED;
int wbits = MAX_WBITS;
int memLevel = DEF_MEM_LEVEL;
int strategy = Z_DEFAULT_STRATEGY;
Py_buffer zdict = {NULL, NULL};
if (!PyArg_ParseTupleAndKeywords(args, kwargs,
"|iiiiiy*:compressobj", _keywords,
&level, &method, &wbits, &memLevel, &strategy, &zdict))
goto exit;
return_value = zlib_compressobj_impl(module, level, method, wbits, memLevel, strategy, &zdict);
exit:
/* Cleanup for zdict */
if (zdict.obj)
PyBuffer_Release(&zdict);
return return_value;
}
PyDoc_STRVAR(zlib_decompressobj__doc__,
"decompressobj(module, wbits=MAX_WBITS, zdict=b\'\')\n"
"Return a decompressor object.\n"
"\n"
" wbits\n"
" The window buffer size.\n"
" zdict\n"
" The predefined compression dictionary. This must be the same\n"
" dictionary as used by the compressor that produced the input data.");
#define ZLIB_DECOMPRESSOBJ_METHODDEF \
{"decompressobj", (PyCFunction)zlib_decompressobj, METH_VARARGS|METH_KEYWORDS, zlib_decompressobj__doc__},
static PyObject *
zlib_decompressobj_impl(PyModuleDef *module, int wbits, PyObject *zdict);
static PyObject *
zlib_decompressobj(PyModuleDef *module, PyObject *args, PyObject *kwargs)
{
PyObject *return_value = NULL;
static char *_keywords[] = {"wbits", "zdict", NULL};
int wbits = MAX_WBITS;
PyObject *zdict = NULL;
if (!PyArg_ParseTupleAndKeywords(args, kwargs,
"|iO:decompressobj", _keywords,
&wbits, &zdict))
goto exit;
return_value = zlib_decompressobj_impl(module, wbits, zdict);
exit:
return return_value;
}
PyDoc_STRVAR(zlib_Compress_compress__doc__,
"compress(self, data)\n"
"Returns a bytes object containing compressed data.\n"
"\n"
" data\n"
" Binary data to be compressed.\n"
"\n"
"After calling this function, some of the input data may still\n"
"be stored in internal buffers for later processing.\n"
"Call the flush() method to clear these buffers.");
#define ZLIB_COMPRESS_COMPRESS_METHODDEF \
{"compress", (PyCFunction)zlib_Compress_compress, METH_VARARGS, zlib_Compress_compress__doc__},
static PyObject *
zlib_Compress_compress_impl(compobject *self, Py_buffer *data);
static PyObject *
zlib_Compress_compress(compobject *self, PyObject *args)
{
PyObject *return_value = NULL;
Py_buffer data = {NULL, NULL};
if (!PyArg_ParseTuple(args,
"y*:compress",
&data))
goto exit;
return_value = zlib_Compress_compress_impl(self, &data);
exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value;
}
PyDoc_STRVAR(zlib_Decompress_decompress__doc__,
"decompress(self, data, max_length=0)\n"
"Return a bytes object containing the decompressed version of the data.\n"
"\n"
" data\n"
" The binary data to decompress.\n"
" max_length\n"
" The maximum allowable length of the decompressed data.\n"
" Unconsumed input data will be stored in\n"
" the unconsumed_tail attribute.\n"
"\n"
"After calling this function, some of the input data may still be stored in\n"
"internal buffers for later processing.\n"
"Call the flush() method to clear these buffers.");
#define ZLIB_DECOMPRESS_DECOMPRESS_METHODDEF \
{"decompress", (PyCFunction)zlib_Decompress_decompress, METH_VARARGS, zlib_Decompress_decompress__doc__},
static PyObject *
zlib_Decompress_decompress_impl(compobject *self, Py_buffer *data, unsigned int max_length);
static PyObject *
zlib_Decompress_decompress(compobject *self, PyObject *args)
{
PyObject *return_value = NULL;
Py_buffer data = {NULL, NULL};
unsigned int max_length = 0;
if (!PyArg_ParseTuple(args,
"y*|O&:decompress",
&data, uint_converter, &max_length))
goto exit;
return_value = zlib_Decompress_decompress_impl(self, &data, max_length);
exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value;
}
PyDoc_STRVAR(zlib_Compress_flush__doc__,
"flush(self, mode=Z_FINISH)\n"
"Return a bytes object containing any remaining compressed data.\n"
"\n"
" mode\n"
" One of the constants Z_SYNC_FLUSH, Z_FULL_FLUSH, Z_FINISH.\n"
" If mode == Z_FINISH, the compressor object can no longer be\n"
" used after calling the flush() method. Otherwise, more data\n"
" can still be compressed.");
#define ZLIB_COMPRESS_FLUSH_METHODDEF \
{"flush", (PyCFunction)zlib_Compress_flush, METH_VARARGS, zlib_Compress_flush__doc__},
static PyObject *
zlib_Compress_flush_impl(compobject *self, int mode);
static PyObject *
zlib_Compress_flush(compobject *self, PyObject *args)
{
PyObject *return_value = NULL;
int mode = Z_FINISH;
if (!PyArg_ParseTuple(args,
"|i:flush",
&mode))
goto exit;
return_value = zlib_Compress_flush_impl(self, mode);
exit:
return return_value;
}
PyDoc_STRVAR(zlib_Compress_copy__doc__,
"copy(self)\n"
"Return a copy of the compression object.");
#define ZLIB_COMPRESS_COPY_METHODDEF \
{"copy", (PyCFunction)zlib_Compress_copy, METH_NOARGS, zlib_Compress_copy__doc__},
static PyObject *
zlib_Compress_copy_impl(compobject *self);
static PyObject *
zlib_Compress_copy(compobject *self, PyObject *Py_UNUSED(ignored))
{
return zlib_Compress_copy_impl(self);
}
PyDoc_STRVAR(zlib_Decompress_copy__doc__,
"copy(self)\n"
"Return a copy of the decompression object.");
#define ZLIB_DECOMPRESS_COPY_METHODDEF \
{"copy", (PyCFunction)zlib_Decompress_copy, METH_NOARGS, zlib_Decompress_copy__doc__},
static PyObject *
zlib_Decompress_copy_impl(compobject *self);
static PyObject *
zlib_Decompress_copy(compobject *self, PyObject *Py_UNUSED(ignored))
{
return zlib_Decompress_copy_impl(self);
}
PyDoc_STRVAR(zlib_Decompress_flush__doc__,
"flush(self, length=DEF_BUF_SIZE)\n"
"Return a bytes object containing any remaining decompressed data.\n"
"\n"
" length\n"
" the initial size of the output buffer.");
#define ZLIB_DECOMPRESS_FLUSH_METHODDEF \
{"flush", (PyCFunction)zlib_Decompress_flush, METH_VARARGS, zlib_Decompress_flush__doc__},
static PyObject *
zlib_Decompress_flush_impl(compobject *self, unsigned int length);
static PyObject *
zlib_Decompress_flush(compobject *self, PyObject *args)
{
PyObject *return_value = NULL;
unsigned int length = DEF_BUF_SIZE;
if (!PyArg_ParseTuple(args,
"|O&:flush",
uint_converter, &length))
goto exit;
return_value = zlib_Decompress_flush_impl(self, length);
exit:
return return_value;
}
PyDoc_STRVAR(zlib_adler32__doc__,
"adler32(module, data, value=1)\n"
"Compute an Adler-32 checksum of data.\n"
"\n"
" value\n"
" Starting value of the checksum.\n"
"\n"
"The returned checksum is an integer.");
#define ZLIB_ADLER32_METHODDEF \
{"adler32", (PyCFunction)zlib_adler32, METH_VARARGS, zlib_adler32__doc__},
static PyObject *
zlib_adler32_impl(PyModuleDef *module, Py_buffer *data, unsigned int value);
static PyObject *
zlib_adler32(PyModuleDef *module, PyObject *args)
{
PyObject *return_value = NULL;
Py_buffer data = {NULL, NULL};
unsigned int value = 1;
if (!PyArg_ParseTuple(args,
"y*|I:adler32",
&data, &value))
goto exit;
return_value = zlib_adler32_impl(module, &data, value);
exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value;
}
PyDoc_STRVAR(zlib_crc32__doc__,
"crc32(module, data, value=0)\n"
"Compute a CRC-32 checksum of data.\n"
"\n"
" value\n"
" Starting value of the checksum.\n"
"\n"
"The returned checksum is an integer.");
#define ZLIB_CRC32_METHODDEF \
{"crc32", (PyCFunction)zlib_crc32, METH_VARARGS, zlib_crc32__doc__},
static PyObject *
zlib_crc32_impl(PyModuleDef *module, Py_buffer *data, unsigned int value);
static PyObject *
zlib_crc32(PyModuleDef *module, PyObject *args)
{
PyObject *return_value = NULL;
Py_buffer data = {NULL, NULL};
unsigned int value = 0;
if (!PyArg_ParseTuple(args,
"y*|I:crc32",
&data, &value))
goto exit;
return_value = zlib_crc32_impl(module, &data, value);
exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value;
}
/*[clinic end generated code: checksum=04f94bbaf2652717753e237e4021bf6c92ddffdd]*/

View File

@ -28,10 +28,9 @@
#else #else
# define DEF_MEM_LEVEL MAX_MEM_LEVEL # define DEF_MEM_LEVEL MAX_MEM_LEVEL
#endif #endif
#define DEF_WBITS MAX_WBITS
/* The output buffer will be increased in chunks of DEFAULTALLOC bytes. */ /* Initial buffer size. */
#define DEFAULTALLOC (16*1024) #define DEF_BUF_SIZE (16*1024)
static PyTypeObject Comptype; static PyTypeObject Comptype;
static PyTypeObject Decomptype; static PyTypeObject Decomptype;
@ -82,42 +81,13 @@ zlib_error(z_stream zst, int err, char *msg)
} }
/*[clinic input] /*[clinic input]
output preset file
module zlib module zlib
class zlib.Compress "compobject *" "&Comptype" class zlib.Compress "compobject *" "&Comptype"
class zlib.Decompress "compobject *" "&Decomptype" class zlib.Decompress "compobject *" "&Decomptype"
[clinic start generated code]*/ [clinic start generated code]*/
/*[clinic end generated code: checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ /*[clinic end generated code: checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709]*/
PyDoc_STRVAR(compressobj__doc__,
"compressobj(level=-1, method=DEFLATED, wbits=15, memlevel=8,\n"
" strategy=Z_DEFAULT_STRATEGY[, zdict])\n"
" -- Return a compressor object.\n"
"\n"
"level is the compression level (an integer in the range 0-9; default is 6).\n"
"Higher compression levels are slower, but produce smaller results.\n"
"\n"
"method is the compression algorithm. If given, this must be DEFLATED.\n"
"\n"
"wbits is the base two logarithm of the window size (range: 8..15).\n"
"\n"
"memlevel controls the amount of memory used for internal compression state.\n"
"Valid values range from 1 to 9. Higher values result in higher memory usage,\n"
"faster compression, and smaller output.\n"
"\n"
"strategy is used to tune the compression algorithm. Possible values are\n"
"Z_DEFAULT_STRATEGY, Z_FILTERED, and Z_HUFFMAN_ONLY.\n"
"\n"
"zdict is the predefined compression dictionary - a sequence of bytes\n"
"containing subsequences that are likely to occur in the input data.");
PyDoc_STRVAR(decompressobj__doc__,
"decompressobj([wbits[, zdict]]) -- Return a decompressor object.\n"
"\n"
"Optional arg wbits is the window buffer size.\n"
"\n"
"Optional arg zdict is the predefined compression dictionary. This must be\n"
"the same dictionary as used by the compressor that produced the input data.");
static compobject * static compobject *
newcompobject(PyTypeObject *type) newcompobject(PyTypeObject *type)
{ {
@ -165,70 +135,20 @@ PyZlib_Free(voidpf ctx, void *ptr)
} }
/*[clinic input] /*[clinic input]
zlib.compress zlib.compress
bytes: Py_buffer bytes: Py_buffer
Binary data to be compressed. Binary data to be compressed.
[ level: int(c_default="Z_DEFAULT_COMPRESSION") = Z_DEFAULT_COMPRESSION
level: int
Compression level, in 0-9. Compression level, in 0-9.
]
/ /
Returns compressed string. Returns a bytes object containing compressed data.
[clinic start generated code]*/ [clinic start generated code]*/
PyDoc_STRVAR(zlib_compress__doc__,
"compress(module, bytes, [level])\n"
"Returns compressed string.\n"
"\n"
" bytes\n"
" Binary data to be compressed.\n"
" level\n"
" Compression level, in 0-9.");
#define ZLIB_COMPRESS_METHODDEF \
{"compress", (PyCFunction)zlib_compress, METH_VARARGS, zlib_compress__doc__},
static PyObject * static PyObject *
zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int group_right_1, int level); zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int level)
/*[clinic end generated code: checksum=5d7dd4588788efd3516e5f4225050d6413632601]*/
static PyObject *
zlib_compress(PyModuleDef *module, PyObject *args)
{
PyObject *return_value = NULL;
Py_buffer bytes = {NULL, NULL};
int group_right_1 = 0;
int level = 0;
switch (PyTuple_GET_SIZE(args)) {
case 1:
if (!PyArg_ParseTuple(args, "y*:compress", &bytes))
goto exit;
break;
case 2:
if (!PyArg_ParseTuple(args, "y*i:compress", &bytes, &level))
goto exit;
group_right_1 = 1;
break;
default:
PyErr_SetString(PyExc_TypeError, "zlib.compress requires 1 to 2 arguments");
goto exit;
}
return_value = zlib_compress_impl(module, &bytes, group_right_1, level);
exit:
/* Cleanup for bytes */
if (bytes.obj)
PyBuffer_Release(&bytes);
return return_value;
}
static PyObject *
zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int group_right_1, int level)
/*[clinic end generated code: checksum=ce8d4c0a17ecd79c3ffcc032dcdf8ac6830ded1e]*/
{ {
PyObject *ReturnVal = NULL; PyObject *ReturnVal = NULL;
Byte *input, *output = NULL; Byte *input, *output = NULL;
@ -236,9 +156,6 @@ zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int group_right_1, int
int err; int err;
z_stream zst; z_stream zst;
if (!group_right_1)
level = Z_DEFAULT_COMPRESSION;
if ((size_t)bytes->len > UINT_MAX) { if ((size_t)bytes->len > UINT_MAX) {
PyErr_SetString(PyExc_OverflowError, PyErr_SetString(PyExc_OverflowError,
"Size does not fit in an unsigned int"); "Size does not fit in an unsigned int");
@ -312,6 +229,7 @@ zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int group_right_1, int
class uint_converter(CConverter): class uint_converter(CConverter):
type = 'unsigned int' type = 'unsigned int'
converter = 'uint_converter' converter = 'uint_converter'
c_ignored_default = "0"
[python start generated code]*/ [python start generated code]*/
/*[python end generated code: checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ /*[python end generated code: checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709]*/
@ -347,35 +265,38 @@ uint_converter(PyObject *obj, void *ptr)
return 1; return 1;
} }
PyDoc_STRVAR(decompress__doc__, /*[clinic input]
"decompress(string[, wbits[, bufsize]]) -- Return decompressed string.\n" zlib.decompress
"\n"
"Optional arg wbits is the window buffer size. Optional arg bufsize is\n" data: Py_buffer
"the initial output buffer size."); Compressed data.
wbits: int(c_default="MAX_WBITS") = MAX_WBITS
The window buffer size.
bufsize: uint(c_default="DEF_BUF_SIZE") = DEF_BUF_SIZE
The initial output buffer size.
/
Returns a bytes object containing the uncompressed data.
[clinic start generated code]*/
static PyObject * static PyObject *
PyZlib_decompress(PyObject *self, PyObject *args) zlib_decompress_impl(PyModuleDef *module, Py_buffer *data, int wbits, unsigned int bufsize)
/*[clinic end generated code: checksum=9e5464e72df9cb5fee73df662dbcaed867e01d32]*/
{ {
PyObject *result_str = NULL; PyObject *result_str = NULL;
Py_buffer pinput;
Byte *input; Byte *input;
unsigned int length; unsigned int length;
int err; int err;
int wsize=DEF_WBITS; unsigned int new_bufsize;
unsigned int bufsize = DEFAULTALLOC, new_bufsize;
z_stream zst; z_stream zst;
if (!PyArg_ParseTuple(args, "y*|iO&:decompress", if ((size_t)data->len > UINT_MAX) {
&pinput, &wsize, uint_converter, &bufsize))
return NULL;
if ((size_t)pinput.len > UINT_MAX) {
PyErr_SetString(PyExc_OverflowError, PyErr_SetString(PyExc_OverflowError,
"Size does not fit in an unsigned int"); "Size does not fit in an unsigned int");
goto error; goto error;
} }
input = pinput.buf; input = data->buf;
length = (unsigned int)pinput.len; length = (unsigned int)data->len;
if (bufsize == 0) if (bufsize == 0)
bufsize = 1; bufsize = 1;
@ -391,7 +312,7 @@ PyZlib_decompress(PyObject *self, PyObject *args)
zst.zfree = PyZlib_Free; zst.zfree = PyZlib_Free;
zst.next_out = (Byte *)PyBytes_AS_STRING(result_str); zst.next_out = (Byte *)PyBytes_AS_STRING(result_str);
zst.next_in = (Byte *)input; zst.next_in = (Byte *)input;
err = inflateInit2(&zst, wsize); err = inflateInit2(&zst, wbits);
switch(err) { switch(err) {
case(Z_OK): case(Z_OK):
@ -457,32 +378,45 @@ PyZlib_decompress(PyObject *self, PyObject *args)
if (_PyBytes_Resize(&result_str, zst.total_out) < 0) if (_PyBytes_Resize(&result_str, zst.total_out) < 0)
goto error; goto error;
PyBuffer_Release(&pinput);
return result_str; return result_str;
error: error:
PyBuffer_Release(&pinput);
Py_XDECREF(result_str); Py_XDECREF(result_str);
return NULL; return NULL;
} }
/*[clinic input]
zlib.compressobj
level: int(c_default="Z_DEFAULT_COMPRESSION") = Z_DEFAULT_COMPRESSION
The compression level (an integer in the range 0-9; default is 6).
Higher compression levels are slower, but produce smaller results.
method: int(c_default="DEFLATED") = DEFLATED
The compression algorithm. If given, this must be DEFLATED.
wbits: int(c_default="MAX_WBITS") = MAX_WBITS
The base two logarithm of the window size (range: 8..15).
memLevel: int(c_default="DEF_MEM_LEVEL") = DEF_MEM_LEVEL
Controls the amount of memory used for internal compression state.
Valid values range from 1 to 9. Higher values result in higher memory
usage, faster compression, and smaller output.
strategy: int(c_default="Z_DEFAULT_STRATEGY") = Z_DEFAULT_STRATEGY
Used to tune the compression algorithm. Possible values are
Z_DEFAULT_STRATEGY, Z_FILTERED, and Z_HUFFMAN_ONLY.
zdict: Py_buffer = None
The predefined compression dictionary - a sequence of bytes
containing subsequences that are likely to occur in the input data.
Return a compressor object.
[clinic start generated code]*/
static PyObject * static PyObject *
PyZlib_compressobj(PyObject *selfptr, PyObject *args, PyObject *kwargs) zlib_compressobj_impl(PyModuleDef *module, int level, int method, int wbits, int memLevel, int strategy, Py_buffer *zdict)
/*[clinic end generated code: checksum=89e5a6c1449caa9ed76f1baad066600e985151a9]*/
{ {
compobject *self = NULL; compobject *self = NULL;
int level=Z_DEFAULT_COMPRESSION, method=DEFLATED; int err;
int wbits=MAX_WBITS, memLevel=DEF_MEM_LEVEL, strategy=0, err;
Py_buffer zdict;
static char *kwlist[] = {"level", "method", "wbits",
"memLevel", "strategy", "zdict", NULL};
zdict.buf = NULL; /* Sentinel, so we can tell whether zdict was supplied. */ if (zdict->buf != NULL && (size_t)zdict->len > UINT_MAX) {
if (!PyArg_ParseTupleAndKeywords(args, kwargs, "|iiiiiy*:compressobj",
kwlist, &level, &method, &wbits,
&memLevel, &strategy, &zdict))
return NULL;
if (zdict.buf != NULL && (size_t)zdict.len > UINT_MAX) {
PyErr_SetString(PyExc_OverflowError, PyErr_SetString(PyExc_OverflowError,
"zdict length does not fit in an unsigned int"); "zdict length does not fit in an unsigned int");
goto error; goto error;
@ -500,11 +434,11 @@ PyZlib_compressobj(PyObject *selfptr, PyObject *args, PyObject *kwargs)
switch(err) { switch(err) {
case (Z_OK): case (Z_OK):
self->is_initialised = 1; self->is_initialised = 1;
if (zdict.buf == NULL) { if (zdict->buf == NULL) {
goto success; goto success;
} else { } else {
err = deflateSetDictionary(&self->zst, err = deflateSetDictionary(&self->zst,
zdict.buf, (unsigned int)zdict.len); zdict->buf, (unsigned int)zdict->len);
switch (err) { switch (err) {
case (Z_OK): case (Z_OK):
goto success; goto success;
@ -532,22 +466,28 @@ PyZlib_compressobj(PyObject *selfptr, PyObject *args, PyObject *kwargs)
Py_XDECREF(self); Py_XDECREF(self);
self = NULL; self = NULL;
success: success:
if (zdict.buf != NULL)
PyBuffer_Release(&zdict);
return (PyObject*)self; return (PyObject*)self;
} }
static PyObject * /*[clinic input]
PyZlib_decompressobj(PyObject *selfptr, PyObject *args, PyObject *kwargs) zlib.decompressobj
{
static char *kwlist[] = {"wbits", "zdict", NULL}; wbits: int(c_default="MAX_WBITS") = MAX_WBITS
int wbits=DEF_WBITS, err; The window buffer size.
compobject *self; zdict: object(c_default="NULL") = b''
PyObject *zdict=NULL; The predefined compression dictionary. This must be the same
dictionary as used by the compressor that produced the input data.
Return a decompressor object.
[clinic start generated code]*/
static PyObject *
zlib_decompressobj_impl(PyModuleDef *module, int wbits, PyObject *zdict)
/*[clinic end generated code: checksum=8ccd583fbd631798566d415933cd44440c8a74b5]*/
{
int err;
compobject *self;
if (!PyArg_ParseTupleAndKeywords(args, kwargs, "|iO:decompressobj",
kwlist, &wbits, &zdict))
return NULL;
if (zdict != NULL && !PyObject_CheckBuffer(zdict)) { if (zdict != NULL && !PyObject_CheckBuffer(zdict)) {
PyErr_SetString(PyExc_TypeError, PyErr_SetString(PyExc_TypeError,
"zdict argument must support the buffer protocol"); "zdict argument must support the buffer protocol");
@ -615,37 +555,41 @@ Decomp_dealloc(compobject *self)
Dealloc(self); Dealloc(self);
} }
PyDoc_STRVAR(comp_compress__doc__, /*[clinic input]
"compress(data) -- Return a string containing data compressed.\n" zlib.Compress.compress
"\n"
"After calling this function, some of the input data may still\n"
"be stored in internal buffers for later processing.\n"
"Call the flush() method to clear these buffers.");
data: Py_buffer
Binary data to be compressed.
/
Returns a bytes object containing compressed data.
After calling this function, some of the input data may still
be stored in internal buffers for later processing.
Call the flush() method to clear these buffers.
[clinic start generated code]*/
static PyObject * static PyObject *
PyZlib_objcompress(compobject *self, PyObject *args) zlib_Compress_compress_impl(compobject *self, Py_buffer *data)
/*[clinic end generated code: checksum=5d5cd791cbc6a7f4b6de4ec12c085c88d4d3e31c]*/
{ {
int err; int err;
unsigned int inplen; unsigned int inplen;
unsigned int length = DEFAULTALLOC, new_length; unsigned int length = DEF_BUF_SIZE, new_length;
PyObject *RetVal = NULL; PyObject *RetVal;
Py_buffer pinput;
Byte *input; Byte *input;
unsigned long start_total_out; unsigned long start_total_out;
if (!PyArg_ParseTuple(args, "y*:compress", &pinput)) if ((size_t)data->len > UINT_MAX) {
return NULL;
if ((size_t)pinput.len > UINT_MAX) {
PyErr_SetString(PyExc_OverflowError, PyErr_SetString(PyExc_OverflowError,
"Size does not fit in an unsigned int"); "Size does not fit in an unsigned int");
goto error_outer; return NULL;
} }
input = pinput.buf; input = data->buf;
inplen = (unsigned int)pinput.len; inplen = (unsigned int)data->len;
if (!(RetVal = PyBytes_FromStringAndSize(NULL, length))) if (!(RetVal = PyBytes_FromStringAndSize(NULL, length)))
goto error_outer; return NULL;
ENTER_ZLIB(self); ENTER_ZLIB(self);
@ -668,7 +612,7 @@ PyZlib_objcompress(compobject *self, PyObject *args)
new_length = UINT_MAX; new_length = UINT_MAX;
if (_PyBytes_Resize(&RetVal, new_length) < 0) { if (_PyBytes_Resize(&RetVal, new_length) < 0) {
Py_CLEAR(RetVal); Py_CLEAR(RetVal);
goto error; goto done;
} }
self->zst.next_out = self->zst.next_out =
(unsigned char *)PyBytes_AS_STRING(RetVal) + length; (unsigned char *)PyBytes_AS_STRING(RetVal) + length;
@ -686,18 +630,15 @@ PyZlib_objcompress(compobject *self, PyObject *args)
if (err != Z_OK && err != Z_BUF_ERROR) { if (err != Z_OK && err != Z_BUF_ERROR) {
zlib_error(self->zst, err, "while compressing data"); zlib_error(self->zst, err, "while compressing data");
Py_DECREF(RetVal); Py_CLEAR(RetVal);
RetVal = NULL; goto done;
goto error;
} }
if (_PyBytes_Resize(&RetVal, self->zst.total_out - start_total_out) < 0) { if (_PyBytes_Resize(&RetVal, self->zst.total_out - start_total_out) < 0) {
Py_CLEAR(RetVal); Py_CLEAR(RetVal);
} }
error: done:
LEAVE_ZLIB(self); LEAVE_ZLIB(self);
error_outer:
PyBuffer_Release(&pinput);
return RetVal; return RetVal;
} }
@ -745,7 +686,6 @@ save_unconsumed_input(compobject *self, int err)
} }
/*[clinic input] /*[clinic input]
zlib.Decompress.decompress zlib.Decompress.decompress
data: Py_buffer data: Py_buffer
@ -756,61 +696,19 @@ zlib.Decompress.decompress
the unconsumed_tail attribute. the unconsumed_tail attribute.
/ /
Return a string containing the decompressed version of the data. Return a bytes object containing the decompressed version of the data.
After calling this function, some of the input data may still be stored in After calling this function, some of the input data may still be stored in
internal buffers for later processing. internal buffers for later processing.
Call the flush() method to clear these buffers. Call the flush() method to clear these buffers.
[clinic start generated code]*/ [clinic start generated code]*/
PyDoc_STRVAR(zlib_Decompress_decompress__doc__,
"decompress(self, data, max_length=0)\n"
"Return a string containing the decompressed version of the data.\n"
"\n"
" data\n"
" The binary data to decompress.\n"
" max_length\n"
" The maximum allowable length of the decompressed data.\n"
" Unconsumed input data will be stored in\n"
" the unconsumed_tail attribute.\n"
"\n"
"After calling this function, some of the input data may still be stored in\n"
"internal buffers for later processing.\n"
"Call the flush() method to clear these buffers.");
#define ZLIB_DECOMPRESS_DECOMPRESS_METHODDEF \
{"decompress", (PyCFunction)zlib_Decompress_decompress, METH_VARARGS, zlib_Decompress_decompress__doc__},
static PyObject *
zlib_Decompress_decompress_impl(compobject *self, Py_buffer *data, unsigned int max_length);
static PyObject *
zlib_Decompress_decompress(compobject *self, PyObject *args)
{
PyObject *return_value = NULL;
Py_buffer data = {NULL, NULL};
unsigned int max_length = 0;
if (!PyArg_ParseTuple(args,
"y*|O&:decompress",
&data, uint_converter, &max_length))
goto exit;
return_value = zlib_Decompress_decompress_impl(self, &data, max_length);
exit:
/* Cleanup for data */
if (data.obj)
PyBuffer_Release(&data);
return return_value;
}
static PyObject * static PyObject *
zlib_Decompress_decompress_impl(compobject *self, Py_buffer *data, unsigned int max_length) zlib_Decompress_decompress_impl(compobject *self, Py_buffer *data, unsigned int max_length)
/*[clinic end generated code: checksum=b7fd2e3b23430f57f5a84817189575bc46464901]*/ /*[clinic end generated code: checksum=755cccc9087bfe55486b7e15fa7e2ab60b4c86d6]*/
{ {
int err; int err;
unsigned int old_length, length = DEFAULTALLOC; unsigned int old_length, length = DEF_BUF_SIZE;
PyObject *RetVal = NULL; PyObject *RetVal = NULL;
unsigned long start_total_out; unsigned long start_total_out;
@ -927,29 +825,31 @@ zlib_Decompress_decompress_impl(compobject *self, Py_buffer *data, unsigned int
return RetVal; return RetVal;
} }
PyDoc_STRVAR(comp_flush__doc__, /*[clinic input]
"flush( [mode] ) -- Return a string containing any remaining compressed data.\n" zlib.Compress.flush
"\n"
"mode can be one of the constants Z_SYNC_FLUSH, Z_FULL_FLUSH, Z_FINISH; the\n" mode: int(c_default="Z_FINISH") = Z_FINISH
"default value used when mode is not specified is Z_FINISH.\n" One of the constants Z_SYNC_FLUSH, Z_FULL_FLUSH, Z_FINISH.
"If mode == Z_FINISH, the compressor object can no longer be used after\n" If mode == Z_FINISH, the compressor object can no longer be
"calling the flush() method. Otherwise, more data can still be compressed."); used after calling the flush() method. Otherwise, more data
can still be compressed.
/
Return a bytes object containing any remaining compressed data.
[clinic start generated code]*/
static PyObject * static PyObject *
PyZlib_flush(compobject *self, PyObject *args) zlib_Compress_flush_impl(compobject *self, int mode)
/*[clinic end generated code: checksum=a203f4cefc9de727aa1d2ea39d11c0a16c32041a]*/
{ {
int err; int err;
unsigned int length = DEFAULTALLOC, new_length; unsigned int length = DEF_BUF_SIZE, new_length;
PyObject *RetVal; PyObject *RetVal;
int flushmode = Z_FINISH;
unsigned long start_total_out; unsigned long start_total_out;
if (!PyArg_ParseTuple(args, "|i:flush", &flushmode))
return NULL;
/* Flushing with Z_NO_FLUSH is a no-op, so there's no point in /* Flushing with Z_NO_FLUSH is a no-op, so there's no point in
doing any work at all; just return an empty string. */ doing any work at all; just return an empty string. */
if (flushmode == Z_NO_FLUSH) { if (mode == Z_NO_FLUSH) {
return PyBytes_FromStringAndSize(NULL, 0); return PyBytes_FromStringAndSize(NULL, 0);
} }
@ -964,7 +864,7 @@ PyZlib_flush(compobject *self, PyObject *args)
self->zst.next_out = (unsigned char *)PyBytes_AS_STRING(RetVal); self->zst.next_out = (unsigned char *)PyBytes_AS_STRING(RetVal);
Py_BEGIN_ALLOW_THREADS Py_BEGIN_ALLOW_THREADS
err = deflate(&(self->zst), flushmode); err = deflate(&(self->zst), mode);
Py_END_ALLOW_THREADS Py_END_ALLOW_THREADS
/* while Z_OK and the output buffer is full, there might be more output, /* while Z_OK and the output buffer is full, there might be more output,
@ -984,14 +884,14 @@ PyZlib_flush(compobject *self, PyObject *args)
length = new_length; length = new_length;
Py_BEGIN_ALLOW_THREADS Py_BEGIN_ALLOW_THREADS
err = deflate(&(self->zst), flushmode); err = deflate(&(self->zst), mode);
Py_END_ALLOW_THREADS Py_END_ALLOW_THREADS
} }
/* If flushmode is Z_FINISH, we also have to call deflateEnd() to free /* If mode is Z_FINISH, we also have to call deflateEnd() to free
various data structures. Note we should only get Z_STREAM_END when various data structures. Note we should only get Z_STREAM_END when
flushmode is Z_FINISH, but checking both for safety*/ mode is Z_FINISH, but checking both for safety*/
if (err == Z_STREAM_END && flushmode == Z_FINISH) { if (err == Z_STREAM_END && mode == Z_FINISH) {
err = deflateEnd(&(self->zst)); err = deflateEnd(&(self->zst));
if (err != Z_OK) { if (err != Z_OK) {
zlib_error(self->zst, err, "while finishing compression"); zlib_error(self->zst, err, "while finishing compression");
@ -1031,25 +931,9 @@ zlib.Compress.copy
Return a copy of the compression object. Return a copy of the compression object.
[clinic start generated code]*/ [clinic start generated code]*/
PyDoc_STRVAR(zlib_Compress_copy__doc__,
"copy(self)\n"
"Return a copy of the compression object.");
#define ZLIB_COMPRESS_COPY_METHODDEF \
{"copy", (PyCFunction)zlib_Compress_copy, METH_NOARGS, zlib_Compress_copy__doc__},
static PyObject *
zlib_Compress_copy_impl(compobject *self);
static PyObject *
zlib_Compress_copy(compobject *self, PyObject *Py_UNUSED(ignored))
{
return zlib_Compress_copy_impl(self);
}
static PyObject * static PyObject *
zlib_Compress_copy_impl(compobject *self) zlib_Compress_copy_impl(compobject *self)
/*[clinic end generated code: checksum=7aa841ad51297eb83250f511a76872e88fdc737e]*/ /*[clinic end generated code: checksum=5144aa153c21e805afa5c19e5b48cf8e6480b5da]*/
{ {
compobject *retval = NULL; compobject *retval = NULL;
int err; int err;
@ -1099,11 +983,15 @@ error:
return NULL; return NULL;
} }
PyDoc_STRVAR(decomp_copy__doc__, /*[clinic input]
"copy() -- Return a copy of the decompression object."); zlib.Decompress.copy
Return a copy of the decompression object.
[clinic start generated code]*/
static PyObject * static PyObject *
PyZlib_uncopy(compobject *self) zlib_Decompress_copy_impl(compobject *self)
/*[clinic end generated code: checksum=02a883a2a510c8ccfeef3f89e317a275bfe8c094]*/
{ {
compobject *retval = NULL; compobject *retval = NULL;
int err; int err;
@ -1155,24 +1043,26 @@ error:
} }
#endif #endif
PyDoc_STRVAR(decomp_flush__doc__, /*[clinic input]
"flush( [length] ) -- Return a string containing any remaining\n" zlib.Decompress.flush
"decompressed data. length, if given, is the initial size of the\n"
"output buffer.\n" length: uint(c_default="DEF_BUF_SIZE") = DEF_BUF_SIZE
"\n" the initial size of the output buffer.
"The decompressor object can no longer be used after this call."); /
Return a bytes object containing any remaining decompressed data.
[clinic start generated code]*/
static PyObject * static PyObject *
PyZlib_unflush(compobject *self, PyObject *args) zlib_Decompress_flush_impl(compobject *self, unsigned int length)
/*[clinic end generated code: checksum=db6fb753ab698e22afe3957c9da9e5e77f4bfc08]*/
{ {
int err; int err;
unsigned int length = DEFAULTALLOC, new_length; unsigned int new_length;
PyObject * retval = NULL; PyObject * retval = NULL;
unsigned long start_total_out; unsigned long start_total_out;
Py_ssize_t size; Py_ssize_t size;
if (!PyArg_ParseTuple(args, "|O&:flush", uint_converter, &length))
return NULL;
if (length == 0) { if (length == 0) {
PyErr_SetString(PyExc_ValueError, "length must be greater than zero"); PyErr_SetString(PyExc_ValueError, "length must be greater than zero");
return NULL; return NULL;
@ -1248,12 +1138,12 @@ error:
return retval; return retval;
} }
#include "clinic/zlibmodule.c.h"
static PyMethodDef comp_methods[] = static PyMethodDef comp_methods[] =
{ {
{"compress", (binaryfunc)PyZlib_objcompress, METH_VARARGS, ZLIB_COMPRESS_COMPRESS_METHODDEF
comp_compress__doc__}, ZLIB_COMPRESS_FLUSH_METHODDEF
{"flush", (binaryfunc)PyZlib_flush, METH_VARARGS,
comp_flush__doc__},
#ifdef HAVE_ZLIB_COPY #ifdef HAVE_ZLIB_COPY
ZLIB_COMPRESS_COPY_METHODDEF ZLIB_COMPRESS_COPY_METHODDEF
#endif #endif
@ -1263,11 +1153,9 @@ static PyMethodDef comp_methods[] =
static PyMethodDef Decomp_methods[] = static PyMethodDef Decomp_methods[] =
{ {
ZLIB_DECOMPRESS_DECOMPRESS_METHODDEF ZLIB_DECOMPRESS_DECOMPRESS_METHODDEF
{"flush", (binaryfunc)PyZlib_unflush, METH_VARARGS, ZLIB_DECOMPRESS_FLUSH_METHODDEF
decomp_flush__doc__},
#ifdef HAVE_ZLIB_COPY #ifdef HAVE_ZLIB_COPY
{"copy", (PyCFunction)PyZlib_uncopy, METH_NOARGS, ZLIB_DECOMPRESS_COPY_METHODDEF
decomp_copy__doc__},
#endif #endif
{NULL, NULL} {NULL, NULL}
}; };
@ -1280,95 +1168,95 @@ static PyMemberDef Decomp_members[] = {
{NULL}, {NULL},
}; };
PyDoc_STRVAR(adler32__doc__, /*[clinic input]
"adler32(string[, start]) -- Compute an Adler-32 checksum of string.\n" zlib.adler32
"\n"
"An optional starting value can be specified. The returned checksum is\n" data: Py_buffer
"an integer."); value: unsigned_int(bitwise=True) = 1
Starting value of the checksum.
/
Compute an Adler-32 checksum of data.
The returned checksum is an integer.
[clinic start generated code]*/
static PyObject * static PyObject *
PyZlib_adler32(PyObject *self, PyObject *args) zlib_adler32_impl(PyModuleDef *module, Py_buffer *data, unsigned int value)
/*[clinic end generated code: checksum=51d6d75ee655c78af8c968fdb4c11d97e62c67d5]*/
{ {
unsigned int adler32val = 1; /* adler32(0L, Z_NULL, 0) */
Py_buffer pbuf;
if (!PyArg_ParseTuple(args, "y*|I:adler32", &pbuf, &adler32val))
return NULL;
/* Releasing the GIL for very small buffers is inefficient /* Releasing the GIL for very small buffers is inefficient
and may lower performance */ and may lower performance */
if (pbuf.len > 1024*5) { if (data->len > 1024*5) {
unsigned char *buf = pbuf.buf; unsigned char *buf = data->buf;
Py_ssize_t len = pbuf.len; Py_ssize_t len = data->len;
Py_BEGIN_ALLOW_THREADS Py_BEGIN_ALLOW_THREADS
/* Avoid truncation of length for very large buffers. adler32() takes /* Avoid truncation of length for very large buffers. adler32() takes
length as an unsigned int, which may be narrower than Py_ssize_t. */ length as an unsigned int, which may be narrower than Py_ssize_t. */
while ((size_t)len > UINT_MAX) { while ((size_t)len > UINT_MAX) {
adler32val = adler32(adler32val, buf, UINT_MAX); value = adler32(value, buf, UINT_MAX);
buf += (size_t) UINT_MAX; buf += (size_t) UINT_MAX;
len -= (size_t) UINT_MAX; len -= (size_t) UINT_MAX;
} }
adler32val = adler32(adler32val, buf, (unsigned int)len); value = adler32(value, buf, (unsigned int)len);
Py_END_ALLOW_THREADS Py_END_ALLOW_THREADS
} else { } else {
adler32val = adler32(adler32val, pbuf.buf, (unsigned int)pbuf.len); value = adler32(value, data->buf, (unsigned int)data->len);
} }
PyBuffer_Release(&pbuf); return PyLong_FromUnsignedLong(value & 0xffffffffU);
return PyLong_FromUnsignedLong(adler32val & 0xffffffffU);
} }
PyDoc_STRVAR(crc32__doc__, /*[clinic input]
"crc32(string[, start]) -- Compute a CRC-32 checksum of string.\n" zlib.crc32
"\n"
"An optional starting value can be specified. The returned checksum is\n" data: Py_buffer
"an integer."); value: unsigned_int(bitwise=True) = 0
Starting value of the checksum.
/
Compute a CRC-32 checksum of data.
The returned checksum is an integer.
[clinic start generated code]*/
static PyObject * static PyObject *
PyZlib_crc32(PyObject *self, PyObject *args) zlib_crc32_impl(PyModuleDef *module, Py_buffer *data, unsigned int value)
/*[clinic end generated code: checksum=c1e986e74fe7b62369998a71a81ebeb9b73e8d4c]*/
{ {
unsigned int crc32val = 0; /* crc32(0L, Z_NULL, 0) */
Py_buffer pbuf;
int signed_val; int signed_val;
if (!PyArg_ParseTuple(args, "y*|I:crc32", &pbuf, &crc32val))
return NULL;
/* Releasing the GIL for very small buffers is inefficient /* Releasing the GIL for very small buffers is inefficient
and may lower performance */ and may lower performance */
if (pbuf.len > 1024*5) { if (data->len > 1024*5) {
unsigned char *buf = pbuf.buf; unsigned char *buf = data->buf;
Py_ssize_t len = pbuf.len; Py_ssize_t len = data->len;
Py_BEGIN_ALLOW_THREADS Py_BEGIN_ALLOW_THREADS
/* Avoid truncation of length for very large buffers. crc32() takes /* Avoid truncation of length for very large buffers. crc32() takes
length as an unsigned int, which may be narrower than Py_ssize_t. */ length as an unsigned int, which may be narrower than Py_ssize_t. */
while ((size_t)len > UINT_MAX) { while ((size_t)len > UINT_MAX) {
crc32val = crc32(crc32val, buf, UINT_MAX); value = crc32(value, buf, UINT_MAX);
buf += (size_t) UINT_MAX; buf += (size_t) UINT_MAX;
len -= (size_t) UINT_MAX; len -= (size_t) UINT_MAX;
} }
signed_val = crc32(crc32val, buf, (unsigned int)len); signed_val = crc32(value, buf, (unsigned int)len);
Py_END_ALLOW_THREADS Py_END_ALLOW_THREADS
} else { } else {
signed_val = crc32(crc32val, pbuf.buf, (unsigned int)pbuf.len); signed_val = crc32(value, data->buf, (unsigned int)data->len);
} }
PyBuffer_Release(&pbuf);
return PyLong_FromUnsignedLong(signed_val & 0xffffffffU); return PyLong_FromUnsignedLong(signed_val & 0xffffffffU);
} }
static PyMethodDef zlib_methods[] = static PyMethodDef zlib_methods[] =
{ {
{"adler32", (PyCFunction)PyZlib_adler32, METH_VARARGS, ZLIB_ADLER32_METHODDEF
adler32__doc__},
ZLIB_COMPRESS_METHODDEF ZLIB_COMPRESS_METHODDEF
{"compressobj", (PyCFunction)PyZlib_compressobj, METH_VARARGS|METH_KEYWORDS, ZLIB_COMPRESSOBJ_METHODDEF
compressobj__doc__}, ZLIB_CRC32_METHODDEF
{"crc32", (PyCFunction)PyZlib_crc32, METH_VARARGS, ZLIB_DECOMPRESS_METHODDEF
crc32__doc__}, ZLIB_DECOMPRESSOBJ_METHODDEF
{"decompress", (PyCFunction)PyZlib_decompress, METH_VARARGS,
decompress__doc__},
{"decompressobj", (PyCFunction)PyZlib_decompressobj, METH_VARARGS|METH_KEYWORDS,
decompressobj__doc__},
{NULL, NULL} {NULL, NULL}
}; };
@ -1482,6 +1370,7 @@ PyInit_zlib(void)
PyModule_AddIntMacro(m, MAX_WBITS); PyModule_AddIntMacro(m, MAX_WBITS);
PyModule_AddIntMacro(m, DEFLATED); PyModule_AddIntMacro(m, DEFLATED);
PyModule_AddIntMacro(m, DEF_MEM_LEVEL); PyModule_AddIntMacro(m, DEF_MEM_LEVEL);
PyModule_AddIntMacro(m, DEF_BUF_SIZE);
PyModule_AddIntMacro(m, Z_BEST_SPEED); PyModule_AddIntMacro(m, Z_BEST_SPEED);
PyModule_AddIntMacro(m, Z_BEST_COMPRESSION); PyModule_AddIntMacro(m, Z_BEST_COMPRESSION);
PyModule_AddIntMacro(m, Z_DEFAULT_COMPRESSION); PyModule_AddIntMacro(m, Z_DEFAULT_COMPRESSION);

View File

@ -2689,8 +2689,11 @@ _PyErr_TrySetFromCause(const char *format, ...)
* types as well, but that's quite a bit trickier due to the extra * types as well, but that's quite a bit trickier due to the extra
* state potentially stored on OSError instances. * state potentially stored on OSError instances.
*/ */
/* Ensure the traceback is set correctly on the existing exception */
Py_XDECREF(tb); if (tb != NULL) {
PyException_SetTraceback(val, tb);
Py_DECREF(tb);
}
#ifdef HAVE_STDARG_PROTOTYPES #ifdef HAVE_STDARG_PROTOTYPES
va_start(vargs, format); va_start(vargs, format);