Issue #2134: Clarify token.OP handling rationale in tokenize documentation.
This commit is contained in:
parent
1b468af7be
commit
972cfb9169
|
@ -15,6 +15,12 @@ implemented in Python. The scanner in this module returns comments as tokens
|
|||
as well, making it useful for implementing "pretty-printers," including
|
||||
colorizers for on-screen displays.
|
||||
|
||||
To simplify token stream handling, all :ref:`operators` and :ref:`delimiters`
|
||||
tokens are returned using the generic :data:`token.OP` token type. The exact
|
||||
type can be determined by checking the token ``string`` field on the
|
||||
:term:`named tuple` returned from :func:`tokenize.tokenize` for the character
|
||||
sequence that identifies a specific operator token.
|
||||
|
||||
The primary entry point is a :term:`generator`:
|
||||
|
||||
.. function:: tokenize(readline)
|
||||
|
|
|
@ -418,6 +418,9 @@ Extension Modules
|
|||
Documentation
|
||||
-------------
|
||||
|
||||
- Issue #2134: The tokenize documentation has been clarified to explain why
|
||||
all operator and delimiter tokens are treated as token.OP tokens.
|
||||
|
||||
- Issue #13513: Fix io.IOBase documentation to correctly link to the
|
||||
io.IOBase.readline method instead of the readline module.
|
||||
|
||||
|
|
Loading…
Reference in New Issue