mirror of https://github.com/python/cpython
Issue #2134: Clarify token.OP handling rationale in tokenize documentation.
This commit is contained in:
parent
d7664dee0c
commit
da747c3d97
|
@ -15,6 +15,12 @@ implemented in Python. The scanner in this module returns comments as tokens as
|
|||
well, making it useful for implementing "pretty-printers," including colorizers
|
||||
for on-screen displays.
|
||||
|
||||
To simplify token stream handling, all :ref:`operators` and :ref:`delimiters`
|
||||
tokens are returned using the generic :data:`token.OP` token type. The exact
|
||||
type can be determined by checking the token ``string`` field on the
|
||||
:term:`named tuple` returned from :func:`tokenize.tokenize` for the character
|
||||
sequence that identifies a specific operator token.
|
||||
|
||||
The primary entry point is a :term:`generator`:
|
||||
|
||||
.. function:: generate_tokens(readline)
|
||||
|
|
|
@ -495,6 +495,9 @@ Tests
|
|||
Documentation
|
||||
-------------
|
||||
|
||||
- Issue #2134: The tokenize documentation has been clarified to explain why
|
||||
all operator and delimiter tokens are treated as token.OP tokens.
|
||||
|
||||
- Issue #13513: Fix io.IOBase documentation to correctly link to the
|
||||
io.IOBase.readline method instead of the readline module.
|
||||
|
||||
|
|
Loading…
Reference in New Issue