bpo-36654: Add examples for using tokenize module programmatically (GH-12947)

(cherry picked from commit 4b09dc79f4)

Co-authored-by: Windson yang <wiwindson@outlook.com>
This commit is contained in:
Miss Islington (bot) 2020-01-25 11:36:04 -08:00 committed by Berker Peksag
parent 548685e364
commit 6dbd843ded
1 changed files with 19 additions and 0 deletions

View File

@ -267,3 +267,22 @@ The exact token type names can be displayed using the :option:`-e` option:
4,10-4,11: RPAR ')'
4,11-4,12: NEWLINE '\n'
5,0-5,0: ENDMARKER ''
Example of tokenizing a file programmatically, reading unicode
strings instead of bytes with :func:`generate_tokens`::
import tokenize
with tokenize.open('hello.py') as f:
tokens = tokenize.generate_tokens(f.readline)
for token in tokens:
print(token)
Or reading bytes directly with :func:`.tokenize`::
import tokenize
with open('hello.py', 'rb') as f:
tokens = tokenize.tokenize(f.readline)
for token in tokens:
print(token)