mirror of https://github.com/python/cpython
Issue #2333: Backport set and dict comprehensions syntax.
This commit is contained in:
parent
0ca7452794
commit
b646547bb4
|
@ -205,74 +205,100 @@ block, nesting from left to right, and evaluating the expression to produce a
|
|||
list element each time the innermost block is reached [#]_.
|
||||
|
||||
|
||||
.. _comprehensions:
|
||||
|
||||
Displays for sets and dictionaries
|
||||
----------------------------------
|
||||
|
||||
For constructing a set or a dictionary Python provides special syntax
|
||||
called "displays", each of them in two flavors:
|
||||
|
||||
* either the container contents are listed explicitly, or
|
||||
|
||||
* they are computed via a set of looping and filtering instructions, called a
|
||||
:dfn:`comprehension`.
|
||||
|
||||
Common syntax elements for comprehensions are:
|
||||
|
||||
.. productionlist::
|
||||
comprehension: `expression` `comp_for`
|
||||
comp_for: "for" `target_list` "in" `or_test` [`comp_iter`]
|
||||
comp_iter: `comp_for` | `comp_if`
|
||||
comp_if: "if" `expression_nocond` [`comp_iter`]
|
||||
|
||||
The comprehension consists of a single expression followed by at least one
|
||||
:keyword:`for` clause and zero or more :keyword:`for` or :keyword:`if` clauses.
|
||||
In this case, the elements of the new container are those that would be produced
|
||||
by considering each of the :keyword:`for` or :keyword:`if` clauses a block,
|
||||
nesting from left to right, and evaluating the expression to produce an element
|
||||
each time the innermost block is reached.
|
||||
|
||||
Note that the comprehension is executed in a separate scope, so names assigned
|
||||
to in the target list don't "leak" in the enclosing scope.
|
||||
|
||||
|
||||
.. _genexpr:
|
||||
|
||||
Generator expressions
|
||||
---------------------
|
||||
|
||||
.. index:: pair: generator; expression
|
||||
object: generator
|
||||
|
||||
A generator expression is a compact generator notation in parentheses:
|
||||
|
||||
.. productionlist::
|
||||
generator_expression: "(" `expression` `genexpr_for` ")"
|
||||
genexpr_for: "for" `target_list` "in" `or_test` [`genexpr_iter`]
|
||||
genexpr_iter: `genexpr_for` | `genexpr_if`
|
||||
genexpr_if: "if" `old_expression` [`genexpr_iter`]
|
||||
generator_expression: "(" `expression` `comp_for` ")"
|
||||
|
||||
.. index:: object: generator
|
||||
A generator expression yields a new generator object. Its syntax is the same as
|
||||
for comprehensions, except that it is enclosed in parentheses instead of
|
||||
brackets or curly braces.
|
||||
|
||||
A generator expression yields a new generator object. It consists of a single
|
||||
expression followed by at least one :keyword:`for` clause and zero or more
|
||||
:keyword:`for` or :keyword:`if` clauses. The iterating values of the new
|
||||
generator are those that would be produced by considering each of the
|
||||
:keyword:`for` or :keyword:`if` clauses a block, nesting from left to right, and
|
||||
evaluating the expression to yield a value that is reached the innermost block
|
||||
for each iteration.
|
||||
|
||||
Variables used in the generator expression are evaluated lazily in a separate
|
||||
scope when the :meth:`next` method is called for the generator object (in the
|
||||
same fashion as for normal generators). However, the :keyword:`in` expression
|
||||
of the leftmost :keyword:`for` clause is immediately evaluated in the current
|
||||
scope so that an error produced by it can be seen before any other possible
|
||||
Variables used in the generator expression are evaluated lazily when the
|
||||
:meth:`__next__` method is called for generator object (in the same fashion as
|
||||
normal generators). However, the leftmost :keyword:`for` clause is immediately
|
||||
evaluated, so that an error produced by it can be seen before any other possible
|
||||
error in the code that handles the generator expression. Subsequent
|
||||
:keyword:`for` and :keyword:`if` clauses cannot be evaluated immediately since
|
||||
they may depend on the previous :keyword:`for` loop. For example:
|
||||
``(x*y for x in range(10) for y in bar(x))``.
|
||||
:keyword:`for` clauses cannot be evaluated immediately since they may depend on
|
||||
the previous :keyword:`for` loop. For example: ``(x*y for x in range(10) for y
|
||||
in bar(x))``.
|
||||
|
||||
The parentheses can be omitted on calls with only one argument. See section
|
||||
:ref:`calls` for the detail.
|
||||
|
||||
|
||||
.. _dict:
|
||||
|
||||
Dictionary displays
|
||||
-------------------
|
||||
|
||||
.. index:: pair: dictionary; display
|
||||
|
||||
.. index::
|
||||
single: key
|
||||
single: datum
|
||||
single: key/datum pair
|
||||
key, datum, key/datum pair
|
||||
object: dictionary
|
||||
|
||||
A dictionary display is a possibly empty series of key/datum pairs enclosed in
|
||||
curly braces:
|
||||
|
||||
.. productionlist::
|
||||
dict_display: "{" [`key_datum_list`] "}"
|
||||
dict_display: "{" [`key_datum_list` | `dict_comprehension`] "}"
|
||||
key_datum_list: `key_datum` ("," `key_datum`)* [","]
|
||||
key_datum: `expression` ":" `expression`
|
||||
|
||||
.. index:: object: dictionary
|
||||
dict_comprehension: `expression` ":" `expression` `comp_for`
|
||||
|
||||
A dictionary display yields a new dictionary object.
|
||||
|
||||
The key/datum pairs are evaluated from left to right to define the entries of
|
||||
the dictionary: each key object is used as a key into the dictionary to store
|
||||
the corresponding datum.
|
||||
If a comma-separated sequence of key/datum pairs is given, they are evaluated
|
||||
from left to right to define the entries of the dictionary: each key object is
|
||||
used as a key into the dictionary to store the corresponding datum. This means
|
||||
that you can specify the same key multiple times in the key/datum list, and the
|
||||
final dictionary's value for that key will be the last one given.
|
||||
|
||||
A dict comprehension, in contrast to list and set comprehensions, needs two
|
||||
expressions separated with a colon followed by the usual "for" and "if" clauses.
|
||||
When the comprehension is run, the resulting key and value elements are inserted
|
||||
in the new dictionary in the order they are produced.
|
||||
|
||||
.. index:: pair: immutable; object
|
||||
hashable
|
||||
|
||||
Restrictions on the types of the key values are listed earlier in section
|
||||
:ref:`types`. (To summarize, the key type should be :term:`hashable`, which excludes
|
||||
|
|
|
@ -100,13 +100,13 @@ arith_expr: term (('+'|'-') term)*
|
|||
term: factor (('*'|'/'|'%'|'//') factor)*
|
||||
factor: ('+'|'-'|'~') factor | power
|
||||
power: atom trailer* ['**' factor]
|
||||
atom: ('(' [yield_expr|testlist_gexp] ')' |
|
||||
atom: ('(' [yield_expr|testlist_comp] ')' |
|
||||
'[' [listmaker] ']' |
|
||||
'{' [dictorsetmaker] '}' |
|
||||
'`' testlist1 '`' |
|
||||
NAME | NUMBER | STRING+)
|
||||
listmaker: test ( list_for | (',' test)* [','] )
|
||||
testlist_gexp: test ( gen_for | (',' test)* [','] )
|
||||
testlist_comp: test ( comp_for | (',' test)* [','] )
|
||||
lambdef: 'lambda' [varargslist] ':' test
|
||||
trailer: '(' [arglist] ')' | '[' subscriptlist ']' | '.' NAME
|
||||
subscriptlist: subscript (',' subscript)* [',']
|
||||
|
@ -115,8 +115,8 @@ sliceop: ':' [test]
|
|||
exprlist: expr (',' expr)* [',']
|
||||
testlist: test (',' test)* [',']
|
||||
dictmaker: test ':' test (',' test ':' test)* [',']
|
||||
dictorsetmaker: ( (test ':' test (',' test ':' test)* [',']) |
|
||||
(test (',' test)* [',']) )
|
||||
dictorsetmaker: ( (test ':' test (comp_for | (',' test ':' test)* [','])) |
|
||||
(test (comp_for | (',' test)* [','])) )
|
||||
|
||||
classdef: 'class' NAME ['(' [testlist] ')'] ':' suite
|
||||
|
||||
|
@ -125,15 +125,15 @@ arglist: (argument ',')* (argument [',']
|
|||
|'**' test)
|
||||
# The reason that keywords are test nodes instead of NAME is that using NAME
|
||||
# results in an ambiguity. ast.c makes sure it's a NAME.
|
||||
argument: test [gen_for] | test '=' test
|
||||
argument: test [comp_for] | test '=' test
|
||||
|
||||
list_iter: list_for | list_if
|
||||
list_for: 'for' exprlist 'in' testlist_safe [list_iter]
|
||||
list_if: 'if' old_test [list_iter]
|
||||
|
||||
gen_iter: gen_for | gen_if
|
||||
gen_for: 'for' exprlist 'in' or_test [gen_iter]
|
||||
gen_if: 'if' old_test [gen_iter]
|
||||
comp_iter: comp_for | comp_if
|
||||
comp_for: 'for' exprlist 'in' or_test [comp_iter]
|
||||
comp_if: 'if' old_test [comp_iter]
|
||||
|
||||
testlist1: test (',' test)*
|
||||
|
||||
|
|
|
@ -186,10 +186,10 @@ struct _stmt {
|
|||
|
||||
enum _expr_kind {BoolOp_kind=1, BinOp_kind=2, UnaryOp_kind=3, Lambda_kind=4,
|
||||
IfExp_kind=5, Dict_kind=6, Set_kind=7, ListComp_kind=8,
|
||||
GeneratorExp_kind=9, Yield_kind=10, Compare_kind=11,
|
||||
Call_kind=12, Repr_kind=13, Num_kind=14, Str_kind=15,
|
||||
Attribute_kind=16, Subscript_kind=17, Name_kind=18,
|
||||
List_kind=19, Tuple_kind=20};
|
||||
SetComp_kind=9, DictComp_kind=10, GeneratorExp_kind=11,
|
||||
Yield_kind=12, Compare_kind=13, Call_kind=14, Repr_kind=15,
|
||||
Num_kind=16, Str_kind=17, Attribute_kind=18,
|
||||
Subscript_kind=19, Name_kind=20, List_kind=21, Tuple_kind=22};
|
||||
struct _expr {
|
||||
enum _expr_kind kind;
|
||||
union {
|
||||
|
@ -234,6 +234,17 @@ struct _expr {
|
|||
asdl_seq *generators;
|
||||
} ListComp;
|
||||
|
||||
struct {
|
||||
expr_ty elt;
|
||||
asdl_seq *generators;
|
||||
} SetComp;
|
||||
|
||||
struct {
|
||||
expr_ty key;
|
||||
expr_ty value;
|
||||
asdl_seq *generators;
|
||||
} DictComp;
|
||||
|
||||
struct {
|
||||
expr_ty elt;
|
||||
asdl_seq *generators;
|
||||
|
@ -458,6 +469,12 @@ expr_ty _Py_Set(asdl_seq * elts, int lineno, int col_offset, PyArena *arena);
|
|||
#define ListComp(a0, a1, a2, a3, a4) _Py_ListComp(a0, a1, a2, a3, a4)
|
||||
expr_ty _Py_ListComp(expr_ty elt, asdl_seq * generators, int lineno, int
|
||||
col_offset, PyArena *arena);
|
||||
#define SetComp(a0, a1, a2, a3, a4) _Py_SetComp(a0, a1, a2, a3, a4)
|
||||
expr_ty _Py_SetComp(expr_ty elt, asdl_seq * generators, int lineno, int
|
||||
col_offset, PyArena *arena);
|
||||
#define DictComp(a0, a1, a2, a3, a4, a5) _Py_DictComp(a0, a1, a2, a3, a4, a5)
|
||||
expr_ty _Py_DictComp(expr_ty key, expr_ty value, asdl_seq * generators, int
|
||||
lineno, int col_offset, PyArena *arena);
|
||||
#define GeneratorExp(a0, a1, a2, a3, a4) _Py_GeneratorExp(a0, a1, a2, a3, a4)
|
||||
expr_ty _Py_GeneratorExp(expr_ty elt, asdl_seq * generators, int lineno, int
|
||||
col_offset, PyArena *arena);
|
||||
|
|
|
@ -64,7 +64,7 @@
|
|||
#define power 317
|
||||
#define atom 318
|
||||
#define listmaker 319
|
||||
#define testlist_gexp 320
|
||||
#define testlist_comp 320
|
||||
#define lambdef 321
|
||||
#define trailer 322
|
||||
#define subscriptlist 323
|
||||
|
@ -80,9 +80,9 @@
|
|||
#define list_iter 333
|
||||
#define list_for 334
|
||||
#define list_if 335
|
||||
#define gen_iter 336
|
||||
#define gen_for 337
|
||||
#define gen_if 338
|
||||
#define comp_iter 336
|
||||
#define comp_for 337
|
||||
#define comp_if 338
|
||||
#define testlist1 339
|
||||
#define encoding_decl 340
|
||||
#define yield_expr 341
|
||||
|
|
|
@ -147,6 +147,9 @@ extern "C" {
|
|||
/* Support for opargs more than 16 bits long */
|
||||
#define EXTENDED_ARG 145
|
||||
|
||||
#define SET_ADD 146
|
||||
#define MAP_ADD 147
|
||||
|
||||
|
||||
enum cmp_op {PyCmp_LT=Py_LT, PyCmp_LE=Py_LE, PyCmp_EQ=Py_EQ, PyCmp_NE=Py_NE, PyCmp_GT=Py_GT, PyCmp_GE=Py_GE,
|
||||
PyCmp_IN, PyCmp_NOT_IN, PyCmp_IS, PyCmp_IS_NOT, PyCmp_EXC_MATCH, PyCmp_BAD};
|
||||
|
|
|
@ -42,6 +42,7 @@ typedef struct _symtable_entry {
|
|||
an argument */
|
||||
int ste_lineno; /* first line of block */
|
||||
int ste_opt_lineno; /* lineno of last exec or import * */
|
||||
int ste_tmpname; /* counter for listcomp temp vars */
|
||||
struct symtable *ste_table;
|
||||
} PySTEntryObject;
|
||||
|
||||
|
|
|
@ -890,6 +890,51 @@ class ListCompIf(Node):
|
|||
def __repr__(self):
|
||||
return "ListCompIf(%s)" % (repr(self.test),)
|
||||
|
||||
class SetComp(Node):
|
||||
def __init__(self, expr, quals, lineno=None):
|
||||
self.expr = expr
|
||||
self.quals = quals
|
||||
self.lineno = lineno
|
||||
|
||||
def getChildren(self):
|
||||
children = []
|
||||
children.append(self.expr)
|
||||
children.extend(flatten(self.quals))
|
||||
return tuple(children)
|
||||
|
||||
def getChildNodes(self):
|
||||
nodelist = []
|
||||
nodelist.append(self.expr)
|
||||
nodelist.extend(flatten_nodes(self.quals))
|
||||
return tuple(nodelist)
|
||||
|
||||
def __repr__(self):
|
||||
return "SetComp(%s, %s)" % (repr(self.expr), repr(self.quals))
|
||||
|
||||
class DictComp(Node):
|
||||
def __init__(self, key, value, quals, lineno=None):
|
||||
self.key = key
|
||||
self.value = value
|
||||
self.quals = quals
|
||||
self.lineno = lineno
|
||||
|
||||
def getChildren(self):
|
||||
children = []
|
||||
children.append(self.key)
|
||||
children.append(self.value)
|
||||
children.extend(flatten(self.quals))
|
||||
return tuple(children)
|
||||
|
||||
def getChildNodes(self):
|
||||
nodelist = []
|
||||
nodelist.append(self.key)
|
||||
nodelist.append(self.value)
|
||||
nodelist.extend(flatten_nodes(self.quals))
|
||||
return tuple(nodelist)
|
||||
|
||||
def __repr__(self):
|
||||
return "DictComp(%s, %s, %s)" % (repr(self.key), repr(self.value), repr(self.quals))
|
||||
|
||||
class Mod(Node):
|
||||
def __init__(self, leftright, lineno=None):
|
||||
self.left = leftright[0]
|
||||
|
|
|
@ -685,7 +685,9 @@ class StackDepthTracker:
|
|||
effect = {
|
||||
'POP_TOP': -1,
|
||||
'DUP_TOP': 1,
|
||||
'LIST_APPEND': -2,
|
||||
'LIST_APPEND': -1,
|
||||
'SET_ADD': -1,
|
||||
'MAP_ADD': -2,
|
||||
'SLICE+1': -1,
|
||||
'SLICE+2': -1,
|
||||
'SLICE+3': -2,
|
||||
|
|
|
@ -589,6 +589,55 @@ class CodeGenerator:
|
|||
self.emit('JUMP_ABSOLUTE', start)
|
||||
self.startBlock(anchor)
|
||||
|
||||
def visitSetComp(self, node):
|
||||
self.set_lineno(node)
|
||||
# setup list
|
||||
self.emit('BUILD_SET', 0)
|
||||
|
||||
stack = []
|
||||
for i, for_ in zip(range(len(node.quals)), node.quals):
|
||||
start, anchor = self.visit(for_)
|
||||
cont = None
|
||||
for if_ in for_.ifs:
|
||||
if cont is None:
|
||||
cont = self.newBlock()
|
||||
self.visit(if_, cont)
|
||||
stack.insert(0, (start, cont, anchor))
|
||||
|
||||
self.visit(node.expr)
|
||||
self.emit('SET_ADD', len(node.quals) + 1)
|
||||
|
||||
for start, cont, anchor in stack:
|
||||
if cont:
|
||||
self.nextBlock(cont)
|
||||
self.emit('JUMP_ABSOLUTE', start)
|
||||
self.startBlock(anchor)
|
||||
|
||||
def visitDictComp(self, node):
|
||||
self.set_lineno(node)
|
||||
# setup list
|
||||
self.emit('BUILD_MAP', 0)
|
||||
|
||||
stack = []
|
||||
for i, for_ in zip(range(len(node.quals)), node.quals):
|
||||
start, anchor = self.visit(for_)
|
||||
cont = None
|
||||
for if_ in for_.ifs:
|
||||
if cont is None:
|
||||
cont = self.newBlock()
|
||||
self.visit(if_, cont)
|
||||
stack.insert(0, (start, cont, anchor))
|
||||
|
||||
self.visit(node.value)
|
||||
self.visit(node.key)
|
||||
self.emit('MAP_ADD', len(node.quals) + 1)
|
||||
|
||||
for start, cont, anchor in stack:
|
||||
if cont:
|
||||
self.nextBlock(cont)
|
||||
self.emit('JUMP_ABSOLUTE', start)
|
||||
self.startBlock(anchor)
|
||||
|
||||
def visitListCompFor(self, node):
|
||||
start = self.newBlock()
|
||||
anchor = self.newBlock()
|
||||
|
|
|
@ -581,8 +581,10 @@ class Transformer:
|
|||
testlist1 = testlist
|
||||
exprlist = testlist
|
||||
|
||||
def testlist_gexp(self, nodelist):
|
||||
if len(nodelist) == 2 and nodelist[1][0] == symbol.gen_for:
|
||||
def testlist_comp(self, nodelist):
|
||||
# test ( comp_for | (',' test)* [','] )
|
||||
assert nodelist[0][0] == symbol.test
|
||||
if len(nodelist) == 2 and nodelist[1][0] == symbol.comp_for:
|
||||
test = self.com_node(nodelist[0])
|
||||
return self.com_generator_expression(test, nodelist[1])
|
||||
return self.testlist(nodelist)
|
||||
|
@ -1001,7 +1003,7 @@ class Transformer:
|
|||
# loop to avoid trivial recursion
|
||||
while 1:
|
||||
t = node[0]
|
||||
if t in (symbol.exprlist, symbol.testlist, symbol.testlist_safe, symbol.testlist_gexp):
|
||||
if t in (symbol.exprlist, symbol.testlist, symbol.testlist_safe, symbol.testlist_comp):
|
||||
if len(node) > 2:
|
||||
return self.com_assign_tuple(node, assigning)
|
||||
node = node[1]
|
||||
|
@ -1099,7 +1101,6 @@ class Transformer:
|
|||
else:
|
||||
stmts.append(result)
|
||||
|
||||
if hasattr(symbol, 'list_for'):
|
||||
def com_list_constructor(self, nodelist):
|
||||
# listmaker: test ( list_for | (',' test)* [','] )
|
||||
values = []
|
||||
|
@ -1114,11 +1115,17 @@ class Transformer:
|
|||
return List(values, lineno=values[0].lineno)
|
||||
|
||||
def com_list_comprehension(self, expr, node):
|
||||
return self.com_comprehension(expr, None, node, 'list')
|
||||
|
||||
def com_comprehension(self, expr1, expr2, node, type):
|
||||
# list_iter: list_for | list_if
|
||||
# list_for: 'for' exprlist 'in' testlist [list_iter]
|
||||
# list_if: 'if' test [list_iter]
|
||||
|
||||
# XXX should raise SyntaxError for assignment
|
||||
# XXX(avassalotti) Set and dict comprehensions should have generator
|
||||
# semantics. In other words, they shouldn't leak
|
||||
# variables outside of the comprehension's scope.
|
||||
|
||||
lineno = node[1][2]
|
||||
fors = []
|
||||
|
@ -1126,43 +1133,51 @@ class Transformer:
|
|||
t = node[1][1]
|
||||
if t == 'for':
|
||||
assignNode = self.com_assign(node[2], OP_ASSIGN)
|
||||
listNode = self.com_node(node[4])
|
||||
newfor = ListCompFor(assignNode, listNode, [])
|
||||
compNode = self.com_node(node[4])
|
||||
newfor = ListCompFor(assignNode, compNode, [])
|
||||
newfor.lineno = node[1][2]
|
||||
fors.append(newfor)
|
||||
if len(node) == 5:
|
||||
node = None
|
||||
else:
|
||||
elif type == 'list':
|
||||
node = self.com_list_iter(node[5])
|
||||
else:
|
||||
node = self.com_comp_iter(node[5])
|
||||
elif t == 'if':
|
||||
test = self.com_node(node[2])
|
||||
newif = ListCompIf(test, lineno=node[1][2])
|
||||
newfor.ifs.append(newif)
|
||||
if len(node) == 3:
|
||||
node = None
|
||||
else:
|
||||
elif type == 'list':
|
||||
node = self.com_list_iter(node[3])
|
||||
else:
|
||||
node = self.com_comp_iter(node[3])
|
||||
else:
|
||||
raise SyntaxError, \
|
||||
("unexpected list comprehension element: %s %d"
|
||||
("unexpected comprehension element: %s %d"
|
||||
% (node, lineno))
|
||||
return ListComp(expr, fors, lineno=lineno)
|
||||
if type == 'list':
|
||||
return ListComp(expr1, fors, lineno=lineno)
|
||||
elif type == 'set':
|
||||
return SetComp(expr1, fors, lineno=lineno)
|
||||
elif type == 'dict':
|
||||
return DictComp(expr1, expr2, fors, lineno=lineno)
|
||||
else:
|
||||
raise ValueError("unexpected comprehension type: " + repr(type))
|
||||
|
||||
def com_list_iter(self, node):
|
||||
assert node[0] == symbol.list_iter
|
||||
return node[1]
|
||||
else:
|
||||
def com_list_constructor(self, nodelist):
|
||||
values = []
|
||||
for i in range(1, len(nodelist), 2):
|
||||
values.append(self.com_node(nodelist[i]))
|
||||
return List(values, lineno=values[0].lineno)
|
||||
|
||||
if hasattr(symbol, 'gen_for'):
|
||||
def com_comp_iter(self, node):
|
||||
assert node[0] == symbol.comp_iter
|
||||
return node[1]
|
||||
|
||||
def com_generator_expression(self, expr, node):
|
||||
# gen_iter: gen_for | gen_if
|
||||
# gen_for: 'for' exprlist 'in' test [gen_iter]
|
||||
# gen_if: 'if' test [gen_iter]
|
||||
# comp_iter: comp_for | comp_if
|
||||
# comp_for: 'for' exprlist 'in' test [comp_iter]
|
||||
# comp_if: 'if' test [comp_iter]
|
||||
|
||||
lineno = node[1][2]
|
||||
fors = []
|
||||
|
@ -1177,7 +1192,7 @@ class Transformer:
|
|||
if (len(node)) == 5:
|
||||
node = None
|
||||
else:
|
||||
node = self.com_gen_iter(node[5])
|
||||
node = self.com_comp_iter(node[5])
|
||||
elif t == 'if':
|
||||
test = self.com_node(node[2])
|
||||
newif = GenExprIf(test, lineno=node[1][2])
|
||||
|
@ -1185,7 +1200,7 @@ class Transformer:
|
|||
if len(node) == 3:
|
||||
node = None
|
||||
else:
|
||||
node = self.com_gen_iter(node[3])
|
||||
node = self.com_comp_iter(node[3])
|
||||
else:
|
||||
raise SyntaxError, \
|
||||
("unexpected generator expression element: %s %d"
|
||||
|
@ -1193,22 +1208,31 @@ class Transformer:
|
|||
fors[0].is_outmost = True
|
||||
return GenExpr(GenExprInner(expr, fors), lineno=lineno)
|
||||
|
||||
def com_gen_iter(self, node):
|
||||
assert node[0] == symbol.gen_iter
|
||||
return node[1]
|
||||
|
||||
def com_dictorsetmaker(self, nodelist):
|
||||
# dictorsetmaker: ( (test ':' test (',' test ':' test)* [',']) |
|
||||
# (test (',' test)* [',']) )
|
||||
# dictorsetmaker: ( (test ':' test (comp_for | (',' test ':' test)* [','])) |
|
||||
# (test (comp_for | (',' test)* [','])) )
|
||||
assert nodelist[0] == symbol.dictorsetmaker
|
||||
if len(nodelist) == 2 or nodelist[2][0] == token.COMMA:
|
||||
nodelist = nodelist[1:]
|
||||
if len(nodelist) == 1 or nodelist[1][0] == token.COMMA:
|
||||
# set literal
|
||||
items = []
|
||||
for i in range(1, len(nodelist), 2):
|
||||
for i in range(0, len(nodelist), 2):
|
||||
items.append(self.com_node(nodelist[i]))
|
||||
return Set(items, lineno=items[0].lineno)
|
||||
elif nodelist[1][0] == symbol.comp_for:
|
||||
# set comprehension
|
||||
expr = self.com_node(nodelist[0])
|
||||
return self.com_comprehension(expr, None, nodelist[1], 'set')
|
||||
elif len(nodelist) > 3 and nodelist[3][0] == symbol.comp_for:
|
||||
# dict comprehension
|
||||
assert nodelist[1][0] == token.COLON
|
||||
key = self.com_node(nodelist[0])
|
||||
value = self.com_node(nodelist[2])
|
||||
return self.com_comprehension(key, value, nodelist[3], 'dict')
|
||||
else:
|
||||
# dict literal
|
||||
items = []
|
||||
for i in range(1, len(nodelist), 4):
|
||||
for i in range(0, len(nodelist), 4):
|
||||
items.append((self.com_node(nodelist[i]),
|
||||
self.com_node(nodelist[i+2])))
|
||||
return Dict(items, lineno=items[0][0].lineno)
|
||||
|
@ -1257,7 +1281,7 @@ class Transformer:
|
|||
kw, result = self.com_argument(node, kw, star_node)
|
||||
|
||||
if len_nodelist != 2 and isinstance(result, GenExpr) \
|
||||
and len(node) == 3 and node[2][0] == symbol.gen_for:
|
||||
and len(node) == 3 and node[2][0] == symbol.comp_for:
|
||||
# allow f(x for x in y), but reject f(x for x in y, 1)
|
||||
# should use f((x for x in y), 1) instead of f(x for x in y, 1)
|
||||
raise SyntaxError, 'generator expression needs parenthesis'
|
||||
|
@ -1269,7 +1293,7 @@ class Transformer:
|
|||
lineno=extractLineNo(nodelist))
|
||||
|
||||
def com_argument(self, nodelist, kw, star_node):
|
||||
if len(nodelist) == 3 and nodelist[2][0] == symbol.gen_for:
|
||||
if len(nodelist) == 3 and nodelist[2][0] == symbol.comp_for:
|
||||
test = self.com_node(nodelist[1])
|
||||
return 0, self.com_generator_expression(test, nodelist[2])
|
||||
if len(nodelist) == 2:
|
||||
|
|
|
@ -186,5 +186,7 @@ jrel_op('SETUP_WITH', 143)
|
|||
|
||||
def_op('EXTENDED_ARG', 145)
|
||||
EXTENDED_ARG = 145
|
||||
def_op('SET_ADD', 146)
|
||||
def_op('MAP_ADD', 147)
|
||||
|
||||
del def_op, name_op, jrel_op, jabs_op
|
||||
|
|
|
@ -74,7 +74,7 @@ factor = 316
|
|||
power = 317
|
||||
atom = 318
|
||||
listmaker = 319
|
||||
testlist_gexp = 320
|
||||
testlist_comp = 320
|
||||
lambdef = 321
|
||||
trailer = 322
|
||||
subscriptlist = 323
|
||||
|
@ -90,9 +90,9 @@ argument = 332
|
|||
list_iter = 333
|
||||
list_for = 334
|
||||
list_if = 335
|
||||
gen_iter = 336
|
||||
gen_for = 337
|
||||
gen_if = 338
|
||||
comp_iter = 336
|
||||
comp_for = 337
|
||||
comp_if = 338
|
||||
testlist1 = 339
|
||||
encoding_decl = 340
|
||||
yield_expr = 341
|
||||
|
|
|
@ -140,6 +140,36 @@ class CompilerTest(unittest.TestCase):
|
|||
'eval')
|
||||
self.assertEquals(eval(c), [(0, 3), (1, 3), (2, 3)])
|
||||
|
||||
def testSetLiteral(self):
|
||||
c = compiler.compile('{1, 2, 3}', '<string>', 'eval')
|
||||
self.assertEquals(eval(c), {1,2,3})
|
||||
c = compiler.compile('{1, 2, 3,}', '<string>', 'eval')
|
||||
self.assertEquals(eval(c), {1,2,3})
|
||||
|
||||
def testDictLiteral(self):
|
||||
c = compiler.compile('{1:2, 2:3, 3:4}', '<string>', 'eval')
|
||||
self.assertEquals(eval(c), {1:2, 2:3, 3:4})
|
||||
c = compiler.compile('{1:2, 2:3, 3:4,}', '<string>', 'eval')
|
||||
self.assertEquals(eval(c), {1:2, 2:3, 3:4})
|
||||
|
||||
def testSetComp(self):
|
||||
c = compiler.compile('{x for x in range(1, 4)}', '<string>', 'eval')
|
||||
self.assertEquals(eval(c), {1, 2, 3})
|
||||
c = compiler.compile('{x * y for x in range(3) if x != 0'
|
||||
' for y in range(4) if y != 0}',
|
||||
'<string>',
|
||||
'eval')
|
||||
self.assertEquals(eval(c), {1, 2, 3, 4, 6})
|
||||
|
||||
def testDictComp(self):
|
||||
c = compiler.compile('{x:x+1 for x in range(1, 4)}', '<string>', 'eval')
|
||||
self.assertEquals(eval(c), {1:2, 2:3, 3:4})
|
||||
c = compiler.compile('{(x, y) : y for x in range(2) if x != 0'
|
||||
' for y in range(3) if y != 0}',
|
||||
'<string>',
|
||||
'eval')
|
||||
self.assertEquals(eval(c), {(1, 2): 2, (1, 1): 1})
|
||||
|
||||
def testWith(self):
|
||||
# SF bug 1638243
|
||||
c = compiler.compile('from __future__ import with_statement\n'
|
||||
|
@ -248,6 +278,8 @@ l[0]
|
|||
l[3:4]
|
||||
d = {'a': 2}
|
||||
d = {}
|
||||
d = {x: y for x, y in zip(range(5), range(5,10))}
|
||||
s = {x for x in range(10)}
|
||||
s = {1}
|
||||
t = ()
|
||||
t = (1, 2)
|
||||
|
|
|
@ -0,0 +1,54 @@
|
|||
|
||||
doctests = """
|
||||
|
||||
>>> k = "old value"
|
||||
>>> { k: None for k in range(10) }
|
||||
{0: None, 1: None, 2: None, 3: None, 4: None, 5: None, 6: None, 7: None, 8: None, 9: None}
|
||||
>>> k
|
||||
'old value'
|
||||
|
||||
>>> { k: k+10 for k in range(10) }
|
||||
{0: 10, 1: 11, 2: 12, 3: 13, 4: 14, 5: 15, 6: 16, 7: 17, 8: 18, 9: 19}
|
||||
|
||||
>>> g = "Global variable"
|
||||
>>> { k: g for k in range(10) }
|
||||
{0: 'Global variable', 1: 'Global variable', 2: 'Global variable', 3: 'Global variable', 4: 'Global variable', 5: 'Global variable', 6: 'Global variable', 7: 'Global variable', 8: 'Global variable', 9: 'Global variable'}
|
||||
|
||||
>>> { k: v for k in range(10) for v in range(10) if k == v }
|
||||
{0: 0, 1: 1, 2: 2, 3: 3, 4: 4, 5: 5, 6: 6, 7: 7, 8: 8, 9: 9}
|
||||
|
||||
>>> { k: v for v in range(10) for k in range(v*9, v*10) }
|
||||
{9: 1, 18: 2, 19: 2, 27: 3, 28: 3, 29: 3, 36: 4, 37: 4, 38: 4, 39: 4, 45: 5, 46: 5, 47: 5, 48: 5, 49: 5, 54: 6, 55: 6, 56: 6, 57: 6, 58: 6, 59: 6, 63: 7, 64: 7, 65: 7, 66: 7, 67: 7, 68: 7, 69: 7, 72: 8, 73: 8, 74: 8, 75: 8, 76: 8, 77: 8, 78: 8, 79: 8, 81: 9, 82: 9, 83: 9, 84: 9, 85: 9, 86: 9, 87: 9, 88: 9, 89: 9}
|
||||
|
||||
>>> { x: y for y, x in ((1, 2), (3, 4)) } = 5 # doctest: +IGNORE_EXCEPTION_DETAIL
|
||||
Traceback (most recent call last):
|
||||
...
|
||||
SyntaxError: ...
|
||||
|
||||
>>> { x: y for y, x in ((1, 2), (3, 4)) } += 5 # doctest: +IGNORE_EXCEPTION_DETAIL
|
||||
Traceback (most recent call last):
|
||||
...
|
||||
SyntaxError: ...
|
||||
|
||||
"""
|
||||
|
||||
__test__ = {'doctests' : doctests}
|
||||
|
||||
def test_main(verbose=None):
|
||||
import sys
|
||||
from test import test_support
|
||||
from test import test_dictcomps
|
||||
test_support.run_doctest(test_dictcomps, verbose)
|
||||
|
||||
# verify reference counting
|
||||
if verbose and hasattr(sys, "gettotalrefcount"):
|
||||
import gc
|
||||
counts = [None] * 5
|
||||
for i in range(len(counts)):
|
||||
test_support.run_doctest(test_dictcomps, verbose)
|
||||
gc.collect()
|
||||
counts[i] = sys.gettotalrefcount()
|
||||
print(counts)
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_main(verbose=True)
|
|
@ -808,6 +808,13 @@ hello world
|
|||
pass
|
||||
self.assertEqual(G.decorated, True)
|
||||
|
||||
def testDictcomps(self):
|
||||
# dictorsetmaker: ( (test ':' test (comp_for |
|
||||
# (',' test ':' test)* [','])) |
|
||||
# (test (comp_for | (',' test)* [','])) )
|
||||
nums = [1, 2, 3]
|
||||
self.assertEqual({i:i+1 for i in nums}, {1: 2, 2: 3, 3: 4})
|
||||
|
||||
def testListcomps(self):
|
||||
# list comprehension tests
|
||||
nums = [1, 2, 3, 4, 5]
|
||||
|
|
|
@ -76,9 +76,20 @@ class RoundtripLegalSyntaxTestCase(unittest.TestCase):
|
|||
self.check_expr("[x**3 for x in range(20)]")
|
||||
self.check_expr("[x**3 for x in range(20) if x % 3]")
|
||||
self.check_expr("[x**3 for x in range(20) if x % 2 if x % 3]")
|
||||
self.check_expr("[x+y for x in range(30) for y in range(20) if x % 2 if y % 3]")
|
||||
#self.check_expr("[x for x in lambda: True, lambda: False if x()]")
|
||||
self.check_expr("list(x**3 for x in range(20))")
|
||||
self.check_expr("list(x**3 for x in range(20) if x % 3)")
|
||||
self.check_expr("list(x**3 for x in range(20) if x % 2 if x % 3)")
|
||||
self.check_expr("list(x+y for x in range(30) for y in range(20) if x % 2 if y % 3)")
|
||||
self.check_expr("{x**3 for x in range(30)}")
|
||||
self.check_expr("{x**3 for x in range(30) if x % 3}")
|
||||
self.check_expr("{x**3 for x in range(30) if x % 2 if x % 3}")
|
||||
self.check_expr("{x+y for x in range(30) for y in range(20) if x % 2 if y % 3}")
|
||||
self.check_expr("{x**3: y**2 for x, y in zip(range(30), range(30))}")
|
||||
self.check_expr("{x**3: y**2 for x, y in zip(range(30), range(30)) if x % 3}")
|
||||
self.check_expr("{x**3: y**2 for x, y in zip(range(30), range(30)) if x % 3 if y % 3}")
|
||||
self.check_expr("{x:y for x in range(30) for y in range(20) if x % 2 if y % 3}")
|
||||
self.check_expr("foo(*args)")
|
||||
self.check_expr("foo(*args, **kw)")
|
||||
self.check_expr("foo(**kw)")
|
||||
|
@ -107,6 +118,7 @@ class RoundtripLegalSyntaxTestCase(unittest.TestCase):
|
|||
self.check_expr("lambda foo=bar, blaz=blat+2, **z: 0")
|
||||
self.check_expr("lambda foo=bar, blaz=blat+2, *y, **z: 0")
|
||||
self.check_expr("lambda x, *y, **z: 0")
|
||||
self.check_expr("lambda x: 5 if x else 2")
|
||||
self.check_expr("(x for x in range(10))")
|
||||
self.check_expr("foo(x for x in range(10))")
|
||||
|
||||
|
|
|
@ -0,0 +1,151 @@
|
|||
doctests = """
|
||||
########### Tests mostly copied from test_listcomps.py ############
|
||||
|
||||
Test simple loop with conditional
|
||||
|
||||
>>> sum({i*i for i in range(100) if i&1 == 1})
|
||||
166650
|
||||
|
||||
Test simple case
|
||||
|
||||
>>> {2*y + x + 1 for x in (0,) for y in (1,)}
|
||||
set([3])
|
||||
|
||||
Test simple nesting
|
||||
|
||||
>>> list(sorted({(i,j) for i in range(3) for j in range(4)}))
|
||||
[(0, 0), (0, 1), (0, 2), (0, 3), (1, 0), (1, 1), (1, 2), (1, 3), (2, 0), (2, 1), (2, 2), (2, 3)]
|
||||
|
||||
Test nesting with the inner expression dependent on the outer
|
||||
|
||||
>>> list(sorted({(i,j) for i in range(4) for j in range(i)}))
|
||||
[(1, 0), (2, 0), (2, 1), (3, 0), (3, 1), (3, 2)]
|
||||
|
||||
Make sure the induction variable is not exposed
|
||||
|
||||
>>> i = 20
|
||||
>>> sum({i*i for i in range(100)})
|
||||
328350
|
||||
|
||||
>>> i
|
||||
20
|
||||
|
||||
Verify that syntax error's are raised for setcomps used as lvalues
|
||||
|
||||
>>> {y for y in (1,2)} = 10 # doctest: +IGNORE_EXCEPTION_DETAIL
|
||||
Traceback (most recent call last):
|
||||
...
|
||||
SyntaxError: ...
|
||||
|
||||
>>> {y for y in (1,2)} += 10 # doctest: +IGNORE_EXCEPTION_DETAIL
|
||||
Traceback (most recent call last):
|
||||
...
|
||||
SyntaxError: ...
|
||||
|
||||
|
||||
Make a nested set comprehension that acts like set(range())
|
||||
|
||||
>>> def srange(n):
|
||||
... return {i for i in range(n)}
|
||||
>>> list(sorted(srange(10)))
|
||||
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
|
||||
|
||||
Same again, only as a lambda expression instead of a function definition
|
||||
|
||||
>>> lrange = lambda n: {i for i in range(n)}
|
||||
>>> list(sorted(lrange(10)))
|
||||
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
|
||||
|
||||
Generators can call other generators:
|
||||
|
||||
>>> def grange(n):
|
||||
... for x in {i for i in range(n)}:
|
||||
... yield x
|
||||
>>> list(sorted(grange(5)))
|
||||
[0, 1, 2, 3, 4]
|
||||
|
||||
|
||||
Make sure that None is a valid return value
|
||||
|
||||
>>> {None for i in range(10)}
|
||||
set([None])
|
||||
|
||||
########### Tests for various scoping corner cases ############
|
||||
|
||||
Return lambdas that use the iteration variable as a default argument
|
||||
|
||||
>>> items = {(lambda i=i: i) for i in range(5)}
|
||||
>>> {x() for x in items} == set(range(5))
|
||||
True
|
||||
|
||||
Same again, only this time as a closure variable
|
||||
|
||||
>>> items = {(lambda: i) for i in range(5)}
|
||||
>>> {x() for x in items}
|
||||
set([4])
|
||||
|
||||
Another way to test that the iteration variable is local to the list comp
|
||||
|
||||
>>> items = {(lambda: i) for i in range(5)}
|
||||
>>> i = 20
|
||||
>>> {x() for x in items}
|
||||
set([4])
|
||||
|
||||
And confirm that a closure can jump over the list comp scope
|
||||
|
||||
>>> items = {(lambda: y) for i in range(5)}
|
||||
>>> y = 2
|
||||
>>> {x() for x in items}
|
||||
set([2])
|
||||
|
||||
We also repeat each of the above scoping tests inside a function
|
||||
|
||||
>>> def test_func():
|
||||
... items = {(lambda i=i: i) for i in range(5)}
|
||||
... return {x() for x in items}
|
||||
>>> test_func() == set(range(5))
|
||||
True
|
||||
|
||||
>>> def test_func():
|
||||
... items = {(lambda: i) for i in range(5)}
|
||||
... return {x() for x in items}
|
||||
>>> test_func()
|
||||
set([4])
|
||||
|
||||
>>> def test_func():
|
||||
... items = {(lambda: i) for i in range(5)}
|
||||
... i = 20
|
||||
... return {x() for x in items}
|
||||
>>> test_func()
|
||||
set([4])
|
||||
|
||||
>>> def test_func():
|
||||
... items = {(lambda: y) for i in range(5)}
|
||||
... y = 2
|
||||
... return {x() for x in items}
|
||||
>>> test_func()
|
||||
set([2])
|
||||
|
||||
"""
|
||||
|
||||
|
||||
__test__ = {'doctests' : doctests}
|
||||
|
||||
def test_main(verbose=None):
|
||||
import sys
|
||||
from test import test_support
|
||||
from test import test_setcomps
|
||||
test_support.run_doctest(test_setcomps, verbose)
|
||||
|
||||
# verify reference counting
|
||||
if verbose and hasattr(sys, "gettotalrefcount"):
|
||||
import gc
|
||||
counts = [None] * 5
|
||||
for i in range(len(counts)):
|
||||
test_support.run_doctest(test_setcomps, verbose)
|
||||
gc.collect()
|
||||
counts[i] = sys.gettotalrefcount()
|
||||
print(counts)
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_main(verbose=True)
|
|
@ -938,9 +938,9 @@ VALIDATER(subscriptlist); VALIDATER(sliceop);
|
|||
VALIDATER(exprlist); VALIDATER(dictorsetmaker);
|
||||
VALIDATER(arglist); VALIDATER(argument);
|
||||
VALIDATER(listmaker); VALIDATER(yield_stmt);
|
||||
VALIDATER(testlist1); VALIDATER(gen_for);
|
||||
VALIDATER(gen_iter); VALIDATER(gen_if);
|
||||
VALIDATER(testlist_gexp); VALIDATER(yield_expr);
|
||||
VALIDATER(testlist1); VALIDATER(comp_for);
|
||||
VALIDATER(comp_iter); VALIDATER(comp_if);
|
||||
VALIDATER(testlist_comp); VALIDATER(yield_expr);
|
||||
VALIDATER(yield_or_testlist); VALIDATER(or_test);
|
||||
VALIDATER(old_test); VALIDATER(old_lambdef);
|
||||
|
||||
|
@ -1342,17 +1342,17 @@ validate_list_iter(node *tree)
|
|||
return res;
|
||||
}
|
||||
|
||||
/* gen_iter: gen_for | gen_if
|
||||
/* comp_iter: comp_for | comp_if
|
||||
*/
|
||||
static int
|
||||
validate_gen_iter(node *tree)
|
||||
validate_comp_iter(node *tree)
|
||||
{
|
||||
int res = (validate_ntype(tree, gen_iter)
|
||||
&& validate_numnodes(tree, 1, "gen_iter"));
|
||||
if (res && TYPE(CHILD(tree, 0)) == gen_for)
|
||||
res = validate_gen_for(CHILD(tree, 0));
|
||||
int res = (validate_ntype(tree, comp_iter)
|
||||
&& validate_numnodes(tree, 1, "comp_iter"));
|
||||
if (res && TYPE(CHILD(tree, 0)) == comp_for)
|
||||
res = validate_comp_for(CHILD(tree, 0));
|
||||
else
|
||||
res = validate_gen_if(CHILD(tree, 0));
|
||||
res = validate_comp_if(CHILD(tree, 0));
|
||||
|
||||
return res;
|
||||
}
|
||||
|
@ -1379,18 +1379,18 @@ validate_list_for(node *tree)
|
|||
return res;
|
||||
}
|
||||
|
||||
/* gen_for: 'for' exprlist 'in' test [gen_iter]
|
||||
/* comp_for: 'for' exprlist 'in' test [comp_iter]
|
||||
*/
|
||||
static int
|
||||
validate_gen_for(node *tree)
|
||||
validate_comp_for(node *tree)
|
||||
{
|
||||
int nch = NCH(tree);
|
||||
int res;
|
||||
|
||||
if (nch == 5)
|
||||
res = validate_gen_iter(CHILD(tree, 4));
|
||||
res = validate_comp_iter(CHILD(tree, 4));
|
||||
else
|
||||
res = validate_numnodes(tree, 4, "gen_for");
|
||||
res = validate_numnodes(tree, 4, "comp_for");
|
||||
|
||||
if (res)
|
||||
res = (validate_name(CHILD(tree, 0), "for")
|
||||
|
@ -1421,18 +1421,18 @@ validate_list_if(node *tree)
|
|||
return res;
|
||||
}
|
||||
|
||||
/* gen_if: 'if' old_test [gen_iter]
|
||||
/* comp_if: 'if' old_test [comp_iter]
|
||||
*/
|
||||
static int
|
||||
validate_gen_if(node *tree)
|
||||
validate_comp_if(node *tree)
|
||||
{
|
||||
int nch = NCH(tree);
|
||||
int res;
|
||||
|
||||
if (nch == 3)
|
||||
res = validate_gen_iter(CHILD(tree, 2));
|
||||
res = validate_comp_iter(CHILD(tree, 2));
|
||||
else
|
||||
res = validate_numnodes(tree, 2, "gen_if");
|
||||
res = validate_numnodes(tree, 2, "comp_if");
|
||||
|
||||
if (res)
|
||||
res = (validate_name(CHILD(tree, 0), "if")
|
||||
|
@ -2459,7 +2459,7 @@ validate_atom(node *tree)
|
|||
if (TYPE(CHILD(tree, 1))==yield_expr)
|
||||
res = validate_yield_expr(CHILD(tree, 1));
|
||||
else
|
||||
res = validate_testlist_gexp(CHILD(tree, 1));
|
||||
res = validate_testlist_comp(CHILD(tree, 1));
|
||||
}
|
||||
break;
|
||||
case LSQB:
|
||||
|
@ -2539,26 +2539,26 @@ validate_listmaker(node *tree)
|
|||
return ok;
|
||||
}
|
||||
|
||||
/* testlist_gexp:
|
||||
* test ( gen_for | (',' test)* [','] )
|
||||
/* testlist_comp:
|
||||
* test ( comp_for | (',' test)* [','] )
|
||||
*/
|
||||
static int
|
||||
validate_testlist_gexp(node *tree)
|
||||
validate_testlist_comp(node *tree)
|
||||
{
|
||||
int nch = NCH(tree);
|
||||
int ok = nch;
|
||||
|
||||
if (nch == 0)
|
||||
err_string("missing child nodes of testlist_gexp");
|
||||
err_string("missing child nodes of testlist_comp");
|
||||
else {
|
||||
ok = validate_test(CHILD(tree, 0));
|
||||
}
|
||||
|
||||
/*
|
||||
* gen_for | (',' test)* [',']
|
||||
* comp_for | (',' test)* [',']
|
||||
*/
|
||||
if (nch == 2 && TYPE(CHILD(tree, 1)) == gen_for)
|
||||
ok = validate_gen_for(CHILD(tree, 1));
|
||||
if (nch == 2 && TYPE(CHILD(tree, 1)) == comp_for)
|
||||
ok = validate_comp_for(CHILD(tree, 1));
|
||||
else {
|
||||
/* (',' test)* [','] */
|
||||
int i = 1;
|
||||
|
@ -2571,7 +2571,7 @@ validate_testlist_gexp(node *tree)
|
|||
ok = validate_comma(CHILD(tree, i));
|
||||
else if (i != nch) {
|
||||
ok = 0;
|
||||
err_string("illegal trailing nodes for testlist_gexp");
|
||||
err_string("illegal trailing nodes for testlist_comp");
|
||||
}
|
||||
}
|
||||
return ok;
|
||||
|
@ -2746,7 +2746,7 @@ validate_arglist(node *tree)
|
|||
for (i=0; i<nch; i++) {
|
||||
if (TYPE(CHILD(tree, i)) == argument) {
|
||||
node *ch = CHILD(tree, i);
|
||||
if (NCH(ch) == 2 && TYPE(CHILD(ch, 1)) == gen_for) {
|
||||
if (NCH(ch) == 2 && TYPE(CHILD(ch, 1)) == comp_for) {
|
||||
err_string("need '(', ')' for generator expression");
|
||||
return 0;
|
||||
}
|
||||
|
@ -2813,7 +2813,7 @@ validate_arglist(node *tree)
|
|||
|
||||
/* argument:
|
||||
*
|
||||
* [test '='] test [gen_for]
|
||||
* [test '='] test [comp_for]
|
||||
*/
|
||||
static int
|
||||
validate_argument(node *tree)
|
||||
|
@ -2824,7 +2824,7 @@ validate_argument(node *tree)
|
|||
&& validate_test(CHILD(tree, 0)));
|
||||
|
||||
if (res && (nch == 2))
|
||||
res = validate_gen_for(CHILD(tree, 1));
|
||||
res = validate_comp_for(CHILD(tree, 1));
|
||||
else if (res && (nch == 3))
|
||||
res = (validate_equal(CHILD(tree, 1))
|
||||
&& validate_test(CHILD(tree, 2)));
|
||||
|
@ -2965,12 +2965,19 @@ validate_exprlist(node *tree)
|
|||
}
|
||||
|
||||
|
||||
/*
|
||||
* dictorsetmaker:
|
||||
*
|
||||
* (test ':' test (comp_for | (',' test ':' test)* [','])) |
|
||||
* (test (comp_for | (',' test)* [',']))
|
||||
*/
|
||||
static int
|
||||
validate_dictorsetmaker(node *tree)
|
||||
{
|
||||
int nch = NCH(tree);
|
||||
int ok = validate_ntype(tree, dictorsetmaker);
|
||||
int i = 0;
|
||||
int check_trailing_comma = 0;
|
||||
|
||||
assert(nch > 0);
|
||||
|
||||
|
@ -2984,6 +2991,23 @@ validate_dictorsetmaker(node *tree)
|
|||
&& validate_test(CHILD(tree, i+1)));
|
||||
i += 2;
|
||||
}
|
||||
check_trailing_comma = 1;
|
||||
}
|
||||
else if (ok && TYPE(CHILD(tree, 1)) == comp_for) {
|
||||
/* We got a set comprehension:
|
||||
* test comp_for
|
||||
*/
|
||||
ok = (validate_test(CHILD(tree, 0))
|
||||
&& validate_comp_for(CHILD(tree, 1)));
|
||||
}
|
||||
else if (ok && NCH(tree) > 3 && TYPE(CHILD(tree, 3)) == comp_for) {
|
||||
/* We got a dict comprehension:
|
||||
* test ':' test comp_for
|
||||
*/
|
||||
ok = (validate_test(CHILD(tree, 0))
|
||||
&& validate_colon(CHILD(tree, 1))
|
||||
&& validate_test(CHILD(tree, 2))
|
||||
&& validate_comp_for(CHILD(tree, 3)));
|
||||
}
|
||||
else if (ok) {
|
||||
/* We got a dict:
|
||||
|
@ -3007,9 +3031,9 @@ validate_dictorsetmaker(node *tree)
|
|||
&& validate_test(CHILD(tree, i+3)));
|
||||
i += 4;
|
||||
}
|
||||
check_trailing_comma = 1;
|
||||
}
|
||||
/* Check for a trailing comma. */
|
||||
if (ok) {
|
||||
if (ok && check_trailing_comma) {
|
||||
if (i == nch-1)
|
||||
ok = validate_comma(CHILD(tree, i));
|
||||
else if (i != nch) {
|
||||
|
|
|
@ -58,6 +58,8 @@ module Python version "$Revision$"
|
|||
| Dict(expr* keys, expr* values)
|
||||
| Set(expr* elts)
|
||||
| ListComp(expr elt, comprehension* generators)
|
||||
| SetComp(expr elt, comprehension* generators)
|
||||
| DictComp(expr key, expr value, comprehension* generators)
|
||||
| GeneratorExp(expr elt, comprehension* generators)
|
||||
-- the grammar constrains where yield expressions can occur
|
||||
| Yield(expr? value)
|
||||
|
|
|
@ -197,6 +197,17 @@ static char *ListComp_fields[]={
|
|||
"elt",
|
||||
"generators",
|
||||
};
|
||||
static PyTypeObject *SetComp_type;
|
||||
static char *SetComp_fields[]={
|
||||
"elt",
|
||||
"generators",
|
||||
};
|
||||
static PyTypeObject *DictComp_type;
|
||||
static char *DictComp_fields[]={
|
||||
"key",
|
||||
"value",
|
||||
"generators",
|
||||
};
|
||||
static PyTypeObject *GeneratorExp_type;
|
||||
static char *GeneratorExp_fields[]={
|
||||
"elt",
|
||||
|
@ -726,6 +737,10 @@ static int init_types(void)
|
|||
if (!Set_type) return 0;
|
||||
ListComp_type = make_type("ListComp", expr_type, ListComp_fields, 2);
|
||||
if (!ListComp_type) return 0;
|
||||
SetComp_type = make_type("SetComp", expr_type, SetComp_fields, 2);
|
||||
if (!SetComp_type) return 0;
|
||||
DictComp_type = make_type("DictComp", expr_type, DictComp_fields, 3);
|
||||
if (!DictComp_type) return 0;
|
||||
GeneratorExp_type = make_type("GeneratorExp", expr_type,
|
||||
GeneratorExp_fields, 2);
|
||||
if (!GeneratorExp_type) return 0;
|
||||
|
@ -1630,6 +1645,54 @@ ListComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset,
|
|||
return p;
|
||||
}
|
||||
|
||||
expr_ty
|
||||
SetComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, PyArena
|
||||
*arena)
|
||||
{
|
||||
expr_ty p;
|
||||
if (!elt) {
|
||||
PyErr_SetString(PyExc_ValueError,
|
||||
"field elt is required for SetComp");
|
||||
return NULL;
|
||||
}
|
||||
p = (expr_ty)PyArena_Malloc(arena, sizeof(*p));
|
||||
if (!p)
|
||||
return NULL;
|
||||
p->kind = SetComp_kind;
|
||||
p->v.SetComp.elt = elt;
|
||||
p->v.SetComp.generators = generators;
|
||||
p->lineno = lineno;
|
||||
p->col_offset = col_offset;
|
||||
return p;
|
||||
}
|
||||
|
||||
expr_ty
|
||||
DictComp(expr_ty key, expr_ty value, asdl_seq * generators, int lineno, int
|
||||
col_offset, PyArena *arena)
|
||||
{
|
||||
expr_ty p;
|
||||
if (!key) {
|
||||
PyErr_SetString(PyExc_ValueError,
|
||||
"field key is required for DictComp");
|
||||
return NULL;
|
||||
}
|
||||
if (!value) {
|
||||
PyErr_SetString(PyExc_ValueError,
|
||||
"field value is required for DictComp");
|
||||
return NULL;
|
||||
}
|
||||
p = (expr_ty)PyArena_Malloc(arena, sizeof(*p));
|
||||
if (!p)
|
||||
return NULL;
|
||||
p->kind = DictComp_kind;
|
||||
p->v.DictComp.key = key;
|
||||
p->v.DictComp.value = value;
|
||||
p->v.DictComp.generators = generators;
|
||||
p->lineno = lineno;
|
||||
p->col_offset = col_offset;
|
||||
return p;
|
||||
}
|
||||
|
||||
expr_ty
|
||||
GeneratorExp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset,
|
||||
PyArena *arena)
|
||||
|
@ -2610,6 +2673,41 @@ ast2obj_expr(void* _o)
|
|||
goto failed;
|
||||
Py_DECREF(value);
|
||||
break;
|
||||
case SetComp_kind:
|
||||
result = PyType_GenericNew(SetComp_type, NULL, NULL);
|
||||
if (!result) goto failed;
|
||||
value = ast2obj_expr(o->v.SetComp.elt);
|
||||
if (!value) goto failed;
|
||||
if (PyObject_SetAttrString(result, "elt", value) == -1)
|
||||
goto failed;
|
||||
Py_DECREF(value);
|
||||
value = ast2obj_list(o->v.SetComp.generators,
|
||||
ast2obj_comprehension);
|
||||
if (!value) goto failed;
|
||||
if (PyObject_SetAttrString(result, "generators", value) == -1)
|
||||
goto failed;
|
||||
Py_DECREF(value);
|
||||
break;
|
||||
case DictComp_kind:
|
||||
result = PyType_GenericNew(DictComp_type, NULL, NULL);
|
||||
if (!result) goto failed;
|
||||
value = ast2obj_expr(o->v.DictComp.key);
|
||||
if (!value) goto failed;
|
||||
if (PyObject_SetAttrString(result, "key", value) == -1)
|
||||
goto failed;
|
||||
Py_DECREF(value);
|
||||
value = ast2obj_expr(o->v.DictComp.value);
|
||||
if (!value) goto failed;
|
||||
if (PyObject_SetAttrString(result, "value", value) == -1)
|
||||
goto failed;
|
||||
Py_DECREF(value);
|
||||
value = ast2obj_list(o->v.DictComp.generators,
|
||||
ast2obj_comprehension);
|
||||
if (!value) goto failed;
|
||||
if (PyObject_SetAttrString(result, "generators", value) == -1)
|
||||
goto failed;
|
||||
Py_DECREF(value);
|
||||
break;
|
||||
case GeneratorExp_kind:
|
||||
result = PyType_GenericNew(GeneratorExp_type, NULL, NULL);
|
||||
if (!result) goto failed;
|
||||
|
@ -4974,6 +5072,118 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena)
|
|||
if (*out == NULL) goto failed;
|
||||
return 0;
|
||||
}
|
||||
isinstance = PyObject_IsInstance(obj, (PyObject*)SetComp_type);
|
||||
if (isinstance == -1) {
|
||||
return 1;
|
||||
}
|
||||
if (isinstance) {
|
||||
expr_ty elt;
|
||||
asdl_seq* generators;
|
||||
|
||||
if (PyObject_HasAttrString(obj, "elt")) {
|
||||
int res;
|
||||
tmp = PyObject_GetAttrString(obj, "elt");
|
||||
if (tmp == NULL) goto failed;
|
||||
res = obj2ast_expr(tmp, &elt, arena);
|
||||
if (res != 0) goto failed;
|
||||
Py_XDECREF(tmp);
|
||||
tmp = NULL;
|
||||
} else {
|
||||
PyErr_SetString(PyExc_TypeError, "required field \"elt\" missing from SetComp");
|
||||
return 1;
|
||||
}
|
||||
if (PyObject_HasAttrString(obj, "generators")) {
|
||||
int res;
|
||||
Py_ssize_t len;
|
||||
Py_ssize_t i;
|
||||
tmp = PyObject_GetAttrString(obj, "generators");
|
||||
if (tmp == NULL) goto failed;
|
||||
if (!PyList_Check(tmp)) {
|
||||
PyErr_Format(PyExc_TypeError, "SetComp field \"generators\" must be a list, not a %.200s", tmp->ob_type->tp_name);
|
||||
goto failed;
|
||||
}
|
||||
len = PyList_GET_SIZE(tmp);
|
||||
generators = asdl_seq_new(len, arena);
|
||||
if (generators == NULL) goto failed;
|
||||
for (i = 0; i < len; i++) {
|
||||
comprehension_ty value;
|
||||
res = obj2ast_comprehension(PyList_GET_ITEM(tmp, i), &value, arena);
|
||||
if (res != 0) goto failed;
|
||||
asdl_seq_SET(generators, i, value);
|
||||
}
|
||||
Py_XDECREF(tmp);
|
||||
tmp = NULL;
|
||||
} else {
|
||||
PyErr_SetString(PyExc_TypeError, "required field \"generators\" missing from SetComp");
|
||||
return 1;
|
||||
}
|
||||
*out = SetComp(elt, generators, lineno, col_offset, arena);
|
||||
if (*out == NULL) goto failed;
|
||||
return 0;
|
||||
}
|
||||
isinstance = PyObject_IsInstance(obj, (PyObject*)DictComp_type);
|
||||
if (isinstance == -1) {
|
||||
return 1;
|
||||
}
|
||||
if (isinstance) {
|
||||
expr_ty key;
|
||||
expr_ty value;
|
||||
asdl_seq* generators;
|
||||
|
||||
if (PyObject_HasAttrString(obj, "key")) {
|
||||
int res;
|
||||
tmp = PyObject_GetAttrString(obj, "key");
|
||||
if (tmp == NULL) goto failed;
|
||||
res = obj2ast_expr(tmp, &key, arena);
|
||||
if (res != 0) goto failed;
|
||||
Py_XDECREF(tmp);
|
||||
tmp = NULL;
|
||||
} else {
|
||||
PyErr_SetString(PyExc_TypeError, "required field \"key\" missing from DictComp");
|
||||
return 1;
|
||||
}
|
||||
if (PyObject_HasAttrString(obj, "value")) {
|
||||
int res;
|
||||
tmp = PyObject_GetAttrString(obj, "value");
|
||||
if (tmp == NULL) goto failed;
|
||||
res = obj2ast_expr(tmp, &value, arena);
|
||||
if (res != 0) goto failed;
|
||||
Py_XDECREF(tmp);
|
||||
tmp = NULL;
|
||||
} else {
|
||||
PyErr_SetString(PyExc_TypeError, "required field \"value\" missing from DictComp");
|
||||
return 1;
|
||||
}
|
||||
if (PyObject_HasAttrString(obj, "generators")) {
|
||||
int res;
|
||||
Py_ssize_t len;
|
||||
Py_ssize_t i;
|
||||
tmp = PyObject_GetAttrString(obj, "generators");
|
||||
if (tmp == NULL) goto failed;
|
||||
if (!PyList_Check(tmp)) {
|
||||
PyErr_Format(PyExc_TypeError, "DictComp field \"generators\" must be a list, not a %.200s", tmp->ob_type->tp_name);
|
||||
goto failed;
|
||||
}
|
||||
len = PyList_GET_SIZE(tmp);
|
||||
generators = asdl_seq_new(len, arena);
|
||||
if (generators == NULL) goto failed;
|
||||
for (i = 0; i < len; i++) {
|
||||
comprehension_ty value;
|
||||
res = obj2ast_comprehension(PyList_GET_ITEM(tmp, i), &value, arena);
|
||||
if (res != 0) goto failed;
|
||||
asdl_seq_SET(generators, i, value);
|
||||
}
|
||||
Py_XDECREF(tmp);
|
||||
tmp = NULL;
|
||||
} else {
|
||||
PyErr_SetString(PyExc_TypeError, "required field \"generators\" missing from DictComp");
|
||||
return 1;
|
||||
}
|
||||
*out = DictComp(key, value, generators, lineno, col_offset,
|
||||
arena);
|
||||
if (*out == NULL) goto failed;
|
||||
return 0;
|
||||
}
|
||||
isinstance = PyObject_IsInstance(obj, (PyObject*)GeneratorExp_type);
|
||||
if (isinstance == -1) {
|
||||
return 1;
|
||||
|
@ -6419,6 +6629,10 @@ init_ast(void)
|
|||
if (PyDict_SetItemString(d, "Set", (PyObject*)Set_type) < 0) return;
|
||||
if (PyDict_SetItemString(d, "ListComp", (PyObject*)ListComp_type) < 0)
|
||||
return;
|
||||
if (PyDict_SetItemString(d, "SetComp", (PyObject*)SetComp_type) < 0)
|
||||
return;
|
||||
if (PyDict_SetItemString(d, "DictComp", (PyObject*)DictComp_type) < 0)
|
||||
return;
|
||||
if (PyDict_SetItemString(d, "GeneratorExp",
|
||||
(PyObject*)GeneratorExp_type) < 0) return;
|
||||
if (PyDict_SetItemString(d, "Yield", (PyObject*)Yield_type) < 0) return;
|
||||
|
|
231
Python/ast.c
231
Python/ast.c
|
@ -31,7 +31,7 @@ static asdl_seq *ast_for_exprlist(struct compiling *, const node *,
|
|||
expr_context_ty);
|
||||
static expr_ty ast_for_testlist(struct compiling *, const node *);
|
||||
static stmt_ty ast_for_classdef(struct compiling *, const node *, asdl_seq *);
|
||||
static expr_ty ast_for_testlist_gexp(struct compiling *, const node *);
|
||||
static expr_ty ast_for_testlist_comp(struct compiling *, const node *);
|
||||
|
||||
/* Note different signature for ast_for_call */
|
||||
static expr_ty ast_for_call(struct compiling *, const node *, expr_ty);
|
||||
|
@ -44,6 +44,9 @@ static PyObject *parsestrplus(struct compiling *, const node *n);
|
|||
#define LINENO(n) ((n)->n_lineno)
|
||||
#endif
|
||||
|
||||
#define COMP_GENEXP 0
|
||||
#define COMP_SETCOMP 1
|
||||
|
||||
static identifier
|
||||
new_identifier(const char* n, PyArena *arena) {
|
||||
PyObject* id = PyString_InternFromString(n);
|
||||
|
@ -268,7 +271,7 @@ PyAST_FromNode(const node *n, PyCompilerFlags *flags, const char *filename,
|
|||
case eval_input: {
|
||||
expr_ty testlist_ast;
|
||||
|
||||
/* XXX Why not gen_for here? */
|
||||
/* XXX Why not comp_for here? */
|
||||
testlist_ast = ast_for_testlist(&c, CHILD(n, 0));
|
||||
if (!testlist_ast)
|
||||
goto error;
|
||||
|
@ -430,6 +433,12 @@ set_context(struct compiling *c, expr_ty e, expr_context_ty ctx, const node *n)
|
|||
case ListComp_kind:
|
||||
expr_name = "list comprehension";
|
||||
break;
|
||||
case SetComp_kind:
|
||||
expr_name = "set comprehension";
|
||||
break;
|
||||
case DictComp_kind:
|
||||
expr_name = "dict comprehension";
|
||||
break;
|
||||
case Dict_kind:
|
||||
case Num_kind:
|
||||
case Str_kind:
|
||||
|
@ -573,7 +582,7 @@ seq_for_testlist(struct compiling *c, const node *n)
|
|||
int i;
|
||||
assert(TYPE(n) == testlist ||
|
||||
TYPE(n) == listmaker ||
|
||||
TYPE(n) == testlist_gexp ||
|
||||
TYPE(n) == testlist_comp ||
|
||||
TYPE(n) == testlist_safe ||
|
||||
TYPE(n) == testlist1);
|
||||
|
||||
|
@ -1150,33 +1159,33 @@ ast_for_listcomp(struct compiling *c, const node *n)
|
|||
return ListComp(elt, listcomps, LINENO(n), n->n_col_offset, c->c_arena);
|
||||
}
|
||||
|
||||
/* Count the number of 'for' loops in a generator expression.
|
||||
/*
|
||||
Count the number of 'for' loops in a comprehension.
|
||||
|
||||
Helper for ast_for_genexp().
|
||||
Helper for ast_for_comprehension().
|
||||
*/
|
||||
|
||||
static int
|
||||
count_gen_fors(struct compiling *c, const node *n)
|
||||
count_comp_fors(struct compiling *c, const node *n)
|
||||
{
|
||||
int n_fors = 0;
|
||||
node *ch = CHILD(n, 1);
|
||||
|
||||
count_gen_for:
|
||||
count_comp_for:
|
||||
n_fors++;
|
||||
REQ(ch, gen_for);
|
||||
if (NCH(ch) == 5)
|
||||
ch = CHILD(ch, 4);
|
||||
REQ(n, comp_for);
|
||||
if (NCH(n) == 5)
|
||||
n = CHILD(n, 4);
|
||||
else
|
||||
return n_fors;
|
||||
count_gen_iter:
|
||||
REQ(ch, gen_iter);
|
||||
ch = CHILD(ch, 0);
|
||||
if (TYPE(ch) == gen_for)
|
||||
goto count_gen_for;
|
||||
else if (TYPE(ch) == gen_if) {
|
||||
if (NCH(ch) == 3) {
|
||||
ch = CHILD(ch, 2);
|
||||
goto count_gen_iter;
|
||||
count_comp_iter:
|
||||
REQ(n, comp_iter);
|
||||
n = CHILD(n, 0);
|
||||
if (TYPE(n) == comp_for)
|
||||
goto count_comp_for;
|
||||
else if (TYPE(n) == comp_if) {
|
||||
if (NCH(n) == 3) {
|
||||
n = CHILD(n, 2);
|
||||
goto count_comp_iter;
|
||||
}
|
||||
else
|
||||
return n_fors;
|
||||
|
@ -1184,26 +1193,26 @@ count_gen_fors(struct compiling *c, const node *n)
|
|||
|
||||
/* Should never be reached */
|
||||
PyErr_SetString(PyExc_SystemError,
|
||||
"logic error in count_gen_fors");
|
||||
"logic error in count_comp_fors");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Count the number of 'if' statements in a generator expression.
|
||||
/* Count the number of 'if' statements in a comprehension.
|
||||
|
||||
Helper for ast_for_genexp().
|
||||
Helper for ast_for_comprehension().
|
||||
*/
|
||||
|
||||
static int
|
||||
count_gen_ifs(struct compiling *c, const node *n)
|
||||
count_comp_ifs(struct compiling *c, const node *n)
|
||||
{
|
||||
int n_ifs = 0;
|
||||
|
||||
while (1) {
|
||||
REQ(n, gen_iter);
|
||||
if (TYPE(CHILD(n, 0)) == gen_for)
|
||||
REQ(n, comp_iter);
|
||||
if (TYPE(CHILD(n, 0)) == comp_for)
|
||||
return n_ifs;
|
||||
n = CHILD(n, 0);
|
||||
REQ(n, gen_if);
|
||||
REQ(n, comp_if);
|
||||
n_ifs++;
|
||||
if (NCH(n) == 2)
|
||||
return n_ifs;
|
||||
|
@ -1211,46 +1220,33 @@ count_gen_ifs(struct compiling *c, const node *n)
|
|||
}
|
||||
}
|
||||
|
||||
/* TODO(jhylton): Combine with list comprehension code? */
|
||||
static expr_ty
|
||||
ast_for_genexp(struct compiling *c, const node *n)
|
||||
static asdl_seq *
|
||||
ast_for_comprehension(struct compiling *c, const node *n)
|
||||
{
|
||||
/* testlist_gexp: test ( gen_for | (',' test)* [','] )
|
||||
argument: [test '='] test [gen_for] # Really [keyword '='] test */
|
||||
expr_ty elt;
|
||||
asdl_seq *genexps;
|
||||
int i, n_fors;
|
||||
node *ch;
|
||||
asdl_seq *comps;
|
||||
|
||||
assert(TYPE(n) == (testlist_gexp) || TYPE(n) == (argument));
|
||||
assert(NCH(n) > 1);
|
||||
|
||||
elt = ast_for_expr(c, CHILD(n, 0));
|
||||
if (!elt)
|
||||
return NULL;
|
||||
|
||||
n_fors = count_gen_fors(c, n);
|
||||
n_fors = count_comp_fors(c, n);
|
||||
if (n_fors == -1)
|
||||
return NULL;
|
||||
|
||||
genexps = asdl_seq_new(n_fors, c->c_arena);
|
||||
if (!genexps)
|
||||
comps = asdl_seq_new(n_fors, c->c_arena);
|
||||
if (!comps)
|
||||
return NULL;
|
||||
|
||||
ch = CHILD(n, 1);
|
||||
for (i = 0; i < n_fors; i++) {
|
||||
comprehension_ty ge;
|
||||
comprehension_ty comp;
|
||||
asdl_seq *t;
|
||||
expr_ty expression, first;
|
||||
node *for_ch;
|
||||
|
||||
REQ(ch, gen_for);
|
||||
REQ(n, comp_for);
|
||||
|
||||
for_ch = CHILD(ch, 1);
|
||||
for_ch = CHILD(n, 1);
|
||||
t = ast_for_exprlist(c, for_ch, Store);
|
||||
if (!t)
|
||||
return NULL;
|
||||
expression = ast_for_expr(c, CHILD(ch, 3));
|
||||
expression = ast_for_expr(c, CHILD(n, 3));
|
||||
if (!expression)
|
||||
return NULL;
|
||||
|
||||
|
@ -1258,21 +1254,20 @@ ast_for_genexp(struct compiling *c, const node *n)
|
|||
(x for x, in ...) has 1 element in t, but still requires a Tuple. */
|
||||
first = (expr_ty)asdl_seq_GET(t, 0);
|
||||
if (NCH(for_ch) == 1)
|
||||
ge = comprehension(first, expression, NULL, c->c_arena);
|
||||
comp = comprehension(first, expression, NULL, c->c_arena);
|
||||
else
|
||||
ge = comprehension(Tuple(t, Store, first->lineno, first->col_offset,
|
||||
comp = comprehension(Tuple(t, Store, first->lineno, first->col_offset,
|
||||
c->c_arena),
|
||||
expression, NULL, c->c_arena);
|
||||
|
||||
if (!ge)
|
||||
if (!comp)
|
||||
return NULL;
|
||||
|
||||
if (NCH(ch) == 5) {
|
||||
if (NCH(n) == 5) {
|
||||
int j, n_ifs;
|
||||
asdl_seq *ifs;
|
||||
|
||||
ch = CHILD(ch, 4);
|
||||
n_ifs = count_gen_ifs(c, ch);
|
||||
n = CHILD(n, 4);
|
||||
n_ifs = count_comp_ifs(c, n);
|
||||
if (n_ifs == -1)
|
||||
return NULL;
|
||||
|
||||
|
@ -1281,32 +1276,94 @@ ast_for_genexp(struct compiling *c, const node *n)
|
|||
return NULL;
|
||||
|
||||
for (j = 0; j < n_ifs; j++) {
|
||||
REQ(ch, gen_iter);
|
||||
ch = CHILD(ch, 0);
|
||||
REQ(ch, gen_if);
|
||||
REQ(n, comp_iter);
|
||||
n = CHILD(n, 0);
|
||||
REQ(n, comp_if);
|
||||
|
||||
expression = ast_for_expr(c, CHILD(ch, 1));
|
||||
expression = ast_for_expr(c, CHILD(n, 1));
|
||||
if (!expression)
|
||||
return NULL;
|
||||
asdl_seq_SET(ifs, j, expression);
|
||||
if (NCH(ch) == 3)
|
||||
ch = CHILD(ch, 2);
|
||||
if (NCH(n) == 3)
|
||||
n = CHILD(n, 2);
|
||||
}
|
||||
/* on exit, must guarantee that ch is a gen_for */
|
||||
if (TYPE(ch) == gen_iter)
|
||||
ch = CHILD(ch, 0);
|
||||
ge->ifs = ifs;
|
||||
/* on exit, must guarantee that n is a comp_for */
|
||||
if (TYPE(n) == comp_iter)
|
||||
n = CHILD(n, 0);
|
||||
comp->ifs = ifs;
|
||||
}
|
||||
asdl_seq_SET(genexps, i, ge);
|
||||
asdl_seq_SET(comps, i, comp);
|
||||
}
|
||||
return comps;
|
||||
}
|
||||
|
||||
return GeneratorExp(elt, genexps, LINENO(n), n->n_col_offset, c->c_arena);
|
||||
static expr_ty
|
||||
ast_for_itercomp(struct compiling *c, const node *n, int type)
|
||||
{
|
||||
expr_ty elt;
|
||||
asdl_seq *comps;
|
||||
|
||||
assert(NCH(n) > 1);
|
||||
|
||||
elt = ast_for_expr(c, CHILD(n, 0));
|
||||
if (!elt)
|
||||
return NULL;
|
||||
|
||||
comps = ast_for_comprehension(c, CHILD(n, 1));
|
||||
if (!comps)
|
||||
return NULL;
|
||||
|
||||
if (type == COMP_GENEXP)
|
||||
return GeneratorExp(elt, comps, LINENO(n), n->n_col_offset, c->c_arena);
|
||||
else if (type == COMP_SETCOMP)
|
||||
return SetComp(elt, comps, LINENO(n), n->n_col_offset, c->c_arena);
|
||||
else
|
||||
/* Should never happen */
|
||||
return NULL;
|
||||
}
|
||||
|
||||
static expr_ty
|
||||
ast_for_dictcomp(struct compiling *c, const node *n)
|
||||
{
|
||||
expr_ty key, value;
|
||||
asdl_seq *comps;
|
||||
|
||||
assert(NCH(n) > 3);
|
||||
REQ(CHILD(n, 1), COLON);
|
||||
|
||||
key = ast_for_expr(c, CHILD(n, 0));
|
||||
if (!key)
|
||||
return NULL;
|
||||
|
||||
value = ast_for_expr(c, CHILD(n, 2));
|
||||
if (!value)
|
||||
return NULL;
|
||||
|
||||
comps = ast_for_comprehension(c, CHILD(n, 3));
|
||||
if (!comps)
|
||||
return NULL;
|
||||
|
||||
return DictComp(key, value, comps, LINENO(n), n->n_col_offset, c->c_arena);
|
||||
}
|
||||
|
||||
static expr_ty
|
||||
ast_for_genexp(struct compiling *c, const node *n)
|
||||
{
|
||||
assert(TYPE(n) == (testlist_comp) || TYPE(n) == (argument));
|
||||
return ast_for_itercomp(c, n, COMP_GENEXP);
|
||||
}
|
||||
|
||||
static expr_ty
|
||||
ast_for_setcomp(struct compiling *c, const node *n)
|
||||
{
|
||||
assert(TYPE(n) == (dictorsetmaker));
|
||||
return ast_for_itercomp(c, n, COMP_SETCOMP);
|
||||
}
|
||||
|
||||
static expr_ty
|
||||
ast_for_atom(struct compiling *c, const node *n)
|
||||
{
|
||||
/* atom: '(' [yield_expr|testlist_gexp] ')' | '[' [listmaker] ']'
|
||||
/* atom: '(' [yield_expr|testlist_comp] ')' | '[' [listmaker] ']'
|
||||
| '{' [dictmaker] '}' | '`' testlist '`' | NAME | NUMBER | STRING+
|
||||
*/
|
||||
node *ch = CHILD(n, 0);
|
||||
|
@ -1365,7 +1422,7 @@ ast_for_atom(struct compiling *c, const node *n)
|
|||
if (TYPE(ch) == yield_expr)
|
||||
return ast_for_expr(c, ch);
|
||||
|
||||
return ast_for_testlist_gexp(c, ch);
|
||||
return ast_for_testlist_comp(c, ch);
|
||||
case LSQB: /* list (or list comprehension) */
|
||||
ch = CHILD(n, 1);
|
||||
|
||||
|
@ -1383,8 +1440,9 @@ ast_for_atom(struct compiling *c, const node *n)
|
|||
else
|
||||
return ast_for_listcomp(c, ch);
|
||||
case LBRACE: {
|
||||
/* dictorsetmaker: test ':' test (',' test ':' test)* [','] |
|
||||
* test (',' test)* [','])
|
||||
/* dictorsetmaker:
|
||||
* (test ':' test (comp_for | (',' test ':' test)* [','])) |
|
||||
* (test (comp_for | (',' test)* [',']))
|
||||
*/
|
||||
int i, size;
|
||||
asdl_seq *keys, *values;
|
||||
|
@ -1408,6 +1466,11 @@ ast_for_atom(struct compiling *c, const node *n)
|
|||
asdl_seq_SET(elts, i / 2, expression);
|
||||
}
|
||||
return Set(elts, LINENO(n), n->n_col_offset, c->c_arena);
|
||||
} else if (TYPE(CHILD(ch, 1)) == comp_for) {
|
||||
/* it's a set comprehension */
|
||||
return ast_for_setcomp(c, ch);
|
||||
} else if (NCH(ch) > 3 && TYPE(CHILD(ch, 3)) == comp_for) {
|
||||
return ast_for_dictcomp(c, ch);
|
||||
} else {
|
||||
/* it's a dict */
|
||||
size = (NCH(ch) + 1) / 4; /* +1 in case no trailing comma */
|
||||
|
@ -1916,7 +1979,7 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func)
|
|||
/*
|
||||
arglist: (argument ',')* (argument [',']| '*' test [',' '**' test]
|
||||
| '**' test)
|
||||
argument: [test '='] test [gen_for] # Really [keyword '='] test
|
||||
argument: [test '='] test [comp_for] # Really [keyword '='] test
|
||||
*/
|
||||
|
||||
int i, nargs, nkeywords, ngens;
|
||||
|
@ -1934,7 +1997,7 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func)
|
|||
if (TYPE(ch) == argument) {
|
||||
if (NCH(ch) == 1)
|
||||
nargs++;
|
||||
else if (TYPE(CHILD(ch, 1)) == gen_for)
|
||||
else if (TYPE(CHILD(ch, 1)) == comp_for)
|
||||
ngens++;
|
||||
else
|
||||
nkeywords++;
|
||||
|
@ -1979,7 +2042,7 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func)
|
|||
return NULL;
|
||||
asdl_seq_SET(args, nargs++, e);
|
||||
}
|
||||
else if (TYPE(CHILD(ch, 1)) == gen_for) {
|
||||
else if (TYPE(CHILD(ch, 1)) == comp_for) {
|
||||
e = ast_for_genexp(c, ch);
|
||||
if (!e)
|
||||
return NULL;
|
||||
|
@ -2049,14 +2112,14 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func)
|
|||
static expr_ty
|
||||
ast_for_testlist(struct compiling *c, const node* n)
|
||||
{
|
||||
/* testlist_gexp: test (',' test)* [','] */
|
||||
/* testlist_comp: test (',' test)* [','] */
|
||||
/* testlist: test (',' test)* [','] */
|
||||
/* testlist_safe: test (',' test)+ [','] */
|
||||
/* testlist1: test (',' test)* */
|
||||
assert(NCH(n) > 0);
|
||||
if (TYPE(n) == testlist_gexp) {
|
||||
if (TYPE(n) == testlist_comp) {
|
||||
if (NCH(n) > 1)
|
||||
assert(TYPE(CHILD(n, 1)) != gen_for);
|
||||
assert(TYPE(CHILD(n, 1)) != comp_for);
|
||||
}
|
||||
else {
|
||||
assert(TYPE(n) == testlist ||
|
||||
|
@ -2074,12 +2137,12 @@ ast_for_testlist(struct compiling *c, const node* n)
|
|||
}
|
||||
|
||||
static expr_ty
|
||||
ast_for_testlist_gexp(struct compiling *c, const node* n)
|
||||
ast_for_testlist_comp(struct compiling *c, const node* n)
|
||||
{
|
||||
/* testlist_gexp: test ( gen_for | (',' test)* [','] ) */
|
||||
/* argument: test [ gen_for ] */
|
||||
assert(TYPE(n) == testlist_gexp || TYPE(n) == argument);
|
||||
if (NCH(n) > 1 && TYPE(CHILD(n, 1)) == gen_for)
|
||||
/* testlist_comp: test ( comp_for | (',' test)* [','] ) */
|
||||
/* argument: test [ comp_for ] */
|
||||
assert(TYPE(n) == testlist_comp || TYPE(n) == argument);
|
||||
if (NCH(n) > 1 && TYPE(CHILD(n, 1)) == comp_for)
|
||||
return ast_for_genexp(c, n);
|
||||
return ast_for_testlist(c, n);
|
||||
}
|
||||
|
|
|
@ -1455,6 +1455,17 @@ PyEval_EvalFrameEx(PyFrameObject *f, int throwflag)
|
|||
}
|
||||
break;
|
||||
|
||||
case SET_ADD:
|
||||
w = POP();
|
||||
v = stack_pointer[-oparg];
|
||||
err = PySet_Add(v, w);
|
||||
Py_DECREF(w);
|
||||
if (err == 0) {
|
||||
PREDICT(JUMP_ABSOLUTE);
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
|
||||
case INPLACE_POWER:
|
||||
w = POP();
|
||||
v = TOP();
|
||||
|
@ -2223,6 +2234,21 @@ PyEval_EvalFrameEx(PyFrameObject *f, int throwflag)
|
|||
if (err == 0) continue;
|
||||
break;
|
||||
|
||||
case MAP_ADD:
|
||||
w = TOP(); /* key */
|
||||
u = SECOND(); /* value */
|
||||
STACKADJ(-2);
|
||||
v = stack_pointer[-oparg]; /* dict */
|
||||
assert (PyDict_CheckExact(v));
|
||||
err = PyDict_SetItem(v, w, u); /* v[w] = u */
|
||||
Py_DECREF(u);
|
||||
Py_DECREF(w);
|
||||
if (err == 0) {
|
||||
PREDICT(JUMP_ABSOLUTE);
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
|
||||
case LOAD_ATTR:
|
||||
w = GETITEM(names, oparg);
|
||||
v = TOP();
|
||||
|
|
188
Python/compile.c
188
Python/compile.c
|
@ -39,6 +39,10 @@ int Py_OptimizeFlag = 0;
|
|||
#define DEFAULT_CODE_SIZE 128
|
||||
#define DEFAULT_LNOTAB_SIZE 16
|
||||
|
||||
#define COMP_GENEXP 0
|
||||
#define COMP_SETCOMP 1
|
||||
#define COMP_DICTCOMP 2
|
||||
|
||||
struct instr {
|
||||
unsigned i_jabs : 1;
|
||||
unsigned i_jrel : 1;
|
||||
|
@ -674,9 +678,13 @@ opcode_stack_effect(int opcode, int oparg)
|
|||
case UNARY_INVERT:
|
||||
return 0;
|
||||
|
||||
case SET_ADD:
|
||||
case LIST_APPEND:
|
||||
return -1;
|
||||
|
||||
case MAP_ADD:
|
||||
return -2;
|
||||
|
||||
case BINARY_POWER:
|
||||
case BINARY_MULTIPLY:
|
||||
case BINARY_DIVIDE:
|
||||
|
@ -2639,32 +2647,43 @@ compiler_listcomp(struct compiler *c, expr_ty e)
|
|||
e->v.ListComp.elt);
|
||||
}
|
||||
|
||||
/* Dict and set comprehensions and generator expressions work by creating a
|
||||
nested function to perform the actual iteration. This means that the
|
||||
iteration variables don't leak into the current scope.
|
||||
The defined function is called immediately following its definition, with the
|
||||
result of that call being the result of the expression.
|
||||
The LC/SC version returns the populated container, while the GE version is
|
||||
flagged in symtable.c as a generator, so it returns the generator object
|
||||
when the function is called.
|
||||
This code *knows* that the loop cannot contain break, continue, or return,
|
||||
so it cheats and skips the SETUP_LOOP/POP_BLOCK steps used in normal loops.
|
||||
|
||||
Possible cleanups:
|
||||
- iterate over the generator sequence instead of using recursion
|
||||
*/
|
||||
|
||||
static int
|
||||
compiler_genexp_generator(struct compiler *c,
|
||||
compiler_comprehension_generator(struct compiler *c,
|
||||
asdl_seq *generators, int gen_index,
|
||||
expr_ty elt)
|
||||
expr_ty elt, expr_ty val, int type)
|
||||
{
|
||||
/* generate code for the iterator, then each of the ifs,
|
||||
and then write to the element */
|
||||
|
||||
comprehension_ty ge;
|
||||
basicblock *start, *anchor, *skip, *if_cleanup, *end;
|
||||
comprehension_ty gen;
|
||||
basicblock *start, *anchor, *skip, *if_cleanup;
|
||||
int i, n;
|
||||
|
||||
start = compiler_new_block(c);
|
||||
skip = compiler_new_block(c);
|
||||
if_cleanup = compiler_new_block(c);
|
||||
anchor = compiler_new_block(c);
|
||||
end = compiler_new_block(c);
|
||||
|
||||
if (start == NULL || skip == NULL || if_cleanup == NULL ||
|
||||
anchor == NULL || end == NULL)
|
||||
anchor == NULL)
|
||||
return 0;
|
||||
|
||||
ge = (comprehension_ty)asdl_seq_GET(generators, gen_index);
|
||||
ADDOP_JREL(c, SETUP_LOOP, end);
|
||||
if (!compiler_push_fblock(c, LOOP, start))
|
||||
return 0;
|
||||
gen = (comprehension_ty)asdl_seq_GET(generators, gen_index);
|
||||
|
||||
if (gen_index == 0) {
|
||||
/* Receive outermost iter as an implicit argument */
|
||||
|
@ -2673,77 +2692,164 @@ compiler_genexp_generator(struct compiler *c,
|
|||
}
|
||||
else {
|
||||
/* Sub-iter - calculate on the fly */
|
||||
VISIT(c, expr, ge->iter);
|
||||
VISIT(c, expr, gen->iter);
|
||||
ADDOP(c, GET_ITER);
|
||||
}
|
||||
compiler_use_next_block(c, start);
|
||||
ADDOP_JREL(c, FOR_ITER, anchor);
|
||||
NEXT_BLOCK(c);
|
||||
VISIT(c, expr, ge->target);
|
||||
VISIT(c, expr, gen->target);
|
||||
|
||||
/* XXX this needs to be cleaned up...a lot! */
|
||||
n = asdl_seq_LEN(ge->ifs);
|
||||
n = asdl_seq_LEN(gen->ifs);
|
||||
for (i = 0; i < n; i++) {
|
||||
expr_ty e = (expr_ty)asdl_seq_GET(ge->ifs, i);
|
||||
expr_ty e = (expr_ty)asdl_seq_GET(gen->ifs, i);
|
||||
VISIT(c, expr, e);
|
||||
ADDOP_JABS(c, POP_JUMP_IF_FALSE, if_cleanup);
|
||||
NEXT_BLOCK(c);
|
||||
}
|
||||
|
||||
if (++gen_index < asdl_seq_LEN(generators))
|
||||
if (!compiler_genexp_generator(c, generators, gen_index, elt))
|
||||
if (!compiler_comprehension_generator(c,
|
||||
generators, gen_index,
|
||||
elt, val, type))
|
||||
return 0;
|
||||
|
||||
/* only append after the last 'for' generator */
|
||||
/* only append after the last for generator */
|
||||
if (gen_index >= asdl_seq_LEN(generators)) {
|
||||
/* comprehension specific code */
|
||||
switch (type) {
|
||||
case COMP_GENEXP:
|
||||
VISIT(c, expr, elt);
|
||||
ADDOP(c, YIELD_VALUE);
|
||||
ADDOP(c, POP_TOP);
|
||||
break;
|
||||
case COMP_SETCOMP:
|
||||
VISIT(c, expr, elt);
|
||||
ADDOP_I(c, SET_ADD, gen_index + 1);
|
||||
break;
|
||||
case COMP_DICTCOMP:
|
||||
/* With 'd[k] = v', v is evaluated before k, so we do
|
||||
the same. */
|
||||
VISIT(c, expr, val);
|
||||
VISIT(c, expr, elt);
|
||||
ADDOP_I(c, MAP_ADD, gen_index + 1);
|
||||
break;
|
||||
default:
|
||||
return 0;
|
||||
}
|
||||
|
||||
compiler_use_next_block(c, skip);
|
||||
}
|
||||
compiler_use_next_block(c, if_cleanup);
|
||||
ADDOP_JABS(c, JUMP_ABSOLUTE, start);
|
||||
compiler_use_next_block(c, anchor);
|
||||
ADDOP(c, POP_BLOCK);
|
||||
compiler_pop_fblock(c, LOOP, start);
|
||||
compiler_use_next_block(c, end);
|
||||
|
||||
return 1;
|
||||
}
|
||||
|
||||
static int
|
||||
compiler_comprehension(struct compiler *c, expr_ty e, int type, identifier name,
|
||||
asdl_seq *generators, expr_ty elt, expr_ty val)
|
||||
{
|
||||
PyCodeObject *co = NULL;
|
||||
expr_ty outermost_iter;
|
||||
|
||||
outermost_iter = ((comprehension_ty)
|
||||
asdl_seq_GET(generators, 0))->iter;
|
||||
|
||||
if (!compiler_enter_scope(c, name, (void *)e, e->lineno))
|
||||
goto error;
|
||||
|
||||
if (type != COMP_GENEXP) {
|
||||
int op;
|
||||
switch (type) {
|
||||
case COMP_SETCOMP:
|
||||
op = BUILD_SET;
|
||||
break;
|
||||
case COMP_DICTCOMP:
|
||||
op = BUILD_MAP;
|
||||
break;
|
||||
default:
|
||||
PyErr_Format(PyExc_SystemError,
|
||||
"unknown comprehension type %d", type);
|
||||
goto error_in_scope;
|
||||
}
|
||||
|
||||
ADDOP_I(c, op, 0);
|
||||
}
|
||||
|
||||
if (!compiler_comprehension_generator(c, generators, 0, elt,
|
||||
val, type))
|
||||
goto error_in_scope;
|
||||
|
||||
if (type != COMP_GENEXP) {
|
||||
ADDOP(c, RETURN_VALUE);
|
||||
}
|
||||
|
||||
co = assemble(c, 1);
|
||||
compiler_exit_scope(c);
|
||||
if (co == NULL)
|
||||
goto error;
|
||||
|
||||
if (!compiler_make_closure(c, co, 0))
|
||||
goto error;
|
||||
Py_DECREF(co);
|
||||
|
||||
VISIT(c, expr, outermost_iter);
|
||||
ADDOP(c, GET_ITER);
|
||||
ADDOP_I(c, CALL_FUNCTION, 1);
|
||||
return 1;
|
||||
error_in_scope:
|
||||
compiler_exit_scope(c);
|
||||
error:
|
||||
Py_XDECREF(co);
|
||||
return 0;
|
||||
}
|
||||
|
||||
static int
|
||||
compiler_genexp(struct compiler *c, expr_ty e)
|
||||
{
|
||||
static identifier name;
|
||||
PyCodeObject *co;
|
||||
expr_ty outermost_iter = ((comprehension_ty)
|
||||
(asdl_seq_GET(e->v.GeneratorExp.generators,
|
||||
0)))->iter;
|
||||
|
||||
if (!name) {
|
||||
name = PyString_FromString("<genexpr>");
|
||||
if (!name)
|
||||
return 0;
|
||||
}
|
||||
assert(e->kind == GeneratorExp_kind);
|
||||
return compiler_comprehension(c, e, COMP_GENEXP, name,
|
||||
e->v.GeneratorExp.generators,
|
||||
e->v.GeneratorExp.elt, NULL);
|
||||
}
|
||||
|
||||
if (!compiler_enter_scope(c, name, (void *)e, e->lineno))
|
||||
static int
|
||||
compiler_setcomp(struct compiler *c, expr_ty e)
|
||||
{
|
||||
static identifier name;
|
||||
if (!name) {
|
||||
name = PyString_FromString("<setcomp>");
|
||||
if (!name)
|
||||
return 0;
|
||||
compiler_genexp_generator(c, e->v.GeneratorExp.generators, 0,
|
||||
e->v.GeneratorExp.elt);
|
||||
co = assemble(c, 1);
|
||||
compiler_exit_scope(c);
|
||||
if (co == NULL)
|
||||
}
|
||||
assert(e->kind == SetComp_kind);
|
||||
return compiler_comprehension(c, e, COMP_SETCOMP, name,
|
||||
e->v.SetComp.generators,
|
||||
e->v.SetComp.elt, NULL);
|
||||
}
|
||||
|
||||
static int
|
||||
compiler_dictcomp(struct compiler *c, expr_ty e)
|
||||
{
|
||||
static identifier name;
|
||||
if (!name) {
|
||||
name = PyString_FromString("<dictcomp>");
|
||||
if (!name)
|
||||
return 0;
|
||||
|
||||
compiler_make_closure(c, co, 0);
|
||||
Py_DECREF(co);
|
||||
|
||||
VISIT(c, expr, outermost_iter);
|
||||
ADDOP(c, GET_ITER);
|
||||
ADDOP_I(c, CALL_FUNCTION, 1);
|
||||
|
||||
return 1;
|
||||
}
|
||||
assert(e->kind == DictComp_kind);
|
||||
return compiler_comprehension(c, e, COMP_DICTCOMP, name,
|
||||
e->v.DictComp.generators,
|
||||
e->v.DictComp.key, e->v.DictComp.value);
|
||||
}
|
||||
|
||||
static int
|
||||
|
@ -2902,6 +3008,10 @@ compiler_visit_expr(struct compiler *c, expr_ty e)
|
|||
break;
|
||||
case ListComp_kind:
|
||||
return compiler_listcomp(c, e);
|
||||
case SetComp_kind:
|
||||
return compiler_setcomp(c, e);
|
||||
case DictComp_kind:
|
||||
return compiler_dictcomp(c, e);
|
||||
case GeneratorExp_kind:
|
||||
return compiler_genexp(c, e);
|
||||
case Yield_kind:
|
||||
|
|
|
@ -1550,42 +1550,57 @@ static state states_72[5] = {
|
|||
static arc arcs_73_0[1] = {
|
||||
{28, 1},
|
||||
};
|
||||
static arc arcs_73_1[3] = {
|
||||
static arc arcs_73_1[4] = {
|
||||
{23, 2},
|
||||
{29, 3},
|
||||
{157, 3},
|
||||
{29, 4},
|
||||
{0, 1},
|
||||
};
|
||||
static arc arcs_73_2[1] = {
|
||||
{28, 4},
|
||||
};
|
||||
static arc arcs_73_3[2] = {
|
||||
{28, 5},
|
||||
};
|
||||
static arc arcs_73_3[1] = {
|
||||
{0, 3},
|
||||
};
|
||||
static arc arcs_73_4[2] = {
|
||||
{29, 6},
|
||||
{28, 6},
|
||||
{0, 4},
|
||||
};
|
||||
static arc arcs_73_5[2] = {
|
||||
{29, 3},
|
||||
static arc arcs_73_5[3] = {
|
||||
{157, 3},
|
||||
{29, 7},
|
||||
{0, 5},
|
||||
};
|
||||
static arc arcs_73_6[2] = {
|
||||
{28, 7},
|
||||
{29, 4},
|
||||
{0, 6},
|
||||
};
|
||||
static arc arcs_73_7[1] = {
|
||||
{23, 2},
|
||||
static arc arcs_73_7[2] = {
|
||||
{28, 8},
|
||||
{0, 7},
|
||||
};
|
||||
static state states_73[8] = {
|
||||
static arc arcs_73_8[1] = {
|
||||
{23, 9},
|
||||
};
|
||||
static arc arcs_73_9[1] = {
|
||||
{28, 10},
|
||||
};
|
||||
static arc arcs_73_10[2] = {
|
||||
{29, 7},
|
||||
{0, 10},
|
||||
};
|
||||
static state states_73[11] = {
|
||||
{1, arcs_73_0},
|
||||
{3, arcs_73_1},
|
||||
{4, arcs_73_1},
|
||||
{1, arcs_73_2},
|
||||
{2, arcs_73_3},
|
||||
{1, arcs_73_3},
|
||||
{2, arcs_73_4},
|
||||
{2, arcs_73_5},
|
||||
{3, arcs_73_5},
|
||||
{2, arcs_73_6},
|
||||
{1, arcs_73_7},
|
||||
{2, arcs_73_7},
|
||||
{1, arcs_73_8},
|
||||
{1, arcs_73_9},
|
||||
{2, arcs_73_10},
|
||||
};
|
||||
static arc arcs_74_0[1] = {
|
||||
{162, 1},
|
||||
|
@ -1964,7 +1979,7 @@ static dfa dfas[86] = {
|
|||
"\000\040\040\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\044\015\000\000"},
|
||||
{319, "listmaker", 0, 5, states_63,
|
||||
"\000\040\040\000\000\000\000\000\000\000\000\000\000\040\010\000\200\041\044\015\000\000"},
|
||||
{320, "testlist_gexp", 0, 5, states_64,
|
||||
{320, "testlist_comp", 0, 5, states_64,
|
||||
"\000\040\040\000\000\000\000\000\000\000\000\000\000\040\010\000\200\041\044\015\000\000"},
|
||||
{321, "lambdef", 0, 5, states_65,
|
||||
"\000\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000"},
|
||||
|
@ -1982,7 +1997,7 @@ static dfa dfas[86] = {
|
|||
"\000\040\040\000\000\000\000\000\000\000\000\000\000\040\010\000\200\041\044\015\000\000"},
|
||||
{328, "dictmaker", 0, 5, states_72,
|
||||
"\000\040\040\000\000\000\000\000\000\000\000\000\000\040\010\000\200\041\044\015\000\000"},
|
||||
{329, "dictorsetmaker", 0, 8, states_73,
|
||||
{329, "dictorsetmaker", 0, 11, states_73,
|
||||
"\000\040\040\000\000\000\000\000\000\000\000\000\000\040\010\000\200\041\044\015\000\000"},
|
||||
{330, "classdef", 0, 8, states_74,
|
||||
"\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\004\000"},
|
||||
|
@ -1996,11 +2011,11 @@ static dfa dfas[86] = {
|
|||
"\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000"},
|
||||
{335, "list_if", 0, 4, states_79,
|
||||
"\000\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000"},
|
||||
{336, "gen_iter", 0, 2, states_80,
|
||||
{336, "comp_iter", 0, 2, states_80,
|
||||
"\000\000\000\000\000\000\000\000\000\000\000\020\001\000\000\000\000\000\000\000\000\000"},
|
||||
{337, "gen_for", 0, 6, states_81,
|
||||
{337, "comp_for", 0, 6, states_81,
|
||||
"\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000"},
|
||||
{338, "gen_if", 0, 4, states_82,
|
||||
{338, "comp_if", 0, 4, states_82,
|
||||
"\000\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000"},
|
||||
{339, "testlist1", 0, 2, states_83,
|
||||
"\000\040\040\000\000\000\000\000\000\000\000\000\000\040\010\000\200\041\044\015\000\000"},
|
||||
|
|
|
@ -76,9 +76,10 @@ typedef unsigned short mode_t;
|
|||
introduce POP_JUMP_IF_FALSE and POP_JUMP_IF_TRUE)
|
||||
Python 2.7a0 62191 (introduce SETUP_WITH)
|
||||
Python 2.7a0 62201 (introduce BUILD_SET)
|
||||
Python 2.7a0 62211 (introduce MAP_ADD and SET_ADD)
|
||||
.
|
||||
*/
|
||||
#define MAGIC (62201 | ((long)'\r'<<16) | ((long)'\n'<<24))
|
||||
#define MAGIC (62211 | ((long)'\r'<<16) | ((long)'\n'<<24))
|
||||
|
||||
/* Magic word as global; note that _PyImport_Init() can change the
|
||||
value of this global to accommodate for alterations of how the
|
||||
|
|
|
@ -166,6 +166,8 @@ static int symtable_exit_block(struct symtable *st, void *ast);
|
|||
static int symtable_visit_stmt(struct symtable *st, stmt_ty s);
|
||||
static int symtable_visit_expr(struct symtable *st, expr_ty s);
|
||||
static int symtable_visit_genexp(struct symtable *st, expr_ty s);
|
||||
static int symtable_visit_setcomp(struct symtable *st, expr_ty e);
|
||||
static int symtable_visit_dictcomp(struct symtable *st, expr_ty e);
|
||||
static int symtable_visit_arguments(struct symtable *st, arguments_ty);
|
||||
static int symtable_visit_excepthandler(struct symtable *st, excepthandler_ty);
|
||||
static int symtable_visit_alias(struct symtable *st, alias_ty);
|
||||
|
@ -177,7 +179,8 @@ static int symtable_visit_params_nested(struct symtable *st, asdl_seq *args);
|
|||
static int symtable_implicit_arg(struct symtable *st, int pos);
|
||||
|
||||
|
||||
static identifier top = NULL, lambda = NULL, genexpr = NULL;
|
||||
static identifier top = NULL, lambda = NULL, genexpr = NULL, setcomp = NULL,
|
||||
dictcomp = NULL;
|
||||
|
||||
#define GET_IDENTIFIER(VAR) \
|
||||
((VAR) ? (VAR) : ((VAR) = PyString_InternFromString(# VAR)))
|
||||
|
@ -1222,6 +1225,14 @@ symtable_visit_expr(struct symtable *st, expr_ty e)
|
|||
if (!symtable_visit_genexp(st, e))
|
||||
return 0;
|
||||
break;
|
||||
case SetComp_kind:
|
||||
if (!symtable_visit_setcomp(st, e))
|
||||
return 0;
|
||||
break;
|
||||
case DictComp_kind:
|
||||
if (!symtable_visit_dictcomp(st, e))
|
||||
return 0;
|
||||
break;
|
||||
case Yield_kind:
|
||||
if (e->v.Yield.value)
|
||||
VISIT(st, expr, e->v.Yield.value);
|
||||
|
@ -1463,27 +1474,80 @@ symtable_visit_slice(struct symtable *st, slice_ty s)
|
|||
}
|
||||
|
||||
static int
|
||||
symtable_visit_genexp(struct symtable *st, expr_ty e)
|
||||
symtable_new_tmpname(struct symtable *st)
|
||||
{
|
||||
char tmpname[256];
|
||||
identifier tmp;
|
||||
|
||||
PyOS_snprintf(tmpname, sizeof(tmpname), "_[%d]",
|
||||
++st->st_cur->ste_tmpname);
|
||||
tmp = PyString_InternFromString(tmpname);
|
||||
if (!tmp)
|
||||
return 0;
|
||||
if (!symtable_add_def(st, tmp, DEF_LOCAL))
|
||||
return 0;
|
||||
Py_DECREF(tmp);
|
||||
return 1;
|
||||
}
|
||||
|
||||
static int
|
||||
symtable_handle_comprehension(struct symtable *st, expr_ty e,
|
||||
identifier scope_name, asdl_seq *generators,
|
||||
expr_ty elt, expr_ty value)
|
||||
{
|
||||
int is_generator = (e->kind == GeneratorExp_kind);
|
||||
int needs_tmp = !is_generator;
|
||||
comprehension_ty outermost = ((comprehension_ty)
|
||||
(asdl_seq_GET(e->v.GeneratorExp.generators, 0)));
|
||||
asdl_seq_GET(generators, 0));
|
||||
/* Outermost iterator is evaluated in current scope */
|
||||
VISIT(st, expr, outermost->iter);
|
||||
/* Create generator scope for the rest */
|
||||
if (!GET_IDENTIFIER(genexpr) ||
|
||||
!symtable_enter_block(st, genexpr, FunctionBlock, (void *)e, e->lineno)) {
|
||||
/* Create comprehension scope for the rest */
|
||||
if (!scope_name ||
|
||||
!symtable_enter_block(st, scope_name, FunctionBlock, (void *)e, 0)) {
|
||||
return 0;
|
||||
}
|
||||
st->st_cur->ste_generator = 1;
|
||||
st->st_cur->ste_generator = is_generator;
|
||||
/* Outermost iter is received as an argument */
|
||||
if (!symtable_implicit_arg(st, 0)) {
|
||||
symtable_exit_block(st, (void *)e);
|
||||
return 0;
|
||||
}
|
||||
/* Allocate temporary name if needed */
|
||||
if (needs_tmp && !symtable_new_tmpname(st)) {
|
||||
symtable_exit_block(st, (void *)e);
|
||||
return 0;
|
||||
}
|
||||
VISIT_IN_BLOCK(st, expr, outermost->target, (void*)e);
|
||||
VISIT_SEQ_IN_BLOCK(st, expr, outermost->ifs, (void*)e);
|
||||
VISIT_SEQ_TAIL_IN_BLOCK(st, comprehension,
|
||||
e->v.GeneratorExp.generators, 1, (void*)e);
|
||||
VISIT_IN_BLOCK(st, expr, e->v.GeneratorExp.elt, (void*)e);
|
||||
generators, 1, (void*)e);
|
||||
if (value)
|
||||
VISIT_IN_BLOCK(st, expr, value, (void*)e);
|
||||
VISIT_IN_BLOCK(st, expr, elt, (void*)e);
|
||||
return symtable_exit_block(st, (void *)e);
|
||||
}
|
||||
|
||||
static int
|
||||
symtable_visit_genexp(struct symtable *st, expr_ty e)
|
||||
{
|
||||
return symtable_handle_comprehension(st, e, GET_IDENTIFIER(genexpr),
|
||||
e->v.GeneratorExp.generators,
|
||||
e->v.GeneratorExp.elt, NULL);
|
||||
}
|
||||
|
||||
static int
|
||||
symtable_visit_setcomp(struct symtable *st, expr_ty e)
|
||||
{
|
||||
return symtable_handle_comprehension(st, e, GET_IDENTIFIER(setcomp),
|
||||
e->v.SetComp.generators,
|
||||
e->v.SetComp.elt, NULL);
|
||||
}
|
||||
|
||||
static int
|
||||
symtable_visit_dictcomp(struct symtable *st, expr_ty e)
|
||||
{
|
||||
return symtable_handle_comprehension(st, e, GET_IDENTIFIER(dictcomp),
|
||||
e->v.DictComp.generators,
|
||||
e->v.DictComp.key,
|
||||
e->v.DictComp.value);
|
||||
}
|
||||
|
|
Loading…
Reference in New Issue