I had a code below in Python 3.2 and I wanted to run it in Python 2.7. I did convert it (have put the code of missing_elements
in both versions) but I am not sure if that is the most efficient way to do it. Basically what happens if there are two yield from
calls like below in upper half and lower half in missing_element
function? Are the entries from the two halves (upper and lower) appended to each other in one list so that the parent recursion function with the yield from
call and use both the halves together?
我在Python 3.2中有一个代码,我想在Python 2.7中运行它。我确实转换了它(已经将missing_elements的代码放在两个版本中),但是我不确定这是否是最有效的方法。基本上,如果在missing_element函数中上下一半的调用中有两个yield,会发生什么呢?两个半(上、下)的条目在一个列表中被追加到一起,从而使父递归函数与调用的yield同时使用两个部分?
def missing_elements(L, start, end): # Python 3.2
if end - start <= 1:
if L[end] - L[start] > 1:
yield from range(L[start] + 1, L[end])
return
index = start + (end - start) // 2
# is the lower half consecutive?
consecutive_low = L[index] == L[start] + (index - start)
if not consecutive_low:
yield from missing_elements(L, start, index)
# is the upper part consecutive?
consecutive_high = L[index] == L[end] - (end - index)
if not consecutive_high:
yield from missing_elements(L, index, end)
def main():
L = [10, 11, 13, 14, 15, 16, 17, 18, 20]
print(list(missing_elements(L, 0, len(L)-1)))
L = range(10, 21)
print(list(missing_elements(L, 0, len(L)-1)))
def missing_elements(L, start, end): # Python 2.7
return_list = []
if end - start <= 1:
if L[end] - L[start] > 1:
return range(L[start] + 1, L[end])
index = start + (end - start) // 2
# is the lower half consecutive?
consecutive_low = L[index] == L[start] + (index - start)
if not consecutive_low:
return_list.append(missing_elements(L, start, index))
# is the upper part consecutive?
consecutive_high = L[index] == L[end] - (end - index)
if not consecutive_high:
return_list.append(missing_elements(L, index, end))
return return_list
6 个解决方案
#1
63
If you don't use the results of your yields,* you can always turn this:
如果你不使用你的收益率的结果,*你可以一直这样做:
yield from foo:
… into this:
……在这:
for bar in foo:
yield bar
There might be a performance cost,** but there is never a semantic difference.
可能会有性能成本,**但是没有语义上的差异。
Are the entries from the two halves (upper and lower) appended to each other in one list so that the parent recursion function with the yield from call and use both the halves together?
两个半(上、下)的条目在一个列表中被追加到一起,从而使父递归函数与调用的yield同时使用两个部分?
No! The whole point of iterators and generators is that you don't build actual lists and append them together.
不!迭代器和生成器的全部意义在于,您不构建实际的列表并将它们添加到一起。
But the effect is similar: you just yield from one, then yield from another.
但效果是相似的:你只需要从其中一个获得收益,然后从另一个获得收益。
If you think of the upper half and the lower half as "lazy lists", then yes, you can think of this as a "lazy append" that creates a larger "lazy list". And if you call list
on the result of the parent function, you of course will get an actual list
that's equivalent to appending together the two lists you would have gotten if you'd done yield list(…)
instead of yield from …
.
如果你认为上半部分和下半部分是“懒惰列表”,那么你可以把它看作是一个“懒惰的附加部分”,它会创建一个更大的“懒惰列表”。如果你调用在父函数的结果,你当然会得到一个实际相当于附加在一起两个列表的列表会得到如果你完成产量列表(…),而不是产量从....
But I think it's easier to think of it the other way around: What it does is exactly the same the for
loops do.
但我认为反过来想更容易:它所做的和循环做的完全一样。
If you saved the two iterators into variables, and looped over itertools.chain(upper, lower)
, that would be the same as looping over the first and then looping over the second, right? No difference here. In fact, you could implement chain
as just:
如果您将两个迭代器保存为变量,并循环使用itertools。链(上,下),这将和循环在第一个,然后循环在第二个,对吗?没有区别。实际上,你可以把链当作:
for arg in *args:
yield from arg
* Not the values the generator yields to its caller, the value of the yield expressions themselves, within the generator (which come from the caller using the send
method), as described in PEP 342. You're not using these in your examples. And I'm willing to bet you're not in your real code. But coroutine-style code often uses the value of a yield from
expression—see PEP 3156 for examples. Such code usually depends on other features of Python 3.3 generators—in particular, the new StopIteration.value
from the same PEP 380 that introduced yield from
—so it will have to be rewritten. But if not, you can use the PEP also shows you the complete horrid messy equivalent, and you can of course pare down the parts you don't care about. And if you don't use the value of the expression, it pares down to the two lines above.
*不像PEP 342中所描述的那样,在生成器中(使用send方法从调用方发出的)中,生成器生成的值与调用者本身的值无关。你没有在你的例子中使用这些。我敢打赌你不是在你的真实代码中。但是,coroutine风格的代码通常会使用来自express3156的yield的值。这类代码通常依赖于Python 3.3泛型的其他特性——特别是新的StopIteration。同样的PEP 380带来的价值——因此它将不得不被重写。但是如果没有,你也可以使用PEP来显示完全可怕的等效,当然你可以减少你不关心的部分。如果你不使用这个表达式的值,它就会缩减到上面的两行。
** Not a huge one, and there's nothing you can do about it short of using Python 3.3 or completely restructuring your code. It's exactly the same case as translating list comprehensions to Python 1.5 loops, or any other case when there's a new optimization in version X.Y and you need to use an older version.
**不是一个大的,你也不能做什么,因为它没有使用Python 3.3或者完全重构你的代码。这与将列表理解转换为Python 1.5循环的情况完全相同,或者在版本X中有一个新的优化时的其他情况。你需要使用旧版本。
#2
4
I just came across this issue and my usage was a bit more difficult since I needed the return value of yield from
:
我刚遇到这个问题,我的用法有点困难,因为我需要收益的回报:
result = yield from other_gen()
This cannot be represented as a simple for
loop but can be reproduced with this:
这不能被表示为一个简单的循环,但是可以用这个来复制:
_iter = iter(other_gen())
try:
while True: #broken by StopIteration
yield next(_iter)
except StopIteration as e:
if e.args:
result = e.args[0]
else:
result = None
Hopefully this will help people who come across the same problem. :)
希望这能帮助遇到同样问题的人。:)
#3
3
I think I found a way to emulate Python 3.x yield from
construct in Python 2.x. It's not efficient and it is a little hacky, but here it is:
我想我找到了一个模仿python3的方法。在Python 2.x中构造出x。它不是有效的,它是一个小的hacky,但在这里它是:
import types
def inline_generators(fn):
def inline(value):
if isinstance(value, InlineGenerator):
for x in value.wrapped:
for y in inline(x):
yield y
else:
yield value
def wrapped(*args, **kwargs):
result = fn(*args, **kwargs)
if isinstance(result, types.GeneratorType):
result = inline(_from(result))
return result
return wrapped
class InlineGenerator(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def _from(value):
assert isinstance(value, types.GeneratorType)
return InlineGenerator(value)
Usage:
用法:
@inline_generators
def outer(x):
def inner_inner(x):
for x in range(1, x + 1):
yield x
def inner(x):
for x in range(1, x + 1):
yield _from(inner_inner(x))
for x in range(1, x + 1):
yield _from(inner(x))
for x in outer(3):
print x,
Produces output:
产生的输出:
1 1 1 2 1 1 2 1 2 3
Maybe someone finds this helpful.
也许有人觉得这很有用。
Known issues: Lacks support for send() and various corner cases described in PEP 380. These could be added and I will edit my entry once I get it working.
已知的问题:缺少对send()和PEP 380中描述的各种角的支持。可以添加这些内容,一旦工作完成,我将编辑条目。
#4
2
Replace them with for-loops:
用for循环:
yield from range(L[start] + 1, L[end])
==>
for i in range(L[start] + 1, L[end]):
yield i
The same about elements:
相同的元素:
yield from missing_elements(L, index, end)
==>
for el in missing_elements(L, index, end):
yield el
#5
1
What about using the definition from pep-380 in order to construct a Python 2 syntax version:
如何使用pep-380的定义来构造Python 2的语法版本:
the statement:
声明:
RESULT = yield from EXPR
is semantically equivalent to:
语义上等价于:
_i = iter(EXPR)
try:
_y = next(_i)
except StopIteration as _e:
_r = _e.value
else:
while 1:
try:
_s = yield _y
except GeneratorExit as _e:
try:
_m = _i.close
except AttributeError:
pass
else:
_m()
raise _e
except BaseException as _e:
_x = sys.exc_info()
try:
_m = _i.throw
except AttributeError:
raise _e
else:
try:
_y = _m(*_x)
except StopIteration as _e:
_r = _e.value
break
else:
try:
if _s is None:
_y = next(_i)
else:
_y = _i.send(_s)
except StopIteration as _e:
_r = _e.value
break
RESULT = _r
in a generator, the statement:
在生成器中,语句:
return value
is semantically equivalent to
语义上等价于
raise StopIteration(value)
except that, as currently, the exception cannot be caught by except clauses within the returning generator.
除了当前,除了返回生成器中的子句外,不能捕获异常。
The StopIteration exception behaves as though defined thusly:
StopIteration异常表现为:
class StopIteration(Exception):
def __init__(self, *args):
if len(args) > 0:
self.value = args[0]
else:
self.value = None
Exception.__init__(self, *args)
#6
0
I've found using resource contexts (using the python-resources module) to be an elegant mechanism for implementing subgenerators in Python 2.7. Conveniently I'd already been using the resource contexts anyway.
我发现使用资源上下文(使用Python -resource模块)是用于在Python 2.7中实现子生成器的优雅机制。顺便说一下,我已经使用了资源上下文。
If in Python 3.3 you would have:
如果在Python 3.3中有:
@resources.register_func
def get_a_thing(type_of_thing):
if type_of_thing is "A":
yield from complicated_logic_for_handling_a()
else:
yield from complicated_logic_for_handling_b()
def complicated_logic_for_handling_a():
a = expensive_setup_for_a()
yield a
expensive_tear_down_for_a()
def complicated_logic_for_handling_b():
b = expensive_setup_for_b()
yield b
expensive_tear_down_for_b()
In Python 2.7 you would have:
在Python 2.7中,你会:
@resources.register_func
def get_a_thing(type_of_thing):
if type_of_thing is "A":
with resources.complicated_logic_for_handling_a_ctx() as a:
yield a
else:
with resources.complicated_logic_for_handling_b_ctx() as b:
yield b
@resources.register_func
def complicated_logic_for_handling_a():
a = expensive_setup_for_a()
yield a
expensive_tear_down_for_a()
@resources.register_func
def complicated_logic_for_handling_b():
b = expensive_setup_for_b()
yield b
expensive_tear_down_for_b()
Note how the complicated-logic operations only require the registration as a resource.
请注意,复杂逻辑操作只需要注册为资源。
#1
63
If you don't use the results of your yields,* you can always turn this:
如果你不使用你的收益率的结果,*你可以一直这样做:
yield from foo:
… into this:
……在这:
for bar in foo:
yield bar
There might be a performance cost,** but there is never a semantic difference.
可能会有性能成本,**但是没有语义上的差异。
Are the entries from the two halves (upper and lower) appended to each other in one list so that the parent recursion function with the yield from call and use both the halves together?
两个半(上、下)的条目在一个列表中被追加到一起,从而使父递归函数与调用的yield同时使用两个部分?
No! The whole point of iterators and generators is that you don't build actual lists and append them together.
不!迭代器和生成器的全部意义在于,您不构建实际的列表并将它们添加到一起。
But the effect is similar: you just yield from one, then yield from another.
但效果是相似的:你只需要从其中一个获得收益,然后从另一个获得收益。
If you think of the upper half and the lower half as "lazy lists", then yes, you can think of this as a "lazy append" that creates a larger "lazy list". And if you call list
on the result of the parent function, you of course will get an actual list
that's equivalent to appending together the two lists you would have gotten if you'd done yield list(…)
instead of yield from …
.
如果你认为上半部分和下半部分是“懒惰列表”,那么你可以把它看作是一个“懒惰的附加部分”,它会创建一个更大的“懒惰列表”。如果你调用在父函数的结果,你当然会得到一个实际相当于附加在一起两个列表的列表会得到如果你完成产量列表(…),而不是产量从....
But I think it's easier to think of it the other way around: What it does is exactly the same the for
loops do.
但我认为反过来想更容易:它所做的和循环做的完全一样。
If you saved the two iterators into variables, and looped over itertools.chain(upper, lower)
, that would be the same as looping over the first and then looping over the second, right? No difference here. In fact, you could implement chain
as just:
如果您将两个迭代器保存为变量,并循环使用itertools。链(上,下),这将和循环在第一个,然后循环在第二个,对吗?没有区别。实际上,你可以把链当作:
for arg in *args:
yield from arg
* Not the values the generator yields to its caller, the value of the yield expressions themselves, within the generator (which come from the caller using the send
method), as described in PEP 342. You're not using these in your examples. And I'm willing to bet you're not in your real code. But coroutine-style code often uses the value of a yield from
expression—see PEP 3156 for examples. Such code usually depends on other features of Python 3.3 generators—in particular, the new StopIteration.value
from the same PEP 380 that introduced yield from
—so it will have to be rewritten. But if not, you can use the PEP also shows you the complete horrid messy equivalent, and you can of course pare down the parts you don't care about. And if you don't use the value of the expression, it pares down to the two lines above.
*不像PEP 342中所描述的那样,在生成器中(使用send方法从调用方发出的)中,生成器生成的值与调用者本身的值无关。你没有在你的例子中使用这些。我敢打赌你不是在你的真实代码中。但是,coroutine风格的代码通常会使用来自express3156的yield的值。这类代码通常依赖于Python 3.3泛型的其他特性——特别是新的StopIteration。同样的PEP 380带来的价值——因此它将不得不被重写。但是如果没有,你也可以使用PEP来显示完全可怕的等效,当然你可以减少你不关心的部分。如果你不使用这个表达式的值,它就会缩减到上面的两行。
** Not a huge one, and there's nothing you can do about it short of using Python 3.3 or completely restructuring your code. It's exactly the same case as translating list comprehensions to Python 1.5 loops, or any other case when there's a new optimization in version X.Y and you need to use an older version.
**不是一个大的,你也不能做什么,因为它没有使用Python 3.3或者完全重构你的代码。这与将列表理解转换为Python 1.5循环的情况完全相同,或者在版本X中有一个新的优化时的其他情况。你需要使用旧版本。
#2
4
I just came across this issue and my usage was a bit more difficult since I needed the return value of yield from
:
我刚遇到这个问题,我的用法有点困难,因为我需要收益的回报:
result = yield from other_gen()
This cannot be represented as a simple for
loop but can be reproduced with this:
这不能被表示为一个简单的循环,但是可以用这个来复制:
_iter = iter(other_gen())
try:
while True: #broken by StopIteration
yield next(_iter)
except StopIteration as e:
if e.args:
result = e.args[0]
else:
result = None
Hopefully this will help people who come across the same problem. :)
希望这能帮助遇到同样问题的人。:)
#3
3
I think I found a way to emulate Python 3.x yield from
construct in Python 2.x. It's not efficient and it is a little hacky, but here it is:
我想我找到了一个模仿python3的方法。在Python 2.x中构造出x。它不是有效的,它是一个小的hacky,但在这里它是:
import types
def inline_generators(fn):
def inline(value):
if isinstance(value, InlineGenerator):
for x in value.wrapped:
for y in inline(x):
yield y
else:
yield value
def wrapped(*args, **kwargs):
result = fn(*args, **kwargs)
if isinstance(result, types.GeneratorType):
result = inline(_from(result))
return result
return wrapped
class InlineGenerator(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def _from(value):
assert isinstance(value, types.GeneratorType)
return InlineGenerator(value)
Usage:
用法:
@inline_generators
def outer(x):
def inner_inner(x):
for x in range(1, x + 1):
yield x
def inner(x):
for x in range(1, x + 1):
yield _from(inner_inner(x))
for x in range(1, x + 1):
yield _from(inner(x))
for x in outer(3):
print x,
Produces output:
产生的输出:
1 1 1 2 1 1 2 1 2 3
Maybe someone finds this helpful.
也许有人觉得这很有用。
Known issues: Lacks support for send() and various corner cases described in PEP 380. These could be added and I will edit my entry once I get it working.
已知的问题:缺少对send()和PEP 380中描述的各种角的支持。可以添加这些内容,一旦工作完成,我将编辑条目。
#4
2
Replace them with for-loops:
用for循环:
yield from range(L[start] + 1, L[end])
==>
for i in range(L[start] + 1, L[end]):
yield i
The same about elements:
相同的元素:
yield from missing_elements(L, index, end)
==>
for el in missing_elements(L, index, end):
yield el
#5
1
What about using the definition from pep-380 in order to construct a Python 2 syntax version:
如何使用pep-380的定义来构造Python 2的语法版本:
the statement:
声明:
RESULT = yield from EXPR
is semantically equivalent to:
语义上等价于:
_i = iter(EXPR)
try:
_y = next(_i)
except StopIteration as _e:
_r = _e.value
else:
while 1:
try:
_s = yield _y
except GeneratorExit as _e:
try:
_m = _i.close
except AttributeError:
pass
else:
_m()
raise _e
except BaseException as _e:
_x = sys.exc_info()
try:
_m = _i.throw
except AttributeError:
raise _e
else:
try:
_y = _m(*_x)
except StopIteration as _e:
_r = _e.value
break
else:
try:
if _s is None:
_y = next(_i)
else:
_y = _i.send(_s)
except StopIteration as _e:
_r = _e.value
break
RESULT = _r
in a generator, the statement:
在生成器中,语句:
return value
is semantically equivalent to
语义上等价于
raise StopIteration(value)
except that, as currently, the exception cannot be caught by except clauses within the returning generator.
除了当前,除了返回生成器中的子句外,不能捕获异常。
The StopIteration exception behaves as though defined thusly:
StopIteration异常表现为:
class StopIteration(Exception):
def __init__(self, *args):
if len(args) > 0:
self.value = args[0]
else:
self.value = None
Exception.__init__(self, *args)
#6
0
I've found using resource contexts (using the python-resources module) to be an elegant mechanism for implementing subgenerators in Python 2.7. Conveniently I'd already been using the resource contexts anyway.
我发现使用资源上下文(使用Python -resource模块)是用于在Python 2.7中实现子生成器的优雅机制。顺便说一下,我已经使用了资源上下文。
If in Python 3.3 you would have:
如果在Python 3.3中有:
@resources.register_func
def get_a_thing(type_of_thing):
if type_of_thing is "A":
yield from complicated_logic_for_handling_a()
else:
yield from complicated_logic_for_handling_b()
def complicated_logic_for_handling_a():
a = expensive_setup_for_a()
yield a
expensive_tear_down_for_a()
def complicated_logic_for_handling_b():
b = expensive_setup_for_b()
yield b
expensive_tear_down_for_b()
In Python 2.7 you would have:
在Python 2.7中,你会:
@resources.register_func
def get_a_thing(type_of_thing):
if type_of_thing is "A":
with resources.complicated_logic_for_handling_a_ctx() as a:
yield a
else:
with resources.complicated_logic_for_handling_b_ctx() as b:
yield b
@resources.register_func
def complicated_logic_for_handling_a():
a = expensive_setup_for_a()
yield a
expensive_tear_down_for_a()
@resources.register_func
def complicated_logic_for_handling_b():
b = expensive_setup_for_b()
yield b
expensive_tear_down_for_b()
Note how the complicated-logic operations only require the registration as a resource.
请注意,复杂逻辑操作只需要注册为资源。