Question: How to relinquishing control to the event loop in Python 3.5
Closed this issue · 20 comments
In Python < 3.5, you could do a yield
or yield None
in a coroutine to give control to the event loop.
In Python 3.5, it is invalid to yield
in an async def
coroutine. So, what's the proper way of relinquishing control to the event loop?
One pattern is:
class AsyncNone(object):
__await__(self):
yield
async def cor():
for _ in range(100):
await AsyncNone()
But this seems very hacky.
Oh, that's unfortunate indeed. @1st1 Any thoughts? I guess the most straightforward pattern would be
await asyncio.sleep(0)
I guess we can add asyncio.nop
or something alike. I, personally, never needed that.
Could asyncio
implement a AsyncNone
like sentinel object, and one could just do:
await asyncio.AsyncNone
?
or a noop
@1st1 beat me to it
Well, AsyncNone
is not less typing than sleep(0)
, so why would you want
that? Also, how common is this use case really? (My intuition is that it's
most commonly used by people who don't yet have a lot of asyncio
experience.)
On Wed, Nov 4, 2015 at 4:02 PM, Jashandeep Sohi notifications@github.com
wrote:
Could asyncio implement a AsyncNone like sentinel object, and one could
just do:
await asyncio.AsyncNone?—
Reply to this email directly or view it on GitHub
#284 (comment).
--Guido van Rossum (python.org/~guido)
Or when asyncio
is being buggy #263 ;)
It can also be useful when you are doing some long computation in the same thread and don't want to block the loop.
AsyncNone
is probably not the best name though. Something that makes it clear, you are giving up control would probably be be best. sleep(0)
seems cryptic.
Hm, for a long computation you should probably put it in a thread.
Hi,
It's unclear to me if it's a bad practice or not to try to help the
scheduler.
In eventlet code, it's "common" to put eventlet.sleep(0) to yield the
control to the next pending task.
In asyncio, you can write "yield from asyncio.sleep(0)" or "await
asyncio.sleep(0)".sleep() is quite complex, a simpler version is:
def relinquish_cpu():
fut = asyncio.Future()
fut.set_result(None)
yield from fut # or: await fut
Example of usage:
yield from relinquish_cpu()
Other suggestions fot the function name: give_cpu(), give_control(),
yield_cpu(), etc. Since I'm not a native english speaker, it's not easy for
me to find a short and well defined name :-/
"noop" (no operation) is not a good name since calling it has a real
impact: it yields the control the next pending task.
Note: the pthread library has a pthread_yield() function. There is also the
sched_yield() function from <sched.h>.
Victor
2015-11-05 1:02 GMT+01:00 Yury Selivanov notifications@github.com:
I guess we can add asyncio.nop or something like that. I, personally,
never needed that.—
Reply to this email directly or view it on GitHub
#284 (comment).
Maybe we can optimize sleep(0). I don't like the idea of introducing a new
function just to save typing the '0'.
On Thu, Nov 5, 2015 at 12:18 AM, Victor Stinner notifications@github.com
wrote:
Hi,
It's unclear to me if it's a bad practice or not to try to help the
scheduler.In eventlet code, it's "common" to put eventlet.sleep(0) to yield the
control to the next pending task.In asyncio, you can write "yield from asyncio.sleep(0)" or "await
asyncio.sleep(0)".sleep() is quite complex, a simpler version is:def relinquish_cpu():
fut = asyncio.Future()
fut.set_result(None)
yield from fut # or: await futExample of usage:
yield from relinquish_cpu()
Other suggestions fot the function name: give_cpu(), give_control(),
yield_cpu(), etc. Since I'm not a native english speaker, it's not easy for
me to find a short and well defined name :-/"noop" (no operation) is not a good name since calling it has a real
impact: it yields the control the next pending task.Note: the pthread library has a pthread_yield() function. There is also the
sched_yield() function from <sched.h>.Victor
2015-11-05 1:02 GMT+01:00 Yury Selivanov notifications@github.com:
I guess we can add asyncio.nop or something like that. I, personally,
never needed that.—
Reply to this email directly or view it on GitHub
#284 (comment).—
Reply to this email directly or view it on GitHub
#284 (comment).
--Guido van Rossum (python.org/~guido)
Maybe we can optimize sleep(0). I don't like the idea of introducing a new
function just to save typing the '0'.
Agree. Please see PR #285
@jashandeep-sohi @Haypo With PR #285 merged, sleep(0)
should be pretty fast! Closing the issue.
Hum, does it mean that "yield from asyncio.sleep(0[, loop=loop])" is
the official way to "yield"? If yes, it's fine for me.
I think yes. If more people complain and ask for a dedicated function, we can add it later, right?
IMO sleep(0) is the perfect spelling for this, since everybody already
knows what it means. :-)
On Thu, Nov 5, 2015 at 12:26 PM, Yury Selivanov notifications@github.com
wrote:
Hum, does it mean that "yield from asyncio.sleep(0[, loop=loop])" is
the official way to "yield"? If yes, it's fine for me.I think yes. If more people complain and ask for a dedicated function, we
can add it later, right?—
Reply to this email directly or view it on GitHub
#284 (comment).
--Guido van Rossum (python.org/~guido)
@1st1 Yeah, sleep(0)
should be fine. I don't think too many people need this, but a quick mention in the docs might help the odd soul.
Good reminder. The docs currently have yield
for this purpose but that
doesn't work inside a async def
(which is how this got started).
On Thu, Nov 5, 2015 at 12:33 PM, Jashandeep Sohi notifications@github.com
wrote:
@1st1 https://github.com/1st1 Yeah, sleep(0) should be fine. I don't
think too many people need this, but a quick mention in the docs might help
the odd soul.—
Reply to this email directly or view it on GitHub
#284 (comment).
--Guido van Rossum (python.org/~guido)
Perhaps this should be documented better. I just spend about 20 minutes looking for how to do a cooperative-multitasking "yield" before finally landing here. It's not a common use case, sure, but there are situations when it might be helpful.
For example I have a function that is usually fast so that it's ok to run it in one go. But under some circumstances, it can take longer to complete and should give opportunity for background event processing once in a while (e.g. every 10000 processed items). This is basically cooperative multitasking. It may be preferrable to threads because the function can be interrupted only in well-controlled places where you don't have to worry about synchronization issues.
Sure. Perhaps you can submit a patch to bugs.python.org with an example?
(Resend because IIRC closed issues don't notify unless at-mentioned.) @regnarg Perhaps you can submit a patch to bugs.python.org with an example?
Ok, I'll try to do that in the near future. NB: I got notified about the first comment, too.