-2

I am working with Asyncio in an app I am making. However, I find it daunting to prefix my function calls with await.

So I want to encapsulate it in a decorator that adds the wait to the function call.

def await_wrap(async_func):
    def wrapper(*args, **kwargs):
        return await async_func(*args, **kwargs)
    return wrapper

@await_wrap
async my_async_class(num)
    asyncio.sleep(num)

my_async_class(5)

Note: the above is pseudocode and has not been tested yet.

Is there any penalty to this other than the slight overhead of the decorator itself, or is the convenience a good idea since there are many async def in my app?

7
  • 1
    Why are you concerned about performance, when you didn't even test, if this works at all? Commented Oct 3 at 12:56
  • It depends on your use case. If you want to invoke and wait for an async function from outside the event loop, then that is something you can achieve. If you're just trying to avoid putting await before your async functions from within another async function then you shouldn't do that. The await keyword means special byte code instructions get executed that allow asynchronous tasks to run concurrently. eg. When you database call has to wait on the network, then the executing task gets swapped out for some other task. But it needs await in every function to be able to do that Commented Oct 3 at 12:59
  • 2
    if you are ever worried about "the slight overhead of decorators itself" you definetelly should not be writting code in Python. That is really a non-issue. Commented Oct 3 at 13:17
  • folks please: it is not a good idea - but it is a question - it should not be downvoted just because it is not a good idea. (ALthough I have to agree the OP might at least have tried to run their code and see the ensuing SyntaxError - but that is hardly worth a multiple negative score for a question) Commented Oct 3 at 13:17
  • @jsbueno I wanted to know if this was even a good idea before trying to get it to work. But I think I got my answer that it is not a good idea. And in regard to running the code. If the concept itself is a bad idea the there should be no reason to start experimenting with the code itself or am I wrong in this? Commented Oct 3 at 13:22

1 Answer 1

5

This won't work, as you won't be able to add an await expression inside a non-asynchronous decorator function.

Just try to run it, and you will get a straight SyntaxError.

Using await is part of writing asynchronous code, and requires a conscientious decision to do so. Moreover, being able to create the co-routines so that they can be concurrently awaited with a gather (or equivalent) call is a very important degree of control one has when writing async code.

So, I don't recommend you follow that path at all. Check for the trio project as it has modified create_task or equivalent calls that don't require you to create a co-routine, and use the same pattern to call both async and sync code. But it still needs the await.

That said, thinking about some routes you might pursue:

I have crafted a (considerably more complex than your proposed decorator) to allow async_to_sync and sync_to_async cross calls, that are able to re-use the same event loop, even umping through intermediate synchronous calls - but even with those, the parent caller has to ultimatelly use await either on an async_to_sync(...) call, or on the object returned by a synchronous function (which would be the case with your proposed decorator).

Wrapping a co-routine function in an object that upon being collected would be awaited: that could work to transparently launch tasks - but if you want the return value for those, you would have to await for those tasks anyway.

So, this should work for non-value returning co-routines that can run concurrently after they are called. (And if by any matter your main code finishes before any of those running tasks is complete, they will be cancelled. In other words, it is simpler to just await for then in the first place):

import asyncio

class autoawait:
    running_tasks = set()
    def __init__(self, func):
        self.func = func
    
    def __call__(self, *args, **kwargs):
        loop = asyncio.get_running_loop()
        coro = self.func(*args, **kwargs)
        task = loop.create_task(coro)
        task.add_done_callback(type(self).running_tasks.remove)
        type(self).running_tasks.add(task)
        return None  # sorry - you can't have the coro return value without an await

# and on the repl:

>>> @autoawait
... async def blah():
...     await asyncio.sleep(1)
...     print("hello world!")
...     

>>> async def main():
...     # awaitless call:
...     blah()
...     # concurrent code for free:
...     print("before hello")
...     # spend time so that blah completes:
...     await asyncio.sleep(1.1)
...     print("after hello")
...

>>> asyncio.run(main())
before hello
hello world!
after hello


Using a similar strategy but creating a "TaskGroup" bound task inside a task-group context could work - and ensure your backgroudn tasks are complete, but the decorator would be so much more complex,as it would have to introspect the calling frame for an active taskgroup. (I am not writting that code). Also, it is possible to add a callback that would place the result of tasks in a queue - so you could get the return values -

But all of these require a good understanding of the asyncio model, and when you get there, you will see that being able to put the awaits is actually more of a blessing. :-)

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.