This won't work, as you won't be able to add an await expression inside a non-asynchronous decorator function.
Just try to run it, and you will get a straight SyntaxError.
Using await is part of writing asynchronous code, and requires a conscientious decision to do so. Moreover, being able to create the co-routines so that they can be concurrently awaited with a gather (or equivalent) call is a very important degree of control one has when writing async code.
So, I don't recommend you follow that path at all.
Check for the trio project as it has modified create_task or equivalent calls that don't require you to create a co-routine, and use the same pattern to call both async and sync code. But it still needs the await.
That said, thinking about some routes you might pursue:
I have crafted a (considerably more complex than your proposed decorator)
to allow async_to_sync and sync_to_async cross calls, that are able to re-use the same event loop, even umping through intermediate synchronous calls - but even with those, the parent caller has to ultimatelly use await either on an async_to_sync(...) call, or on the object returned by a synchronous function (which would be the case with your proposed decorator).
Wrapping a co-routine function in an object that upon being collected would be awaited: that could work to transparently launch tasks - but if you want the return value for those, you would have to await for those tasks anyway.
So, this should work for non-value returning co-routines that can run concurrently after they are called. (And if by any matter your main code finishes before any of those running tasks is complete, they will be cancelled. In other words, it is simpler to just await for then in the first place):
import asyncio
class autoawait:
running_tasks = set()
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
loop = asyncio.get_running_loop()
coro = self.func(*args, **kwargs)
task = loop.create_task(coro)
task.add_done_callback(type(self).running_tasks.remove)
type(self).running_tasks.add(task)
return None # sorry - you can't have the coro return value without an await
# and on the repl:
>>> @autoawait
... async def blah():
... await asyncio.sleep(1)
... print("hello world!")
...
>>> async def main():
... # awaitless call:
... blah()
... # concurrent code for free:
... print("before hello")
... # spend time so that blah completes:
... await asyncio.sleep(1.1)
... print("after hello")
...
>>> asyncio.run(main())
before hello
hello world!
after hello
Using a similar strategy but creating a "TaskGroup" bound task inside a task-group context could work - and ensure your backgroudn tasks are complete, but the decorator would be so much more complex,as it would have to introspect the calling frame for an active taskgroup. (I am not writting that code).
Also, it is possible to add a callback that would place the result of tasks in a queue - so you could get the return values -
But all of these require a good understanding of the asyncio model, and when you get there, you will see that being able to put the awaits is actually more of a blessing. :-)
awaitbefore your async functions from within another async function then you shouldn't do that. Theawaitkeyword means special byte code instructions get executed that allow asynchronous tasks to run concurrently. eg. When you database call has to wait on the network, then the executing task gets swapped out for some other task. But it needsawaitin every function to be able to do that