Skip to content
Advertisement

asyncio run_until_complete does not wait that all coroutines finish

I am making my first steps in Python and I have a bit of struggle trying to understand why I do not have the expected result with this one. Here is what I am trying to achieve :

I have a function that consumes an API. While waiting for the API to answer and given that I am going through a proxy that creates additional lag, I though that sending concurrent request will speed up the process (I run 100 concurrent requests). It does. But asyncio run_until_complete always returns some unfinished coroutines.

Here the code (simplified):

JavaScript

When I run the debugger, the results returned by the loop_on_api function include a list of string corresponding to the results of consume_api and some occurence of <coroutine objects consume_api at 0x00...>. Those variables have a cr_running parameter at False and a cr_Frame. Though if I check the coroutines variables, I can find all the 100 coroutines but none seems to have a cr_Frame.

Any idea what I am doing wrong?

I’m also thinking my way of counting the 50 error will be shared by all coroutines.

Any idea how I can make it specific?

Advertisement

Answer

It seems the issue is coming from the proxy I am using, which sometimes do not carry the request or response. Hence forcing a rerun seems to be the best answer. Hence I now check if the results returned have some coroutines remaining and re-run the loop_on_api() on them

JavaScript
User contributions licensed under: CC BY-SA
8 People found this is helpful
Advertisement