Skip to content
Advertisement

Concurrency for requests

I have a python script and I’m sending post requests to same url with different data(different ids). I have to send requests for each id and check them continuously to see if there is a change. I’m handling it with iterating an “ids” list with for loop and sending request for each id and then iterating the list again and again.

But I want to check every one of them for every 10 seconds max and if I have 1000 ids in the list, its getting longer to check first id again. I can solve this by running 10 parallel scripts to check 100 ids with each script. Is there any alternative you would suggest? thanks.

Advertisement

Answer

If you mean literally running 10 parallel scripts, then yes, you improve upon this with multithreading from a single script.

User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement