Skip to content
Advertisement

Running functions siultaneoulsy in python

I am making a small program in which I need a few functions to check for something in the background.

I used module threading and all those functions indeed run simultaneously and everything works perfectly until I start adding more functions. As the threading module makes new threads, they all stay within the same process, so when I add more, they start slowing each other down.

The problem is not with the CPU as it’s utilization never reaches 100% (i5-4460). I also tried the multiprocessing module which creates a new process for function, but then it seems that variables can’t be shared between different processes or I don’t know how. (newly started process for each function seems to take existing variables with itself, but my main program cannot access any of the changes that function in the separate process makes or even new variables it creates)

I tried using the global keyword but it seems to have no effect in multiprocessing as it does in threading.

How could I solve this problem?

I am pretty sure that I have to create new processes for those background functions but I need to get some feedback from them and that part I don’t know to solve.

Advertisement

Answer

I ended up using multiprocessing Value

User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement