I have very large number of file names (from my PC) inserted in db with status New by default. I want for every file name do some operations (change file). During change file change file status to Proccesing. After operations change status on Processed. I deside to do it with multiprocessing python module. Right now i have this solution but
Tag: multiprocessing
Using a class decorator to automatically run a method with a child process
I was asked to develop a consistent way to run(train, make predictions, etc.) any ML model from the command line. I also need to periodically check the DB for requests related to training, like abort requests. To minimize the effect checking the DB has on training, I want to create a separate process for fetching requests from the DB. So
Python: will a thread ever unblock while hanging on a `Queue.get(block=True)` call if the queue is suddenly destroyed from another thread/process?
TLDR: Would a blocking get be unblocked by a queue that would be terminated in some way? Long question: Okay, so I know that the thread will hang if the queue (multiprocessing.Queue) remains empty forever while trying to fetch something from it with a blocking get. But suppose now that in another thread or process, I close the queue with
faster processing of a for loop with put requests python
I have a json file – that i am reading and processing each element due to the api and the payload that is required. I am looking for ways to speed this up through multi processing / concurrency – not for sure the proper approach. I think either of those would work as they are individual request updating a specific
Constant camera grabbing with OpenCV & Python multiprocessing
I’m after constantly reading images from an OpenCV camera in Python and reading from the main program the latest image. This is needed because of problematic HW. After messing around with threads and getting a very low efficiency (duh!), I’d like to switch to multiprocessing. Here’s the threading version: And – Can someone please help me translate this to multiprocess
Optimal way to use multiprocessing for many files
So I have a large list of files that need to be processed into CSVs. Each file itself is quite large, and each line is a string. Each line of the files could represent one of three types of data, each of which is processed a bit differently. My current solution looks like the following: I iterate through the files,
How to do a race between multiprocessors in python
I have a function that factor a number. It depends on some random condition. So what I am trying to do it’s to run multiple processors in this function and the processor that finds the factor first returns the value and all processors terminate. What I have so far is very wrong. The processors are not terminating and I also
mocking multiprocessing.Process target in unit tests
I want to test if a mocked method is called by multiprocessing.Process. I am aware that a subprocess call makes a fork of the current process and that the information can only be available in the subprocess. However, I would like to know if there is a clean way to test such a usecase anyway. One possibility would certainly be
Error 409 occurred in telebot when using multiprocessing [exe file]
I want to start another process, it will start, if I write to telegram bot “start”. But right after this occurring the ERROR: I am running ONLY one bot at the same time, already checked it. I found, that the error occurs when program trying to start a new process. Could anyone help me, please? UPD: Forgot to say, this
How to close all the processes one by one in a program that operates with multiprocessing by means of an ‘if’ validation found in one of them process?
I need that when the variable text is equal to “Close your program” the 3 active processes(p1,p2,p3) are closed. I have tried to do something like this: But it is not working for me, and I would need a better code that allows me to close them one by one in that code block if text is equal to “Close