Skip to content
Advertisement

Tag: multiprocessing

multiprocessing loop over a simple list?

I have a function that calls a custom function that compares rows in a dataframe and calculates some stats. vt.make_breakpts it needs a dataframe (data), a key (unique identifier), and a datefield (date) to do it’s thing. I can run this and wait a very long time and it will go through and entire dataframe and output a dataframe of

How can I parallelize a function with multiple arguments

I have written a function create_time_series(input_df1, info_df1, unit_name,start_date,end_date), which aims to create a time series based on log-files saved in input_df1. The problem of my function is that the execution is slow, therefore I thought of parallelizing it. The following code is my attempt at utilizing the multiprocessing library: In the task manager, I can see the processes running however,

Python Process Pool with custom Process not able to respawn child processes

I have overridden multiprocess.Process (fork of multiprocessing library) like so: When I create normal Process using this class everything works perfectly, including creating logs. Now I want to create a Process Pool of such customized processes but I encountered problem with respawning such processes after they life comes to an end. Here is how I create pool with additional maxtasksperchild=1

Share RLock between multiple instances of Python with multiprocessing

Consider this MWE: When this script is executed, it instantiates the_setup and serves it. Then I want clients to be able to do things like this from other scripts: However, I get RuntimeError: RLock objects should only be shared between processes through inheritance. If the with the_setup.hold_hardware(): is removed, it works fine but then I cannot guarantee that the hardware

Advertisement