I have three large lists. First contains bitarrays (module bitarray 0.8.0) and the other two contain arrays of integers. These data structures take quite a bit of RAM (~16GB total). If i start 12 sub-processes using: Does this mean that l1, l2 and l3 will be copied for each sub-process or will the sub-processes share these lists? Or to be
Tag: shared-memory
Combine Pool.map with shared memory Array in Python multiprocessing
I have a very large (read only) array of data that I want to be processed by multiple processes in parallel. I like the Pool.map function and would like to use it to calculate functions on that data in parallel. I saw that one can use the Value or Array class to use shared memory data between processes. But when