I want to initialize 300,000 x 300,0000
sparse matrix using sklearn
, but it requires memory as if it was not sparse:
>>> from scipy import sparse >>> sparse.rand(300000,300000,.1)
it gives the error:
MemoryError: Unable to allocate 671. GiB for an array with shape (300000, 300000) and data type float64
which is the same error as if I initialize using numpy
:
np.random.normal(size=[300000, 300000])
Even when I go to a very low density, it reproduces the error:
>>> from scipy import sparse >>> from scipy import sparse >>> sparse.rand(300000,300000,.000000000001) Traceback (most recent call last): File "<stdin>", line 1, in <module> File ".../python3.8/site-packages/scipy/sparse/construct.py", line 842, in rand return random(m, n, density, format, dtype, random_state) File ".../lib/python3.8/site-packages/scipy/sparse/construct.py", line 788, in random ind = random_state.choice(mn, size=k, replace=False) File "mtrand.pyx", line 980, in numpy.random.mtrand.RandomState.choice File "mtrand.pyx", line 4528, in numpy.random.mtrand.RandomState.permutation MemoryError: Unable to allocate 671. GiB for an array with shape (90000000000,) and data type int64
Is there a more memory-efficient way to create such a sparse matrix?
Advertisement
Answer
Just generate only what you need.
from scipy import sparse import numpy as np n, m = 300000, 300000 density = 0.00000001 size = int(n * m * density) rows = np.random.randint(0, n, size=size) cols = np.random.randint(0, m, size=size) data = np.random.rand(size) arr = sparse.csr_matrix((data, (rows, cols)), shape=(n, m))
This lets you build monster sparse arrays provided they’re sparse enough to fit into memory.
>>> arr <300000x300000 sparse matrix of type '<class 'numpy.float64'>' with 900 stored elements in Compressed Sparse Row format>
This is probably how the sparse.rand constructor should be working anyway. If any row, col pairs collide it’ll add the data values together, which is probably fine for all applications I can think of.