Skip to content
Advertisement

Tag: out-of-memory

Importing large IDL files into Python with SciPy

I currently use scipy.io.readsav() to import IDL .sav files to Python, which is working well, eg: However, if the .sav file is large (say > 1 GB), I get a MemoryError when trying to import into Python. Usually, iterating through the data would of course solve this (if it were a .txt or .csv file) rather than loading it in

Python 3 causes memory error at shuffle(X,Y) where X is 36000 3-channel images (36000, 256,256,3) and Y is 3-channel normal data (36000, 256,256,3)

Following image showing Memory Usage: Memory error occurs. I am using Numpy and Python3. I have two numpy arrays of shape (36000,256,256,3) each as X and Y and memory error occurs when I do following code. They are code to prepare training data. Is there another way to do it which uses lesser memory? This is my processor: IntelĀ® Xeon(R)

Advertisement