I currently use scipy.io.readsav() to import IDL .sav files to Python, which is working well, eg:
data = scipy.io.readsav('data.sav', python_dict=True, verbose=True)
However, if the .sav file is large (say > 1 GB), I get a MemoryError when trying to import into Python.
Usually, iterating through the data would of course solve this (if it were a .txt or .csv file) rather than loading it in all in at once, but I don’t see how I can do this when using .sav files, considering the only method I know of to import it is using readsav.
Any ideas how I can avoid this memory error?
Advertisement
Answer
This was resolved by using 64 bit python.