Skip to content
Advertisement

Get total amount of free GPU memory and available using pytorch

I’m using google colab free Gpu’s for experimentation and wanted to know how much GPU Memory available to play around, torch.cuda.memory_allocated() returns the current GPU memory occupied, but how do we determine total available memory using PyTorch.

Advertisement

Answer

In the recent version of PyTorch you can also use torch.cuda.mem_get_info:

https://pytorch.org/docs/stable/generated/torch.cuda.mem_get_info.html#torch.cuda.mem_get_info

User contributions licensed under: CC BY-SA
6 People found this is helpful
Advertisement