Skip to content
Advertisement

How do I check if PyTorch is using the GPU?

How do I check if PyTorch is using the GPU? The nvidia-smi command can detect GPU activity, but I want to check it directly from inside a Python script.

Advertisement

Answer

These functions should help:

>>> import torch

>>> torch.cuda.is_available()
True

>>> torch.cuda.device_count()
1

>>> torch.cuda.current_device()
0

>>> torch.cuda.device(0)
<torch.cuda.device at 0x7efce0b03be0>

>>> torch.cuda.get_device_name(0)
'GeForce GTX 950M'

This tells us:

  • CUDA is available and can be used by one device.
  • Device 0 refers to the GPU GeForce GTX 950M, and it is currently chosen by PyTorch.
User contributions licensed under: CC BY-SA
8 People found this is helpful
Advertisement