Skip to content
Advertisement

CUDA out of memory in Google Colab

I am trying to replicate a GAN study (Stargan-V2). So, I want to train a model (using less data) in Google Colab. But, I got this problem:

JavaScript

I changed batch_size but It didn’t work for me. Did you have any idea? How can I fix this problem?

Paper: StarGAN v2: Diverse Image Synthesis for Multiple Domains

Original github repo: stargan-v2

Advertisement

Answer

If you aren’t using the Pro version of Google Colab, then you’re going to run into somewhat restrictive maximums for your memory allocation. From the Google Colab FAQ…

The amount of memory available in Colab virtual machines varies over time (but is stable for the lifetime of the VM)… You may sometimes be automatically assigned a VM with extra memory when Colab detects that you are likely to need it. Users interested in having more memory available to them in Colab, and more reliably, may be interested in Colab Pro.

You already have a good grasp of this issue, since you understand that lowering batch_size is a good way to get around it for a little while. Ultimately, though, if you want to replicate this study, you’ll have to switch to a training method that can accommodate for the amount of data you seem to need.

User contributions licensed under: CC BY-SA
5 People found this is helpful
Advertisement