I’m using Django with an sqlite backend, and write performance is a problem. I may graduate to a “proper” db at some stage, but for the moment I’m stuck with sqlite. I think that my write performance problems are probably related to the fact that I’m creating a large number of rows, and presumably each time I save()
one it’s locking, unlocking and syncing the DB on disk.
How can I aggregate a large number of save()
calls into a single database operation?
Advertisement
Answer
EDITED: commit_on_success
is deprecated and was removed in Django 1.8. Use transaction.atomic
instead. See Fraser Harris’s answer.
Actually this is easier to do than you think. You can use transactions in Django. These batch database operations (specifically save, insert and delete) into one operation. I’ve found the easiest one to use is commit_on_success
. Essentially you wrap your database save operations into a function and then use the commit_on_success
decorator.
from django.db.transaction import commit_on_success @commit_on_success def lot_of_saves(queryset): for item in queryset: modify_item(item) item.save()
This will have a huge speed increase. You’ll also get the benefit of having roll-backs if any of the items fail. If you have millions of save operations then you may have to commit them in blocks using the commit_manually
and transaction.commit()
but I’ve rarely needed that.