My problem is as follows:
I have a car dealer A, and a db table named sold_cars
. When a car is being sold I create entry in this table.
Table ha
from contextlib import contextmanager
from django.db import transaction
from django.db.transaction import get_connection
@contextmanager
def lock_table(model):
with transaction.atomic():
cursor = get_connection().cursor()
cursor.execute(f'LOCK TABLE {model._meta.db_table}')
try:
yield
finally:
cursor.close()
This is very similar to @jdepoix solution, but a but more dense.
You can use it like this:
with lock_table(MyModel):
MyModel.do_something()
Note that this only works with PostgreSQL and uses python 3.6's f-strings a.k.a. literal string interpolation.
I think this code snippet meets your need, assuming you are using MySQL. If not, you may need to tweak the syntax a little, but the idea should still work.
Source: Locking tables
class LockingManager(models.Manager):
""" Add lock/unlock functionality to manager.
Example::
class Job(models.Model):
manager = LockingManager()
counter = models.IntegerField(null=True, default=0)
@staticmethod
def do_atomic_update(job_id)
''' Updates job integer, keeping it below 5 '''
try:
# Ensure only one HTTP request can do this update at once.
Job.objects.lock()
job = Job.object.get(id=job_id)
# If we don't lock the tables two simultanous
# requests might both increase the counter
# going over 5
if job.counter < 5:
job.counter += 1
job.save()
finally:
Job.objects.unlock()
"""
def lock(self):
""" Lock table.
Locks the object model table so that atomic update is possible.
Simulatenous database access request pend until the lock is unlock()'ed.
Note: If you need to lock multiple tables, you need to do lock them
all in one SQL clause and this function is not enough. To avoid
dead lock, all tables must be locked in the same order.
See http://dev.mysql.com/doc/refman/5.0/en/lock-tables.html
"""
cursor = connection.cursor()
table = self.model._meta.db_table
logger.debug("Locking table %s" % table)
cursor.execute("LOCK TABLES %s WRITE" % table)
row = cursor.fetchone()
return row
def unlock(self):
""" Unlock the table. """
cursor = connection.cursor()
table = self.model._meta.db_table
cursor.execute("UNLOCK TABLES")
row = cursor.fetchone()
return row
I would recommend using the F() expression instead of locking the entire table. If your app is being heavily used, locking the table will have significant performance impact.
The exact scenario you described is mentioned in Django documentation here. Based on your scenario, here's the code you can use:
from django.db.models import F
# Populate sold_cars as you normally do..
# Before saving, use the "F" expression
sold_cars.order_num =F('order_num') + 1
sold_cars.save()
# You must do this before referring to order_num:
sold_cars.refresh_from_db()
# Now you have the database-assigned order number in sold_cars.order_num
Note that if you set order_num during an update operation, use the following instead:
sold_cars.update(order_num=F('order_num')+1)
sold_cars.refresh_from_db()
Since database is in charge of updating the field, there won't be any race conditions or duplicated order_num values. Plus, this approach is much faster than one with locked tables.
I know this question is a bit older, but I just had the same issue and wanted to share my learnings.
I wasn't quite satisfied with st0nes answer, since (at least for postgres) a LOCK TABLE
statement can only be issued within a transaction. And although in Django usually almost everything happens within a transaction, this LockingManager
does not make sure, that you actually are within a transaction, at least to my understanding. Also I didn't want to completely change the Models Manager
just to be able to lock it at one spot and therefore I was more looking for something that works kinda like the with transaction.atomic():
, but also locks a given Model.
So I came up with this:
from django.conf import settings
from django.db import DEFAULT_DB_ALIAS
from django.db.transaction import Atomic, get_connection
class LockedAtomicTransaction(Atomic):
"""
Does a atomic transaction, but also locks the entire table for any transactions, for the duration of this
transaction. Although this is the only way to avoid concurrency issues in certain situations, it should be used with
caution, since it has impacts on performance, for obvious reasons...
"""
def __init__(self, model, using=None, savepoint=None):
if using is None:
using = DEFAULT_DB_ALIAS
super().__init__(using, savepoint)
self.model = model
def __enter__(self):
super(LockedAtomicTransaction, self).__enter__()
# Make sure not to lock, when sqlite is used, or you'll run into problems while running tests!!!
if settings.DATABASES[self.using]['ENGINE'] != 'django.db.backends.sqlite3':
cursor = None
try:
cursor = get_connection(self.using).cursor()
cursor.execute(
'LOCK TABLE {db_table_name}'.format(db_table_name=self.model._meta.db_table)
)
finally:
if cursor and not cursor.closed:
cursor.close()
So if I now want to lock the model ModelToLock
, this can be used like this:
with LockedAtomicTransaction(ModelToLock):
# do whatever you want to do
ModelToLock.objects.create()
EDIT: Note that I have only tested this using postgres. But to my understanding, it should also work on mysql just like that.