I have a following setup:
- Several data processing workers get configuration from django view
get_conf()by http. - Configuration is stored in django model using MySQL / InnoDB backend
- Configuration model has overridden
save()method which tells workers to reload configuration
I have noticed that sometimes the workers do not receive the changed configuration correctly. In particular, when the conf reload time was shorter than usual, the workers got "old" configuration from get_conf() (missing the most recent change). The transaction model used in Django is the default autocommit.
I have come up with the following possible scenario that could cause the behavior:
- New configuration is saved
save()returns but MySQL / InnoDB is still processing the (auto)commit- Workers are booted and make http request for new configuration
- MySQL (auto)commit finishes
Is the step 2 in the above scenario possible? That is, can django model save() return before the data is actually committed in the DB if the autocommit transactional method is being used? Or, to go one layer down, can MySQL autocommitting INSERT or UPDATE operation finish before the commit is complete (update / insert visible to other transactions)?
ATOMIC_REQUESTS=True, then you could have a race where the other process tries to load before the commit occurs). Considering the workers get their conf via an HTTP connection, is there caching occurring at any layer? - bimsapi