I have a bit of a strange problem that I'm not quite able to explain.
I have a django project with some old, stale objects lying around. For example, lets say my objects look something like this:
class blog_post(models.Model):
user_account = models.ForeignKey('accounts.Account')
text = models.CharField(max_length=255)
authors = models.ManyToManyField(author)
created = models.DateTimeField(blank=True, null=True)
This is not an exact copy of my model, but is close enough.
I've created a management command to build ordered querysets of these objects, and then delete with with a Paginator
My command looks something like this:
all_accounts = Account.objects.all()
for act in all_accounts.iterator():
stale_objects = blog_post.objects.filter(user_account=act,
created=django.utils.timezone.now() - datetime.timedelta(days=7))
paginator = Paginator(stale_objects.order_by('id'), 100)
for page in range(1, paginator.num_pages + 1):
page_stale_objects = blog_post.objects.filter(id__in=paginator.page(page).object_list.values_list('id'))
page_stale_objects.delete()
The problem I'm having is, after I delete these objects with my command, there are still objects that fit the queryset parameters but are not deleted. So, I have to run the command 3+ times to properly find and remove all the objects.
I first figured that my date range was just weirdly on the edge of the DateTime so was not catching objects made shortly after 1 week past my command time. This is not the case, I've removed the created=... filter from the queryset, and have the same results.
Why are my querysets not catching all objects the first time this command runs? There are not excessive objects, at the most ~30,000 rows.
page_stale_objects
there will only 1 page be deleted (which would mean the number of objects you find exceeds theper_page
value of paginator) – Max M