I have a large number of items (1M+) that i want to delete from a database, i fork a background job to take care of that, so that the user won\'t have to wait for it to finish t
As Kelvin Jones points out, the reason the random number of items is being deleted is that you're deleting records as you page through them.
chunk
simply uses offset & limit to "paginate" through your table. But if you delete 100 records from page 1 (IDs 1-100), then go to page 2, you're actually now skipping IDs 101-200 and jumping to 201-300.
chunkById
is a way around this
Post::where('arch_id', $posts_archive->id)->chunkById(1000, function ($posts) {
//go through the collection and delete every post.
foreach($posts as $post) {
$post->delete();
}
});
Literally just replace the method name. Now, instead of using offset & limit to paginate, it will look at the maximum primary key (100) from the first page, then the next page will query where ID > 100
. So page 2 is now correctly giving you IDs 101-200 instead of 201-300.