Slow sql query when select 1000 in 80 loops (pages)

For pushing data to solr server, we select 1000 post in one sql query , and loop over 80 time for next 1000 posts.

When script start selecting get fast, but with every next page sql query get slower.

    $this->loadModel('Postings');

    $limit = 1000;

    $where = [
        'Postings.source' => $source,
        'Postings.approved' => 1,
        'Postings.deleted' => 0,
    ];
    if ($source === null) {
        unset($where['Postings.source']);
    }
    $count = $this->Postings
        ->find()
        ->where($where)
        ->count();

    $num_pages = $count <= $limit ? 1 : ceil($count/$limit);
    $page = 1;

    while ($page <= $num_pages) {
        $this->out($page.'/'. $num_pages);
        $postings = $this->Postings
            ->find()
            ->select([
                'Postings.id',
                'Postings.title',
               // ....
            ])
            ->where($where)
            ->contain([
                // ...
            ])
            ->limit($limit)
            ->page($page)
            ->toArray();
       // $this->push($postings);
        $postings = null;
        $page++;
    }

Any hint, idea, solution?

cakephp 3.6
php 7.2
mariadb

As you see we doing pagination, because we have over 80.000 records, and we must daily push that data to remote solr server.

nope, limit() & page() done all jobs.

https://book.cakephp.org/3.0/en/orm/query-builder.html

To limit the number of rows or set the row offset you can use the limit() and page() methods:

// Fetch rows 50 to 100
$query = $articles->find()
    ->limit(50)
    ->page(2);