0
votes

I am using https://docs.laravel-excel.com/3.1/exports. I have issue exporting 500,000 rows. I am using queue to handle the export. Its working good for 10K records. When dealing with 500,000 records and running php artisan queue:work --timeout=86000 to execute job noting is happening.

Any idea how to make maatwebsite excel work for 500,000 records export?? Have attached the snapshot of my code.

My controller

public function getDownload(Request $request)
{
    (new TagsExport)->queue('exports/tags' . date('Y-m-d H_i') . '.xlsx');
    echo back()->withSuccess('Export started!');
}

My TagsExport Class

<?php

namespace App\Admin\Exports;

use Maatwebsite\Excel\Concerns\WithMultipleSheets;
use Maatwebsite\Excel\Concerns\Exportable;
use Maatwebsite\Excel\Concerns\FromQuery;
use Maatwebsite\Excel\Concerns\WithTitle;
use Maatwebsite\Excel\Concerns\WithHeadings;
use Maatwebsite\Excel\Concerns\WithMapping;
use Maatwebsite\Excel\Concerns\WithCustomQuerySize;

use Tag;

class TagsExport implements FromQuery,WithTitle, WithHeadings, WithMapping, WithCustomQuerySize
{
    use Exportable;

    /**
     * Get the sheet title
     *
     * @return string
     */
    public function title(): string
    {
        return 'Tags';
    }

    /**
     * Get the header rows
     *
     * @return array
     */
    public function headings(): array
    {
        return [
            'id',
            'Name',
            
        ];
    }


    public function query()
    {
        return Tag::query();
        
    }
    
    public function querySize(): int
    {
        $query = Tag::query();

        $size = $query->count();
        return $size;
    }

    public function chunkSize(): int
      {
         return 1000;
      }

    /**
     * Get the mapped results
     *
     * @param  Company $company
     * @return array
     */
    public function map($company): array
    {
        return [
            $company->id,
            $company->name,
        ];
    }

    
}
1

1 Answers

0
votes

If your code is running great with 10.000 registers but it is breaking with 500.000 rows, I think it is taking more time than you expect to generate the whole export file and for you it seems like nothing is happening, but the queue did not have enough time to process it all.

You really need a huge single file? Can't you split your dump in multiple sheets?

[1~25000]
[25001 ~ 50000]
[50001 - 100000]
And so on...

It might also be problem related with the amount of memory the PHP is allowed to use (not enough memory in this case) try to change it in the php.ini and give your worker another try.

Hope it helps you =)