5
votes

I am making a module which you upload records on a database from an excel file. Those are just phone numbers. So here's my code:

        $file = Input::file('file');
        Excel::load($file, function($reader) {
            // Getting all results
            $results = $reader->get()->toArray();
            //var_dump($results);exit;
            foreach ($results as $key => $value) {
                $phone = new Phone();
                $phone->msisdn          =  $value['msisdn'];
                $phone->save();
            }
        });

I'm using https://github.com/Maatwebsite/Laravel-Excel to read the excel file. It works fine, 20,000 records uploads in 20mins I guess, is there a way to to it or upload it faster? I know that it depends also in the server but is there other factors? I'm using MySQL

Thanks

2
20000 is 20 minutes seems very slow. Can you log how long each operation takes to see where the bottle neck is? - Halcyon
Can you not save the (relevant bits of the) Excel file as a CSV and then use LOAD DATA or mysqlimport? - eggyal
The main issue here is that you're inserting in a loop. Instead of using Eloquent, go for the Query Builder. Here is an example to get you going. - Andrei
The problem is almost certainly on the Excel side of things. That package uses PHPExcel, which is an incredible package, but extremely resource intensive. This question asks about different packages for reading Excel files in PHP. I haven't researched these, so I can't speak for any of them. If you can save the Excel file as CSV, that's going to give you the best performance - Kryten
I would think looking at bulk inserts or LOAD DATA FROM INFILE are your best bets here. I would also think you could save time by not reading entire into memory if going the bulk insert route. Just read each line of file and continue to build bulk insert query, triggering the actual insert when query gets to number of records you want to insert at a time. I definitely would not take extra step of instantiating an Phone object with each insert unless you need to do something like validate/transform the data before making the insert. - Mike Brant

2 Answers

0
votes

From their documentation possibly chunk the results as you are using such a large file. That way you can read in chunks of the file rather than loading the whole file into memory.

http://www.maatwebsite.nl/laravel-excel/docs/import#chunk

0
votes

You could try this faster alternative (https://github.com/rap2hpoutre/fast-excel):

(new FastExcel)->import($file, function ($line) {
    $phone = new Phone();
    $phone->msisdn = $value['msisdn'];
    $phone->save();
});