问题
So, I have an excel file with 28k rows.
I want to load it, then insert into database, but it was just stopped. (blank space)
I've tried to reduce into 5k data, and it worked, but is too slow
I also tried using chunk, with only 5k data, but I got "Maximum execution time of 300 seconds exceeded".
here's the code
Excel::filter('chunk')->load(storage_path('excel/exports/').$fileName)->chunk(1000, function($results)
{
foreach($results as $key)
{
// even nothing to do
}
});
Is 5k row really that big to handle?
Or am I doing it wrong?
Thanks.
回答1:
You're doing it by the book. (Using chuck, for example)
But 28k rows is much data to handle.
You can edit your maximum execution time.
see: http://php.net/manual/en/function.set-time-limit.php
bool set_time_limit ( int $seconds )
Hope this will help.
回答2:
Using chunk is great to prevent over exhausting memory but it will slow down your execution time.
Increase the number of chunk if you want it faster but be careful with that.
Note. Every end of chunk, your application gonna read the file again and that take time.
来源:https://stackoverflow.com/questions/30121055/laravel-excel-massive-import