One of my friends has the same trouble:
I want read data from csv file in chunks on laravel i am using this library https://github.com/Maatwebsite/Laravel-Excel. My csv file have larage data i am using this code
Excel::filter('chunk')->load('sample.csv')->chunk(50, function($results) { //insert record in databade in chunks },FALSE);
above code getting data in chunks but it is taking lot of time time after sometime giving me error Maximum execution time of 30 seconds exceeded .I knew if i will increase max_excution_time then this error will not come but it still take lot of time . I dont want to increase max_excution_time from php.ini file i want after one chunk page it should clear cache or page should refresh or anything else that is better and that should display status that record from-to have been imported
Reading large files, been there, done that. The problem is that most libraries just chunk the entire document into memory which is blocking. If the file is large, the file might not be parsed before PHP either runs out of memory or execution time expires. I suggest using one of two methods.
In my experience with these kind of tasks, I have better experience writing the code as an artisan command and then execute it either from the console or in Laravel.
https://laravel.com/docs/5.2/artisan
The chunk doesn't mean PHP reset execution time every time a new chunk is being processed. For this you need to make Ajax call and wait for the first chunk to finish before continuing with the next. If you choose this approach, you need to define where to start and size limit and send those parameters with the call.
I recommend the console command, especially if it's something an administrator is doing. It also benefits from the nice progress bar.
I recently had a similar problem, how I tackaled it was to break a large file into multiple files and have an artisan command running every minute to process the smaller csv files.
I documented how I went about it at daveismyname.blog/laravel-import-large-csv-file
You could try to load the data from CSV directly into mysql. You could try something like : LOAD DATA INFILE 'file.csv' INTO TABLE mytable; You just need to use Laravel to build the query and point to that file.
TJ
Building Sparkle ✨ for Laravel | Echo Labs | Curology
I'm with @emilmoe on this in regards to using an artisan command.
I've typically handled this with an async workflow that has worked well.
I've also used the method that method that @emilmoe mentioned with using ajax to send chunk start and end identifiers back and forth.