Right way to create a report that consist 100k records


Hey Guys,

I hope someone can answer it.
I need to create a report that consist around 100k records in csv, excel, and pdf format.
I only have 4GB RAM, so far I can run the report that consist around 20k records. More than 20k the report goes memory exhaust.
My method is to query all the records and put it in an array and pass it to view and save it to csv/excel/pdf.
Which is not good way I think since your putting all the records in an array.
I was thinking about paginating the query in view upon saving it to csv/excel file. Let say 5k per query.
So is it possible do the query in view? Or there’s a better way to handle this situation?
I was using cakephp 3 by the way.



Well the answer depends on many factors.

Based on what you wrote I would and a limit parameter to the query in the model and call it from a cake shell via a cronjob.


Start with the simple case of the CSV file.

With this case, you will loop through query results, and stream straight to the CSV, and perhaps bypass the view.

With Excel, if it really has to be a .xlsx file, then you at the mercy of your Excel lib. Though I think there might be a 65k row limit in Excel. If you can, just export as CSV.

PDF might be in the same boat the the CSV. Use FPDF in the controller, loop through the results (not an array), and stream to the client as soon as you can.


I answered a similar question: