We're running 2.7.7+ and have found that exporting large sets of logs to an Excel file will cause the CPU to spike and cause Moodle to be unresponsive. As an example, I tested this on a course that had 247 pages of logs. Exporting all to Excel took 3.5 minutes and during that time, CPU utilization was 100% and the site was not accessible (this is a test site). The resulting spreadsheet had approximately 30,000 entries. Larger sets of logs will eventually time out.
As this is hosted on a VM, we can request more resources, but I wonder if the CPU spike and amount of time are expected or not (for reference, we host one instance of Moodle on a VM with one dedicated CPU and 4GB. The database is on a separate VM). That is, does this sound like it could be a configuration issue? We have all recommended settings applied, including opcache.
Essentially it's a spot of poor design. The entire selected log data is loaded into an array in memory and then the export plugin is called.
Unfortunately, if the selected data is large (e.g. all the logs for your course) then it exceeds memory limits. Needs some work... unfortunately, it won't be a trivial fix.
Thanks to you both! I thought I had searched the site thoroughly but evidently hadn't. We are looking into increasing system resources, and I will report back with how much of a difference it makes.