I've just done a Moodle 3.4.4 to 3.5.1 upgrade on a test server with a copy of the Open University live data. We have developer debug on, so it shows the timing of each upgrade step, and I thought it was interesting that only a few upgrade steps take most of the time.
Total time with our data was 7:09 minutes and every individual upgrade step took under 1 second* except these ones:
System
2018-08-23T14:01:30.8855570Z ++ 2017121200: Success (1.02 seconds) ++
2018-08-23T14:01:42.8998230Z ++ 2017121900: Success (12.02 seconds) ++
2018-08-23T14:03:06.8454880Z ++ 2018020500: Success (83.93 seconds) ++
2018-08-23T14:03:16.8567460Z ++ 2018022800.02: Success (9.91 seconds) ++
mod_quiz
2018-08-23T14:07:42.7648640Z ++ 2018020701: Success (236.75 seconds) ++
So together, these five steps take about 80% of the time of the upgrade.
The mod_quiz one is about changes to random questions, and the 83.93 second system change is about top-level question categories.
These times depend heavily on our data: obviously there could be a slow upgrade step somewhere else in a module we don't use much, in which case it wouldn't show up for us. The quiz one is slow because we have about 32,000 random questions, and I guess we also have a lot of question categories for the other one.
The times are acceptable for us, 7 minutes for a major version upgrade is not bad.
However, I guess I'm kind of wondering:
- Does anybody else - at Moodle or elsewhere - check these times against sites with lots of data?
- Any chance of doing this prior to a major release, so that problematic steps can be identified and maybe optimised before it goes public? (Not saying there's necessarily room for optimising these particular ones, maybe there isn't.)
- Should potentially slow steps be called out in the release notes maybe? Like, 'the upgrade might be slow if you have a lot of random questions - on a site with 32,000 random questions, this added approximately 5 minutes to the upgrade time'.