Test script
OK, I now have a test script that sets up the following:
- Creates a database connection with a fake table name prefix.
- Creates copies of of the 'context', 'filter_active' and 'filter_config' from the definition in install.xml (useful method in HEAD if you are writing test code: $dbman->install_one_table_from_xmldb_file)
- Creates 100 course category contexts, for each one choosing a parent randomly from the previously created contexts and the system context.
- Creates 1000 category contexts, picking a parent category randomly. (Seems to give an average of 5 levels of nesting, max about 10)
- creates 10000 module contexts, picking a parent course randomly.
- Randomly chooses a system level setting (disabled, off, on) for each filter.
- Randomly sets up 50000 local overrides (to on or off, filter and context chosen randomly).
- Randomly sets up 50000 random local config variables (filter and context chosen randomly, variable name chosen from a short list, value generated by random_string(rand(20, 40))).
(Setting this up takes a long time!)
Then I have a test harness that basically looks like:
$contexts = $DB->get_records('context');
$startime = microtime(true);
for ($j = 0; $j < $numcalls; $j++) {
$function($contexts[array_rand($contexts)]);
}
$duration = microtime(true) - $startime;
I call that with three functions:
- noop($context) {}. Turns out that randomly picking an object from an array in PHP is quite slow so we have to adjust for tht in the real timing runs.
- simple_get_record_by_id($context) { $DB->get_record('context', array('id' => $context->id)); }
- filter_get_active_in_context - which is the function we are actually worried about.
Running the tests
The test script is attached if you wish to review it.
To run it, you need to
- save it to lib/simpletest
- apply the patch series from MDL-7336 to a HEAD checkout
- set $CFG->unittestprefix to something safe in your config.php
- go to the URL .../lib/simpletest/filtersettingsperformancetester.php
- Click 'Set up test tables'
- Wait
- Click 'Run tests'
- Click 'Drop test tables' (or don't bother)
You can, of course, play with the numbers in the script.
Results
The simple summary is that filter_get_active_in_context seems to take only about twice as long as a simple_get_record_by_id! I was expecting it to be worse than that.
In terms of scalabiltiy, that simple summary seems pretty stable. Dropping the test dataset size by a factor of 10 pushes the ratio closer to 2.5. It seems very insensitive to the density of local overrides and local config. That is all on
Postgres running on my desktop machine. I guess I should try an install on
MySQL now. Typical output copied and pasted below.
Time for 1000 calls to noop: 0.596s (0.596 - 0.000s) which is 1678 calls per second.
Time for 1000 calls to simple_get_record_by_id: 1.192s (1.788 - 0.596s) which is 839 calls per second.
Time for 1000 calls to filter_get_active_in_context: 1.821s (2.417 - 0.596s) which is 549 calls per second.
Total of 11101 contexts, 41721 filter_active and 48160 filter_config rows in the database.