Idea for Moodle performance metrics

Idea for Moodle performance metrics

by Tomasz Muras -
Number of replies: 8
Picture of Core developers Picture of Plugin developers Picture of Plugins guardians Picture of Translators

Hi All,

It's always difficult to talk about performance of any web application as it depends on many factors (hardware, software, usage patterns, data, ...). It's nearly impossible to compare one Moodle setup to another (and get meaningful results).

How about something like that:

we could create a special version of Moodle that one could install on his environment, login and click a button to run some performance tests. That Moodle would come with users, courses, etc pre-created and would run tests like (just an example) submitting new forum posts. The results (like maximum # of forum posts added / second) would then be submitted to some central server or simply posted on this forum.

This could give us some way of saying "on my hardware, I can get X concurrent users per second, my Moodle Performance Score is ...".

cheers,
Tomek

Average of ratings: Useful (1)
In reply to Tomasz Muras

Re: Idea for Moodle performance metrics

by Howard Miller -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers

I am somewhat ashamed to say that I have been left behind by current automated testing in Moodle, but could some of these ideas be wrapped into that? The testing creates fake test data already surely? The next step is timing key events. For all I know we are doing that already...

In reply to Tomasz Muras

Re: Idea for Moodle performance metrics

by David Monllaó -

Hi Tomasz,

In HQ we have been working in this direction and released a tool to run performance tests using JMeter (https://github.com/moodlehq/moodle-performance-comparison)

This tool wraps all the required steps to compare Moodle sites performance using the same data set, so you can compare different branches, different site settings sets, database and web server changes...

The tool runs the same test plan 2 times to compare results, you decide what you want to change between the 2 runs. Resuming the whole process which includes 3 different shell scripts:

  1. Installs a moodle site
  2. Populates the database with courses, users, enrolments, resources and forums using static data sets (the data set size is the only variable) so we can also share results with other organizations
  3. Generates a JMeter test plan based on the system contents, the test plan mimics a student session, you can find more info the the project README, but basically it logs in, goes to a course, access a couple of resouces, reads and replies to a forum...)
  4. Backs up the database and the dataroot directory
  5. Runs the JMeter test plan
  6. Restores the database and the dataroot directory
  7. Here we introduce differences between the sites depending on what we want to compare:
    • Upgrades moodle to a different branch (it is done automatically, the tool can be configured to do it) to compare a patch performance OR
    • You can log into the system to change site setting values to see how they affect the system performance (you should do it manually) OR
    • Tune services (more database connections, more threads...) OR
    • Whatever you can think of...
  8. Runs again the test plan
  9. Provides a web interface to compare results (charts, percentages, absolute numbers...)
    • The runs results can also be downloaded to share them, they are PHP files including info about the run (site major version, commit, number of users, loops...)

It currently reports about:

  • Database queries (we are working in adding the time elapsed running 
  • Memory usage
  • Number of files included
  • Server load
  • Session size

Moodle codebase includes a bunch of features that are used by this tool to populate the site and generate the test plan (since Moodle 2.5.2+ Build: 20131004) they can also be used along with other tools you may have or other JMeter test plans you currently use:

 

In HQ integration team we are running this tool in daily basis to ensure that no performance regressions are integrated in core. You can find all the info in the project README (https://github.com/moodlehq/moodle-performance-comparison/blob/master/README.md) Feel free to comment / send feedback / report issues in the project page, the tool was released recently and there are plans to extend the code coverage to other widely used components and report on other vars.

Average of ratings: Useful (3)
In reply to David Monllaó

Re: Idea for Moodle performance metrics

by Tomasz Muras -
Picture of Core developers Picture of Plugin developers Picture of Plugins guardians Picture of Translators

Hi David,

 

Thanks for the detailed response. What I'm proposing here should definitely be related to the tool you described but I'm thinking about a different purpose.

I was thinking about giving any administrator an easy way to measure how well Moodle will perform in his setup. If the idea makes sense then it would bring us benefits (statistics) only if it's widely used on different environments. Therefore it would have to be as simple to set up as possible. I was thinking about something like you do but all data would be pre-generated.

For example we would create Moodle site with 20 courses and 100 users and use it as a baseline for treating.  This would be tagged and frozen data set, eg Moodle performance baseline version 1. If later on we discover there is some performance problem with Moodle installations that, say, have many categories - we will create another baseline and call it version 2.

System administrator should be able to download the whole lot - data, database and source code and install it on specific environment. She would then simply click a button somewhere and get the performance results.  Then  scrap  the  installation.  The stuff we could test could also include things like number of megabytes / files hashed per second, number of logs inserted per second, etc.

In general it would not be used to compare performance of two different settings / Moodle versions but to measure how fast your setup is.

Tomek

(Edited by Dan Poltawski to clean up excess whitespace - original submission Tuesday, 11 February 2014, 2:52 AM)

In reply to Tomasz Muras

Re: Idea for Moodle performance metrics

by Dan Poltawski -

In general it would not be used to compare performance of two different settings / Moodle versions but to measure how fast your setup is.

Hmm, perhaps i'm misunderstanding your objectives but I'm pretty skeptical about this. It reminds me of Sam Marshall's performance perspective scripts and I think that is a very blunt instrument to use for measurements.

Tuning Moodle (or any web application) for 'single threaded' performance doesn't really give much indiciation of wider scalability performance.

In reply to Dan Poltawski

Re: Idea for Moodle performance metrics

by sam marshall -
Picture of Core developers Picture of Peer reviewers Picture of Plugin developers

I definitely agree with this about there being little point measuring single-threaded (or even single-server) performance. To illustrate Dan's point, consider two setups:

a) one machine running a web server and database server.

b) 4 machines of the same power running web servers, and another much faster machine running a database server.

Single-threaded performance will show a big advantage for setup (a) because there is no round-trip time for database queries (e.g. if a typical page makes 100 queries and there is a network round-trip time of 1ms per query, then doing it on the same machine will be 100ms faster). But the capacity of setup (b) is much higher because you now have five machines sharing the work.

I believe that Moodle HQ either already do, or plan to do, performance testing comparisons on multi-server environments.

But just to defend myself since my name was mentioned smile - the original 'performance perspective' script was really intended simply to get a perspective about how slow different things are in PHP. I.e. that in PHP, basic things like function calls may be slow compared to higher performance languages like Java, but they're still way faster than a database query. At the time I was encountering a bit of, omg, that makes an extra function call, you can't do that, and I was like, screw the function call it makes a database query anyway so who cares, and wanted something to justify myself. smile

--sam

Average of ratings: Useful (2)
In reply to sam marshall

Re: Idea for Moodle performance metrics

by David Monllaó -

Hi Sam,

One of the purposes of this tool is to allow all kind of organizations to run performance tests in their own infrastructures without having to worry about creating a test plan, adding big data sets, prepare sites to begin always with the same status, warm-up processes... We can run (and we do) performance tests in our infrastructure, but the results will change completely depending on many factors, network configuration, cache stores, number of servers, hardware, memcache, sessions, site settings... There are that many factors that we can not test all the combinations and the results we get can be completely different from yours depending on all that, that's why we focused our efforts on creating a tool for you to run these tests and tune your infrastructure and settings according to it's results. All the results can be downloaded and shared, and we can create a knowledge base with all those results so it can be helpful in future for other sys admins.

 

In reply to David Monllaó

Re: Idea for Moodle performance metrics

by Visvanath Ratnaweera -
Picture of Particularly helpful Moodlers Picture of Translators
All these different approaches are justified because they answer different questions. IMO the common questions fall into two categories:

1. How does platform A compare to platform B for the same (standard) Moodle-load?

The platform being the combination of some hardware and a software stack. One may consider caching mechanisms also as part of the software stack.

2. How does Moodle version x compare to version y?

The idea here is to keep track of (relative) resource needs of Moodle versions for the same (standard) Moodle-load.

For example, if one needs to compare two machines (hardware), category 1 is the right one. Just keep the software stack and the caching mechanisms the same. Or, if the question is whether an upgrade will degrade the performance, type 2 is the right one.

The solution proposed by Thomasz fall into category 1, because it is bound to a particular Moodle version. Well, unless somebody provides this enhancement for every Moodle release.

Sam's "perspective" is also a rough approach to category 1. Rough because it does a (Moodle-independent) PHP-benchmark and "convert" it to a (hypothetical) Moodle load.

I haven't looked at the HQ-solution. But from David's description I guess it falls into category 2.
In reply to Visvanath Ratnaweera

Re: Idea for Moodle performance metrics

by David Monllaó -

Hi Visvanath,

https://github.com/moodlehq/moodle-performance-comparison was designed to be used in both scenarios, changing hardware, moodle versions or whatever you want, the base is that you will run the same test plan using the same data set as much times as you want, between runs you can change hardware, settings, moodle versions or whatever you want to compare.

The results of the test plan runs can be downloaded and restored (copying back in runs/) in other systems, so users controls all the changes they have done in the system (both hardware and software) and the tool allows you to compare them.