This is potentially a very complicated task, especially seeing as it would need to be as transparent as possible to retain the ease-of-use and ease-of-modification that makes Moodle so desirable.
In a professional application, for example, I would make all components of Moodle create a template file which contains all of the structural information already and does the bare neccessities of data extraction to fill in the dynamic parts of the pages. However, with Moodle's current setup, creating this sort of framework is difficult, especially when a teacher wants to come along and modify something simple but can't understand why a PHP script is writing another PHP script.
What are peoples' thoughts on this topic? How far should Moodle be cached, and how much impact do loading times make on your use of Moodle?
1) the filter's area - remove all non-essential filters especially autolinking large glossaries
2) the filter's are - increase the cache time,
3) the langcache setting in admin should be on
4) Messaging should be off
5) Experiment with dbsessions to on (?)
6) Experimenting with a php accelerator
Any other non-hardware tips, anyone?
He claimed that the first page was particularly slow to load, and that subsequent pages are slow too. I presume that this suggests that even with the present system of caching, old servers are not facing problems with large php (L)CMS like moodle.
The trouble is that with the planned move to SMARTY templating engine, and plans for increased seperation of layout and content, not to mention ongoing development in general, it sounds like making a templating system is going to be pretty difficult.
Tim
A PHP Optimizer is likely to increase speed nominally, but what makes Moodle slow is that it regenerates the page content and grabs all the data out of the database EVERY request. Most of this stuff is duplicated. With an efficient file-caching scenario I think the speed of Moodle and it's loading latency could be reduced by up to 200-300%.
Looks like this one's being covered for the moment - I might leave it on the back burner for a while.
I noticed on the "Some nice little projects" page resource search functionality is listed. Maybe I'll have a crack at that instead.
There seems to be no unified search platform, which would be the easiest way to do things. Adding a central search index for all items in Moodle and offering module support through a search class seems the best way to do things. That way modules can choose to offer serch functionality or not, and, depending on what you're looking for, different criterea can be set. Yes. I think I'm going to enjoy making this
Any caching system is going to have do a lot of checking of data because Moodle pages are so dynamic - I doubt anything in PHP can compete with the native code solutions I outlined above.
Most of the time when people have speed issues with Moodle it's because they either haven't set up all of the above, or the server is simply overloaded with too many users and too features enabled. Any car with ten people on board is going to drive slow.
I'll move this to the Servers and Performance forum.
One active apache process requires about 20-30Megs of RAM, depending on active modules. One lighttpd process requires 2-3Megs of RAM, depending on active modules.
My lighttpd-installations (e.g. the newest moodle-liveCD) are runing php in cgi-mode via fastcgi. All active lighttpd and fastcgi-processes require only 18-25Megs of RAM. Speed increases rapidly with the use of lighttpd (load-balancing can be done as well...), but this has to be checked on really "big" moodle-installations, because php-cgi is told to be slow... The liveCD runs properly on a PII-400 with 192Megs of RAM now. With apache speed was really poor. I am just checking out to migrate our moodle site to lighttpd, which is no big thing...
loadbalancer (e.g. proxy)
two webservers
two MySQL-servers
just my ideas....
Maik
Also you lose any advantages gained over the shared library which keeps PHP in the background (instead of loading it up every time) which would have been part of your Apache resource usage. As well, you lose the ability to use things like persistent database connections in PHP (because PHP is executed seperatly each hit instead of the same instance being called with several hits).
I agree about the resource usage, though. Perhaps try using a web server which supports the PHP library rather than the CGI functions? I know there are some out there, although I've never had to use them.
Maik
About Swish-e
Swish-e is a fast, flexible, and free open source system for indexing collections of Web pages or other files. Swish-e is ideally suited for collections of a million documents or smaller. Using the GNOME libxml2 parser and a collection of filters, Swish-e can index plain text, e-mail, PDF, HTML, XML, Microsoft® Word/PowerPoint/Excel and just about any file that can be converted to XML or HTML text. Swish-e is also often used to supplement databases like the MySQL® DBMS for very fast full-text searching. Check out the full list of features.
Swish-e was recently featured in the Linux Journal article How to Index Anything by Josh Rabinowitz.
Swish-e is descended from the original web indexing program SWISH by WWW Hall of Famer Kevin Hughes.
- It's a native c application. That means it's not as easy to "plug-and-play" as Moodle, making it difficult for people wanting a LMS-in-a-box.
- It's very dependant on files. Everything has to be a file, and everything is converted to HTML for indexing (not realy a problem for us, but can be anoying with SCORM, etc)
- It uses a lot of Perl, and although this is very popular, may limit it's use in a PHP-based application.
Key things it doesn't have:
- The ability to know what courses a user is in and tailer the results accordingly.
- Knowledge of a user's permission level, thus indexing "non-released" content wouldn't be possible - a set back for teachers/creates/admins.
- Meta-data knowlgde: the ability to restrict results based on relevant fields for relevant data types, for example, all discussions with no replies, or all assignments submitted by user xyz across all courses.
It has provided me with a good code-base to implement stemming at a later date, though. This is something I'd like done eventually.
There is heaps of scalability/performance improvements that can be done in Moodle, but they are more around tuning how we work with the databases. Some small (well tested and profiled) changes in datalib can give us a huge boost. I've added a lot of profiling information in 1.5 so we can collect stats and focus on the problem areas...
Anyways, I'm leaving this for now. I'll work on the search engine first.
A programmer that I respect says negative things about Smarty.
http://fplanque.net/Blog/devblog/2005/09/27/mvc_smarty_templating_inefficient
Why I consider (MVC) Smarty templating inefficient
Most PHP developers (and other web developpers too) seem to evolve on a similar path which goes like this:
Step 1: take HTML pages an add PHP tags into them.
Step 2: Realize that on a large scale this is getting very hard to maintain.
Step 3: Learn about the MVC (Model View Controller) paradigm and begin to think it's cool.
Step 4: Use Smarty or another templating engine like that.
The way those templating engines work is basically this:
a) Do a ton of PHP processing in the framework and fill in variables.
b) Call smarty and let it fill in variables in a template.
c) Send the output.
The issue here is that during steps a and b, nothing is sent back to the user. The user is just waiting for a response. Sadly enough, steps a and b will take a lot of precessing time. So the user will wait quite a long time before he gets any feedback on his action.
Even worse, when the application gets bigger, step a will take even more time!
And the more complex page you request, the more time you'll be left in the dark...
This is why I don't like all those mainstream templating engines. I want to send output back to the user as soon as possible. I want to send the page header at the begining of step a. And as soon as something has been processed and is ready for display, I want to send it out.
The global processing time will be approximately the same, but if the user sees content begining to display immediately, instead of all the content displaying at once 2 seconds after clicking, he will get the (false but humanely useful) impression that the application is faster. (Please do not say AJAX here, it is so not my point )
I'm not saying MVC is a bad paradigm, I'm saying most implementations of MVC are flawed. Because they do the M & C stuff and then they do the V (view) stuff at the end. In my opinion they definitely need to interleave all this a lot more.
In evoCore (b2evolution's framework) our approach is to have the Controller really control the processing flow. In other words: for every block of information the View wants to display, the Control will process data from the Model and immediately pass it to the View in order to display the result block by block.
Of course, this has drawbacks: you won't be able to change the block order of the output if it's hard coded into the Controller. But really, that's okay to some extent. Who really needs to move the global header and the global footer for example? This is what we use for the back-office.
However, for the front-office, we want more customization, and we want people to be able to rearrange blocks in any order. In this case we have a little more complex processing, where the View actually calls the controller everytime it needs to display something.
Anyway, in any situation, my point is: the View should not be handled last.
However -- PHP's model of spitting everything at the user immediately is BAD in the context of serving webpages because you cannot do any decent error handling. Anything that could potentially ever trigger an error must happen before you send content to the user -- this is part of the HTTP protocol model, and PHP breaks it badly.
Perl strikes a great balance: all the output is buffered until you are done. So your code can print() from anywhere, and still die() and the user gets a correct error page (and apache logs a 500 as it should). If your page is long, and you want to print out early, you can force a buffer flush -- but then you know you can't die() on errors. And at any rate it's the exception, not the rule.
cheers~!
Back to templates, the idea is to split design away so that people who don't know programming can change the layout. I see it as more of a user issue than a programming/efficiency one.
However, with XHTML 1.0 Strict and good CSS the whole idea of templates becomes less and less useful, since so much can be done in CSS. If we really need to have different layouts (say blocks vs no blocks) then these can be handled by user interface variables in PHP (like we do now).
Smarty is really on the backburner for me. If we ever use it I think it will be restricted to just a few special pages.

Anyway, in my opinion, we must follow (increase) the tendency to encapsulate more and more code inside functions, both because of html readability and logic-presentation separation. This will make things really easier for everybody and the possibilities of CORRECT interaction with Moodle from external systems will be more than better!
Ciao
Dull answer, www.m-w.com:
Main Entry: back burner
Function: noun
: the condition of being out of active consideration or development -- usually used in the phrase on the back burner

The assumptions of PHP are that we're on stage, the band is playing and we're printing to the client -- all the time. And I'm happy to play by PHP's rules, and there are certainly strategies to cope.
Luckily[*] Moodle uses really good strategies to address this. The most important thing is: we do a lot of processing before we print the HTML headers, and in all that process, we can error out safely. And most of the modules follow that quite well and only do trivial things after the HTML headers are out.
* What I call luck is usually MartinD's good judgement!
In some cases we have that as separate somefile.php and somefile.html -- which I think is PHP's sweet spot and a great practice. I like how 1.5 has gone down the track you describe of simplifying the HTML parts and letting people manage their page by (ab)using CSS. And abstracting things more into functions and pre-processing somefile.php files.
(I sometimes rant loudly at PHP, but it's noise. It definitely hits a sweet spot for projects like Moodle, and then what we want to do is to play to its strenghts rather than to its weaknesses. Blame my rants on my being a Perl-head).
While I don't fully agree with the original blog poster, I want to point out that this is what the original blog poster was mostly against of: that everything is buffered and only sent out in the end.
I'm prejudiced against his post a bit as he kind of equalled Smarty with MVC. People can code 100% MVC-less code with Smarty too, or 100% MVC without it...
It's true that this is the way almost everybody codes in PHP. But even in PHP, you still have to specifically print the stuff out before they are thrown to the browser. No function returns (naturally) are echoed to the browser, so it kind of still gives free hands to do things "the right way". Which would be something like the current page object to the power of two. A display layer asking modules "do you have something for output there, this thing is going to the browser soon you know". Most functions should not print at all, but just to return arrays for processing.
I think that any language powerful enough let's you do things either the right way, the wrong way or the very wrong way. PHP does have a lot of glitches and weird/inconsistent things, but I think that the low learning curve (<?php print("Hello, world!") ?>) is what brings out most of the "bad code".
And some day, I'll have so many students at linuxcapacityplanning.com that this will matter to me and my hosting service.

I'll go and sign up now, your site looks interesting in any case.