Im currently working on some plugins, and I arrived at the point where I need to write testcases. To do this, I use the Moodle Unit Testing facilities.
Now I want to test my database functionality, which would be safest if it would run on tables with the unittest_ prefix. How can I make sure that those test tables are created exactly like I defined them in XMLDB?
Next to that, I will need to use some dummy users and courses. I read something about the generator tool, but that also requires me to set up the tables myself.
Can anyone help me with this?
Thanks!
For some of my recent Moodle 2.0 work I have added infrastructure to make tests using the DB possible to write, but they still execute slowly. See http://cvs.moodle.org/moodle/lib/simpletest/testfilterconfig.php?view=markup and the code in lib/simpletestlib.php that it uses. Unfortunately, that is not available in Moodle 1.9.
I'm afraid that that is not a helpful reply.
[code php]
function setUp() {
global $CFG, $USER;
if ($CFG->dbtype != 'postgres7') {
throw new Exception('Sorry, but this unit test is unable to be used in mysql-mode. try using postgres!');
}
global $QTYPES;
begin_sql();
$this->user = (object)array(
'firstname' => 'Test',
'lastname' => 'User',
'email' => 'test@user.org',
'username' => 'test',
);
$this->user->id = insert_record('user', $this->user);
$this->realuser = $USER;
$USER = $this->user;
function tearDown() {
rollback_sql();
global $USER;
$USER = $this->realuser;
}
[/code]
And it worked pretty well. The reason it's not supported on mysql is .... well, mysql has autocommit on by default for starters. This means that if php dies (Which it frequently does when you're developing and miss a ; for example), your data gets committed. I tend to get ranty about mysql.
Anyway this is a total hack and everything, and Tim has solved it nicely in 2.0 as he said, but if you need something in 1.9, and can use postgres, this works too.
Edit: Sorry, I cannot at all manage to get the syntax highlighting to work on this post.
So Moodle doesn't provide a way to clear a test db and insert records on it so that we can do unit tests on that data? I think it would be useful that whenever somebody runs the unit tests the test db tables are created, and inserted with the dummy data, so that the unit tests can test db functionality as well.
So in the meantime, use postgres, and revel in your transactional ability! ;)
There was a fantastic test facility in 2.0 which was very easy to use. It worked as follows:
1. you set up your tests to use the right base class
2. there was no step 2
All tables already existed ready for you, and you could make any change you liked to the test tables. All the tables you changed would be reset to previous state after the test finished. This worked with the 'unit test prefix' feature; a second copy of all tables was maintained and updated, along with the main copy.
But looking at Tim's example, I don't think it is using this system (or is it). Looks like it needs to manually specify tables it uses, and judging by the 'slow' comment I'm guessing it may be recreating those tables from xml each time.
Has this good system gone? Because it's a huge shame if so...
--sam
- Guaranteeing to reset everything to the state before the tests started was about possible about 99% of the time, which is not really good enough.
- There are various places where Moodle caches stuff for performance, and you need to think about that, and whether your tests will see the test data you just inserted into the test database, or instead just see the cached data from the real database.
Therefore, the completely magic system got disabled. So, when I needed to do some tests using the DB, I made a simpler system.
- Make your testcase class extend UnitTestCaseUsingDatabase.
- call create_test_table or create_test_tables to create the tables your test will need.
- load any required test data.
- Do the test.
This means that you have to do a bit more work to write your tests. On the other hand, you tests then use a minimal fresh fixture, which makes them more useful as tests as documentation.
- The performance constraints of constantly setting up / dropping tables are absolutely hideous especially if we get serious about creating unit tests. I should think it could easily take more than a day to run unit tests.
- Tests become rather fragile - most Moodle code calls out to other Moodle global functions, and in future versions, those Moodle global functions might access other tables that you haven't created yet...
- When writing a test you need to know lots of internal details just to create the right tables.
How about a special database driver for unit tests that is based on an in-memory database system? Presumably this could reliably reload a full db install each time because all changes would be stored in memory; it could also make testing faster. I guess I'm not offering to write this though
--sam
I've worked in several large projects with a similar approach, so seen this in action. Comments follow...
(alas, the blockquotes are still b0rked)
it could easily take more than a day to run unit tests.
Yes. That definitely happens, and happens even with lots of relatively fast tests. It's bound to happen on any large system.
The solution is a regression testing machine that checks out the tree and runs the full test suite -- Mozilla has something like that wrapped in a tool called tinderbox.
When something breaks, tinderbox flags it red, and lists all the commits between working and non-working versions. With git bisect, you can automate the bisection.
Tinderbox can also test performance metrics (it does for Mozilla ).
Tests become rather fragile (...)might access other tables that you haven't created yet...
The test framework does need the core tables setup. The tinderbox env can speed up 'core table' setup with some DB specific tricks like having a base moodle DB 'raw' and 'restoring it' directly in the filesystem.
How about(...) an in-memory database system?
Easier: Use mysql and restore sample moodle db using Heap tables?
The testing framework could be a good reason to get sqlite support completed and merged. It's trivial to do the above with sqlite.
Any faking of the DB layer would have to completely masks problems with SQL, a n important source of bugs.
Obviously using something like SQLite (which I alluded to in general terms, and you mentioned specifically) could still mask database-specific SQL problems; in order to find these, we would need to run the unit tests against MySQL, Postgres, SQL Server, and Oracle (any more?). Actually, that would theoretically be a great idea if possible. (And if it takes more than a day to run the unit tests on a normal database, then we're looking at a week for the Oracle run. )
I notice there's a few things SQLite doesn't support, however it looks like these are quite limited and probably not something we actually use even in more complex queries. It doesn't support ALTER TABLE so would not be possible to test upgrades, but other than that seems OK.
Rod (Norfor) and I were discussing this problem. We also talked about the possibility of filesystem rollback (e.g. if you use something like zfs, you can easily return to snapshot of the database) but this requires a bit more entanglement between the DB server and Moodle webserver than we liked. In the end, we didn't come up with any perfect solutions, it's a hard problem.
Personally, I think using something like SQLite with a database that is installed once (e.g. at the start of a regular unit test, with an option to skip this and use previous if already done, which would be useful when running a single test repreatedly), then make/delete a new copy of the file for each test to initialise it, would be best.
If this works out, another option that would be nice is predefined datasets. For example, let's say in any simpletest folder you could have a dataset.something.php script consisting of moodle commands to initialise values into an empty database. [note it is safer to do it this way than include a prebuilt database because that needs redoing every time tables change.] Using a test base class, you could write a test class that uses a particular dataset. This would:
* If dataset has not been created, load the 'clean' sqlite file, run these commands, and save this as a 'dataset' sqlite file with the relevant name (something.dat)
* For each test, make a copy of existing dataset file and work on that - just like as above for non-dataset tests, only starting from the dataset instead of the plain install.
* Make sure all datasets are erased at the same point when the 'clean' sqlite file is regenerated, so they will be recreated then.
That would be pretty cool and relatively fast and minimal work for test creator, wouldn't it?
--sam
The 'test harness' infrastructure is _outside_ of Moodle -- it can run as root, it can stop databases, change their 'storage' directory and start them up again -- just using cp, no need for zfs ubercoolness
(Using zfs is overkill and actually awkward. ZFS won't let you rollback a directory, only a whole partition. So you'd need a separate partition for every dir you want to rollback. Ugh. With plain ol' cp you can have various DB snapshots around with no such complications...)
So SQLite is good for the convenience of running a unittest from the commandline (or even - bletch - from a webbrowser). But a tinderbox-style setup can do what you describe (setting up 'baseline' DBs snapshots that are cheap to revert to) with Pg and MySQL very easily.
By very easily I mean it's a SMOP. It's still quite a bit of work to implement
The recommended approach is to use an in-memory implementation of the database layer. In the Moodle case, that would mean creating a moodle_in_memory_test_database extending moodle_database which implements most of the methods - basically all the methods without 'sql' in the name - in terms of in-memory arrays, and throw not-implemented exceptions for the other metods. That would let us test a lot of our code very fast, and code that uses more complex SQL would still have to be tested against the read DB.
(If I was doing this, I would parse the install.xml file to get the column types, and validate the arguments to insert_record, etc. strictly. I would aslo make a very simple switch so all tests that are normally run against the in-memory database can easily be run against the real DB. However, I won't have time to do this any time soon.)
I think these defined datasets would be a great solution and I don't care what any expert says (Arrogant? Moi?) I do think it's important to ensure that they are correct [i.e. generated from latest install.xml files] and cannot be changed by any test, though. But I want to be clear, this isn't really about speed although things being painfully slow *is* a problem. It's mostly about it being as easy as possible - or let's be honest, as not-too-difficult as possible - to write unit tests. IMO that's the critical factor here. Having to create tables at the start of each test doesn't make it easier to write unit tests.
--sam