Pattern match
Question types ::: qtype_pmatch
Maintained by
Tim Hunt,
Phil Butcher
Allows a short response of one or a few sentences that is graded by comparing against various model answers, which are described using the OU's pattern match syntax.
Latest release:
1701 sites
426 downloads
41 fans
There are working examples in eAssessment with Moodle on our OpenLearn site.
The documentation is in How to create questions in Moodle on that site.
Sets
This plugin is part of set Open University.
Contributors
Tim Hunt (Lead maintainer)
Phil Butcher: Question type designer
Jamie Pratt: Developer
Colin Chambers: Developer
John Beedell: Developer
Chris Nelson: Product owner
Please login to view contributors details and/or to contact them
I tried to upgrade Pmatch qtype in my current installation (Moodle 3.5) and I got this error:
Excepción - syntax error, unexpected 'const' (T_CONST), expecting variable (T_VARIABLE)
Más información sobre este error <-- Notice that my defaul language is Spanish
×Debug info:
Error code: generalexceptionmessage
×Stack trace:
line 48 of /question/type/pmatch/classes/local/spell/qtype_pmatch_spell_checker.php: ParseError thrown
line ? of unknownfile: call to core_component::classloader()
line 129 of /question/type/pmatch/db/upgrade.php: call to spl_autoload_call()
line 632 of /lib/upgradelib.php: call to xmldb_qtype_pmatch_upgrade()
line 1857 of /lib/upgradelib.php: call to upgrade_plugins()
line 694 of /admin/index.php: call to upgrade_noncore()
Fortunately I could cancel the upgrade and restore my old version.
Before sending this message I tried it one more time after changing my default languague to English and I got the same error but with much less additional information:
Exception - syntax error, unexpected 'const' (T_CONST), expecting variable (T_VARIABLE) <-- Just this!
I hope that will be easy to fix. Either way, it is not urgent for me as I can go on using the current version.
Thank you in advance Tim!
It probably could be improved if anyone was brave enough to dive into the code. Initially our priority was to get it to implement the matching correctly, and that was hard enough. Optimising performance is not something we have done much work on.
- A bug: Proper nouns fall foul of the spell checking logic. For example, if you include the word Greek in your typed answer, the code converts it to greek before sending it to the spell checker. Aspell then chucks it out as a spelling error, even though aspell has a case insensitive option and Greek is in its dictionary.
- Optional words / phrases: I could vastly simplify the rules for many of my more exact questions just by being able to add an empty alternative using the or symbol, either by | on its own or perhaps |[]. Sometimes I can achieve the same effect if there is a simple anchor word before or after but often the words before or after are alternatives too and things get ugly fast.
- More selective match_w option: often I want to allow extra words only before or after the match but not _in_ the match clause - making it possible to pick out one particular phrase or ignore extra detail or examples without rewriting the answer rules. You can do this with _ and p control, but it can get very fiddly.
- Support for multiple possible pmatch answers in the combined question.
- This is a kicker, and might be better handled via some option in the combined question. Often I want the student to include multiple answers. E.g. "State three reasons for the fall of the Weimar Republic" or "Describe three features of a contemporary CPU that improve performance". What I don't want is the student to be able to give the same (or same meaning) answer multiple times, and I want to give them credit for fewer than 3. The number of permutations becomes ridiculous very fast. For four requested out of four in the main pmatch, it's Answer 1 (100%): match all of 4 x match 1 possibilities + Answer 2 (75%): match any of 4 x match all of 3 x match 1 possibilities + Answer 3 (50%): match any of 2 x match all of 2 x match any of 2 x match 1 possibilities Answer 4 (25%): match any of 4 x match 1 possibilities. That is 28 items, the same four matches repeated seven times. This could perhaps either be handled by having a match_n operator in the pmatch question type and/or allowing the combined question type to give tags to any answer that can only be used once across the whole question.
Thanks very much for the feedback, Stephen. Given the amount of feedback, it might be easier for Tim and/or I to discuss if you posted each separately in the forums.
1) We'll check this as that does sound wrong (but just to check, has case sensitivity been switched on, and has 'Greek' been added to the question settings dictionary?).
2) In a forum post, could you expand on this, including an example question or two, please? It really helps us understand PMatch limitations when we see how others are trying to use it..
3) In a forum post, could you expand on this, including an example question or two, please? In general, aye, that's also on our improvement list for PMatch, although quite far down.
4) The Combined PMatch sub-question was designed as a lightweight implementation, and we did that to try to keep the Question Editing Form as short and clean as could. We're currently making a few other changes to Combined though (adding single choice and shuffling/list options for single choice and multiple response - hopefully this will be available within the next few weeks), so we'll review.
5) We wouldn't want to promise anything specific at this stage, but we've been thinking for a while now about a new form of PMatch (tentatively called the "PMatch Sets" question type) that would allow educators to ask that sort of question. If we're lucky, we might start formal development in our next dev quarter. Once we release it, we'll bundle it with normal PMatch as a plug-in set. And then we'll make it available as a Combined sub-question type too.
Do you intend to update this for 3.9 when it's out?
(Also, it is rare for a upgrade to break a plugin. You don't have to wait, you can always test the current plugin with the latest Moodle. There are good chances it will work. If you share you results here, that is a useful contribution to make.)
match_w (2n)
this will not work, but if I tell pmatch to remove '<' and '>' and convert to space, then the following
match_w (2 sup n /sup)
now works. It is a very bizarre thing. Note also that the subscript version does work, so ... weird. Is there any reason for this?
--Gareth
If possible, please attach an example question exported in Moodle XML format. (In the past, github did not like XML attachments. If that is still an issue just rename to .txt.)
kind regards - Gareth
Well... we'd kinda expect admins to go to the Quiz grading for manual corrections. But really, you should be able to cope with spelling (e.g. enable dictionary), synonyms, and ignoring spaces in the question settings. I appreciate that I'm inferring on very little data, but it suggests that the issue lies with the question wording or pattern-matching logic, although it could point to an oversight with what we've made available. I guess some examples of the specific problem would help us see if we need to add/tweak things. But don't forget about the 'Test this question' link that lets you import/review and check answers, to spot any holes in the pattern-matching ahead of 'live' student use.
It is the quiz grading for manual corrections I am talking about. I have had situations where the variations in the answers (I am in chemistry) were just too many to properly catch in the pattern-match algorithm. I know we should aim for 97% and one or two mistakes can creep in, but when you go to manual grading there is no option to just grade the ones that were never matched. This could be a useful feature where partial pattern-matching allows, for example, removing obviously incorrect answers, minimising the amount of manual grading that needs to be done.
I do need to say that this question type is amazing - I use it in course of close to 1000 students with two languages. My only real issue is one that IT have fobbed off, and that is the processing time that it takes every time I make a change and want to check during grading. Otherwise the thing I am mentioning above would just be a nice feature.
Sounds interesting