Assessment for Learning

Assessment for Learning

by Tony Gardner-Medwin -
Number of replies: 22
Picture of Particularly helpful Moodlers Picture of Plugin developers

To learn best from a quiz, I prefer students to have their answer marked and to see feedback immediately after entering the answer, while they are still thinking about it. Moodle can accomplish this with adaptive mode, but unless you put every Q on a separate page, v 1.7 seems to jump to the start of the quiz (or page) when you 'submit' the answer.  Two questions about this:

1.  Can it be made to jump back to the correct place after a submission, with >1Q per page?

2. Is this mode of use at risk of putting an intolerable burden on a server? Obviously depends on usage, but say with a few dozen users each answering a Q and requiring server access every 10sec or so? Because of this concern, I have always in the past used client Javascript to mark answers for practice / revision exercises, so that server accesses aren't required. But I'm not sure this is consistent with Moodle ways of doing things.

I'm looking at this in the context of adapting Moodle code for Certainty-Based Marking ( www.ucl.ac.uk/lapt ), which is also intended to improve 'assessment for learning'. This is working fine, but it really needs a facility to switch CBM on or off for a particular quiz, depending on how you want to use the quiz (without changing the Q types), just as you can switch other aspects of grading (e.g. 'adaptive mode' or 'apply penalties'). So:

3. Is there any way to add a new grading option switch as a plugin, or does it necessarily require altering core code?

Thanks for any help. Tony GM

Average of ratings: -
In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators
Tony > Can it be made to jump back to the correct place after a submission, with >1Q per page?

This request has been partly answered in this post. Unfortunately, this (desirable) behavior has not yet been implemented in current Moodle versions.

Joseph

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Do you know my colleague Phil Butcher? Sounds like you have a log in common. Also, you may be interested in our system OpenMark http://www.open.ac.uk/openmarkexamples/ which is now open source: https://openmark.dev.java.net/.

2. A big quiz with all the questions on one page does put unnecessary load on the server. Better to break it up into more smaller pages.

However, the whole point of web servers is to do processing for und users. So server-side is the natural place to do the processing. Expecially since the code is then running on your server, whereas with javascript you always have problems getting it to work in all browsers.

And if it is all done client-side, and the student's browser crashes, they lose everything, whereas Moodle always stores everything in the database at every step, so you can always pick up where you left off.

If you do the processing server-side, then your pages can be fairly plain HTML, which is better for accessibiltiy.

Finally, if you are processing client-side, then it is possible for students to cheat, athought doing so probalby takes more skill that just getting the quiz right for most people.


1. I've often thought about implementing it, but never got around to it. The rought idea would be to add an on-click handler to the submit button for the question that does

form.action = form.action + '#q' + {$question->id}


If you get this working reliably, please post a patch here, so I can check it in to the CVS repository.


3. Would require altering core code. But all question marking goes through functions in lib/questionlib.php, indeed through the function called something like question_apply_penalty_and_timelimit. Similarly, the adaptive mode submit button is printed in one place (can't remember where), and you could control the printing of the certainly dropdown menu there. Then you would need to change the quiz editing form (and would probalby need to add another table to store the options, unless you just wanted to hard-code the risk/reward scale). So it might no be so difficult to do.
In reply to Tim Hunt

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers
Thanks to Tim & Joseph.  Re my point (1), and following Tim's suggestion, it seems to work fine just adding the line
         echo " onclick='form.action=form.action + \"#q\" + \"{$question->id}\"; return true;' ";
before
           echo ' />';
at the end of the function print_question_submit_buttons() in question/type/questiontype.php

Re (3) I'll for the moment simply work around the problem of switching CBM on and off, by triggering CBM with something in the quiz name to indicate whether or not it is required.

Tim's comments re (2) are all valid and worthy, but rather overlook I think the other side of the coin, which is that the prime function of assessment in e-learning is really to help the student learn and self-assess. Online assessment in any case is never secure unless you check who may be helping the user, so JS insecurity isn't really a concern. If a student wants to cheat, that's his/her loss, through not learning to address the issues. With javascript a student can work offline, with a download or a CD (e.g. in a remote Pakistan village - where one of our med students chose to do her revision recently, and prompted me to provide download facilities). With JS, web congestion delays never interfere with student learning, nor do crashes (unless submission of results is a teacher's requirement). We don't actually have much problem with different browser JS compatibilities, though I confess that avoiding them has caused me some surprising headaches. I do like the look of OpenMark, though it too needs a server access every time you respond, and doesn't I think do anything we don't do with LAPT. I'm not really suggesting Moodle should use javascript on a big scale (though it may be worth introducing a bit more). My query about server congestion is really a quantitative one, for which I have no feel at present: when will congestion start to show itself through response delays, or even crashes? With luck it won't be a problem, but we may have to find out by experiment.

Moodle has so many good features to aid learning and feedback. My concern is simply to make sure that at least in some modes of use the quiz presentations are optimised for student learning, as ultimately a more important function than assessment. Perhaps this could be a thread for discussion in Tim's proposed quiz workshop at the OU, which I should like to attend.

Tony GM
In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators

Tony > Re my point (1), and following Tim's suggestion, it seems to work fine just

adding the line
echo " Xonclick='form.action=form.action + \"#q\" + \"{$question->id}\"; return true;' ";
before
echo ' />';
at the end of the function print_question_submit_buttons() in question/type/questiontype.php

Just tested it on my local install of Moodle 1.8... it works fine! Thanks, Tony.

Tim, what about putting this in CVS as soon as possible?

Joseph

In reply to Joseph Rézeau

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

NB there is an X gets inserted before 'onclick' in this code line on the Moodle forum, and I can't edit it out. Bizarre. Delete any X you see if you want to use the line, but note that the space before onclick is required.

echo " onclick='form.action=form.action + \"#q\" + \"{$question->id}\"; return true;' ";

before
echo ' />';
at the end of the function print_question_submit_buttons() in question/type/questiontype.php

Tony
 

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
I know. This is a security feature in moodle. It ensures that untruseted users like you and me cannot possibly add JavaScript to a forum post that will get executed by other users.
In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Definitely a good topic of conversation for the MoodleMoot. Assessment for learning is something we are very interested in at the OU.

I'll check in the code change on Monday.
In reply to Tim Hunt

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

This isn't finished yet, but here are code changes I've developed for implementation of CBM (Certainty-Based Marking) within Moodle. So far I've just done multichoice and TrueFalse Qs, but quite a lot of the issues are common to all Qtypes. Various points will need tidying up. These changes are for 1.7, and one reason to put these up, preliminary as they are, is in case someone can see glancing at them whether later changes in 1.8, 1.9 are likely to have major or minor impact on what's needed.

With these mods, any quiz (with MCQ or TF Qs) can be made to run either with CBM or in the normal manner, according to whether you include 'CBM' in its title or not (a temporary fudge).  CBM operates either with immediate feedback (much preferable pedagogically, though with greater server loading) or with feedback only after page or quiz submission - according to whether 'Adaptive Mode' is switched on or off.

I may have missed documenting some changes in the attached document, but with luck not - in case anyone wants to try it out. Tell me if you have trouble.

Tony GM

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

I've put up a 1.7.2 installation where anyone can try out Moodle with  Certainty-Based Marking (CBM) using TF, MCQ and Text/Numeric Qs.  You can run the same demo quiz with 4 alternatives: with/without CBM, and with/without adaptive mode (giving immediate feedback). There is a forum on it for bug reports/ comments/ suggestions. The code is improved on that attached earlier in this thread, and I'll make it available once things are settled (or if anyone asks). It would be really helpful if people could take a few minutes to try the site out and react to what I'm on about in this context of CBM and 'assessment for learning'.

URL: http://www.ucl.ac.uk/lapt/moodle17/moodle .

You can create your own account or use the open account username=moodler and password=moodler .  You can run any number of attempts and put in right as well as wrong answers of course. There are a few improvements implemented along with the CBM code that are not specifically to do with CBM - e.g. distinction on review between Incorrect and Blank responses. Tell me if there are any problems ( ucgbarg@ucl.ac.uk ).

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators
Hi Tony,
Thanks for your CBM questions and for making them available on your test site. It all looks quite promising. I have one remark about the meaning you give to adaptive/non-adaptive mode. To me, "adaptive" mode means not only immediate feedback but also a chance to change my answer, which I cannot do in your system. But this is probably unavoidable, due to the special nature of the CBM tests.
Oh, and it's not possible to post new discussions to your Forum: Comments, suggestions when logged in as moodler.
Joseph
In reply to Joseph Rézeau

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

Thanks Joseph. Sorry to be excessively dumb, but I can't see how to enable postings for 'moodler' despite quite a hunt. I've set 'yes initially' for 'force all users to be subscribed'. Can't find anything more relevant seeming!

I'm only using 'adaptive' because it is the moodle word for the switch that gives the functionality I need: when 'on' the server is contacted after each answer. The other use for 'adaptive mode' is a bit hard to reconcile in general with CBM. If initially you're 'sure' Glasgow is Scotland's capital but would enter Edinburgh on a second attempt, I think I'd rather just give you the feedback after your 1st entry, and not give you the comfort that at least you got it right second time round. You shouldn't have been sure. But with added complexity, this could be built in. At present, with CBM the code ignores any set penalties and marks you on your first and only response.

The CBM code does take account of set fractional grades for partially correct answers (e.g. try answering 8 to the no. of Baltic States). 

Multichoice Qs with multiple answers are tricky, and are not handled at present. They could be, and they are handled in the cbm-export plugin I created some time ago ( 25 March post at http://moodle.org/mod/forum/discuss.php?d=61169 ). There are two alternative strategies defined in LAPT, e.g for the Q "Which of the following are vertebrates : {cat squid terrapin dinosaur coelocanth}" for which you may be sure of some and not of others. (1) You can be asked for your certainty that you are getting exacly the right selection all in one go, or (2) You can be asked to pick right answers 1 by 1, indicating your certainty for each one. The Q may or may not tell you the correct number of correct options.

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

The forum at the CBM test site www.ucl.ac.uk/lapt/moodle17/moodle is repaired (I had changed the name of the default News Forum instead of creating a new one, and it seems you can't change the 'type' of a news forum.)

Note that the Qs are not in any way "CBM questions". They are standard (core) Moodle Q types, optionally marked with CBM. If anyone would like to export and send me a set of their own Qs I can put these up for you to try. The code has rectified some bugs that appeared on review, and now handles MCQs with multiple answers.

There also new exercises (150 Qs) on English Idioms (for a project with English-test.net for ESL students (English as a Second Language).

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

The code for Certainty-Based Marking is now fairly complete and can be tested by anyone at http://www.ucl.ac.uk/lapt/moodle17/moodle . Please report bugs, bad behaviours or ideas for improvement. The code mods are visible at http://www.ucl.ac.uk/lapt/moodle17/Moodle_CBM_Mods.html .

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

I have adapted 1.9 code to use Certainty-Based Marking (CBM) which I should welcome people trying out at www.ucl.ac.uk/lapt/moodle19/moodle . The code mods are zipped at www.ucl.ac.uk/lapt/moodle19/moodle.mod.zip .

There is one thing I'm stuck on. I have added a CBM option to the quiz options page, but it only at present works in an awkward way (using what I guess may be a little used setting decimaldigits=3 to switch on CBM). I haven't ventured to try and add a new option to $CFG because I don't know how to avoid risk of corruption of existing databases. Ideally there may be some spare bits available for binary options like this, on some existing database entries.  Can anyone advise?

Is there support for including such code into the core Moodle files, once adequately tested & polished (e.g. for language support)?  So far  as I can see, it can't be done as a plugin.

Tony G-M

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

In 1.9 I've created a switch for turning on and off CBM by using the 'penaltyscheme' switch to be a more general 'mark scheme' switch, with the drop down box shown below. I think this makes sense. The help page for this new option can be seen HERE.  I'll also show an example below of how I'm handling feedback symbols. If you want to try things out from a course creator's as well as a student point of view, create an account on my Moodle 1.9 site and I can give you the requisite role. Tony GM

mark scheme options

f/b example

Attachment markscheme.GIF
In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Tim Florian -

Hello Tony,

It was a pleasure to meet you at Jones last week. I am having an issue with the CBM code in 1.9. I receive the following error message:

Notice: Undefined property: feedback in /home/content/f/l/o/floriantp/html/question/type/multichoice/questiontype.php on line 352

I checked the file against the files and line 352 reads:

$a->feedback .= format_text($answer->feedback, true, $formatoptions, $cmoptions->course);

on both files. Do you have any ideas on the problem with the script?

I am running 1.9 stable on a demo site, www.sageknowledge.net

Also, did you post the script for the mark scheme in the moodle19zip file on the web? I am having difficulty locating it.

-Tim

In reply to Tony Gardner-Medwin

Re: Assessment for Learning

by Phil Butcher -

Tim brought this thread to my attention and there are many points where I would have asked similar questions over the last 6-10 years. I thought you might be interested in a brief run-through that is reflected by where the OU has reached

1 client side or server side processing. Well 10 years ago we had no option. It was client side only and off a CD. It was formative only and we had no idea of student usage or individual responses. Students liked it but it’s not easy to ‘sell it’ to other courses without hard stats on usage.

2 6 years ago we moved to the internet and added comms so that we could record responses etc. but we kept the processing client side in Java. I thought why not use all that client side computing power for what is light cpu use interactive questions? No-one disagreed with me so we did. On the whole it worked well but we could never claim that it worked 100%. There was always someone somewhere with a weird problem. And during this time MS and Sun fell out and the Java Runtime fell out of the browser. I still remember where I was when I read that gem - on a small boat on the Norfolk Broads. The ducks wondered why I was the one doing the flapping.

Having to install the JRE before starting became a real drag. Anything that got in the way of students seeing the first question was an unnecessary hurdle. Couple this with the few students (out of each 1,000) who still had problems and it wasn’t something we could sell hard. But we’d made the point that it was useful educationally and the university backed a rewrite to move the processing server side.

3 So we moved server side in 2005 and the hurdles went away. It became click and go and there’s the first question. The result is OpenMark to which Tim gives the link above. Within OpenMark we always show just one question per page and we always give immediate feedback. We also like to offer multiple attempts, with reducing scores, so that students with incorrect responses can read our feedback and have another go. We do this for both formative and summative tests (there’s randomisation in the latter). Sure there are delays in comms and server side processing but they have not been significant. The whole thing feels more secure, we have lots of stats on student use including lots of complimentary comments and we can show these to other course teams looking to use eAssessment. We’ve seen much more growth in use since we moved server side. We’ve accepted that at present we have no offline capability. When compared to sending stuff down the line to be processed client side clearly we’re sending much less traffic.

4 So when the OU adopted Moodle we were already attuned to just one question per page. The advice I’ve received is that within Moodle one question per page for say n pages is better than 1 page with n questions in terms of the hit on the server cpu. So the default setup for OU quizzes is one question per page.

5 And you’ll have guessed that we’re also big on feedback. We want Moodle adaptive mode to be much closer to how OpenMark works where multiple attempts are allowed with feedback at each stage and an end point where the answer is given. Guess what we’re working on now. Tim has described it in one of these Moodle forums.

6 And if the adaptive question being answered is on a page with others and it’s not the first one, then yes when the question is answered and the page refreshes it will do this so that the answered question appears at the top of the screen.

7 Finally CBM marking is on our wish list (on behalf of the OU I hold a large wish list!). So I’m rather pleased to see Tony GM in here bringing it to Moodle. Can’t think of anyone better.

8 At the moment we’re focussed on testing the new Gradebook but once we have the nuts and bolts in place I look forward to disseminating adaptive questions, CBM and more around the OU.

In reply to Phil Butcher

Re: Assessment for Learning

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators

Hi Phil,

Thanks for a most instructive post. Thanks to the link provided by Tim I had looked at the OU's OpenMark examples and am quite impressed.

"And you’ll have guessed that we’re also big on feedback. We want Moodle adaptive mode to be much closer to how OpenMark works where multiple attempts are allowed with feedback at each stage and an end point where the answer is given."

I personally do not use the Moodle quiz module, Hot Potatoes (or other such software) for testing purposes but for learning (and "self-testing") purposes. So - as for you - for me feedback is of paramount importance in the learning process. However, in order to achieve the most relevant (and thus helpful for learning) feedback possible, the best possible answer analysis ought to be performed (on "open answer" questions that is). This is what I have been working on for the past 15 years, and this work is reflected in my regexp question type. I do appreciate that the amount of preparatory work which is involved in creating questions of this type is such that only a handful of teachers are interested in it. I'd appreciate your own feedback, maybe on a new thread on this forum?

All the best,

Joseph

PS1.- "6 years ago we moved to the internet and added comms so that we could record responses etc" what are "comms"?

PS2.- It's unfortunate (for me) that all of the OpenMark examples are related to Science, Maths, etc. and none to languages.

In reply to Joseph Rézeau

Re: Assessment for Learning

by Phil Butcher -
Hi Joseph,
We clearly have much in common.

Regarding 'best possible answer analysis'
a) OpenMark uses an alogorithm developed at the Computer-Based Learning Unit at the University of Leeds in the 1970s. I thought that we might bring that to Moodle.
b) ...or I thought that perhaps we should look to provide a regexp type. Now I see that I don't need to start thinking about this from scratch - thanks ;~)
c) but if your'e really into this you might like to go to the OpenMark Examples site again and look under 'Text response | Free text'. On that page is a link to 'S103 trial...' where we're using Intelligent Assessment Technology's Free-text analyser which uses Computational Linguistics. Definitely not for the faint-hearted author but we have two colleagues in our Science Faculty who have 'taken the plunge' and I think we're quite impressed with where we've got to. The IAT system has it's own, sophisticated, authoring tool for handling response-matching.

This is an area that interests me but as I indicated yesterday it will have to wait awhile longer. The OU has put a considerable amount of money into the Gradebook in 1.9 so we need to ensure it's close to what we specified and currently all our development resource for the quiz is tied up until the turn of the year.

Regarding 'comms': sorry, 'communications' between the client and server so we could collect student responses and mark them - for credit if we wish.

And now for the good news - we have an OpenMark project starting with our Languages Faculty in September. When that's done I should have some different questions to show.

Regards

Phil
In reply to Phil Butcher

Re: Assessment for Learning

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

Interesting posts, Phil & Joseph. It would be good perhaps if we could get together and compare side by side the features these different systems and strategies offer. Perhaps by luck rather than good management I chose Javascript not Java, and have had hardly any of the problems Phil sets out. Main plague has been the rash of commercial add-on popup blockers working in different ways. We have voluntary submission of data, but collect upwards of 10^6 answers per year. For required coursework, students must submit of course, and along with collecting the full data on the LAPT system we upload summary info automatically, and with pretty secure authentication, into the WebCT (4.1) gradebook. I hope this will be possible with the Moodle gradebook developments. 

Tony