However, the things we did while changing the Moodle quiz at the OU were done while trying to make minimal changes to the quiz/question code. (We wanted to avoid a fork.) I have since come to realise (yay! hindsight ) that what was actually needed was some big changes to clean up the mess that has accumulated over the years, and only then to add our new stuff.
I have been writing up the details of what I propose to do at Development:Question_Engine_2. The overview section is the one that tries to explain what I am going to change in a comprehensible way. That will be expanded on the the detailed design section which I have not finished writing yet.
As I say, while necessarily a bit technical, I hope that this is comprehensible. If you would like anything explained further, please ask. I sometimes forget that not everyone is a software developer .
If you think this looks familiar then you may have read a previous proposal of mine (Development:Changing_the_Moodle_Question_Engine). The new document is the old one sliced up, and then expanded. In other words this is a well worked-out plan that I have been thinking about for a long time, and I am really happy that I finally get the chance to implement it.
However, even though (I think) the basic idea is well worked out, I would still love to hear any comments anyone has on the proposals. I have been wrong before, and one of the nice things about working in this community is that people can look at your proposals and point out your mistakes before it is too late to fix them. Well, when I say nice ... it doesn't actually feel nice when it happens ... but don't worry about my feelings if you want to criticise
A quick look confirm how much this resume your "remarkable" knowledge on quiz and questions.
"the question types will be given access to the full state history. " is one of the main point .
Pierre
P.S. more lecture later and back to the unit handling project.
Wow. As a non code person, I believe I learned some more about how things fit together by reading the new pages around Development:Question_Engine_2.
Dumb question. Question Engine, Quiz, Question Bank: how do they relate? At times I thought Question Engine was another name for Quiz. The question_attempts data table seems to say this is what is going on. Probably my ignorance.
For others I will share: I found the OpenMark S205 example interesting from both Lesson and Quiz standpoints. My last chemistry class was in 1961 I started with Elements and compounds with question types similar to short answer or numeric questions.
Note that, not only did the feedback change with each attempt to answer the question, but there seemed to be a change in the image which was associated with the question with every attempt, while the question itself remained unchanbged. I did sample some organic chemistry questions (lucky guesser because these were MC) with similar adaptive feedback without changes in the image.
In Lesson this "adaptive" nature can be achieved via jumps to another page based on the students answer/choice. The new page is either another content (basically a grade neutral MC page with labeled buttons) or another question page.
The concept of adaptive testing, as compared to adaptive feedbacks, is another level that the teacher can attempt in a Lesson module but not Quiz. I wondered this could be done in the OpenMark example. What I could not tell, was if the "next" button could be programmed to jump to another specific question, based upon the attempt and the answer or if it was just the next question in the sequence or another random question based upon the quiz settings. This is probably going down the road of having one Engine do too much (ie where Lesson is in my opinion).
It does get back to the separation issue of adaptive control being assigned to either the question or the question engine. How does Question Bank relate to both Lesson and Quiz in the longer run.
Fascinating and a heck of a lot of work there. Thanks for sharing. Chris
- Question bank
- This is a database that stores question definitions organised into categories and contexts. With facilities to browse, select, import and export the definitions.
- Question engine
- This is the subsystem that manages the execution of the question definitions when someone attempts them. That is, it turns the static definition of the question stored in the database into an interactive experience. This part of Moodle has not really had a name until now, but it is a relatively distinct part of the system, and it is useful to have a name for it. We used 'question engine' in OpenMark, and I like it.
- Quiz
- This is a particular activity, that uses particular questions selected from the question bank, as executed by the question engine, to try to teach and assess students.
That is very helpful for me at least.
I also appreciate the post over in the Lesson Forum about the core work. It would be nice if Lesson truly shared the question bank. I would seem to me that there is a Lesson Engine, with the Lesson Module setting up the parameters of the activity (if I follow you in your post).
I will get out of your way here and into it in other places
Best Chris
One technical comment:
The solution I used to handle number and unit is to create input elements ( answer and unit) that are the responses to the same numerical question.
In cloze the input elements are related each to a different question.
The current code suppose that each question has only one $state->responses field and put it in $state->responses['''].
When there are many subquestions, the question id are added to the $state->responses which is saved in the one answer field with separators specific to the question.
In numerical the answer and unit element are separated by '|||||' which in any case cannot be a numerical answer or a valid unit.
Could your question engine 2.0 handle this i.e. multiple parts response, in a better way?
Pierre
P.S.Where should we put comments on your project?
With the new proposal, this will not be necessary. The multiple input elements naturally arrive at the Moodle server as a HTTP POST submission. That is, the server might receive data like
q123_answer = '1.92'
q123_unit = 'm'
q124_... = ...
The question engine will sort out the responses for each question, so the numerical question will be passed an array $responses like
array('answer' => '1.92', 'unit' => 'm');
and this will automatically be stored in the question_responses table, associated with the state of the attempt at the particular question.
I hope I explained that OK.
I would prefer to discuss the proposal in forum threads here. Either in this thread, or start a new thread, whichever you think makes sense. However, you can also add comments to the talk page if you prefer (as Oleg is doing). I will also see and respond to those. However, I find a forum better suited to discussions.
This also open a lot of opportunities for new question types that are difficult to handle actually.
Joseph Rézeau will surely have comments on this.
Pierre
Would it be possible to enhance the Essay input forms so as to store the answer data as the student enters it, so that the student (and teacher and other peer reviewers) can go back and update it after ending a particular Attempt 1? Are there any other Moodle communities who would like such an enhancement? Any other suggestions owuld be greatly appreciated! Thank you for all of your support!
Thanks
Willy Felton
So far, I have written a lot of new code that works well in isolation from the existing parts of Moodle. That includes the core of the question engine, and most of the new interaction models (eight done, two to go, plus one ou-specific one). All that with thorough unit tests. I have also started on the code that loads and saves the state of play from the database.
At the moment, is uses fake question type classes with a new API and bits of the inner workings copied from the old classes. At some point I will have to start converting the real question type plugins.
Anyway, I have just updated all the latest code to http://timhunt.github.com/Moodle-Question-Engine-2/. If you want, you can check-out the new_qe branch from there, which is based on the Moodle 1.9 codebase. Install it, then go to admin -> reports -> unit tests and run the reports in question/engine: (12/12 test cases complete: 221 passes, 0 fails and 0 exceptions.) and question/interaction: (10/10 test cases complete: 312 passes, 0 fails and 0 exceptions.) That is not very exciting, I will admit, but it does represent quite a lot of work.
I think the time has come to start making the new code work with the existing bits of Moodle. In particular, I think my next step should be to make the question preview pop-up window use the new code, and make that work for at least one question type, probably true-false.
All that is probably incomprehensible unless you are a developer and familiar with the design document I wrote. Oh well.
I try to load the zip file but get an error message.
I even sign up with GIT but the problem persists. (tar reference shows the same problem).
Pierre
P.S. Back to OU, you seem to have as much time to work on moodle code than in Australia given the work done since your return.
I normally just use git from the command line. That is, follow the instructions on Development:Tracking_Moodle_CVS_with_git (actually you just need the git clone commande) but use the github url instead of the git.moodle.org one.
One of the things I learned at Moodle HQ was how much more I could get done away for the distractions of a big institution (particularly all the meetings, management and reporting they want to make sure we are working efficiently ). So, now I am back I am trying to keep the focus on writing code and not getting side-tracked. Also, they have just let me get on with this one project, which is great, but I only have a certain amount of time to get it done, which does focus the mind.
In the mean time, the link http://github.com/timhunt/Moodle-Question-Engine-2/zipball/master is not working even on mac.
Pierre
http://github.com/timhunt/Moodle-Question-Engine-2/zipball/new_qe
and that is the only link I can see on the github site. I wonder where you found the other link.
By the way you can get git command line tools, and gitgui for Windows. But then learning a whole new tool just to get these files is probably overkill.
"I wonder where you found the other link."
I just follow the guide
http://timhunt.github.com/Moodle-Question-Engine-2/.
and click on the beautifull icon for zip file or the little link at the end of the page
Pierre
Install, run, you pass the two tests as described.
I will look more closely...
Pierre
Can you put your project on a XREF site which seems to me one of the best ways to understand this kind of complex code and complement the docs.
Pierre
As a first reaction, I like very much the way you use classes which is more standard to what I have learn as a computer autodidact .
You have done similar thing in showbank code.
Perhaps class use in question code was somehow limited by previous PHP versions class handling.
You understand better than us the difficulties related to the actual "plug-in" question types.
Do you have any statistics about the use of non standard plug-in question types that are (or not) catalog on this moodle.org site?
As when using fortran computer cards years ago, I just find myself another time at the beginning of a steep learning curve before being familiar with the new engine.
Pierre
The other reason is just the history of Moodle coding style.
The only statistics are http://download.moodle.org/stats.php, which is a bit limited, as that is just a download count, not a use count, and it only counts downloads, not CVS checkouts.
Thinking about the question preview window. How about the following?
The main change there is adding the options form, so you can see and change the options that affect what you are seeing. (Currently there is nothing.) If you are previewing a question from a quiz, these settings will default to the quiz settings.
Also, making the navigation to previous states clearer. (Currently, a Previous button appears after you have done something.) Do we really need this, would it be to have only the start again button?
I only managed to get rid of one button (Submit page).
I was thinking that the default size for the popup would be such that you see the question, the buttons, and a bit of the top of the settings form.
Anyway, comments welcome.
Are you planning to allow changing of the quiz settings from a question used in the quiz as implied in the mock up? If so, I think that would be confusing to users. A display of the relevant quiz settings that is read only could be useful.
Actually, it is already implemented in my new code, but the new code is not finished.
About the question preview windows...
Does this mean that CBM will be incorporated into all question types? as an option?
Why not get rid of the Submit all and Finish button, I think it is useless on that Preview window?
And a Start again button is enough, please let's keep the interface as simple as possible.
When shall we be able to CVS-update our Moodle 2.0 to see current state of the Quiz and Questions modules?
Joseph
CBM will is available for all question types (that do not need to be marked manually). It will be a quiz option, like the adaptive mode/non-adaptive mode choice you have now. Indeed, the
How questions behave: [... various choices ...| v ]
dropdown replaces the
Adaptive mode: [Yes / No| v ]
on the quiz form too.
If you had read all of this thread, you would know that my latest code is available from http://github.com/timhunt/Moodle-Question-Engine-2, but there is not much to see yet. I don't plan to put it into CVS until it is done.
Yes, Tim, I know (I have read all of the thread) ; but I'm only interested in doing a CVS update, so I'll just wait until it is done...
A teacher using Bb CE can develop a quiz and preview it by taking it exactly the way a student would. The quiz even gets scored, and the score gets dumped into the gradebook as "Demo student." Only down side is you get to take the quiz as "Demo student" only once (hmm - can't remember if a teacher can preview it a second time without it being scored - I think so).
The principle of making the preview behave as much as possible like the real thing is one worth pursuing, I think. I've been most confused when the preview doesn't behave like the real quiz - as with the essay question currently. Thanks for all of your efforts!
Peter
A particularly good way to do this is to use two web browsers. For example, you can be logged in as you in Firefox, and logged is as Dummy Student in Internet Explorer.
"
core_question_renderer
Renderers are responsible for generating the HTML to display a question in a particular state. The core_question_renderer it responsible for all the bits that do not depend on the current question type or interaction model.
qtype_renderer
Base class for
- qtype_truefalse_renderer
- qtype_multichoice_renderer - and possibly also qtype_multichoice_horizontal_renderer
- ...
Responsible for generating the bits of HTML that depend on the question type. For example the question text, and input elements.
"
Will the plug-in question types be able to add a new qtype_renderer and the base classes are used as much as possible to eliminate code duplication.
"and possibly also qtype_multichoice_horizontal_renderer" could indicate this is so as multichoice_horizontal is actually implemented in cloze question type.
Are the new units display (left, or right) with the option (text, text input, multichoice) for numerical and calculated being defined as part of the project?
In the 2.0 version of calculated question type when creating the question the user can choose
for a
- standard version with the parent virtual qtype being numerical
or
- a multichoice version with the parent virtual qtype being multichoice (single or multiple).
Once the question is created, it cannot go back to the other type.
Although this could create a new qtype, it could be better to have the two options as separated question types.
Pierre
Short answer: yes.
For the full answer, you will have to wait until I have made more progress with that bit of the code. The details are being worked out as I go along.
I am not promising to implement any new features in the question types. My priority has to be to make sure the question types still work the same way they always have, but in the new framework. However, I may implement some new stuff as I go along if it seems easy and worthwhile, and backwards compatible.
This is much more difficult than creating from scratch.
I just mentioned these cases as examples of complex questions structure or rendering.
When things will be more clearly defined, if you think I could help, I am volunteer to explore the new coding for questiontypes that I know more than the others questiontypes.
Pierre
I think I need to get multichoice, truefalse, shortanswer and random working first, but then I will tip you off to have a go at calculated and possibly multianswer. multianswer is going to be the ultimate test of whether the new code can cope, and how much the new system lets us clean up old code.
Me too.
You have already set a test for a multichoice single question.
Let's take the case that I want to test for a multianswer question containing 2-3 multichoice questions ( MCV i.e. vertical radio button which is the multichoice case you build) but without shuffling as in the actual multianswer options.
Many users ask for shuffling but it appears to add another parameter to stored in the actual restricted single attempt question record.
What will be the general schema to create such a multianswer example?
Pierre
P.S. Although multianswer is more like a quiz in a quiz, in the actual code even if there are more than one question, the multianswer is considered as a one question, displayed and graded accordingly.
The option is offered when the question is created and cannot be changed when editing the question.
This is as a matter of fact, two question types in the same box.
I think calculated should be split in calculated and calculated_multichoice (or calculatedmultichoice) question types.
This will simplify the code of the edit_calculated_form by creating a new edit_calculated_multichoice_form and all the rendering , attempts etc. code.
The calculated will be very similar to the numerical in rendering, grading etc.
The calculated_multichoice will be very similar to the multichoice (either the single or the multiple) in rendering, grading etc.
The calculated_multichoice could use the datasetdefinitions_form and datasetitems_form of calculated.
The necessary database fields have already being created in calculated_options table.
You are experimenting on a 1.9 version because, I think, you are using some work done at OU.
In 1.9 the datasetdependent questiontype is present but was merged with calculated questiontype on 2.0 (HEAD).
Should you migrate to 2.0 ?
Pierre
P.S. I put this first on the other thread "What are all ..." but delete it and add it in this thread.
I just previewed a true-false question in the question preview window, and had it (mostly) work. Time to go home and have a beer.
(Latest code pushed to github, for those who care.)
Pierre
Have you set somewhere the code flow for using a true-false question in a quiz with the new engine?
Pierre
More comments if I survive to the Minotaur.
Pierre
What I understand is that 2,0 will change lot of things (i.e. file system) that will get quite complicated to upgrade directly form 1.9 to 2.0.
The new engine add other upgrading constraints that should be included in the 2,0 at the start.
The new engine flexibility is also in the 2,0 philosophy of a better student e-learning tool.
Pierre
However, I don't want anything I am doing to be the thing that delays Moodle 2.0 any more. If it is finished in time, this can go into 2.0, but if not, it is not a tragedy if it goes into 2.1. I think by Christmas, we will have a clearer picture how my work, and Moodle 2.0, are progressing.
Pierre
Pierre
On the working side, it is now possible to preview all the standard question types apart from multichoice, calculated and random short-answer matching, in the question preview pop-up. The question preview pop-up has been extended so you can change the options used to preview questions. I am hoping, in the first three days of next week, I am hoping to get attempting a quiz to work.
On the still-to-do side, well, I have a full todo list at http://github.com/timhunt/Moodle-Question-Engine-2/blob/new_qe/question/engine/todo.txt. The major tasks are
* Quiz reports
* Upgrading old data in the database
* Backup and restore
* Re-do a whole bunch of OU-specific changes (some to go into the main Moodle, and some not)
* Remaining question types.
* Testing and bugfixing.
Altogether it is about 45 days work for me to get this into the OU VLE, and another 15 days to get it into Moodle core.
Not that there may be important requirements the OU want in other parts of Moodle, that I have to work on as well as this after Christmas.
On the Moodle 2.0 side, Martin is now guesstimating February.
I just donwload the code but as the database is not updated the classic preview failed.
Did you create the new database tables by hand or is thre something I miss?
Iincidently , in the actual code numerical parent is short_answer. This seems not to be the case for the renderer.
Pierre
P.S. Just to take a look before going south to Punta Cana.
Building them and come back later.
Pierre
I was experimenting with what should inherit from what to duplicate the least code. The current organisation may not be the best, and may change. Also, it is very complicated, with lots of base classes and clever design patterns. I definitely need to draw a diagram, and I may need to simplify it.
Works correctly.
I will try to see how calculated could be fit in this.
"it is very complicated, with lots of base classes and clever design patterns"
I agree.
Pierre
As 2,0 beta should be released 1 March 2010, can you manage given all your workload to get the question engine for this release?
Pierre
My detailed work breakdown for this project is at http://github.com/timhunt/Moodle-Question-Engine-2/blob/new_qe/question/engine/todo.txt. I think it you do the sums then it is 50+ days before I have something ready to go into Moodle core. Therefore, it won't be ready on 1st March. However, this is the only bit of development I have to work on at the moment, so I should not get diverted, unless bugs occur, as they did this morning.
These are the types used in Cloze question.
From what I have seen in their function formulation_and_controls
I conclude that the Cloze function formulation_and_controls will be built quite as the actual one using cut and paste from the other questiontypes.
If you agree, I could work on Cloze.
Pierre
P.S. Given the new way to store the attempts, we could add multiple responses multichoice to Cloze. This is often asked on the forum as the recent
Embedded answers: can you add MCV questions with multiple answers required?
However, the whole point of the re-write has to be to make things more flexible in future. So whatever code you write, you should be thinking about how easy it will be to add new features in the future, and adding multiple responses multichoice would be good.
I was hoping it would be possible to avoid copy-and-paste. Renderers should be state-less, and each method should have everything the need passed in as arguments.
So I hope it will be possible for qtype_multianswer_renderer, when need to output an embedded MC question, to do something like
// Prepare some data ...
$mcout = renderer_factory::get_renderer('qtype_multichoice', 'single');
$output .= $mcout->choices_horizontal($some, $arguments);
For that to work, the code inside the multichoice renderer is split up into more separate functions, so that there are methods, choices_horizontal and choices_vertical, and formulation_and_controls calls choices_vertical. (Later it might be nice to add an option to MC qtype so that teachers can choose horizontal or vertical rendering there.
Sorry but Cloze reality is an hopeless case.
The first constraint is that the short_answer, numerical and multichoice must be in-line with the question text whichs mean that for multichoice a select element is used.
Even if I used calls to the actual renderers with questiontext = '' the result will not be in-line because of the various <div> tags.
question_display_options $options) {
$result = '';
$result .= $this->output_tag('div', array('class' => 'qtext'),
$question->format_questiontext());
$result .= $this->output_start_tag('div', array('class' => 'ablock'));
$result .= $this->output_tag('div', array('class' => 'prompt'),
get_string('selectone', 'qtype_multichoice'));
$result .= $this->output_start_tag('div', array('class' => 'answer'));
foreach ($radiobuttons as $key => $radio) {
$result .= $this->output_tag('span', array('class' => $classes[$key]),
$radio . $feedbackimg[$key] . $feedback[$key]) . "\n";
}
$result .= $this->output_end_tag('div'); // answer
$result .= $this->output_end_tag('div'); // ablock
Furthermore the feedback and good response are shown as a popup window when the mouse is over the input element.
As for horizontal multichoice, it is already offered in actual Cloze because it is usefull for yes, no, I don't know questions.
An almost complete quiz with many similar questions can be put easily in a table inside a single cloze question.
The horizontal/ vertical choice used the layout column of question_multichoice table.
I can however divide the different tasks ( grading, feedback, display) of multianswer formulation_and_controls function so as to refer to the specific qtype of each subquestion and create a specific multichoice select renderer.
A similar problem occurs with calculated with can be a numerical or a multichoice although simpler because the standard rendering can be used.
My first tests done on calculated in december only include the numerical version of calculated.
So we can also discuss the calculated case before the Cloze.
As there was a need to a simpler editing interface for a simpler calculated question using only private datasets, I create the calculated simple question type.
The question editor is simpler and the question rendering can use the numerical version of the calculated question type.
There was a need to a multichoice using datasets
- (i.e in a single question you can ask the student to choose the response for a circle area which offer a choice between the good one (pi*r2) and a common error (2pir))
The calculated question editor allow (on 3 pages) all the possibility of datasets.
The choice between numerical or multichoice is done on the first page and cannot be changed after that in order to eliminate any problem when using the question in a quiz if the user could modify the question type later.
A better choice will be to create a new question type i.e. calculated multichoice but its appears more easy to do it this way given your concerns about the number of official question types .
Also calculated simple question type is created mostly because the plug-in structure does not allow two question_edit_forms but this question type use most of the calculated code.
Having calculated simple , calculated (as numerical) and calculated multichoice question types mean the following:
- Calculated simple and calculated (as numerical) use mostly the numerical engine code as the calculated questiontype code.
- Calculated multichoice
- will simplify the calculated question_edit_forms,
- use the same datasetdefinition code
- use the calculated dataset code
- and mostly the multichoice (single or multiple) engine code.
- give to the user a clear indication of the question type.
These three question type versions will be the most easy to maintain on the long, long term as my interest in calculated question type goes back to the 1970 years and, although being 67 years old, my life expectancy is around 20 years.
Pierre
All those three calculated versions use the same datatables, the necessary parameters being be stored in the new question_calculated_options table added in 2,0.
I could not test the multichoice version of calculated on the new engine as this is a new feature on actual HEAD and not in 1.9.
Pierre
Also, feel free to make something that works, even if you have to copy and paste code. Then we can improve it later. I have been doing that a lot, and I find it a good way to make progress.
"you can take Moodle 2.0 code and use it with the new question engine"
So I put the new 2.0 calculated code in the new engine (your code in github) which is 1.9, creating manually the necessary new calculated database tables
or
the reverse ?
"the comment I made in mod/quiz/db" ?
Where exactly ( in github or 2.0) ?
Pierre
I was taking about http://github.com/timhunt/Moodle-Question-Engine-2/blob/new_qe/mod/quiz/db/upgrade.php#L90
There has been important changes in the calculated question code from 1.9 to 2,0 including the fusion of calculated and datasetdependent and new db table question_calculated_options with the addition of calculatedsimple and caculatedmulti questiontype.
There is also a new unit handling code for numerical and calculated (MDL-20296).
All these changes are planned for 2,0.
The sooner you migrate to 2,0 on (github) , the better it will be.
Unless you create a 2.0 site on which I can fit calculated and cloze and numerical and then back to 1.9 for the classic calculated and cloze.
For units the 1.9 code is simpler and can be solved in the actual 1.9 github.
Pierre
There will eventually be a Moodle 2.0 based version on github, I just don't know when.
There are no important modifications to Cloze code for 2.0.
I can synchronize more my work with yours if first I solve the Cloze question on the actual 1.9 github as the short answer, numerical without units and single response multichoice has been solved.
Once this finished, if you have solved the unit problem at the 1,9 level, I can finished the classic 1,9 calculated.
Solving the Cloze is also solving the MDL-20296 proposal for units as the numerical becomes
- a Cloze with a numerical (the unit is given as a text),
- a Cloze with a numerical and a shortanswer or
- a Cloze with a numerical and a multichoice.
Pierre
The calculatedmulti/questiontype.php is minimal using most of the calculated/questiontype.php functions.
The calculated_multichoice should allow the use of the multichoice question engine as easily as the calculated use the numerical one.
Stay tune for more news.
Pierre
P.S. Incidently the qtype field length is 20 characters so less than calculatedmultichoice.
Do you prefer calcmultichoice or calculatedmulti?
I will try on cloze something like the following
create a
class qtype_multianswer_question extends question_graded_by_strategy
as graded_by_strategy is used by numerical and short answer
and
extends question_graded_automatically used by multiplechoice.
For each subquestion I will create the specific questiontype.
I need just one renderer
class qtype_multianswer_renderer_base extends qtype_renderer {
and use most of the code flow for the question text that is actually used, with the necessary modifications following what you have set for the multiplechoice, numerical and short answer renderers.
Some first results expected next week.
Pierre
P.S. As usual all comments will be appreciated
That is one approach.
Another approach I was wondering about (but I am not sure if it would work) is this.
Make
class qtype_multianswer_question extends question_graded_automatically
private $subquestions = array();
Have $subquestions be an array of other question definitions. So, if the first sub-question is a shortanswer question, then $subquestions will contain a qtype_shortanswer_definition that has been set up appropriately. And if the next sub-question is a multiple choice, then $subquestions will next contain a qtype_multichoice_single_defintion, initialised appropriately.
Then, when you need to grade responses, the qtype_multianswer_question asks all the subquestions to grade their bits.
The reason for doing it that way is to make it easier to user other question types in multianswer in the future.
However, as I say, this is just a broad outline that sounds good in theory. I don't know whether it would work in practice.
(Note, I just pushed a bunch of changes to github, you may like to update.)
class qtype_multianswer_question extends question_graded_automatically
private $subquestions = array();
Have $subquestions be an array of other question definitions. So, if the first sub-question is a shortanswer question, then $subquestions will contain a qtype_shortanswer_definition that has been set up appropriately. And if the next sub-question is a multiple choice, then $subquestions will next contain a qtype_multichoice_single_defintion, initialised appropriately.
) is a replica of the actual cloze code as
while (preg_match('~\{#([^[:space:]}]*)}~', $qtextremaining, $regs)) {
$qtextsplits = explode($regs[0], $qtextremaining, 2);
echo $qtextsplits[0];
echo "<label>"; // MDL-7497
$qtextremaining = $qtextsplits[1];
$positionkey = $regs[1];
if (isset($question->options->questions[$positionkey]) && $question->options->questions[$positionkey] != ''){
$wrapped = &$question->options->questions[$positionkey];
$answers = &$wrapped->options->answers;
...
if ($correctanswers = $QTYPES[$wrapped->qtype]->get_correct_responses($wrapped, $testedstate)) {
...
foreach ($answers as $answer) {
if($QTYPES[$wrapped->qtype]
->test_response($wrapped, $testedstate, $answer)) {
$chosenanswer = clone($answer);
break;
}
}
...
the call has to be changed to the similar ones in the new question classes.
For the class qtype_multianswer_question I was in the impression that extends question_graded_by_strategy could give more flexibility than to extends question_graded_automatically as some necessary functions (for numerical and shortanswer) will be already defined.
As you design these classes, I will switch back to the more general
class qtype_multianswer_question extends question_graded_automatically
Pierre
The first candidate should be Joseph regex question type.
The main problem as I understand it, is the possible failure of the regex analysis if not written correctly by the teacher.
Otherwise it is easy to include it and it would use some of the new aspects of interaction with the students that are included in your design.
Your expertise will be of great help when we will come to this.
Pierre
P.S. I noticed your recent changes in github, thanks for the advice.
public function formulation_and_controls(question_attempt $qa,
question_display_options $options) {
in two parts so that the question text is printed independently.
I will first experiment with the creation of numerical_inline, shortanswer_inline, multiplechoice_inline, multiplechoice_horizontal classes and renderers which should control the display of the necessary controls.
This way multianswer use these renderers as the actual multiple choice vertical with questiontext= '' for the actual function formulation_and_controls(question_attempt $qa,
question_display_options $options) {
This should allow a cleaner addition of other question types.
The multianswer role will be to switch for the various renderers following the subquestions in the question.
We should also allow the uesrs to be able to position the various subquestions controls at there convenience(i.e. a table containing a series of question multiple choice horizontal (yes, no, I don'know) .
Pierre
It has no answers or responses for itself, everything is in the subquestions.
As we have a parameter in question type (parent) we should add a new similar parameter to question_attempt
This will allow more flexibility for the future.
or use something like quiz attempt to store cloze question attempt
or
....
Pierre
Some of us at UCL are hoping to present something at the London Moodlemoot (13-14/4/10) about recent progress incorporating Self-Tests (designed for students to take charge more of their own learning) and Certainty-Based Marking (CBM) into Moodle 1.9.7.
We wonder if there are other plans for related presentations we should be coordinating with. Much of our progress and discussion material is available at www.ucl.ac.uk/lapt/moodle19/moodle (login with id=pwd=moodler).
Tony GM
I am planning to talk about what is new in Moodle 2.0, and also the further work that is the subject of this thread, and which includes CBM.
Phil Butcher is planning to talk about what we are doing with our assessment tools at the OU - how courses are using it, the feedback we get from students, and so on.
And another Tim from the maths faculty is planning to talk about their use of Moodle quiz on their new first level maths course (1500 students, 15 summative and 5 formative quizzes. 2000 questions created in total!)
I think the proposals are 60, 20 and 20 minutes.
Before I put my proposal in, I emailed the organisers, and basically, once they get all the proposals, the may well contact the potential speakers to get them to coordinate a bit.
Anyway, I would certainly like to hear about what you have been doing, so please do put in a talk proposal.
Tony
I don't have a publicly accessible server I can put this on. I was wondering about asking if it can go on test.moodle.net, but I keep not getting around to it.
Recently, I have been working on converting the quiz reports to work with the new question engine. Last time we (the OU) wanted some major improvements to the quiz reports, we out-sourced it to Jamie Pratt, and I am finding out quite how glad I am we did that. There are so many different settings and options to consider, and the code is necessarily very complicated. I know Jamie cleaned up the code a lot while he was working on it. I hate to think what it was like before. (These improved reports already in Moodle 2.0.)
In particular I have been making use of the fact that the new question engine can correctly distinguish questions that the student did not answer from questions that they got wrong, and also questions that still need to be manually graded. (Currently essay questions are treated as having a score of 0 before they are graded, which leads to us displaying meaningless scores: MDL-3936.)
I think I am nearly there with the grades and responses reports. I still have the manual grading and statistics reports to go.
- A manually graded question that has not been graded yet.
- Hence that whole attempts does not have a grade yet,
- and that is also correctly reflected in the averages.
- You can see in the report which questions the student has flagged.
- The new system can cope with questions that give a negative grade.
- Questions that the student did not answer are shown similarly, but differently from questions that they answered and got wrong. (Still red, with an X, but no grade, rather than 0.)
Tim,
Although this is an "old" issue, it is becoming an actual reality which is a real pleasure.
Could you precise here what will be the next weeks calendar?
Pierre
P.S. There are rumors that work on remaining questiontypes will be done after attempt code. Is there a way I can help you retrieving the calculated code Ariadne's thread ?
Although you could want to climb on your own the "challenging learning curve".
I suppose it is appropriate to resurrect this old thread now, since we are on the home straight, I hope.
I spent the first three days of this week at the UK Moodle Moot, which was most enjoyable. Always nice to meet Moodlers (I hope I can make it to Canad one year.)
In his keynote, Martin said that Moodle 2.1 would be released on 30th June, so new features have to be complete and integrated by then, so for the next 5 weeks I will be busy, but I think it is possible.
I have a presentation where I talked about the new question engine, and all the talks were recorded, so when the recordings are published, I will put the links here.
The bits that are finished are currently being tested here at the OU, and not many problems have been found so far. The bits that are not finished yet are:
- backup and restore of attempt data.
- ungrade from 2.0
- numerical
- calculated
- calculatedsimple
- calculatedmulti
- multianswer
- randomsamatch qtypes.
Note that question editing has not chanaged much, and I think that most of the complexity for calculated is there, I think.
What you will have set for numerical editing, will be set for calculated and calculated simple as they use the same (need to be review..) code in the editing form.
Once calculated datasets variables have been replaced in answers, the grading is the same as numerical although there are more tolerance options.
This is the same for calculated multi to which you could apply the multichoice code once the multichoice string is converted either by simple replacing {var} or solving an equation in {=...}.
Incidently, I did not find a good way to test for errors in the formula for all these types ( MDL-26823).
The handling (i.e. moving ) of question or category datasets could be done after 2,1 release unless a more complex analysis of all the possibilities of the new engine offers new easier ways to handle the datasets.
Just setting the "automatic question testing code" is no a easy task...
Pierre
P.S. When I worked last year on the multianswer, I explore how to add to cloze multichoice all the possibilties of standard multichoice. Somehow in the "engine" the cloze question which is almost a quiz, needs to access to properties that are an higher level that what was allowed to questions in the model at this time. You will also surely find a better way to handle in the cloze question "private" properties of the questions included in the cloze.