Improvements to the quiz reports

Improvements to the quiz reports

by Tim Hunt -
Number of replies: 75
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
One of the things the OU will be doing to the quiz in the new year is getting a whole bunch of changes made to the quiz reports. I have been working on a specification for this over the last few weeks. You can see it here:

http://docs.moodle.org/en/Development:Quiz_report_enhancements

This is not quite finished. The section detailing exactly what we want to appear in the Item Analysis report is not quite finished yet, but the rest is.


The plans do not make any really big changes. Basically the quiz reports have not had much care and attention since about Moodle 1.6, so they have no really kept up with the introduction of Roles in 1.7 and fomslib in 1.8. Also, the new gradebook in 1.9 had raised the bar in terms of reporting in Moodle, and again the quiz has been left behind a bit. Finally, there are a bunch of bugs in the tracker that I have not had time to work on, so I have chucked them into this specification which we hope to outsource.


If you have any comments, please make them in this thread.

Any good ideas, I may be able to incorporate. However, since the OU is paying, I can only really incorporate things where I can make a case that the the OU needs them.
Average of ratings: -
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Pierre Pichet -
Tim,
See my comments on the docs and here is an example of the preliminary work done for multianswer calculated questions which has a structure similar to the Cloze multianswer.

cal

Pierre

P.S.In fact the structure is more complex as it allows multiple response (multichoice) which Cloze does not allow. This could set standards for quite complex plug-in types...

In reply to Tim Hunt

Re: Improvements to the quiz reports

by Pierre Pichet -
Tim,
What are the time schedule for this project?
I think I can figure out a proposal for retrieving the subquestions (or questions) data in multianswer question types as Cloze.
What is done once this data is obtained, is the main part of your project and I don't want to interfere.
The multianswer calculated question type is the normal development for this question type and I cannot continue on calculated question type if this report analysis cannot be done on multianswer question types.

However if you already have a solution to this problem, I will wait.

Pierre

P.S. Given the new year holidays, I am leaving for Dominican Republic from december 20 to january 3 smile . I hope you have similar plans...
In reply to Pierre Pichet

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
We are hoping that it will be implemented in the first few months of 2008. (There is probably a couple of months work in there.) We will include it in the OU Moodle internal release that we are planning for next July, and it will be included in Moodle 2.0, which is probably about a year away.

There is already a lot of work in there, and it represents a big step forwards, so I don't want to expand it much more. Therefore, I think you should wait until this work is done, and then once the code has been cleaned up, it should be easier to do your changes.
In reply to Tim Hunt

Re: STOP Work on calculated question type for the next year

by Pierre Pichet -
"So I don't want to expand it much more. Therefore, I think you should wait until this work is done, and then once the code has been cleaned up, it should be easier to do your changes."

So you don't want to implement now the multianswer question types in the report analysis.
And I should wait that all the code is cleaned and ported to Moodle 2.0 so one year from now.

Should I understand that it is better to stop working on calculated question type and come back in January 2009 !!!

Pierre




In reply to Pierre Pichet

Re: STOP Work on calculated question type for the next year

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
The work will actually be done and merged into Moodle 2.0 dev early in 2008.

A lot of the code in the reports is badly in need of cleaning up, and once it has been cleaned up, it will be much easier to make other changes. Therefore, I was merely suggesting that you might want to hold off for a few months, and focus instead on other parts that you are working on. However, it was only a suggestion.

Sorry I was not clearer. At the moment I am both dying for a holiday, and so can't wait for the end of this week, but also I have a bunch of bugs to fix before Christmas, and so need this week to last as long as possible. As a result, I am probably not communicating at my best.
In reply to Tim Hunt

Re: STOP Work on calculated question type for the next year

by Pierre Pichet -
Coming back from a two weeks holiday in Dominican Republic, I just hope that you get the same benefits from your holidayssmile.

The OU project should have a structure that will be useable by the multianswer question types.
We sould set how to display the individual embedded questions of a multianswer.
I suggest that the question text will be displayed separately with their responses somthing similar to this example.
Q#
Question text
Answer's text
Partial credit
#380
What is the surface of a square of 10 cm side
100
1.0


40
0.2


20
0

What is the perimeter of this square
40
1.0


20
0


10
0



Pierre

P.S.The problem of how to retrieve the data from the the multianswer question types is the one on which I plan to work.
In reply to Pierre Pichet

Re: STOP Work on calculated question type for the next year

by Pierre Pichet -
Additional comments
  • If we consider that a multianswer question is an internal quiz with distinct (or individual) questions, the report analysis should be done on each individual question using the necessary (new) questiontype functions() to access the subquestions data from the multianswer question type.
  • Looking more closely on the table question text rendering, I realize that it is actually done by the report.php from the retrieved question database table.The question text rendering should be done by a QTYPES->print_question_text function like the print_question_formulation_and_controls().
  • etc.
I think that we should first solve the BUG related to the actual report.php code that is not able to analyze multianswer question types BEFORE improving the statistic analysis.

Pierre



In reply to Pierre Pichet

Re: STOP Work on calculated question type for the next year

by Pierre Pichet -
There are different ways to extract the necessary data to obtain a valid item analysis for every question types.
I am trying to solve the problem by setting that the general case is the multianswers question type and that usual question types are a multianswer with only one embedded question.
That is to say that the data structure is rebuilt with a more universal way of structuring the response and answer data.
I should be able to have a working prototype in a few days (next week...)
More details will follow.

Pierre

In reply to Pierre Pichet

Re: STOP Work on calculated question type for the next year

by Pierre Pichet -
Just to say that this proposal gives correct preliminary item analysis stats for either each question embeded in multiple answers (cloze) question type and for regular one answer question types (short answer, multiple choice, numerical).

Pierre
In reply to Pierre Pichet

Re: An item analysis example for multianswers question type (Cloze)

by Pierre Pichet -
A first example showing the splitting of a multianswer question type.

item
Note that regular questions are displayed correctly

Pierre

P.S. The stats ( facility, SD, Disc etc) are not correclty display for each imbedded question in this preliminary version.

In reply to Pierre Pichet

Re: An item analysis example for multianswers question type (Cloze)

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators
Pierre > Note that regular questions are displayed correctly.

What do you mean by "regular questions"? Do you mean the standard questions as opposed to the 3rd party plugin ones?

Joseph

PS.- I am interested in this possibility of detailed reports on the Cloze question type. I do not care about the stats, though.

In reply to Joseph Rézeau

Re: An item analysis example for multianswers question type (Cloze)

by Pierre Pichet -
No just single question question types like Regexpsmile

Pierre
In reply to Pierre Pichet

Re: An item analysis example for multianswers question type (Cloze)

by Pierre Pichet -
" The stats ( facility, SD, Disc etc) are not correclty display for each imbedded question in this preliminary version."
On a closer look to the question_states, although the answers from the different questions are stored correctly, the individual grades are not stored so the stats apply only to the multianswers question total grade and should not (could not) be repeated for the individual questions.

Pierre
In reply to Pierre Pichet

Re: The item analysis code :a simple example or a bug solving code?

by Pierre Pichet -
See
http://docs.moodle.org/en/Development:Quiz_Item_Analysis_of_Multianswers_Question_Types
for some explanations of the code used.

Tim, either you use it as a guide line for your project or I continue to work on it and put it on moodle as this solves a bug at least for older versions...

Pierre

P.S. The actual version is full of print_r and I will put on the tracker a version related to your decision about this project.


In reply to Pierre Pichet

Re: The item analysis code :solving MDL-11441 for old versions...

by Pierre Pichet -
Tim,
From the other postings of Jamie and you, I realize that the changes in Item analysis will be major ones.
You also wrote
"But changes of this magnitude cannot go into the public 1.9 stable branch, they have to go into HEAD there."

Can I conclude that I can continue on my proposal as a MDL-11441 bug solving for 1.9, 1.8 etc., as long as I agree to do this work for only these "old" versions.

Pierre

P.S. As this Item analysis is a report and do not modify questions data, there is no risk of major bugs in quiz or questions. All the questions types will be tested before releasing the code.

In reply to Pierre Pichet

Re: The item analysis code :solving MDL-11441 for old versions...

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Yes, the changes to Item Analysis will be major, but hopefully the kind of changes you are making, and the kind of changes that Jamie will be making, will be quite orthogonal, so it will be possible to merge them once the dust settles.

I really like the way you are displaying the analysis with each subquestion listed separately.
In reply to Tim Hunt

Re: An almost universal Item analysis report.php

by Pierre Pichet -
Just to announce that I have almost succeeded in building a first version of the item analysis that correctly display all actual question types either regular ones or multiple answers or random.
I will put all the details in docs.
Here a new version of match and Random Short-Answer Matching display which are effectively multiple answers question type.
match.jpg

Pierre
P.S. almost because Moodle always have a special way to surprise you with a bug smile
In reply to Pierre Pichet

Re: Adding a column to display the question texts

by Pierre Pichet -
A new column containing the question name was added so that the questions text are in the same column for multianswers and regular questions.
This allows also the use of the sorting and hide or display options.
complet.jpg

Pierre

In reply to Tim Hunt

Re: Improvements to the quiz reports

by Pierre Pichet -
"and it will be included in Moodle 2.0, which is probably about a year away."

Can I conclude that this will not be merged in 1.9, 1.8,1.7,1.6?

If this is the case, I could explore the necessary changes to modify for these versions the report analysis so that it could display the Cloze multianswer and the multianswer calculated question type in the actual format.
Something similar to the example in this discussion thread (or topic) with a more explicit display of the subquestions question text.
Even if these modifications just lived in 1.9, 1.8,1.7,1.6 they will be useful for many years to many moodle intallations.

Pierre

P.S. Technically this implies the creation of a get_subquestions() function in questiontype and some modifications to the actual code flow to retrieve these subquestions correctly. The default get_subquestions() for all the single answer question types will return "" (or false?). See MDL-11441.
In reply to Pierre Pichet

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Major new functionality should never be merged into stable branches. Doing so would negate the whole concept of a stable branch.

Some of the changes that are bug fixes probably should be merged into other branches where possibly. I must remember to add that to the spec.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Jamie Pratt -
Can you clarify for me whether these improvements should be developed against Moodle HEAD or the Moodle 1.9 stable branch or both?

The aim is to develop on a branch of HEAD or in HEAD presumably but also to create patches to apply to the OU Moodle installation. It says in your development requirements stuff for OU should be developed against the 1.9 release version. I'm hoping it will be possible to get permission to work on HEAD or on a branch in CVS and that you will be able to fetch patches from CVS to apply to the OU installation??
In reply to Jamie Pratt

Re: Improvements to the quiz reports

by Jamie Pratt -
Having read further the spec I can answer my own question. According to the spec the requirement is to supply patches that can be applied to the OUs version of Moodle which is based on Moodle 1.9 and also to merge code into HEAD. Hopefully we can take the patches straight from CVS and apply it to the OU version of Moodle without any problems since Moodle 1.9 especially in this area will be similar to HEAD.

I would presume the best way to do this would be to work on a series of commits with proper commit comments to HEAD or a branch of HEAD. And presumably to just make life as easy as possible for the OU guys who will be merging the code into their Moodle.
In reply to Jamie Pratt

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
That is basically right. For the whole of 2008, the OU version of Moodle will be based on the 1.9 branch, and we want to include these new reports in our next release that is being developed over the next few months, then tested, then release in July.

Of course, being nice people, we don't just want these improvements in OU moodle, we want to donate them to the community as well. But changes of this magnitude cannot go into the public 1.9 stable branch, they have to go into HEAD there.

So that is the problem we are having to solve. The 'OU guys who will be merging' is me, so I have a strong incentive to have the merge be as easy as the tools, CVS/patch/diff will allow wink On the other hand, I have some practice at doing this sort of thing.

You are right that the areas of the code that this affects will be very similar between HEAD and 1.9 (until your changes go in) so merging should be quite easy. The difficulty is where this work involves improving core code like tablelib.php. It is probably good enough if you just keep a note of which files outside mod/quiz/reports get changed as you do this work.

We need to ask Martin D if these changes are big enough that he wants them on a development branch off head, of if he is happy for them to go straight in to core. A development branch probably makes merging slightly easier for me, but really that decision is for Martin and should be based on other considerations.

Of course, Martin Langoff would tell us that Git is the appropriate tool for this problem, but I have never got my head round it, so it does not seem fair for me to expect you to.
In reply to Tim Hunt

Re: Improvements to the quiz reports - Improvements to tablelib

by Jamie Pratt -
Seems a good idea to add functionality to tablelib to optionally add buttons to the bottom of any table to download all the data in several file formats as well as to display a table of sortable and paginated data.

Is this what you meant by improvements to tablelib?
In reply to Jamie Pratt

Re: Improvements to the quiz reports - Improvements to tablelib

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Yes, that is basically what I meant. there is a slight complication though, which is that when tables are printed on screen, we want things like username to be HTML links to the user profile, and in exports, they need to be plain text. So it is not quite as simple as tablelib collects all the data, then writes it out in one of several formats.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Jamie Pratt -
You say in the part of the spec detailing the item analysis report that :

"The detailed analysis of every single question should be downloadable in a single XLS/ODS/CSV."

This means the export option is not always just for the data displayed in the table. You want to drill down to the detailed items analysis report for each question from the overview report but you also want a single download of a detailed analysis for every single question available?


In reply to Jamie Pratt

Re: Improvements to the quiz reports

by Jamie Pratt -
I can answer my own question on this one as well I think. I misunderstood the spec I thought you wanted an item analysis report to be for just one item and the overview to be for all the questions in the quiz.

It seems I was wrong you want the item analysis to have two modes : display a single item or display a table of analysis of all items in the quiz.

What is the advantage though of having an overview from which you can drill down to see the detailed item analysis for one question in the item analysis report and also having the item analysis report having another mode to display a detailed list of all questions.

Could we make the overview report be the full list of all questions and the item analysis the view of a single question. Perhaps by default in the 'overview' report we can collapse the center columns? Then the overview report would be called 'all questions' and the item analysis would be 'item analysis for one question only'. What do you think?
In reply to Jamie Pratt

Re: Improvements to the quiz reports

by Jamie Pratt -
Ignore the two parent posts here. I was confusing the Item Analysis Report overview mode and the separate Overview report.
In reply to Jamie Pratt

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
I will answer anyway, to give the rationale behind this.

The analysis for a single question can get quite big. Maybe not for a multiple-choice question, but suppose you have a shortanswer question, where it has a row for each different answer that at least one student entered. If you have a quiz taken by a few hundred users, there can be tens of different answers between them.

And there are times when you just want to see the summary statistics for each question in the quiz, to be able to see at a glance whether there is one question that looks like it has a mistake in it, because the discrimination index is much lower than for the other questions. A table with one row for each quiz question is also a fair chunk of data.

Hence my thought that for online use, what you want is one screen for the summary statistics of each question in the quiz, with drill-down to the detailed analysis of each question.

But you also need a third mode, for the benefit of people who like to print things out on dead trees and stick them in a filing cabinet (wink), or people who want to get all the data in one excel spreadsheet. For them, you want a single document, obtained by concatenation first the summary table, and then the drill-down for each individual question. Hopefully that is not too much extra work to implement.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Anthony Borrow -
Picture of Core developers Picture of Plugin developers Picture of Testers
Tim - I'm excited about the proposed work for quiz reports. I'm especially keen on the improvements to regrading. Peace - Anthony
In reply to Tim Hunt

quiz reports - summary graph

by Jamie Pratt -
Am I right that the summary graph you suggest is a summary of grades for all attempts? So we just need to query the 'quiz_attempts' table??
In reply to Jamie Pratt

Re: quiz reports - summary graph

by Jamie Pratt -
I guess you probably want to include just the first attempt as with the item analysis calculations??
In reply to Jamie Pratt

Re: quiz reports - summary graph

by Jamie Pratt -
Please ignore the two parent posts of this post as well. Confused the item analysis graph and the summary graph.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Jamie Pratt -
You say about the item analysis report :

"This report has to perform a lot of expensive calculations. Instead of re-computing this every time the report is viewed, we probably need to cache the computed values in a separate database table."

Was wondering if it would make sense to use the db table to store results of calculations every time the grades are calculated. And then we can make previous results available as well - print a table of when calculations were performed and allow users to click through to see the stored results from any of the calculations. Would provide delete links for each set of results.
In reply to Jamie Pratt

Re: Improvements to the quiz reports

by Pierre Pichet -
  • This will be OK to store in the database if the quiz is closed and there is no further attempts. If the last attempt is also stored, you could have a data structure that allow additional attemps to be added.
  • Recalculation will also be necessary if there is any manual grading or automatic regrade procedure.
  • The database storage should be added after the actual bugs are solved and the statistical parameters have been "officialized" as any changes will imply database table modifications.

A much interesting development will be to create a complete data file output that can be used in external statistical packages.

Pierre

In reply to Jamie Pratt

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
I'm not sure how useful a complete historical record is. I would just expect it to delete any previously cached results before recomputing them. (Hmm, or possibly only after successfully recomputing the new stats, so if an error occurs during recomputation, there is at least something to display).

But, maybe make the delete be a separate function, so if people want to retain the history, they just just comment out the one line that calls the function.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Jamie Pratt -
You say here about questions in the item analysis report :

"if the question is used in other quizzes (that the current user has reports access to) then we should provide cross links the detailed analysis of this question in those quizzes too."

Does this mean only reports in the same course or do you think we should also link to reports for quizzes in other courses as well where the user has permissions to see these reports?
In reply to Jamie Pratt

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
I was expecting one DB query like

SELECT quiz as quizid, cm.id as cmid FROM mdl_quiz_question_instances JOIN mdl_course_modules cm ON cm.instance = quiz.id JOIN mdl_modules mods ON cm.module = mods.id WHERE question = $questionid AND mods.name = 'quiz';

Then loop over those other quizzes to print the links, and inside the loop, do

if (has_capability('mod/quiz:viewreports', get_context_instance(CONTEXT_MODULE, $quiz->cmid))) {


So, yes, any quiz in any course where the current user has permission to view the reports.


Of course, I have failed to take into account random questions here, but we have the usual problem that it is almost impossible to tell, in SQL, which actual question a random question represented for a particular student. We can do without covering this case for now.

In reply to Tim Hunt

Re: Improvements to the quiz reports

by Matt Oquist -
Hi Tim,

I'm the tech department for a new virtual high school we've opened here in New Hampshire, USA, and our teachers are suffering greatly under the [very sad] disappointing manual grading reporting for quizzes in Moodle 1.8. We have rolling enrolments, so we have students at all points of all courses at all times. The manual grading workflow in Moodle 1.8 makes this about as miserable as it could be.

I've already added a report that shows at a glance how many ungraded questions have been submitted for each assessment in the course, and just now I discovered that the viewquestion report action (clicking the "Manual Grading") doesn't present ungraded questions any differently from graded ones, which makes the page basically useless for our teachers. So now I'll have to fix that, too.*,**

All this is to say that improving this workflow is something I'm *very* interested in working on, and I can hopefully squeeze in some time to help out with the design and implementation of something better that fits within your spec (which I haven't read yet but will).

In addition, just to get these points into the conversation, here are other things vexing us:
  • auto-graded questions that don't display the correct answers to the teachers (Cloze, at least. We purchased our courses and the teachers sometimes don't know why some things are marked "incorrect".)
  • no possibility of randomly selecting essay-type questions
  • Why, oh why, are the questionids in a comma-separated-list in the quiz table? If there's a good reason not to put the sequence information in a separate table with a sequence field, I don't know what it is.
  • We need a better flow for students to naturally find themselves viewing teacher comments, just like when a paper gets handed back in a classroom.
  • Teachers need to be able to assign grades to quizzes that haven't been submitted by the student.
  • make it easy to tell by a single database query whether grading is actually complete for a quiz attempt (this may be possible now; I'm still working on grokking the manual grading stuff, as you can guess by my initial comments, above smile )
Cheers,
Matt

* I might be wrong about this. Looking at the code, it's checking to see if the questions are graded, so my conclusion that no distinction is made may be an artifact of the limited testing I've done so far, and/or user error on the part of the teacher who complained. Maybe the world is a better place than I thought. smile
** OK, so it looks like this problem comes from the standardwhite CSS not defining manual-graded and manual-ungraded.
In reply to Matt Oquist

Re: Improvements to the quiz reports

by Pierre Pichet -
Hi Matt,
"(Cloze, at least. We purchased our courses and the teachers sometimes don't know why some things are marked "incorrect".)"
Could you specify this specific problem.

The actual feedback on Cloze question is displayed by using the "onmouseover" feature. You have to put your mouse over the specific question to get the display.
This does not respect new HTML display standards.
However, the cloze question is often used in language courses to ask students to fill the rigth punctuation mark, verbal tense et. in a text.
This means that the question element (input or select) should remain in-line with the text.
So the actual feedback display.

There is a project of a new multianswer question type that will use the standard question type display but not necessarily with the in-line feature.
Improving the multianswer(cloze) question, a proposal

Pierre

P.S.1 If you know how to display in-line the feedback text underside the question element (input or select) without disprupting the question text flow I will be happy to apply your solution.
I have done some tests using div, table without finding a real universal solution.
P.S.2 Perhaps we could use the standard display in the quiz report?
In reply to Pierre Pichet

Re: Improvements to the quiz reports

by Matt Oquist -
Hi Pierre,

See the attached screenshot. The data we imported wants the answer to be "830,000.", but that string isn't included anywhere in the markup that presents to the teacher the screenshot I've attached. So the teacher is left thinking "well, that looks right to me" instead of giving the student comments such as "remember that the directions instructed you to include commas and a decimal..."

Of course, IMO it's just weird that the decimal is included in the expected answer with no significant figures after it, and while that is a content issue in some sense, Moodle should be telling the teacher what answer it was expecting so the teacher can understand why the question was automatically graded the way it was.

Regards,
Matt
Attachment cloze-incorrect.png
In reply to Matt Oquist

Re: Improvements to the quiz reports

by Pierre Pichet -
Hi Matt,
The problem you illustrate is a problem related to the numerical question type that actually does not analyze the numerical response using the regional settings for expressing numerical quantities.see MDL-3282

Pierre

In reply to Pierre Pichet

Re: Improvements to the quiz reports

by Matt Oquist -
Hi Pierre,

Close but no cigar. The data I imported into Moodle didn't make it easy for me to distinguish between numerical and non-numeric fill-in-the-blank answers, so it's actually shortanswer. In any case, the issue isn't that it's being graded incorrectly, it's only that the teacher has no straightforward way of finding out what the expected answer is.

I haven't tried to find a bug report for this one yet, but as far as I've looked into this it looks like a feature that just isn't implemented.

Cheers,
Matt
In reply to Matt Oquist

Re: Improvements to the quiz reports

by Pierre Pichet -
Hi Matt,
This is not easy to implement as there are many answers with various grading that can be defined.
Take a look at the Cloze example available by clicking on the help button when editing a cloze question.
"and right after that you will have to deal with this short answer {1:SHORTANSWER:Wrong answer#Feedback for this wrong answer~=Correct answer#Feedback for correct answer~%50%Answer that gives half the credit#Feedback for half credit answer} "

This is why this feature cannot be easily implemented. The only way is to write the good answer (or a clue to it) in the feedback field and then you put the mouse over the question input field and you will have it.
You could also in another window edit the question and look at the question text.

Pierre

In reply to Matt Oquist

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
  • no possibility of randomly selecting essay-type questions
Long-standing issue. MDL-4004.
  • Why, oh why, are the questionids in a comma-separated-list in the quiz table? If there's a good reason not to put the sequence information in a separate table with a sequence field, I don't know what it is.
No good reason. It has been like that for ages and would be a pain to clean up. As they say if it ain't (too badly) broke, don't fix it. See also the 'answers' column in the qtype_multiplechoice table.

However, you don't need to use that field if you don't want, the is also the quiz_question_instances table that has most of the necessary information.

This diagram: http://docs.moodle.org/en/Development:Quiz_database_structure#Overview may help.
  • We need a better flow for students to naturally find themselves viewing teacher comments, just like when a paper gets handed back in a classroom.
Yes. Presumably you are talking about manually graded quizzes here. Specific proposals for how the interface should work are welcome. See also your last point.
  • Teachers need to be able to assign grades to quizzes that haven't been submitted by the student.
Really? this is dangerous. Are you sure you don't mean "Teachers should be able to close a student's attempt when the student has forgotten to finish it".
  • make it easy to tell by a single database query whether grading is actually complete for a quiz attempt (this may be possible now; I'm still working on grokking the manual grading stuff, as you can guess by my initial comments, above smile )
Yes. The bottom diagram on http://docs.moodle.org/en/Development:Quiz_state_diagrams may help. The states are stored in the mdl_question_states table in the event column. What we really need, but don't currently have, is a 'Submitted, but in need of manual grading' state.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Mark Miller -
I'm really new to moodle and not a programmer, so there is a lot I don't understand. I tied to look at the links to the bug tracker, etc. and hardly understand a word of it. I have two questions:

a. What's a HEAD"? I see references to making the changes there, but I can't figure out what that means.

b. Will the new stats reports include difficulty factors for each question?

I apologize if I've made a netiquette error here; the only excuse I can offer is that for someone like me, a teacher-user, some of this can be hard to follow.

tnx and regards

mark c. miller
noblesville indiana
In reply to Mark Miller

Re: Improvements to the quiz reports

by Matt Oquist -
Hi Mark,

HEAD is where the core of Moodle code lives. The point is that the fix needs to go into the development version of Moodle and be included in all future versions.

I don't know about your question 'b' (I haven't ready the spec that thoroughly).

NP WRT netiquette; you're at least 50% directly on-topic, so that's doing pretty well. approve

Taylor University (Upland) is my alma mater, so I'm familiar with your area. Welcome to Moodle and I hope somebody helps you out with question b.

Cheers,
Matt
In reply to Mark Miller

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
a) http://docs.moodle.org/en/CVS_for_Administrators may help.

b) I don't know exactly what you mean by "Difficulty factor". I suspect it the same as what Moodle calls Facility Index.

No breach of netiquette. Quite the reverse. Asking good questions is one of the most important contributions you can make to these forums, and yours were good questions.
Average of ratings: Useful (1)
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Matt Oquist -
I should give some more background about our school and how we're using Moodle; that will make quite a bit more sense out of my comments about what we need.

We have rolling enrollments, so students may be spread out over the different sections ("topics") of any course at any time. This means that assessments (we're only using Moodle's quiz module ATM) never "open" or "close", and there are technically never due dates. Students are allowed to submit new attempts (up to a limit) for each assessment at any point in the course.

In case it isn't already obvious, we're using Moodle as a tool for each student to have a very individual relationship with her teacher, and occasionally some interactions with other students in forums, chat, etc.

I just checked, and 6246 out of our 46302 questions in the DB are 'essay', so the majority of our assessments are automatically graded. However, we're expecting our teachers to provide lots of 1-on-1 feedback to students, and this takes the form of comments (and occasionally grade overrides). Unfortunately, the 9606 random-selection questions in our database cannot be commented upon (effectively), and their automatically-assigned grades cannot be overridden (effectively). I believe this is my #1 Moodle issue for our school right now.

  • We need a better flow for students to naturally find themselves viewing teacher comments, just like when a paper gets handed back in a classroom.
Yes. Presumably you are talking about manually graded quizzes here. Specific proposals for how the interface should work are welcome. See also your last point.

Actually, I meant it as a general statement, since we're wanting our teachers to provide so much feedback (even on auto-graded assessments) to students. I'd like students to land in Moodle and find themselves looking at a stack of graded assessments with comments, as it were. I haven't even begun to think of an interface for this, but I think a block could handle it nicely, with a link to a report page (analogous to the grading reports page(s)) that provides links to assignments with unseen comments. This does imply that Moodle needs to keep track of which comments a student has viewed before.

Analogously, it needs to be easy for our teachers to see which submitted assessments they've reviewed (and commented on) and which they haven't. This implies that Moodle needs to track these things.

I'm planning to implement such a tracking system (perhaps sharing some code and DB tables for the student & teacher sides) and present it to the user in a block, at least to demo the idea to other devs.
  • Teachers need to be able to assign grades to quizzes that haven't been submitted by the student.
Really? this is dangerous. Are you sure you don't mean "Teachers should be able to close a student's attempt when the student has forgotten to finish it".

I meant what I said, but I totally see why you see it as dangerous. I think this is a site-wide policy issue (and it should not be available by default), but it makes sense for us. Right now we can't call a student "100% complete" in the course until she's clicked a Submit button for every last assignment, even if she's lazy and/or difficult and willing to take zeros on some assignments. A teacher may get stuck with a 98% student who isn't interacting any longer, but who has for all intents and purposes completed the course (and even passed, potentially). We need to be able to assign the zeros, get Moodle to report 100% to our student information system, and send the report off to the state so we and the teacher can get paid.
  • make it easy to tell by a single database query whether grading is actually complete for a quiz attempt (this may be possible now; I'm still working on grokking the manual grading stuff, as you can guess by my initial comments, above smile )
Yes. The bottom diagram on http://docs.moodle.org/en/Development:Quiz_state_diagrams may help. The states are stored in the mdl_question_states table in the event column. What we really need, but don't currently have, is a 'Submitted, but in need of manual grading' state.

Thx for the link; it's very helpful (and I only looked at the first paragraph so far smile). What I've done for us at this point (and it'll have to be good enough for now) is hack grading/report.php to make an API* out of the code that finds ungraded essay questions, and add a routine ("viewquizzes") that calls the API for every quiz in a course and prints out a table with popup links (b/c this page is CPU-expensive) and the sum of ungraded questions for each assessment. It's not a pretty solution and it should only serve to demonstrate the functionality I'm looking for (in case my explanation is unclear); I've attached it here for that reason.

I'll be gone all of next week, but this week I'm going to try to address the issue of randomly-selected questions not showing comments or taking grade overrides. That should get me more familiar with the entire quiz module overall, so that I can perhaps address random selection of essay questions after that.

Cheers,
Matt

* This is a general complaint I have, and something I'm interested to help fix (if I can). Far too often I find code that I want in an API making the assumption that it's in the business of being a UI. I want to pass parameters and receive error codes and return values, but instead it's busy getting parameters from the URL and telling the human things through the Web browser. We should have UIs using APIs, so that I can use the same APIs to build new UIs. I would guess a lot of this is legacy code, and my [vague, out-of-date] sense of the direction of Moodle development is that we're heading in a very positive direction. So there's an opinion for you, dear reader.
In reply to Matt Oquist

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
This is getting out of hand. We really need separate threads for each separate point, so I'll be making multiple replies for the different bits.
  • We need a better flow for students to naturally find themselves viewing teacher comments, just like when a paper gets handed back in a classroom.
A block is the most natural Moodlish way to implement this. I wonder if the recent activity block does what you want, but I suspect that it is not assessment focussed enough for you.

I don't think you should need any extra database tables to do this. All the data you need should already be there, it is just a matter of getting it out in the right format.

By the way, I assume you are using groups to associate your teachers with their students.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Hey! I've just noticed that if you turn on "Show mark details" in the results overview report, it tells you which essay questions need grading. That's useful.
In reply to Matt Oquist

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
  • Teachers need to be able to assign grades to quizzes that haven't been submitted by the student.
I still think a better way to do this is for teachers to be able to close a student's attempt for them (a long-standing feature request). Then if they need to do what you say, first they close the abandoned quiz attempts, and only then assign a grade to them.
In reply to Matt Oquist

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
This is a general complaint I have, and something I'm interested to help fix (if I can). Far too often I find code that I want in an API making the assumption that it's in the business of being a UI. [...] I would guess a lot of this is legacy code, and my [vague, out-of-date] sense of the direction of Moodle development is that we're heading in a very positive direction. So there's an opinion for you, dear reader.

You are quite right. It's a valid complaint, it is mostly legacy code, and we are moving (slowly) in the right direction.

But you do have to be careful, when refactoring, not to break stuff. That is why it is slow, and why I don't tend to do it unless I am already changing a certain area of code for another reason.
In reply to Matt Oquist

Re: Improvements to the quiz reports

by Pierre Pichet -
Hi Matt,(and Tim)
As you can see in the upper part of this discussion, I am working on Cloze questions as an example of multiple questions, for the Item analysis report.

If we consider that Cloze is just a convenient way to create questions, each Cloze subquestions are valid questions on which we could want complete Item analysis.
However there is just one actual question_states record for each Cloze question, so we could not get directly the grade of the individual questions and do complete item analysis ( Facility index, dicscrimination index etc.) although the student response for each individual subquestions is stored in the question_states with the total grade and the total raw_grade.

These individual subquestions cannot be manually graded .

So I propose to do the complete item analysis of each subquestions by regrading each of them as there where initially graded.

This will allow the teachers to evaluate the individual subquestions as any other questions in the quiz.

Pierre


In reply to Pierre Pichet

Re: Improvements to the quiz reports

by Jestin VanScoyoc -
Sorry if this question diverges a bit, but its quiz report related ... I would like to change the default quiz report overview. For example, by default I want to see all quiz attempts within a group. Its currently set to 10 per page and I have to manually change it every time - workable but annoying.

Thanks,
Jestin
In reply to Jestin VanScoyoc

Re: Improvements to the quiz reports

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Well, the code is in mod/quiz/report/overview/report.php

Something like that should be easy to change.

Yep, looking at the code, the line

$pagesize = optional_param('pagesize', 10, PARAM_INT);

The 10 there in the middle argument is the default value. You just need to bump that up to 100 or something.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Jestin VanScoyoc -
Aha! I changed it in 30 seconds and it works just like I wanted big grin I'm not a code monkey so these little tips are just what I need. Love the power of community learning.

Thanks,

Jestin
In reply to Tim Hunt

Re: Are multiquestion (Cloze) one question or multiple?

by Pierre Pichet -
Tim,
I think that multiple question types like Cloze (and Match) should be treated as a quiz inserted in the quiz i.e. each subquestion should be analyzed individually.
Could you comment on this thread or on
http://docs.moodle.org/en/Development:Quiz_Item_Analysis_of_All_Question_Types

Here an example of the work done (I am working of having the separate grade of each subquestions).
indiquestion.jpg

Pierre

In reply to Pierre Pichet

Re: Are multiquestion (Cloze) one question or multiple?

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
I think you are basically right, that level of extra detail would be really helpful to teachers.

What you are proposing will probably fit nicely with the enhancements to Item Analysis that I am proposing: http://docs.moodle.org/en/Development:Quiz_report_enhancements#Improvements_to_the_Item_Analysis_report. The document now has crude screen mock-ups to help explain the proposal. I have included random questions in the proposal in a way that I like. I have not had time to specify (and the OU does not want to fund development for) cloze and matching question breakdowns, however, I feel that it be easier to add the extra detail on top of the new system than it is to incorporate it into the existing report.

At the moment, we are close to signing a contract to get this developed, and it should then take a few months to do everything in that document. I will, of course, let you know who is doing the work, once the deal is signed.
In reply to Tim Hunt

Re: Are multiquestion (Cloze) one question or multiple?

by Pierre Pichet -
Tim,
As I need no contractsmile but time , I will be able to put more time on moodle as I won't continue my job of chemistry undergaduate program director ib the next months.
Retrieving the subquestions grades can be done nicely if we use the quiz_question_states tables to store them. This ask for a revision of the multiquestion grading process...?

Pierre
P.S. otherwise the subquestions grade has to be recalculated each time...
In reply to Pierre Pichet

Re: Are multiquestion (Cloze) one question or multiple?

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Hmm... Interesting thought. Possibly big implications. I'll ponder it.
In reply to Tim Hunt

Re: storing the database search in $SESSION

by Pierre Pichet -
I have CVS a version that stored the attempts database search and the resulting $question data in the $SESSION so that they are not done again when sorting the table or downloading .

Pierre

As this version does not use new database table, it could be merged to older versions.
In reply to Pierre Pichet

Re: storing the database search in $SESSION

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
It is generally a bad idea to store lots of data in the SESSION. It all has to get saved to disc and reloaded into memory on every page request, so large sessions hurt performance. It can also go wrong when you do things in two different tabs in your web browser at the same time.

Please can you undo this change, it is definitely the wrong direction to go in.
In reply to Tim Hunt

Re: storing the database search in $SESSION

by Pierre Pichet -
Sorry, I suspect some problems but could not find the right infos.

In any case the actual code is quite inefficient as everything has to be loaded again with every function either sorting, download etc.

So the question array does contain less than the table display as the question text are not there.
So moodle users will have to wait to 2.0 to have a good report.php
The other possibility is to use the hidden input elements but they use more memory although this is not more than some large images even with a quiz with 100 questions and 1000 students.
So I will undo and check if this can be easily transform to use with hidden elements.

Pierre

P.S. We can discuss a lot about this memory problem because this report is used by teachers and the loading is much less than all the moodle server time necessary to compute the values.
Normally the browser also save the pages...

The design was plan take the quiz-id to eliminate the tab problem.

Anyway, the design could be useful at least as an exercise in moodle computingsmile...


In reply to Pierre Pichet

Re: storing the database search in $SESSION

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
I can't actually remember where I learnt about what was or wasn't appropriate use of session. I think I picked it up by osmosis just by hanging around in the same forums as Martin Langhoff. Certainly I don't know any good docs about it.

I think generally, moodle developers are moving in the direction of attaching patches to the tracker, and asking other people to comment, before committing to CVS. Even Moodle HQ developers, who are probably allowed to do anything they like to Moodle, have been doing it, e.g. MDL-13155, MDL-13903 and MDL-13412. This is probably a good thing in terms of the stability of Moodle. I probably ought to do it more often myself.

It does depend though, it is obviously overkill for a simple bug fix, but for anything where it is a bigger change, or some type of programming you are less familiar with, it is worth it. But, of course, it is a pain to have changed hanging around for a few days before you commit them to CVS while you wait for comments.


In reply to Tim Hunt

Re: storing the database search in $SESSION

by Pierre Pichet -
"attaching patches to the tracker, and asking other people to comment"
In an ideal world.
"I probably ought to do it more often myself."
In the real worldwink

As a teacher I worry a lot about code stability and I will not have done any changes in the quiz code flow.
I have proceed this way because
  1. This report is an useful tool to analyze questions grading performance but does not change in anyway the actual student grading.
  2. I have work a lot on this report code to integrate multianswer question types
  3. Doing the $SESSION option, I learn more about the code flow
  4. I consider that HEAD has an experimental status and often code CVS to HEAD has a lot of bugs that have not been checked even on official Moodle projects.
  5. This code flow modification is one of the possible way to integrate my work with the OU project
  6. I have done a quite heavy checking of the code.
  7. I have left the original code as comment. So you uncomment and remove the call to the new get_questions_atttempts_data() function and you are back to the original code.
  8. There is not so much other peoples that master this code .
  9. I am waiting to do any merging before having your approval.
This is not a general trend but perhaps a practical one for not core code proposals.wink

Pierre

In reply to Pierre Pichet

Re: storing the database search in $SESSION

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Can I just check, these are the changes in MDL-13427, right? I will try and have a look at them, but it is unlikely to happen before Thursday, because that is the deadline we are currently aiming at in the OU.
In reply to Tim Hunt

Re: storing the database search in $SESSION

by Pierre Pichet -
I put a diff file with all the changes and some comments on MDL_13427.

If OK, I will do the help and doc changes and merge to 1.9 and other versions.
But not until next week.
So Happy Easter Holidays...

Pierre
P.S. Add some good chocolatesmile
In reply to Tim Hunt

Re: splitting the report code for easy concurrent development

by Pierre Pichet -
I have CVS a version that does not use $SESSION.
In this version I split somewhat the code so the once the question attempts are selected, the function get_questions_atttempts_data() build the $question array data that is used to build the output table.
So I can modify this function to include the multiquestions and the OU new questions selection can be done independently. ( "orthogonal" as you wrote somewhere ).

I have also review the downloads ODS, XLS and CVS by adding some index columns so that the data is more easily sorted .
These formats should be used for further analyzis and the HTML for printing , text processing etc.

Pierre

P.S. From here I will merge this new version to older versions (1.9 ,1.8 etc.) if you approve these modifications.
Before merging I will modify the help files.
After this, I return to the multiquestions problem.
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Heike Ebelt -
I have a very similar problem and don't want to open a new topic:

Our Moodle page has become VERY VERY SLOW recently (sometimes I receive even errors, such as that the course id couldn't be found or other error messages). We have only 20 users!!! And no-one of them does exercises because most of the courses still need content.

In addition, when I run the cron.php it won't finish and show me the results. However, if I reopen the admin page I see that the option for running the cron.php is gone, which I interpret as a successful attempt as I ran it.

I haven't changed any settings since the problem has started.

I am using Moodle 1.9 Beta 4, PHP 5 and MySQL.

Please help me. I don't know much about the technical stuff... sad

Heike
(heikeebelt@yahoo.com)
In reply to Heike Ebelt

Re: Improvements to the quiz reports

by Pierre Pichet -
Heike
As I understand, your problem is not related to the quiz reports.
It should be posted to another forum like general problems with more details.

Pierre
In reply to Tim Hunt

Re: Improvements to the quiz reports

by Dennis Daniels -
Greetings! This thread looks very old but I thought I should post this message.

--------
Greetings,

My name is Dennis Daniels and I'm a long time user of Moodle and I have a specific request for new functionality to be added to Moodle. I am not a customer of yours but I sincerely believe that my idea below will provide significant value to your clients.

The purpose of this proposed software is to allow 'students' to submit questions to a Moodle question database. Answers supplied by students while taking tests will be easily incorporated into the question database for further reuse after teacher review and approval. There will be an option for users to submit questions/ rating questions/ suggesting categories etc. while taking a test to capture those related ideas and to give students opportunity to improve the test, impress the teacher and etc. There will also be options for course administrators to create assignments that permit users to submit questions in the many Moodle question formats. Questions will be matched to a larger selection of standards. Teachers may be responsible for multiple subjects, grades and standards. One question might meet many education standards. These tools are to make it easier for the teacher to reuse students' questions and answers and gather feedback.The teacher must be able to also easily identify which questions with standards have had the most correct and incorrect answers. This system will make it easier for students to post questions that comply with Moodle's question database format. The advantages of having the students write test questions is invaluable from a time management perspective as well as a pedogogic one.

The full document specifications are found here: http://docs.google.com/View?id=dhq9ft55_48gzd44gd5

There are a number of Moodle partners, perhaps work with them to build this very important functionality into Moodle? http://moodle.com/partners/list/

Please help me create this new functionality for Moodle.

with many thanks,
Dennis Daniels
dennisgdaniels@gmail.com