Question specific usage analytics

Re: Question specific usage analytics

by Tim Hunt -
Number of replies: 23
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Not a full answer, but https://docs.moodle.org/dev/Overview_of_the_Moodle_question_engine#Detailed_data_about_an_attempt give a sample query for pulling out question attempt data. That should be a place to start, though you will need to modify.
In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
Hi Tim, I went through that. Firstly, thank you for your work on all of this. This is truly transformational. The attempt steps and attempt steps data table are populated only by the quiz module and possibly the adaptive quiz module which is derived from the quiz module. My point is that the same questions are being used for different types of activities and similar attempt info should be logged from the pov of the question. Instead what happens is activities get questions from the db and use the same questions within their activity without a trace. In a certain sense I'm talking about every question leaving a trail of breadcrumbs data that would be tracked as a way to get information about the question and how students interact with it. I'm not sure if I'm making myself clear but if not I'd be happy to clarify more.
In reply to Arvind Murali

Re: Question specific usage analytics

by Arvind Murali -
For eg. Quizventure, a very engaging game does this in renderer.php:

renderer.php

This gets it an array of questions which it then displays within the game for students to shoot the answers with. If they shoot the right answer, their score goes up and if they shoot the wrong answer, it goes down. Great stuff. I have seen a great deal of interest from students in this gamified approach. The problem is, even though the questions from the question bank are being used here and students show that they know the answer to those questions, there is no way to track what questions showed up in the game, how many students shot the right answer on the first try and how many did not. In this case, my plan is to dig into the core quiz rendering code and see how the attempts are being logged and try to replicate it within the game. What I was hoping could happen long term is that the moodle question engine kept track of its questions being used and tracking the results of the question usage. I'm not entirely sure how we'd go about something like that but I wanted to throw it out there for developers much much better than me to think about.
In reply to Arvind Murali

Re: Question specific usage analytics

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
I've seen quizventure, and it is cool, but I've never looked at the code.

The Moodle question engine does keep a copy of all attempts being used, providing the code that users questions uses them 'properly' - as described on https://docs.moodle.org/dev/Using_the_question_engine_from_module. I know the query I linked you to had a join on the quiz_attempts table, but that is just to illustrate how the link works. It is not essential. The question_usages table is generic.

So, it is a choice made by the creators of quizventure, that they are not recording the 'attempts' at questions that students make in the game. They have chosen to just load the question definitions from the question bank, and use the data themselves. Fairly simple proof that quizventure is not storing the data: https://github.com/xow/moodle-mod_quizgame/search?q=usage&unscoped_q=usage

However, several other plugins that use questions (including mod_studentquiz, filter_embedquestions, ...) do store their data in the standard tables.

Quizventure could be fixed to store the attempts. Basically, you would need to create a 'question usage' behind the scenes each time a user starts a game, then as they shoot aliens, add the questions and record the 'response' the student gave. You need to talk to the maintainers of quizventure.
In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
Awesome! Yup looks like the aggregation I was off about exists in the form of question usage! We'd just have to make sure to get developers to stick to the protocols! I'll send a note to quizventure. If I manage to add the code myself I'll try to help them out with the changes as well. Thanks Tim!
In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
https://docs.moodle.org/dev/Question_data_structureshttps://docs.moodle.org/dev/Question_data_structures
As I try to make it work with Quizventure, I'm running into something interesting albeit quite confusing in the question engine. I managed to create a usage entry every time the game is started.

Question

The game uses question_load_questions function to load the questions. Quba needs to use question_bank::load_question which should give similar outputs according to this: https://docs.moodle.org/dev/Question_data_structures. However they do not. question_load_questions gives a direct array of questions but the question_bank::load_question function gives question definitions (1 layer higher than actual questions). I'm trying to figure out a way to add questions to the $quba directly from the $questions array or create a $questions array from the $quba. I tried question_bank::make_question with each question definition which causes the qtype to become missing for some reason. I'm looking for some way to solve this disconnect. The other issue I haven't figured out is that the question array gets sent to a javascript file which is where they are shuffled and a question is displayed with options. I've not figured out how to get that info recorded. Any thoughts? 

Also, is the make_question issue a bug?
This is what I did:
Debug
Debug
In reply to Arvind Murali

Re: Question specific usage analytics

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Sharing code as images makes it hard to comment on. I can't copy and paste bits.

load_question returns an instance of a question_definition class. It does get_question_data, then calls make_question. So, of course calling make_question again will give an error.

Putting a lot of questions in a usage in PHP code is not going to do much good. You need to add the question to the usage when the student 'attempts' it. And the attempting happens in JavaScript in the student's browser, so you are going to have to think how you will send the data from there back to Moodle.
In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
That makes sense. I'll post code in text the next time. My apologies. Just out of curiosity to learn, why is adding questions in php a bad idea? The renderer php only gets called once at the beginning right? Is it a concern about sending a large usage array into JavaScript at a time ? I'm not from a computer science background, so my question might be pretty silly, pardon me if it is. Any thoughts on the "how you will send data back to Moodle" part? Pointers on where I should look? So far I've just learnt I should look at AJAX, but I have no knowledge of best practices with it.

 So far I'm thinking create the usage in php, pass the variable to the javascript file, add questions in JS ( Can I call the question_bank::load_question function from js? ), save the responses within JS. Send the usage variable back to php through AJAX. Does that sound right?

I'm trying something that is obviously above over my head but I haven't really been able to get support from CS backgrounded developers. I do learn quickly when pointed in the right direction though. So thank you Tim! I appreciate it.
In reply to Arvind Murali

Re: Question specific usage analytics

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
You are right, it requires AJAX. And Moodle has a recommended way to do Ajax calls https://docs.moodle.org/dev/AJAX, which is linked to Moodle's way of doing web service APIs https://docs.moodle.org/dev/Adding_a_web_service_to_a_plugin, ...

I have to say, this is not a good choice of project for learning Moodle development. It involves several different complicated concepts, and requires them to all work at once. If you were trying to learn just once concept at a time, I am sure you could do it. Trying to learn them all in the same development just seems like a recipe for disaster, or at least frustration.

On the other hand, if this is the feature you want, .... How could we break it down so you only have to learn one thing at a time?

I guess you could start like this:

First try to change the JavaScript in quizventure, so that every time the student shoots an alien, it makes an AJAX/Web service call back to Moodle which passes the questionid, and which answer they shot (or somthing like that). Don't worry about making that web service do anything useful. It can just write something to the server logs, so you know it is working (for example debugging("$questionid, $answerid");. If you can get that much working, then think about learning how to make the AJAX call record the user's attempt at a question in a useful way.

If you want a simple example of an AJAX call in Moodle, I suggest you have a look at block_starredcourses. That looks like about the simplest possible example. Hope that helps.
Average of ratings: Useful (1)
In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -

This is great mentorship Tim. Thank you. I will follow your recommendation. I'm a product manager focused on the user experience I'm trying to provide but since I haven't gotten any tech friends interested yet, I've had to go at it myself. Like you said it's not been easy but I with the help of pointers I have managed to get a lot of the user experience features I'm working on done. I'll keep this topic posted on how I fare with this. Much appreciated. Thanks. 

Average of ratings: Useful (1)
In reply to Arvind Murali

Re: Question specific usage analytics

by Arvind Murali -
I managed to get the questionid and answerid for each shot. The only thing I'm not sure of now is how to add this info back to the quba. I create the quba in the php script on game initialization. I pass question id and answer id to the javascript file separate from the quba. The quba does not even have questions to the quba because quizventure does not use quba currently. However I do have the questionid and answerid for each attempt within the browser pushed back to a php file through AJAX. Can I just get the quba based on the id in this php file and add these question and answer info?

Or, would I have to pass the entire quba object, render the question from it and then the answer attempts and pass it back out?
In reply to Arvind Murali

Re: Question specific usage analytics

by Arvind Murali -
This is how I am thinking of executing this. I'm not sure if this is correct though. What I'm unsure of is if there would have to be an attempt created manually since the usage is a series of attempts and each attempt is a series of steps! 
1
In reply to Arvind Murali

Re: Question specific usage analytics

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers

Sorry for the slow reply. (It is a holiday weekend, after all.)

I think your diagram shows more complexity than is needed. I think you hardly need to change quizventure at all - just the change you made to get the JavaScript to make the Ajax call in addition to whatever it was already doing. All you need to do is make the PHP code that receives that Ajax call do something useful.

I think the something useful needs to look like:

  1. Find the $quba for the current user on this quizventure. (If one does not exist, create one. Probably easiest to set the behaviour to 'immediatefeedback'.)
  2. Add the question that was just answered, and start it.
  3. Process the answer that student gave, to record it.

You can sort-of see the code needed by looking at a typical unit test for the multichoice question type (https://github.com/moodle/moodle/blob/master/question/type/multichoice/tests/walkthrough_test.php#L53).The only problem is that these tests use a number of helper methods to reduce duplicated. This makes these tests less verbose, but also means that the tests are less clear as an example of how to interact with questions. You will need to un-pick what the helpers are doing.

There is also a decision about how long to use each $quba. You could just have one, which records all of a students interaction with a quizventure. Or you could try having one per session of game-play. Either could work.

My suggestion above was to just create one if, when you need on, you find one does not already exist. I guess the alternative is to do what you suggest in your diagram, and create it when the student launches the game. Either method could work. I am not sure which would be easier.

Of course, the other thing to do is to use the fact that you are now saving the responses to make this information available to students and/or teachers in a useful way.

I hope that helps.

In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
Hi Tim,

Thank you for the walkthrough. I have 1 and 2 in your list done. I chose to create a new quba every time a new quiz is started. I see the instance in the DB. I start questions like you mention once they are rendered. I'm able to start the question after every response and it creates entries as expected in the attempt table. The only thing I can't do right now is get the response summary to reflect the answer that was selected. Basically step 3 in your comment. It seems from the test example you mentioned that I'd have to create an array for the response and then use $quba->process_action(slot,$data) to process the response. I just need to figure out what the array needs to look like. I'll continue to work on the final piece of this puzzle and update here! Thanks a bunch! It is great learning something complex. Feels like an accomplishment!
In reply to Arvind Murali

Re: Question specific usage analytics

by Arvind Murali -
This is what I have so far :

function quizgame_add_attempt($quizgameid,$usageid,$questionid,$qsummary,$asummary,$aid,$fraction,$slot,$ansidx) {
global $DB;
$cm = get_coursemodule_from_instance('quizgame', $quizgameid, 0, false, MUST_EXIST);
$course = $DB->get_record('course', array('id' => $cm->course), '*', MUST_EXIST);
$context = context_module::instance($cm->id);
$timenow = time();

$quba = question_engine::load_questions_usage_by_activity($usageid);
$qtmp=question_bank::load_question($questionid);
$res=$quba->add_question($qtmp);
$preferredbehaviour='immediatefeedback';
$quba->set_preferred_behaviour($preferredbehaviour);
$variantstrategy = new question_variant_random_strategy();
$quba->start_question($res);
// $quba->process_action($slot, ['answer' => 0]);
$data=['answer' => $ansidx];
$prefix = $quba->get_field_prefix($slot);
$qa=$quba->get_question_attempt($slot);

$fulldata = array(
'slots' => $slot,
$prefix . ':sequencecheck' => $qa->get_sequence_check_count(),
);
foreach ($data as $name => $value) {
$fulldata[$prefix . $name] = $value;
}

$quba->process_all_actions(time(), $fulldata);

question_engine::save_questions_usage_by_activity($quba);

return $res;
}

I see the index of the answer responses in the attempt_step_data table but dont see them reflected in the attempt or attempt_step tables. I'm sure I'm missing the finish_question method here but I'm not sure when to call it. If I call it after every response, it seems to create a new question instance everytime. I think there is a method that just creates a new updates an attempt with responses but does not create a new attempt. Not sure what method that is. Also a bunch of the args I pass are useless and I'll remove them once I get this working.
In reply to Arvind Murali

Re: Question specific usage analytics

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers

That is very close, and the one thing you are missing is not terribly obvious. Probably a good way to see what the issue is is to go into the question bank, and preview a multiple choice question. Make sure the preview is set to use immediatefeedback behaviour. (Here is a direct link to the moodle demo site.) Open your browser developer tools, so you can see the post data as you do things.

At the moment, the $fulldata array you are creating matches what happens when you click the 'Save' button under the question.

What you want to happen is what happens when you click the Check button. That button belongs to the question behaviour, and you need to include its value. That is add

$fulldata[$prefix . '-submit'] = 1;

to the array.

Also, a small point, but you should probably only call

$quba->set_preferred_behaviour($preferredbehaviour);

once, when you create the usage. It should not be necessary to call it again.

In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
Awesome! This was a great learning experience. Thank you Tim. I'll extend this to other plugins I use that work with questions. I want to share with you my vision for the output and get your pedagogical thoughts and technical pointers as well. Here is what I am thinking of. I want to have a report in the dashboard of a teacher ( preferably as a report itself in the dashboard instead of a link to a report). The report shows them the students within the courses they teach and the number of questions they had worked on the previous day or any given day. The UI would sort of eventually look like this: 
att
I have my eyes on the custom sql report tool you built. Do you have pointers for me on how I should go about executing on this?
In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
I do have a small bug. Every time an enemy ship is created with an answer, I assign it an index from 0 to x, which identifies the option. I log it and see that a certain option is identified by a certain index.

(questions[level].type == 'multichoice') {
questions[level].answers.forEach(function(answer,i) {
console.log(i);
console.log(answer.text);
var enemy = new MultiEnemy(Math.random() * bounds.width, -Math.random() * bounds.height / 2,
answer.text, answer.fraction, questions[level].single,answer.aid,i);

I pass this index to the ajax call above which is what gets used as $data=['answer' => $ansidx];

I check the DB after I have shot an answer and the responsesummary entry is either different from what I shot or I get NULL. I'm thinking, if the options get shuffled before being rendered in the game, then the option index would not match the index that is created with the 'add_question' method in the ajax call. I do have the answer id that has been shot. Is there a way to pass that info instead of the array index of the answer in this command $data=['answer' => $ansidx]; ?
In reply to Arvind Murali

Re: Question specific usage analytics

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Your problem will be to do with randomisation. Probably the most reliable way to is to make sure the actual question_answer.id gets passed to the JavaScript with the other data, and used in the Ajax call, and then used to work out the correct index when you call process_submitted_data.
In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
Got this working! Almost there! The only problem left now is that the first response for a slot gets logged correctly but the responses for the same slot if the first one was wrong for example do not get a responsesummary. I still add the new question and all that but use the same slot.
In reply to Arvind Murali

Re: Question specific usage analytics

by Arvind Murali -
The state in the steps table seems to be in todo instead of graded right or graded wrong. Not sure if a finish_question type method needs to be called. The slot number in the attempts table seems to auto increase despite creating the questions in the same slot. Not sure if this is expected behaviour. 
In reply to Arvind Murali

Re: Question specific usage analytics

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
If you want to track all the responses, then perhaps you need to use a different question behaviour. Perhaps adaptive? Depending on what you do, you may also need to call $quba->finish($slot) at the appropriate time.
In reply to Tim Hunt

Re: Question specific usage analytics

by Arvind Murali -
Got it. Thanks. I'll figure this out. Any thoughts on the report I mentioned a few posts ago? 
In reply to Arvind Murali

Re: Question specific usage analytics

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Sorry, not really. But you should care more about what the teachers you are building this for think than what I think.
Average of ratings: Useful (1)