Plugin Feedback Requested

Plugin Feedback Requested

by Marty Gilbert -
Number of replies: 25
Picture of Core developers

Hello all,

After hearing about the Purdue Signals Project at the 2013 Moodle Moot in Portland, I - along with one of my senior students at the time - began work on a simple project, Moodle Meter, that would:

  1. calculate each student's course activity*, relative to others in the course

  2. Assign each student to 1 of 5 levels - 1 being well below average, 3 being average, 5 being well above average.

  3. show each student their level, as an indication of how active they are on the course site relative to their classmates

    Activity Levels
  4. show the instructor a list of all of the students' levels, including the ability to graph the change in activity over time

    Teacher Block

Graph


*The calculation of a student's activity score is terribly simple, and essentially scores the Moodle logs for occurrences of certain types of activities, each class of activity is weighted in the block config. I feel like this is one of the weak points of our calculations.


Meter Config


I beta-tested this block with an opt-in group of faculty this spring, and the feedback was positive. Most said that it simply reinforced their gut feeling, but it was good to have data to back it up.

I was hoping to get some general feedback from the developers on board as to what the overall weaknesses of this block are, and ways to improve it, if worthwhile.

The project, as it stands, is somewhat stable, but lacks testing code and externalization of strings (among other things, I'm sure) to bring it up to Moodle plugin standards.

Thanks for any and all comments,


Marty Gilbert



Average of ratings: Useful (5)
In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Howard Miller -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers

Link to your code (or better yet, Github page)??

Average of ratings: Useful (2)
In reply to Howard Miller

Re: Plugin Feedback Requested

by Marty Gilbert -
Picture of Core developers

Hi Howard,

The code needs much work, but you can find it here: https://github.com/MarsHillUniversityCS/Moodle-Meter

I think it supports Moodle 2.6 - 2.9, but needs much more testing.

Thanks,

Marty

In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Dr. Indira Koneru -

Hi Marty,

Moodle Meter current version doesn't support Moodle 2.8. It requires Moodle 2015051100.03.

Indira

In reply to Dr. Indira Koneru

Re: Plugin Feedback Requested

by Just H -
The 2.8 version requires Moodle 2014111006.03.
In reply to Dr. Indira Koneru

Re: Plugin Feedback Requested

by Marty Gilbert -
Picture of Core developers

Hi Indira -

As Mr. H indicated, I believe you might have the wrong branch checked out. Try the MOODLE_28_STABLE branch instead of the master branch, which is most likely tracking the most recent (2.9) version.

Marty

In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Dr. Indira Koneru -

 Installed 2.8 Stable. Still stuck with the issue "Dependencies check failed for block_meter". Attached the screenshot.

Attachment Meter.png
In reply to Dr. Indira Koneru

Re: Plugin Feedback Requested

by Marty Gilbert -
Picture of Core developers

Hi Indira,

I'll contact you offline so we don't hijack this thread and send you the code for Moodle Meter for Moodle 2.8. Unfortunately, you cannot just download the code from GitHub without performing a

git checkout MOODLE_28_STABLE

in the 'meter' directory to use the correct version.

Thanks to everyone for the valuable feedback. If you decide to Moodle Meter at your institution, let me know what improvements you'd like to see!

Thanks,

Marty


In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Rob Monk -

I love it 

Just the kind of thing that engages students and gives staff easy to read feedback on what students are doing  

Our school will definitely use this  

In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Roland Sherwood -

This looks great Marty - and we'd be definitely be keen to make use of the block also (and, yes, please could you provide a Github link if possible).

In terms of additional features, perhaps the option to factor in overall time spent in the course might prove useful also?

Really looking forward to seeing how this progresses.

In reply to Roland Sherwood

Re: Plugin Feedback Requested

by steve miley -

Roland, the question of how much time does the student spend on the course site is a great question.   But its not clear to me how to determine this number.   If an instructor uses the lesson module, there is definately a series of clicks, or even going through a quiz.    But uploading an assignment (if its a file) can be pretty fast.    


Do you have ideas on how the "total time" might be computed?  


In reply to steve miley

Re: Plugin Feedback Requested

by James McLean -

Without a definitive method of determining if a student has 'left' a course - the number will never be accurate. Examples such as reading a large forum thread or long page in a book may take many minutes meaning there is no direct activity in this time frame, but they're still very much active in the course itself.

Looking at things like logout button being clicked are not very reliable; With the exception of online banking I never bother to logout of any system anymore and I'd bet the majority of people are the same.

I suppose you could say if there was no activity in say 30 minutes, or there was activity outside the course, then they're no longer engaged in this course - but again that's not really definitive enough if you ask me.

In reply to steve miley

Re: Plugin Feedback Requested

by Rob Monk -

Total time is problematic to calculate  Do I log in and make a coffee to get my stats up? 

For me, who works in a k to 12 school a major indicator of engagement is logins out of school hours   This is effectively a "homework" checker . Weekend and out of hours logins show commitment and you should be able to give them additional weighting  

In reply to Roland Sherwood

Re: Plugin Feedback Requested

by Marty Gilbert -
Picture of Core developers

Hi Roland,

I posted a link to the Github site above - and thanks for the feedback!

I'm not sure how I could/would capture time on the site - but I, too, think that would be valuable data.

Thanks,

Marty

In reply to Marty Gilbert

Re: Plugin Feedback Requested

by steve miley -

Marty - I like what you've done here.   I think what you have is a great start and might be plenty to get going. 

A few thoughts though - (future ideas)

Think of having multiple "engagement" indexes that can be viewed,  as we currently don't really know what makes an engaged student.  This way an instructor could view 1-5 different engagement indexes.  

Recording this activity per week, would be a bit more complicated, but it would allow being able to track trends, and identify students who are slipping or who might be more engaged over time.  This would require an instructor to set the week start date, which might be different for different instructors. 

Being able to correlate the engagement with the course total could be used to review the viability of the accuracy of a engagement index.   I've used scatterplots in the past to compare activity vs grades.

I like the method you have of applying weights to different items.  

There can be lots of "noise" in the logging of activity.   such as quiz views.  The logging was inserted by a programmer (nothing against the programmer), but not a pedagogical expert who might choose different locations to place these log recordings or sensors.    So analyzing the logging that happens with each activity, and possibly pruning out the noise might be useful.  

Great work! 

Steve


Average of ratings: Useful (1)
In reply to steve miley

Re: Plugin Feedback Requested

by Marty Gilbert -
Picture of Core developers

Steve - thanks for the excellent feedback.

I like the idea of computing different engagement indexes - as I said earlier, the metric we're currently using is most basic, and, while it allows for some flexibility, it's still just one metric.

Moodle Meter records activity daily, so you can see trends over time via the graph feature - is there an advantage to doing it weekly versus daily?

I had a social science professor do some unprompted correlation analysis early in the semester. He had modified the weights of his engagement metric for how he presented his course materials on Moodle, and found an R value of 0.393, which suggested a moderate positive correlation. I had hoped to do more such analysis this summer to see if this held true across every course, but it just hasn't happened. It may do nothing more than measure how well (or if) a professor modified the weights to match what they deemed important.

A (somewhat) major flaw (or feature?) of Moodle Meter is how it counts different types of activity. If the log event says "quiz", it counts it as a mark in the 'quiz' category and weights it as such, regardless if it was a simple 'view' or a 'submit'. Thus, if quizzes were weighted 50 points, and the student kept hitting 'refresh' on the quiz view page 100 times, that student just earned 5,000 engagement points. I had some students trying to "win" the Moodle Meter game - not exactly what I was going for smile

Thanks,

Marty



Average of ratings: Useful (1)
In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Elizabeth Dalton -

Glad to hear someone looked at the correlation data. smile I'm doing a lot of this kind of reporting and analysis with our data. Currently (using legacy logs) I distinguish between "views" and "edits" and I don't count "views" as activity. I think in your widget they should at least be different categories of activity, so they could be weighted differently by the instructor (e.g. include "views" but with very low weighting compared to "edits"). 

In Legacy Logs, I looked for mdl_log.action NOT LIKE "view%". For the new logs, it looks like something similar will work, e.g.  mdl_logstore_standard_log.action NOT LIKE "view%" (or mdl_logstore_standard_log.action NOT LIKE "viewed" or even mdl_logstore_standard_log.action <> 'viewed', depending on which is fastest), though I want to do some testing. I'll be working on this over the next couple of weeks as we transition to the new log system.

Average of ratings: Useful (1)
In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Elizabeth Dalton -

Also, while there is an appealing simplicity to comparing student activity to other students in the same course in terms of "most" and "least," I'm mildly concerned that there is a temptation to equate "most" with "enough" and "least" with "not enough," which may not be valid at all. If all students in a given class are participating approximately equally, what does this block display? Are they all going to get "average" participation? If the whole group is participating well, this may undercut the value of that participation. On the other hand, if the whole group is disengaged and they're all getting "average," this may justify low participation to students and/or instructors. I'm reluctant to suggest site-wide comparisons, both for performance reasons and because different subject areas and types of courses can have vastly different participation patterns, but some kind of disclaimer seems appropriate to keep people from misinterpreting (or over-interpreting) these results.

Average of ratings: Useful (1)
In reply to Elizabeth Dalton

Re: Plugin Feedback Requested

by Marty Gilbert -
Picture of Core developers

Hi Elizabeth,

I believe that both of your posts make excellent points.

I think the next tweak I'll look at is differentiating between a category's 'view' and 'update'/'add'/'etc'. That's fairly trivial, as far as the code goes, and could go a long way to filtering out the "noise" mentioned by Steve Miley upthread. Thanks for the suggestion.

As to your second post re: students mis/over-interpreting their Moodle Meter level, I think that's a valid concern. I tried to carefully word the description with their "gauge" ("Your Moodle activity is ______ compared to your peers"), and I'd love to hear suggestions for different wording, or perhaps adding a link under the gauge to further explanation? 

The two examples you've offered where the entire class was over/underachieving equally...well, this tool might not be great for that situation. I wonder how plausible those situations are? I could see a class equally underachieving as a much more likely scenario, I guess, but the equally overachieving class I find much less likely. I would posit that with increased activity, the chance that the differences can be highlighted by mean/sd calculations also increases.

Thanks for the valued input,

Marty


In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Bob Puffer -

Marty, any chance you'll be at the Moodle Moot US 2015? Would love to have you on the Learning Analytics working group.

In reply to Bob Puffer

Re: Plugin Feedback Requested

by Marty Gilbert -
Picture of Core developers

Hi Bob,

I'm actually going to The Moodle MountainMoot instead of Moodle Moot US 2015 this year; the dates were a little more friendly to my calendar, since school begins in mid August.

I would love to be a fly on the wall of the Learning Analytics working group, though. Will any notes/minutes be posted? Anyone accepting bribes to post video? smile

Thanks,

Marty

In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Rob Monk -

I've included a link to this discussion in the Learning Analytics working group at moodlemoot Australia next week  


In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Rob Monk -

I've included a link to this discussion in the Learning Analytics working group at moodlemoot Australia next week  


In reply to Marty Gilbert

Re: Plugin Feedback Requested

by Bob Puffer -

All, bear in mind the reason Purdue's Course Signals can show some success is that it messages directly students in trouble.

In reply to Bob Puffer

Re: Plugin Feedback Requested

by Elizabeth Dalton -

Do you think automatically sending messages to students is the most helpful component, or alerting instructors and/or advisors? I wonder how much automated "canned" reminders motivate students to participate more, compared to a personal note from an instructor.

In reply to Elizabeth Dalton

Re: Plugin Feedback Requested

by Marty Gilbert -
Picture of Core developers

Bob and Elizabeth -

You both make excellent points, and I'm actually already in the process of adding a block config setting that will allow automatic feedback to be sent periodically to those students that have lower relative activity.

But, because the 'canned' feedback might be ignored, I'm also adding the ability for the instructor to message them personal notes through the Moodle Meter interface.

Lastly, I'm working on an interface that is accessible to a user(s) with elevated privileges that will show a summary of a student's Moodle Meter levels across all courses. We have an office of student success that can use that as yet another piece of information to help catch students before they pass the point of no return, so to speak.

All good points - thanks for the valuable feedback.

Marty