Analytics and reporting

When people have different uses for Moodle - what are the consistent concepts that can be analysed?

 
Picture of Gavin Henrick
When people have different uses for Moodle - what are the consistent concepts that can be analysed?
Group Moodle Course Creator Certificate holdersGroup Moodle HQGroup Particularly helpful MoodlersGroup Plugin developersGroup Testers

I remember when i wrote my blog post on rollovers and how people did them, it was based on experience with many organisations - all different and all who pretty much had their own local culture of how they
a) used a moodle course
b) enrolled people
c) various level of what their coursework existed in Moodle

So some of the following were things I noticed:

  • course start as a field in a course may exist but was not the actual dates at all of the specific module for all users in it - often was just the date of a course being created manually or by external bulk create
  • end date probably does not have a value, - its new in 3.2 and so most SIS integrations (manual or automatic) would not have it, yet.
  • a course may have same shortname/id year on yearost do not use activity completion
  • assignment activities are liable to be hidden after end dates, and sometimes unhidden not always


About course enrolments

  • There may be multiple sets of students from different modules in the course at same, or back to back without resets - some students left with access for 3-4 years while in college
  • a course may be reused year on year with just a reset
  • a course may exist for years in the moodle site, and just used for reference while each course is cloned for new running of module


Grades

  • Final grade for overall course is most likely not in moodle (as most are blended not fully online)
  • Gradebook may have formative and summative items mixed up - as many institutions export the summative or actual serious graded activities out into their Student record system


what does this mean for doing generic analytics?

  • Can we rely on the start/end date reliably- or enrol start date /end date
  • Can we use activity completion- or is view / submit / graded more dependable
  • the gradebook, is only whats in it - and all we have to go with - how to know
  • some users may be in one assignment, some in others if groups are used
  • some gradebook entries may not actually contribute to a real final grade


So what can an analytics engine potentially and reliably use across most types of course?

Dates

  • Perhaps ongoing "periods of time" can be used, looking at who/what is happening within a period of time
  • Students who are active during a set period (if analysing on a week, then only those active in that week for example) - and comparing only within that window
  • Grading of activities - can only go on whats there or not
  • Activity on activities (or lack of)
  • the details with Rubric responses of assignments
  • the details with Marking guide responses of assignments
  • overall access patterns to non activity items (messaging, gradebook, calendar)


in short, what is reliable consistent data withint inconcistent course structures, design and implementation.

My 2 cents.

What do you think?

 
Average of ratings: Useful (4)
Picture of Elizabeth Dalton
Re: When people have different uses for Moodle - what are the consistent concepts that can be analysed?
Group Moodle HQGroup Particularly helpful MoodlersGroup Plugin developers

Thanks, Gavin.

All of these points are valid. We do have a way of estimating course start and end for courses that are used once (based on majority of activity in course), but this gets much harder with courses that are reused without reset. We have also been hoping to develop a bulk grade import tool so we have some valid measures of success (for academic courses) on which to base predictions.

We are looking into adding "expect start" and "expect complete" date fields to enrolment records as one way of improving predictions, but obviously these fields won't help if they aren't filled in.

There is an unavoidable tradeoff between data quality for predictions and effort involved in maintaining the data. Much of this effort can be automated if courses and enrolments are generated from SIS data, but changing the automation scripts is an effort in itself.

Our question in the analytics team is, what tradeoff are people willing to make? How much data would site admins, teachers, etc. be willing to enter to improve predictions, and conversely, how low an accuracy rate on predictions are people willing to accept if their data is incomplete?

We have no assumptions about this in the Learning Analytics team. We need to hear from the Moodle community on this.

 
Average of ratings: -
Mary Cooch
Re: When people have different uses for Moodle - what are the consistent concepts that can be analysed?
Group Documentation writersGroup Moodle Course Creator Certificate holdersGroup Moodle HQGroup Particularly helpful MoodlersGroup TestersGroup Translators

From a K12 point of view, I can definitely relate to much of what Gavin says. In my former establishment, course start dates were indeed meaningless as they were the date the admin made the course . They'd enrol the students then or later (eg at the start of a term) For courses with younger, non-exam students, the courses would simply be reset each year, previous students kicked out and new classes brought in and none of the data saved. For older, exam class students, the students would be retained in an archived group (Class of 2015) and the new students brought in meaning Moodle courses with years of students together.

The gradebook was used inconsistently across departments and (as this was a school, much grading was done offline and not synced with any Moodle grades.) The greatest use of Moodle assignments was (and still is) merely to record that work had been submitted. One department made a custom scale - handed in/not handed in, for that purpose. They hardly ever use(d) activity completion and that's a shame because I think such reports are really useful. Course completion, however, is kind of meaningless in such environments.

During the time I was there and we had Moodle, final exam results went up. Management was often asked "what evidence do you have that Moodle helped improve the grades?" but there wasn't any, really. It would have been/will be excellent to have something easily used by schools and colleges. As a by the way, my old school, and I suspect many others, are in love with SISRA Analytics - but I doubt anything Moodley is taken into account.

 
Average of ratings: -
Picture of Marcus Green
Re: When people have different uses for Moodle - what are the consistent concepts that can be analysed?
Group Core developersGroup Particularly helpful MoodlersGroup Plugin developersGroup Testers
My background is in UK FE (mainly technical education possibly similar to Australian TAFE) and University degree courses. I don't recall if start and end dates were stored in FE, but they would not have been particularly meaningful as a large percentage of students would trickle on to courses up to weeks after the start and a small percentage would hang around for up to a month after the official course end.

In both organisations courses were not re-used and a new course was created automatically, typically this was close to the course start date. Final course grade was not stored in Moodle but was hand entered into the School Management System. The SMS was the final arbiter of a students overall grade.

Gradebook contained graded activities that were formative, summative and sometimes just a way of getting the students to submit stuff that was never expected to be graded

To address Gavins final question about "what is reliable data within inconsistent course structures" is that there is none and to get it you need to change organisational policies and individuals attitudes.

I rather like activity completion as a measure of progress as it is turned off by default, meaning that where it is turned there has been a deliberate decision to address it and also it is binary, i.e. you have either completed or not completed an activity. This is in contrast to other areas of grading which could vary wildly between two different teachers.
 
Average of ratings: Useful (1)
Picture of Tom Murdock
Re: When people have different uses for Moodle - what are the consistent concepts that can be analysed?
Group Moodle HQ

The questions you raise make perfect sense to me as conversations I always want to have with Moodlers.  

  • Do you know what settings are necessary to make the best use of a Moodle dashboard?
  • What learning outcomes do you want to address within Moodle?
  • How will you know if outcomes have been fulfilled?
  • What elements are consistently designed across a set of courses so that they can be compared?
  • What have your teachers/educators/trainers agreed upon as "consistent practice"?
  • What factors make certain elements more or less successful than other elements? 
  • Do the various course designs create value for students, or are the different patterns confusing?
  • If you teach hybrid courses, what activities do you want to save for face-to-face meetings?

Backward Design (Wiggins) can't be ignored in online courses, especially when you intend to compare apples to apples with analytics.  Plus, it is really liberating to find some agreement among peers on what they want to see in online instruction.

-Tom  

 
Average of ratings: -