Learning Analytics tools available in Moodle

Learning Analytics tools available in Moodle

by Michael de Raadt -
Number of replies: 17

There are a number of reports, blocks and other plugins for Moodle that provide learning analytics. Some are standard plugins (part of the standard distribution) and some are third-party plugins (available from the Plugins Directory).

Starting with a definition...

Learning Analytics are any piece of information that can help an LMS user improve learning outcomes. Users include students, teachers, administrators and decision-makers.

...the following plugins are available.

Plugin Type Standard/
Third-party
Useful for Reported usage*
Logs Report Standard Teachers, Admins, Decision-makers 71.4%
Activity Report Standard Teachers 69.1%
Activity completion Report Standard Teachers 66.3%
Live logs Report Standard Teachers, Admins 55.2%
(Quiz) Statistics Report Standard Teachers 53.0%
(Course) Participation Report Standard Teachers 49.9%
Course overview Report Standard Admins, Decision-makers 45.0%
Course completion status Block Standard Students 41.4%
Progress Bar Block Third-party Students, Teachers 32.0%
Events list Report Standard Teachers, Admins 28.6%
Activity results block Block Standard Students 26.1%
Configurable Reports Block Third-party Teachers, Admins, Decision-makers 22.7%
Ad-hoc database queries Report Third-party Teachers, Admins, Decision-makers N/A
Course Dedication Block Third-party Students, Teachers N/A
Graph Stats Block Third-party Teachers, Admins N/A
Engagement Analytics  Block,
Report,
Activity
 Third-party  Teachers  N/A

* Reported usage is drawn from the Plugins Usage Survey from 2015.

Questions for you...

  • What is your experience using these plugins as a source of Learning Analytics?
  • Do you know of any other Learning Analytics plugins for use within Moodle not in this list?
  • Would you classify these plugins differently?


Average of ratings: Useful (2)
In reply to Michael de Raadt

Re: Learning Analytics tools available in Moodle

by Matt Bury -
Picture of Plugin developers

Hi Michael,

AFAIK, the research on learning analytics is mostly inconclusive with some exceptions. The most consistent findings I've seen (by no means a comprehensive review though) are that how soon learners start work on assignments, problem sets, etc. and how much time they spend online (presumably rough indicators of "time on task") are the two strongest predictors of learner attrition (i.e. failure and/or drop-outs). If anyone has links to research/articles on more predictors please post them. I'd love to know smile

In my experience, although I haven't used them much, the Engagement analytics suite of plugins are a fairly good indicator/predictor. Some of the stats from the standard reports can be useful but difficult to interpret in meaningful, actionable ways to anticipate all but the more extreme cases of lack of participation.

I think that also worth a mention are the participation forum and participation map plugins, although it doesn't look like they're maintained and are difficult to install (require some code modification). It's a shame because I think they're great plugins:

http://www.participationforum.org/

http://www.participationmap.org/

There's presentations on them here: http://brant.knutzen.se/seminars-workshops/

In reply to Michael de Raadt

Re: Learning Analytics tools available in Moodle

by David Jones -
G'day Michael,

I've used Activity Completion in a way that fits the definition of learning analytics. I also use BIM (I imagine I'm one of a very small number that does). Haven't used any of the others you have listed.

Activity Completion and BIM are used because they were the primary components of how I designed the course and hence are most likely to provide the information that is deemed useful for my students and I.

In both cases, I've adapted the data provided by both tools to better fit the design of the course.

The gap between generic tools and the specifics of the learning design is an interest of mine. I think it will remain one of the challenges for learning analytics to overcome prior to making a huge impact.

Different classifications?


There don't seem to be a lot of classifying in the above that's resulted directly with learning analytics. Except perhaps "useful for" and whether or not these are learning analytics at all.

"Useful for" perhaps comes down to design and intent.

Under your definition of learning analytics, I'd tend to suggest that Activity Completion is useful for both
  • teachers; and,
    e.g. allows identification of students who may be falling behind etc. students.
    I've had comments from students that the "ticking of boxes" helps keep them on track, organised, and motivation.


YMMV.

David.



Average of ratings: Useful (2)
In reply to David Jones

Re: Learning Analytics tools available in Moodle

by Derek Chirnside -

Big analytics or Little analytics?

This post was more prompted by the blog post over at MoodleNews on intelliboard.  You can get a ton of data from intelliboard


However, most of the tutors I work with need something a lot more simple.

---

Case study/illustration: Moodle forum use

eg.  I've set a question in a forum and I want to see who has answered.
Corollary: I'd like to remind the non posters.

Is this 'analytics; in the accepted use of the word?

This simple addition would enable teachers to manage an emergent constructionist oriented class a little more effectively.

Here is what Moodle currently offers:


Think of an average busy tutor trying to make sense of this and actually use the information here.  Not only is the information hidden away a little, it is not procus3ed in a user friendly format and you cannot do anything with it without massaging.

D2L and Canvas have had this sorted at least two maybe 4 years ago.  Both these products have 'analytics' as a marketing tool, in fact D2L has gone over the top and made promises I suspect are unrealistic.  But they also have something a bit more accessible as well at this very basic level.

-Derek


Average of ratings: Useful (2)
In reply to Derek Chirnside

Re: Learning Analytics tools available in Moodle

by Michael de Raadt -

Hi, Derek.

For ad-hoc questions, there was a generic reporting API suggested and explored a couple of years ago. A spec was started. I think a lot of people would have like to see this become part of Moodle, but it hasn't gone too far due to the complexity of making this happen across Moodle. It still appears on the Roadmap and I hope it gets some attention still.

For the more specific questions of involvement and risk, the Learning Analytics working group at MootUS15 had a go at starting a Learning Analytics API. A spec has been started, but it's early days yet. Something like that may be able to identify people who are not involved in activities like forums and allow teachers to contact such students.

There was also an idea floated by Steve Miley at the Learning Analytics working group, which could be a very simple, useful LA tool for students. By comparing activity to other students in the course, students could be prompted to view/post/submit. For example:

43% of students have viewed PAGE X

82% of students have posted to FORUM Y

29% of students have submitted ASSIGNMENT Z

This could be implemented as a block in Moodle (using an observer to track things), but that assumes students are coming to the course. Perhaps it could also be used to trigger messages to students periodically, perhaps with some conditions.


Average of ratings: Useful (1)
In reply to Michael de Raadt

Re: Learning Analytics tools available in Moodle

by Derek Chirnside -

This post doubled in size as I wrote it.

Post Abstract: lets have some simple analytics [based on workflows needed by real tutors] that are easily gathered and an easy way to respond to the information -   rather than trying to analyse every data point, and rather than having to subject data to speadsheet analysis first.

Hit 'delete' now if you like.

-Derek

---

Post:

Question: Michael, What is an 'ad hoc' question in this context?

Comment: "By comparing activity to other students in the course, students could be prompted to view/post/submit"
This is an American centric, competative and non-collaborative "grading on the curve", "use the stick" content/transmission oriented kind of approach to education.  I would also like something more emergent and supportive of the learner.  I apologise to Steve if this is not what he meant.

Question: where does the "Learning Analytics Working Group" hang out?

Comment: "It still appears on the Roadmap and I hope it gets some attention still"
Maybe.  As I have said before (https://moodle.org/mod/forum/discuss.php?d=265939), when Martin blew away all the links on the roadmap to the various working documents, Forum discussions, developer meetings and tracker items, the roadmap became so high level and non-specific as to be nearly useless in any sort of planning for where we put our effort and thinking.  The recurrent word from HQ (ie the wonderful coders there) is 'hope'.

---

Not all learning 'things' on a Moodle site are the same.  Who cares if 47% have viewed page X?  If your site is set up well you will know what is in page X and you as a student may not NEED to read it - you have read the article, listened in class etc etc and don't need to.  But interaction.  That's different.  How do we KNOW students are interacting?  The best we can do is forums (#3 on your most used plugins list)  And I don't buy this "I learn by lurking" view.  

This may sound a little grumpy Michael, maybe a reflection that I got up early for my first webinar in many months which was a waste of time, was not presented to the stated goals, had no chat dialogue to converse, closed off questions, basically was death by powerpoint and towards the end I began to suspect they had not really been familiar with the topic.  

But I do think some of you guys in the analytics field (and I mean the widest sense, even well outside Moodle) have yet to really engage with workflows and what teachers need to enhance their role.  (I'm setting aside the question of students here, and bean counting administrators) 

The question "in your teaching, what do you find difficult to do that you want to do" becomes for analytics: "what do you as a tutor want to know about your users and their engagement with the course?"

  • Q1 Type data: The simple question like "Who has done it?" "Who has not?" "How do I communicate with them? (individually or as a group?)"  This is the first type of question.  
    I'm very clear here: I want to send a message to Fred and his buddies like this: "Hey I notice you have not posted yet on Task three.  I'd suggest the easiest way to get started is to read (link) with your notes from (xxx) in front of you.  Please get back to me ASAP if there is a problem"
    I want information and functionality so I can do this.  I'd prefer to do it rather than a machine.  I don't want to have to download a report and do analysis to achieve this.
  • Q2 type data: The second type of question is "Who is logging on and when, who is in NOT there early enough in the cycle (usually the week) and how can I communicate with them?" 
    I want to be able to say "You guys are leaving the task work until too late.  You are not engaging until half way through the unit.  This is not good for your learning - What do you want me to do?  Add in another required task earlier on?  What's the problem?"
  • I won't touch on Q3 Type data here . . . .
This is important:

I can manage the second type of question to a degree IF I have the answers to Q1, so Q1 information is more vital, central to forums and interaction, collaborative learning - and I believe useful for lock step transmission stuff as well; it is not there in Moodle and is in D2L and Canvas.  This is huge impact information, that at present is difficult in Moodle.  You are doing a lot of work, huge investments of time, which we appreciate (for instance the Plugin survey was great) but the basic information from event monitoring is still hidden in the logs and not available easily to a lowly tutor.

If you want to point me to any presentations on the topic of analytics I'm happy to listen.  I may of course have got the wrong end of the stick.

I'll respond in due course to Mikes question.

-Derek

Average of ratings: Useful (2)
In reply to Derek Chirnside

Re: Learning Analytics tools available in Moodle

by Michael de Raadt -

Hi, Derek and all.

Answering some of your specific questions to me...

  • By ad hoc I mean not pre-defined. If a teacher, one day, wants to check who has participated in the activity due that day, and they hadn't considered that before, that seems like an ad hoc question they wish to answer and perhaps act on.
  • The Learning Analytics working group took place at MootUS15 (site, notes, spec, summary). You can be part of that by helping to improve the spec at this stage.
On what I see as the main thrust of your message, you have a good point. I guess what you are terming "workflow" from a teacher's perspective is analogous to "time management" and "progress" from a student perspective. It also relates to "completion".

Quite a few years back (pre 2.0) I wrote the Progress Bar block to aid students with their time management. I noticed students were being confronted with a large number of activities. Research at the time showed that students who focus on the short-term (what do I have to do next) were more successful than long-term planners or students who didn't plan at all. The Progress Bar isn't proactive, it won't remind students if they are about to miss something that is due, but it does give a quick visual prompt on what students have and haven't done and what they need to do next.

A while later I added an Overview page, which collects together all the progress bars of students in a course and allows comparison by a teacher. The teacher can also communicate with students from that page. That seems similar to the sort of idea you are talking about.

Overview page

Later (2.0+), Course and Activity completion came along. Completion is a system that can do much of what I set up in the Progress Bar (by the way, the Progress Bar can be used with Completion). There is a Course Completion Status block and an Activity Completion report.

Completion report

The Activity Completion report doesn't allow teachers to act on what they are seeing. They can view individual students and communicate with them from there. There is also no proactive action based on completion, such as a warning to a student that they have not completed something that was or is about to come due.

The Progress Bar and Completion don't address course access, but they cover non-participation. There's no reason there couldn't be a completion aspect that requires students to log in, but ideally we want to get students involved on a regular/continual basis and the only way to do that is through activities, so I think relying on activity/resource-based completion is probably enough.

For a single activity, it is possible to look at the Participation report. This report doesn't require any prior setup, like the Progress Bar and Completion. It can be used to see what students have or haven't participated in a particular activity and from that list it is possible to contact those students and prompt them. It seems like a good tool for answering those ad hoc questions and acting on them, but it doesn't consider the greater context of the course.

Here are some questions back?

  • Based on these existing tools, or perhaps using the same underlying APIs they use, what tools can we improve/create/combine to help teachers with their "workflow"? What should they look like?
  • How can we introduce these into courses so they don't need manual setup by teachers? Should we turn completion on by default?
  • Can we make any of these things more proactive/automatic, so they do not rely on teachers as much, but still have the same effect?

This is a productive conversation, let's keep it going.

Average of ratings: Useful (2)
In reply to Michael de Raadt

Re: Learning Analytics tools available in Moodle

by Derek Chirnside -

This is a placeholder post.

Thought: is the work done on 'reports' aware of work done on 'analytics'  ??

@David, and @Michael, thanks for the responses.  I'm actually recovering from rotator cuff surgery, and I probably shouldn't delve into this conversation at the moment, it is uncomfortable to type.  But I will.  I'll check out the links, and be back sometime.  

Friday 2.52pm here.  As it arrives through the time-zones, have a good weekend.   cool

-Derek

In reply to Derek Chirnside

Re: Learning Analytics tools available in Moodle

by David Jones -
G'day All,

Sorry, this has grown.

Summary: Explain how work we've (not just me) been doing echoes much of what Derek said. Outline a problem facing Moodle and other large scale (multiple courses/contexts) learning analytics projects. Suggest a possible (difficult) solution.

The 4 paths

Colin Beer, Damien Clark, and I have been thinking about this sort of problem for a while.

Michael referenced some of this work earlier in this thread with what we know call the 4 paths for learning analytics. It was 3 paths in a 2014 paper but had evolved into 4 paths by the time of a talk at MoodleMootAU'2015.

The paths are based on our observations of ways in which learning analytics are implemented. The 4 paths are:
  1. Do it to teachers.
  2. Do it for teachers.
  3. Do it with teachers.
  4. Teacher DIY
    "Know thy students is an example of Teacher DIY from my practice.


Learning analytics projects in institutions tends to be to or for. Large scale institutional projects that assume a particular tool is applicable across all courses.

The problem of context


The problem with this is that the projects have to be designed for reuse across the broadest possible range of courses. As per Wiley's reusability paradox this means that contextual details (e.g. specific details of the individual learners, the learning design, or information specific to the course) have to be removed. The cost of removing context is reducing pedagogical value.

The do it to and for paths have a tendency to remove context and reduce pedagogical value.

Choice1

This might be why we get Dawson & McWilliam (2008) reporting
current LMS present poor data aggregation and similarly poor visualisation tools in terms of assisting staff in understanding..student learning behaviour (p. 3)
and then 5 years later Corrin et al (2013) reporting
A common request that emerged across the focus groups was the ability to correlate data across systems


Since the context for most courses is that Moodle is not the only system containing relevant data.

Categorising Moodle analytics tools with the 4 paths?


Earlier in the thread Michael asked
Could there be a way of categorising tools according to the three paths you suggested in your ASCILITE 2014 paper (do it to, do it for, do it with teachers)?


The problem or bias (one that perhaps needs to be disrupted) I bring to this question is that most of the Moodle tools (almost be definition) have been the result of the to or for paths. Due to the nature of how these tools are developed and then evolve.

There's also the problem of how plugins get added to an institutional Moodle installation. There are typically institutional gatekeepers that evaluate what gets added. Which tends to make the DIY approach impossible.

Of course, reality is never that simple and I'm sure there's typically a mix of the (first three) paths, but I do think there's a tendency toward to and for

Aligning learning analytics with learning design


Lockyer et al (2013) suggest that learning analytics needs to be better aligned with the specifics of the learning design and include arguments such as (emphasis added)
This leads to questions surrounding how analytics can begin to bridge the technical–educational divide to provide just-in-time, useful, and context-sensitive feedback on how well the learning design is meeting its intended educational outcomes. Here we argue that a critical step for moving forward on this agenda entails the establishment of methods for identifying and coding learning and teaching contexts. (p. 1446)


Subsequently they propose "two broad categories of analytic applications" (p. 1448)
  1. checkpoint analytics; and,
    that is, the snapshot data that indicate a student has met the prerequisites for learning by accessing the relevant resources of the learning design. (p. 1448)
  2. process analytics.
    These data and analyses provide direct insight into learner information processing and knowledge application (Elias, 2011) within the tasks that the student completes as part of a learning design.


The challenge is that these work best when aligned with the specific learning design.

I'm not sure that many of the existing Moodle (and elsewhere) learning analytics tools are specific to learning designs.

What might this look like?


A few weeks ago I gave a talk to Uni of South Australia that expanded upon some of the above and closed off with a couple of examples of what this might look like in Moodle. Here's one example.

Near the start of my course I use a Moodle forum to run an icebreaker activity. Students have to
  1. Post their introduction.
  2. Say hi to someone they think is similar to them.
  3. Say hi to someone they think is different to them.


i.e. they have to write a post and two replies. I use activity completion on this, but that doesn't tell me which students have written how many posts. It also doesn't scaffold the students - i.e. clearly tell them that they have 2 more posts to write.

With this learning design, a useful example of process analytics would be (as shown in the following image) a button I could press (in context) that would show me a list of students and their progress. It would be even more useful if it included a "Remind" button that would help me nudge students into completing the task

iceBreaker CASA 1

Another example, not quite related to learning analytics might be to add some scaffolding to the Feedback tool so that it would actively support people in using the Feedback tool to implement Minute papers. Stead (2005) suggests that limited use of the Minute paper is
largely due to lack of knowledge of its existence and the perception that it would be too time-consuming to analyse responses
Sounds like a job for learning analytics, in particular process analytics.

minute paper CASA

The challenge


The challenge here is how could Moodle and institutions using it overcome the reusability paradox. Moodle has to be a tool that is usable across many different types of institutions. Hence adding the type of learning design specific examples above is difficult. But at the same time it's the sort of contextual functionality that is required to ultimately increase pedagogical value.

Due to our individual contexts, we're currently exploring solutions to this via the with and DIY paths. i.e. we have to work outside of Moodle. e.g. Damien's Moodle Activity Viewer and my
"Know thy students.

I imagine that people within the Moodle HQ context would see different possibilities. It would be interesting to hear what those might be.

David.
In reply to David Jones

Re: Learning Analytics tools available in Moodle

by Michael de Raadt -

Thanks for sharing your paths work here, David.

I liked your distinction between checkpoint and process analytics. Within Moodle, I suppose process analytics need to reside with activities. The only activity that I know shares such process information is Quiz, through its reports. Assignment almost does, although it only shows submission status.

On the docs page for Learning Analtyics, perhaps we could add a section for integrated systems used for Learning Analytics. Things like the MAV (which I wish was an actual Moodle plugin) and Intelliboard could go there. I hadn't seen your Know thy students thing before.

In reply to Michael de Raadt

Re: Learning Analytics tools available in Moodle

by David Jones -

MAV/know thy students


That it's not a Moodle plugin is largely (but not entirely) a contextual factor. Basically, a Moodle plugin version of MAV would never have gotten installed at that institution. Similar with "know thy students". It took me a year to get BIM installed, let alone trying to do something like "know thy students".

I do think addressing the "reusability pardox" problem is a big challenge for Moodle (or any LMS). Moodle's API work is a good start on this. Will be interesting to see how that is (allowed to be) used within institutions.

I see you've added a link to MAV. Damien has released the code on GitHub. The MAV repo might be a better link.

Process analytics


Agree, process analytics does appear to be activity specific, but also potentially learning design specific. e.g. the type of data/analysis I want from a forum when I'm using it for an icebreaker activity, could be different if I'm using the forum for a debate, or some other learning design.

Checkpoint analytics


I also wonder about the relationship with checkpoint analytics. I believe when Lockyer et al (2013) say "met the prerequisites for learning by accessing the relevant resources of the learning design", that they aren't meaning just having accessed content. "resources" may include completing pre-cursor activities.

e.g. before engaging in a debate, I might want to ensure that students have completed (or achieved a certain level) in a collection of preparation activities. Activity completion is one Moodle way of doing this, but I can also see some potential use where activity completion is expanded to be able to link with the "process analytics" of particular activities (to get at some specifics, rather than just completed).

The PIRAC Model



In terms of analysing learning analytics applications, the PIRAC model might be another option. It's the framework Col is going to be using in his thesis. It was first talked about in an earlier ASCILITE paper. We later found that it overlaid nicely with a learning analytics model from Siemens (2013, p. 13). The following image is from an early presentation and shows the relationship.

IRAC and LAM

PIRAC is an acronym. Each acronym is meant to have a range of questions you ask about the particular learning analytics application - a summary follows - a bit more description in the ASCILITE paper
  • Purpose - why and who developed it? what is it intended to do? who is it intended to help? why? How?
  • Information - what information is used? how is it analysed? privacy and ethics?
  • Representation - how is the analyses displayed? what is required to understand it? What is missing? etc.
  • Affordances - what does it allow/enable the user to do with that analyses/representation?
  • Change - what type of change to P, I, R ,and A components is being done? how often? who can make these changes?


The rationale is that there needs to be a way to understand/analyse what is (and isn't) being done around learning analytics. Our feel is that most learning analytics research places too much emphasis on the Information component. After all, that's the research driving much of this. Representation is perhaps next. The remainder are under represented. Especially change and affordances.

My feel is that if PIRAC was used to analyse the Moodle learning analytics plugins listed you'd find
  • Information - an understandably limited (to Moodle data) range of information being analysed and possibly only fairly limited analysis being done (statistical analysis, not much machine learning etc).
  • Representation - tending to be limited to tabular representations or the odd image/bar. Not much in the way of heat maps like MAV? Most of the representation being done outside of the learning space.
  • Affordances - very little.
  • Change - maybe just a bit of re-configuration being allowed and perhaps increasingly the ability to export the data. And maybe slowly, a bit more with the growth of APIs


Perhaps identifying some opportunities for development.

references



Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1371–1379. doi:10.1177/0002764213498851
In reply to Derek Chirnside

Re: Learning Analytics tools available in Moodle

by Mike Churchward -
Picture of Core developers Picture of Plugin developers Picture of Testers

Hi Derek -

I've been trying to pin down exactly what it is people want from Moodle analytics, but have remained largely unsuccessful.

With regards to Intelliboard, you said: "However, most of the tutors I work with need something a lot more simple". What are they looking or then? I'm sure it can be built.

Do you have examples (or can point me to them) of what Canvas and D2L have sorted out? Again, I'd like to get this onto a wish list.

mike

In reply to David Jones

Re: Learning Analytics tools available in Moodle

by Michael de Raadt -

Hi, David.

I think you're correct about completion within Moodle being useful to many users. The underlying system has a wide range of users. The plugins mentioned above were specific plugins making use of completion, which could have been made a bit clearer.

For different classifications, what would you suggest? Could there be a way of categorising tools according to the three paths you suggested in your ASCILITE 2014 paper (do it to, do it for, do it with teachers)?

In reply to Michael de Raadt

Re: Learning Analytics tools available in Moodle

by David Jones -
G'day All,

Came across the following paper this morning

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2015). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicating learning success. The Internet and Higher Education, 28, 68–84. doi:doi:10.1016/j.iheduc.2015.10.002

which has some connection to these discussions. Also includes some suggestions about ways forward for learning analytics research.

It's also reporting on the application of predictive models to a range of Moodle-based courses (at Uni South Australia I believe).

The abstract is
This study examined the extent towhich instructional conditions influence the prediction of academic success in nine undergraduate courses offered in a blended learningmodel (n=4134). The study illustrates the differences in predictive power and significant predictors between course-specific models and generalized predictive models. The results suggest that it is imperative for learning analytics research to account for the diverse ways technology is adopted and applied in course-specific contexts. The differences in technology use, especially those related to whether and how learners use the learning management system, require consideration before the log-data can be merged to create a generalized model for predicting academic success. A lack of attention to instructional conditions can lead to an over or under estimation of the effects of LMS features on students' academic success. These findings have broader implications for institutions seeking generalized and portable models for identifying students at risk of academic failure.


David.
In reply to Michael de Raadt

Re: Learning Analytics tools available in Moodle

by Elizabeth Dalton -

As Michael knows (and as I presented at the US Moodle Moot 2015), I am working in this area in my doctoral dissertation. I have quite a lengthy bibliography, if anyone else is interested.

In general, learning analytics need to be designed to solve specific learning problems, and those are dependent on what the goals of the institution are. One way to look at this is to look at what the curriculum theory of the institution is, which often drives the learning theory, as well. For example, an institution might be intending to prepare future employees for a workforce. They will probably use a cognitive mastery model, and will want to know if learners are on track to achieve mastery of a required set of defined outcomes; "efficiency" of achieving those outcomes quickly might be more important than maximizing time on task. On the other hand, a traditional academic model might emphasize grades or even rankings between students, and might emphasize full exploration of the available resources and might consider higher time on task a positive predictor. Or a learner-centric model might primarily emphasize learner satisfaction with the program, using ratings of individual components of a course to predict whether the learner is likely to provide positive feedback at the end of the course. It really depends on the model of the program, so there is no "one size fits all" model of learning analytics.

A generalized learning analytics system needs to allow any of several possible data sources to be included in the calculation used to predict the desired outcome, which also needs to be configurable. Ideally, weightings between different factors in the prediction model would be suggested by retroactively analyzing historical student data.

I looked at Intelliboard a year ago and I didn't see the kinds of data sources and calculations we would need for our institution at that time. We also had problems with their security-- we had a survey we needed them to complete in order to work with them, and they weren't responsive at that time. I'd much rather see the calculations we need performed within Moodle itself, for that reason.

I'm still reviewing other plugins and research efforts within the Moodle community. Quite a lot has been happening over the past few years. I am still not seeing strong quantitative validation of some of the proposed analytics, though.

In reply to Elizabeth Dalton

Re: Learning Analytics tools available in Moodle

by David Jones -
G'day Elizabeth,

Would love to see your bibliography and any other work you've done in the open.

*warning: academic nitpick*

I can agree that "learning analytics need to be designed to solve specific learning problems", but as you may have picked up from my earlier posts in this thread, I disagree that this is "dependent on what the goals of the institution are". I'd argue that it's depended upon the intent behind the particular learning design.

It may be possible that you work within an institution where there is some common pedagogical intent used across all courses. But personally I've yet to see an institution that works that way, or an institution (of sufficient diversity) where such an intent would be a good thing.

A focus on LA at an institutional level (fairly common) tends to result in an approach following the do it to or do it for approaches. Approaches that tend to result in less than stellar outcomes. The Gasevic et al (2015) paper provides some evidence of the limitation of the one size fits all (in the institution) approach.

Sorry, but the common institutional approach to anything around learning and teaching is a bug bear of mine. Senior management in institutions tend to like these approaches, but I just don't see the possibility of them have significant impact.

David.

In reply to David Jones

Re: Learning Analytics tools available in Moodle

by Derek Chirnside -
Elizabeth,


Warning, "Practitioner's nitpick"

You say: In general, learning analytics need to be designed to solve specific learning problems, and those are dependent on what the goals of the institution are.

So is this a learning problem in your sense:

In any given time, not all of the class do an activity, and need a reminder.

Some deep ideas in your post.  I too would like to see your bibliography.

-Derek