Analysis..

Analysis..

by Gavin Henrick -
Number of replies: 16
Picture of Plugin developers

Let me give you an example.

There is a class of 100 students in a course.


For a specific handout from a lecture, most people have accessed it twice.

Looking at three students:

  • Bob has not looked at it at all.
  • Tom has looked at it once
  • Paul has looked at it 10 times

Just purely from looking at this one analytic point, what be be inferred?

Perhaps

-> Bob should look at it, and is possibly at a disadvantage if he does not

-> Paul really likes this subject

-> Paul does not understand the information and is struggling

-> Tom likes pie.

Thoughts?

Average of ratings: Useful (1)
In reply to Gavin Henrick

Re: Analysis..

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators

Hi Gavin,

Your example illustrates one of the two extreme points of view re the usefulness of analysis, statistics, etc. i.e. you do not believe that data analysis carries any value.

The other - extreme - point view is that data analysis is extremely important and should be piece and parcel of any LMS.

As always the "truth" of the matter lies in between those two extremes.

Joseph

PS.- I cannot resist quoting my favorite quote here:

"Statistics are like bikinis. What they reveal is suggestive, but what they conceal is vital."

~Aaron Levenstein

Average of ratings: Useful (1)
In reply to Joseph Rézeau

Re: Analysis..

by Gavin Henrick -
Picture of Plugin developers

Hi Joseph,

Actually I was trying to show both, but to put a significant spin on it.

I recall a lecturer once telling me that if a student had not logged into Moodle to access the materials after one week of the course beginning, that they were already behind. This is not necessarily the case for every module, every course everywhere, but for that lecturer, that college, that course, it was.

I have seen and heard similar events from different places, however the context is king.

Only the teacher could probably know if the person who has not accessed it, is going to be at a disadvantage, or if the person who is accessing it a lot may be struggling.

The only thing that a report or analytics can say, is that the behaviour of those two people are not the norm, and that one is statistically quite far from the norm - ie, related to their peers.

So if that document is a crucial practical guidance information, or assessment guidelines which need to be read by a certain time, this is another bit of information which needs to be analysed.

Now yes, perhaps this should be a completion element, which can then be tracked as complete or not to identify the non-activity, but hyper activity cannot be tracked by that approach.

So how does one approach such things?

I love statistics, I love the buzz from analysis and the identification of habit and  or abnormalities.

I hope to be starting a test project where some analysis of relative analytis can be tested for specific criteria analysis

  • relative to expectations
  • relative to peers
  • relative to overall individual performance

So if an expectation (e.g. study logs into Moodle within 1 day of course beginning) is not met, this is identified

So if a students (e.g. grade on an assignment) is significantly different to that of his peers on that activity it can be tracked

So if a students performance in one module is significantly different to his performance in another module, that this can be identified.

What someone does or interprets these as is a different matter, however they are statistically significant albeit, once context is added they may not be important, or they may be very important.

In reply to Gavin Henrick

Re: Analysis..

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators

Thanks for the detailed response, Gavin. And never mind the bikinis.

Joseph

In reply to Joseph Rézeau

Re: Analysis..

by Gavin Henrick -
Picture of Plugin developers

Hey, np!

The one huge challenge with reporting is having stuff tracked that it can be reported on. (maybe this post should be a different thread, maybe not)

I may be wrong, but If i recall correctly, when using the official Moodle mobile app and downloading a file, it did not

  1. log in the course that the user had accessed the file
  2. compete that the user had completed the activity

So the webservices which do an action, not only needs to do it, but I would suggest needs to log that the action is done & or complete the activity if it being completed via the interface.

This would then work for all apps, integrations which use the web services, but that is just one point.

Just some random thoughts wink

In reply to Joseph Rézeau

Push or Pull

by Gavin Henrick -
Picture of Plugin developers

I will move this to another thread later, but adding here for some fun..


Analytics and reporting...  how would you like that served?

Moodle of course already has some reports, in different places. You can go access them and compile them, very much pull the reports out of Moodle.

However, Pull requires effort.

Personally I love push reporting, reports that come in daily/weeky/or reacting to an "event" via emailed PDF/html summary with link to full report, rather than having to go log into a site and go looking for the report. So many distractions instead of get email, and print (yes, I kill too many trees but it may surprise you to know I do not like reading too much on a screen)

What do you like?

Pull or Push?

In reply to Gavin Henrick

Re: Push or Pull

by john whitmer -

I like push, served with a milkshake.  

Seriously, push is great - assuming that we've got a meaningful report/indicator pre-identified, like "no login within first week", or "differential student activity of greater than 2 SD from other students", or something of that sort.

Seems like this is especially the case with busy faculty - who already have a long list of things to do that they already know.

For research / evaluation, though, usually requires pull reports (or more detailed logfile analysis), which does have the thrill of the statistical hunt you mentioned in another post.  That's probably the easiest place to identify the most meaningful reports to push - 

Best, John

In reply to john whitmer

Re: Push or Pull

by David Jones -

I like push, served with a five-course meal.

My interest is around the question of how you enable teachers (students as well, but my focus tends to be teachers) to use the insight provided by analytics to inform what they do.

Shane Dawson and the SNAPP folk found that even with a pretty, graphical representation it was difficult for teaching staff to even understand how that related to learning and teaching, let alone translate that into what steps to take next.

So, reports - be they push or pull - appear to be useful for researchers and middle managers, but not so useful for teachers (though there are always exceptions).

For teaching staff, I wonder wheter insights from nudge theory or distributed cognition might be useful guides for embedding information/features from reports into various Moodle/LMS tools and interfaces in ways that encourage action/thinking.

David.

In reply to Gavin Henrick

Re: Analysis..

by Martin Dougiamas -
Picture of Core developers Picture of Documentation writers Picture of Moodle HQ Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers

IMO analysis of what someone has read or how long/often they access materials is completely useless to understand learning and I've always thought so.  That's why I've always resisted the many people who have called for stats on "how long people have spent learning".  On the web it's particularly meaningless as you don't actually know how long someone's been looking at a screen or what their brain was doing at the time.  It competely distracts and misleads teachers.

The best place to look is around outcomes/competencies and the grading (automatic or manual) that is associated around those.  I'll post a new thread with some thoughts there.

In reply to Martin Dougiamas

Re: Analysis..

by john whitmer -

Great thread.  

I'm a self-confessed "data junkie", but I believe it's important to start with a realistic scope of what we hope to learn (accurately) from Analytics.  I've taken to paraphrasing George Siemens "Analytics doesn't provide answers, it just helps us to ask better questions."

If all teachers would use the gradebook, frequently and often (heck, even consistently while we're aspiring) we would get great predictive indicators of student success.   Until that time, clickstream / usage data seems like the best common data measures that we have.  Maybe we’re trying to solve a teaching/learning issue with technology – but that said, there is a lot that we can ask and advance our learning.

Here’s a chart pulled from usage data on a single course (373 students) that shows some of that usefulness, and some of the complexity in the data.  There’s an overall positive correlation between activity (hits and time) and grade (e.g. more time in Moodle, but students who are Pell-Eligible  (e.g. low income) spend much more time than students who are not Pell-eligible. 

So in this example, the amount of time v. grade is positive in some cases, but possibly problematic for others. 

Chart of Pell v. Non-Pell Usage v Activity and Grade

What does this mean?  Is this good or bad?  That’s not a question we can answer with this data – just the what, not the “why” or the “so what”. 

That said, we’re beginning our Moodle analytics effort with more robust usage measures around adoption.  Looks like configurable reports might be a good step in that direction, maybe we could decide on a few to get “adopted” by core?  

 

In reply to john whitmer

Re: Analysis..

by Evan Donovan -

Interesting chart. We're also attempting to determine the correlation between Pell eligiblity and academic performance.

What do the different measures on here mean, and how was this chart generated? It seems like something similar would help us.

In reply to Evan Donovan

Re: Analysis..

by john whitmer -

The Y axis is the average number of seconds spent per student (e.g. dwell time), disaggregated on the Y axis by groupings of the tool used (e.g. assessment = quiz, administration = calendar, announcements, etc.).

Charts were created outside of Moodle using Tableau.  I don't know how you could import student info into Moodle, but seemsl like you could do that with a custom reg field. 

Best, John

In reply to Martin Dougiamas

Re: Analysis..

by Gavin Henrick -
Picture of Plugin developers

Martin, I agree that there are many aspects which are not about learning outcomes specifically, but more about behaviour and performance outcomes.

Generally speaking,knowing that a student

- accessed material does not indicate that they know it,

- not accessing it does not indicate that they do not know it

- accessing it a lot does not mean that they dont get it, maybe it fails to download for them, or you dont know how to take a local copy,  or cant

However it does help lead to other questions and maybe other answers

Same as if someone has not logged into their online course area within a week of starting, and if it is central to the course, it could be problematic, maybe they dont know the url, cant find it, don;t know what password to log in with, cant follow the guidelines...


I think analytics goes beyond just grade or pure learning outcomes into performance and behaviour.

 

In reply to Martin Dougiamas

Re: Analysis..

by Elizabeth Dalton -

How often or long someone has looked at materials doesn't tell you if they've learned... but it may warn you that they're having trouble learning. We use Moodle for asynchronous distance learning, and one of the most difficult problems our instructors face is knowing how the students are doing before they achieve or fail to achieve an outcome. Outcomes are inherently summative; the learning process also needs to include attention to formative indicators.

I'd like to see a way for instructors to rate resources and activities in importance beyond simply marking them to collect completion or not, so students can see which resources are most important and instructors can see if students are paying attention appropriately. I'd also like to see built-in tools to allow students to rate the helpfulness of resources, and possibly ways to automatically sort resources by student ratings. (For that matter, I'd like to see more formalized ways for students to contribute and rate resources, but I'm planning to use Database for that.)

In reply to Elizabeth Dalton

Re: Analysis..

by Scott Studham -

Elizabeth,

Thanks for your help.  I agree with you 100% (looking at material doesn't mean you learned anything) but can be interesting.  I'm trying to understand how the "Dwell time" was calculated.  I've looked at the Moodle database and didn't see a field I could use to estimate/calculated the "Dwell time".

 

Thanks!

Scott