What analytics do you want?

What analytics do you want?

by Michael de Raadt -
Number of replies: 23

Hi, analytics fans.

I recently wrote a blog post about analytics for the LMS. I attempted to create a solid definition of analytics, listed analytics I thought would be useful to students, teachers and institutions, and discussed where we could get analytics information from. You can see the entire blog post here.

I don't think we have clear concensus on what analytics people want, so I thought I would copy that list here to create a starting point and promote discussion.

Analytics useful to Students

Progress
My progress bar blockWith an LMS, it is possible to achieve regular assessment within a course based on a rich set of finely chunked multi-modal activities, and while this can lead to deep learning, it can also be overwhelming for students. It is, therefore, useful for a student to know where they are up to in a course and what they have to do next. Students who use short-term planning tend to be more successful; they just need a quick snapshot of their progress.
Relative success
Deep learners are more successful and deep learners are characterised by meta-cognition about their learning. Providing analytics about their relative success can allow students to know whether they are on track of if they need further exposure to a topic. Relative success can also be used to introduce a competitive element into a cohort, which some educationalists recommend.
Opportunities to interact
If students are studying in isolation, it may not always be apparent when there are chances for them to interact with peers or teachers. Determining the level at which a student is interacting could be seen as an analytic that can be used to direct them to opportunities for communication and collaboration.

Analytics useful to Teachers

Student participation
In an online world, it is more difficult for a teacher to know which students are participating and those needing a push. Students can fail to participate for numerous reasons, usually valid ones. Sometimes a student may need to be encouraged to withdraw from a course and re-enrol later. Where analytics can help is in the determination of the timing of when such decisions need to be made. That’s not to say that such information needs to be complex; it could be as simple as “traffic light” coloured icons next to a list of names of students, ordered by risk.
Student success
Engagement analytics block
Assuming a student is involved, a teacher also wants to know how successful they are. This could be the product of assessment and views of resources. If students are progressing through the course with unsuccessful results, then they may need to be encouraged to re-expose themselves to a topic within the course before progressing further.
Student exposures
Moving away from a course modality where “one size fits all”, it is useful to know how many times a student was exposed to a topic before they were successful. This is a differentiating factor among students in a cohort. If students are progressing with few exposures, perhaps they are finding the course too easy, perhaps even boring, and may need to be challenged further. If students are requiring numerous exposures before they are successful, then perhaps alternate presentations of a topic need to be created to suit the learning preference of particular learners. Such an analytical tool can assist a teacher to deliver learning at an individual level.
Student difficulty in understanding
Through an analysis of exposures and assessment results, it may be possible to determine which topics, or areas within a topic, students are finding difficult. This may indicate areas that need to be revisited in the current delivery or enhanced in a future delivery of the course.
Student difficulty in technical tasks
When students are undertaking learning, the last thing they want is to be stifled by an inability to express their understanding because of by the way a course is set up within the LMS. Students patterns of use within the LMS may indicate they are having such difficulties, and a teacher can be alerted to take action.
Feedback attention
Teachers take time and spend effort creating feedback for students as a reflection of their understanding. It is useful to know which students have paid attention to such feedback, and which students may need to be encouraged to do so. Going beyond this it may be possible to deliver information to a teacher about the effectiveness of their feedback on students’ understandings as reflected in subsequent assessment.
Course quality
In several institutions that I know of, part of the measurement of a teacher’s effectiveness is judged by the quality of the courses they are producing within the LMS, based on a set of metrics. Such measurements can be used for promotions and to drive the development of PD activities. If such metrics can be automated, then analytics can be produced for teachers that encourage them to improve their course by increasing the richness of their resources, improving the quality of their activities, including more activities of different kinds, providing more opportunities for students to interact or collaborate.

Analytics useful to Institutions

Student retention
Analytics can provide more information about students than simple pass/fail rates. Analytics can help determine when students may be at risk of failing and in which courses this is more likely to happen. Such analytics can help an institution to send resources to where they are needed most and to plan resources for the future.
Teacher involvement
There may be ethical implications in monitoring teacher involvement in a course as it is akin to workplace survelance. However there is information in an LMS that can be presented in a useful way in relation to training and promotions. It might also be useful to anonymously tie in a teacher involvement analytic with other analytics to find correlations.
Teacher success
As well as looking at success in terms of pass and fail, it may also be possible to determine where teacher interventions have encouraged students to achieve beyond their expected outcomes.
Relative course quality
Clearly not all courses are equal, but how do you determine which is better. There have been a number of attempts to manually measure aspects of a course such as accessibility, organisation, goals and objectives, content, opportunities for practice and transfer, and evaluation mechanisms (Criteria for Evaluating the Quality of Online Courses, Clayton R. Wright). If such metrics can be automated, then analytics can be created with can reflect the quality of courses. Such metrics could also be fed back to teachers as an incentive to improve their courses.
What analytics would you add to this list?
Average of ratings: Useful (6)
In reply to Michael de Raadt

Re: What analytics do you want?

by Mahesh Agrawal -

Dear Sir,

I want the analytics report similar to that of Khan Academy. The analytic tool that Khan Academy has used is very nice.

Rgds

mahesh

In reply to Mahesh Agrawal

Re: What analytics do you want?

by Michael de Raadt -

Hi, Mahesh.

I've seen some video of analytics tools from the Khan Academy.

Do you have specific analytics tools in mind? What are your favourites?

Do you have links to information about the analytics used at the Khan Acadamy?

In reply to Michael de Raadt

Re: What analytics do you want?

by john whitmer -

This is a very interesting and comprehensive list.   Some of them (e.g. "student difficulty in understanding",  "relativel course quality", "teacher success") include value judgements that I'm not sure we can make with Analytics - or at least in large-scale analytics applied to multiple courses.

To me, it's important to consider carefully the distinction between empirical behaviors (e.g. high amounts of discussion forum activity), from the meaning of those behaviors (e.g. students being engaged with the course).  When you get "big data", and into large numbers, these often even out, but I think it's still important to start with that distinction - especially if we're thinking about creating Analytics for Moodle, which will (should) be based in activities/behaviors - 

Average of ratings: Useful (1)
In reply to john whitmer

Re: What analytics do you want?

by Michael Penney -

or at least in large-scale analytics applied to multiple courses.

We you *could* do this if you had standardized assessments- for example the standardized end of course exams that are rolling out here in Florida wink.

*Assuming the exams have a resonable degree of validity.

In reply to Michael Penney

Re: What analytics do you want?

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
In reply to Tim Hunt

Re: What analytics do you want?

by Michael Penney -

What do you recommend it for? It looks like a polemic to me. 

In reply to Michael Penney

Re: What analytics do you want?

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers

I was recommending it as something to read. It is a fairly factual accout from someone who worked in the standardised testing industry for many years, starting as just a marker, and ending up as a consultant. It is well written, not too long, and not overly polemic.

In reply to Michael Penney

Re: What analytics do you want?

by john whitmer -

Hi Michael!  Hope you're doing well in sunny FL.

Standardized assessments and/or standardized content is a whole different (data) animal, if you'll pardon the metaphor, and they it's possible to do some very fine-grained analytics (such as Kaplan or Kahn Academy is known for).  

For me, the problem of creating "generic" analytics that can scale out across the multiple ways faculty adopt the LMS is an altogether different issue that requires a different type of analysis - and probably accepting that there's going to be a fairly high amount of error built-in.  Still better than (continuing to) ignore the data, IMO.

What do others think about the potential of "generic" LMS analytics o assess learning and/or academic achievement?  

In reply to john whitmer

Re: What analytics do you want?

by Michael de Raadt -

Hi, John.

I do like your distinction between behaviour and meaning. That's certainly an important consideration.

I am a big fan of objectivity over subjectivity, based on a background in assessing students and research.

However I'm not sure I agree with your inference that measures like "student difficulty in understanding",  "relative course quality" and "teacher success" cannot be quantified sufficiently to be generalised and useful as a basis for analytics.

If student understanding is based on a combination of exposures during learning with assessment results, this can be quantified as a relative measure.

Relative course quality can be measured with established, verified rubrics, and I see no reason why these cannot be automated.

Teacher success can be measured by a statistical analysis of the teachers interaction with students and the level of success of those students over time.

I don't think any of these analytics is beyond our potential to create. The barriers are having appropriate data and then having the time to create, test and share such analytics.

In reply to Michael de Raadt

Re: What analytics do you want?

by Sérgio Gonçalves -

Hello, i've posted some questions/problems that i have to sove using moodle. The issue is:

http://moodle.org/mod/forum/discuss.php?d=209905

 

From your experience is possible to solve/workaround some how this log's information.

 

I'm analysing the log's tables from moodle database but i'm almost sure that with the current log table format is impossible/very hard to get the information that i need.

 

Thanks.

 

Sérgio

In reply to Michael de Raadt

Re: What analytics do you want?

by Steven Parker -

Hi Michael

I would like it to be able to gather and interpret data on the variations of teaching and learning activity under course groupings to determine how a teacher's activity within a grouping has lead to (Or not) good learning outcomes, student results and engagement with the content.

Being able to see data on the differences two teachers can make within a shared course though different teaching approaches would be really useful to see what is working.  This could tie in with the New Feature Enhancement for Groupings which is about using groupings to enable teachers to enhance a course without impacting others sharing the same course core content.

Cheers

Steven

In reply to Steven Parker

Re: What analytics do you want?

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators

Steven "I would like it to be able to gather and interpret data on the variations of teaching and learning activity under course groupings to determine how a teacher's activity within a grouping has lead to (Or not) good learning outcomes, student results and engagement with the content."

A tall order! - or total delusion?

Joseph

In reply to Joseph Rézeau

Re: What analytics do you want?

by Steven Parker -

Hi Joseph

Hmmm, a tall order, perhaps...total delusion I don’t think so though I do need to break down my Learning Analytics forum post more to explain what I mean.

variations of teaching and learning activity under course groupings”

For context I have been working on a new Moodle “Groupings” enhancement feature request to enable teams of teachers to better:

  • Deliver to multiple student groups from a single course yet be free to teach in different ways to achieve common learning outcomes based on the principles of Universal Design for Learning (whereby it is recognised that there are multiple ways of teaching).
  • Analyse the success of the multiple ways of teaching within a course, (i.e. presenting instructional goals, methods, materials, and assessments)

gather and interpret data on the variations of teaching and learning activity”

 I’d like Moodle to make it as easy as possible for teams of teachers sharing a single course to visually gather data about each other’s approaches. This is where I think the new Moodle Learning Analytics reporting based on Groupings  would be great!

Michael de Raadt has a great article on the types of Learning Analytics data Moodle HQ is looking at gathering and making available which is quite student focused, see: Analytics: getting helpful information out of an LMS

The Institutional analytics data will also be really useful, I look forward to seeing how it will be implemented?

However as Michael says there is definitely an issue with teachers being agreeable for others to see data about the impact of their teaching practice. Selling the benefits of sharing resources and discussing each other's approaches rather that surveillance will need to be made communicated as part of project team building exercises.

"lead to (Or not) good learning outcomes, student results and engagement with the content."

What would “Success” benchmarks be based on? This would vary from team to team BUT they would need to be agreed to by the team of teachers at the start of a project based on the available Moodle Learning Analytics data.

I would expect a major part of our learning analytics "Success" benchmarks will be based on reporting on how well teachers different teaching approaches have lead to students achieving good learning outcomes.

Moodle HQ is looking at enabling better reporting on outcomes see: http://docs.moodle.org/dev/Outcomes_stage2

Cheers

Steven

In reply to Michael de Raadt

Re: What analytics do you want?

by Evan Donovan -

Thanks for this very helpful & comprehensive post.

On the visualizations front, I think GISMO is awesome. I wonder if there would be interest in moving any things like that into Moodle itself. My main issue with it is that I have various needs (such as sending out emails) where I would prefer if the system generated its visualizations server-side.

On the reporting front, here's the data that I've been spec'd to request from Moodle. I will share, hopefully by Monday, the code I'm currently using to get it.

  1. For each course a student takes
    1. Last Login Date
    2. Last Assignment Submission Date
    3. Last Quiz Submission Date
    4. Last Forum Date
    5. Active for Pell Grant Purposes: Defined as 60% participation (4.8 weeks)
  2. For each course a professor teaches
    1. Last login date
    2. Last Assignment Graded Date
    3. Last Quiz Graded Date
Average of ratings: Useful (1)
In reply to Evan Donovan

Re: What analytics do you want?

by Evan Donovan -

We also are planning on doing custom reporting on Questionnaire module results, since we need to graph those. We use Questionnaire for our end-of-course evaluation surveys, which are a DETC requirement.

Also, we are ultimately planning on integration between our reportings and our student management system in Salesforce.

Our institution is City Vision College (www.cityvision.edu).

In reply to Michael de Raadt

Re: What analytics do you want?

by Nelson King -

For starters, go back to Moodle 1.x analytics.

Item analysis in Moodle 1.x was simple-clean and one click away.  Unless you live and breathe Moodle 2 daily, unlikely a faculty member can find the equivalent feature once or twice a semester.  I don't understand the logic of having to go to each question and then finding the link to click to see results of a question IF all the questions have been graded - otherwise no results are displayed.  If you want to see the item analysis for all questions you must export to XML and then look at the data.

Moodle 1 displayed results as questions were graded thereby allowing target grading.  Moodle 2 seems to presume everyone uses only multiple choice so there is no intermediate state of grading.  Since we combine target grading with progressive Bloom case questions, our only way of getting intermediate analytics with Moodle 2 is to constantly export the data and do our own calculations. 

Moodle should also prominently inform faculty that you can't get useful question level analytics if you randomize the questions in a quiz.  If you don't care how students are meeting learning objectives, then I guess it doesn't matter.

Where should Moodle be? 

We grade at the sub-question level since we ask students put multi-part responses in tables that help them organize their thoughts.  We then add up the sub-question scores and record the grade.  If you want learning analytics, being able to record sub-question scores within a question is necessary. 

We look at our data across all assessment activities (i.e., longitudinal learning).  The only way to do this in Moodle is to manually export and then align the student records manually.  I would assume you could use an SQL query since Moodle data sits in a database.  Apparently you can't.

Questions should be taggable with attributes such as learning objective, Bloom's level, etc.  Then meaningful assessment could be done easily.  Now it has to be done manually outside of Moodle.  If I have to export the data into another package, what is the value of Moodle?

 

 

 

In reply to Nelson King

Pull grades off multiple courses

by Tiffany oakes -

I want to be able to pull a transcript off multiple courses. For example, if a student is in Math 101 and English 101, can I pull a grade sheet of these two classes combined?

Or, how do I export both of these grade sheets into one spreadsheet without a lot of maual work?

In reply to Tiffany oakes

Re: Pull grades off multiple courses

by Vernon Spain -
Picture of Plugin developers Picture of Testers

Hi Tiffany,

The subcourse module works for usbut the code is out of date and MUST be tested extensively before being used on a live site.

It pulls the grades from course X into course y and adds them to the gradebook.

(Info here) http://docs.moodle.org/23/en/Subcourse_module

(Version here) https://github.com/mudrd8mz/moodle-mod_subcourse/blob/master/version.php

Regards,

Vernon

 

In reply to Michael de Raadt

Re: What analytics do you want?

by Eric Strom -

Glad to see attention to these concepts. Learning analytics will be a key feature set for institutions moving forward.

Michael, have you seen the feature set available in the D2L environment lately? It is very rich and taps gauging student-to-student engagement and predictive success algorithms.

http://www.desire2learn.com/newsletters/Horizon/Issue29/articles/?id=4

I would value the ability to build automated action logic based on student activity. For example, notify instructor when student hasn't accessed moodle course in 'x' days, or when student hasn't accessed an assignment at all 'x' hours before the assignment is due, or when 'x' percent of students get a particular quiz question incorrect.

 

In reply to Eric Strom

Re: What analytics do you want?

by Jason Hardin -

For the automatic notifications we have something like this as part of Joule called personalized learning designer (needs a better name).  It is something we are determining how we could release to the Moodle community. The problem currently is that there are a significant number of core patches to make PLD work, which mean an extensive installation process and work to upgrade it with every Moodle release.

Our clients have found it to be a powerful tool for student engagement and retention.

In reply to Eric Strom

Re: What analytics do you want?

by Anatoliy Kochnev -
Picture of Plugin developers

As part of the analytic/reporting tools for Moodle based environments, we created project we called IntelliBoard. Any Moodle version system can be connected and data can be presented in a very user friendly format. Check out www.intelliboard.net. All reports are available with delivery options as well.

will be happy to have Moodle community to test it out.

In reply to Eric Strom

Re: What analytics do you want?

by Wen Hao Chuang -

How about some of those MOOC platforms, such as edX, Coursera, Udemy, and Udacity? I'm sure that they would have some dashboards or some kind of "learning analytics" tools available in their software suite. Can someone post some screenshots here? Thanks!