Moodle research

Paper review: Teachers’ Experiences with Learning Analytics

 
This discussion has been locked because a year has elapsed since the last post. Please start a new discussion topic.
My mug
Paper review: Teachers’ Experiences with Learning Analytics
Group Core developersGroup Moodle Course Creator Certificate holdersGroup Plugin developersGroup Testers

Higher Education Teachers’ Experiences with Learning Analytics in Relation to Student Retention

Picture of Deborah West Picture of Henk Huijser Picture of David Heath
Picture of Alf Lizzio Picture of Danny Toohey Picture of Carol Miles
Authors Deborah West (Charles Darwin University), Henk Huijser (Batchelor Institute of Indigenous Tertiary
Education), David Heath (Charles Darwin University), Alf Lizzio (Griffith University), Danny Toohey (Murdoch University) and Carol Miles (The University of Newcastle)
Publication Proceedings from ascilite 2015
Date 2015
Links Details, Paper link


A number of papers describing work on projects funded by the Australian Office of Learning and Teaching were presented at last month's ASCILITE conference in Perth. A significant project involving academics from 24 universities in Australia and New Zealand was reported in a paper titled "Higher Education Teachers’ Experiences with Learning Analytics in Relation to Student Retention".

The paper reports on attempts to clarify what learning analytics tools are being made available to teachers and how they are being used to increase student retention. Data gathering was being achieved through large-scale surveys and the results are interesting.

The project started with a broad institutional survey and found that Australasian institutions are at various stages of adoption of LA tools, with many unsure what perspective such tools should take. Moving on, the next stage of surveying focused on teachers to "elicit their views of key issues identified in the literature" (p. 308).

The study discovered interesting facts about the use of learning analytics data in relation to retention.

  • Most data used for identify students-at-risk comes from students (self reported or requested) and from the LMS. Other lesser sources are the student information system and support staff.
  • The most commonly used indicators for identifying students-at-risk are:
    • task completion
    • grades
    • LMS access
  • When provided with LA data, 37% of respondants said they acted in a systematic way, most commonly with manual interventions such as emails or phone calls. Only a small group (<5%) used automated responses.
  • Teachers rarely discuss learning analytics with other staff or students.
  • Teachers rate the provision of LA information as being poor on a number of dimensions.
  • Training is being provided to teachers where LA results are made available, but few teachers take the opportunity to attend such training.

In general, while there is an interest in learning analytics, participation in LA activities is limited. The authors suggest that the "dizzingly complicated and numerous" variety of tools and the lack of discussion about thier use will evolve as institutions adopt policies.

The paper is being published as the project wraps up, according to the project website.

 
Average of ratings: -