Moodle research

Structured feedback in an online/blended context

My mug
Structured feedback in an online/blended context
Group Core developersGroup Moodle Course Creator Certificate holdersGroup Particularly helpful MoodlersGroup Plugin developersGroup Testers

I'm preparing for a presentation on assessment at this year's MoodlePosium and one aspect I've been looking at is structured feedback. I hope what I've found will benefit you, but I also have a number of questions that you might be able to help me with.

Structured Feedback Types

Generally, two types of structured feedback are suggested.

  • Rubrics
    • Criteria + levels
    • Can be more instructional
  • Checklists
    • Criteria
    • Can be more objective

Q: Can you think of any other types of structured feedback?

Structured Feedback Benefits

A number of benefits are described in studies of structured feedback.

  • Objective, obvious, fair → reduced anxiety
  • Results comparable between students
  • Reliability and consistency
  • Reduced effort, increased efficiency, faster feedback
  • Feedback equally satisfactory to students
  • Improved academic performance (sometimes)

Q: Has anyone seen a good study that compares structured and unstructured feedback?

Structured Assessment Potential

Structured feedback can be...

  • measured for assessment validity
  • instructional to students
  • used to focus and improve courses
  • used in peer and self assessment
  • used to facilitate discussion
  • co-created with students → higher order thinking
  • used in conjunction with unstructured feedback

Q: Can you think of other potential uses of structured feedback?


I found a bunch of papers discussing rubrics. Here is a selection of papers, including one of my own.

  • Andrade, H. G. (2005). Teaching with rubrics: The good, the bad, and the ugly. College teaching, 53(1), 27-31. (PDF)
  • Anglin, L., Anglin, K., Schumann, P. L., & Kaliski, J. A. (2008). Improving the Efficiency and Effectiveness of Grading Through the Use of Computer‐Assisted Grading Rubrics. Decision Sciences Journal of Innovative Education, 6(1), 51-73. (Pay-walled)
  • de Raadt, M., Lai, D., & Watson, R. (2007). An Evaluation of Electronic Individual Peer Assessment in an Introductory Programming Course. Proceedings of the Seventh Baltic Sea Conference on Computing Education Research (Koli Calling 2007), Koli, Finland. (PDF)
  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational research review, 2(2), 130-144. (PDF)
  • Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35(4), 435-448. (PDF)

Average of ratings: Useful (3)
Tim at Lone Pine Koala Sanctuary
Re: Structured feedback in an online/blended context
Group Core developersGroup Documentation writersGroup Particularly helpful MoodlersGroup Plugin developers

It may not what you are thinking of, since it is automatically given to the student (once the teacher has set it up), but I would say that the feedback the student gets from a quiz is highly structured. (Actually, looking at your 'potential' section, a lot of those are exactly what you can do using the quiz feedback, especially via the quiz statistics report.)

I think if you look at some of the adaptive comparative judgement literature, a lot of the papers will start with a critique of why criterion-referenced assessment is not all it is cracked up to be. That might be a good way to get to some appropriate references.

Average of ratings: Useful (2)
Matt Bury
Re: Structured feedback in an online/blended context
Group Particularly helpful MoodlersGroup Plugin developers

Hi Michael and Tim,

Firstly, thanks for the references. I'll be reading those with interest.

A few interesting papers I've got on rubrics:

Knight, L. V., & Steinbach, T. A. (2011). Adapting Peer Review to an Online Course: An Exploratory Case Study. Journal of Information Technology Education, 10, 81–100.
Curran, V., Hollett, A., Casimiro, L. M., Mccarthy, P., Banfield, V., Hall, P., … Wagner, S. (2011). Development and validation of the interprofessional collaborator assessment rubric ((ICAR)). Journal of Interprofessional Care, 25(5), 339–344.
Lipnevich, A. A., McCallen, L. N., Miles, K. P., & Smith, J. K. (2013). Mind the gap! Students’ use of exemplars and detailed rubrics as formative assessment. Instructional Science, 42(4), 539–559.
Greenberg, K. P. (2015). Rubric Use in Formative Assessment A Detailed Behavioral Rubric Helps Students Improve Their Scientific Writing Skills. Teaching of Psychology, 42(3), 211–217.
Goodridge, H. (1996). Teaching for Authentic Student Performance: Understanding Rubrics. Educational Leadership, 54(4), 14–17.
The Basics of Rubrics. (2007). Schreyer Institute for Teaching Excellence, Penn State. Retrieved from
Pecka, S., Schmid, K., & Pozehl, B. (2014). Psychometric Testing of the Pecka Grading Rubric for Evaluating Higher-Order Thinking in Distance Learning. AANA Journal, 82(6), 449–456.

Average of ratings: -
My mug
Re: Structured feedback in an online/blended context
Group Core developersGroup Moodle Course Creator Certificate holdersGroup Particularly helpful MoodlersGroup Plugin developersGroup Testers

Thanks to Tim and Matt for their suggestions. They were useful.

The presentation I gave at MoodlePosium, which involved structured feedback, was successful. I've attached a PDF version of my slides on the topic, with some updates.

One detrimental aspect of rubric design I just came across was turning a rubric into a matrix/grid, where all criteria have the same number of levels. Apparently this can cause raters to take a biased global/general view of performance (a "halo effect"), rather than judging criteria independently. According to a study by Humphry and Heldsinger, this effect is real and can be avoided.

So the tip of the day is: avoid matrix rubric design, consider the structure of each criterion independently.

The paper is available online and the reference is...

Humphry, S. M., & Heldsinger, S. A. (2014). Common structural design features of rubrics may represent a threat to validity. Educational Researcher, 43(5), 253-263.

Matt Bury
Re: Structured feedback in an online/blended context
Group Particularly helpful MoodlersGroup Plugin developers

Thanks Michael. I haven't heard of that particular critique of rubrics before. It'll be interesting reading smile

One of the "take-home" messages I've got is that Likert style rubrics with numerical or adjectives/adverbs of degree are no better than traditional grading in that they don't provide learners with meaningful actionable information or why they got those numbers or adjectives/adverbs, i.e. What does a "4" or an "excellent" mean?

An alternative that I've found productive are behaviourally anchored rating scales (BARS) which provide explicit descriptors of which behaviours by learners merit which criteria levels. However, since they are so specific and context dependent, they tend not to be re-usable for other tasks or projects, meaning a lot of extra work for teachers/curriculum developers to design unique rubrics for each assessment. In my experience, valid, productive rubrics are difficult and time consuming to design, but then they really make you think more carefully about learning objectives and (sufficient) evidence of learning, which is no bad thing.

Average of ratings: -
Picture of Rob Monk
Re: Structured feedback in an online/blended context

When will moodle allow students to self assess against the rubric and provide structured feedback like this?

Adverb soup of rubrics. 

great speech by 

Daisy Christodoulou - Life Beyond Levels at the 20 minute mark she makes a good case for not using criteria and rubrics for summative judgements. 

We need to build in software like into Moodle. 

Average of ratings: -