I developed "Modular" Course Design Reviews for our University organized by Areas of the Moodle LMS:
- Course Header (Welcome & Orientation)
- Main Course Page (Navigation & Clutter)
- Modules - Big Picture (Alignment with Outcomes)
- Assignments - Individual Prompts (Rigor & Standards)
- Syllabus & Gradebook (Alignment & Accuracy)
- Media Check (Quality & Accessibility)
Each one allows you to answer 10-20 questions using branching logic on a Microsoft Form. I have it setup with a Power Automate flow to send the results by email to the reviewer.
The advantages to this Modular approach to Course Design Review:
- You can delegate specific reviews to those with relevant expertise (e.g., Media Check by a Multimedia Specialist, Modules - Big Picture to an Instructional Designer)
- These shorter "mini-reviews" are easier to complete in one-sitting (compared to comprehensive reviews)
- Collating everything which needs to be reviewed for a particular "View" / "Area" of the LMS saves course reviewers time, as they can review all of it in that location.
Here is what I want Moodle to be able to do:
An Annotated Course Review Functionality based on Area of LMS
The new feature would perform two functions:
- Function 1 - Based on Moodle LMS Role, would allow customizable questions to be created to evaluate that area of the course. This would allow for both a quantitative, multiple-choice like question ("Meets Standard", "Almost Meets Standard", "Does Not Meet Standard", "Not Applicable"). Users could make their own, but this functionality would be capable of generating a CSV file or report of some kind. It would also allow for qualitative, open-ended feedback questions.
- Function 2 - Both Quantitative and Qualitative feedback could be seen within each area of the LMS as "annotated comments" based on certain roles. So an Instructor and Instructional Designer role could see them but not students. They could use this as a checklist to clean up and make adjustments to the course based on these annotations and then delete them or mark them complete when addressed.
So there would be a function which would tie customizable questions for feedback to a specific area of the course shell (e.g., Activity, Main Page, Module) and there would be another function which could capture that feedback as annotations which would show up for certain user roles in those areas of the course.
What is the value of this feature to users of Moodle? The value is that instead of users of the LMS using external tools, methods, etc. to do Course Design Reviews for Quality Assurance, this functionality could be integrated within the LMS in the areas where that feedback is relevant and may need adjustment / improvement. This also makes it possible for universities to choose their own, customizable review standards. So a university which wanted to use Quality Matters, could type in those standards into this based on the area of the LMS these could be identified. But another university could use OSCQR Rubric or create their own.
This would be invaluable functionality and could really set the Moodle LMS apart from competitors which do not offer anything close to this type of functionality.
I am also open to making the modular reviews I created open source so others can benefit from them and the community can improve on them. They could even be added as "defaults" to give end-users an idea of how best to use this functionality and then customize it for their needs??
Bonus would be if it was also made possible to award a Badge or Certify a course based on a certain score for these questions!