When people have different uses for Moodle - what are the consistent concepts that can be analysed?

Re: When people have different uses for Moodle - what are the consistent concepts that can be analysed?

by Tom Murdock -
Number of replies: 2

The questions you raise make perfect sense to me as conversations I always want to have with Moodlers.  

  • Do you know what settings are necessary to make the best use of a Moodle dashboard?
  • What learning outcomes do you want to address within Moodle?
  • How will you know if outcomes have been fulfilled?
  • What elements are consistently designed across a set of courses so that they can be compared?
  • What have your teachers/educators/trainers agreed upon as "consistent practice"?
  • What factors make certain elements more or less successful than other elements? 
  • Do the various course designs create value for students, or are the different patterns confusing?
  • If you teach hybrid courses, what activities do you want to save for face-to-face meetings?

Backward Design (Wiggins) can't be ignored in online courses, especially when you intend to compare apples to apples with analytics.  Plus, it is really liberating to find some agreement among peers on what they want to see in online instruction.

-Tom  

In reply to Tom Murdock

Re: When people have different uses for Moodle - what are the consistent concepts that can be analysed?

by Dan McGuire -

I agree with your points, Tom. I do think this discussion is in the wrong forum, however. The points you make are best addressed and analyzed (thereby producing learning analytics) with competencies and thorough and detailed reporting on the competencies.

IMO, analyzing behaviors and course characteristics that aren't necessarily (a normal and necessary part of the course) related to authentic evidence of student learning is not learning analytics even though the term 'learning analytics' in current parlance usually does mean analyzing behaviors and course characteristics. The problem that Mary points out that faculty don't frequently use Moodle to record assessments, especially of off-line activities, even though doing so is actually quite efficient, illustrates the problem with analyzing learning. it is only partially a software feature development issue. It is more of a professional services / professional development issue which will be made easier as competency reporting gets easier to use. Software developers will need to listen long and frequently to professional services people, anathema for some developers.

And one other, somewhat related point, all teaching is either online or hybrid (certainly for our purposes on this forum;)  it's just a matter of how much time is face to face. Here's an example from my days in the classroom which I will be missing this time of year, for a week or so. http://dangerouslyirrelevant.org/2010/09/writing-the-elephant-in-the-living-room.html


In reply to Dan McGuire

Re: When people have different uses for Moodle - what are the consistent concepts that can be analysed?

by Elizabeth Dalton -

There are certainly "best practices" we can offer to institutions about how to improve their predictions by more complete capture of data, e.g. competencies, but I think we also have to acknowledge that not every institution works the same way, and competencies are not always appropriate. For example, in a Social Reconstruction curriculum using Social Constructivist pedagogy, bringing students into participation with the community of practice may be the main goal, and participation patterns in the forums, etc. may be the most important predictor.