OK: Would this survey be a useful addition to your online courses/programmes? What issues do you see with using this as an evaluation tool for courses and programmes?
It is a quantitative methodology, which makes use of a data collection tool that collects feedback, which is number-crunched to evaluate courses. This is not new. I would add some qual questions in addition to seek student reasons for their yes/no...likert scale responses.
As for the research, all the stats were 'high'...e.g. Cron B A -not significant, not low...not medium (he he-am playing-stretching my gag a bit). If the reader is convinced of the validity of the tool (questionnaire) then that is up to the reader...yet I fail to understand such a focus on an effort to refine a questionnaire-only to include descriptions of the 'high' number outcomes without any links with student voice/triangulation with participants ( e.g. links in the discussion of results section).
It seems to me the focus of the paper is on the development of the tool rather than an in-depth analysis of the tool's impact for data colection on the key areas (a) community, (b) comfort,
(c) facilitation, and (d) interaction and collaboration. In other words, I have not developed my knowledge of how the tool supported movement forward for knowledge development about these key areas for online learning. To over-egg-the pud.....I have no idea what is meant by the term/concept comfort....and what the measures/outcomes are/mean for online learning...and on...and on.