Fourth Friday: Moodle Research Review for January 2020

Fourth Friday: Moodle Research Review for January 2020

by Elizabeth Dalton -
Number of replies: 4

Welcome to "Fourth Friday," a new series of monthly reviews of published research in top-tier journals based on using Moodle to pursue our mission, "Empowering educators to improve our world."

This month there are four journal articles that particularly caught my eye:

Arpaci, I., & Basol, G. (2020). The impact of preservice teachers’ cognitive and technological perceptions on their continuous intention to use flipped classroom. Education and Information Technologies, 1–12. https://doi.org/10.1007/s10639-020-10104-8
In this study of preservice teachers who will be employed in elementary and secondary schools in Türkiye, researchers implemented a flipped classroom model using Moodle. The study then evaluated how experience with this model affected self-regulation, self-efficacy, perceived ease of use of teaching using the flipped classroom model, and intention to use the flipped classroom model in future teaching. The findings suggest that instructors may need to provide more scaffolding when first exposing students (even preservice teachers) to the flipped classroom model. Instructors in preservice teaching programs can help to build self-regulation and self-efficacy by modeling enthusiasm about new blended learning strategies.

Meo, F. D., & Martí-Ballester, C.-P. (2020). Effects of the perceptions of online quizzes and electronic devices on student performance. Australasian Journal of Educational Technology, 36(1), 111–125. https://doi.org/10.14742/ajet.4888
Educators sometimes wonder if providing unlimited attempts at online quizzes in Moodle undermines learning, by enabling students to repeatedly take the quizzes rather than studying the original source material. In this study, online quizzes on the Moodle platform were implemented for each topic of the Introduction to Accounting course at the Universitat Autònoma de Barcelona during the second term of the 2015–2016 academic year. In addition, students’ perceptions of online quizzes for continuous assessment, and of electronic devices to do online quizzes, were obtained through a survey distributed at the end of the course.

Participants who scored best at online quizzes also obtained significantly better examination scores, regardless of whether students used online multiple-choice questions to check their knowledge or to study the content of an accounting topic.

Kamaghe, J. S., Luhanga, E. T., & Kisangiri, M. (2020). The Challenges of Adopting M-Learning Assistive Technologies for Visually Impaired Learners in Higher Learning Institution in Tanzania. International Journal of Emerging Technologies in Learning (IJET), 15(01), 140–151.
This study describes the use of assistive technologies in mobile learning among students with visual impairments in Tanzania. Moodle is widely used in higher education institutions in Tanzania. The study concluded that while Moodle itself provides accessibility support, and many mobile devices provide assistive technology such as text-to-voice, many institutions have not enabled Moodle Mobile, and course content often lacks appropriate support for assistive technologies (e.g. alternative text for images). Additionally, few learners were aware of how to use mobile learning features to improve accessibility, indicating a need for student orientation to these services.

Rabin, E., Kalman, Y. M., & Kalz, M. (2020). The cathedral’s ivory tower and the open education bazaar – catalyzing innovation in the higher education sector. Open Learning: The Journal of Open, Distance and e-Learning, 35(1), 82–99. https://doi.org/10.1080/02680513.2019.1662285
This article contrasts "C-Type" organizations ("cathedral-type") such as traditional research-oriented universities, major publishers, and proprietary educationaal technology providers, with "B-Type" ("bazaar-type"), here interpreted as open enrollment higher education institutions, open textbook publishers and open source educational technology providers, including Moodle. The narrative of "disrupt and replace" is rejected, with a complementary, interdependent ecosystem model proposed instead.

Do you know of research based on or relevant to Moodle? Please feel free to post here, or deposit items to our Moodle Research Repository at https://research.moodle.org. And thanks!

Average of ratings: Useful (2)
In reply to Elizabeth Dalton

Re: Fourth Friday: Moodle Research Review for January 2020

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers

First, can I say thank you very much for sharing this. Please keep it up.

There has been some interesting work done recently by Sally Jordan and collaborators at the OU, trying to unpick the difference in performance we see for male and female students on our physics degrees, which is a worrying problem. See some of the recent papers here: http://oro.open.ac.uk/view/person/sej3.html. Some of that is using data about what goes on in our Moodle.


Obviously, given my background, I am going to comment on the "Effects of the perceptions of online quizzes and electronic devices on student performance" paper you linked to.

I am afraid I am not very impressed with their results. "participants who [engaged most with our ed tech] also obtained significantly better examination scores" is what everyone always find in these sorts of studies where students self-select their use of the technology. It does not mean "the use of online quizzes helps to enhance students’ examination scores". It means some students were always going to do better than their peers. The students who are probably going to get an A for the course are probably going to knock off all the during-the-course activities without breaking a sweat too.

To actually show that your educational technology helps student, you are probably going to need a proper experimental design for your study. Often that is not feasible, in which case you may get somewhere if you can control for the student intake in some way (e.g. do you study in a second year course, and control your finding by the first-year exam results.) I have seen some convincing data along those lines at internal OU seminars, but I don't think the most of it has every been published. Also, even if you find that students who spend time with your technology genuinely do learn more, well, the main contributor to any learning is time on task, so if spending time with your tech helps students, well that is good, but it is not necessarily better than whatever else the student could be doing with there time. Anyway, for the 'strongest' result of the paper, let's all say it together: "Correlation does not imply causation."

The other thing about the paper that annoys me is that the authors seems to think that students can only use "online multiple-choice questions to check their knowledge" recall. That is the lowest possible level of Blooms taxonomy, and not really the point of university education.

In my opinion (based on observing what works at the OU) the most effective use of online assessment is in practicing skills (so, maths, science calculations, languages). There, being able to practice the skill with immediate feedback is very powerful for learning.

It is also the case that multiple choice is the least useful sort of question type for practice. Rather than selected response, it is much better to set open ended questions (numerical, short-answer, calculated, pattern-match, stack, ...), ideally with lots of variants of each question, so when students repeat the quiz to practice, they get different questions. This paper was about accounting after all. Numerical question should have been easy.

The paper worries about what happens if "quizzes are used to learn course content rather than to test knowledge". This does not worry me at all. Generally active learning > passive learning. Read Richard Lobb's rant "How programming quizzes should work" in the CodeRunner documentation:  https://github.com/trampgeek/moodle-qtype_coderunner/blob/master/Readme.md#appendix-how-programming-quizzes-should-work. Amen to that say I.

Anyway, given that their quizzes seems to have been quite weakly designed, I am not that surprised that many of their other findings were weak. Also, it is very well established in lots of literature that what students like, of what they think will be effective, often bears no relationship to what actually helps them, further weakening the results of this paper.

Average of ratings: Useful (4)
In reply to Tim Hunt

Re: Fourth Friday: Moodle Research Review for January 2020

by Elizabeth Dalton -
Hi Tim,
Thanks for the link to OU research! I encourage anyone who is doing well-designed research in Moodle to post to our research repository. smile Maybe I’ll do a special edition of “Fourth Friday” all about OU research!

Regarding the paper you critiqued, yeah, I agree with all your points (which probably won’t surprise you). It wouldn’t have occurred to me that someone would seriously be worried about students using quizzes as a study tool (even limited ones like these). But then, I always feel any quiz or other assessments should be testing real application, not memorization, and if people want to start with a quiz to find out which material they need to spend time on, why not? Evidently some colleagues of these researchers objected to that idea, so their research was designed to show that for the students who were likely to retake quizzes, there was no harm done. (They didn’t mention the usual problem that the highest performing students seem to also over-study, if anything, and the weakest students don’t realize they need to study more.)

At least, that was my take on it. It wasn’t the strongest paper I’ve read in the past year by a long shot, but hey, it was a paper published in the past month in a relatively significant journal that used Moodle in the study... as you can see, I only found 4 such papers this month. ;) (I need to finish up some of my papers and submit them!)
In reply to Elizabeth Dalton

Re: Fourth Friday: Moodle Research Review for January 2020

by Matt Bury -
Picture of Plugin developers

Thanks Elizabeth for starting this. It's much appreciated & I hope it grows into a SIG smile

Tim & Elizabeth, hopefully I can shed some light on the findings of the Meo & Martí-Ballester (2020) paper. There's a large & growing body of research on the effects of tests & other similar strategies collectively known as retrieval practice*. The principle is that retrieval practice is a more effective & efficient way to strengthen memorised ideas, concepts, processes, compared to strategies such as re-study/re-reading. The evidence is so strong & the effect sizes so large that a number of researchers have acquired funding & established campaigns to promote retrieval practice, as well as 5 other strategies, to students, teachers, & instructional designers, e.g. https://www.learningscientists.org/downloadable-materials & https://www.retrievalpractice.org/

With this in mind, my take on the Meo & Martí-Ballester (2020) paper is that it doesn't establish a causal relationship between the testing & students' academic performance & an experimental design would be necessary to do so. However, I think that the abundance of evidence that retrieval practice is effective across a wide variety of disciplines, ages groups, & contexts leads me to believe that we'd likely see similar results had the present study incorporated an experimental design. 

In the spirit of sharing,

Matt

*To give you an idea of the sheer quantity of research on retrieval practice, here are the results from a search for one of the leading researcher's names, Jeffrey Karpicke in my bibliography manager: 

Agarwal, P. K., Karpicke, J. D., Kang, S. H. K., Roediger, H. L., & McDermott, K. B. (2008). Examining the testing effect with open- and closed-book tests. Applied Cognitive Psychology, 22(7), 861–876. https://doi.org/10.1002/acp.1391
Ariel, R., & Karpicke, J. D. (2018). Improving self-regulated learning with a retrieval practice intervention. Journal of Experimental Psychology: Applied, 24(1), 43–56. https://doi.org/10.1037/xap0000133
Blunt, J. R., & Karpicke, J. D. (2014). Learning with retrieval-based concept mapping. Journal of Educational Psychology, 106(3), 849–858. https://doi.org/10.1037/a0035934
Butler, A. C., Karpicke, J. D., & Roediger, H. L. (2007). The effect of type and timing of feedback on learning from multiple-choice tests. Journal of Experimental Psychology: Applied, 13(4), 273–281. https://doi.org/10.1037/1076-898X.13.4.273
Butler, A. C., Karpicke, J. D., & Roediger, H. L. (2008). Correcting a metacognitive error: Feedback increases retention of low-confidence correct responses. Journal of Experimental Psychology: Learning, Memory, and Cognition, 34(4), 918–928. https://doi.org/10.1037/0278-7393.34.4.918
Carmichael, M., Reid, A.-K., & Karpicke, J. D. (2018). Assessing the Impact of Educational Video on Student Engagement, Critical Thinking and Learning. 21.
Grimaldi, P. J., & Karpicke, J. D. (2012). When and why do retrieval attempts enhance subsequent encoding? Memory & Cognition, 40(4), 505–513. https://doi.org/10.3758/s13421-011-0174-0
Grimaldi, P. J., & Karpicke, J. D. (2014). Guided retrieval practice of educational materials using automated scoring. Journal of Educational Psychology, 106(1), 58–68. https://doi.org/10.1037/a0033208
Grimaldi, P. J., Poston, L., & Karpicke, J. D. (2015). How does creating a concept map affect item-specific encoding? Journal of Experimental Psychology: Learning, Memory, and Cognition, 41(4), 1049–1061. https://doi.org/10.1037/xlm0000076
Hartwig, M. K., & Dunlosky, J. (2011). Study strategies of college students: Are self-testing and scheduling related to achievement? Psychonomic Bulletin & Review, 19(1), 126–134. https://doi.org/10.3758/s13423-011-0181-y
Karpicke, J. D., & Blunt, J. R. (2011a). Retrieval Practice Produces More Learning than Elaborative Studying with Concept Mapping. Science, 331(6018), 772–775. https://doi.org/10.1126/science.1199327
Karpicke, J. D., & Blunt, J. R. (2011b). Response to Comment on “Retrieval Practice Produces More Learning than Elaborative Studying with Concept Mapping.” Science, 334(6055), 453–453. https://doi.org/10.1126/science.1204035
Karpicke, J. D., & Roediger, H. L. (2010). Is expanding retrieval a superior method for learning text materials? Memory & Cognition, 38(1), 116–124. https://doi.org/10.3758/MC.38.1.116
Karpicke, J, & Roedigeriii, H. (2007). Repeated retrieval during learning is the key to long-term retention. Journal of Memory and Language, 57(2), 151–162. https://doi.org/10.1016/j.jml.2006.09.004
Karpicke, Jeffrey. (2016, June). A powerful way to improve learning and memory: Practicing retrieval enhances long-term, meaningful learning. Psychological Science Agenda, 8.
Karpicke, Jeffrey D. (2009). Metacognitive control and strategy selection: Deciding to practice retrieval during learning. Journal of Experimental Psychology: General, 138(4), 469–486. https://doi.org/10.1037/a0017341
Karpicke, Jeffrey D. (2012). Retrieval-Based Learning: Active Retrieval Promotes Meaningful Learning. Current Directions in Psychological Science, 21(3), 157–163. https://doi.org/10.1177/0963721412443552
Karpicke, Jeffrey D. (2016, June). A powerful way to improve learning and memory. Psychological Science Agenda. http://www.apa.org/science/about/psa/2016/06/learning-memory.aspx
Karpicke, Jeffrey D. (2017). Retrieval-Based Learning: A Decade of Progress. In Learning and Memory: A Comprehensive Reference (pp. 487–514). Elsevier. https://doi.org/10.1016/B978-0-12-809324-5.21055-9
Karpicke, Jeffrey D., & Aue, W. R. (2015). The Testing Effect Is Alive and Well with Complex Materials. Educational Psychology Review, 27(2), 317–326. https://doi.org/10.1007/s10648-015-9309-3
Karpicke, Jeffrey D., & Bauernschmidt, A. (2011). Spaced retrieval: Absolute spacing enhances learning regardless of relative spacing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 37(5), 1250–1257. https://doi.org/10.1037/a0023436
Karpicke, Jeffrey D., & Blunt, J. R. (2011). Retrieval Practice Produces More Learning than Elaborative Studying with Concept Mapping. Science, 331(6018), 772–775. https://doi.org/10.1126/science.1199327
Karpicke, Jeffrey D., Blunt, J. R., Smith, M. A., & Karpicke, S. S. (2014). Retrieval-based learning: The need for guided retrieval in elementary school children. Journal of Applied Research in Memory and Cognition, 3(3), 198–206. https://doi.org/10.1016/j.jarmac.2014.07.008
Karpicke, Jeffrey D., Butler, A. C., & III, H. L. R. (2009). Metacognitive strategies in student learning: Do students practise retrieval when they study on their own? Memory, 17(4), 471–479. https://doi.org/10.1080/09658210802647009
Karpicke, Jeffrey D., & Grimaldi, P. J. (2012). Retrieval-Based Learning: A Perspective for Enhancing Meaningful Learning. Educational Psychology Review, 24(3), 401–418. https://doi.org/10.1007/s10648-012-9202-2
Karpicke, Jeffrey D., Lehman, M., & Aue, W. R. (2014). Retrieval-Based Learning. In Psychology of Learning and Motivation (Vol. 61, pp. 237–284). Elsevier. https://doi.org/10.1016/B978-0-12-800283-4.00007-1
Karpicke, Jeffrey D., McCabe, D. P., & Roediger, H. L. (2008). False memories are not surprising: The subjective experience of an associative memory illusion. Journal of Memory and Language, 58(4), 1065–1079. https://doi.org/10.1016/j.jml.2007.12.004
Karpicke, Jeffrey D, & Pisoni, D. B. (2004). Using immediate memory span to measure implicit learning. Memory & Cognition, 32(6), 956–964.
Karpicke, Jeffrey D., & Roediger, H. L. (2007). Expanding retrieval practice promotes short-term retention, but equally spaced retrieval enhances long-term retention. Journal of Experimental Psychology: Learning, Memory, and Cognition, 33(4), 704–719. https://doi.org/10.1037/0278-7393.33.4.704
Karpicke, Jeffrey D., & Roediger, H. L. (2008). The Critical Importance of Retrieval for Learning. Science, 319(5865), 966–968. https://doi.org/10.1126/science.1152408
Karpicke, Jeffrey D., & Smith, M. A. (2012). Separate mnemonic effects of retrieval practice and elaborative encoding. Journal of Memory and Language, 67(1), 17–29. https://doi.org/10.1016/j.jml.2012.02.004
Karpicke, Jeffrey D., & Zaromb, F. M. (2010). Retrieval mode distinguishes the testing effect from the generation effect. Journal of Memory and Language, 62(3), 227–239. https://doi.org/10.1016/j.jml.2009.11.010
Lehman, M., & Karpicke, J. D. (2016). Elaborative retrieval: Do semantic mediators improve memory? Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(10), 1573–1591. https://doi.org/10.1037/xlm0000267
Lehman, M., Smith, M. A., & Karpicke, J. D. (2014). Toward an episodic context account of retrieval-based learning: Dissociating retrieval practice and elaboration. Journal of Experimental Psychology: Learning, Memory, and Cognition, 40(6), 1787–1794. https://doi.org/10.1037/xlm0000012
McCabe, D. P., Roediger, H. L., & Karpicke, J. D. (2011). Automatic processing influences free recall: Converging evidence from the process dissociation procedure and remember-know judgments. Memory & Cognition, 39(3), 389–402. https://doi.org/10.3758/s13421-010-0040-5
Nunes, L. D., & Karpicke, J. D. (2015). Retrieval-Based Learning: Research at the Interface between Cognitive Science and Education. In R. A. Scott & S. M. Kosslyn (Eds.), Emerging Trends in the Social and Behavioral Sciences (pp. 1–16). John Wiley & Sons, Inc. https://doi.org/10.1002/9781118900772.etrds0289
Roediger, H. L., & Karpicke, J. D. (2006a). Test-Enhanced Learning: Taking Memory Tests Improves Long-Term Retention. Psychological Science, 17(3), 249–255. https://doi.org/10.1111/j.1467-9280.2006.01693.x
Roediger, H. L., & Karpicke, J. D. (2006b). The Power of Testing Memory: Basic Research and Implications for Educational Practice. Perspectives on Psychological Science, 1(3), 181–210. https://doi.org/10.1111/j.1745-6916.2006.00012.x
Roediger, H. L., & Karpicke, J. D. (2018). Reflections on the Resurgence of Interest in the Testing Effect. Perspectives on Psychological Science, 13(2), 236–241. https://doi.org/10.1177/1745691617718873
Smith, M. A., Blunt, J. R., Whiffen, J. W., & Karpicke, J. D. (2016). Does Providing Prompts During Retrieval Practice Improve Learning?: Retrieval-based learning. Applied Cognitive Psychology, 30(4), 544–553. https://doi.org/10.1002/acp.3227
Smith, M. A., & Karpicke, J. D. (2014). Retrieval practice with short-answer, multiple-choice, and hybrid tests. Memory, 22(7), 784–802. https://doi.org/10.1080/09658211.2013.831454
Smith, M. A., Roediger, H. L., & Karpicke, J. D. (2013). Covert retrieval practice benefits retention as much as overt retrieval practice. Journal of Experimental Psychology: Learning, Memory, and Cognition, 39(6), 1712–1725. https://doi.org/10.1037/a0033569
Weinstein, Y., Nunes, L. D., & Karpicke, J. D. (2016). On the placement of practice questions during study. Journal of Experimental Psychology: Applied, 22(1), 72–84. https://doi.org/10.1037/xap0000071
Whiffen, J. W., & Karpicke, J. D. (2017). The role of episodic context in retrieval practice effects. Journal of Experimental Psychology: Learning, Memory, and Cognition, 43(7), 1036–1046. https://doi.org/10.1037/xlm0000379
Zaromb, F. M., Karpicke, J. D., & Roediger, H. L. (2010). Comprehension as a basis for metacognitive judgments: Effects of effort after meaning on recall and metacognition. Journal of Experimental Psychology: Learning, Memory, and Cognition, 36(2), 552–557. https://doi.org/10.1037/a0018277

Average of ratings: Useful (1)
In reply to Matt Bury

Re: Fourth Friday: Moodle Research Review for January 2020

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers
Retrieval practice is a very strong effect for being able to retrieve specific facts. It is an important effect that has its place. However, it is not a universal panacea. It only helps you recall the specific facts you practice recalling. It does not help other related facts, and it does not help with understanding what you have memorised. https://www.sciencedirect.com/science/article/abs/pii/S2211368114000588 is one of the useful papers exploring the limits of the testing effect. Worth a read.

Actually, does anyone know a good recent review article that summarises what has been found about the limits of the testing effect?

I hope I am not being too negative. Although there are limits to what the effect can be used for, it is a great way to learn things. I use it when learning music for choir, and it's why I do Go problems every day.
Average of ratings: Useful (1)