An interesting workshop about self-assessment tools

An interesting workshop about self-assessment tools

by Tim Hunt -
Number of replies: 30
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers

About 10 days ago, I took part in a very interesting workshop about the use of assessment tools to promote learning:

Self-assessment: strategies and software to stimulate learning

The day was organised by Sally Jordan from the OU, and Tony Gardner-Medwin from UCL, and supported by the HEA, so thanks to all of them for making it happen.

People talked about different assessment tools (not all Moodle), but also how they were getting students to use them, and in some cases what evidence there was for whether that was effective.

Parts of the event were recorded, and you can now access the recordings at http://stadium.open.ac.uk/stadia/preview.php?whichevent=1955&s=1. There is a total of 3.5 hours of video there, so you may not want to watch it all. My presentation is in Part 3, which also includes the final discussion, all in 30 minutes, and provides a reasonable summary of the day.

Despite having spent the whole day at the event, and discussed various aspects of what self-assessment is, I don't think we reached a single definition for what is self-assessment. Actually, I think it is clear that it is not one thing, but it is a useful way of looking at many different things, from the point of view of what is the most useful thing to help students learn.

One of the tools discussed during the day was PeerWise. If you have not come across that yet, then you should take a look, becuase it looks like a very interesting tool. There is a good introduction on Youtube:

.

Average of ratings: Useful (3)
In reply to Tim Hunt

Re: An interesting workshop about self-assessment tools

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

I've put up a more informative way to access the recordings and pdf files for presentations at our self-assessment workshop last week at http://www.ucl.ac.uk/~ucgbarg/OU_workshop.htm (linked as "more information" from other sites to do with the workshop).

I'm puzzled why people should think the term 'self-assessment' unclear. 

Summative assessment is for examiners to pass judgement on you.

Formative assessment is for teachers to find out if they're succeeding in teaching you, and to provide stimulus and feedback to you.

Self-assessment is entirely private - your business alone, to help you learn and direct your learning efficiently. It's even fun.

Ipsative assessment was a new concept for me at the workshop: part of the core function of self-assessment as I see it, whereby a student gets some measure of whether he or she is improving, whatever their absolute standards.

 

 

Average of ratings: Useful (2)
In reply to Tony Gardner-Medwin

Re: An interesting workshop about self-assessment tools

by Tim Hunt -
Picture of Core developers Picture of Documentation writers Picture of Particularly helpful Moodlers Picture of Peer reviewers Picture of Plugin developers

I have been thinking about it for a few days, and your attempt to clarify self-assessmenty does no satisfy me, although it captures some of the essentials. Self-assessment, as we discussed it was not entirely private. What you (the student) did while engaging with the tool may have been entirely private (unless, say, in LAPT, you chose to comment on a question - an acutally one of the speakers was able to present data that students who did not engage with LAPT were much more likely to fail the exam ...). However the task was, in most cases, determined by the teacher. 'The task' may, or may not, include authoring the questions.

In reply to Tim Hunt

Re: An interesting workshop about self-assessment tools

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

I think this debate about what self-assessment should be is quite important, and I've copied it and responded myself again in a forum I've been setting up for the meeting, perhaps more appropriate than here for those interested, at http://tmedwin.net/forum/index.php?board=3.0 

In reply to Tony Gardner-Medwin

Re: An interesting workshop about self-assessment tools

by Rick Jerz -
Picture of Particularly helpful Moodlers Picture of Testers

I have been delivering quizzes, which I call "self-assessments" since I started using Moodle six years ago.  What I typically do is to create a multiple-choice, true-false quiz with 15 to 20 questions picked randomly from a pool of 60-80 questions, and I allow students to repeat these until the "are satisfied that they have learned to topic".  In other words, satisfied with their grade.  If a student stops at 16 out of 20, for example, that is fine.

Of course, I tell them that they will always get (grade) points for anything they do.  And I tell Moodle to keep their highest score so that they are encouraged to keep trying.

I find that this is one of the best methods to get students to read the book and learn on their own.  Maybe, these "self-assessments" become enjoyable for them, like a game, and they try to get the best score.

For exams, I then draw questions from the same test bank of questions.  If they have practiced their self-assessments, they should do well on exams.  Of course for exams, I give them only one attempt.

Moodle's quizzing mechanism makes this method very effective.  If I wanted to carry this method a little further, I would probably try the conditional features that I have read about in 2.3, when I upgrade to this version soon.

self-assessment

In reply to Rick Jerz

Re: An interesting workshop about self-assessment tools

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

Rick - It sounds a bit as if you are offering the students a chance to see their exam Qs in advance, ad lib! Not surprising they like it! A sprinkling of overlap between the self-assessment and exam databases must be good, to motivate the students to work on it - but surely not too much, or they will rote-learn. I would prefer to use different and varied formats for Qs in self-tests than exams (TF vs MCQ vs free text etc.) so things are coming from different perspectives.

Average of ratings: Useful (1)
In reply to Tony Gardner-Medwin

Re: An interesting workshop about self-assessment tools

by Rick Jerz -
Picture of Particularly helpful Moodlers Picture of Testers

Yes, they can see exam questions ahead of time.  However, the more repeating of questions, the more likely that textbook material might sink into their heads.

Yes, I am hoping to eventually use more question types.  The problem right now is that the publishers do a good job supplying T/F and multiple choice questions.  When you get to some of the more advanced Moodle question types, the publishers don't supply any so one must end up creating these from scratch, which takes time.

In reply to Rick Jerz

Re: An interesting workshop about self-assessment tools

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators

Rick "... so one must end up creating these from scratch, which takes time."

Well, isn't that a teacher's job? wink

Joseph

Average of ratings: Useful (1)
In reply to Joseph Rézeau

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

I'd argue that it can't be the teacher's job. Creating enough variants to make practicessment exercises effective takes a lot of time and to do that efficiently one needs to master various tools beyond Moodle's interface for question authoring and sometimes invent new tools. With such exercises students' awareness to the details increases dramatically and the teacher may have to work much harder to prepare for answering in-class questions and connecting the dots. smile

In reply to Joseph Rézeau

Re: An interesting workshop about self-assessment tools

by Rick Jerz -
Picture of Particularly helpful Moodlers Picture of Testers

Maybe so, but often textbook authors (or publishers) will do this for you.  Additionally, when the publisher/author does this, there is some assurance the that answers can be found in the textbook, and that the questions have been tried on a bigger population and might not be in error.

However, creating quizzes and exams are among many tasks that teachers might take on.  (One might argue that teachers are supposed to create their own textbooks, and course management systems, too smile, just kidding).

In reply to Tony Gardner-Medwin

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

IMO your concern is somewhat misguided. It seems to suggest (by analogy) that we should not let toddlers walk too much lest they would rote-learn how to walk as opposed to genuinely understand walking. Insofar as you are concerned about rote-learning as opposed to genuine understanding, I would challenge you to clarify what understanding exactly is and how it is measured and assessed. If 'coming from different perspective' means that we should test students with problem types they did not practice, there seems to be something fundamentally wrong about that. I'm pretty sure that not too many teachers would agree to be tested and assessed on their teaching skills by having to teach in ways they have not experienced and practiced. smile

In reply to Itamar Tzadok

Re: An interesting workshop about self-assessment tools

by Joshua Bragg -

I see nothing wrong with testing students with problem types they did not practice.  It all depends on how you're doing it.  If the students know the basic concepts and skills that are required for the problem and solving problems of types they have never seen before is discussed and practiced in class with general problem solving skills then it is fine.  Learning to solve problems of types you have never seen before is a valuable real-world skill. 

On the other hand, if you've been giving students rote practice with clearly defined parameters in class and then test them on something unexpected on the test, you didn't do a good job of preparing them for the test.  That preparation doesn't have to be a discussion of the exact type and format of the question though.

As I tell my students in my chemistry class, if you want to make the big bucks in the world you're going to be applying your skills and understanding to unique situations.  The people who can do that are the most valuable commodity in any organization.

I'm tested everyday by the questions and concerns that students bring to me that I've never encountered before.  I am evaluated by that student on how well I seem to deal with those questions and concerns.  My wife works in advertising and no two clients are ever the same.  

In reply to Joshua Bragg

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

... and solving problems of types they have never seen before is discussed and practiced in class with general problem solving skills ...

There may be some confusion here between problem types and problem instances. Testing the students with never-seen-before instances of problem type A which has been discussed and practiced in the course is fine. But testing students with instances of problem type B which has not been discussed and practiced is questionable. This may be testing the students on their ability to move from type A to type B without giving them a fair opportunity to practice and acquire the skill to apply from A to B. It is legit if that's the requirement, but it's rarely the requirement even in the rare occasions where teachers make the requirements clear.

if you want to make the big bucks in the world you're going to be applying your skills and understanding to unique situations

Not really. It's probably more important to be willing to do what most people won't do, most of which has nothing to do with skills or understanding of unique situations. But I suppose those who actually make the big bucks can be better judges of what it really takes.

I'm tested everyday by the questions and concerns that students bring to me that I've never encountered before ...

Only if at the end of the term you get a grade that affects your paycheck prospects. Otherwise, it's not really a test and you're not really evaluated by the students. smile

In reply to Itamar Tzadok

Re: An interesting workshop about self-assessment tools

by Joshua Bragg -

I don't think I'm confusing problem types and problem instances.  I honestly mean types of problems never seen before.

I don't know what your background is in terms of teaching.  Mine is chemistry.  Chemistry, Physics, and Math classes are very problem orriented.  It is impossible to show every possible type of problem to student in these classes, even when limiting the types of problems to ones that only require skills that are taught in that class, not ones that would be taught later.  There are just too many ways to combine the skills that you learn together.

Don't get me wrong, I'm not testing them on skills they haven't learned.  I'm testing them on the ability to apply those skills in novel problems.  They are problems that they have the skills to solve, they just need to figure out how to apply those skills to that situation.

I think it is easiest to demonstrate what I'm talking about with a problem:

One of the many super acids, which are defined as any acid stronger than 100% sulfuric acid, can be prepared by the following reaction:
2HF + SbF5 → [H2F]+ + [SbF6]-
How many octahedral ions can be produced from 2.93mL of anhydrous HF (density = 0.975g/mL)?

To solve this problem, they need have the following skills:

  • Draw a Lewis structure for the ions and identify which one is octahedral.
  • Calculate a mass from a volume using density.
  • Use the mass of the HF to calculate moles of HF
  • Use a mole ratio to calculate the moles of SbF6-
  • Use the mols of SbF6- to calculate the number of ions using Avogadro's number. 

All of those individual parts are things that I teach in class.  But I think it is a requirement that students learn to solve complex multistep and multiskill problems.  They haven't learned how to when they get to me and I do my best to get them to learn.

I can't remember if I created that problem or partially or completely stole it from another source.  But I can be pretty certain that they're fairly unlikely to see a problem of that same type again.  That doesn't mean its not worth solving and figuring out how to do.

On your big bucks point, based on what I'm told everyday when I tell people that I teach chemistry, I should be making the big bucks because very few other people I've met are willing to do it.  Or to teach at all for that matter.

Finally, I live in North Carolina in the US.  The state is reworking the teacher evaluation system.  It is likely to include student surveys as part of a measurement of teacher effectiveness.  (At least from the last report I heard from our state Board of Education.)  The state legislature recently approved a budget that tells school districts that they may submit plans to use teacher effectiveness data for merit pay.  So, my paycheck is evaluated right now by my students but in 5 years I'm not going to be surprised if it is.

Average of ratings: Useful (2)
In reply to Joshua Bragg

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

Appreciate your detailed response. I don't have a chemistry background but I do have math and at the other end of the circle, philosophy/history. Both ends as well as everything in between are very problem oriented. Just like anything important in life. The only thing that makes chemistry and math seem more or exclusively problem oriented (compared to philosophy/history) is that they are accepted as inherently symbolic, and it is much easier to introduce and solve a complex problem without getting lost in details when the problem/solution description can be abbreviated by symbolization. Subjects such as philosophy/history are traditionally unsymbolized and consequently do not lend themselves easily to complex problem solving. The closest they get to that is a short moment of formal logic, where you can find problems such as

Provide a proof for the following argument

∃x∃y(((Ax ∧ Ox) ∧ (Ay ∧ Oy)) ∧ ~(x = y))
∀x(Ax → Px)
∀x∀y∀z((((Px ∧ Ox) ∧ (Py ∧ Oy)) ∧ (Pz ∧ Oz)) → ((x = y) ∨ ((x = z) ∨ (y = z))))
-------------------------------------------------------------------
∃x∃y((((Px ∧ Ox) ∧ (Py ∧ Oy)) ∧ ~(x = y)) ∧ ∀z((Pz ∧ Oz) → ((z = x) ∨ (z = y))))

and depending on the system of rules you are provided with can take some 40 steps to solve (and up to 80 for some students, but only b/c I put the limit at 80).

So we seem to talk about the same sort of complex multistep and multiskill problems. Consider now that I require students to solve 10-20 such problems (alongside instances of a couple of other types) every week for 12 weeks. The questions are drawn randomly from a large bank so in each attempt the student may get a different subset, but even with repeating instances the letters and order of premises will be changed randomly. The virtual question bank is pretty huge so much so that it ensures that even if the final exam draws from that same question bank, as it does, students are likely to get "never-seen-before" and yet well-practiced instances.

Beyond formal logic we need to work harder to generate problem domains where symbols are not a natural/traditional extension.

Your big bucks repoint granted, but I was referring to things in demand and especially those in the grey area.

Sorry to hear about the rework by your state. It's amazing albeit not surprising to see institutional education keep complaining about the superficiality of students learning and performance while submitting to what is arguably a very superficial rating culture. How can students who are not professional instructors evaluate the professional performance of instructors? And if they can't what are these surveys good for? From what I see, all the surveys do is drive instructors to be very lenient and give higher grades so as to get better student evaluations. And then institutional education start complaining about grade inflation.

smile

Average of ratings: Useful (2)
In reply to Joshua Bragg

Re: An interesting workshop about self-assessment tools

by Joseph Rézeau -
Picture of Core developers Picture of Particularly helpful Moodlers Picture of Plugin developers Picture of Testers Picture of Translators

Joshua wrote "2HF + SbF5 → [H2F]+ + [SbF6]-"

and Itamar replied "∃x∃y(((Ax ∧ Ox) ∧ (Ay ∧ Oy)) ∧ ~(x = y))
∀x(Ax → Px)
∀x∀y∀z((((Px ∧ Ox) ∧ (Py ∧ Oy)) ∧ (Pz ∧ Oz)) → ((x = y) ∨ ((x = z) ∨ (y = z))))
-------------------------------------------------------------------
∃x∃y((((Px ∧ Ox) ∧ (Py ∧ Oy)) ∧ ~(x = y)) ∧ ∀z((Pz ∧ Oz) → ((z = x) ∨ (z = y))))"

and Joseph quietly left this discussion.thoughtful

In reply to Joseph Rézeau

Re: An interesting workshop about self-assessment tools

by Lesli Smith -

Hee, hee, Joseph. I think both Joshua and Itamar are ultimately actually agreeing on the concept of scaffolding learning, albeit their answers are in slightly different languages. It reminds me of one of my favorite Gregory Corso poems. But I don't think this time it has quite as dramatic an ending for the two speakers.

Thanks, Joshua, for reminding me how much I forgot from my chemistry classes 20+ years ago--BUT what I do remember is how we approached problem solving and used the scientific method, etc. in those classes.  You are right on, there.  Similarly, and in my own field, I've forgotten key passages of Shakespeare that I had to memorize for class A, B, or C.  However, I haven't forgotten how to look at a passage and apply analytical skill sets to it.  That is our goal anyway, I think, right?  Ultimately, in whatever subject we teach, we want students to learn how to read critically, respond reflectively, and apply their understanding effectively when facing new subjects/new knowledge.  

What I still need to improve on in my own practice is figuring out ways to better involve students in the process of creating their own agency in relation to this learning. The idea of incorporating self-assessment gets me partway there as I ponder whether we are currently asking the right questions regarding some of the choices we are making in creating formative through summative assessments in our current systems of learning.

Average of ratings: Useful (3)
In reply to Lesli Smith

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

Here is a question for you Lesli.

What role did the forgotten passages of Shakespeare play in your acquisition of the unforgotten ability to apply analytical skill sets to passages of Shakespeare and others?

It should be pretty hard to answer this question and not only because we don't remember the things we have forgotten. But arguably those memorised passages did play an important role. It's like watching a movie for the third and fourth time. You start noticing all kinds of details you haven't noticed before. That's because many of the other details that attracted your attention the first couple of times are by the third and fourth times familiar enough to be perceived without the need to be aware of perceiving them. And so you can turn your attention to some details that were there all those times but you were just too busy to notice them. In a similar way, you can start paying attention to ideas between the lines after the lines themselves are well absorbed. smile

Average of ratings: Useful (2)
In reply to Itamar Tzadok

Re: An interesting workshop about self-assessment tools

by Lesli Smith -

Exactly.  And  that's precisely the part of learning that can't be measured in the here and now (or maybe even quantitatively measured ever), but that doesn't seem to stop us from trying, apparently.  Through analytics we keep trying to find the secret to the precise placement of the seed, the correct amount of irrigation, and the right exposure to the sun.  Sure, there is much to be learned from studying the effects of the variables, but we may never know for certain what the totality of variables were that made up the "success" we see twenty years or so later.   Personally, I like the mystery, but policy makers would prefer a formula.  smile

Average of ratings: Useful (1)
In reply to Lesli Smith

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

But if you can't measure it how can you assess and grade it? And yet you do assess and grade it. And what if your mysterious subjective criteria are flat wrong for the here and now of a particular demonstration of learning? Arguably you can't be critical and reflective about that because it's a mystery to you. And if that's the case then it's not a very good example of how to read critically, respond reflectively, and apply understanding effectively when facing new subjects/new knowledge, but you say that you want your students to learn from you precisely that.

The fact that we currently don't know whether we may ever know for certain the totality of variables that make up the success we see later, is not a good reason to call it a mystery, especially when considering the possible consequences of treating it as a mystery.

When it comes to assessment, personally, I like the formula, contra to many instructors who prefer the mystery.

I've started compiling my take on essay writing at http://substantialmethods.com/subject/view.php?id=8&topic=-2. I'd love to see you challenging there my ideas and practical recommendations. smile

In reply to Itamar Tzadok

Re: An interesting workshop about self-assessment tools

by Lesli Smith -

Hi, Itamar.  Sorry for the delay.  I was offline on holiday for a bit with my kids.  smile

Yes, as I tried to explain in a post below, every time I articulate my concerns about using assessment standards, it seems to come off that I don't believe in standards.  That is ironic since one of my chief goals this summer has been to bridge US Common Core State Standards for my discipline with outcomes and benchmarking systems in Moodle.  I wholeheartedly believe in the power of tracking standards and benchmarks.  It is just when people start turning from conversations of improving craft through the use of benchmark data and instead try to use this data to somehow "package" a successful teacher that I start getting cranky.  I'm not trying to create ways for us to carbon copy one another.  We are supposed to be finding more powerful ways to help us be successful at what we do as US, not as someone else.  That is how the data should be used.

If you take issue with my use of the word mystery, I suppose the roots of my ambivalence won't strike a better cord as I blame it on the year I picked up Hal Foster's Return of the Real while trying to keep up with my husband and his fellow artists as they went through their respective MFA programs.   I was thinking of this search for the "Real" when I used the word mystery.

Put another way, various writers, even writers who teach, have often discussed the way that taking apart a piece of literature can be rather like dissecting a frog.  The learning gained in the act of the dissection is very valuable (and may keep other frogs alive longer in the future), BUT for all practical purposes, this particular frog is dead.  

I guess it seems to me that too often policy makers look at the data with the frame of trying to re-animate that "frog," instead of with the frame of how it can help other frogs in the future, which is futile at best, and like Frankenstein at worst.

I took a look at your site and am intrigued, but will have to come back to comment further after I've had more time to review it without little ones interrupting me.  smile

 

In reply to Lesli Smith

Re: An interesting workshop about self-assessment tools

by Lesli Smith -

Argh.  I surpassed my editing time.  I would correct "cord" if I could.  It should be "chord."   sad

In reply to Itamar Tzadok

Re: An interesting workshop about self-assessment tools

by Lesli Smith -

Hi, Itamar.

Regarding feedback on your essay course so far, I really like it.  Have you seen this site, "Thou shalt not commit logical fallacies"?  It's pretty funny.  I think I came across it when a fellow Moodler tweeted it a while back, but I can't remember for sure.  The Moodle geek in me is loving the course format.  Is that the new course format in 2.3?  Nice!

Now, just so Tim doesn't get upset with us for hijacking his topic, I'm going to bring it back around to the matter at hand: self-assessment strategies, in this case, for essay writing.  For me, I've been trying to master the many variables of outcomes and roles permissions in the hopes that maybe someday I might have a good handle on how to create a peer-critique structure that would open up those types of assessment tools to students.  The workshop module already offers quite a bit in this arena, but I think we can push the envelope more.  Of course, first one has to have a really good understanding of the math behind all of these assessment tools and how it all works together.  The math part has been my Achilles heel in this endeavor.  Looking forward to hearing how you are thinking about it for this particular scenario.

Average of ratings: Useful (1)
In reply to Lesli Smith

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

Have just seen the fallacies site (thanks for sharing!). Indeed pretty funny and nicely designed. Not sure, though, that I agree with the opening statement.

A logical fallacy is usually what has happened when someone is wrong about something.

Surely, we can be usually wrong without making any logical fallacies. We just need to start with being wrong about some things, and then conclude from those things many other wrong things (among occasional right things) by perfectly valid reasoning. Logical fallacies do not entail wrongness just as valid reasoning does not entail correctness.

The course format has not been released yet. It currently works on 2.2.2 and will be adjusted to 2.3 in a month or so. I've been asked to release the current version and plan to do that next week.

What is so sacred about essay writing? (Here's a retrospectively amusing anecdote) Put another way, what do we need it for? If it is just a written expression of critical reasoning, then maybe it's good for not much. As hinted above, valid reasoning is not necessarily more useful than logical fallacies (or more broadly rhetoric) for the good life (a long standing debate from the dawn of rationality). If it serves some practical purpose then often using templates and filling the blanks does the trick. But for the latter not much of assessment is needed.

It seems to me that for self-assessment of essay writing the problem is not so much the self part as the assessment part, which in turn is a problem of the self which manifests itself in our inclination to preserve the right to make subjective human errors just to remind ourselves that we are better than other things.

That said, peer-view (where peer could be one's other self) is important for engagement and engagement is important for learning. I will be adjusting and posting more of my experimental work for essay writing and we could take it from there. smile

 

In reply to Itamar Tzadok

Re: An interesting workshop about self-assessment tools

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

Itamar - re 2 July 2012, 2:25 PM

There's lots of interesting debate to be had round the relationship between rote learning, knowledge and understanding. I didn't mean to imply that rote learning was itself a bad thing. I am eternally grateful I learned my multiplication tables by rote. But I also understand them - so if I hesitate over whether I am sure that 7x12 is 84, I can immediately relate this to other things - e.g. that this is 12 more than 72 which I am confident is 6x12, so it  has to be right. A parrot can rote-learn, but we expect students in most contexts to do a bit better. We shouldn't be giving high marks to students who can simply reproduce textbook paragraphs.


Assessment of understanding surely isn't hard - a matter of asking some of the almost limitless questions whose answers will follow from a textbook paragraph, but that may never have been encountered in quite the same form. That's why I regard it as so important in self-assessment for students to think how their answer to a Q relates to other pieces of knowledge. If you ask a medical student "Inulin injections are used to treat Type 1 diabetes. T/F?" you are not just finding out if she (incorrectly) associates these names, but you are trying to stimulate thinking about what has gone wrong in diabetes 1&2, what the pancreas normally secretes and how the hormone name is similar to something - yes, inulin - used to measure kidney glomerular filtration. Any of these points might be a question that could arise in an exam. Understanding is a matter of inter-relating correctly all the different pieces of knowledge that you have.

In reply to Tony Gardner-Medwin

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

Here is the problem Tony. You say

Understanding is a matter of inter-relating correctly all the different pieces of knowledge that you have.

If what counts as correct is not just your personal whim, please give me a list of all the inter-relations I should master or a way to systematically extract them from the knowledge you do make available and let me learn them the way that works for me, even if that way doesn't work for you and even if its mere parroting. Otherwise you only measure my ability to guess what you think is right. And that's wrong.

At any rate, I don't think your multiplication tables example works. If I hesitate over whether I am sure that 7x12 is 84, I may be able to immediately relate this to other things - e.g. that this is 10 more than 74 which I am confident is 6x12, so it  has to be right. It's right but I'm wrong and that's because confidence has nothing to do with it. smile

In reply to Itamar Tzadok

Re: An interesting workshop about self-assessment tools

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

Itamar - Two misconceptions (7x12 is 10 more than 6x12, and 6x12=74) can indeed combine to appear to give justification for a correct belief. Epistemologists name this paradox after Gettier. But more commonly errors do not cancel, and the process of using related pieces of knowledge to test something at issue can work well - providing justification or reason for doubt. Your example surely illustrates the merit of asking Qs from different perspectives: your two misconceptions that arise as you check the hypothesis 7x12=84 are two questions that might be asked directly in a future test. If you recognise that these themselves are unreliable ideas - wherever they come from -  and check them, you will benefit. There is an infinity of Qs like this that you could base on tables up to 12x12. You are welcome to derive your own way, perhaps different from mine, of developing what I would call the 'understanding' that enables you to answer any of them correctly. However you do this, it isn't going to be just parrot learning - because I can almost certainly think up an unfamiliar Q that would stump any parrot (e.g. Is 6x4 two times 3x4?).

In reply to Tony Gardner-Medwin

Re: An interesting workshop about self-assessment tools

by Itamar Tzadok -

Not quite Gettier problem b/c such a problem requires two levels of justification and transitivity, and in my example there is no justification for 6x12=74 (confidence doesn't count as justification). It's rather basic logic. True statements can follow validly from false ones.

At the end you seem to agree with what I've been proposing. If understanding is any thing that enables me to answer your questions correctly, then we can safely dismiss the word 'understanding' as unhelpful for measurement and admit that ultimately we measure correct answers. Your responsibility as an instructor is to ensure that the correct answers are available to me, one way or another (or preferably multiple ways), when I prepare for the exam. If the exam requires me to make inferences to answers that were not available, you need to make sure I explicitly practice making such inferences in the same problem domain.

There is no need to underestimate the extent of parrot learning and assume that any parrot could not possibly answer 6x4=24, 3x4=12, 2x3x4=24, 2x12=24, Yes. But then again, by your definitions you may be inclined to call such a display 'understanding' and look for the human ghost in the parrot machine so as to preserve the elusive distinction between parrot learning and "real" learning ("understanding"). smile

 

In reply to Itamar Tzadok

Re: An interesting workshop about self-assessment tools

by Tony Gardner-Medwin -
Picture of Particularly helpful Moodlers Picture of Plugin developers

Itamar - At least we agree that students should be able to do more than just recognise or repeat utterances they have encountered. So rote learning of the expression of facts (as a parrot might achieve, or you might achieve in a language you don't understand) is not enough. The scary thing with many students is that their ability to rote learn is phenomenal, and they can be tempted to rely on just that. Medical students who cram for exams that way can even pass with little real understanding of what they have learned. This is short-lived knowledge that is no foundation for a career as a doctor and it is dangerous when students pass exams that way. What is missing is the network of relationships between all these things: A is an instance of B;  B results in C, etc. If you lay down such a network, then you can use the relationships, even after many of them have been forgotten or don't immediately come to mind, to arrive at or check the reliability of pieces of knowledge that you need to use. If you don't want to call this 'understanding', OK - but you need it, and it seems to me the essence of understanding. We can develop it and  assess it by asking questions that have different perspectives on the issues, to which answers cannot all have been simply rote learned. All this has little to do with Moodle, so I guess shouldn't really be here.

In reply to Tim Hunt

Re: An interesting workshop about self-assessment tools

by Frankie Kam -
Picture of Plugin developers

Hi Tim

Thank you for introducing me to Peerwise! What a wonderful collaborative learning and assessment tool/website!! Thanks to your post, I've already created my instructor account and I can't wait to create my own institution account/page.

Frankie Kam
Stamford College Malacca
Malaysia

In reply to Tim Hunt

Re: An interesting workshop about self-assessment tools

by Lesli Smith -

Hi, Tim.  Thanks so much for posting these resources.  At the moment I am seeing all kinds of interesting intersections between tools and pedagogical theory/practices as they relate to figuring out better ways to put the learner at the center of the ways courses and assessments are framed.  

With regard to Peerwise, back in the day when I was fielding various related versions of "does this tool integrate with Moodle" questions, I was starting to get asked about Peerwise fairly often.  Those who asked used it and really loved it.  I looked at it briefly enough to see that there were some key overlaps with workshop at the time, but that there were many things it offered that couldn't yet be found anywhere else.

Lately, I've been concerned about certain intersections between personalized learning and self-assessment opportunities, but I've had difficulty articulating those concerns.  Luckily, a blogger I follow has articulated them for me here: 

The Wide Space Between Personalized Learning and Personal Learning is Choice of Action by Mary Ann Reilly

This question of agency is, I think, at the heart of how we really use the tools you all have created for us in ways that actually create learning spaces, versus just spaces for rote learning and regurgitation.  

Changing gears a bit, and I know you are the Quiz master, not the gradebook forum master, there is an interesting question being asked over in the gradebook forum on how we can better involve students in active self-assessments of their individual grades, perhaps using comments in a gradebook feedback report, to track these conversations.  The how interests me a little bit, here, of course, as the devil is always in the details, but I'm also interested in the why and the what.

I'm therefore curious to get the feedback of the current crew participating in this discussion on the different types of assessment feedback we have or will have available to us soon.

Thanks again for starting a really great conversation! 

Average of ratings: Useful (2)