Please help us with your inputs,
* I find it very difficult to sustain in my classrooms though where I can't be accountable with quite an explicit description of how scores are composed which students can validate for themselves.
* A second issue involves getting a bit more control over making scoring adjustments in compositing summary grades -- perhaps through a 'bonus' over-ride opportunity would be terrific.
At present it is not at all easy (or possible) to explain how grades are arrived at. This makes workshop activities virtually worthless for activities where the teacher/trainer may be asked to explain results achieved by a participant. IMO this represents a greater barrier to the use of this module than the acknowledged complexity.
In the Workshop module pre-Moodle 1.5 it was possible to set the parameters used, however, in the current "Comparison of Assessments" setting there is no explanation of the underlying logic. For Workshop to be viable in future, at the very least there must be a clear and detailed explanation of the logic employed. As Steve has remarked in the post above:
"* I find it very difficult to sustain in my classrooms though where I can't be accountable with quite an explicit description of how scores are composed which students can validate for themselves."
Some other things:
- I would like to see the option of setting the assignment type to mirror the on-line text / upload a single file type in the Assignment module (retaining the option to attach additional files if necessary - as in the current Workshop).
- Split view - I don't have a suggestion as to how this might be improved, but I'm not altogether sure it is that user friendly - if I recall correctly this approach is not used elsewhere in Moodle.
- If the system of "keys" i.e different brackets to denote who graded/assessed submissions and assessments could be made clearer that would help immensely - perhaps this is just a layout issue.
- Negative weightings - are they still experimental?
- One of the things I really like in the Workshop module is the assessment form. I've long felt that this should also be a feature of the Assignment module (not for peer review obviously). Learners having sight of the assessment criteria and having the opportunity to see in advance the actual form used to deliver the assessment is always very positive in my experience. In addition, this helps promote consistency and saves time for the assessor. I have no doubt that this will remain in the Workshop module but it it could be developed so that it could slot into the Assignment module too that would be great!
Finally, I'd also like to say Thanks to Ray Kingdon for his original work and (even though I've bashed some of his refinements in my comments above) Gustav Delius for his work prior to Moodle 1.5.
I'm arriving a little late to this discussion, and I'm relatively new to Moodle, but I have already used the workshop very extensively (and would hope that all the features are retained, because I use practically all of it in various ways). The students love it, after initially struggling a little. Assessment after the group work, however, is an issue.
Here is what I would like to see, it's a small thing:
Would it be possible to create the option of hiding the "Specimen Assessment Form"?
This would offer great possibilities to utilize the workshop for more self-assessment. Students could compare their submissions with what they are offered in the assessment form. It would economize teacher time (allowing it to focus on other issues), because teachers could just check the plausibility of high self assessments. Conversely it would allow students a real self-assessment on reflective/critical thinking questions which are not suitable for short answer questions.
This really would be a great boon for distance learning, which we are trying to introduce here in the Caucasus to qualify lecturers. They have to work by themselves, in non-formal arrangements (i.e., not in the same cohort) with very limited trainer time, which we therefore would like to focus on the most important stuff.
This was a very quick outline of some ideas I had about new helpful features a workshop module might have. I will be happy to clarify and expand on any parts that you might not understand.
Basically, most of those features would be useful for large classes where the professor might need a quick way to assess assesments, so to speak.
Currently, if the workshop is assigned to be worth credit, and you want to use the peer-evaluation/critique feature, then this function must be activated for both the instructor and the students who evaluate it. However, often instructors want to be the sole individual who determines the grades, yet require that students provide feedback to one another.
This is not possible with the current module.
For the student evaluations, you can set a low value for the students' scoring when assigning a score to the assignment. However, there isn't a way to neutralize their score altogether.
In addition, because the instructor is able to write the "leading" questions that prompt student answers, these questions may not lend themselves to the Activity Credit Yes/No" options that accompany every criteria. This can be confusing to students. Yet, if the instructor wants the Workshop to be worth credit, those MUST appear after each evaluation statement because the student grading is not able to be turned-off without disabling the peer critique function altogether.
By separating the teacher and peer critiquing and allowing the scoring to be activated for the teacher (i.e. worth a grade/score) yet deactivated for the students (allowing critique, but not able to assign credit), this problem would be solved.
As regards how I use it and would like to use it...
I'd use the current workshop module if I was able to run practice exam papers using it - I currently can't as it seems to just want one, monolithic assignment. My exam papers (for AQA Specication A Psychology) are short-answer ones mainly, so I need to be able to do a kind of short-answer quiz, then have the students read and assess each other's short answers.
I also have one larger essay question in each exam section. This is assessed on two dimensions: Knowledge & Understanding, and Analysis & Evaluation. Depending on the unit, these are given varying grades, the final mark being the two added together.
The complex bit is that these two dimensions have several sub-elements. For example, the Knowledge & Undersatnding bit is made up of Content, Detail & Accuracy, Organization & Structure and Breadth/Depth of content & synoptic possibilities.
The students (and me for that matter) need to have a rubric for each one of these subelements, with one description covering a range of marks. The overall mark for the dimension is then calculated by way of an average. I'm aware this is a little complex, so here is the marking grid have:
|Marks||Content||Detail and accuracy||Organisation and structure||Breadth/Depth of content and synoptic possibilities|
|15-13||Substantial||Accurate and well detailed||Coherent||Substantial eveidence of both|
|12-10||Slightly limited||Accurate and reasonably detailed||Coherent||Evidence of both|
|9-7||Limited||Generally accurate and reasonably detailed||Reasonably constructed||Evidence of both|
|6-4||Basic||Lacking Detail||Sometimes||Little evidence focused|
|Wholly/mainly||Little or no evidence/ irrelevant|
I need to be able to record which rubric item on each of the columns the student is achieving, then choose an overall grade for that dimension. Note that some of the rubrid descriptors cover a wide range of marks. I need:
- A way of constructing such a rubric table within the workshop2 module so that I can specify what the range of each rubric statement is, rather than a separate statement for each mark.
- A way of presenting this table at grading time so that the peer or teacher can clearly see how the rubric elements add up to make the overall score for that dimension.
- A way of allowing the relevant rubric statements to all be recorded as well as the overall mark for that dimension.
- A way of assigning more than one rubric table to each essay (essays as either the whole workshop, or as a sub-section of a workshop e.g. exam paper)
- A way of re-using the tables across different assignments sections and assignments.
I would also, separately, like to use the workshop for a large piece of coursework. This has a large (15ish) number of sections, each of which would need a simple rubric marking scale (please, not limited to just 5 items!!) and would run for several months.
As time progressed, more and more of the sections would be comlpeted (although they would actually be written as a continuous document) and the student would signal that completion had happened using a 'submit this section' button so that the peers and tutor could mark it. At the same time, revisions may have been made to earlier sections, so ideally, a list of sections would be displayed for the student with radio buttons saying 'please mark this section' so that they could request marking of several at once. At that review point, the peers/tutor would have the whole document displayed and the marking rubrics that the student had indicated were to be used. The idea is for a kind of continuous re-submission and re-marking of selected parts to be possible.
Again, I hope this is reasonably clear.
I'm with matt.
Our school is using rubrics more and more. I want to be able to click the rubric cells online and have moodle calculate the score and give the feedback to the students.
I want the rubric displayed in table form. Like Matt showed above.
Hope you can deliver.
a table to change scores will be in the new light modul, a plug in to create own calculated methods will be in the advanced method
A name suggestion would be "Peer Assessment" (peer).
Hmm, that could be confusing, as we are about to release a Peer Review module based on work done by Virginia Tech and research done by Dr. Beth Eschenbach. (originally annouced here last august:
Our work discussed here:
If folks would like to register at http://cdc.humboldt.edu/collab and let me know, I can add you as teachers to the peer review course to demo it.
The module lets you peer review potentially any Moodle activity (currently internal, assignment, and resource), with optional rubrics based on the quiz question types.
It is much simpler than Workshop, with hopefully less of a learning curve.
Why not leave Workshop titled as it is? That kind of captures the more advanced and complex nature of the module, IMO.
Another idea: Advanced Assessment, which would let the title tell folks that this is a complex and powerful tool?
Looking for solutions for Workshop troubles I arrived at this post only now: Very, very nice that idea to have peer-review combined with Rubrics Moodle wide.
I think that a repaired or new workshop module could stay in place for stepwise peerreview training before you allow students to do peerreview on other topics.
I see two streams in Rubrics-land:
- The true believers who try to create real Rubric schemes and try to describe the behavior on different levels without cheating with the words less and more...
- The pragmatics who try to describe a set of qualities and use a scale for every quality: to avoid confusing with real scales they use pictograms as stars or stamps. The intersting part of this approach is that you can create a "star-profile-description" for every job and students can compare their own "star-profile" with that of their preferred job (..and work on their competenties to meet their ideal...)
- Would be nice to welcome both options in your new Modul.
Again Moodle will set a new standard for education support.
I would like pupils to be able to assess work using the scales that we have used in the other assignments in a course, but allowing the pupils to use rubrics or assessment criteria to help them to achieve these assessments.
Pupils need to be able to restart assessments, I have had problems with pupil starting to assess work then finding thet they do not have the correct software to open one of the files. They were unable to go back to the assessment and when I deleted the assessment to allow them to re-do it, due to the random allocation of assessments they often got new pupils work to assess. When working out of lessons pupils often have short blocks of time that they can use tha may not allow them to complete an assessment but that can allow them to preview files or assess some of the indendent criteria.
I think you may have mentioned these points, but this is what we would like to see:
I like the idea of an option for manual assignment of submissions for peer review. That will be handy.
We would also like the student to initially see the entire class's submissions for review (I believe the current limit is 20, and we often have more than 20 students in a class. I know that after they assess some, they will be given more randomly, but we prefer to just see them all upfront.)
Also, we like to use groups, and would like group members to see and assess only the submissions of their fellow group members.
Thanks so much!
- There will be a manual assignment.
- The current limit is 100 ... no idea why, but I think we will cancel that limit in future.
- What we discuss is to show also the history and working process (links into forum, wiki,...)
- The extended version will have group-methods
You are working for K12? The easy version, will be very handsome for beginners!
What timescales do you have in mind for the Workshop development?
I am eager to work with the new Workshop. When can we expect to be ready to update it?
Second, I also think it's one of the tools that needs the most work in terms of usability for newbies. In some ways its very confusing in its instructions. But, it also has the most potential for excellence!
Here's what I'd like to see:
1. Create a new workshop based on existing workshop.
2. Store criteria sets on the server, and then reuse one or more of those sets in a new workshops (4 out of the 10 criteria for any given project in my course are identical; for some instructors the exact same rubric is used throughout the course).
3. Upload Workshop scoring criteria. Maybe as an xml file? This would be easier for me because I keep all this stuff on my hard drive.
4. More sophisticated/unusual features in the build screen hidden by default. WebCT does this pretty well, by grouping some of the less-used features together in a hidden element that you reveal by clicking (though I absolutely hate that you can't change the default to hidden or revealed). This would certainly make things much easier for new users.
5. I'm not sure about this one, as I just noticed it today (and I posted a question about it earlier on this forum too), but it seems like if a student's submission has been scored by the instructor but NOT scored by another student, the submission score penalizes them as if they had gotten a ZERO score from the other student. Is this is problem in the tool or a problem in my understanding?
Allow me to introduce myself. I am Elena Hernández, Faculty Professor at the Department of Chemical Engineering of the Universidad de Guadalajara, México. I am a recent user of Moodle in my classes and I want to use the Workshop Module.
I congartulate you on developing a better Workshop module for de 1.6 version. Unfortunatelly, I belive the Moodle version we have at the U is the 1.5. More than suggestions, I have a request. I have read the Professor Manual for the 1.5 version (written in spanish) and I still have some questions about the distribution of the deliveries, how grades are assesed and most important! what happens to a student who did not send his assigment: is he going to get other people´s deliveries to evaluate? Is he out of the "pool" where the deliveries are distributed?
I would appreciate if you point me out to where I can get this sort of info.
Thanks a lot!
Greetings from sunny Guadalajara, México. The land where the real! Tequila is made
I would like to see extensive user interface on real human students who have never seen workshop2, workshop1 or the peer module developed at the Univ of VA. I have friends who are UI engineers. One good UI engineer can do wonders.
I think a good way to explain the flow of workshop2 and/or peer module would be this
- student turns in work, hits submit button
- computer feedback says thanks, sends email receipt to student & prof.
- computer says "I will take you through some steps to evaluate other work"
- student reads a simple screen and clicks next
- each screen has one bit of information or question
- a wizard next, next interface is given to the student
- in the end a "preview" is show their assesment of the teacher sample / peer work
- final submit button is pressed
- feedback says thanks and email receipt sent to student * prof.
- feedback given as to being done or not done ( n more assesments needed )
- time passes
- student receives email that their work has been judged by a student/teacher
the thing all kids love "how many times has your work been viewed"
- student follows URL in email or simply visits moodle
- student receives welcome and wizard interface
- student is then taken on a tour of how others viewed/scored their work
- student revises and creates perfection!
Sorry this is a late suggestion. Teaching 7th grade leaves me no free time to help with moodle while school is in session.
I love the workshop module. But the layout of the workshop screen in version 1.5 can cause complete frustration most students.
ps: see the proposed peer module below
Due to problems with the current Workshop module, including restoring courses using it on to Moodle 1.6, and because I don't want to introduce staff to a deprecated feature, I was hoping to disable the workshop module. But if I did when could we expect this new module. I guess we wouldn't see it by the start of the Autumn term?
I did not think it would be in 1.8 but it is.
So will it be in future releases?
Hope we can finish it until 12/2007....
Did you really mean to say December 2007 -- which would be more than a year from today? TIA for clarifying this.
I hope to apply it to a paper system we use - attached "IB_essay_grade", which has 5 1-5 criteria, 5 yes/no criteria, a mark which is not calculable from these criteria & a grade - but now allowing peer assessment in an organised and managed way. The existing workshop seems to do this well.
However, I agree with Jeff Rowe above [suggest wizard based UI for workshop2 and peer module] some simplification for students coming to the workshop would be very welcome. I can see many of my students failing to follow through the present set-up.
So please maintain core functions but make the workshop more user-friendly to the student.
Workshop has a great potential, but I think the real basic hurdle is the learning theorethical frameset model...
In a few words...how to use the workshop ? can u give me some examples of learning paths, assessment model and so on for different subjects? How can I use it in cooperative learning?
I teach humanities and i am particulary interested in constructivist paths involving the peer review.
I am having some trials with my students, and my difficulties rely on :
* explaining assessment criteria to students
* outputting a fair grade when elements of evaluation are many and different : the assessment comparison (strict, fair lax and so on) doesn't work as fair defense from the monkey answers (i.e. random ones). At this moment the degree of agreement between the student's and teacher's assessment is based on the differences between the scores in individual elements (actually the squared differences are used). It would need a system that compares each element of the peer's assessment with the teacher assessment, not only one that converts the mean of these differences into a meaningful grade.
The "Comparison of Assessments" option is a quite complex way to allow the teacher to control the overall oucomes of the workshop.
1. Instructors need to be able to upload their own files when scoring and giving submission feedback. I had heard that this was planned, but can't seem to find the post that suggested this.
2. I'd love to see a way to quickly QuickMail or Message a student from the Assessment form
3. I'd really really really like a way to download all student submissions from the Workshop's page a la WebCT CE 4
4. I love the ability to store feedback on the assessment form, but I personally need to be able to use HTML or plain text.
5. Again, with the idea of storing feedback, I'd like to be able to store feedback on the self/peer assessment review feedback page (that is, when I am commenting on or assessing a student's peer or self assessment)
By the way, I do have a fairly talented part-time PHP developer that I would love to assign to this project if you're looking for an extra brain to help you move this forward. Just e-mail me!
I would like to stress the possibility to assign reviews manually - not randomly. I want everyone to review two peers but I want to determine which. That would be really helpful for me (and maybe others).