Does anyone knows if it exists a solution to track the time spent studying lessons?
Thanks!
Shimoda
Implant a chip in a student?
Not that unreal: In one Dutch pub it is "hype" (or "camp" : what can you do next after all these eyebrow-rings and tatoo things?) to let implant voluntarilly a chip in your arm with your pub-account on it, scannable by the waitress.
(I am not joking this time.)
In the past we had AICC. You could register that student number 5 took .3 secs more for question 234.b part 7a. What information can you distract from that piece of data?
What I really want to say is that you are - from a SC point of view - moving in the wrong direction: overcontrol and double check. You take the responsibility out of the hands of the student... worse then old fashioned claasrooms..
Even in a positive way: what help could you give to this student with that information? (Products in the eighties - like Authorware - had all this built in, AND NOBODY USED IT)
Moralist remark: Spent your time on developing tools that help students to visualise - in a popup window? - their thinking when they do these lessons or readings..
"Show me in one graph what my students did last 2 weeks in my course instead of all these logs I must study", is what teachers ask.
Students along the vertical axis, activities along the horizontal, cell-color as timeindicator. (with a color-legenda in the corner like altitude maps?)
(or something like the good old GISMO graphs?)
Dear Audun
Thank you for this screen-shot. It looks good for tracking student participation. I am presuming it is part of Statistic, is that right or is it a block as Stuart suggests?
All the same I think that I need a display of *studying time*.
There are people here that may want to evaluate Moodle against another system to see which one forces students to study for longer, in terms of the number of minutes that students are coerced into studying per week.
(Largely in answer to Joseph Rézeau: Leaving aside whips and electric shocks, failing students is probably the most common way of making sure that students study, and my institution uses that method increasingly, especially since the introduction of a requirement to get a stipulated grade in a recognised test -TOEIC™- as a requirement for graduation. I much prefer the use of a LMS over a TOEIC™ score or final exam. A judicially used LMS can spread out the pain. Students can fail themselves, or study, one mini-test at a time.)
So, some sort of average log in time (minus a final, or any other, timeout), or average total time on weekly tests would be good.
I would not use it to evaluate students, but, having said that, there are some students that just give random answers to tests in a few seconds, whereas others are just bad at tests. It would be nice to see which students are doing both badly, and also doing badly very quickly at a glance.
Thank you again as, as always, I remain in your debt for your course organiser javascript Christmas Present which I am still using as an essential part of course management.
Tim
I suppose that was a slip of the keyboard and you actually meant to write "flailing* students is probably the most common way of making sure that students study". Spare the rod and spoil the child, etc.
Joseph
*to flail = to beat someone or something violently, usually with a stick [LONGMAN Dictionary of Contemporary English]
I like very much that chart. I have some seemfully charts either that receive "real-time" usage indications of resources and "real reading time" (ajax based tracking) on chapters. (did you heard about "Moodlus Oculi" ? )
Was presented at French MoodleMoot 2007 in Castres.
I fear this is still a bit theoretical approach, conversely to yours, but may be interesting to mix... maybe.
Hi,
In all the years I've been working with another CMS, a common question has been to find out how much actual time a student spends on each page in a module (Book in Moodle, Content Module in WebCT CE, Learning Module in WebCT Vista, Learning Unit in Blackboard). And the answer is basically "can't be done" given the generally "stateless" nature of the Web. The question I will often toss back is "does anyone know how much time a student spends reading each page of a printed textbook?"
Short of sitting next to a student, there is no way to get actual time spent on a page but there are some indicators available in CMSs as mentioned in other messages. For example, in WebCT we can look at Student tracking that shows when a student accessed each page in a content module. If we start seeing very short intervals between the entries for each page, it's a good indication that the student was "clicking through".
But we need to be more realistic in that students read and comprehend at different rates. For those of us "older farts" with bifocals and trifocals it takes longer to read something on a screen than when we were much younger. My concern is not so much "how long?" but "did they comprehend when finished?" For that we can use all the other typical assessment and evaluation techniques.
Cheers,
Bob
These kind of questions turn up every so often and usually, as with Tim's comment above, because the requested 'features' are needed to convince a non-technical, non-teaching bureaucracy rather than for any real benefit to those teaching or learning (often these suggestions will actually harm the learning experience).
Rather than just shoot down each request individually as it comes in, would it be possible to try and collect together some of the very real on-line teaching and learning experience that is present in the community to produce 'position papers' to explain:
I'm not saying that every bureaucrat is susceptible to reason, but having a page to point to with a comprehensive canned answer makes you seem less negative than simply saying "no, that's a stupid idea" or "that'll never work".
So would this crazy idea work? Is collecting the threads addressing these in a wiki a good start? Does anyone have suggestions for commonly asked questions that require a polite "no"?
Dear David, Bernard
Thanks very much for the hack above Bernard. I will definately be installing it at the end of term.
Sorry, David, I should not have blamed the bureacrats. I am interested in the information as well, and I do not think that it is a stupid idea.
I think that from a social constructionist point of view it proabably is a stupid idea. It smacks of control and behaviourism. Also for technical reasons, like the uncopiable test, it is also impossible to achieve perfection.
I am not a social-constructionist. (perhaps behaviouro-social-constructivist?). I think that it is possible to achieve fairly good, approximate data regarding the amount of time that students spend studying on line.
The social constructivists amongst us should probably start the wiki you recommend, but those of us who are not would be advised to share hacks. I know that the mainstream of the Moodle community is not investing time implementing this feature, but I am glad that the Moodle community is pluralist (is that the right word?), and that Bernard is investing his valubale time in this way.
Tim
If you make a version of this function that the student can use to reflect on his own time management behavior, it could become a nice Soc.Constr. tool. (This student is also the only one who can interpretet the meaning of these values..)
Nice idea for a student block?, like this one: http://www.ucc.vt.edu/stdysk/TMInteractive.html
Ger,
Nice idea! It would be nice if student users where presented with these sort of statistics together with class averages.
I think that there is no way that one might be able to whip students into studying more since ultimately the tests of student-studying-time are not 100% (perhaps only 80%) reliable. Really good students might read the materials offline and do the quizes in a twinkling of an eye. If occasionally they get the quiz answers wrong then they should not be penalised as if were answering the quiz at random.
However, just displaying the data, both times and average scores might shame the students into trying harder. "Shame" is a bad word in the West, so how about "increase their public self awareness to the extent that they try harder."
A horrible thought? In Japan till recently they used to display everyone's score, till they imported Western sensitivities and "only one, not number one" education. They used to teach their teenagers to fly into ships too, one might add. To be honest, I think that I approve accross the board.
Tim
Folks
There are two very good reasons for having such a feature.
One is from a course development perspective the content developer can identify where users are pausing or skipping content thus providing clues about where further improvement may be required. Together with assessment results, the time taken to analyse course content becomes a useful guide to improving content quality and usability.
Commercially, the time spent in courses is a valuable metric to the sales person in promoting the value of eLearning to a client organisation. Yes results are important, but time spent on each course AND the desired learning/business outcome are both metrics valued by managers for obvious reasons. Less time spent for better outcomes = a better performing learning product.
Business is less concerned with methodologies and ideologies than results and cost. Time is definitely a factor here.
My experience with another system has been that with enough users there is sufficient evidence to make a valid assessment about how users are working through the course. This is not a micro level analysis of course but you can say with confidence that if numerous users paused at a spot for an extended period I had better take a closer look and figure out why.
It has been pointed out that unsupervised users could be doing anything and I agree. Hands up anyone who's students never played a game or surfed during a lesson.
Anyway, I need to investigate the student activity report as it may provide a similar sort of useful facility.