How do you know if a learner has watched a video or recorded
lecture? It can be logged as having been viewed remotely but you have no guarantee that
they have attentively watched it. You need some way of knowing they have actually
watched the content.
My friends Enrique Canessa,
Carlo Fonda and Marco Zennaro, at the ICTP (International
Centre for Theoretical Physics) in Trieste, who also produced a brilliantly
innovative lecture capture system, have come up with a clever form of
video assessment that solves this problem. They have a system which places
spoken, unique to every download, randomised numbers in the silences detected
by their software. As a student you need to pay attention to the entire video
to hear the numbers and write them down. Note that you can’t just fast forward
to hear the numbers, as they’re modulated to match the sound on the original
video. Clever or what? The idea has been used successfully on TV shows such as ‘Watch
& Win’ and in advertising, where viewers watch out for codes then submit
these for prizes.
Online certification
The idea is to provide online certification on submission of
these numbers. This is an improvement on the current certification for simply
attending lectures and seminars and can be awarded for remote attendance. A certificate
of attendance is provided on the fly.
Increased recall through increased
attention
They don’t pretend that this is sure-fire assessment, as
there’s no evidence that the student has paid enough attention to learn, retain and
recall the content. However, it does, by definition, force you to sustain
attention for the entire length of the video. In this sense it is
psychologically sound, as attention is a necessary condition for learning and one
of the main causes of failure in learning is lack of attention. By raising attention
you’re likely to increase the effectiveness of the learning.
Increased recall through note taking
Id add that we know that attention falls away in lectures, as does
heartbeat, the performance of the lecturer and note taking. This technique may
also encourage more consistent note taking, again increasing retention. This is
not a trivial point, as note taking can increase retention by 20-30%.
Khan Academy
Now let’s think of some concrete applications. One problem,
perhaps the most serious weakness, of the flipped classroom, is to identify
whether the student has actually made the effort to attentively watch the
videos at home. This system could add some psychological punch to the flipped
model. Those educational establishments that record lectures could also
increase the effectiveness of the recorded viewing by adopting this technique.
Cues
It also got me thinking. A possible enhancement could be the
insertion of words, not numbers; those numbers being ‘cues’. Tulving, the man
responsible for the semantic/episodic distinction in memory, also identified
‘cues’ which overlap with memories, as playing a key role in recall. If you
encode memories with these ‘cues’ it makes them easier to recall, like attaching
a handle to your memories making them easier ti pull back out. So, imagine
placing cues in these silences, so that you note the cues, improving later
recall. (I've written about this here.)
Conclusion
At first this sound a little mechanical but on examination
it has some sound psychological principles that make it an indirect and
possibly very strong form of assessment. As we move, inevitably, towards
recorded lectures and video content as a feature of blended, learning experiences,
this is a solution that could have a powerful effect in ensuring that actual
learning takes place.
Reference
"On-line
Certification for All: The PINVOX Algorithm" Inter. J. Emerging Tech. in
Learning (iJET), Sept 2012
12 comments:
This is an interesting idea, but why not simply put them through a quick online assessment based on the lecture? A few 'fact retention' style MCQ's, a worked example, ask them to pick out the 4 key themes of the lecture from a list of 20, identify an image used or some genuinely domain specific test of understanding and retention?
The answer is, of course, because those are hard to design. Work for lecturers..
This is an interesting idea, but why not simply put them through a quick online assessment based on the lecture? A few 'fact retention' style MCQ's, a worked example, ask them to pick out the 4 key themes of the lecture from a list of 20, identify an image used or some genuinely domain specific test of understanding and retention?
The answer is, of course, because those are hard to design. Work for lecturers..
This doesn't sound too great to me. We really need to get students mentally interacting and engaging with the content. I've just been trying a free service called Vialogues which i think could work well: https://vialogues.com/ I reviewed it here http://quickshout.blogspot.co.uk/2012/06/engage-students-with-flipped-video.html It enables you to add questions and polls at specific times as the video progresses. Students can then add their comments and respond to the polls as they listen. This makes listening much more active.
Best
Nik Peachey
Robert - sure but the aim is to have an automated system that takes minutes to complete that could be applied to all lectures/videos, without the creation of extra content and assessment items. The bottom line is that lecturers rarely ever bother to make any effort to assess within or beyond lectures - I suppose that's the problem that this addresses.
Nick. I agree but I'd make the same point as in last comment. Lecturers will not do this. This system simply addresses the problem of whether the recorded lectures/videos are attentively watched. It's automated.
I don't really agree with this. The only thing it proves is that the learner paid attention to the code. It has nothing to do with the content. If we're talking online learning we need to consider cognitive load and this is certainly extraneous cognitive load.
I disagree. Your attention is raised during the actual content and as you need to note it down you'll also be more notes-conscious. To simply listen to a one hour lecture just to hear the codes would be more effort than to listen to the content attentively. The cognitive overload is tiny - a few numbers over one hour. However, I do agree that this needs to be tested. These guys have a good track record on researching students use of recorded lectures.
I can understand some of the criticism that the commenters above have shown.
I nevertheless think that they miss the most important point (maybe because I had access to the full text, though I think Donald summarizes it quite well).
The thing is that the paper raises a proposal to certify "online attendance", as we certify attendance to an event.
And that is "all".
I mean: of course attendance is not paying attention, understanding or even learning, but that is not the point of the paper. The paper does not speak about all these issues, but just about providing a way to be able to tell whether someone watched a video, or sat in front of a screen.
In an offline world we can perfectly tell the difference between attending an event and making the best of it. And that is why we sign certificates of attendance to everyone in the room and grades only to the ones that completed an assessment based on what was talked about inside that very same room. And to some of them we will even give degrees if other conditions are met.
And that is the idea of PinVOX, to try and certificate just the first phase of the whole process.
And I think it works quite good :)
i.
FYI - This is the Abstract of the mentioned paper
which we would like to share here, in the hope we all together can improve the algorithm:
"On-line Certification for All: The PINVOX Algorithm"
A protoype algorithm: PINVOX (“Personal
Identification Number by Voice") for on-line certification is
introduced to guarantee that scholars have followed, i.e.,
listened and watched, a complete recorded lecture with the
option of earning a certificate or diploma of completion
after remotely attending courses. It is based on the injection
of unique, randomly selected and pre-recorded integer
numbers (or single letters or words) within the audio trace
of a video stream at places where silence is automatically
detected. The certificate of completion or “virtual
attendance” is generated on-the-fly after the successful
identification of the embedded PINVOX code by a video
viewer student.
To apper on: Intern. J. Emerging Tech. in Learning (iJET) Sept 2012.
With interactive videos you can now include questions throughout the video which not only checks the student is paying attention, but whether or not they have actually learnt anything.
Nowadays you can embed questions within videos.
Lynne - the creators of this system are well aware of this. The point is that this approach can be done on scale and automated. With questions, they have to be created and coded.
Post a Comment