Education and training has always coveted data. But in any honest appraisal of this data collection we have to admit that it is largely the wrong data. There has, historically been too much focus on start and end point data. All dull inputs and outputs, it’s like judging a person simply by measuring what he eats and then excretes. It may even stretch to how long that process took to complete. What we need to focus on is the cognitive improvement of the learner. Here’s five examples of data. Mostly superficial, that accounts for the vast bulk of the data collected in education and training:
1. Bums on seats
To measure attendees, or bums on seats, is to measure the wrong end of the learner. Yet this is what so much ‘contact time’ is in our colleges and universities. I once attended a talk by the Head of Training for a global bank, where she proudly showed that x number of meals had been served in her canteen on the training campus. And they wonder why banks failed?
2. Contact time
Contact time is essentially an excuse for not measuring what is learnt. Turning up is hardly a measure of learning. Attendance is not attainment. In some cases the contact time is even more illusory, as in Higher Education in the UK they do not even count the number of students that turn up for lectures.
3. Course completion
Completion is not a measurement of attainment or competence, yet so many courses measure simply this. We have already seen how turning up is not a great measure but this is so often simply a measure of how many people just hung about until the end.
4. Summative assessments
The problem with final test and exam data is that it’s all too late. The deed is done. Exams are too often the final act in learning and an end-point. As Professor Black has shown, this final mark so often stops even the best learners from trying any harder and marks the poorer students out as failures. There’s also the problem of cramming and short-term memory.
5. Happy sheets
The evaluation of education and training is plagued with end-point data. None more futile than the obsession with happy sheets, as they measure nothing. It’s a staple of classroom courses and often the only data that is collected. Yet it says nothing about what has actually been learnt.
Even in online learning SCORM was really just a package and delivery tracking mechanism for LMS vendors, build on the false premise of learning objects and although it provided a ‘standard’ for interoperability, it largely measured simple inputs and outputs.
What’s so often missing is the data on competence. We teach what is easy to test and test what is easy to teach. That means lots of academic knowledge which is tested through paper tests, from multiple choice to essays. The actual competence measured is often just the ability to cram and remember data to pass tests, quickly forgotten.
ConclusionMost data collection in education and training skates over the surface with data about superficial attendance, end-point assessment and opinion. What’s missing is hard data on actual performance, competences and retained knowledge. What really matters is data collected from learners as they learn. This is when data really is needed so that we can help learners succeed. So much data focuses on the deficit model in education – failure, drop-outs.