Everywhere I go these days there’s a Data Analytics’
initiative. They’re collecting data like kids at a wedding scatter; data at the
organisational level, data at the course level, data at the student level. But
it’s trainspotting and all very ‘groupthinky’. They’re gathering the wrong data
on the wrong things, as the things that really matter are awkward and uncomfortable.
Worse still, there’s perhaps a wilful avoidance of the prickly data, data that
really could lead to decisions and change. So let’s look at seven serious areas
where data should be gathered but is not.
1. University
rankings data
These must go down as the most obvious example of the
‘misuse’ of data in HE.
There are
many uncomfortable truths about the data con that is University rankings. It’s
a case study in fraudulent data analytics. Universities play the game by being
selective about what ranking table they choose, bait and switch by pretending
the rankings reflect teaching quality (when they don’t), ignore the fact that
they’re qualitative, self-fulfilling ratings that often compare apples with
oranges. Even worse, they skew strategies, spending and agendas, while playing
to crude marketing strategies. Will HE
stop playing fast and loose with this data. Not a chance.
2. Data on pay
differentials
Want to be ethical on data collection? That’s fair. So let’s
look at the salaries across the organisation and work out the differential
between those at the top and the bottom? Now look at how that range has
widened, not narrowed. Senior academic staff pay is rising three times faster
than other academics, especially those of Vice-Chancellors (and that’s not
counting the perks). Average pay for senior staff is £82,321. average pay for
non-professorial staff is $43,327. We have this data – it’s damning. Are senior
academics robbing other academics blind? Yes. Will any decisions be made? No.
3. Data on pensions
Forget the astonishing data point that cancer Researchers
among others invested £211 million in British American Tobacco last year. The
Universities Superannuation Scheme is the UKs largest private sector scheme
with a whopping £57 billion of liabilities and an £8 billion deficit. At
present, the pension payments are nowhere near filling in this hole. Why? It
would wipe out the surpluses of many Universities across the land. It’s a badly
run scheme and there is little chance that the hole will be filled. The
consequences, however, are severe. It is likely that costs will rise and that
student fees will rise. Not good news.
4. Data on researchers
as teachers
You would like to know whether researchers make good
teachers? But what if the data suggests this may not be true. Teaching skills demand social skills,
communication skills, the ability to hold an audience, keep to the right level,
avoid cognitive overload, good pedagogic skills and the ability to deliver
constructive feedback. Some contend that the personality types necessary for
teaching are very different and, statistically, do not largely match those of
researchers. So what about the evidence? The major research and text is Astin,
who studied longitudinal data on 24,847 students at 309 different institutions
and the correlation between faculty orientation towards research and
student/teaching orientation. The two were strongly negatively correlated. The
student orientation was also negatively related to compensation. He concludes
that “there is a significant institutional price to be paid, in terms of
student development, for a very strong faculty emphasis on research” and
recommends stronger leadership to rebalance the incentives and time spent on
teaching v research. To be fair, academics are now largely judged on research
not teaching. Too much effort in teaching can hamper your career. It rarely
enhances your prospects. The dramatic swing away from teaching towards research
was tracked by Boyer who identified year on year abandonment of teaching
commitment towards research. However, it wouldn’t matter how much data
we collected on this topic, the Red Sea has a greater chance of parting than
this separation.
5. Data on Journals
More Journals, more papers, fewer readers. Good data on
actual citation rates and readers are hard to come by. The famous Indiana
figure of 90% remaining unread has been refuted but other sources show that,
although it’s not that awful, it’s still pretty awful. This is a pretty good
state of play paper, as it’s a complicated issue – but nevertheless worrying.
We really should be concerned as the push towards research (data suggests lots
of it is 2nd and 3rd rate as unread), away from teaching
may be doing the real harm in HE. We simply don’t know, nor do we dare ask.
6. Data on lecture
attendance
Let’s start with a question. Do you record (accurately and
consistently), the number if students who attend lectures? If the answer is NO,
don’t even pretend that you’re off the block on data analytics on teaching. A recent study
from Harvard, over ten courses, showed that only 60% of students on average
even turned up. In any other area of human endeavour, when one has paid tens of
thousands of dollars up front for high-quality teaching this came as a shock.
Similar studies show similar results in other institutions. If you were running
a restaurant, and 40% of your students were not turning up to lectures (often
it’s a lot worse) you’d seriously question your ability, not only as a chef,
but also a restaurant owner. Remember that students have paid for the meal,
through the nose, and if they aren’t turning up, something’s wrong. So, I have
the data that students are not turning up to lectures, what do I do with that
data? Nothing – we’ll still call ourselves ‘lecturers’ and lecture on.
Suppose I also show you data that says students, especially
those struggling, with English as a second language and so on, benefit from
recorded lectures. You’d seriously consider recording your lectures. My point
is that collecting data is unlikely to change anything other than dig deeper
holes, if you’re not asking the right questions and taking action.
7. Data on assessment
Take assessment. What data do you have on this? The grades
for the occasional essay? This is not big data, it’s not even little data, it’s
back of an envelope data. Student submits essay, waits weeks for a grade and some
light feedback – too late. Why bother? The problem is the method of assessment.
Will this change – no. What you really need at this level is constant data from
regular low stakes testing. Not only is this proven to increase retention and
recall, you know where students are in terms of progress. Here’s a tougher
question? Do you really know how many essays submitted by students are written
by others? And by others, I mean parents, or essay mills? Look at the number of
these companies and their accounts. That’s interesting data. Why will this not
happen? Because there’s reputations at risk, pesky journalists to appease.
Conclusion
Henry Ford famously said that if he had gathered data from
his customers, they’d have asked for faster horses. Actually, he said no such
thing, but what do we care in education, a good quote is good data, and it
captures exactly what is happening with data analytics in HE. Data is not the
new oil in education. Data is the new slurry. Dead pools of lifeless, inert
stuff that ends up in unread and unloved reports. By going for low relevance
data, in areas that are less than critical, it’s wilful avoidance of the real
issues. It gives the illusion of problem solving and neatly avoids the hard and
often detailed decisions that have to be made around cultural and fiscal
change.
No comments:
Post a Comment