Friday, May 27, 2016

Why the tablets in schools debacle is over

After the California debacle, schools in five states (Virginia, California, Maine, Texas and North Carolina) are starting to swap out tablets for the laptops they should have purchased in the first place. It started with a survey in Maine, where teachers and students expressed a preference for laptops over tablets.
To be exact, 88.5% of teachers and 74% of grade 7-12 students wanted laptops, not iPads. The observations were clear, that while iPads may be appropriate for young children, they are not suitable for older children who need to acquire writing and other more sophisticated skills using tools that don’t work on iPads,
“shortcomings for older students”
"provide no educational function in the classroom” “students use them as toys”
“word processing near to impossible … I applaud this change.”
“largely students’ gaming devices”
“a disaster”
“WE NEED LAPTOPS!!!” a student said, three times.
Apple has caved in and swapped the tablets for reduced price MacBook Air laptops. This reflects the fall in sales of iPads, now at their lowest since 2011. What went wrong?
(Tablets) Disaster in the taking
So tablets have been swallowed by the hundreds of thousands in education but shown to have serious side effects. I’ve been writing and talking about this impending disaster since the start of 2013. My claim is that for learners beyond young children in primary schools, tablets do more damage than good.
7 reasons why tablets should NOT be used in education
When this madness began, in 2013, in ‘Too cool for school: 7 reasons why tablets should NOT be used in education’, I argued that tablets were not the device of choice for teachers and students, poor for writing, encouraged facile creativity, were consumer not producer devices and awful for coding. They were vanity projects, too expensive, as well as teacher and student unfriendly.
7 reasons why buying tablets is lousy advice
In ‘Keep on taking the tablets: 7 reasons why this is lousy advice’, I argued that the perfect storm of aggressive vendors, naïve buyers, little or no cost effectiveness analysis (different from cost benefits), placebo research and groupthink, led to a tsunami of poor procurement. I’d add the cult of ‘Leadership’ in schools, that has become shorthand for a few folk making decisions without consulting the rest, also contributed to the lemming-like rush to buy them.
7 researched ways 'tablets' can inhibit learning
At that time, I also detailed ‘7 researched ways 'tablets' can inhibit learning’. Physical and cognitive ergonomic principles were used to show that tablets are inferior in all sorts of learning tasks, especially writing, where they inhibit the development of complex writing but also in coding, graphics, sustained tasks and so on. 
Device fetish
Beyond this, I have argued that education suffers from ‘device fetish’, which is to concentrate on the wrong end of the problem, using student opinion to show that tablets were unsuitable for sustained skills development. When students reach secondary they have to learn higher order skills which tablets do not, in general, support the sort of digital literacy they need to know. To progress they neeed to have an input device that allows quick and low errorinput with haptic feedback - not a touchscreen keyboard. They also need more control over what that device does. iPads were designed to be consumer, not producer devices - they inhibit progress.
Conclusion
This is a near perfect example of how and why technology in education so often shoots itself in the foot. Obsessed by devices, itself a function of the refusal to do any serious analysis on what is actually needed, schools, ‘leaders’ and vendors opt for easy, but ill-fated, solutions - it was a gold-rush mentality. Rather than focus on good tools, content and services, they rushed towards hardware. Why? I suspect it’s less challenging, doesn’t threaten ‘teaching’ and is seen as an adjunct, rather than core, pedagogic approach. Here's the solution - do the research, listen to what learners actually want. Stop this amateurish, device madness. Note that I'm happy with their use in primary school and also with tablets that have full keyboards (those are really laptops) but even here one has to be careful on costs.   

 Subscribe to RSS

Tuesday, April 10, 2012

Steiner (1861-1925) mysticism, three 7 year stages & eurythmics


Rudolph Steiner, a Hungarian, developed his own philosophical system, ‘Anthroposophy’ based on spirituality. It is, in fact, a mish-mash of Eastern thought, neo-Platonism, Christianity and Hegel. There is much talk of ‘inner experience’ and its amplification through the ‘secret society’ but its philosophical ideas are based on three realms, the physical, soul and spiritual. From this rather unlikely theoretical basis, Steiner schools have grown to be one of the biggest not-for-profit school systems in the world, headquartered in Switzerland. Founded in 1919 in Germany they grew, initially after being funded by a cigarette tycoon, and have flourished for nearly 100 years. Note that Steiner schools often go under the name of Waldorf schools.
Education as development
Education, for Steiner, is not so much teaching, or even learning, as a process of spiritual development defined within Steiner’s ‘Anthroposophy’. The system assumes ‘three births of men’, in three, seven-year periods. Up to 7, 7-14 then to 14-21. These stages are based on obscure and esoteric ‘astral’ and ‘ethereal’ principles. There is a curious neo-Platonic idea of the soul, where the mind needs to recover the soul’s memory through a gentle, empathetic education. There is also a curious Medieval throwback, where teachers use ‘choleric, phlegmatic, melancholic and sanguine’ to judge the temperaments of their pupils.
Children start school at seven and are encouraged to play, as well develop their creative and imaginative abilities. Early competition is avoided in favour of collaboration and students are allowed to develop at their own pace. The curriculum from 7 onwards covers common academic subjects but, compared to mainstream schooling, has more emphasis on the arts, with the addition of a subject unique to Steiner schools, ‘eurythmics’, a form of slow-motion dance.
Radical appeal
Whatever the occult origins of the Steiner philosophy, it has some radical approaches to learning that have some appeal to progressive learning theorists. Children start schooling later (7) with reading is held off until that age, there is no marking or grading and the developmental process studiously avoids placing pressure and stress on children. They are taught in groups, often by the same teacher, for up to seven years, to foster the idea of the school as a family and teacher a parent’. They are non-selective, co-educational, teachers are given a great deal of autonomy and parents encouraged to be part of the school community. Long regarded by parents as an alternative to the pressurised environment of state schooling, it seems to satisfy a need for parents who see schools, whether they be state or private, as too rigid, uncaring, non-spiritual and obsessed with assessment.
Evidence
Steiner’s philosophy is derivative and scarcely credible, clairvoyance, the astral and ethereal being just a few of his mystical ideas. He has also been criticised for racism, believing that reincarnation proceeds through three races, African, Asian and European, in that order. People for Legal and Non-Sectarian Schools(PLANS) is a group of former Steiner students, parents, teachers and administrators who want to expose the hidden missionary and religious agenda in the movement.
Does it work? There is no definitive evidence as little real comparative research has been done. However, in several countries, independent reports are favourable towards Steiner (Waldorf) schools in terms of English, literacy and the arts.
Conclusion
Esoteric claims about the soul, spirituality and process aside, Steiner schools do practice some methods that many regard as positive and progressive. They have a counter-cultural appeal that avoids the commonly held view that education is a grind, designed to filter and fail, rather than develop children into autonomous adults. It is not unusual, in the history of educational theory to come across outliers, that have survived despite their sometimes naïve, even bizarre, underlying theory. They survive because they develop strong brands, financial models that work, their own teacher training and an appeal to a clearly defined need or group.
Bibliography
Steiner R. (1973)Theosophy Rudolph Steiner Press.
Wilkinson R. (1993)Rudolf Steiner on Education: A compendium. Hawthorn.
Rudolph Steiner College http://steinercollege.edu/

 Subscribe to RSS

Monday, December 16, 2013

MOOCs How ‘open’ are they? (7 dimensions)


A MOOC is open in several senses of the word but by far the most important is the idea that they are OPEN IN SPIRIT, not open in any technical sense but open in a moral sense. This means a genuine attempt to open up education to all through open access, low cost, online delivery. Access to powerful, and free at the point of access, educational tools and resources is available through Google, YouTube, Wikipedia and a myriad of other online resources. The Open Educational Resources movement also provided the ground from which MOOCs could sprout. More specifically, the Khan Academy came along with more structured video-based learning experiences. All of this was, from the learner’s point of view, open in the sense of being accessible and free.
7 dimensions of openness
But openness has several other dimensions relevant to MOOCs: 
1. Open access
2. Open structure
3. Open educational resources
4. Open collaboration
5. Open accreditation
6. Open source code
7. Open data

1. Open access - cost really matters
The original intention was open in the sense of access i.e. anyone could simply sign up without prior qualifications. This signaled a moral agenda about opening up education for all, freeing it from scarcity and high cost, towards a model of abundance and no cost. Cost is a big issue. There’s no such thing as a free munch (m for MOOC) but education wants to be free and this is a vital condition for universal, global access. The cost is being reduced to cents/pence per learners that is a great achievement.
2. Open structure - don't copy synchronous, semester model
Many, not all, MOOCs are still tethered to the HE 6/8/10 week semester with a start date, end date, timetable and timed weekly releases of content. As the market progresses and we see that the '18 year old undergraduate' is not the audience but busy people with jobs and so on - lifelong learners. The courses are getting more asynchronous, available anytime and shorter. Coursera is still restrictive, delivered at set times, Udacity less so and EdX does have archived courses. This is good for access. Another access issue is functionality on devices, some platforms are excellent, some appalling.
3. Open educational resources
The degree to which you can reuse, repurpose MOOCs and MOOC content is interesting. Many of the video resources on some platforms are on YouTube, similarly with other media shared-resources. Coursera is the least open with no open licensed content available. Udacity uses YouTube to host its videos and allows reuse under Creative Commons. EdX is more explicit stating that they hope to do much more in terms of open content.
4. Open collaboration
Almost all MOOCs offer forums of one description or another but this is still quite weak. What learners have been doing is spilling out into social media and physical meetups. Interestingly the data from the six Edinburgh Coursera MOOCs showed relatively low forum use (15%) but there can be no doubt that this is a dimension in openness. One could, and some do, argue that learner created content is another dimension of openness but let’s tuck it in here for the moment.
5. Open accreditation
MOOCs assess and therefore accredit on a number of levels from statements of completion (fine for most), certificates of distinction, through to online and offline proctored exams. It is important not to be too hung up on closure through certification and accreditation, as the majority of lifelong learners appear not to want even certification. Nevertheless, openness of accreditation would be desirable, perhaps through OpenBadges and freeing others to accredit.
6. Open source code
EdX have become a major player in MOOCland by making the code open source. This encourages participation, lowers costs and stimulates innovation. Openness in this sense may give them market advantage, especially as it’s in line with the spirit of openness I mentioned earlier. LINK
7. Open data
The University of Edinburgh (LINK) have published data from their six MOOC experiment and the Gates Foundation (LINK) are funding research into MOC data. But the degree to which data is harvested and disseminated is quite sparse. This is not an ‘open data’ environment (yet). Questions still need to be asked about who owns what data and what happens to that data after it is collected. At the moment we have lots of bare number stats about registration, who did what, when people stopped (a category mistake called drop-out) and so on, but as many platforms are not gathering meaningful data about the learners, even age, background and so on, entrance and exit surveys are still being done. Some interesting research is starting around harvesting qualitative data from social media such as blogs, Twitter and Facebook from MOOC users. How open is MOOC data – not very.
Conclusion
Let’s push at the door to see how open MOOCs can be. While it is important to be realistic on costs, ownership and data protection, we need to see how far we can take open access, structure, resources, collaboration, accreditation, code and data. Note that complete openness is not always a virtue. It is largely a matter of degree. The schema above could be used to score MOOCs on ‘openness’ but it is more important to move forward and accept that the MOOC landscape will have many players with many different models. To repeat what was said at the start, it is important to hold true to the spirit of openness, while allowing different models to flourish. To achieve this we must rise above the simple public v private, dropout v dropin, xMOOC v cMOOC dualisms. Let’s not skewer ourselves on the horns of false dilemmas just as the show is getting on the road.

 Subscribe to RSS

Thursday, April 11, 2013

Keep on taking the tablets: 7 reasons why this is lousy advice

Purveyors of the metaphor ‘Keep taking the tablets’ often fail to realise its downside. Like much over-prescribed medication, it can be pushed by aggressive sales, lack adequate investigation and diagnosis, be too narrow a treatment, lack evidence other than placebo effects and have some nasty side-effects.
I’ve written about tablets in two previous posts 1) Too cool for school: 7 reasons why tablets should NOT be used in education, 2) Tablets: 7 researched ways they can INHIBIT learning This final piece in the triad looks at the bigger picture with seven reasons  for  being wary of the tablet bandwagon.
1. Aggressive vendors
In an interesting lunch with someone who works for a tablet manufacturer, and has a great deal of experience and expertise in education and technology, someone I respect, I encountered some horror stories that do not appear in the ‘research’. He, like me, despairs of the mad rush towards 1:1 tablets in schools. Both of us own and use tablets and both of us have spent a lifetime trying to get technology used in education but this latest bandwagon effect, is worrying. Vendors are too close to Gove and government, with freebies and special meetings. Beware also the shadowy figures, connected to government, who have an eye on the low hanging fruit of government contracts. Witness Rachel Wolf’s rapid rise from Gove researcher to Newscorp tablet salesperson.
2. Naïve funders
Keep taking the tablets’ was the title of the E-learning Foundations Conference. This lack of objectivity by a funding body is simply bandwagon behaviour. A few of the Trustees are no doubt proud of their ability to look at their Board papers on their new iPad – note that they rarely take notes, annotate or do anything productive on their screens and often have the printed papers out at the side. A little technology is a dangerous thing!
3. Poor diagnosis
Procurement is not just a few columns in a spread-sheet. It involves the calculation, or at least best effort approximation, of risk. These risks come in all sorts of shapes – fiscal (purchase, insurance, maintenance etc.), technical (bandwidth, networking, printing, storage, security, technical support etc.), adoption (teacher use in classroom, collection from leavers, illicit use etc.) and change management (governance, leadership issues, parent reactions, teacher CPD etc.). Few do a really thorough job on this and fewer still demand an evaluative approach that builds-in quantitative measures on the evaluation of attainment.
Governors of schools and colleges tend to be older, not particularly tech savvy and certainly not capable of doing the necessary governance checks on procurement. With schools breaking free from being overseen by Local Authorities, that layer of procurement expertise has gone. To be fair, it was never that good, but its disappearance has led to a lot of idiosyncratic buying.
Have all of these tablet taking projects really completed a cost-benefit analysis against  a) Bring Your Own Device (BYOD) strategy, b) Notebooks/laptops strategy or c) Flip technology out of the classroom? Have lost-opportunity costs been taken into account? I doubt it.
4. Narrow treatment
Pedagogy matters. Failure to plan and cost teacher training can lead to technology slowing down and not accelerating learning, as hoped. For example, my respected educational technologists witnessed iPads being collected in by a teacher at the end of the class ‘for marking’. There are cases of tablets being distributed to students only and not teachers! Without an implementation plan that involves teachers, before the tablets are distributed, you are simply creating more problems than you solve.
Learning needs are very different horizontally across subjects and vertically through age and increasing complexity. The failure to see the need for long-form writing in English, History and other subjects, detailed editing in coding, need to have pixel accurate control in graphics packages and so on renders tablet use literally useless as one moves across the curriculum. Similarly with increasing demands on productive tasks on learning. The further up the educational attainment ladder the learner climbs the less use tablets become.
5. Placebo research
Why do the well-known, negative disasters and negative findings rarely appear in the research. Medicine is built upon randomised double-blind trials with rigorous research, education is not. Even I  the oft-quoted Hull report, one disaster, where iPads were given out yet wifi was only installed months later, was treated as a mere blip. Most of the research is akin to the placebo effects one sees in homeopathic medicine. It’s qualitative, survey-monkey-level research that merely confirms the known fact that if you give kids and teacher a free iPad they like them. This is the allure of consumer electronics, attention not proven attainment.
6. Nasty side-effects
Idiosyncratic tablet projects are pregnant with problems. The technology may bite back as teachers struggle with connectivity, printing, storage and so on. Teachers can come out in a rash of negativity when their practical and training needs are ignored. Students may spend huge amounts of time on relatively shallow learning or other distracting activities
7. Groupthink
This is an old story in education and technology – the over-prescription of untried ttechnology as if it were a wonder-drug. Something new and shiny comes along and before long it’s become a bandwagon, we jump aboard without thinking too much about where it’s taking us, then the wheels start to fall off. Even when the wheels have fallen off you don’t get to hear the bad news, as there’s been so much invested.
Conclusion
I’m not against the use of tablets in schools, I just think that turning it into a ‘movement’ is a mistake and that too many of these projects are poorly planned, badly procured and lack proper evaluation.

 Subscribe to RSS

Thursday, September 27, 2012

Books = paper (hardware) + texts (software) & its the software that matters


Book v scroll
The book, or codex, was invented in the 2nd century AD by early Christians. It is thought that they were keen to distinguish the physical form of their holy book from that of the Jewish Torah, which remains loyal to the scrolled format.Before this the scroll was the dominant form for writing but a scroll had its weaknesses:
1.       needs to be held in both hands
2.       difficult to unroll
3.       rolls on a flat surface
4.       difficult to carry
5.       difficult to store
6.       written only on one side
7.       difficult to navigate to specific place

There were many advantages of the book over the scroll:
1.       hold in one hand
2.       take notes with spare hand
3.       sit on a flat surface
4.       easy to carry
5.       easy to store
6.       pages give sense of place
7.       written on both sides
The book gave the hardware of the book a boost in terms of its software, the text. Page numbers content pages and indexes could all be added to aid navigation.
Scroll v page in learning
This scroll versus page technology divide lives on today in screen technology, where page based web pages live alongside vertical scrolling. We can see this in Wikipedia, with its page structure for entries with scrolling for reading. Window panes add depth making multiple tasks possible. Another interesting scroll versus page structural debate concerns the modern scroll of film, then video. The media of the moving image were literally the technology of reels or scrolls but now handled by delivering fast refresh pages. However, in terms of learning, the distinction between the continuous scrolled presentation of content versus pages under the control of the user, remains a sharp divide. It is still difficult to search and navigate video content for learning purposes. The navigation of forward, back and fast forward remain at the navigational level of ancient scrolls.
Book as hardware
It is useful to separate the hardware and software components of books, as the word ‘book’ has two meanings. First, the whole physical object of paper and text; second, just the text. Authors don’t write books, they write texts. It is publishers who package texts into books by commissioning covers, paper type and weight, font and other features. A book, as hardware, is light, portable and never runs out of battery. It is undoubtedly an attractive, well bound, object that doesn’t break when dropped and is easy to hold for reading. Even its flexibility makes it comfortable to hold or lie on one’s lap when read. The physical pages make it easy to know where you are in a book and how much you’ve completed. Paper, as a reflective medium, is also eminently readable. Block shaped books also makes them easy to store on shelves. There can be no doubt that the physicality of the book contributes to its appeal.
Book as software
The most useful part of a book is, of course, its software, or text. We think of the book as a single text but early books tended to contain a miscellaneous mixture of different texts on different subjects, often in different languages by different authors. Paged books encouraged the development of readable content as texts were:
1.       chunked into chapters & paragraphs
2.       spaced (words and sentences)
3.       punctuated to aid reading
4.       capitalised for sentences & emphasis.
5.       listed by contents
6.       indexed
7.       appendices & bibliographies
All of this took centuries of slow incremental progress. Note that these are features of the text, not the physical book. This is the software, not the hardware. For most of the technological advances in books were either in the process of production (printing, ink and paper) or software improvements.
Book and screen technology
Although the book, as a physical technology, has developed over nearly eighteen centuries into finely-honed, much loved, object, that technology is being challenged by screen based reading and writing. There has been a social explosion of publishing, writing and reading on screens, aided by the internet. This has been boosted by good, readable screen technology, mobile devices and inexpensive e-book readers.
Traditionalists may wave their reading glasses in horror but to turn books into a fetish is simply to deny the inevitable. Real books are great, but let’s not confuse the medium with the content, or hardware with software, namely books with texts. Just as journalists and newspaper owners fail to realise they’re in the ‘news’ not the ‘newspaper’ business, so book fans and publishers sometimes fail to realise that this is about writing and reading, not books. Books are simple a piece of technology.
Just as the book was a hardware improvement on the technology of the scroll, so screen technology is an improvement on the hardware of the physical book. Books destroy trees, require landfill and are expensive to transport and store. In turning atoms into bits, books become weightless, distribution trivial and the problem of storage disappears.
Screen base delivery also puts books in the realm of software control, so that it is easier to:
1.       download
2.       store
3.       search
4.       hyperlink
5.       change font etc.
6.       highlight
7.       comment
We can now see where this can lead us, or more specifically lead us in improving learning. Why lock up knowledge and the ability to learn in libraries, schools and physical books, when we can publish and distribute it at marginal cost to everyone.
Books and learning
There is a tendency to think of books as being an intrinsic good, but many would question the role of Adolf Hitler’s Mein Kampf and Mao Tse Tung’s Little Red Book as instructive or progressive. Similarly, many would doubt that the literal reading of sacred texts, such as The Bible, Torah and Koran, are always forces for good.
Professor Pierre Bayard ‘s How to Talk About Books You Haven’t Read is a deep analysis of the ambiguous role of readers and books. We take books too seriously, forgetting that many are bought and not read, skimmed or talked about as if they had been read, even forgotten. Bayard throws the book at books.
Books have a special status as ‘almost objects of worship’ and non-readers are stigmatised. Yet reading is often non-reading, as we forget most of what we read, almost as quickly as it is read. As we forge forward, content is forgotten and disappears in the wake of memory. Most reading is forgetting. He’s really on to something here. I habitually underline, mark, comment and summarise on the books I read. Yet it is almost taboo to underline, mark books, and blasphemous to tear out a page or chapter. Life is short and books are long. It’s OK to skim, as many books are padded out to conform to the standard 250 page norm. In fact, for many, the fact that most of what you read will be forgotten, means a summary is adequate.
Academics cook the books
As an academic, he is at his best in describing a world he knows well, where academics discuss and teach books to students who have also not read the book. Teaching pressurises teachers into talking about books they have not read. Students respond by pretending to read long reading lists they never in fact read. Short-cuts are taken by all. It's a game where reading is the facade and non-reading the reality.
Every trick in the book
What’s clever is the way he hauls in authors to support his case. Montaigne’s honest reflections on reading, Oscar Wilde’s ‘100 worst books’ (books we should not read), David Lodge’s expose of the Academy’s dependence on unread books. Umberto Eco, Balzac, Green, Shakespeare, Joyce, Proust and others are all used to build a case, not against books, but against the bogus idea of books as being pure and sacrosanct.
You can’t judge a book by its lover
So reading, and the culture of reading, is not what we think it is. It’s full of deceit, snobbery and false claims. Bayard exposes many of these taboos. Take a leaf out of his book and see reading, not as being synonymous with books, but in all its wonderful variations in terms of style, length, authors and media. New media and self-publishing are tearing apart the myth that reading is synonymous with books. It may well be that reading in many ways has freed itself from the tyranny of books.
Conclusion
The Book (codex) was a superior technology to the scroll and in the form of hand written manuscripts had a good 1200 year run. The printing press scaled up the process of replication and has had another good 500 year run. Building on this, screen based reading has given us another massive boost in scalability, making books weightless, volumeless, easy to distribute and searchable.
What we are witnessing is, perhaps, the death of the book as the dominant form of written expression. A much wider range of forms of expression have emerged. Wikipedia is not really a book in the sense that the Encyclopedia Britannica was a book. Txting, posting, commenting, blogging are challenging the long-form book as the writing and reading medium of choice. Books themselves are being seen as just one form of expression among many.

 Subscribe to RSS

Sunday, November 12, 2017

7 ways to use AI to massively reduce costs in the NHS

I once met Tony Blair and asked him “Why are you not using technology in learning and health to free it up for everyone, anyplace, anytime?” He replied with an anecdote, “I was in a training centre for the unemployed and did an online module – which I failed. The guy next to me also failed, so I said ‘Don’t worry, it’s OK to fail, you always get another chance…. To which the unemployed man said 'I’m not worried about me failing, I’m unemployed – you’re the Prime Minister!” It was his way of fobbing me off.
Nevertheless, 25 years later, he publishes this solid document on the use of technology in policy, especially education and health. It’s full of sound ideas around raising our game through the current wave of AI technology. It forms the basis for a rethink around policy, even the way policy is formulated, through increased engagement with those who are disaffected and direct democracy. Above all, it offers concrete ideas in education, health and a new social contract with the tech giants to move the UK forward.
In healthcare, given the challenges of a rising and ageing population, the focus should be on increasing productivity in the NHS. To see all solutions in terms of increasing spend is to stumble  blindly onto a never-ending escalator of increasing costs. Increasing spend does not necessarily increase productivity, it can, in some cases, decrease productivity. The one thing that can fit the bill, without inflating the bill, is technology, AI in particular. So how can AI can increase productivity in healthcare:
1. Prevention
2. Presentation
3. Investigation
4. Diagnosis
5. Treatment
6. Care
7. Training
1. Prevention
Personal devices have taken data gathering down to the level of the individual. It wasn’t long ago that we knew far more about our car than our own bodies. Now we can measure signs, critically, across time. Lifestyle changes can have a significant effect on the big killers, heart disease, cancer and diabetes. Nudge devices, providing the individual with data on lifestyle – especially exercise and diet, is now possible. Linked to personal accounts online, personalised prevention could do exactly what Amazon and Netflix do by nudging patients towards desired outcomes. In addition targeted AI-driven advertising campaigns could also have an effect. Public health initiatives should be digital by default.
2. Presentation
Accident and Emergency can quickly turn in to a war zone, especially when General Practice becomes difficult to access. This pushes up costs. The trick is to lower demand and costs at the front end, in General Practice. First, GPs must adopt technology such as email, texting and Skype for selected
patients. There is a double dividend here, as this increases productivity at work, as millions need not take time off work to travel to a clinic, sit in a waiting room and get back home or to work. This is a particular problem for the disabled, mentally ill and those that live far from a surgery. Remote consultation also means less need for expensive real estate – especially in cities. Several components of presentation are now possible online; talking to the patient, visual examination, even high definition images from mobile for dermatological investigation. As personal medical kits become available, more data can be gathered on symptoms and signs. Trials show patients love it and successful services are already being offered in the private sector.
Beyond the simple GP visit, lies a much bigger prize. I worked with Alan Langlands, the CEO of the NHS, the man who implemented NHS Direct. He was adamant that a massive expansion of NHS Direct was needed but commented that they were too risk averse to make that expansion possible. He was right and now that these risks have fallen, and the automation of diagnostic techniques has risen, the time is right for such an expansion. Chatbots, driven by precise, discovery techniques, can start to do what even Doctors can’t, do preliminary diagnosis at any time 24/7, efficiently and in some areas, more accurately, than most Doctors. Progress is being made here, AI already has successes under its belt and progress will accelerate.
3. Investigation
Technology is what speeds up the bulk of investigative techniques; blood tests, urine tests, tissue pathology, reading of scans and other standars tests, have all benefited from technology. In pathology, looking at tissues under a microscope is how most cancer diagnosis takes place. Observer variability will always be a problem but image analysis algorithms are already doing a good job here. Digitising slides, and scans also means the death of distance. Faster and more accurate investigation is now possible. Digital pathology and radiology, using data and machine learning, is the future. If you need convincing further look at this famous test for radiologists.
4. Diagnosis
AI already outperforms Doctors in some areas, matches them in others and it is clear that progress will be rapid in others. Esteva et al. in Nature (2017) describes an AI system trained on a data set of 129,450 clinical images of 2,032 different diseases compared its diagnostic performance to 21 board-certified dermatologists. The AI system classified skin cancer at a level of competence comparable to the dermatologists. This does not means that Doctors will disappear but it does mean they, and other health professionals, will have less workload and be able to focus more on the emotional needs of their patients. Lots of symptoms are relatively undifferentiated, some conditions rare and probability-based reasoning is often beyond that of the brain of the clinician. AI technology, and machine learning, offers a way forward from this natural, rate-limiting step. We must accept that this is the way forward.
5. Treatment
Robot pharmacies already select and package prescriptions. They are safer and more accurate than humans. Wearable technology can provide treatment for many conditions, as can technology provided for the patient at home. Repeat prescriptions and on-going treatment could certainly be better managed by GPs and pharmacists online, further reducing workload and pressure on patients time. Above all patient data could be used for more effective treatment and a vast reduction in waste through over-prescribing.
Treatment in hospitals through automated robots, such as TUG, are already delivering medication, food and test samples, reducing the humdrum tasks that health professionals have to do, day in, day out. Really a self-driving car, it negotiates hospital corridors, even lifts, using lasers and internally built AI maps. The online management of treatement regimes would increase complaince to those regimes and save costs.
6. Care
Health and social care are intertwined. Much attention has been given to robots in social care but it is  AI-driven personalized care plans and decision support for care workers along with more self-care that holds most promise and is already being trialed. AI will help the elderly stay at home longer by providing detailed support. AI also gives support to carers. It may also, through VR and AR, provide some interesting applications in autism, ADHD, PTSD, phobias, frailty and dementia.
7. Medical education
Huge sums are spent on largely inefficient medical training. There are immense amounts of duplication in the design and delivery of courses. AI created content can create high quality, high-retention content in minutes not months (WildFire). Adaptive, personalized learning gets us out of the trap of batched, one size fits all courses. On-demand courses can be delivered and online assessments, now possible with AI-driven digital identification, keystroke tests and automated marking make assessment easier. Healthcare must get out of the ‘hire a room with round tables, a flipchart and PowerPoint (often awful)’ approach to training. The one body that is trying here is HEE with their E-learing For Health initiative. Online learning can truly reduce costs, increase knowledge and skills at a much lower cost.
Conclusion

It is now clear that AI can alleviate clinical workload, speed up doctor-patient interaction, speed up investigation, improve diagnosis and provide cheaper treatment options, as well as lower the cost of medical training. We have a single, public institution, the NHS, where, with some political foresight, a policy around the accelerated research and application of AI in healthcare could help alleviate the growing burden of healthcare. Europe has 7% of the world’s population, 25% of its wealth and 50% of its welfare spending, so simply spending more on labour is not the solution. We need to give more support to healthcare professionals to make them more effective by taking away the mundane sides of their jobs through AI, automation and data analysis.

 Subscribe to RSS