Monday, February 22, 2021

From Nazi Punchcards to the Cambrian Explosion of LMSs and now LXPs - a short history of enterprise learning software

 

Introduction

The LMS (Learning Management System) and now LXP (Learning Experience Platform) have lots of roots, best seen historically as growing from several roots; hardware, software, business software, connectivity and pedagogy.

Hardware

One early hardware development was punch-cards systems, used in Jaquard's loom, invented in 1804. It allowed complex inputs to automate the production of complex patterns in weaving. Babbage wanted to incorporate these into his Analytical Engine but the idea was to be carried forward by Herman Hollerith, who worked for the US Census. He patented a punch cards system in 1884. It eventually had 12 rows and 24 columns, that could record human characteristics for the 1890 census. Cleverly the punched cards were run over cups of mercury and spring-loaded pins would complete the circuit where there was a hole. This was the first use of electronics for data storage. His machine could calculate totals but also combine traits. The time taken to complete the US census went down from eight to a single year, in 1890. Hollerith's company eventually, in 1924, became IBM.

In 1933 the germans, under Hitler, were keen to do a national census, as it would identify undesirable races and traits. Tom Watson, the first CEO of IBM flew to meet Hitler in 1933, made a sizeable investment in their German subsidiary. As Hitker expanded into other countries, this Hollerith systems was used to identify Jews and other races, as told in in the excellent book ‘IBM and the Holocaust’ by Edwin Black. It stored data on skills, race and sexual orientation. Jews, Gypsies, the disabled and homosexuals, were identified and selected for slave labour and death trains to the concentration camps. The LMS did not start well. 

Software

There were other notable events around software such as the famous AI conference in 1956 at Dartmouth, where several projects emerged, even the first chatbot. There was a long period of experimentation before usable computers came along. This experimentation is summarised by Atkinson & Wilson (1969) with 21 papers looking at the then trends in CAI. The experimentation was especially strong in the military, as explained by Fletcher and Rockway (1986). PLATO and its rival TICCIT, were largely confined to academic experimentation although there were some corporate examples. All of this played a role, albeit it slow and relatively minor, as part of this period of experimentation. 

Down to business

Then home computers, made possible by the mass production of the microprocessor in 1971, led to an explosion of activity in the 1980s. But it was the IBM PC that game real impetus to CBT (Computer Based Training), when released in 1981, along with a rack of consumer computers such as the Commodore 64. This gave rise to an embryonic computer based training industry. My first learning programme, to teach Russian, was in the early 1980s on a Commodore 64. Other machines such as the BBC Micro in the early 80, in the UK, were seen immediately as having educational uses.

There were many programmes produced and distributed on floppy discs of various sizes. Other storage devices such as interactive videotapes, Videodiscs, laserdiscs, CDi and CD-ROM were used to store very larger amounts of data and media. There was a burst of creative activity, as video, audio and images could be used with the overlay of text from computers.

Then came networked enterprise systems, with client-server structures. IBM was a hardware then a software company but competed directly with Microsoft on software with their Lotus Smart Suite, Lotus 1-2-3, Lotus Notes and so on. This was a rival to Office, the last release being 2014! SAP was an early 1975 a spin off from IBM that stuck to ERM software. Microsoft was a software company built on their operating system then Office software that came along in 1988. Cisco 1984 out of Stanford, a networking company. Enterprise software became the norm, as it did for learning. All of these set the scene for learning systems that operated at the enterprise level.

LMS Cambrian Explosion

This all led to the Cambrian explosion of LMS, in 1999-2001, tracked in detail by Brandon Hall. Between 1999 and 2001 the sales of LMSs took off. Brandon Hall published a specification list and before long there were around 250 systems. On top of this HR companies like SAP, Peoplesoft and Oracle entered the LMS market. Then a split emerged between the LMS (corporate) and VLE (education market) with e-college, Blackboard and WebCT.

There were eventually two main groups. Those that developed out of client-server, training systems and those that were born on the web. Saba, Click2Learn, Pathlore, Learnframe and Thinq were originally client server and had to be rewritten for the web. They often used Java applets, client-end software and plug-ins, with a client-server back-end for administration. The born on the web group included Docent, KnowledgePlanet and Teamscape, which had web browser interfaces. They had the advantage of being more scalable, easier to roll-out and maintain, with fewer technical changes. Later open source LMSs emerged, the most successful being Moodle. This in turn was forked and corporate versions created. In late 2001 the market was made more complicated by the introduction of the LCMS (Learning Content Management System). These vendors claimed to have additional functionality around authoring, learning object repositories and dynamic delivery. The distinction was soon blurred as the LMS vendors adopted these features. Interestingly, the learning object approach has returned with micro-learning, 20 years later.

At the same time ADL and others came up with de facto standards. It is important to note that there are few real standards in e-learning: what we have instead is a collection of specifications, guidelines and reference models, a set of de facto (not de jure) standards. Across the range of LMSs on offer there were varying degrees of adoption of AICC, IMS and SCORM. Then there were accessibility standards, an increasing demand, especially in the public sector.

Dominant model

At the start, Brandon Hall issued a specification list that led to procurement against the list and so their complexity grew. Brandon Hall reported 27 LMSs in 1998, 50 in 2000 and by 2003 they had selected 70 LMSs. There were many more and the market continued to grow. Since then there have been lots of failures, mergers and acquisitions but it remains a large $7 billion market, having shifted to a SaaS model. For the last 20 years this has been the dominant model but there has always been dissatisfaction on integration, lack of data and shortfalls on functionality and delivery. 

As repositories for content, they were more about management than learning. There had always been dissatisfaction with the model, based on its poor interface, sign-ons, clumsy menu systems and delivery of ‘courses’. In a sense, they still mimic classroom courses, managing them after they had been converted to online. They fail to provide the flexibility needed in the workplace on both push and pull, moments of need and more sophisticated pedagogy, especially around motivation.

Learning Experience Design

Learning Experiences came from a different root. An early mention of Learning Experience Design is by McLellan (2002) who, prophetically, mentions Harvard Case Studies, simulations, virtual reality, artificial intelligence and recommends the rehabilitation of the emotional side of learning. She mentions Pine and Gilmore (1999), who talk of the ‘Experience Economy’, transformative experiences, that change us in some way. This line of thought was heavily influenced by the idea of attention and experiences that people were getting in games, imagery, TV, film on the web.

There was also a growing interest in UI and UX. The web was delivering a UI experience that was personalised, used recommendation engines and looks slick. The LXP world was similarly data-driven and started to sue recommendation engines, AI, sentiment analysis.

We need also remember the deep roots of media design from Radio, TV and Computer Games. The web delivered media and multimedia experiences that the learning community wanted to mimic. Hence the rise of video with Netflix /YouTube interfaces, audio had its roots in distance learning with the School of the Air in Australia, now as podcasts.

Another development was the rise of corporate social platforms, such as Yammer and Slack. These were eventually folded into the likes of Teams. There was also a move by Microsoft, SAP and others to focus more on workflow products.

Shift to LXP

The move from LMS to LXP came from a specific line of thought that had been around for 30 years - namely performance support. Gloria Gery (1991) defined this 30 years ago, in 1991, as EPSS – Electronic Performance Support Systems. Jay Cross (2011) worked tirelessly on this concept and more recent practitioners such as Bob Mosher (2011) have focused on moments of need and innovative forms of curation and performance support. The 70:20:10 movement spearheaded by Charles Jennings and Jos Arets have also helped highlight the need for real traction in the workplace with more of a mixture of formal content and informal techniques.

In addition Degreed and some other companies entered the corporate market. The LMS vendors are now busily transforming their LMS into an LXP or building an LXP from scratch. The LMS vendors are moving towards being LXPs and the LXO+P vendors are having to become LMSs. They will, in the end be single platforms and tools.


These new platforms use the technology of the day, AI and data, to signpost, recommend and automate workflow processes. xAPI will replace SCORM and data-driven approaches will push the old static forms of delivery aside. We live in the age of algorithms, and just as everything we do online is mediated by AI and personalised by using personal and aggregated data, so it will be with learning. This is outlined in my book AI for Learning.

No comments: