Wednesday, March 06, 2019

Summarising learning materials using AI - paucity of data, abundance of stuff

 We’ve been using AI to create online learning for some time now. Our approach is to avoid the use of big data, analytics and prediction software, as there are almost no contexts in which there is nearly enough data to make this work to meet the expectations of the buyer. AI, we believe, is far better at precise goals, such as identifying key learning points, creating links to external content, creating podcasts using text to speech and the semantic interpretation of free text input by learners. We’ve done all of this but one thing always plagues the use of AI in learning…. Although there’s a paucity of data, there’s an abundance of stuff!

Paucity of data, abundance of stuff
Walk into many large organisations and you’ll encounter a ton of documents and PowerPoints. They’re often over-written and far too long to be useful in an efficient learning process. That doesn’t put people off and in many organisations we still have 50-120 or more PowerPoint slides delivered in a room with a projector, as training. It’s not much better in Higher Education, where the one hour lecture is still the most dominant teaching method. The trick is to have a filter that can automate the shortening of all of this stuff.

To summarise or précis documents (text) down in size, to focus on the ‘need to know’ content, there are three processes:
1. Human edit
No matter what AI techniques you use to précis text, it is wise to initially, edit out the extraneous material (by hand), that learners will not be expected to learn. For example, supplementary information, disclaimers, who wrote the document and so on. With large, well-structured documents, PDFs and PPTs it is often easy to simply identify the introductions or summaries in each section. These form ready-made summaries of the essential content for learning. Regard this step as simple data cleansing or hand washing! Now you are ready for further steps with AI....
2. Extractive AI
This technique uses a summary that keeps the sentences intact and only ‘extracts’ the relevant material. We usually look at a quick Human edit first, then extract the relevant shortened text, which can then be used in WildFire, or on its own. This is especially useful where the content may be subject to already regulated control (approved by expert, lawyer, regulator). For example in medical content in the pharmaceutical industry or compliance.
3. Abstractive AI
This is a summary that is rewritten and uses a set of training data and machine learning to produce a summary. Note that this approach needs a large domain-specific training set. By large we mean as large as possible. Some of the trainings sets are literally Gigabytes of data. That data also has to be cleaned.


The end result is automatically shortened documents, from original large documents, PowerPoints even video transcripts. These we can input into WildFire, rather than delivering in]tense training on huge pieces of content, you get the essentials. The summaries themselves can be useful in the content of the learning experience. So if you have a ton of documents and PowerPoints, we can shorten them quickly and produce online learning in minutes not months, at a fraction of the cost of traditional online learning, with very high retention.

 Subscribe to RSS


Post a Comment

<< Home