Wednesday, June 07, 2023

Apple will shift us from 2D to 3D and set the pace in learning with the Vision Pro

Artificial Intelligence is like a black hole sucking in all attention in learning technology but published a book 'Learning in the Metaverse' (2023) on the shift that is taking place in parallel, from 2D to 3D. The publisher inisised on using the word 'Metaverse' but it is really about mixed reality.

The world’s major religions have all posited another virtual 3D afterlife, we build monumental 3D spaces as theatres, cinemas, sports stadia and so on for social gatherings, we have had full-blown 3D video games since the early 90s. Roblox, Fortnite and Minecraft have hundreds of millions of users. We are 3D people who live and work in a 3D world, yet most learning is 2D text on paper, PowerPoint or 2D images and text on screens.

Vision Pro

I have always maintained that the shift into virtual worlds will happen and have written about this extensively. What it needs is consumer tech that would make it happen. Apple have just released their Vision Pro, a high-end VR headset. Apple have set the standard and trajectory going forward. It is a springboard product. What they're after is the redefinition of the human-machine interface. It has an eye watering price at $3500 and, at 2 hours, limited battery time, but oh what a product. To be fair it is called ‘Pro’ as they’re releasing it to the research and professional market.

Apple is selling a dream machine here, a window into new immersive realities. This opens the mind up to heavens on earth but also combinations of the real and unreal. You are not looking at a screen, you are in a world. It also redefines the boundary between the real and virtual worlds. This is the mind shift that Apple is selling. This is not using a device, it is being inside a device. It actually blurs the real and virtual, mixed reality is a matter of degree with one small wheel on the headset. This is the shift, redefining that boundary as a matter of degree.

Sure, it's a little heavy, but packs a lot of stuff into a stand alone unit. It has two speakers on either side, and lots of cameras, sensors and fans, all run from a M2 and R1 chip, all inside inside the headset. There's a cable to an external battery, which you put in your pocket, with 2-4 hours of battery life. 


Superb interface

It frees up apps from the restrictions and boundaries of a perceived screen. They can be placed anywhere in the new 3D space, especially for collaboration. Watching TV and movies will be like watching a 100 foot wide screen - superbly immersive, with 180 degree experiences and spatial audio. If you have a Mac you can AirPlay from the Mac to inside the headset in varying resolutions, ultra high resolution being one. It is superb quality.

It is a complete computer on your head with a ton of sensors and cameras inside and outside. The interface is wholly eyes, hands and voice. No controllers, you just look (eye tracking is sensational) but takes a little getting used to, speak and pinch your fingers. You can have your hand anywhere to pinch. It is fast and accurate and highlights what you are looking at, and you can click wherever you want. A virtual keyboard can pop up, you can also talk to type. As it has its own OS, called Vision OS, it’s the real deal. Like touching on an iPAD, you can look and use your hands to select, scroll, throw, resize, drag stuff around, with low latency. It will also synch with your Mac to use your desktop and other applications. It will, of course, mirror your display as if it were a Mac with a giant screen. You can play games, watch movies or use it as a desktop.

Customizing an environment with a large number of simultaneous windows is the big win. This takes computing into a manipulable 3D spce.


You can open multiple apps, move them around and let go to lock in 3D space. Remember you are seeing passthrough in the sense of a camera showing you the outside world. It's not actually AR. The passthrough is pretty much real time. You can play ping-pong, in this reconstructed real world. Super-close up is difficult but you can use your phone while inside the headset. You can scroll the passthrough to get ever more immersion until you are fully immersed, on the moon, wherever, there are worlds provided.

How does text input work?

You can poke a virtual keyboard with your real fingers. Secondly, look at the key and pinch. You can also look at the microphone and speak. 

This is a typical Apple move. Refine the user experience and make it as simple and intuitive as possible. Imagine this combination of eye tracking, gestures and voice recognition on all future devices. Optical ID is included for privacy.

Voice recognition is important as AI has now provided that interface into Chatbot functionality, where the Chatbot will truly understand your meaning. I suspect Apple already have their own LLM driven version of ChatGPT that will eventually be integrated. The learning possibilities are mind blowing.


This interface opens the device up to learning as you’re not taking up tons of cognitive bandwidth, only looking, pinching and talking. I can already see training in real contexts taking place with AI generated avatars and 3D worlds, sophisticated learning pathways, real assessment of performance and great data tracking, even of eye movements and behaviours. This may, at last, be the way we really can train and assess skills. Its possibilities in training and performance support are clear.


Some stock apps that come with Vision Pro - Apple Music, set in a music room. Apple TV and Disney Channel have their own environments. The photo app adds parallax. The astronomical sky can be seen. Jigspace allows you to import 3D models and play around with them - a dead cert for training. Keynote allows you to practice a talk in an immersive environment. Then a ton of existing compatible apps that you can use straight off. No Netflix, Spotify or YouTube yet.

It will automatically connect to your Mac or iPhone, automatically black the screen out and be usable within the headset. Remember multiple apps can be used, so you can use these while your Mac screen is there. This is seamless and useful.



The eyes on the outside of the headset are your virtual eyes. This powers personas, impressive and strange at the same time. If you have passthrough you can see the eyes, they're blued-out when you're immersed. It detects external people and they shine through your immersed images when they approach. That's clever.

To get your eyes, you have to get them captured, along with your hands, then take the headset of and look at the headset, then turn your head right and left, up and down. Then facial expressions, smile, raised eyebrows, closed eyes etc. This is pretty good - it really looks like you. Once the capture is complete it has your persona or avatar captured. You can edit on glasses, skin tone etc. Meta have done this but it's pretty amazing.

You can use your persona in Facetime and of you all have Vision Pros on it will look like you are looking at the right angle towards that person's face.

I already have Synthesia and Heygen avatars. as well as Digital-Don my GPT. Then there's my identity on Facebook, Twitter, Blogger and LinkedIn. On top of that my email addresses. Our personas are multiplying as we re-present ourselves in the virtual world.

Spatial audio

Voices come from where people are in the room. pu them at a distance and they seem far away. It's not yet real but getting there. One could easily have collaborative training sessions, teachers or seminars, meetings. One could eventually have patients, customers or employees as automated avatars.



Quite cleverly, it renders the part you’re looking at in more detail, not the whole screen – so it is super-sharp. For learning the passthrough opens up all sorts of possibilities in mixed reality, as you can have all sorts of mixed reality learning experiences. The AR learning opportunities are endless as layers of reality can be presented. With a turn of the cog on the headset, you can control the degree of immersion. This is neat. 


There is one very strange feature. The front of the headset has a screen that displays your eyes, not your real eyes but a representation of your eyes. It is activated if someone comes close. Clever, if not a bit weird, but the idea is to make you more human from the outside. Very Apple.

A shout out also to Meta's Oculus 3, as it also has passthrough, you can get apps up and see high res movies. It is also more comfortable to wear as it is lighter - and a LOT cheaper. Don't write off other manufacturers here.


I've seen some fantastic applications using the VIVE and Oculus recently. For example, projects on CPR as well as xcrime seen investigation by the police. The trainers tell us that the reaction from trainees is overwhemingly positive. More to the point, the simple point that yoiu can 'look where you want' is hte special fetaure that makes it real and therefore relevant. Far tooo much training for real-world jobs is done in classrooms. We see how we can bring that world into the classroom. To be honest, the headset is standalone, so you can also take it out into the world. There are already videos of people using it while walking around, doing execise and so on.

The book covers a ton in learning:


Spatial thinking, Extended mind. Motivation, Self Determination Theory.


Presence, Agency, Embodied learning, Vision, Sound, Touch, Autonomy and generative learning, Implementation


Tyranny of the real, Tyranny of place, Tyranny of time, Tyranny of 2D, Learning transfer, Assessing competence


Social metaverse, Collaboration, Social learning, Social skills

Healthcare data, Data in learning, Eye tracking, Hearing, Haptics. 


Apple will watch and see what developers will come up with. I suspect entertainment, ring-side seats at sports events, cinema and games will figure large. This, by all accounts, will deliver stunning experiences. But it is the desktop market that is the new battleground and they’ve made a big move, way ahead of the awful Hololens from Microsoft. 

An all-new App Store provides users with access to more than 1 million compatible apps across iOS and iPadOS, as well as new experiences that take advantage of the unique capabilities of Vision Pro. 

Expensive and the separate battery hanging from a cord, with only 2-4 hours battery time is a bit of a disappointment, one movie and you’re out. But oh what a product. I have to tip my hat to Apple here, they're starting to take risks again.

No comments: