Thursday, June 20, 2024

British Library. Books look like museum pieces as that it what they are becoming?

Make it real! Can we actually deliver AI through current networks?


A talk and chat at the Nokia event held in the British Library. Wonderful venue and I made the point that we first abstracted our ideas onto shells 500000 years ago, invented writing 5000 years ago, printing 500 years ago and here we are discussing a technology that may eclipse them all – AI.

Bo heads up Nokia’s Bell Labs, who are working on lots of edge computing and other network research and we did what we do with ChatGPT – engaged in dialogue. I like this format, as it’s closer to a podcast, more informal and seems more real than a traditional keynote.

It was also great to be among real technology experts discussing the supply problems. There's something about focused practitioner events that make them more relevant. Microsoft telling us about GPT5 testing and some great case studies showing the massive impact AI is having on productivity.

Quantum computing was shown and discussed and an interesting focus on the backend network and telco problems in delivering AI. We have unprecedented demand for compute and the delivery of data at lower levels of latency. Yet much of the system was never designed for this purpose. 

Energy solutions

The race is on to find energy solutions such as:

Fusion is now on the horizon

Battery innovation progresses

AI to optimise power use now common

Low power Quantum computing begiining to be realised

Compute solutions

Models have to be trained but low latency dialogue also has to be delivered: 

Chip wars with increasing capability at lower costs

Quantum computers with massive compute power

Application-Specific Integrated Circuits (ASICs) and Field-Programmable Gate Arrays (FPGAs), optimised for AI workloads with lower power consumption

Edge computing moves processing closer to the data source at the edge of the network, reducing the need for centralised compute resources and it lowers latency

Federated learning allows multiple decentralised devices to collaboratively train models while keeping the data localised

Neuromorphic computing with chips that mimic neural structures, offering potential efficiency gains for AI workloads

Software efficiency

There’s also a ton of stuff on software and algorithmic efficiency, such as:

Model Compression through pruning, quantisation, and distillation to reduce the size and computational requirements of AI models

More efficient training methods like transfer learning, few-shot learning, and reinforcement learning to reduce the computational cost of building AI models.

Delivery

Network infrastructure moves towards 5G to provide high-speed, low-latency connectivity, essential for real-time AI applications and global delivery. Content Delivery Networks (CDNs) can cache AI models and results closer to users, reducing latency and bandwidth usage.

Two-horse race

Of course all of this has to be delivered and it is now clear that the biggest companies in the world are now AI companies. NVIDIA are now the most valuable company on the planet, at 3.34 Trillion delivering the spades to the gold miners, Microsoft at $3.32, Apple a touch less at $3.29, Google at $2.17 and Facebook at $1.27. In China Tencent $3.65 Trillion, Alibaba £1.43. This is a two horse race with Us well ahead and China chasing and copying. Europe is still in the paddock.

Conclusion

Afterwards, I went to the British Library’s Treasures of the British Library Collection. There lay the first books, written, then printed. A 2000 year old homework book, early Korans, The Gutenberg Bible. We made this work by developing paper and printing technologies, block printing, moveable type, book formats, networks for publishing and distribution. This was undermined by the internet but something much more profound has just happened.



It struck me that I that same building we had just witnessed a revolution that surpasses both. The sum total of all that written material, globally, is now being used to train a new technology, AI, that allows us to have dialogue with it to make the next leap in cultural advancement. We have invented a technology (books and printing were also technologies) that transcend even the digital presentation of print into a world where the limitations of that format are clear. We are almost returning to an oral world where we talk with our past achievements to move forward into the future.

We are no longer passive consumers of print but in active dialogue with its legacy. These books really did look like museum pieces as that is what print has become.

 

No comments: