Sunday, November 21, 2021

Stickgold & Walker Sleep and learning

Robert Stickgold is a US Professor of Psychiatry at Harvard Medical School, whose sleep research looks at the links between sleep and learning, especially sleep deprivation. He was a colleague and mentor to Matthew Walker, an English sleep researcher, now Professor of Neuroscience and Psychology at the University of California, Berkeley. His research is on sleep and his international bestseller Why We Sleep (2017) contains much that is relevant to the topic of sleep, memory and learning. His 2019 TED talk Sleep is your Superpower was also hugely popular, watched by millions.

Sleep and memory

Walker has written about the effects of sleep on student learning and recommends a rethink around the idea of end-of-semester exams that encourage cramming, even all-nighters. He has changed his own teaching to avoid final exams, splitting his courses up into thirds to spread the assessment load. 

Sleep before learning

Sleep is an active process that improves memory. When we are awake the hippocampus experiences and learns things in the real world, as a short-term location for new memories. It is also limited in capacity. It deals with this by shifting memories into other locations, namely the cortex, during sleep. You can test this using daytime naps and Walker compared a 90 minute ‘nap’ with a ‘no-nap’ group, after they performed a taxing 100 face-name pair task. Later that day, another intense learning task was performed, to see if learning had declined. Those that napped actually increased their ability to learn, while those that stayed awake showed a decline, the difference being a staggering 20%. It would appear that light, Stage 2 NREM sleep and short sleep spindles led to greater retention. It would appear that sleep refreshes our ability to learn, especially the later period of a night’s sleep. Getting up too early and shortening your sleep period seems to be deleterious to learning. This seems to decline with age.

Sleep after learning

What about after one has learnt something? Consolidation of memories has been posited for 2000 years, but it was Jenkins and Dallenbach (1924), who tested forgetting of verbal facts over eight hours, either awake or asleep. This has been replicated many times and forgetting in the group that was awake is greater, the benefits of sleep being 20-40% greater for the sleep groups.

REM and NREM  sleep was then discovered in the 1950s and the link between consolidation of memory and deep NREM was established, with MRI evidence indicating that memories literally move from the hippocampus to the neocortex during sleep. It would appear that your cache of memories gets cleared and stored every night, leaving  you ready for the next day’s learning. Sleep can also improve learning by recovering memories you lost while learning. It seems to rescue memories. Motor skills are also consolidated and enhanced during sleep.

Stimulating learning during sleep

Sleepers stimulated by electrical voltage pulses during deep NREM. The sleepers felt nothing but doubled their ability to recall facts learnt just before going to sleep. Quiet auditory tones synchronised with brain waves, from speakers next to the bed, have also been found to have an effect, namely a 40% improvement on recall. 

Sleep to forget  

When two groups were presented with words to remember but told to remember some (tagged R) and forget others (tagged F), the group that had a 90 minute nap had actively remembered more R words and forgotten more F words. It would seem that sleep is quite intelligent or active in what it selects as memories to be stored.

Sleep and emotions

Emotions or the affective side of learning are also influenced by sleep. The brain does reprocess or modulate emotions through sleep. Sleep deprivation encourages high emotional responses including aggression, bullying and behavioural problems in children.

Critique

Walker has been criticised for being slapdash with his data and references in his book. He has responded and apologised for some of its weaknesses.

Influence

Walker’s book and TED talk popularised sleep research and although he has been criticised for some inaccuracies, the benefits of sleep are now well known, especially among teachers and parents, worried by the rise in late night screen time.

Bibliography

Walker, M., 2017. Why we sleep: Unlocking the power of sleep and dreams. Simon and Schuster.

Stickgold, R. and Walker, M.P., 2013. Sleep-dependent memory triage: evolving generalization through selective processing. Nature neuroscience, 16(2), pp.139-145.

Walker, M.P. and van Der Helm, E., 2009. Overnight therapy? The role of sleep in emotional brain processing. Psychological bulletin, 135(5), p.731.

Walker, M.P. and Stickgold, R., 2004. Sleep-dependent learning and memory consolidation. Neuron, 44(1), pp.121-133.

Walker, M.P. and Stickgold, R., 2006. Sleep, memory, and plasticity. Annu. Rev. Psychol., 57, pp.139-166.

Jenkins, J.G. and Dallenbach, K.M., 1924. Obliviscence during sleep and waking. The American Journal of Psychology, 35(4), pp.605-612.


Vannevar Bush Internet visionary

Vannervar Bush (1890 - 1974) was the Dean of the School of Engineering at MIT, a founder of Raytheon and the top administrator for the US during World War II. He widened research to include partnerships between government, the private sector and universities, a model that survives to this day in the US. He claimed that his leadership qualities came from his family who were sea captains and whalers. He was also a practical man with inventions and dozens of patents to his name. In addition to his Differential Analyzer, he was an administrator and visionary who not only created the environment for much of US technological development during and after World War II leading to the internet but also gave us a powerful and influential vision for what became the World Wide Web.

Differential Analyzer

Bush built his analogue Differential Analyzer in 1931, arguably the first computer. It was an analogue electrical-mechanical device with six disc integrators . The size of a small room, it could solve equations with up to 18 variables. By the late 1930s the digital mindset and technology began to emerge with the English engineer Tommy Flowers, who built vacuum tube switches as binary switches in electrical circuits. A century after Babbage the concept of the modern computer was established.

Innovation and the internet

When World War II came along he headed up Roosevelte’s National Defense Research Committee and oversaw The Manhattan Project among many others. Basic science, especially physics, he saw as the bedrock of innovation. It was technological innovation, he thought, that led to better work conditions and more “study, for learning how to live without the deadening drudgery which has been the burden for the common man for past ages”. His post war report saw the founding of the National Science Foundation, and Bush’s triad model of government, private sector and Universities became the powerhouse for America’s post war technological success. Research centres such as Bell labs, RAND Corporation, SRI and Xerox PARC were bountiful in their innovation, and all contributed to that one huge invention - the internet.

As We May Think

Bush was fascinated with the concept of augmented memory and in his wonderful 1945 article As We May Think, described the idea of a ‘Memex’. It was a vision he came back to time and time again; the storage of books, records and communications, an immense augmentation of human memory that could be accessed quickly and flexibly - basically the internet and world wide web.

Fundamental to his vision was the associative trail, to create new trails of content by linking them together in chained sequences of events, with personal contributions as side trails. Here we have the concept of hyperlinking and personal communications. This he saw as mimicking the associate nature of the human brain. He saw users calling up this indexed, motherlode of augmenting knowledge with just a few keystrokes. A process that would accelerate progress in research and science.

More than this he realised that users would be able to personally create and add knowledge and resources to the system, such as text, comments and photos, linked to main trails or in personal side trails - thus predicting concepts such as social media. He was quite precise about creating, say a personal article, sharing it and linking it to other articles, anticipating blogging. The idea of creating, connecting, annotating and sharing knowledge, on an encyclopedic scale anticipated Wikipedia and other knowledge bases. Lawyers, Doctors, Historians and other professionals would have access to the knowledge they needed to do their jobs more effectively. 

In a book published 22 years later, Science Is Not Enough (1967), he relished the idea that recent technological advances in electronics, such as photocells, transistors, magnetic tape, solid-state circuits and cathode ray tubes have brought his vision closer to reality. He saw in erasable, magnetic tape the possibility of erasure and correction, namely editing, as an important feature of his system of augmentation. Even more remarkable was his prophetic ideas around voice control and user-generated content, anticipating the personal assistants so familiar to us today. He even anticipated the automatic creation of trails, anticipating that AI and machine learning may also play a part in our interaction with such knowledge-bases.

What is astonishing is the range and depth of his vision, coupled with a realistic vision on how technology could be combined with knowledge to accelerate progress, all in the service of the creative brain. It was an astounding thought experiment.

Critique

Some, such as Eisenhower, argue that the Military-Industrial complex became, and continues to be, too large and powerful, no longer serving its original purpose of innovation, although DARPA may be a counter-argument to that thesis.

Influence

Douglas Engelbart, the visionary for the modern computer, was profoundly influenced by Bush’s vision for man-machine and quotes Bush repeatedly as the inspiration for his ideas and practical inventions such as the mouse, computer screen, and personal computer. It was Bush who inspired the ‘Mother of all demos’ the manifestation of a vision that was to be eventually realised through the personal computer and the internet. The vision was not just technological, it continued Bush’s idea of the augmentation of human capabilities for the common good, something Engelbart was to call ‘collective intelligence’. Ted Nelson, who invented ‘hypertext’, also acknowledged his deep debt to Vannervar Bush, as did Tim Berners-Lee, who specifically mentioned Bush and Engelbart and As We May Think as an inspiration for his development of the World Wide Web.

Bibliography

Bush, V., 1967. Science is not enough.

Bush, V., 1945. As we may think. The Atlantic Monthly, 176(1), pp.101-108.

Houston, R.D. and Harmon, G., 2007. Vannevar Bush and memex. Annual review of information science and technology, 41, p.55.

Saturday, November 20, 2021

Engelbart Collective intelligence and IQ

Douglas Carl Engelbart (1925 – 2013) was instrumental in establishing human-computer interaction as an area of technical and psychological research, playing an instrumental role in the invention of the computer mouse, joystick and tracker-ball, also bitmapped screens and hypertext. These and other prophetic features were shown in the famous ‘Mother of all demos’ in 1968. He also put forward an early and full vision of collective intelligence and the idea of collective IQ. He envisioned much of this before the advent of the internet but foresaw the importance of networked knowledge and the networked organisation.

Collective intelligence

While in the Navy he read Vannevar Bush’s article As We May Think and saw the possibility of a shared network being able to be more than the sum of its parts.


We need to get better at getting better. To do this we needed to augment our individual intellects with techniques that leverage collective knowledge. He saw this as the solution to solving complex problems. He called this Bootstrapping and at the heart of the Bootstapping Paradigm was his Dynamic Knowledge Repository (DKR) which allowed a process called the  Concurrent Development, Integration and Application of Knowledge (CoDIAK). This DKTR is also subject to the CoDIAK process.

There is, of course, the human activity with tools and within networked technology but Englebart’s focus was on

A-level, core business as usual activities

B-level improvements on that process, such as quality control

C-level improving on the improvements. 

This C-level is the most im[ortant for exponential improvement. It is what he meant by getting better at being better and is an iterative process where lessons learnt are included in the process of improvement.

Collective IQ

Beyond the mere qualitative description of the web being a place where collective intelligence could flourish, in 1994 he proposed a measure for such intelligence - collective IQ. It measures ‘effectiveness’ or how well groups work together to anticipate and respond to problems and situations. 

This could be a product, service or research goal. Whatever the goal, Collective IQ determines the effectiveness of the response. Speed and quality of response are the key measures, along with development and deployment. It is not an abstract measure of reason but a measure of getting things done and completed, to meet the goal.

Complex goals need more collective IQ, so it is challenging projects, such as the Moonshot or Manhattan project that are often quoted as examples, where collective efforts resulted in goals being met faster and more effectively than they would have been, on the basis of a less collective effort.

The components of Collective IQ are, unlike the brain and individual IQ:

  1. Group process - collective ability to develop, integrate and apply knowledge to a goal

  2. Shared memory - gained, captured, accessed as a shared resource

Collective IQ can be raised or lowered through ignoring, obstructing the bootstrapping process.

Critique

The measurement of individual intelligence is hard enough, the measurement of collective intelligence that much harder, not only in terms of how one combines the individual inputs but also any additional value that images from it being a collective. It is not clear that any general measure could be possible.

Influence

Engelbart’s invention of the mouse and initalm work in envisioning the internet is reason enough to see him as an influential pioneer. His further work on collective intelligence saw the start of serious analysis of networks in terms of their emerging features as forms of collective effort and intelligence.

Bibliography

Boosting Our Collective IQ, by Douglas C. Engelbart, 1995


Lanier Virtual Reality and Digital Maoism

Jaron Lanier is a US technologist and musician. As he founded the first Virtual Reality (VR) company, VPL Research, to sell VR goggles in 1985 but went bankrupt in 1990, so is seen as the inventor, father or, more realistically, the person who came up with the phrase ‘Virtual Reality’. His insights on technology, such as ‘Digital Maoism’, ‘Micropayments’ and his disdain for the collective output of the web and social media, have also been influential. He has for many years been a gadfly for the big tech companies, critical of how Silicon Valley has turned out. He now works at Microsoft.

Virtual reality

In Dawn of the New Everything: Encounters with Reality and Virtual Reality (2017), he takes a wide view of VR, explaining his journey in exploring the medium, presenting its strengths; such as healthcare applications for PTSD and work with those with disabilities, as well as its weaknesses. He presents it as a new medium, not a gadget and explains its complex, fluid and evolving role.

VR now undoubtedly plays a growing role in learning and Lanier’s early work with headsets, gloves, bodysuits and virtual, multi-user spaces, laid the foundation for further developments. His focus on user interaction within those spaces is now being made easier as the technology has improved and the research implemented, and so are its learning and training applications. 

Digital Maoism

Lanier, in a clever phrase, warns against Digital Maoism (2006) aided and abetted by Google, that may take the wisdom of the crowd and turn us all into slavish followers of it and other monolithic services, such as Wikipedia, as the single font of all authoritative knowledge. It drags us towards the single view of authoritative truth that is presented online. He cleverly compares this drift as the sort of drift we see towards totalitarian regimes, towards a single view of the world. 

He also sees the online world as dehumanising people, a place where subtlety and individuality is lost. The subterfuge is that Google monetises your search data and is “selling people (their advertiser-targetable personal identities, buying habits, etc.) back to themselves“. This is an interesting counter to the many theorists, such as Engelbert and Shirky, who see the internet as the source of collective intelligence or the wisdom of crowds. For Lanier it has too many inherent vested interests, structures and limitations.

Internet

He has little time for Kurweil’s single-event analysis which is part of his general critique of AI, as not being capable of being, even replicating, what it is to be human. In You Are Not a Gadget (2010), he again criticises the mob mentality and structures of the web. His individualism sees collective production, such as Wikipedia, open source and open educational resources, as using the labour of individuals without attribution or rewards. There is a sense in which innovation is curbed by collective builds, as it crushes the efforts and flair of the individual. Attempts at standardisation, such as MIDI in music, heattacks as dumbing down.

He takes up the attack again in Who owns the future? (2013), where he again sees users as being duped into handing over their data, while large online organisations profit from that data. He again puts forward a micropayments solution to this problem, that repays individuals for their efforts. In Ten Arguments for Deleting Your Social Media Accounts Right Now (2018), he takes the Postman idea that social media users are becoming the tool they use and being turned into fractious, tribal addicts, losing their sense of well being and place in the real world.

Critique

Lanier has become more critical, polemical and prescriptive in his attacks on online technology, social media and its uses but it is not clear that becoming prescriptive about not using it is justified. Individuals and small businesses and organisations have bloomed and many see the transaction of data for useful free services such as search, translation and communication as fair. His views on technology also, at times, seem somewhat dated as events overtake his analysis, especially in AIU where generated code and transformers break free from what he sees as being the limitations of AI.

It is also not clear that Digital Maoism has actually emerged on the web. If anything it has gone in the opposite direction producing diversity and tribal divisions. On the whole, Lanier tends to play the technology card then swipe it back. This is best summed up in the fact that he has written several books about the hegemony of large tech companies but works for Microsoft.

Influence

Lanier’s contribution to initiating and maintaining interest in VR is undisputed and useful. His provocations on the nature of the web have also contributed greatly to the debate around who profits on the web and some of the dangers around its commoditisation. His role as a writer and speaker has allowed him to present interesting and novel theories, such as Digital Maoism, as well as solutions such as micropayments. His work in VR is now also bearing fruit with a growing body of research and learning applications, see Clark (2021).

Bibliography

Lanier, J., 2018. Ten arguments for deleting your social media accounts right now. Random House.Lanier, J., 2017. Dawn of the new everything: Encounters with reality and virtual reality. Henry Holt and Company.Lanier, J., 2014. Who owns the future?. Simon and Schuster.

Lanier, J., 2006. Digital Maoism: The hazards of the new online collectivism. The Edge, 183(30), p.2.

Lanier, J., 2010. You are not a gadget: A manifesto. Vintage.

Clark, D., 2021. Learning Experience Design: How to Create Effective Learning that Works. Kogan Page Publishers.


China's second Cultural revolution

Some years back Gil, our sons and I travelled in a huge loop around China but especially loved Shaolin, the Temple where they combine Buddhism with martial arts. Almost all modern martial arts originated there. Callum, all these years later, is just about to go to the European Championships in the ITF TaeKwon Do England Team but on that day 10,000 practicing martial arts kids were out in their silks (we were lucky). There were, everywhere, disciplined kids demonstrating their skills, who were attending one of the many schools. It was spectacular.

Turning my thoughts to the China of today, that discipline has turned the country into a superpower. We saw the poverty that existed in the countryside, almost feudal, and who can deny that its great success is in lifting hundreds of millions out of poverty. They did that by using socialist principles to leverage hard working, productive labour but who really paid for this - you and I. We lapped up those cheap goods. Almost every Christmas decoration and present you will buy this year has come from China. The complaints about low wages and what is called slave labour is a bit rich from those who have iPhones and wardrobes full of rarely worn clothes.

However, that brings me to some strange goings on… a second cultural revolution or clampdown. This has been led by the enigmatic Wang Huning. There’s a brilliant portrait of him here...

https://palladiummag.com/.../the-triumph-and-terror.../...

Wang, like all iconoclasts in the past, is using state control to do some pretty unusual things.

TECH COMPANIES

In addition to a fierce clampdown on large tech companies trying to become monopolies, Alibaba was fined US$2.8 billion, Didi was removed from all app stores, they have all been issued with a strict warning - comply or we'll crush you. They've also banned Bitcoin, a clever move for all sorts of reasons, not least weaponising finance to blame the US and others for climate change.

SOCIAL MEDIA

Wang's also behind the move to control large tech companies with a ‘Minor Protection Law’, implemented in June; a crackdown on social media use, which means time management, content restriction and consumption limits for minors.

  • On Douyin, the Chinese version of TikTok, owned by same company, with 600 million daily users, if you’re under 14:
  • Swipe and you get a compulsory science lesson or Museum exhibit, before you can get back to the fun. It’s a mass nudge campaign.
  • Limited to 40 mins a day.
  • Mandatory 5 second delay on scrolling.
  • Not available 10pm to 6am.

ONLINE GAMING

A clampdown on online gamers targets what they see as the decadence of this activity:

  • Under 18 gamers can only play on public holidays, Fridays, Saturdays, and Sundays from 8pm to 9pm.
  • Then there’s censorship in the form of not being able to play games that feature ‘cross-dressing’, ‘gay romance’ and ‘effeminate males’.
  • Changing ‘established’ historical narratives is also frowned upon, this includes  ‘politically harmful’ or ‘historically nihilistic’ content.

ONLINE TUTORING

This sector has been decimated:

  • Private sector tutoring companies have been banned.
  • Foreign education companies have come under strict regulatory scrutiny.
  • Compulsory state registration controls all education companies and platforms.
  • No approval for new off-campus tutoring centres providing core/compulsory education.
  • All must become registered as non-profit institutions.
  • Tutoring banned on weekends, public holidays, and winter and summer vacations, which are popular times for off-campus education.
  • All new enrolments and payments stopped until registration complete by end of 2021.

This is all backed up with sophisticated tracking. Of course, since the dawn of the internet kids have circumvented restrictions through illegal websites, VPNs and foreign sites. Wonder how your kids got to watch that recent cinema only movie release?

Now we feel repulsed by all of this but for years I’ve seen keynote speakers and so-called liberal and progressive people call for similar restrictions on screen time, gaming and many agree that tutoring is ‘gaming’ the education system in favour of the wealthy. It's no more than many have demanded.. We should use the Chinese clampdown to reflect on what we in the liberal, democratic world really want here.

Then there’s the social credit system, which is widely misunderstood, more on that later in another post


Friday, November 19, 2021

Wiley OER (Open Educational Resources)

David Wiley was an early and influential figure in the OER (Open Education Resources) movement. He was involved not only in open content but also the definition and evangelising of OER and encouragement of open learning communities. He started the Open Content project, with its Open Conte Licences was closed in 2003, when it was superseded by Creative Commons. Creative Commons is a nont-for-profit, where Wiley was the Director of Educational Licenses, has over 2 billion works licenced, including Wikipedia and Flickr. 

Open Educational Resources (OER)

OER was a development that encouraged the open license of education content, in all forms of media for teaching, learning and assessment, also for research. Open is not a simple concept, in opposition to closed, as it is a matter of degree in terms of the nature of the content as resources, courses, books and so on, also the degree to which they are in the public domain.

Four R’s of openness

Wiley (2009), stated in The four R’s of openness and ALMS Analysis: Frameworks for open educational resources (2010), identified four things that Open Educational Resources allow from a license:

Reuse—The most basic level of openness. People are allowed to use all or part of the work for their own purposes (e.g. download an educational video to watch at a later time). 

Redistribute—People can share the work with others (e.g. email a digital article to a colleague). 

Revise—People can adapt, modify, translate, or change the form the work (e.g. take a book written in English and turn it into a Spanish audio book). 

Remix—People can take two or more existing resources and combine them to create a new resource (e.g. take audio lectures from one course and combine them with slides from another course to create a new derivative work). Figure 1 illustrates how allowing these different R’s increases the openness of an OER.

Critique

OER has been somewhat confused by the availability of free resources, such as MOOCs. Free is different from open and can give the impression of low quality. There is also the NIH (Note Invented Here) syndrome that often prevents such resources from being used by research oriented faculty who have a natural disposition towards seeing everything created by them, even content that is to be taught. 

Influence

The Open Educational Resources movement has been growing steadily through the steady and relentless world of theorists and contributors, such as David Wiley. In practice, it turned out not to be reusability redistribution, revision and remixing that was relevant. OER targeted directly at learners, not teachers has had far more success and this happened on a greater scale with resources such as YouTube, MOOCs, Khan Academy and Duolingo, almost in spite of the institutional OER movement. In fact, there seems to have been a bifurcation in OER between the flood of publicly funded projects, that tended to atrophy even die, and a successful crop of global successes. 

Bibliography

Hilton, J., Wiley, D., Stein, J., & Johnson, A. (2010). The four R’s of openness and ALMS Analysis: Frameworks for open educational resources. Open Learning: The Journal of Open and Distance Learning, 25(1), 37–44.

Wiley (2009). Creating Open Educational Resources. Materials prepared for an independent study class on Open Educational Resources.

Wiley, D.A., 2002. The instructional use of learning objects (Vol. 1). Bloomington, IN: Agency for instructional technology.


Baudrillard Virtual reality philosopher

Jean Baudrillard (1929 - 2007), the postmodernist French philosopher and cultural theorist, examined the ways in which see and interact with the world as simulcra and simulations and hyperreality. Neo, in the film The Matrix (1999), carries a copy of Baudrillard's Simulcra and Simulations (1981), which examines technology, or more accurately its virtual outputs, in culture. More than this, he has created several concepts and theories that redefine what technology is in our lives and culture, way beyond face-to-face and print media.

His focus on the idea of ‘simulations’ is a break with the past, disassociated from reference to the reality it pretends to represent. This is illustrated in his infamous book, The Gulf War Did Not Take Place (1991). His core idea is that the virtual world(s) we have created are now more important cognitively and culturally, than their supposed referents in the real world. More than this, he thinks the virtual has cleaved away from this assumed real world.

Hyperreality

Consumer Society (1970) rejects the Marxist (and Freudian) ideas of the free agent. The conspicuousness of consumption, he thinks, is far more complex. Malls have their perpetual springtime and perpetual shopping. Our created needs, all the possibilities of pleasure make us not producers but consumers with a huge capacity for consumption. Prophetically, he saw the real role of credit as lubricating this desire and its excesses. His critique of Marxism reaches its peak in The Mirror of Production (1973), where each of the major elements in Marxism are demolished – dialectic, modes of production and so on. Indeed, he turns Marxism on its head, as he thinks it is a justification for the system it purports to destroy. For all its machinations around labour, production and value, Marxism has no distance.

He rejects traditional Marxist descriptions and explanations of economics, with its focus on ‘production’, constructing a new era of consumerist culture, based on consumerism, communications and commodities. ‘Hyperreality’ is the new state, free from the anchors of reason and materialism. For Baudrillard, consumerist communication has its own set of codes related to the desires of the consumer and this new form of living has demoted the idea of people as producers.

In what he calls the ‘code’; floating signifiers, ads, virtual experiences and so on, we live within a system of signs. As his leftism gave way to fatalism in Symbolic Exchange and Death (1976), death is the only escape. But death confirms the absence of relevance of the system in which we find ourselves trapped. In Seduction (1979), he renews his broadsides against Marx, Freud and the structuralists, opting for a Nietzschean view of perspectivism. It is a blow to liberalism and Marxism. Citizens are not a community, they are consumers.

Simulcra, simulations, virtual reality

He picks up on Nietzsche’s rejection of oppositional thought, to move the debate beyond appearance and reality, subject and object, oppressors and oppressed – to a world of Simulacra and simulations (1981) – ads, TV news and soap operas. Even in the realm of divinity the battle between simulcra and iconoclasts, who confirm the power of ‘icons’, show that our concerns are in this battle of signs. Representation first reflects a reality, then masks and perverts that reality, masks the absence of that reality and finally bears no relation to reality. Disneyland, for example, is the US ‘embalmed and pacified’. 

He has been using the term ‘virtual’ for 25 years and in The Gulf War Did Not Exist (1991) he shocked traditional commentators by claiming that the war, as shown through media, was not grounded in the Gulf War but a created reality. History itself collapses through dilution, as we move beyond an ‘event’ based culture to a non-historical state.

In The Conspiracy if Art (1996) he trounces modern art, as no longer relevant and part of the very system it pretends to critique. It is the art of collusion and has no special status. Art is everywhere and nowhere, part of a consumerist nexus with its careers, commerce and tawdry fame. Worse, it has become mediocre, worse still - null. Adored by the art world after Simulacra and simulations (1981), he came back to destroy its view of itself as superior, even relevant.

Critique

Many find his philosophical position vague and ill-defined, especially the use of the word ‘real’. Baudrillard also took an odd position, which has puzzled many – that the best reaction to this all-consuming storm of simulacra and simulations is ‘silence’. This, many argue, is inappropriate, as technology can be a force for good. But Baudrillard’s challenge is to take the debate beyond good and bad. It is an existential, not moral, position.

Influence

Despite writing much of his work in the era of traditional broadcast media, his work has gathered strength, in what he calls the ‘virtual’ world. With the advent of the internet and web, along with social media, augmented reality, virtual reality and artificial intelligence, his theories have gathered strength as the world he described has come to pass. The ‘virtual’ moves us further away from reality. With the advent of virtual reality as a consumer device and the movement towards virtual worlds, such as Facebook’s Metaverse, Non-Fungible Tokens and our increasing participation on online spaces, Baurdillard’s thought seem to have increased relevance.

Twenty five years later, we see this code of signs, simulations and virtual experiences, often as an end in itself. Wars are created to be filmed, now Tweeted, Facebooked and YouTubed. Wars are more virtual than ever. Never easy, always challenging, certainly original – Baudrillard, may no longer be with us but he is a philosopher for our age.

Bibliography

Baudrillard, J., 1995. The Gulf War did not take place. Indiana University Press.

Baudrillard, J., 2019. Simulacra and simulations (1981). In Crime and Media. Routledge

Baudrillard, J., 2016. The consumer society: Myths and structures. Sage.

Baudrillard, J., 1975. The mirror of production (Vol. 17). St. Louis: Telos Press.

Baudrillard, J. and Singer, B., 1990. Seduction. New World Perspectives.

Baudrillard, J., 2005. The conspiracy of art. New York


Shirky Cognitive surplus

Clay Shirky is a US writer and commentator on technology. He is the Vice Provost of Educational Technologies of New York University(NYU), after being the CIO at NYU Shanghai. He has commented on the immense impact the internet has had on politics, economics and education. He foresaw the disintermediation, decentralisation and democritisation effect of the internet but is not uncritical of the negative side of such phenomena.

Mass amateurisation

Here Comes Everybody (2008) was an appeal for the power of collaboration and crowdsourcing which the internet had enabled. He sees this as a bottom-up process, a sharing culture of the internet, in terms of sharing by individuals of content, links, reposting and so on. This is followed by conversations, with one to many dialogue, where people learn from each other, in a meeting of minds. This can develop into collaboration where people coalesce around a common goal, a problem to fix or idea to be developed. Finally, a team forms for collective action, with division of labour, to get something made or done. This removes the traditional blocks in business and general human endeavour, in what he calls ‘mass amateurisation’. Wikipedia and mass publishing have been examples of this.

Cognitive Surplus

Clay Shirky’s Cognitive Surplus (2010) moves us beyond the descriptive to the prescriptive. Cognitive Surplus is a direct assault on TV, as the post-war medium that soaked up almost all of our free time. TV “immobilizes even moderately attentive users, freezing them on chairs and couches”. This 50 year aberration made us less happy, pushing us more towards material satisfaction than social satisfaction. Year on year we spent more time in this “vast wasteland”. It became a medium of ‘social surrogacy’ replacing time spent with family and friends with imaginary friends. 

Shirky then posed a fascinating question. What if even some of that cognitive effort and time were put to better use? Shirky’s cardinal argument is that this passive ‘cognitive surplus’, squandered on passive consumption, could be bountiful. For example, one year of US TV watching is the equivalent of 2000 Wikipedias. In practice, the internet has allowed us to ‘make and share’, with sharing being the driver. We produce rather than just consume and he says that “the stupidest possible creative act is still a creative act”. It’s Shirky’s belief that “It’s in our nature to interact – we enjoy it.” Being part of the web is being part of a global network and the numbers matter. More is better as we can harness this global cognitive surplus to create a new future that is less passive. It is a matter not of using it up but using it constructively through broad experimentation. He compares the web with the print and telephone revolution, in that it results not in a monoculture but increased and unpredictable forms of communication arguing for “as much chaos as we can stand".

TV, unlike the telephone and internet, is unbalanced. Being part of the web is being part of a global network and the numbers matter. More is better as we can harness this global cognitive surplus to create a new future that is less passive. It is a matter, not of using it up but using it constructively, through broad experimentation. He compares the web with the print and telephone revolution, in that it results not in a monoculture but increased and unpredictable forms of communication arguing for ‘as much chaos as we can stand’. Fundamentally, he sees interactivity and social communication as a more natural form of behaviour, destroyed by TV, but coming back to the fore.

Education

In 2015, in the article The Digital Revolution in Higher Education has already happened. No one noticed, Shirky claimed that online learning was already the norm for many with most colleges and most students doing this in some form. He sees the whole date as being, wrongly, about elite four-year colleges, when the real revolution is taking place elsewhere. He also sees the Higher Education system shrink to much fewer and larger organisations, as online learning takes hold

Critique

Shirky’s vision of a world increasingly shaped by mass amateurs and collaboration has not quite come to pass. Yet progress has been steady. It has opened up opportunities for small businesses, projects, activities and publishing on a global scale. On the other hand, the power of large tech companies has intensified. Similarly, his predictions for the changes in Higher Education have not yet come to pass, although the Covid period is certainly putting them under strain.

Influence

Shirky was a new type of commentator, active online, largely concerned with the shift from traditional mass media to online culture. He had and still has wide popular appeal, based on his published books. The shift from old to new media continues as does his predicted shift to a more decentralised world.

Bibliography

The Digital Revolution in Higher Education has Already Happened (2015)

Shirky, C., 2010. Cognitive surplus: Creativity and generosity in a connected age. Penguin UK.

Shirky, C., 2009. Here comes everybody: How change happens when people come together. Penguin UK.