top of page
Search
velislava777

AI creates a risk paradox: it promises 'Goldilocks' education while posing superrisks in the real world

AI is infantilising education with promises for automation and 'just-right' personalised learning. That's the opposite of what we should teach our children.


For most people intelligence is understood as a scale ranging from the village idiot, on one end, and the proverbial genius, on the other. Similarly, we can look at AI beginning at zero intelligence, progressing to surpass human intellect, potentially reaching some kind of superintelligence. If AI is seen to surpass the human genius, we can take a similar scale to describe risk – expect some kinds of superrisks.

 

Sociologist Ulrich Beck described modern societies as 'risk societies', living with uncertainties like the climate crisis and job insecurities. With AI, not only do we face these risks, but we can’t be sure we can understand or control the kinds of new risks it can generate.

 

This is where societies tend to rely on education in the hope that an educated individual has better chances to deal with and even diminish risks such as to unemployment and poverty. Economist Tyler Cowen once warned of a fiercely competitive future job market, which would favour those who have the right education and training. Educated young people, Cowen reassured, will earn significantly more than their less educated peers. His perspective, shared by many mainstream economists and politicians, envisions a precarious future where job opportunities are scarce – now further exacerbated by competition with the very AI systems entering education. This view has led to a global emphasis on schools to teach subjects like data science, AI, and coding, ironically, things that Large Language Models like Chat-GPT are fast becoming experts.

 

We need education to prepare children for a superrisky future, yet make them reliant on the very technologies enabling it. This obviously creates a paradox. In education, AI promises to avert many risks. AI can diagnose students to 'personalise' their learning pathway, decolonise curriculum (i.e., remove controversial topics), release teachers of repetitive work like grading and crafting lesson plans. AI promises to tailor education without the risks of it being neither too difficult, nor too easy, as Goldilocks would say – just right. Examples include Century Tech's data-driven learning 'nuggets', Carnegie Learning's real-time recommendations, and Knewton Alta's just-in-time remediation based on reading children’s minds.

 

But this kind of AI-driven education is sterile, even deceitful, as it overshadows both long-standing risks like poverty, gender inequality, antisemitism, islamophobia, and other problems like the breakdown of family bonds, sedentary life and the climate crisis (to which AI is a big contributor).

 



Data crunching and machine learning to provide risk-averse education teaches children nothing about the real world and how to negotiate and manage risks – be those job-related, societal, or even planetary. Governments are turning to advancing 'smart' technologies, seeing them as neutral tools, which take on issues like inequality and racism as mere design glitches, not as deep-seated problems that require social and political effort.


The UK Department for Education aims to fully harness AI to prepare students for rapidly changing workplaces despite ongoing problems like teacher attrition, mental and family problems, and destitution that leaves over 1 million children without proper beds. Education becomes a moving target. One is expected to catch up with precarious futures and evolving technologies – the same ones contributing to the precarity.

 

The sense of urgency that the new generation should keep up with AI is inadequate and the exact opposite of what education is in reality – a slow and risky endeavour.

 

Egocentric and driven by their desires but unable to fully understand them, children often seek instant gratification. Only slowly they move away from this 'infantile' state and begin to learn about the world and what's possible. As philosopher Gert Biesta argues, the 'slow work of the educator' – and education therefore – is to guide children in understanding their desires and discern which ones benefit them to live well in a world with others and which ones will not.

 

Governments globally encourage education systems to adopt AI. Former UK Prime Minister Rishi Sunak planned to fund personalized AI assistants for every teacher in England, although he didn’t need to. Over £1.3 billion has been invested in educational technologies, many of which are already integrating AI functionalities into their products.

 

Meanwhile, AI threatens over 8 million jobs in the UK in many fields like law, business management, accountancy, as well as teaching. Goldman Sachs estimates, AI could automate the equivalent of 300 million full-time jobs. Ironically, while more education is supposed to reduce uncertainty, those with advanced qualifications are more susceptible to AI automation. 

 

Job market risks extend beyond just job loss. There are risks of harm from AI used to hire and fire employees, to expedite screening of job applicants, and assess employee performance. AI may not steal your job but it may stop you from getting it.

 

Global organisations and governments are pushing AI into education, but they fail to explain how this will prepare children for a world of superrisks.


When funding AI development for education, it is important to carefully assess who truly benefits from it (an 'pro-innovation approach' primarily benefits businesses). Additionally, it is crucial to establish effective accountability mechanisms: how will companies providing AI services in education be held accountable, transparent, and responsible for their use of children's data and the quality of their products?

 

Bringing AI into education is debatable; not inevitable. Education should prioritize teaching children about the multitude of risks in the world, which are often dependent on the role societies choose to take in it, not on the available tools at hand.

16 views0 comments

Comments


bottom of page