On 2 May, 2023, the education sector witnessed a sharp decline in share prices, after Chegg, an online edtech company, confirmed that its revenue had been adversely impacted by generative AI tools such as ChatGPT.
The company’s stock tumbled by nearly 50%, and investor fears spread to rival, Pearson, whose shares fell by 15% in London trading.
Chegg’s chief executive, Dan Rosensweig, said, “Since March we have seen a significant student interest in ChatGPT. We now believe it’s having an impact on our new customer growth rate.”
The company’s recently launched AI companion, CheggMate, has not succeeded in reassuring spooked investors. The new learning service, launched in April, was built with OpenAI’s GPT-4 and trained by over 150,000 subject experts, leveraging Chegg’s proprietary content.
How, then, will the AI boom affect education companies?
Pearson – a different kettle of fish?
Pearson insists that its diversified business model differs fundamentally from Chegg’s offering. Chegg distinguishes itself by its “ask an expert” service, whilst Pearson has a greater focus on content creation. Indeed, following the fall-out of Chegg’s ChatGPT warnings, JP Morgan maintained its “overweight” rating on Pearson, arguing that the publisher is not as exposed to the threat of AI as some of its competitors.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataThe investment bank said: “While ChatGPT could be seen as an alternative for students seeking answers to their homework, we do not see it as an alternative to Pearson’s text books, courseware and learning platforms that provide trusted programmes that are adopted by colleges, and have to be followed and consumed by students for around 70% of higher education courses.
“In our view, ChatGPT/AI will not provide complete and structured learning programmes with trusted content.”
Moreover, Pearson’s CEO, Andy Bird, told the Financial Times that the emergence of large language models (LLMs) represents a lucrative opportunity for the company, saying: “The output of these generative AI models is largely predicted by the quality of the data sets that are inputted into them. We are the owners of some very rich, pure data sets – when you start to input them into generative AI models, you get better outputs.”
The company has, in fact, used AI across its product portfolio for many years. For example, it uses LLMs in its workforce skills division to develop predictive algorithms which assess trends in global demand for skills and occupations as well as recommending career pathways, whilst the Pearson Test of English (PTE) uses AI to examine proficiency in English as a foreign language, claiming to be free from human bias.
Analysts at GlobalData, however, remain sceptical about the competitivity of companies such as Pearson and Chegg, rating them poorly against rivals when it comes to AI strategy. Emma Taylor, analyst at GlobalData, says, “Chegg was only seen as a harbinger of the latest developments because it was not perceived to have acted fast enough to co-opt AI for its own use. If companies are investing in the technology, partnering with the right providers, and considering how best to integrate it into their products and business processes, they will not be vulnerable, but at an advantage.”
Is the end of the textbook era nigh?
Could generative AI replace the traditional textbook and teacher model of education? Emma Taylor offers her insight: “Generative AI models should not be used to inform. Generative AI, like ChatGPT, uses LLMs; in their current state of development, these tend to ‘hallucinate’. This is when the content they have created is fabricated or nonsensical. In the context of education, hallucinations could mean people are informed inaccurately.”
David Bicknell, principal analyst at GlobalData, also worries about the already entrenched attainment gap that may grow between disadvantaged students and their more privileged counterparts: “There is a risk that AI will only serve to further empower the already strong, leaving behind the less fortunate. The gap between learning leaders and laggards is now put at around five years. With AI, the gap could widen. So, will AI be the key to unlocking the potential of every learner? Possibly not.”
There is also the pressing matter of trust and safety. On 20 May, a group of UK school leaders, led by the educationist Sir Anthony Seldon, said in a letter to The Times that they “have no confidence that the large digital companies will be capable of regulating themselves in the interests of students, staff and schools and in the past the government has not shown itself capable or willing to do so.”
The letter declares that school leaders are forming their own AI advisory body, composed of leading teachers and guided by a panel of independent digital and AI experts.
Seldon also worries about the implications for students’ development of critical thinking skills, telling the Times: “Learning is at its best, human beings are at their best, when they are challenged and overcome those challenges. AI will make life easy and strip away learning and teaching – unless we get ahead of it.”
While certain technologies can do some “heavy lifting” for teachers, such as marking and assessments, human intelligence, Seldon says, must remain at the heart of education.