MPs are to investigate possible links between between social media algorithms and the spread of harmful, false content in the wake of rioting which took place in the UK over the summer.
The science, innovation and technology committee has launched a new inquiry which will consider the role of false claims, spread via profit-driven social media algorithms, in triggering the wave of anti-immigration riots in July and August 2024.
Far right disorder erupted on England’s streets in July following the spread of misinformation about the fatal stabbing of three young girls in Southport. False speculation suggested the suspect was an asylum seeker who had arrived in the UK on a small boat.
The inquiry, the first of the newly appointed select committee, will also investigate the effectiveness of current and proposed regulation for these technologies, including the Online Safety Act, and what further measures might be needed.
MDU training professional one of first in country to receive membership award
BASC calls on Gloucestershire Constabulary to reverse “outrageous and unlawful” firearms licensing decision
Commenting ahead of the inquiry’s launch, the chair of the science, innovation and technology committee, Chi Onwurah, said: “The violence we saw on UK streets this summer has shown the dangerous real-world impact of spreading misinformation and disinformation across social media.
“We shouldn’t accept the spread of false and harmful content as part and parcel of using social media. It’s vital that lessons are learnt, and we ensure it doesn’t fuel riots and violence on our streets again.”
***Politics.co.uk is the UK’s leading digital-only political website. Subscribe to our daily newsletter for all the latest news and analysis.***
The Labour MP added: “This is an important opportunity to investigate to what extent social media companies and search engines encourage the spread of harmful and false content online.
“As part of this, we’ll examine how these companies use algorithms to rank content, and whether their business models encourage the spread of content that can mislead and harm us.
“We’ll look at how effective the UK’s regulations and legislation are in combatting content like this, weighing up the balance with freedom of speech, and at who is accountable.”
The terms of reference for the inquiry include investigating the extent to which the business models of social media companies and search engines encourage the spread of harmful content and contribute to wider social harms.
MPs will assess how social media companies and search engines use algorithms to rank content, and whether generative artificial intelligence (AI) and large language models (LLMs) play a role in the creation and spread of misinformation, disinformation and harmful content.
The committee will also look at what role Ofcom and the National Security Online Information Team play in preventing the spread of harmful and false content online, and which bodies should be held accountable for the spread of misinformation, disinformation and harmful content.
The MPs sitting on the science, innovation and technology committee are as follows:
- Chi Onwurah (chair) (Labour)
- Emily Darlington (Labour)
- George Freeman (Conservative)
- Dr Allison Gardner (Labour)
- Tom Gordon (Liberal Democrat)
- Rt Hon Kit Malthouse (Conservative)
- Steve Race (Labour)
- Josh Simons (Labour)
- Dr Lauren Sullivan MP (Labour)
- Adam Thompson MP (Labour)
- Martin Wrigley MP (Liberal Democrat)
Josh Self is Editor of Politics.co.uk, follow him on Bluesky here.
Politics.co.uk is the UK’s leading digital-only political website. Subscribe to our daily newsletter for all the latest news and analysis.
‘Britain is going full Stalin’: Elon Musk’s war with UK government escalates after Trump appointment