Three thousand social media posts containing “hostile or concerning content” were directed at the England team during the 2025 European Championship final on Sunday, a report has found.
Moonshoot, a UK-based threat monitoring service, analysed threats of violence and abuse targeting England’s players and head coach Sarina Wiegman during the victory over Spain, and found social media platforms have failed to take down the majority of abuse targeted at the Lionesses.
Advertisement
The company assessed 73,400 posts during the final across 30 social media platforms. They found that 3,000 contained “hostile or concerning content” such as non-targeted abusive language or misogyny.
Moonshot reported 95 posts which specifically targeted a player or the head coach to the platforms, as they were classified as abuse and clear violations of the platforms’ terms of service. However, 99 per cent of those posts remain online.
“Moderation is reducing,” Catriona Scholes, director of insight at Moonshot, told The Athletic. “There are fewer moderators being recruited and retained by the platforms to take this kind of content down. I think that the experience of the players is a reflection of that reality.
“We are seeing the Online Safety Act in the UK showing a real intention to crack down on that. But what we aren’t seeing is an associated significant reduction in that content online.”
Advertisement
Those 95 posts included racist abuse (42 per cent of posts), misogynistic sentiment (29 per cent), sexual objectification (12 per cent), and anti-LGBTQ+ abuse (11 per cent).
Michelle Agyemang was the most targeted player in the dataset. Agyemang, the most junior member of Wiegman’s squad at 19 years old, was voted young player of the tournament and scored two late equalisers that were crucial in England’s progress to the final.
The Arsenal forward only came on in the 71st minute in the final but was the target of 18 per cent of abusive posts during the game, which were primarily of a racist nature.
Before the semi-finals, England defender Jess Carter stepped back from social media after receiving racist abuse during the tournament. Following her statement, the team decided to stop taking the knee before their matches.
Advertisement
“It is clear we and football need to find another way to tackle racism,” the Lionesses squad said in a statement. “Those behind this online poison must be held accountable.”
“It feels like there can be a place where we can control abuse online, especially racism online, because everything’s monitored online,” England defender Lucy Bronze said in a news conference, “so it just doesn’t make sense to us.”
According to Moonshot, moderation standards are declining across social media platforms. Last year, it recorded a 28 per cent take-down rate for all posts which violate the platforms’ terms of service or community guidelines on X, Instagram, TikTok, Reddit, and Facebook. By July 2025, this take-down rate declined to 6 per cent.
“The social media companies can improve their proactive identification capabilities so that more content is identified through automated means and then goes through assessments by moderators,” Scholes added. “They can also do more to actually apply their own terms of service as they’re written.
Advertisement
“It’s not even that they need to change the policies … it’s actually that they need to put in the enforcement and implementation steps.
“We’re seeing a decline (in moderation), so we can also say that they were better than they are now. So that’s probably a reflection of there being less focus on moderation in the large tech companies right now.”
Scholes explains that there are more sophisticated threats which are “challenging” for moderators, such as using AI-generated content that promotes disinformation about a player and sparks abuse, or using emojis, misspelling and coded language to disguise abuse.
The company says the volume of abuse and threats during the final was lower than expected, which it believes is due to England’s victory. However, the report says the figures demonstrate “a systemic failure in platform moderation, where even the most explicit violations of stated community guidelines remain visible and accessible”.
Advertisement
The report added that the failure in platform moderation was “particularly stark in the context of the UK Online Safety Act which designates much of this content as ‘priority illegal content’.”
After Carter’s statement, anti-discrimination charity Kick It Out said: “Social media companies have failed to prevent exposure to this toxicity, and football must continue to use its collective power to hold them to account. We have been working with the government and the regulator, but we know that more urgency is needed from everyone involved.”
This article originally appeared in The Athletic.
England, Soccer, Women’s Soccer, Women’s Euros
2025 The Athletic Media Company