Twitter fails to delete 99% of racist tweets aimed at footballers in run-up to World Cup | Twitter

Twitter fails to delete 99% of racist tweets aimed at footballers in run-up to World Cup | Twitter

Tweets hurling racist abuse at footballers, including the N-word, monkey emojis and calls for them to be deported, are not being removed by Twitter.

New research shows the platform failed to act on 99 out of 100 racist tweets reported to it in the week before the World Cup.

Only one was removed after being flagged on Wednesday, a tweet that repeated a racial slur 16 times. All the others remained live this weekend.

The abuse was aimed at 43 players including England stars Raheem Sterling and Bukayo Saka, who were among several players targeted after the Euro 2020 final.

The analysis, conducted by researchers at the Center for Countering Digital Hate (CCDH) and seen by the Observer, included 100 tweets reported to Twitter. Of those, 11 used the N-word to describe footballers, 25 used monkey or banana emojis directed at players, 13 called for players to be deported, and 25 attacked players by telling them to “go back to” other countries. Thirteen tweets targeted footballers over their English skills.

The findings come at a turbulent time for Twitter and will fuel concerns about players possibly being targeted during the World Cup.

Thousands of staff have left the company since Elon Musk’s take-over on 27 October. Musk has insisted that moderation capabilities remain strong and that he is committed to preventing the platform from becoming a “free-for-all hellscape”.

In an update to the platform’s rules on hate speech last week, however, Musk said “negative/hate tweets” would be “deboosted & demonetized”, but not necessarily removed. He added that users “won’t find the tweet unless you specifically seek it out, which is no different from the rest of [the] internet”.

It is unclear how this will be applied to abuse that tags individuals or mentions them by name, who are likely to see the post without seeking it out.

All the tweets identified in the CCDH’s analysis mentioned footballers by name or tagged their Twitter handle. Many were posts beneath official tweets from football clubs or news sites.

They included tweets telling footballers to “go back to Africa”, likening players to apes and chimps and calling for them to be deported. The tweets were flagged through Twitter’s in-app reporting tool.

Twitter was contacted for comment but did not respond. It has laid off much of its communications team.

The content policy currently on its website says it prohibits “targeting others with repeated slurs” and that in cases of “severe, repetitive usage of slurs, where the primary intent is to harass”, tweets may be removed.

It also prohibits “dehumanisation” of a group of people based on characteristics including race.