UK: Technical explainer on X’s recommender system and the 2024 racist riots

In the immediate aftermath of a tragic triple murder in Southport on 29 July 2024, social media platform X (formerly Twitter) became a hotspot for racist, Islamophobic and xenophobic rhetoric. False claims alleging the perpetrator of the attack was a Muslim immigrant or asylum-seeker gained significant traction online. As hateful narratives spread on X, offline violence erupted, with mobs targeting mosques, refugee shelters, and Asian, Black and Muslim communities amid a wave of violent racist riots which swept across multiple UK towns and cities. As outlined in the UN Guiding Principles on Business and Human Rights (UN Guiding Principles), companies like X have a responsibility to respect human rights. This includes taking steps to avoid causing or contributing to human rights harms through their design and operations, and to address those impacts when they occur.

View Report in English