Twitter still failing women over online violence and abuse

Twitter is still not doing enough to protect women from online violence and abuse, despite repeated promises to do so, new analysis by Amnesty International reveals.

Twitter is still not doing enough to tackle the deluge of abuse women face on the platform.

Rasha Abdul Rahim, Co-Director of Amnesty Tech.

The Twitter Scorecard grades the social media company’s record on implementing a series of recommendations to tackle abuse against women on the platform, since Amnesty first highlighted the scale of the problem in its 2018 Toxic Twitter report. Despite some welcome progress, Twitter needs to do much more to address the problem. The company has fully implemented just one of ten concrete recommendations, with limited progress in increasing transparency on how it handles reports of abuse.

“Twitter is still not doing enough to tackle the deluge of abuse women face on the platform. Our analysis shows that despite some progress, Twitter is not doing enough to protect women users, leading many women to silence or censor themselves on the platform,” said Rasha Abdul Rahim, Co-Director of Amnesty Tech.

“We have outlined clear, straightforward steps that Twitter can take to make its platform a safer place for women to express their views. Twitter can and must do more to protect women from abuse.”

Since the release of Toxic Twitter in 2018, Amnesty International has continued to highlight the scale of abuse women face on Twitter, including in Argentina, India, UK and USA. Meanwhile, women have continued to speak out about the abuse they experience on Twitter, and the company’s failure to adequately respond.

The persistent abuse women face on the platform undermines their right to express themselves equally, freely and without fear. This abuse is highly intersectional and women from ethnic or religious minorities, marginalized castes, lesbian, bisexual or transgender women – as well as non-binary individuals – and women with disabilities are disproportionately impacted by abuse on the platform.

Indian author and activist, Meena Kandasamy, said, “Being a Tamil, mixed-caste woman, who speaks out against India’s discriminatory caste system, has proved an explosive mix on Twitter. I receive a torrent of racist and misogynistic abuse, including rape threats. Twitter always seems to be playing catchup and is too slow to address the different types of abuse women face. Twitter is a powerful place to express ourselves, but Twitter needs to do more to clean up the platform and make it a safe place for women.”

Amnesty International provided Twitter with concrete recommendations on how it can better meet its human rights responsibilities, highlighting ten we believe are key to helping to tackle online abuse against women. The Twitter Scorecard uses a traffic light system to grade Twitter’s progress in implementing the recommendations, which cover transparency, reporting mechanisms, and enhanced privacy and security features. Red means the recommendation has not been implemented, amber indicates work in progress, and green means the recommendation has been fully implemented.

Twitter is a powerful place to express ourselves, but Twitter needs to do more to clean up the platform and make it a safe place for women.

Meena Kandasamy, author and activist

Due to the lack of meaningful data Twitter provides, it is difficult even to gauge the full extent of the problem. For example, Twitter still does not provide detailed country-level breakdowns of user reports of abuse, nor does it provide data about how many users report specific kinds of abusive language, for example abuse based on gender or race.

Twitter is also reticent about disclosing detailed information about the number of content moderators it employs, including what kind of coverage they provide across different countries and languages.

The social media platform needs to be more transparent as to how it designs and implements automated processes to identify online abuse against women. While Twitter has disclosed details on how it is using algorithms to combat misinformation during the current COVID-19 pandemic, it is yet to provide the same level of transparency on how algorithms are used to address abusive tweets.

Twitter has made welcome progress in some areas, including improving the appeals process, by offering more guidance to users on how the process works and how decisions are made. The company was graded amber for its efforts towards increasing users’ awareness of privacy and security features and in educating users on the harm such abuse causes.

Twitter has a responsibility to respect human rights, including the rights to live free from discrimination and violence and to freedom of expression and opinion.

“It is totally in Twitter’s power to implement these changes that would make a real difference to millions of women’s experience on the platform,” said Michael Kleinman, Director of Amnesty International’s Silicon Valley Initiative.

“Twitter CEO Jack Dorsey needs to match words with action to show he is genuinely committed to making Twitter a safer place for women. We will continue to press the company until we see more changes that truly show that abuse against women is not welcome on the platform.”

Twitter’s response

In response to our analysis, Twitter acknowledged it needs to do more. However, the company said its combination of human moderation and use of technology, allows it to take a more proactive response to online abuse. On publishing disaggregated data by country or region, Twitter argued this could be open to misinterpretation and give a misleading impression of the problem.

While Amnesty International acknowledges that context is important, there is nothing to stop Twitter providing context alongside data, and the company’s human rights responsibilities means it has a duty to be transparent in how it deals with reports of violence and abuse.