Illustration of woman at computer screen

South Korea: Google fails to tackle online sexual abuse content despite complaints by survivors

Google has failed to fix its flawed system for removing non-consensual  sexually violative content from its searches despite a long-running campaign by South Korean women and girls targeted with digital sex crimes, Amnesty International said today – exactly one year after first highlighting the problem.

Survivors attempting to remove non-consensual sexually explicit videos from the internet told Amnesty International they continued to find Google’s removal request process difficult to navigate, incomplete and unresponsive. 

A year ago today, Amnesty International launched a global petition calling on Google to address the flaws in its reporting system, following years of lobbying by survivors and local activists.

“Women and girls in South Korea who have been targeted with online sexual abuse have a simple aim: to remove this violative and traumatizing content from the internet. But Google continues to make this difficult due to its dysfunctional reporting system,” said Amnesty International’s Ming Yu Hah, Amnesty International’s Deputy Regional Director for Campaigns.

“Google has had ample time to fix, or at least alleviate, this problem since long before Amnesty first shared survivors’ stories with the company last year, but progress has been insufficient. Google must urgently act to prevent the harm to sexual abuse survivors that is being perpetuated by flaws in its services.”

Many survivors have been trying for years to get sexually explicit videos of themselves removed from Google searches, but the company’s convoluted reporting system and slow responses to requests mean that many of the videos are still online.

Google may have made some attempts to address this issue in the past year, but it must do more

Ming Yu Hah, Amnesty International

In its research published on 8 December 2022, Amnesty International found that Google could greatly reduce survivors’ suffering by quickly processing content takedown requests. The company’s failure to do so had exposed women and girls to prolonged physical and mental harm during the reporting process.

Specifically, survivors said the appropriate reporting forms were difficult to find, and they contained ambiguous categories about the type of content being reported.

In the ensuing year, Google has made alterations and updates to many of the problematic pages identified. For example, there is now a single button to access the “Report a legal removal issue’ form within some forms. However, survivors say these changes have not made their situation any easier.

They said problems remained with confusing forms, a lack of transparency in the reporting process and Google’s failure to adequately address survivors’ trauma.

“Nth Room” shadow still hangs over South Korea

In March 2020, a group of South Korean journalists exposed the existence of eight secret chat rooms on messaging app Telegram where thousands of videos of women and girls containing non-consensual sexually explicit content were being shared and sold without their consent using cryptocurrency. Korean police said more than 60,000 people participated in the crimes by entering these rooms, collectively known as the “Nth Room” case.

Despite the jailing of some “Nth Room” chat operators, some content connected with it still shows up on Google search results. Perpetrators habitually threaten survivors with existing video content to force them into producing more sexually abusive content.

By failing to facilitate the swift removal of this harmful content, Google is failing to meet its human rights responsibilities

Ming Yu Hah, Amnesty International

Unless the non-consensual content and personal information of survivors is deleted, women and girls are subjected to further harm or crimes even when the original perpetrators are punished.

However, survivors have repeatedly had their problems exacerbated when confronted with the slow and confusing process of trying to remove the content from the web using Google’s reporting functionality.

“Google may have made some attempts to address this issue in the past year, but it must do more. Amnesty is echoing the call from women’s rights activists for Google to simplify its reporting processes,” Ming Yu Hah said.

“Google must also start taking more practical measures to prevent re-distribution of non-consensual sexually explicit content, and must take a survivor-centred approach when reviewing materials.”

Google must meet its human rights responsibilities to survivors

The responsibility of all companies to respect human rights has been well articulated in the UN Guiding Principles on Business and Human Rights, which state that all business enterprises are required to avoid causing or contributing to adverse human rights impacts through their own activities, and to address such impacts when they occur.

Google’s own human rights policy states its commitment to “upholding the standards established in the United Nations Guiding Principles on Business and Human Rights”.

“By failing to facilitate the swift removal of this harmful content, Google is failing to meet its human rights responsibilities. The company must put the experiences of survivors at the heart of its reporting system, making it more intuitive and taking steps to reduce the risk of re-traumatization,” said Ming Yu Hah.

“Survivors of digital sex crimes are reliant on Google for the support they need – namely the removal of non-consensual sexually explicit content – and Google must do better in providing this support.” 

On 5 December 2023, Google told Amnesty International it had “a longstanding commitment to fighting technology-facilitated gender-based harassment and violence”. It acknowledged the difficulties survivors faced navigating removal requests “from multiple online platforms and services”, and pointed to the creation of its Transparency Center to help users understand its product policies and its Help Center page, created in response to the South Korean Telecommunications Business Act.