There are photographs and videos of so many of them. Talmenes, Al-lataminah, Kafr Zita, Khan Sheikhoun; all chemical weapons attacks in Syria. They have been documented and verified, but rarely have they been so central to how states publicly justify their policies.
But an alleged chemical attack on Douma, in the Damascus Countryside, on 7 April changed all that. In their justification for retaliatory military strikes on targets in Syria, official speeches by UK Prime Minister Theresa May referred specifically to “harrowing images of men, women and children lying dead with foam in their mouths”, and the White House to videos and images showing “the remnants of at least two chlorine barrel bombs from the attacks with features consistent with chlorine barrel bombs from past attacks.”
We are not doubting the veracity of these images. Indeed, we at Amnesty International worked to verify many of them. A core part of our work in conflict areas is to assess whether militaries and armed groups adhere to applicable international legal obligations. Verified open source evidence can be crucial to assessing compliance with the rules of international humanitarian law and human rights law.
What is remarkable about these images, though, is how front and centre they have been in these states’ justifications for their actions in Syria on 14 April. Yes, clearly, there is other evidence behind them. The French government mentions interviews with people on the ground, the US “reliable information indicating coordination between Syrian military officials before the attack”. But it’s also true that, with the Organisation for the Prohibition of Chemical Weapons (OPCW) initially unable to enter Douma, authoritative, publicly available evidence has been hard to come by. This is in contrast, for example, to the US airstrike following the Khan Sheikhoun attack in April 2017, before which the Turkish Health Ministry issued a statement saying that sarin gas had been used, which the OPCW subsequently confirmed.
With our repeated requests to access Syria continually denied or ignored by the government, Amnesty International has turned to open source information – the videos and photographs posted on the internet or shared on social messaging networks such as WhatsApp – to support our research and campaigning to protect civilians caught in the conflict. Had we not done so, our work would have been greatly hampered.
As someone who has spent long hours sifting through, verifying and mapping videos and photos depicting attacks against civilians and destruction of civilian objects, I am certain that the recent images from Douma that have been collated and verified – using well-established methodologies – by trusted organizations are not fakedSam Dubberley, Manager of Amnesty International's Digital Verification Corps
As someone who has spent long hours sifting through, verifying and mapping videos and photos depicting attacks against civilians and destruction of civilian objects, I am certain that the recent images from Douma that have been collated and verified – using well-established methodologies – by trusted organizations are not faked.
That is not to say, however, that there are no fake images out there. There are – and this part of a wider problem. Time and time again, with examples of faked content on hand, governments, armed groups and their partisans can trot out the tired trope of “fake news” to mask something horrendous. A human rights violation is committed and images from a different time or place suddenly appear on social media. The Syrian government exploits the falsity of those images to undermine other verified material – and the suffering, trauma, and distress they document.
The aftermath of the 7 April attack on Douma saw a spike in this type of false content – a Pentagon spokesperson cited a 2,000% increase in activity by Russian bots. A particularly striking example, spotted by the online investigation collective Bellingcat, was footage shared as “proof” that the White Helmets had staged the attacks. In fact, it was taken from “Revolution Man” – a fictional film funded by the Syrian Ministry of Culture about a journalist who enters Syria and fabricates chemical weapons attacks. Even if there was rigorously verified public information to counter such false narratives, the old adage that “a lie spreads around the world before the truth has got its trousers on” reigns supreme in the online sphere. Indeed, a recent MIT study published in Science concluded that, on Twitter, false news spreads faster and further than the truth.
While Amnesty International recognizes how useful open source intelligence can be to corroborate and verify events, it rarely forms the backbone of our research and analysis. In Syria, we continue to work as hard as we can to get first-hand interviews from victims, eyewitnesses and experts the ground. We can’t always access all parts of the country officially or safely, but our research teams are in constant contact with their networks across the region. Open source video and images are now part of that process, but they’re not the only part. Our Digital Verification Corps – a team of around 120 volunteers at prestigious universities in five countries – employ a robust methodology for sourcing and verifying such content. We can only use the content when it meets certain, rigorous standards; we have to discard it otherwise. Why? Because anything less would threaten the credibility of our and the wider human rights community’s fact finding at a time and in an information environment that desperately needs it.
In the digital age, perhaps it is not surprising that online, open source content is increasingly central to states’ public diplomacy, and even can inform decisions by international institutions. A recent International Criminal Court arrest warrant for war crimes in Libya, for example, hinged in large part on verified digital evidence of the alleged crimes. When carried out properly, open source investigations can form a crucial new route to justice and accountability for the victims of atrocities in hard-to-reach places, where a mobile phone may be the only witness capable of sharing the story with the wider world.
Just as the “fog of war” can lead to misunderstanding and doubt, states should be attuned to the morass of conflicting and competing narratives on digital platforms in the wake of atrocities such as the Douma attack.Sam Dubberley
But just as the “fog of war” can lead to misunderstanding and doubt, states should be attuned to the morass of conflicting and competing narratives on digital platforms in the wake of atrocities such as the Douma attack. If we are to ensure that open source content is effectively used to hold perpetrators to account, we all – states, international institutions, the media and civil society organizations such as Amnesty International – need to ensure that our verification methodologies are clear, transparent and robust. Doing otherwise leaves the door open to the spread of misinformation and propaganda that undermines the truth – with potentially devastating consequences for victims of actual violations and crimes.