Stop Killer Robots

Governments and companies are rapidly developing weapons systems with increasing autonomy using new technology and artificial intelligence. These ‘killer robots’ could be used in conflict zones, by police forces and in border control. A machine should not be allowed to make a decision over life and death. Let’s act now to protect our humanity and make the world a safer place.

We are facing the unthinkable: drones and other advanced weapons are being developed with the ability to choose and attack their own targets – without human control. Once thought to be in movies, autonomous weapons, or ‘killer robots’, are not a future problem any more.

Machines can’t make complex ethical choices. They lack compassion and understanding, they make decisions based on biased, flawed and oppressive processes. Emerging technologies like facial and vocal recognition often fail in recognizing women, people of colour and persons with disabilities. This means that autonomous weapons can never be adequately programmed to substitute human decision making.

The replacement of troops with machines makes the decision to go to war easier – and through illegal transfers and battlefield capture, these weapons will fall into hands of others. What’s more, these technologies will be used in policing, in border control, and to threaten human rights like the right to protest, the right to life, the prohibition of torture and other ill treatment. Despite these concerns, countries like the US, China, Israel, South Korea, Russia, Australia, India and the UK continue to invest in autonomous weapons.

We have an opportunity to act now. As companies and defence departments around the world race to develop these technologies, we must act fast before we lose meaningful human control over the use of force – with devastating consequences.

Sign the petition and stop killer robots making decisions over life & death.

We call on government leaders around the world to launch negotiations for new international law on autonomy in weapons systems – to ensure human control in the use of force and to prohibit machines that target people, reducing us to objects, stereotypes and data points.