Killer Drone Hunts Human Targets Without Instruction

Published on 4 July 2021 at 13:01

According to a United Nations report, military drones may have attacked human targets autonomously last year. The complete details of the event have not been released to the public and so it is currently unclear if there were any casualties, but this incident highlights the fact that international movements and petitions to ban the use of autonomous weapons might be too late.

 

The drone in question is a Turkish built Kargu-2 quadcopter drone, a deadly attack drone designed for asymmetric warfare and anti-terrorist operations. According to the UN report, the drone targeted one of Libyan general Khalifa Haftar's soldiers while he tried to retreat.

 

The drone, which can be directed to detonate on impact, was operating in a “highly effective autonomous mode that required no human controller,” the New York Post reported. “The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the report from the UN Security Council’s panel of experts on Libya said.

 

According to the drone makers’ website, “the KARGU can be effectively used against static or moving targets through its indigenous and real-time image processing capabilities and machine learning algorithms embedded on the platform.” It is currently included in the inventory of the Turkish Armed Forces, where it enables soldiers to detect and eliminate threats in a region, and can be used easily by the soldiers in the region without entering dangerous areas.

 

The STM website site also includes a video that demonstrates the Kargu-2 in action. Essentially, it flies in like a Kamikaze pilot of the Second World War, detonating an explosive charge in close proximity to the target. On detonation, munitions designed to kill people or damage equipment that are pre-loaded, depending on mission, are dispersed.

 

The UN and other organizations have protested the proliferation of fully autonomous weapons, or “killer robots,” on the grounds that they would violate the international human rights rule of distinction, which states that parties to an armed conflict must at all times distinguish between the civilian population and combatants.

 

Human Rights Watch has called for an end to so-called “killer robots” and is campaigning for a “pre-emptive ban on the development, production, and use of fully autonomous weapons,” according to a report by the non-profit. They state on their website that there are serious doubts that fully autonomous weapons would be capable of meeting international humanitarian law standards, including the rules of distinction, proportionality, and military necessity, while they would threaten the fundamental right to life and principle of human dignity.

In fact, it’s not just national security forces deploying drones. Gangs and other terrorist groups have repeatedly staged attacks by strapping explosives to commercial drones freely available to the public.

 

At the end of the day, legislation on the matter of autonomous weaponry is still under works, with the UN still conducting reviews into possible regulations and laws before considering mass adoption. It is too early at this stage to know how things will develop, but the two primary end results are either governments getting the greenlight to purchase/develop autonomous weaponry on par with the Kargu-2, or the worldwide push for banning such technology passing through and the production and deployment of autonomous weapons being discontinued.

Add comment

Comments

There are no comments yet.