THE French army is training with robot dogs as a controversial new defence report says soldiers should be allowed to use Terminator-style killer droids.
Named Spot, the robotic canine has been used in battle exercises by Frances armed forces after first being deployed by cops in New York City earlier this year.
Developed by US firm Boston Dynamics , it is being used by cadets at the Saint-Cyr military college to help during simulated street fighting.
This comes as a defence ministry report, ordered by minister Florence Parly, said French soldiers should be allowed to use “partially autonomous” killing machines while fighting.
The Defence Ethics Committee said the Terminator-style systems would help identify and engage with enemy targets while briefing human fighters.
Soldiers would be able to stop the machines, the report says.
Western countries including Britain and the US have refused to stop the development of robot-fighting machines over fears China and Russia will race ahead with the technology.
The French report did call for a ban on fully independent systems “programmed to be able to change their rules of operation”.
The French army used the AI machine during street fighting exer[/caption]
The automation of weapons will become essential to cope with the speed of future weapons such as missiles which are already approaching five times the speed of sound, the French committee says.
Unmanned drone attack planes which are remotely operated by humans have been used for years in the Middle East with a 2014 report estimating that 2,400 people were killed by US-operated strikes in the previous five years.
However, drones can already fire at enemy targets without a human’s order. Meanwhile, AI naval vessels and tanks are currently being developed.
Most read in News
Despite Spot, the 31kg robot dog, being used by police and soldiers, Boston Dynamics said it did not want the machine used for hurting people.
Michael Perry, the company’s VP, told The Verge: “We do not want any customer using the robot to harm people.
“This forward-deployment model…is something that we need to better understand to determine whether it is actively being used to harm people.”
Source: The Sun
10 total views, 1 views today