Search This Blog

Saturday 24 January 2015

The US Military Is Building Gangs of Autonomous Flying War Bots

Defense One 

For the Pentagon, drones are cheaper to buy and to operate than regular fighter jets. An armed MQ-9 Reaper drone runs about $14 million, compared to $180 million or more for an F-35 Joint Strike Fighter. But unlike barrel-rolling a jet, the business of actually operating a unmanned aerial vehicle, UAV, for the military is lonely, thankless, and incredibly difficult. It’s no wonder the Pentagon doesn’t have enough drone pilots to meet its needs, a problem certain to persist as the military increases its reliance on unmanned systems, especially in areas where it has no interest in putting boots on the ground, like Pakistan or Iraq. The solution that the military is exploring: increasing the level of autonomy in UAVs to allow one pilot to manage several drones at once.

The Defense Advanced Projects Research Agency, DARPA, put out a call for ideas this week as part of the “Collaborative Operations in Denied Environment” or CODE project. Today, the majority of the drones that the military is using in the fight against ISIL require two pilots. The agency is looking to build packs of flying machines that communicate more with one another as with their operator, which, in turn, would allow a single operator to preside over a unit of six or more drones. Together, the flying robot pack would “collaborate to find, track, identify and engage targets,” according to a press release.

It’s the “engage” portion of that release that rings of Skynet, the robotic tyrant system made famous by the “Terminator” movie franchise. But the drones that DARPA is envisioning would not be firing on human targets without approval from another human. The request also states that the targeting would be under “under established rules of engagement.” What are those rules when it comes to robots? In deciding what drones should and should not be allowed to do, the Defense Department relies on a 2012 directive that states that autonomous weapons “shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.” DOD officials are always eager to remind reporters that, when it comes to armed robots being allowed to select and fire on targets, the department doesn’t want to take humans out of the loop.

Even so, the legality of U.S. drone strikes, particularly those in which civilians die as a result of the strike, remains a matter of some dispute. Ben Emmerson, the United Nation’s special rapporteur on human rights and counter-terrorism, authored a report in 2013 that found that 33 drone strikes may have violated International Humanitarian Law.
 

No comments:

Related Posts Plugin for WordPress, Blogger...