About The Episode
We look at the future of automation within the military. Current examples, possible future implications, and the ethical issues that comes with it.
Additional Notes:
A fictional dystopian scenario with palm sized mini explosive drones or ‘slaughterbots’, very reminiscent of a black mirror episode – that went viral and is used as communications materials for the banning of lethal autonomous weapons: https://www.youtube.com/watch?v=TlO2gcs1YvM
Transcipt
This week we will be looking at automation in the military. Even though automation is having and will have a massive impact in the commercial and civilian sectors, it has been argued that the greatest benefactor of automation technologies is going to be the military. It has been a topic in the back of my mind for over a year now. I went to a conference on exoskeletons last year in Berlin where a number of passive exoskeletons (this means no battery or power but just springs and weights) were showcased. The most impressive presentation was of a US military contractor exhibiting their full body passive exoskeleton, on an actual soldier. The soldier actually got on the stage with a 40kg backpack and ran back and forth along the length of the stage a number of times. He then talked a bit about how the task was quite easy and didn’t feel like he was exerting himself very much. The exoskeleton was very flexible and ran down the length of his spine to his hips, and then down the outside of his legs into specially made boots where the weight was transferred. The cost was still quite high for such a device at some 50,000e or so, but given the fact that most soldiers needs to carry ample amounts of gear in all sorts of environments, this device was argued to be more than beneficial for the price tag attached, also claiming that with further developments and different materials used the price tag could be dropped significantly. Now this is obviously not an example of fully automation but rather of augmenting a human to modify the required work tasks. One could argue that most if not all new technologies brought to the military does some aspect of this. But I think the more interesting (and perhaps scary) thing to discuss is the fully autonomous systems that are being put in place nowadays. Now I think most of us have come across this idea in one form or another, but perhaps most famously though the various Hollywood films touting the danger automated weaponized robots in the future could pose to us, perhaps most famously with the terminator movies. Personally I never gave much credit to these ideas, but after looking further into this for this episode specifically I’m much less hopeful for the automated military future than before, but mainly for one technology in particular which I’ll get into later in the episode. No I don’t believe the robots will take over, BUT there are better arguments for a problematic future than I originally thought.
We’ve seen military operations trend towards less direct human involvement mainly with drone strikes, specifically Predator drones, which are unmanned but remote controlled by a human operator, notoriously being used in Afghanistan and Iraq. It is estimated that more than 80,000 surveillance drones and almost 2000 attack drones could be purchased around the world in the next 10 years. In 2019 alone, it is expected the world’s 10 biggest drone users will spend some $8bn on new drones (from USA, china, Russia, to Isreal and Turkey). Pilots on rotation control devices that are able to loiter in a conflict zone for about 16 to 20 hours – and able in theory to hit a target the size of a household pane of glass. Justin Bronk, from military think tank the Royal United Services Institute, says “drones are five to six times more efficient than conventional air missions” – hence the reason for their continued increase in use. But,as discussed in the episode on autonomous vehicles, DARPA, the research arm of the US military has been interested in automated systems for quite some time and was the one that launched the autonomous vehicle challenge that really acted as the initial launch for the entire autonomous vehicle boom we’ve been seeing over the last decade. Unlike autonomous cars on our streets, there are a number of autonomous military systems that are already in use and a number still being developed.
Examples
One of the more publicized examples was the Bigdog from Boston Dynamics, and funded by DARPA. Big dog was an autonomous 4 legged robot that could go over difficult terrain and carry up to 150kg. Though it was eventually cancelled in 2015 as it was too noisy for combat. Again in 2015 a semi-autonomous Russian tank, the T-14 Armata was showcased by the Russian military on the Victory Day parade. This marked the first next-generation tank to be entered into serial production. The T-14 Armata has an unmanned turret; instead of having a traditional three-person crew to drive the tank and man the turret, the new T-14 only requires two crew members to operate. This reduces weight and vulnerability of the vehicle, as the tank requires less space for the extra crewman, and leaves the enemy with one less spot to land a kill shot. Since the turret is unmanned, the structure is basically autonomous. As unpredictable land terrain is a particularly difficult issue to overcome with hard and soft ground, sand, bog, water, trenches and more. There are far fewer obstacles in the air domain to consider. For this reason the future of air combat is expected to shift towards autonomous systems and the use of the “autonomous wingmen”, Boeing’s autonomous fighter jet – as mentioned in episode #6 on autonomous vehicles – is currently under development. Boeing plans to sell to customers around the world in 2020. A unique aspect of this is that an autonomous vehicle would feasibly be able to travel faster and farther than a human pilot’s physiology would allow, opening the door to new types of missions. Similarly, navigation on water is much easier. The US navy’s AN-2 Anaconda gunboat, is being developed as a “completely autonomous watercraft equipped with artificial intelligence capabilities” and can “stay in an area for long periods of time without human intervention”. The US Sea Hunter autonomous warship is an unarmed 40 metre-long prototype that has already been launched and can cruise the ocean’s surface without any crew for two to three months at a time.
Jobs in general
As with most other professions, tasks in the military are susceptible to automation. A study was done on the US Navy which used highly detailed descriptions of the tasks done by each of the 1500 occupations – Around a quarter of the occupations, 400, are found to have a higher than 70 percent chance of being automated, but more than half of the occupations in the Navy are not likely to be directly affected by the coming of the robots. Almost all of the occupations deemed having a high risk of being automated are in support services. Typical examples are culinary specialist, data transcriber, accounting, and a range of occupations involved in maintenance operation of equipment, we can also add pilots and soldiers themselves to a certain extent. The study concludes that based on their findings there is reason to believe that the potential for automation is somewhat lower in the armed forces than in the economy as a whole. It is safe to assume that to a certain degree, this can be seen to happen to other parts of the military, and not just to the US.
Ethical consideration
Though the automation of jobs and tasks in full or in part of the 20 million people across the world that serve in the military is important, as I mentioned at the start of this episode I wanted to spend a bit more time on the ethical issues of an automated military force. Though this is something I try not to dive into in other episodes, I think it’s especially important given the specific topic. The overall push for using autonomous military systems is risk reduction for ground forces. As commanders can understand the threat of engagement with a non human tool the risk to actual soldiers drops drastically. This will be especially valuable for conducting combat operations of the future, such as subterranean combat and operations in the confines of megacities and dense urban environments, where small hallways, lack of cover, and lack of navigation options will greatly increase the risk for infantry units, along with explosive device clearing. This will be especially important as The US Army announced it was investing $572 million into training and equipping 26 of its 31 active combat brigades to fight in large-scale subterranean facilities. Linked to this is the particular deployment of remotely autonomous combat drones – a mixture between AI and drones (link to each episode) that is argued to support this form of modern warfare. As remotely operated drones are susceptible to jamming and hacking, as experienced in a 2011 spoofing attack of a drone over Iraq, drones will have to shoulder more decision-making. They’ll know their mission objective, and they’ll react to new circumstances without human guidance. They’ll ignore external radio signals and send very few of their own. Beyond this, through cyber espionage it is very likely that a successful drone design and production will proliferate in the gray market. And in that situation, sifting through the wreckage of a suicide drone attack, it will be very difficult to say who sent that weapon. Anonymous lethal weapons could make lethal action an easy choice for all sorts of competing interests. And this would put a chill on free speech and popular political action, the very heart of democracy. These ideas in particular are taken from a powerful ted talk that I will link in the shownotes.
But it isn’t baseless speculation. In 2017 a worker at Google, Laura Nolan, quit over fears of AI killer drones. Project Maven in 2017 focused on dramatically enhancing US military drone technology. Nolan and others were asked to build a system where AI machines could differentiate potential enemy targets, people, and objects at an infinitely faster rate than human military operatives. She outlined how external forces ranging from changing weather systems to machines being unable to work out complex human behaviour might throw killer robots off course, with possibly fatal consequences. “The other scary thing about these autonomous war systems is that you can only really test them by deploying them in a real combat zone. Nolan said killer robots not guided by human remote control should be outlawed by the same type of international treaty that bans chemical weapons. Therefore, Nolan and several others are pushing for a stop to killer robots and have a campaign that trying to accomplish this. You can find it at stopkillerrobots.org. But it might be already too late, there are already examples of suicide drones being developed. I’ll link videos to both in the shownotes but I was able to find an Israeli suicide drone video showing the drone tracking and moving towards a wooden human target before exploding. There is also the US switchblade, which is marketed for use on board naval ships. This was the first episode so far that I had to take several breaks in the researching of it. The videos that I watched and the articles I read painted quite a bleak picture of the automated military future and the massive negative consequences that accompanies it. I highly recommend you to go to the shownotes where I’ve listed a number of the articles and videos talked about in this episode.
That’s it for this episode. I hope you enjoyed it and helps you to think about the future of automation within the military. It honestly took quite a bit out of me and definitely made me think a lot about the possible problems associated with automation. Next week we’ll look at space exploration which I hope is a little bit more positive. As always leave a review or comment, and thanks for listening.