• Stop the Robots protest at SXSW.

Stop the Robots protest at SXSW. (Photo : SXSW)

The worldwide movement led by prominent scientists and technology entrepreneurs to ban further research into autonomous weapons and killer robots misses the point of the entire problem, argues a research team from the State University of New York at Buffalo, otherwise known as the University of Buffalo.

Like Us on Facebook

A paper published by the team sees the rush to ban and demonize autonomous weapons or "killer robots" as a temporary solution. The actual problem, however, is that society is entering into a situation where these deadly weapons will become a reality.

"We have to deconstruct the term 'killer robot' into smaller cultural techniques," said Tero Karppi, assistant professor of media study, whose paper with Marc Böhlen, UB professor of media study, and Yvette Granta, a graduate student at the university, appears in the International Journal of Cultural Studies.

"We need to go back and look at the history of machine learning, pattern recognition and predictive modeling, and how these things are conceived," said Karppi, an expert in critical platform and software studies whose interests include automation, artificial intelligence and how these systems fail.

"What are the principles and ideologies of building an automated system? What can it do?"

By looking at killer robots, we are forced to address questions that are set to define the coming age of automation, artificial intelligence and robotics.

"The distinctions between combatant and non-combatant, human and machine, life and death are not drawn by a robot," write the study authors.

"While it may be the robot that pulls the trigger, the actual operation of pulling is a consequence of a vast chain of operations, processes and calculations."

Karppi says it's necessary to unpack two different elements in the case of killer robots.

"We shouldn't focus on what is technologically possible," he says. "But rather the ideological, cultural and political motivations that drive these technological developments."

The Pentagon allocated $18 billion of its latest budget to develop systems and technologies that could form the basis of fully autonomous weapons, instruments that independently seek, identify and attack enemy combatants or targets.

"Are humans better than robots to make decisions? If not, then what separates humans from robots? When we are defining what robots are and what they do we also define what it means to be a human in this culture and this society," said Karppi.

Cultural techniques are principles that lead into technical developments.

Originally related to agriculture, cultural techniques were once about cultivation and the processes, labors and actions necessary to render land productive and habitable.

In media theory, however, the cultural-techniques approach is interested in various working parts and multiple evolutionary chains of thought, technology, imagination and knowledge production. It also focuses on how these practices turn into actual systems, products and concepts.

Cultural techniques provide insight into the process of becoming: How we got to now.

"Cultural techniques create distinctions in the world," said Karppi.

"Previously humans have had the agency on the battlefield to pull the trigger, but what happens when this agency is given to a robot and because of its complexity we can't even trace why particular decisions are made in particular situations?"

Karppi and his fellow authors argue in their paper "there is a need to reconsider the composition of the actual threat."

"Consider how both software and ethical systems operate on certain rules," argued Karppi. "Can we take the ethical rule-based system and code that into the software? Whose ethics do we choose? What does the software allow us to do?"