killer robots in the uncanny valley

Recently, 1,ooo leading artificial intellegence experts and researchers  signed an open letter calling for a ban on the development of  “offensive autonomous weapons beyond meaningful human control.”  The letter was released at the International Joint Conference on Artificial Intelligence in Buenos Aires, Argentina.  Initial signatories included Tesla’s Elon Musk, Apple co-founder Steve Wozniak, Google DeepMind chief executive Demis Hassabis and professor Steven Hawking.  Since then, the number of signatories has approached 20,000.

The letter focusses on autonomous weapons – that is those over which humans have no “meaningful control”.

Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions.

The crucial dimension setting AW’s apart from other highly technological/cybernetic weapons such as drones and cruise missiles is the automated selection and engagement of targets.  In 2013,  Human Rights Watch in its report Losing Humanity provided a somewhat expanded version outlining the difference between autonomous weapons and others:

Unmanned technology possesses at least some level of autonomy, which refers to the ability of a machine to operate without human supervision. At lower levels, autonomy can consist simply of the ability to return to base in case of a malfunction. If a weapon were fully autonomous, it would “identify targets and … trigger itself.” Today’s robotic weapons still have a human being in the decision-making loop, requiring human intervention before the weapons take any lethal action. The aerial drones currently in operation, for instance, depend on a person to make the final decision whether to fire on a target.

Continue reading “killer robots in the uncanny valley”

The Findable Cyborg Part 4

CALL-AfghanBiometrics[1]The US military has an ongoing project in Afghanistan to collect comprehensive biometric data for the entire population. Derik Gregory’s post Biometric War, outlines the program and links to a number of resources helpful in understanding it. One of these is Public Intelligence’s  Identity Dominance: The U.S. Military’s Biometric War in Afghanistan

Despite this lack of formal doctrine, the U.S. military is currently using more than 7,000 devices to collect biometric data from the Afghan population. .. [T]he biometric identifiers being collected in Afghanistan consist primarily of fingerprints, iris scans and facial photographs.  Other biological characteristics, which are referred to as modalities, that can be used to identify a person include certain types of voice patterns, palm prints, DNA, as well as behavioral characteristics such as gait and even keystroke patterns on a keyboard…. The stated goal of the Afghan effort is no less than the collection of biometric data for every living person in Afghanistan. .. [T]he collection of biometric data is not simply about “identifying terrorists and criminals,” but that “it can be used to enable progress in society and has countless applications for the provision of services to the citizens of Afghanistan.”

The lack of formal doctrine is, I think, important.  The Army has operated the program since 2010.  A doctrine would both define the program’s objectives and methods, and exclude other possible uses.

The Army does say that the program is useful “with identifying terrorists and criminals”.  Who can argue with that?   This data is increasingly used for criminal prosecutions. However, the Army has not discussed the accuracy of the scans, and forensic evidence they are compared to.  So its hard to evaluate  the soundness of the convictions in Afghanistan this program has been instrumental in obtaining. Many of these were obtained on the sole evidence of biometrics. The Army has not specified the number of convictions obtained, nor the what are the “countless other applications it foresees”. The lack of formal doctrine creates a freedom to pursue uses without justification.  In a paper Dr. Gregory cites Colleen B. Bell discusses this.

That is, this emergent technology is poised to capture peoples’ biometrics without their consent or knowledge…It also offers the chance to scan whole populations deemed problematic or risky. It is one way forward in the trend towards automating warfare. The course underway suggests that spaces of the global South deemed to be terrorist havens, actual or perhaps even potential zones of conflict are key targets for the development and implementation of new regimes of securitization. This pattern of activity is consistent with experiments in preliberal government that animated colonial rule…

Colonial modes of governance were also experiments in public order, … render[ing] colonized peoples and spaces as laboratories for the limits and possibilities for disciplinary rule (1999:108–111).

Though the hierarchy of relations between the North and South is not one of direct colonial control, in attempting to secure the identity of crisis populations — and by extension the future — there is a rejuvenation of earlier forms of colonial governance evident in the patterns of illiberal governance over subject populations in which local control is circumscribed by coalition mandates, sovereignty is contingent, and practices that are legally taboo in metropolitan settings are permissible in borderlands settings. Grey’s Anatomy Goes South:  Global Racism and Suspect Identities in the Colonial Present  Colleen D. Bell

Drones as an instrument of warfare have received much attention, becoming a cultural trope.  Pervasive biometric gathering and analysis has not.  Yet biometrics are essential if remote forms of warfare, like drones, are to succeed in their cultural/mythic mission to create a discourse of surgical war, that appears always bloodless for the surgeon and beneficial to the etherized patient.

The gathering of meta data, also receives much attention in the West.  Even as many object to it, it’s scale creates a remote, abstract quality.  The individual scale of the practices that create the data make their benefits much more concrete

Right now, the technology of gathering biometric data is very much “in your face”.  Perhaps the technology being developed now in Afghanistan make that untrue in the future.   Perhaps in the near future, not only our financial and communicative movements, but our public bodily movements will make us always findable.

The Technological Horizon – The Cyborg Terrain System

An experiential horizon structures one’s gaze into the world.  Jodi Dean expressed this well in a talk on the Communist Horizon, which she asserts is the political horizon of our age.

The term “horizon” marks a division.

Understood spatially, the horizon is the line dividing the visible, separating earth from sky. Understood temporally, the horizon converges with loss in a metaphor for privation and depletion. The “lost horizon” suggests abandoned projects, prior hopes that have now passed away. Astrophysics offers a thrilling, even uncanny, horizon: the “event horizon” surrounding a black hole. The event horizon is the boundary beyond which events cannot escape. Although “event horizon” denotes the curvature in space/time effected by a singularity, it’s not much different from the spatial horizon. Both evoke a fundamental division, that we experience as impossible to reach, and that we can neither escape nor cross.

I use “horizon” not to recall a forgotten future but to designate a dimension of experience that we can never lose, even if, lost in a fog or focused on our feet, we fail to see it.

The horizon is Real in the sense of impossible—we can never reach it—and in the sense of actual. The horizon shapes our setting. We can lose our bearings, but the horizon is a necessary dimension of our actuality. Whether the effect of a singularity or the meeting of earth and sky, the horizon is the fundamental division establishing where we are.
The Communist Horizon by Jodi Dean

As much as anyting, the horizon is a boundry, but like all boundaries, it seems definite but turns out to be queer.

Standing on a shore perhaps, another shoreline about 2.9 miles across a lake exactly defines the horizon.  Boarding a suitable vessel, we set off and land on that beach, only to turn around and see the shore from which we left a short while ago defining exactly the horizon.

Whatever beach we see in the distance is the technological horizon.  Both beaches, the intervening water, sky, the boat and we, with our vivid perceptions, are all the cyborg terrain.

I take this phrase from the US military’s Human Terrain System which sets out to describe “the human population in the operational environment”.  The Technological Horizon creates the cyborg, not technological artifacts.  That is to say that Heidegger’s technological understanding of Being, the Enframing of the counted and measured components of the world [1] is also the Technological Horizon.

It is the Technological Horizon, not the cell phones, tablets etc, that converts the Human to the Cyborg.

It is the Technological Horizon that converts the military operational Human Terrain System to the planetary Cyborg Terrain System.

It is the Technological Horizon that evokes in Cyborgs Dean’s sense of loss of “the forgotten future” .

A horizon perhaps is not so much a division as an expression of the relationship among perceiver, figure and ground.   We can never reach the horizon but at the same time the horizon is where we live.  A version perhaps of Heidegger’s “splendor of radiant appearing.”

[1] The Question Concerning Technology, Heiddeger,