Thunderstorm

A lot of rain here the past few weeks, although mostly showers and “Euclidean rain”  (that phrase from Scott Bakker’s evocative post The Lesser Sound and Fury).  Here the trees are close in and it can be difficult to really appreciate a good storm.

So Scott’s piece reminded me of when we lived on the other side of the valley –  in a former creamery on top of a hill above a bend in the Susquehanna River.  Thunderstorms would come down the valley from the west. Sitting in Adirondack chairs in the front lawn, 800 feet above the valley , we would watch each storm come toward us.  The blur of rain and hail falling from the thunderhead’s floor, sometimes, for a while below where we sat, the visibility of the full height of the cumulonimbus cloud, the advancing thunder, the ionized air, terrified and thrilled us until, in  a panic, we would run into the building, itself barely more than a ruin, that seemed in those moments, a place of safety.

The Anxious Cyborg – 2

As the  human/machine relationship continues to develop, information processing increasingly defines how work is done.  In turn, it enters into human considerations of what it means to exist, to intend and to act.

The flow of information mediated by computed coding enables a built environment that creates ever-increasing opportunities for more information to enable more machine work and functioning.  As I previously wrote:

From a machinic perspective, the development of M2M technology introduces a reverse instrumentality.  Technology continues to serve cyborg ends, but cyborgs also become data factories for machines.   Technology has begun to have as its end its own growth and evolution as much as whatever human function it may nominally have….  The world becomes the operational environment of technology.  The Anxious Cyborg

This of course is simply my particular iteration of a complex of ideas others have discussed for a long time now.  Yet, even accounting for the self reinforcing character of much of my blog reading, I feel I have been encountering an unusual number of variations and improvements on this theme.

A recent  post by R. Scott Bakker gives a flavor of the broadly integrative approach that characterizes his blog. He has a particular interest in exploring the vivid deceptiveness of human self-awareness.

Meanwhile, it seems almost certain that the future is only going to become progressively more post-intentional, more difficult to adequately cognize via our murky, apriori intuitions regarding normativity. Even as we speak, society is beginning a second great wave of rationalization, an extraction of organizational efficiencies via the pattern recognition power of Big Data: the New Social Physics. The irrelevance of content—the game of giving and asking for reasons—stands at the root of this movement, whose successes have been dramatic enough to trigger a kind of Moneyball revolution within the corporate world. Where all our previous organizational endeavours have arisen as products of consultation and experimentation, we’re now being organized by our ever-increasing transparency to ever-complicating algorithms. As Alex Pentland (whose MIT lab stands at the forefront of this movement) points out, “most of our beliefs and habits are learned by observing the attitudes, actions, and outcomes of peers, rather than by logic or argument” (Social Physics, 61). The efficiency of our interrelations primarily turns on our unconscious ability to ape our peers, on automatic social learning, not reasoning. Thus first person estimations of character, intelligence, and intent are abandoned in favour of statistical models of institutional behaviour.  Arguing No One: Wolfendale and the Penury of ‘Pragmatic Functionalism’ R. Scott Bakker

Taking a more political turn, Robin James describes the mutual arising of behavior and data in the context of capitalism.

Big data capital wants to get in synch with you just as much as post-identity MRWaSP wants you to synch up with it. [2] Cheney-Lippold calls this process of mutual adaptation “modulation” (168).   A type of “perpetual training” (169) of both us and the algorithms that monitor us and send us information, modulation compels us to temper ourselves by the scales set out by algorithmic capitalism, but it also re-tunes these algorithms to fall more efficiently in phase with the segments of the population it needs to control.

The algorithms you synch up with determine the kinds of opportunities and resources that will be directed your way, and the number of extra hoops you will need to jump through (or not) to be able to access them. Modulation “predicts our lives as users by tethering the potential for alternative futures to our previous actions as users” (Cheney-Lippold 169). Your past patterns of behavior determine the opportunities offered you, and the resources you’re given to realize those opportunities.  Robin James Visible Social Identies vs Algorithmic Identities

Shifting the focus from a systemic and political view, Alistair Croll discusses the individual ethical dimensions of these issues.

Big data is about reducing the cost of analyzing our world. The resulting abundance is triggering entirely new ways of using that data. Visualizations, interfaces, and ubiquitous data collection are increasingly important, because they feed the machine — and the machine is hungry….

Perhaps the biggest threat that a data-driven world presents is an ethical one. Our social safety net is woven on uncertainty. We have welfare, insurance, and other institutions precisely because we can’t tell what’s going to happen — so we amortize that risk across shared resources. The better we are at predicting the future, the less we’ll be willing to share our fates with others. And the more those predictions look like facts, the more justice looks like thoughtcrime.  Alistair Croll New ethics for a new world 

Of course, many cyborgs look forward to all of this with optimism and a sense of opportunity.

When you can use AI as a conduit, as an orchestrating mechanism to the world of information and services, you find yourself in a place where services don’t need to be discovered by an app store or search engine. It’s a new space where users will no longer be required to navigate each individual application or service to find and do what they want. Rather they move effortlessly from one need to the next with thousands of services competing and cooperating to accomplish their desires and tasks simply by expressing their desires. Just by asking….

At this contextual “just arranged a date” moment lies an opportunity to intelligently prompt if the user would like to see what is going on on friday night in the area, get tickets, book dinner reservations, send an Uber to pick them up or send flowers to the table. Incremental revenue nirvana.  Dag Kittlaus A Cambrian Explosion in AI Is Coming

But then it’s always been swell to have money.

The Whole Is Greater Than the Part (Part 4)

Euclid Ave Pawn Shop © Mark Wolfe used with permission Mark Wolf Documentary Photography
Euclid Ave Pawn Shop © Mark Wolfe used with permission
Mark Wolfe Documentary Photography

While a fine-grained focus characterizes much of Code/Space, the final chapter  takes, at points, a panoramic vision.  The accumulation of the specifics of coded applications becomes Everyware.

Everyware is the notion that computational power will soon be distributed and available at any point on the planet…With everyware, life unfolds enveloped within software-enabled environments.   216 Taken together, it is envisioned that these various forms of everyware will generate “ambient intelligence” — objects and spaces that are sensitive and responsive to presence of people or other coded objects. 221

K&D analyze this using cost/ benefit binaries such as surveillance vs empowerment.  This enables them throughout the book to present codeness as a tool that we can use in either positive and negatives ways. This is different from the approach I outline in my Findable Cyborg posts Part 1 and Part 3 and imply in posting the DARPA video in Part 2.  These posts discuss pervasively rationalized environments in the context of the technological understanding of being.  Up to this final part of Code/Space, K&D deemphasize this kind of analysis preferring a mostly functional approach. While they do idenify code’s role in extending the negative aspects of neo-liberal capitalism, there is nothing up to this point like R Scott Bakker’s view:

Modern technological society constitutes a vast, species-wide attempt to become more mechanical, more efficiently integrated in nested levels of superordinate machinery. (You could say that the tyrant attempts to impose from without, capitalism kindles from within.) The Blind Mechanic II: Reza Negarestani and the Labour of Ghosts R. Scott Bakker

Yet something like this sentiment is there, in less explicit form.

Everyware promises new opportunities to monitor, link, and make sense of the interactions, transactions, and mobilities of people, goods and information at a spatial and temporal resolution previously impossible…to create a fine-grained net of automated management. 228

This anxiety becomes more explicit in their discussion of life-logging.  These practices take typical practices of human self monitoring beyond augmentation by coded devices to a pervasive and ubiquitous part of living.

The aim of life-log developers is to provide a record of the past that includes every action, every event, every conversation and every material expression of an individual’s life. 230

The combination of pervasive automated management and life-logging  “has the potential to create a society that never forgets…a detailed spatialization of the history of everything, everywhere.” K&D propose a solution that is both elegant and impossible, the converging of parallel lines of thought on the curved surface of code/space.

One path…is to construct an ethics of forgetting in relation to pervasive computing….[T]echnologies that “store and manage a lifetime’s worth of everything” should always be complimented by forgetting…So rather than focus on the prescriptive [ethics], we envision necessary processes of forgetting…that should be built into code, ensuring a sufficient degree of imperfection, loss and error. 253

Pervasive computing relentlessly increases the signal to noise, seeking to eliminate noise altogether.  Forgetting is purposely generating noise to reconstitute the human in the face of the totalizing machine.  Yet the machines must also be the agents of this forgetting, accepting as they become more and more powerful, “imperfection, loss and error.” Animal perception functions by filling in the gaps of its always incomplete sensory information.  That is why the expression, “The whole is greater than the sum its parts” makes any sense.  The perceived whole is always greater the parts we can perceive. Machine perception has the potential to vastly reduce the unperceived, unprocessed parts.  In such a situation though, the idea of the whole itself becomes  dispensable replaced by a stream of amorphous parts defined by their temporary function. Perhaps there would be hope for K&D’s strategy of forgetting if humans could first provide an example of accepting “imperfection, loss and error”.  It remains though a measure of the predicament we find ourselves in, and this alone recommends Code/Space.

See Also: The Whole Is Greater Than the Part (Part 1)
The Whole Is Greater Than the Part (Part 2)
The Whole Is Greater Than the Part (Part 3)