The speculative turn here can lead me to omit from my discussions the high quality blogging I read by more Reasonable Cyborgs. I call them reasonable not because I necessarily agree with each of them, but because of their approach. Generally, a Reasonable Cyborg writes in a continuum from an informed illustration of a trend to a contribution meant to help to solve the challenge the ubiquitous coding of earth presents.
For the most part, they are doing something different from me, but their work is relevant here, covering ground I mostly neglect. Reasonableness goes beyond mere practicality, to include a vision of the human/technology relationship quite different from what I discuss here.
One of the ideas many Reasonable Cyborgs hold is that technology is completely an instrument of human subjectivity and decision-making. A fair amount of the time this idea forms an important part of the conclusion of a Reasonable Cyborg’s argument.
Recently, Michael Sacasas, in talking about the prospect of self driving cars drives such a point economically home in his post Fit the Tool to the Person, Not the Person to the Tool.
If autonomous cars become the norm and transportation systems are designed to accommodate their needs, it will not have happened because of some force inherent in the technology itself. It will happen because interested parties will make it happen, with varying degrees of acquiescence from the general public…
Choices were made; political will was exerted; money was spent. So it is now, and so it will be tomorrow.
Developing this approach in the post Do Artifacts have Ethics, Sacasas develops a list of 41 questions one might ask before engaging with a new technology. The preamble to this list includes this:
When we do think about technology’s moral implications, we tend to think about what we do with a given technology. We might call this the “guns don’t kill people, people kill people” approach to the ethics of technology. What matters most about a technology on this view is the use to which it is put. …
But is this really the only morally relevant question one could ask? For instance, pursuing the example of the hammer, might I not also ask how having the hammer in hand encourages me to perceive the world around me? Or, what feelings having a hammer in hand arouses.
Cyborgs exhibit reasonableness not only in theoretical contexts, but practical, problem solving ones as well.
danah boyd reports in her post Data & Civil Rights: What do we know? What don’t we know? on the Data & Civil Rights Conference she attended. The Executive Summary of the Conference described the approach of those participating as follows:
The event had three main narratives: (1) the roots and contemporary state of civil rights issues, which centered primarily on discrimination on the basis of protected classes, and issues of privacy; (2) the inner workings of the technology and how and when it can create discriminatory outcomes and impacts, particularly through algorithmic decision-making; and (3) the next steps for these discussions, especially in the areas of policymaking, government actions, technology development, generating social change, industry innovation, and new research.
Here, the instrumentality of technology was more implicit than Sacasas’ approach, but was just as central. In her post, boyd implores the reader to take up this task personally.
Moving forward, we need your help. We need to go beyond hype and fear, hope and anxiety, and deepen our collective understanding of technology, civil rights, and social justice. We need to work across sectors to imagine how we can create a more robust society, free of the cancerous nature of inequity… It means that those working to create a more fair and just society need to understand how technology works. And it means that all of us need to come together and get creative about building the society that we want to live in.
Reasonable Cyborgs differ in their assessment of the proportion of personal responsibility versus structural societal factors at work in creating the instrumentality of technology. boyd and Sacasas take an approach that seeks to acknowledge and balance both factors.
Others, like Nicholas Carr focus their efforts more on the relationship between individuals and technology. His post A litmus test for technology critics responds to criticism by Evgeny Morozov that Carr’s approach stresses the individual too much.
What particularly galls Morozov is any phenomenological critique of technology, any critical approach that begins by examining the way that the tools people use shape their actual experience of life — their behavior, their perceptions, their thoughts, their relations with others and with the world. The entire tradition of such criticism, a rich and vital tradition that I’m proud to be a part of, is anathema to him, a mere distraction from the political.
Carr does not deny the importance of the structural, the political, but seeks to emphasize the more individual part of the human/technological relationship. For example, his most recent post The illusion of knowledge discusses a study showing that “that searching the web gives people an ‘illusion of knowledge.’ They start to confuse what’s online with what’s in their head, which gives them an exaggerated sense of their own intelligence.”
I might use Morozov as a counterpoint to Carr here, but the two have a somewhat contentious relationship which might cloud whatever clarity I might be achieving here. Instead, I’ll refer to Jenny Davis’ current post Exclusionary Algorithms:Jobaline’s Voice Analyzer. Here the observation of a technology application goes beyond the personal to a structural analysis.
Davis discusses Jobaline’s voice analyzer. The company asserts it can simplify and make less prone to unconscious discrimination the screening of job applicants using an analysis of their voices. Davis observes:
Algorithms sort in the way that humans tell them to sort. They are necessarily imbued with human values. Hidden behind a veil of objectivity, algorithms have a powerful potential to reinforce existing cultural arrangements and render these arrangements natural, moral and inevitable…
Technological processes are, always, human processes.
This last sentence presents clearly the radical instrumentality of technology at the heart of the Reasonable Cyborg.