In Cathy O’Neil’s essay about the hidden bias of machine learning algorithms, she offers a simple and obvious truth about the nature of the algorithmic tools that increasingly influence many of the most important decisions in our society: “AI and machine learning algorithms are marketed as unbiased, objective tools. They are not.”
Her words indict not just the failings of a specific technology, but also the larger fiction of neutrality that invisibly coats so much of the world as we understand it. In linguistics, there is the notion of deixis: the idea that language is coded, inevitably, with countless markers of positional context. The words “I” or “here” or are not monoliths of meaning that stand self-sufficient and independent of each other, but indicators of position that are inescapably enmeshed in notions of “you” and “there.” You cannot exist, or express that existence, without coming from a particular place and pointing at another.
As David Foster Wallace observed in his speech “This is Water,” the formative conditions of our lives and our society are often as invisible to us as they are real, and difficult to see precisely because we are so suffused in them. A fish does not contemplate the nature of water, much as we do not always contemplate the foundational impact of our families, our governments, our societal perceptions of value and ability. We simply move through them, if we are able to do so without friction, and call them the world.
Remaining empathetic as well as critical of the darker and unquestioned forces that move beneath the surface of our culture is often a challenging task, but also an essential one if we hope to escape the solipsism of walking through life from behind only one set of eyes, and one set of experiences.
“The really important kind of freedom,” said Wallace, “involves attention and awareness and discipline,” the ability to question the ground we walk upon with every step we take, rather than striding forth with the incurious blinders of inherited “truth” wrapped firmly around our eyes. We do this only by cultivating our own awareness about “what is so real and essential, so hidden in plain sight all around us, all the time, that we have to keep reminding ourselves over and over.”
Too often, we consider the world—and the data, tools, and systems that make it run—from a perspective that takes the biases of its creators for granted, and institutionalizes their blind spots in ever more troubling ways. In the case of algorithmic decisionmaking tools, when we ignore the bias of the data that informs them and the notions of success and accuracy that their code expresses, we create tools that can become substantially worse than human decisionmaking. A human, even a biased one, can always decide simply to take a chance on someone, or to make an exception based on a “gut feeling.” A machine that is trained to reiterate and even amplify existing biases cannot. Our technological children have no empathy or situational awareness to temper their worst impulses, and ours; they merely run the scripts we have given them, without context or contemplation.
When we talk about technology being neutral, we are engaging in a more specialized version of a much larger argument: the idea that the world itself is neutral, and that the particular configuration of reality that we engage with from day to day is somehow natural and fair, rather than a product of countless historical, economic, environmental, and cultural forces that were as incidental as they were formative.
Algorithms do not emerge fully formed into existence like digital Athenas springing from the head of a perfectly objective Zeus. They are crafted by hands of flawed human beings and operate on data that is generated by the same. Much like our words, they are always saying something about where we are coming from and where we are going. We should pay attention to what we are saying, especially in the moments when we think we are not speaking at all. To ignore this is to perpetuate all of the biases that course unexamined through the veins of the world.
“Neutrality” has always been the province of the powerful. Much as whiteness is often perceived as not having a race and maleness as not having a gender, existing within the dominant social groups of a society has historically meant not having to examine the forces that empower them or produce them. The ability to bathe in the frictionless ease of being the default—of never having to consider these questions of bias all—is the height and definition of unearned advantage.
So let us question them, and let us stop handing over the most important decisions that our institutions can make—who gets jobs, loans, prison sentences, and opportunities—to systems whose methodologies are opaque and whose impacts on the most vulnerable members of society may be substantially worse than the mistakes we might make with our own human hands. If algorithms can truly do better than us and create fairer outcomes, then let them prove it first, before we hand them the keys to the kingdom and absolve ourselves of what happens next simply because the outcomes are generated by machines.
We will always be coming from somewhere, a place that is as complicated, conflicted, and complicit with the forces of power that shaped us as the things we shape in return. There is no neutral way to approach algorithms, no way to reduce the complexity of the world to the sleek simplicity of ones as zeros, no matter how seductive the idea may seem. To be responsible and ethical is to demand acknowledgement and transparency about the choices that we make—especially the ones that might not initially seem like “choices” to us at all—and the values that they code into the technological foundation of the world.
“It is unimaginably hard to do this, to stay conscious and alive in the adult world day in and day out,” concluded Wallace, “which means yet another grand cliché turns out to be true: your education really is the job of a lifetime. And it commences: now. I wish you way more than luck.”