Book Notes: “Stand Out of Our Light"
Stand Out of Our Light might be the best and most accessible introduction to the human stakes of the attention economy.
Much of our digital tech has goals for us that are not our own, and this is unacceptable.
That’s the key idea of James Williams’ book, Stand Out Of Our Light: Freedom and Resistance in the Attention Economy. Williams is a former Google ad strategist turned Oxford-trained philosopher and he has a few ideas about what makes technology ethical or unethical. Stand Out of Our Light is among best (and shortest) introductions to the stakes of the attention economy and digital media more generally. Sure, readers won’t get a detailed breakdown of the inner-workings of surveillance capitalism, as they would from Shoshanna Zuboff’s The Age of Surveillance Capitalism, but they’ll understand why it matters. And that’s crucial, because that’s how you build an appetite for reform (or rejection).
Williams argues that the tools claiming to help people organize their lives and pursue their passions are too often at cross-purposes with the goals of the very users they’re meant to help. A helpful term that Williams deploys throughout the book is “adversarial design,” which he defines as any technology that deliberately privileges our impulses over our intentions. Adversarial design that “militates against our pursuit of higher goals and values” is unacceptable. Rejecting adversarial design starts with understanding that attention and distraction run deeper than most of think.
Deep Distraction
Why make such a big fuss about distraction, though? Because consequences of pervasive distraction may run more deeply than losing one’s focus. Williams’ argument depends on his wider definition of attention, which he divides into the “spotlight,” “starlight”, and daylight,” varieties. He presents these human faculties of attention (and intention, crucially) as under threat from our collective distraction that, under the attention economy, is not made of mere moments of interruption but constitute a disposition of distraction. Illustrating the issue at hand, Williams writes about what it’s like to use faulty GPS device:
Who would continue to put up with a device they knew would take them somewhere other than where they wanted to go? What reasons could anyone possibly have for continuing to tolerate such a thing? No one would put up with this sort of distraction from a technology that directs them through physical space. Yet we do precisely this, on a daily basis, when it comes to the technologies that direct us through informational space. We have a curiously high tolerance for poor navigability when it comes to the GPSes for our lives – the information and communication systems that now direct so much of our thought and action.
Our tolerance of adversarial technology (epitomized by “free” services) is so high that some writers charting a way through the digital age believe the burden is on users to make the best of things. In a review for IRL: Finding Realness, Meaning, and Belonging in Our Digital Lives, the author notes the irony of seeking belonging and meaning in tools that are designed to direct our attention and sell things. Yet so many still believe that “the more we are able to be aware of the dangers online life poses to our quest for self-knowledge, the more we can enable ourselves and others to avoid those dangers.”
This thinking, where self-regulation is the solution to industrial-scale distraction, is, at best, a stopgap approach. It’s accommodation where there should be resistance. Building off ideas in Matt Crawford’s The World Beyond Your Head, Williams explains the cost of “bringing your own boundaries.”
Sometimes, taking on this additional self-regulatory burden is totally worth it. Other times, though, the cost is too high. […] So when the self-regulatory cost of bringing your own boundaries is high enough, it takes away willpower that could have been spent on something else.
It’s the “something else,” that’s the crucial missing piece in most discussions about the attention economy, and which prompts William’s to go so far as to question the existence of advertising in an age of information abundance.
Wanting What We Want to Want
The increasingly hostile demands on our attention are eroding our ability to pursue our intentions, and should lead us to ask what tools like Facebook and much of Google’s empire are even for. Are human goals and intentions only incidental to the purposes of the machines rumblings beneath the surface? Self-regulation is a necessary part of life, but self-regulation in the face of a vast industrial infrastructure of persuasion and distraction is no solution. Tools that are adversarial to human intentions are unacceptable and the assumptions and systems that encourage the worst of the attention economy should be interrogated, repudiated, and replaced. That’s a big part of what I want to do here at Good Words.
As mentioned earlier, Stand Out of Our Light might be the best and most accessible introduction to the human stakes of the attention economy. While I’ve done my best to share some key insights here, I encourage you to read the whole thing (available here, via Open Access).
I’ll leave you with this passage from the preface:
For too long, we’ve minimized the threats of this intelligent, adversarial persuasion as mere “distraction,” or minor annoyance. In the short term, these challenges can indeed frustrate our ability to do the things we want to do. In the longer term, however, they can make it harder for us to live the lives we want to live, or, even worse, undermine fundamental capacities such as reflection and self-regulation, making it harder, in the words of philosopher Harry Frankfurt, to “want what we want to want.” Seen in this light, these new attentional adversaries threaten not only the success but even the integrity of the human will, at both individual and collective levels.
Want to help? You can spread good words by sharing this post with a friend or support the work by subscribing (there's a free option). It's also possible to buy me a beer.