Amusing Ourselves to Irrelevance?
On Algorithms
(4 minute read)
By now it should come as no surprise that the device you are using to read this newsletter is intentionally addictive. It has been designed and continually updated to be more effective at keeping your eyes on its screen.
The past year brought about increased levels of fatigue regarding our technological devices, likely because we were all but forced to make them an even more central part of our working and personal lives while simultaneously being deprived of a slew of in-person experiences.
When the balance of life tips more heavily toward online vs. off, it is perhaps easier to notice the extent to which algorithms play a role in our daily lives, if only to realize that even the most informed of us can’t comprehend all that our devices know about us.
In recent months, I’ve observed friends and colleagues using “Orwellian” to directionally describe our society, and while it’s true that aspects of social policing in the name of ‘wokeness’ can evoke 1984, I’ve been thinking more about Brave New World and Amusing Ourselves to Death.
If you took a college-level Journalism or Communications class after the mid-eighties, chances are you read at least parts of Neil Postman’s prescient book about what awaits us when every aspect of society is driven by entertainment. Though Postman zeroes in on the television as the source of our destruction, revisiting the ‘Foreward’ from Amusing Ourselves to Death proved to be a powerful and timely re-read last week”
“We were keeping our eye on 1984. When the year came and the prophecy didn't, thoughtful Americans sang softly in praise of themselves. The roots of liberal democracy had held. Wherever else the terror had happened, we, at least, had not been visited by Orwellian nightmares.
But we had forgotten that alongside Orwell's dark vision, there was another, slightly older, slightly less well known, equally chilling: Aldous Huxley's Brave New World. Contrary to common belief even among the educated, Huxley and Orwell did not prophesy the same thing. Orwell warns that we will be overcome by an externally imposed oppression. But in Huxley's vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think.
What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy, and the centrifugal bumble puppy.
As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny "failed to take into account man's almost infinite appetite for distractions." In 1984, Huxley added, people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us.
This book is about the possibility that Huxley, not Orwell, was right.”
- Neil Postman, Amusing Ourselves to Death (1985)
We can only imagine what Postman might say about Facebook, Google, Instagram, and TikTok, but in 21 Lessons for the 21st Century, Professor Yuval Noah Harari discusses algorithms and their possible impact on humanity with a lens that, in many ways, echoes Postman.
“Even in allegedly free societies, algorithms might gain authority because we will learn from experience to trust them on more and more issues, and we will gradually lose our ability to make decisions for ourselves. Just think of the way that within a mere two decades, billions of people have come to entrust the Google search algorithm with one of the most important tasks of all: searching for relevant and trustworthy information. We no longer search for information. Instead, we google. And as we increasingly rely on Google for answers, so our ability to search for information by ourselves diminishes.”
-Yuval Noah Harari, 21 Lessons for the 21st Century (2019)
I’ve begun to question the value of algorithms for consumers. Conceptually, of course, some version of an algorithm is needed for something as functional as a search engine to work, and yet, does it have to be so tailored to my needs as opposed to anonymously aggregating the data of everyone who has ever used it? Is the more seamless shopping experience I get from sharing my data worth everything I might be giving up in return?
Meanwhile, those of us who have ever marketed or sold a product online have most likely found ourselves on the other side of this equation: using data and algorithms to more effectively target potential customers. But, as time passes, this game seems to only get more expensive and frustrating for advertisers and business-owners, particularly those striving to be nimble and cost-effective. Ultimately, companies who have grown accustomed to using these tools to generate business might face a breaking point not so long from now:
“The data giants probably aim higher than any previous attention merchant. Their true business isn’t to sell advertisements at all. Rather, by capturing our attention, they manage to accumulate immense amounts of data about us, which is worth more than any advertising revenue. We aren’t their customers — we are their product. In the medium term, this data hoard opens a path to a radically different business model whose first victim will be the advertising industry itself.”
-Yuval Noah Harari, 21 Lessons for the 21st Century (2019)
In a world where people lead ever more lonely lives on an ever more connected planet, we’ve developed a habit of putting convenience above all. We’re hampered by short term thinking that is almost never comprehensive. We celebrate how much smarter we’ve programmed our machines to be — so smart they can learn on their own — without realizing that we’ve grown stupider, or at least lazier.
“We could end up with downgraded humans using upgraded computers.”
To overestimate the ability of algorithms to make the right decisions for us is to dismiss the value of consciousness in the human experience. Yes, data is making it easier every day for our devices to ‘know us better’, and yet, perhaps we would be best served by investing in understanding ourselves in ways that AI cannot (at least for now).
If we’re continually fed more of that which we liked before, what does it mean for our capacity to change our minds or evolve? Is the best human experience that with the least amount of friction? If we eliminate that which is disappointing, can we ever know true satisfaction?
It’s impossible to ask these questions without getting existential. For thousands of years, humans have pondered what the ‘point’ human life is, and though the answer to this question largely remains a mystery (as far as I am aware), if our time on Earth is increasingly detached from consciousness and both big and small decisions are determined by algorithms, is there really any point at all?
We might not fall prey to the exploitation that Orwell feared, but instead to something more disturbing — irrelevance.
— A
Scott Galloway discussed Covid bailouts, NFTs, and income inequality last week on New York Magazine’s Intelligencer blog. Suddenly, Galloway seems to be everywhere — and this piece echoes ideas you’ve likely heard him say elsewhere, but it’s still worth reading. He’s particularly effective at outlining how in our current economy, we protect the shareholder class at seemingly all costs, which means fewer economic opportunities for young people.
One place though, that I think Galloway misses the complete picture is his take on how rich people’s needs impacted the national Covid response. While he hypothesizes that rich people would care a lot more if ‘this had cut their wealth in half instead of doubled their wealth’, I imagine our response to the pandemic might have looked quite different if the rich were truly as susceptible to bad health outcomes from the virus. Yes, a lot of it is about money, but it’s not just about money. Health is wealth.
Last week, the CDC announced that “The vast majority—78%—of U.S. patients hospitalized with COVID-19 were overweight or had obesity. The numbers for intensive care, invasive mechanical ventilation and death were nearly the same.”
And yet, we don’t like to talk about obesity because it’s uncomfortable, it’s taboo. But, what we’re also not talking about is how, in the U.S., a lower socioeconomic status also often translates to a sentence of being overweight, of diabetes, of heart disease. And we continue to do almost nothing in the way of preventative care.
On a more fun note, Hype Williams shot a lookbook for Jay-Z’s new cannabis line and until you look closer, the images look straight out of Slim Aarons’ camera. I’m amazed at how Aarons’ images never go out of style, no matter how played out they seem. I’d be happy to be at this pool party.
Have a great week!