I recently rewatched one of my favourite television series, Halt and Catch Fire, and had a thought I really wanted to share. So, allow me to take you along on a mental exercise: how would the world in 2050 remember the technological boom that took place between 2010 and 2025?
If a showrunner in 2050 were to pitch a prestige drama about the period that gave birth to social media and generative AI, I doubt the tone would be at all the same as in Halt and Catch Fire. There will be no romanticizing of scruffy visionaries in hoodies coding the future in garages, driven by the pure energy of innovation. Instead, a show looking back at this era will probably be more of a moment-to-moment black box examination of a plane crash.
Except the scale of the destruction would be quite a bit larger. Not to sound too grandiose, but in total, I think this period has cost our species the foundations of what makes us human – our ability to communicate, to learn, to think critically, and to create. I fear we would be looking through the lens of HBO’s Chernobyl: a gripping, horrifying autopsy of a system that prioritized output over safety.
Believe it or not, there is graphite on the roof
In Chernobyl, the horror comes from the incompetence of authority figures who refuse to believe a reactor can explode. In our case, the reactor is Big Tech, and the operators are… well, you know who they are. We will look back at the “Move Fast and Break Things” philosophy not as a plucky manifesto of disruption, but as a confession of criminal negligence. The “things” they broke were not merely the taxi and hotel business models. Perhaps unknowingly, but eagerly nevertheless, they sacrificed our democratic process, our ability to distinguish truth, and our mental health at the altar of “hockey stick growth”.
The tragedy of this process wasn’t that the technology was evil. It was that “innovation” became a synonym for “extraction.” Driven by the insatiable greed of VCs and protected by legislators who were either too senile to understand the difference between a JPEG and a PDF, or too corrupt to care, the tech sector stopped building tools for humans and started building traps. And we gladly fell into them. We got questionable “value” for free, and ended up paying dearly (and mental healthcare doesn’t come cheap).
So, when future generations look upon our disruptive innovations, they will see some pretty horrific shit:
- The Infinite Scroll should be studied the way we look back to the use of lead in Roman aqueducts. A design choice so toxic, yet so ubiquitous, that it slowly poisoned the population while functioning exactly as intended.
- The Like Button will be viewed as primitive gamification of social validation – a psychological Skinner box that traded dopamine for data.
- The Influencer Economy has become the weaponized form of social media for furthering capitalist greed, while at the same time taking advantage of our society’s already eroded media literacy and self-esteem. And the fact that its main targets happen to be children makes it all the more disgusting.
- The Cycle of Ragebait is perhaps the worst of all. We fed the most vitriolic parts of our nature into the machine and built systems where polarizing statements were rewarded regardless of their truthfulness.
Each one of these deserves its own examination, and I am hereby adding them to my list of things to cover on this blog. Once ready, the above will be made clickable links for you to enjoy a trip down the rabbit hole with me.
And we haven’t even started talking about AI
Holy shit, if things were bad before, they sure are doomed now. Take a population made vulnerable by social media, who no longer interact with real humans but with their highly curated digital avatars, and then take away even the last vestiges of authenticity, only to replace it with the regurgitated slop of a large language model.
And once again, this is a textbook case of a truly fascinating technology’s complete and utter bastardization. Machine learning at its core could be the key to unlocking invaluable speed and efficiency in medical research, climate prediction (and perhaps even the reversal of climate change), manufacturing, error prevention, etc. Instead, it’s at best been utilized to cut costs in processes that we didn’t necessarily need in the first place – like creating cheap visual and text content for advertising – and at worst to create falsehoods that an untrained eye can mistake for truth, thus further eroding our concept of reality.
Going back to Halt and Catch Fire and Chernobyl as my storytelling crutch: on one side, I can picture the elation when the engineers hit that “deploy” button and made the machine speak for itself for the first time. It talks in other people’s words, creates images and music mashed up from the stolen labor of artists (who, by the way, are already not the richest members of our society), but still, fundamentally the code works and the ticket is moved to “release”. Huzzah!
On the other hand, I am reminded of the detailed examination of how the different participants in the Soviet nuclear disaster all contributed to making a terrible thing monumentally worse. I watched on as ignorance, negligence, egoism, and political zeal combined to inflict a wound onto our planet that will not heal for millennia.
With these two thoughts pitted against each other, I suddenly ask myself:
Would we even be there in 25 years?
Too morose? Perhaps. But the point I am trying to make is that something needs doing. Simply accepting things as they are would just send us careening into a spiral of irreversible damage to our environment, socio-economic structures, and psychological (and perhaps even physical) wellbeing.
So, my answer is both straightforward and impossibly challenging. Regulate the living daylights out of Big Tech. Put extremely strict guardrails on the applications for LLMs. And most importantly, stop legislating after the fact, so that the next Chernobyl-scale disaster never even takes place. Elect people who are knowledgeable enough—or surround themselves with advisors whose main priority is to inform and not to line their pockets with lobby money.
Easy, right?
-Bo