Fifteen Minutes of Shame
How Mark Zuckerberg Created The World's First Algorithmic Cyberbullying Machine
I want to imagine that you just got married. Your wedding was the happiest day of your life, with those nearest to you gathered to celebrate your union with the person you’ve chosen to spend the rest of your days with. Rings were shared, cake was cut, and in your first dance, you wrapped yourself in your partner’s arms and swayed to Presley’s “Can't Help Falling in Love.”
And then I want to imagine that someone you don’t even know managed to get ahold of your wedding photographs, and for reasons you perhaps don’t even know, that person thinks they’re hilarious. So funny, in fact, they have to share them with other people. Each person who sees them tries to one-up the last person, making jokes about the way you looked, or your choice of spouse, or the decor in the room.
If such a person existed, you’d rightly call them a bastard. But here’s the thing: that person exists, and he has a name, and that name is Mark Zuckerberg.
In my last newsletter, I talked about the power of the algorithms that now dominate the products we use, and how they’ve systematically stripped us of agency. I spent a lot of time in the piece — perhaps too much — talking about Facebook’s News Feed, and how rather than show us posts from our friends and family, it instead suppresses them in favor of randomly-selected content that it believes will increase “engagement,” and thus, the amount of time we spend on the site.
I didn’t, however, spend that much time talking about what that content is, in part because I wanted to talk about something that’s been bothering me for a long time.
Have you ever seen something pop up on your Facebook News Feed that made you ask “why the fuck am I seeing this?” Not just because it was random, but because it was a kind of highly-specific type of random that would only really matter to a handful of people.
It might be a McDonald’s franchise in another state — or another country — recognizing its employee of the month, complete with a picture of the worker being handed a certificate and a gift card by their manager. Someone who works at a car dealership hit a big target. A charity event, or a eulogy, or a birthday party.
And then you look closer and realize that the post has thousands of comments from people all over the world, and they’re making fun of the people in the picture, or making lewd jokes about their appearance, writing the kind of things that, in most workplaces, would earn you a one-way ticket to HR.
The reason why I started with the hypothetical example of someone’s wedding was because it wasn’t, in fact, hypothetical, but rather the thing that broke my brain and prompted me to write this newsletter.
For whatever reason, Facebook’s News Feed decided to show me a post from a registry office in Yorkshire celebrating the nuptials of a young couple. The comments underneath were, to put it mildly, incongruent with the happy nature of the occasion.
Sidenote: For the Americans reading this — and most of the people who’ve subscribed to this newsletter are from the US — a registry office is, quite simply, a government-run building where you can get married. They provide the equivalent of a courthouse wedding, and are a secular alternative to a house of worship.
They’re also — and this is pertinent to the story — pretty cheap. While you can, under UK law, get married in other non-religious locations, like hotels and football stadiums (unless, that is, your spouse-to-be isn’t from the UK, as was the case with me and my American wife), you generally have to pay for the privilege. A registry office wedding, meanwhile, can cost as little as £60, or $80.
The registry office was in a former mining town — the kind of which never really recovered from the Thatcher years, and where the only industries left are charity shops, vape retail, and FOBTs. The kind of place that we once described as the “red wall,” because “red wall” sounds a lot kinder than “poor and neglected.”
The couple, who were probably in their mid-20s, looked ordinary enough. They weren’t supermodels — but, then again, most people aren’t. The groom was wearing a waistcoat over a white collared shirt, and the bride sported an ordinary red dress. Not exactly Hugo Boss and Vera Wang, but nothing strange, either.
In essence, they were just a normal working class couple getting married, with their basic wedding likely reflecting their limited economic means. I don’t know how else I can convey the fact that there was nothing inherently weird about this wedding, other than the fact that I was now aware of its existence.
I don’t pretend to understand the machinations that led to that post being seen by myself, or, indeed, the thousands of other people who left cry-laughing emojis, or jibes about “council estate weddings,” or the couple’s pale complexion. I don’t know how Facebook identified the early green shoots of viral potential, and then shot it to News Feeds around the country — and perhaps the world — so that people could spend more time on the site.
And, truthfully, I don’t need to. What matters is that it’s wrong, and it’s hard to imagine that Facebook — with its billions of dollars and its tens of thousands of employees — is unaware of how it’s created the world’s first algorithmic game of public shaming Russian roulette, where anyone who uses the site is at risk of becoming the subject of derision through no fault of their own.
It’s the worst kind of random, algorithmically-driven cruelty, where Facebook serves up victims to an audience of billions, who are then “rewarded” with likes and reactions and congratulatory comments, and the jolt of dopamine that they bring.
The irony that such a system was created by Zuckerberg — a man who, despite now adorning himself with comically-oversized shades, gold chains, and a haircut that would make McLovin cringe, still resembles the type of person that spent half his time at high school in AP classes, and the rest with his head down a toilet — isn’t lost on me. This man is a dweeb, and a bully, and he always has been. And like any bully, he doesn’t have the courage of his convictions to stand by what he does or says.
Mark Zuckerberg is a hypocrite and a coward and a liar who, at no point, has taken responsibility for his company’s horrific record. In 2012, right before the company debuted on the public markets, he wrote about how Facebook was “not originally created to be a company” but to “accomplish a social mission,” namely to “make the world more open and connected.” In 2015, after the company passed its one billion daily user milestone, he wrote about how Facebook “stands for giving every person a voice, for promoting understanding and for including everyone in the opportunities of our modern world.”
Sidenote: You may, perhaps rightfully, accuse me of hypocrisy by decrying the acerbities thrown at random strangers on Facebook in one breath, and then insulting Zuckerberg’s appearance in the next. I don’t really have an intellectually rigorous response to that, except: fuck it, he’s a c… can’t say that word, and he deserves it, and more.
All lovely stuff. And totally, I add, incongruent with the fact that Facebook was arguably the tinder that ignited the Rohingya genocide in Myanmar, with its algorithms turbocharging the spread of hate speech on the platform and radicalizing Burmese society. I’m not sure how lying about why video was the future on the platform — which, in turn, killed a bunch of promising publications — and amplifying mendacious far-right dipshits like Dan Bongino and Ben Shapiro is “giving people a voice,” unless you consider being horrible a prerequisite for personhood. I’m curious to learn how hiding research that shows Instagram is catastrophic for teen mental health fits into the company’s “social mission.”
The reason why I called Zuckerberg a coward — other than the fact that it’s true — is because in every single one of those instances (save for Myanmar, where Facebook admitted that it had helped “incite offline violence”), Facebook tried to minimize its responsibility, to deflect, or to obfuscate.
I think that’s why, as I discussed last week, so much of Facebook is now algorithmically-driven. Sure, the fact that Facebook is hyper-optimized for growth — or, to maximise engagement — is a factor. It’s also true that, by using AI wherever possible, Facebook can shift whatever blame comes its way, and not to another human, but to an artificial neural network.
The News Feed is the perfect example of that. The algorithm provides distance to Facebook, with the decision about what content to surface — who to shame — being made by a machine, not a person. And the insults? The sexually-aggressive comments made about absolute strangers? They aren’t coming from Facebook, but rather from the users. Any culpability is, therefore, transferred from Facebook as an institution — rotten though it is — but rather to misfiring algorithms and its users.
In creating a system that panders to our cruellest impulses — our desire to tear down absolute strangers, and to one-up everyone else by crafting the cruellest insults — that’s entirely automated, and that only surfaces things that people will respond to — whether that response is positive or just plainly cruel — Facebook has cover, and it can absolve itself of the consequences of its actions.
Unless, of course, we call Facebook out on this.
“Unpaid public shaming interns for Facebook”
I’m a massive fan of Jon Ronson’s work, and in particular, his 2015 book, “So You’ve Been Publicly Shamed.” Despite being almost a decade old, and many of the examples cited coming from the nascent days of Twitter and Reddit, it hasn’t aged a day, in part because it addresses the eternal human desire to inflict public humiliation on others.
Ronson — who, in addition to being an exceptional writer, is also a wonderful, inquisitive, gentle human being, having met him several times in person — charts a path that weaves from early Colonial America, and the punishments meted out against heretics and societal transgressors, to populist judges in Texas that used humiliation in lieu of incarceration, to modern-day Internet pile-ons.
Sidenote: When I say Ronson is a wonderful human being, I mean it. In 2019, Lyra McKee, one of my closest friends (and the groomswoman at my wedding) was murdered in Derry by the New IRA. Lyra also shared my love of Ronson’s work, and the year before her death, we’d met up in Belfast to watch him deliver a talk about his book, The Psychopath Test.
Obviously, Lyra’s death impacted me in ways that I’m still, over six years later, coming to terms with. It was only one week later, at the funeral, when I properly started to grieve. That’s because, as a journalist, I realized that her death would inevitably attract a bunch of media attention, and her family would be inundated with calls and messages from reporters trying to get interviews.
In my head, my way of honoring Lyra’s memory would be to shoulder as much of that burden as possible, and to give as much space as possible to Lyra’s partner, and her family.
And so, I made a point of doing every interview that came my way, and of speaking to every journalist who dropped me a DM or sent me an email. I spoke to the majority of the British and Irish print newspapers, wrote a eulogy for The Telegraph, went on TV in Germany and the UK, and even did an interview with the CBC’s Carol Off for As It Happens.In the latter interview with the CBC, I was talking about who Lyra was as a person — she was one of my best friends, and an incredibly tenacious journalist, and I didn’t want her to be remembered by the brutal last moments of her life — I made a throwaway comment about how we both loved Ronson’s work.
I don’t know how, but Ronson somehow found that comment, found my Twitter profile, and then dropped me a DM to pass on his condolences. He didn’t have to, but he did. I’ve met him two times since then, and he’s always remembered who I am, and who Lyra was.
He’s just a good fucking human being, and I’ll always feel grateful to him for the kindness he shared with me during that incredibly difficult time.
One of the subjects of the book is Justine Sacco, the publicist who, infamously, tweeted “Going to Africa. Hope I don’t get Aids. Just kidding. I’m white!” right before her flight to Johannesburg departed from London Heathrow.
Sacco, with just a hundred or so Twitter followers, wasn’t a public figure. It was Sam Biddle, a writer for Gawker’s Valleywag publication, that would bring it to a larger audience, and set in motion the chain of events that would make her the most hated woman on the Internet. While she coursed through the air, fast asleep in the comfort of her first-class lie-flat seat, she unknowingly became Twitter’s biggest trending topic. Thousands tracked her flight, waiting for the moment she landed and realized that her life as she knew it was over.
And it was. She was fired by her employer, IAC. Her name was mentioned in over 100,000 tweets. Justine was front-page news all over the world, appearing in CNN, the BBC, and more. Ironically, it was the kind of coverage that, normally, any publicist would die for — except, perhaps, not in these circumstances.
Ronson would later interview Sacco for his book, where she would say that her tweet wasn’t her flaunting her privilege, but rather mocking it. It was, she claimed, a case of someone making a joke that went a little too close to the grain, albeit one that, divorced from any context, looked really, really bad.
Whether you believe that or not, Sacco’s motivations are, arguably, less interesting than the role that the various tech platforms — primarily Twitter, but also Google — played in elevating this previously unknown into our collective consciousness, and consequently, public infamy, and how these platforms benefited financially from her destruction.
Ronson, speaking about Sacco’s story at Ted 2015, said:
“A lot of companies were making good money that night. You know, Justine’s name was normally Googled 40 times a month. That month, between December the 20th and the end of December, her name was Googled 1,220,000 times. One internet economist told me that meant that Google made somewhere between 120,000 dollars and 468,000 dollars from Justine’s annihilation, whereas those of us doing the actual shaming — we got nothing. We were like unpaid shaming interns for Google.”
The really weird thing — the truly disturbing thing — is how, even if unspoken, public shaming has become integral to the business models of these companies.
Facebook, Twitter, and YouTube all benefit from whenever a new lolcow emerges from their paddock, or from whenever there’s a new bastard du jour, or just another person we don’t like for whatever reason, or just someone who looks poor, or strange, or has a low-paid and low-status job, and thus is someone whom we have license to mock.
I’ve wondered about how much money Facebook made from that young newlywed couple in Yorkshire. I wish I had the foresight to screenshot the post, if only so that I could try and make a vaguely-accurate guestimation of how many people saw it, and thus, how much money Facebook made.
Let’s assume, conservatively, that there were about 400 reactions and another 100 comments. I’m drawing from memory, so this may be inaccurate. And let’s assume that only two percent of those people who saw it actually interacted with the post (and, honestly, I have no idea whether this figure is accurate). That gives us 25,000 views. Assuming a CPM of $10, which is probably on the low side of things, Facebook made $250 from that one post alone.
Scale that across thousands — or millions — of unwitting victims, and you have a nice little revenue stream.
One thing that Ronson didn’t quite anticipate in his book was how the focus of these public shamings would shift, going from people who had — at least, arguably — transgressed, to those who have done nothing wrong, except to look a bit funny, or to have an unconventional appearance, or to be poor.
And how could he? At that time, Facebook’s algorithm was still in its algorithmic infancy, and was nowhere near as aggressive as it is today. Facebook was still growing, and so, it didn’t need to depend on amoral growth-hacking techniques to artificially increase the activity of its dwindling number of users.
TikTok — which is one big random content recommendation machine — wasn’t a thing. Twitter was still, for the most part, chronological. And while there were cases of people who had, through sheer misfortune, become unwilling internet celebrities (for lack of a better term), they were the exception and not the norm.
Let me give you an example: Ghyslain Raza, better known as The Star Wars Kid. In 2002, Raza filmed himself mimicking the actions of Darth Maul with a golf ball retriever at his high school’s film studio. A classmate found it and shared it with a few friends, who shared it with their friends, who posted it to Kazaa, causing it to spread further to a bunch of small blogs and forums.
The reason why, almost two decades later, we remember who the Star Wars Kid isn’t because he was exceptionally strange, or the video was exceptionally funny, but because of the fact that he was a rare example of internet virality, and his fame came at a time when there wasn’t the infrastructure to mass-produce other Star Wars Kids.
The Star Wars Kid was an example of artisan public shaming — where his infamy could have only happened with a chain of deliberate human intervention. Someone would have had to copy the video to a thumb drive, to pass it to their friends, to upload it to a P2P service, and so on.
Today’s online humiliation victims are mass-produced in the equivalent of vast algorithmic factories that are ultimately stripped of human cognition, and conscience, and deliberate action, and are created purely for the benefit of nameless, faceless, and morally-void companies like Facebook.
And while the effects are largely the same — although not on the same scale as the Star Wars Kid’s public evisceration — I’d argue that the presence of a commercial incentive, and the fact that Facebook arguably knows what it’s doing, makes it feel so much worse.
Raza’s classmates had no idea the chain of events they’d started. They weren’t optimizing for virality, using thousands of data points and complex AI models, nor were they benefitting financially from the video. They were just dumb, thoughtless, idiot kids, acting like dumb, thoughtless, idiot kids do.
Sidenote: The other big difference between Raza’s classmates and Facebook is that only one faced any real consequences. Raza’s parents filed suit against the families of his tormentors, seeking a CA$250,000 payment for the anguish he suffered, which was later settled out-of-court.
I’m yet to hear of anyone taking action against Facebook for exposing them to public humiliation through the News Feed’s suggested content feature, however.
That’s the thing. We all have, within ourselves, the capacity for good and the capacity for cruelty. And while we may tend to skew towards one of those poles, no person is exclusively one or the other.
We’re all human. We all fuck up. Sometimes, we fuck up without really understanding why, or that we’re fucking up in the first place.
The problem with Facebook — and with TikTok, and with YouTube, and with every other platform that uses “recommendations” to push content towards its users — is that it makes it far too easy to be cruel, and that it rewards cruelty. Rather than pander to our better angels, it does the opposite — because cruelty is good for business.
These platforms don’t distinguish between “bad engagement” and “good engagement.” Engagement is engagement, even if it’s a bunch of greasy strangers ogling the breasts of an 18-year-old fast food worker that just made employee of the month.
And the reason why they don’t make that distinction is because they can’t.
Just like an LLM — like those powering ChatGPT — doesn’t understand the underlying meaning of the words it spits out, Facebook’s algorithm can’t distinguish between elevating a person, or bringing a person down to their knees. It only tracks whether you linger on a post, whether you click through to read the comments, whether you share or like, or otherwise react. The underlying context is meaningless. It’s just telemetry. It’s just math.
Bad For Us
I called this newsletter What We Lost because I feel that the technology industry’s current trajectory has, in meaningful ways, stripped us of stuff that matters. That can be in small ways — like whether we actually control or understand the products we use — or in big ways, like the entire bottom half of the career ladder.
I’d argue that another item to add to that list is the sense of safety online. Or, phrased in a more specific way, and one that doesn’t sound like I’m angling for a job at OFCOM, I’d say that the increasingly recommendations-driven nature of social media has meant that we’re all at risk of becoming unwilling online celebrities, “enjoying” our own fifteen minutes of shame.
The Internet has always been a rough, raucous space, and I’m not denying that. But, at the very least, the 2011 version of Facebook wouldn’t push your wedding photos out to thousands of strangers to be laughed at.
When we participate in these pile-ons, we’re not showing the world how funny we are, or how quick-witted we are. Rather, we’re doing these massive tech giants a favor, padding their bottom line at the expense of an absolute stranger. And we win by refusing to play, and by listening to our better angels.
Over the past few weeks, Facebook has shown me several posts from a page that shows retail employees — often cart-pushers at supermarkets — wearing tactical-style clothing at work. And yes, while the idea of someone wearing a Walmart nametag on a full military vest is kinda funny, we need to take a step back and ask: “why do we care?”
Seriously, why the fuck should we spend any time thinking about what someone — a person likely earning minimum wage, or near-enough — wears to their thankless job? Why aren’t we questioning why someone thinks it’s appropriate to slyly take a picture of said retail worker and then post it to an online audience of thousands, which then becomes an audience of tens of thousands — or more — when the algorithm does its job?
That’s creepy, right?
I know this makes me sound like a kill-joy, but I honestly don’t care. This type of low-effort, zero-thought content is just as intellectually offensive as Shrimp Jesus — and, in fact, is worse, as it comes at the expense of another human being who, even considering their questionable sartorial choices, has done nothing wrong.
But, just like I feel bad for the people who fall victim to Facebook’s random drive-by cruelty (as well as that on other platforms, like TikTok and Twitter), I also feel sympathy for those that are writing the abuse.
Sidenote: Remember how I called Mark Zuckerberg a coward earlier? Watch how I’m heroically leaving the most contentious point right to the very end of this post which is, as I write this sentence, already over 4,000 words long and not even finished.
Glass houses, and all that.
That said, if you get mad at me, I’ll know you read to the very end, and I’ll know you’re actually a fan. Check. And. Mate.
I bet the majority of the people who leave the comments I described earlier think of themselves as good people. Decent people. And I’m sure that they are! I believe that these people are, in a way, manipulated into behaving in the way they do.
While Facebook requires you to use your real name on the platform, the fact that content is innate ephemeral — appearing on your News Feed one minute, then disappearing the next — provides a level of… well, not quite anonymity, but perhaps the next best thing. It empowers people to act appallingly because they know that, as atrocious as they are, it’ll drop off the reader’s page as soon as they refresh their browser window.
Combine that with the little jolts of dopamine that you get whenever someone likes your comment, or gives you a laugh react, or compliments your joke, and you can sort-of see how you’d end up writing hateful things about a newlywed couple you’ve never met before, and likely never will meet.
I’ve called this newsletter What We Lost for a reason — and I’d argue that tech is incredibly effective at stripping away our moral compass, and our sense of decency, and perhaps even our humanity.
We need to question why these platforms are showing us the things we see. We need to ask why, and what this is doing to us.
More importantly, we need to find another word to describe this sluice of inanity and hate that isn’t the banal “recommendations,” or “suggested content,” because they’re based on a lie.
These “recommendations” aren’t driven by any sense of utility for the end user, or because they align to our interests or passions, or for anything to do with us as people, but rather how they can control us, to keep us scrolling and typing and clicking, and to keep us watching ad after ad after ad.
Because that’s what this is all about. Control. And, as I pointed out in my last newsletter, the tech we use is designed to strip as much control as possible for the end-user. The only way we can win — the only way to fight back — is to not engage. That’s the only weapon at our disposal.
But that’s the only weapon we need.
Footnote
Thanks for making it this far. Few things to note I wrap up:
If you’ve got any thoughts, or want to tell me how wrong I am about something, or have an idea, or just want to chat, feel free to drop me an email. My address is me@matthewhughes.co.uk. You can also send me a death threat on Bluesky.
Two weeks after launching this newsletter, I’ve now got over 300 free subscribers. That is.. Insane. Thank you. Thank you. Thank you. Seriously, thank you.
Actually, indulge me for a second. I’m going to say more about that. So, as a reporter, I was used to people leaving comments about the stories I wrote, whether to talk about the subject, or to say how shit I am, or how I’m “an bias,” or a shill for <insert tech company here>. Very rarely did I get people saying nice things about me, or my work.
Since starting this Substack, I’ve had more people say nice things about my writing than… well. Perhaps in my entire career? It’s strange. Lovely, though I’m very unused to it.
As embarrassing as this sounds, I’ve always felt pretty shit about my abilities. I’ve always lacked confidence. My self-esteem is… well. It’s pretty much non-existent. And as cringe as this might sound, this is actually the first time I’ve ever felt proud — truly, truly proud — of something I’ve written. And so, I want to just say a massive thank you to everyone who has left a comment, or shared my stuff, or reacted.Also, last weekend brought my fifth paid subscriber, which is also insane considering how I don’t have any premium content — nor any immediate plans to launch any — and that this publication is only a couple of weeks old. Two of those subs came before I even turned on monetization.
I’m incredibly grateful. Honestly, it just makes me feel… nice. And if it sounds like I’m lacking the words to describe how I feel, it’s because I am.Right, I’ll shut up now. See you next week.
"More importantly, we need to find another word to describe this sluice of inanity and hate that isn’t the banal “recommendations,” or “suggested content,” because they’re based on a lie." Strongly agree with this point - most of the time when I "recommend" something it's because I think it's good or that someone will like it, not because it (directly or indirectly) makes me money. Technically (as I understand it) Facebook doesn't make money directly off of you seeing/interacting with a post like the wedding picture one discussed here unless it was sponsored onto your feed, but they absolutely do make money off of it indirectly by that reaction prompting you to continue scrolling your feed and seeing more ads otherwise. In another sense, though, you can pretty much think of non-sponsored algorithmically chosen content (like the post in question) as identical to a sponsored post, as just as much of an advertisement - it's just an ad that Facebook is paying for themselves. Thinking of this article in those terms really drove home exactly how depraved this behavior by Facebook is. If you could somehow make a standard video ad out of the wedding post and comments, no reasonable company would pay to air it on TV, but that's basically equivalent to what Facebook did.
I can't remember how I came to be here, but I'm glad I am.