Dailymaverick logo

Maverick News

Maverick News

‘Careless People’ — the tell-all book that Facebook tried to ban

‘Careless People’ — the tell-all book that Facebook tried to ban
Sarah-Wynn Williams. (Photo: Sarah-Wynn Williams / Bookseller)
We live in a world where tech bros wield extraordinary power, but one of the consoling fantasies is that they are, at least, clever. A new Facebook tell-all dispels this illusion.

Only a few pages into “Careless People: A story of where I used to work” by Sarah Wynn-Williams, you’ll understand why Facebook — now Meta — tried so hard to block the publication of this memoir by its former employee.

Wynn-Williams is no jaded intern. She was Facebook’s global public policy manager from 2011 to 2017. A former diplomat from New Zealand, she didn’t arrive by way of the usual Harvard-to-Menlo-Park conveyor belt. She had a sharp eye for Facebook’s political potential long before most, and she convinced the top brass to let her in.

From there, she became a high-level operator, one of the rare insiders with a front-row seat to both the internal chaos and the global reach of Mark Zuckerberg’s empire.

Sarah-Wynn Williams.
(Photo: Sarah-Wynn Williams / Bookseller)



Wynn-Williams has been touted as the most high-ranking Facebook whistle-blower to date. She is also, as the ending of the book makes clear, an aggrieved former employee — having been fired, in her telling, for accusing her boss of harassment. Facebook would have you believe this renders her account fundamentally unreliable.

Perhaps, but her insider’s vantage point also means that Wynn-Williams comes with documents. Lots of them. Internal emails, memos, messages: some written by Zuckerberg and COO Sheryl Sandberg. It’s this documentary weight that gives her gossipy, fast-paced narrative its punch.

Facebook is extremely unlike other workplaces, Wynn-Williams quickly learns.

“I learn soon enough that I have no reference points for the obscene wealth that flows through Facebook. What makes it so strange is that it’s based on tenure, rather than title. So assistants and junior staff are often worth vastly more than their bosses,” she writes.

This compensation comes at a considerable personal cost: employees are expected to answer emails at any point between 5am and 1am the following morning. For Wynn-Williams’ team, the work piles up because as Facebook’s influence grows, the political complexities mount. Crisis after crisis hits with regards to governments demanding content take-downs and debates over what constitutes permissible speech.

Facebook, Wynn-Williams writes, isn’t just indifferent to ideology; it is hostile to it: “There is no grand ideology here. No theory about what Facebook should be in the world. The company is just responding to stuff as it happens.”

The only consistent organising principle, she finds, is profit.

There are moments of queasy hilarity which remind one powerfully of a Guardian op-ed by Rebecca Shaw which went viral in January this year, headlined “I knew one day I’d have to watch powerful men burn the world down – I just didn’t expect them to be such losers”.

Facebook’s top management team, for instance, ensures that Zuckerberg wins every board game he plays. Zuckerberg requests that the global team arrange for him to be “gently mobbed” by a crowd in Indonesia for a kind of ersatz rock god experience. He pairs this with karaoke on his private jet, where his team dutifully cheers him on in his rendition of the Backstreet Boys’ “I Want It This Way”.

Wynn-Williams gives us tremendously entertaining sketches behind the scenes of global summits like Davos, which are almost cartoonishly chaotic — and once again, often laced with pure cringe. Zuckerberg asks Chinese President Xi Jinping if he will “do him the honour of naming his unborn child”. Xi refuses.

In one of the most extraordinary episodes, Facebook COO Sandberg demands that Wynn-Williams gets into bed with her on a private plane — with Sandberg apparently accustomed to sleeping on female employees’ laps while they stroke her hair. (It gives “Lean In” a whole new meaning.)

How Facebook warped global politics


Beyond the Silicon Valley soap opera, the book’s most chilling revelations lie in its detailed account of how Facebook warped global politics. We learn how a seemingly innocuous “I voted” button on Facebook rolled out in the US midterms in 2010 nudged more than 300,000 extra people to the polls — a discovery that would kickstart a decade-long experiment in political influence.

The growth team quickly sees the potential in getting cosy with politicians.

“Let’s dial up the algorithm to give politicians some love,” is the line they use. In other words, they begin skewing the algorithm to ensure that politicians’ posts reach more people than they would organically.

By 2014, the strategy was formalised. A new team was built to woo politicians around the world, to train them how to use the platform effectively — and to subtly remind them that their re-election campaigns would go better if Facebook liked them. Or, more precisely, if the algorithm did.

This is where things get deeply sinister. Not only did Facebook experiment with artificially boosting politicians’ posts to curry favour, it also considered trading access to user data with authoritarian regimes like China, including Hong Kong protesters’ information, in exchange for market access.

It’s a win-win situation: “If politicians depend on Facebook to win elections, they’ll be less likely to do anything that’ll harm Facebook,” writes Wynn-Williams.

By 2016, the Donald Trump campaign understood Facebook’s political utility better than Facebook itself did.

“Facebook rewards outsider candidates who post inflammatory content that drives engagement,” writes Wynn-Williams.

Worse: “We charge less money for ads that are more incendiary and reach more people.”

She recounts how Facebook allowed micro-targeted voter suppression campaigns against three groups of Democrats, including a black audience shown secret ads reviving Hillary Clinton’s “super predators” soundbite. These “dark posts” were invisible to the broader public. Just like that, Facebook helped reshape an election, all while insisting it was just a neutral platform.

When Zuckerberg and Co are confronted with the consequences of their product after Trump’s win, they are less remorseful than furious. Particularly Zuckerberg, who fumes after a dressing-down from Barack Obama about “fake news”, and is deeply resentful about what he sees as unfair media criticism.

So much so that he begins scheming ways to “crush” the press entirely.

How clean is the whistle-blower’s whistle?


Wynn-Williams is a mostly likeable narrator, though her own culpability gnaws at the edges of the story. She spent six years at a company she continuously says she found ethically indefensible. Her claim that she stayed on for financial reasons rings a little hollow: as Wired pointed out, Wynn-Williams would have been taking home millions of dollars annually.

Other critics with a front-row seat to the action have suggested that Wynn-Williams downplays the role she played in these destructive global practices and simultaneously takes too much credit for the few positive measures put in place.

Still, what she offers is invaluable: a granular, infuriating, and often absurd portrait of a company with terrifying reach and no moral compass. Even if Facebook dismisses her claims as “old news” (and that admission is in itself surely a tremendous own goal), the reality is that we are now living with the consequences, in a world where tech barons Zuckerberg, Elon Musk and Jeff Bezos enjoyed front-row seats to Trump’s second inauguration.

This book won’t change how Big Tech behaves.

But it might, at least, make us stop pretending these guys are the geniuses they claim to be. Sometimes, they’re just very rich weirdos with terrible karaoke taste and an awful lot of power. DM

Excerpt from ‘Careless People: A story of where I used to work’, by Sarah-Wynn Williams (published in South Africa by Pan Macmillan)


By now, it feels like the day-to-day at Facebook is lurching from one dismaying shit show to the next. Mark and Sheryl seem completely removed. Focused on presidential runs or promoting new books or commencement speeches or whatever.

In April 2017, a confidential document is leaked that reveals Facebook is offering advertisers the opportunity to target thirteen- to seventeen-year-olds across its platforms, including Instagram, during moments of psychological vulnerability when they feel “worthless,” “insecure,” “stressed,” “defeated,” “anxious,” “stupid,” “useless,” and “like a failure.” Or to target them when they’re worried about their bodies and thinking of losing weight. Basically, when a teen is in a fragile emotional state.

Facebook’s advertising team had made this presentation for an Australian client that explains that Instagram and Facebook monitor teenagers’ posts, photos, interactions, conversations with friends, visual communications, and internet activity on and off Facebook’s platforms and use this data to target young people when they’re vulnerable. In addition to the moments of vulnerability listed, Facebook finds moments when teenagers are concerned with “body confidence” and “working out & losing weight.”

At first blush it sounds pretty gross, sifting through teens’ private information to identify times when they might be feeling worthless and vulnerable to an advertiser flogging flat-tummy tea or whatever other rubbish.

But apparently Facebook’s proud of it. They’ve placed a story in Australia explaining how the company uses targeting based on emotions: “How Brands Can Tap into Aussie and Kiwis [sic] Emotions: Facebook Research,” which touts how Facebook and Instagram use the “emotional drivers of behavior” to allow advertisers to “form a connection.” The advertising industry understands that we buy more stuff when we are insecure, and it’s seen as an asset that Facebook knows when that is and can target ads when we’re in this state.

It’s a reporter for an Australian newspaper who’s got his hands on one of the internal documents about how Facebook actually does this, and he reaches out for a comment from Facebook before publishing. That’s when I hear about it. I didn’t know anything about this and neither did the policy team in Australia. It’s an advertising thing. I’m put on a response team of communications specialists, members from the privacy team and measurement team, and safety policy specialists that’s supposed to figure out what to say publicly.

No one in that group, other than me and my Australian team, seems surprised that Facebook made an advertising deck like this. One person messages the group, “I have a very strong feeling that she [the Australian staffer who prepared the deck] is not the only researcher doing this work. So do we want to open a giant can of worms or not?” And they’re right. At first, we think the leaked document is one Facebook made to pitch a gum manufacturer to target teenagers during vulnerable emotional states. Then eventually the team realize, no, the one that got leaked was for a bank. There are obviously many decks like this.

The privacy staffer explains that teams do this type of customized work targeting insecurities for other advertisers, and there are presentations for other clients specifically targeting teens. We discuss the possibility that this news might lead to investigations by state attorneys general or the Federal Trade Commission, because it might become public that Facebook commercializes and exploits Facebook’s youngest users.

To me, this type of surveillance and monetization of young teens’ sense of worthlessness feels like a concrete step towards the dystopian future Facebook’s critics had long warned of.

A statement is quickly drafted and the response team debates whether Facebook can include the line, “We take this very seriously and are taking every effort to remedy the situation,” since in fact this is apparently just normal business practice. A comms staffer points out what should be obvious: that “we can’t say we’re taking efforts to remedy it if we’re not.”

This prompts other team members to confirm his take, revealing other examples they know of. Facebook targets young mothers, based on their emotional states, and targets racial and ethnic groups — for example, “Hispanic and African American Feeling Fantastic Over- index.” Facebook does work for a beauty product company tracking when thirteen- to seventeen-year-old girls delete selfies, so it can serve a beauty ad to them at that moment.

We don’t know what happens to young teen girls when they’re targeted with beauty advertisements after deleting a selfie. Nothing good. There’s a reason why you erase something from existence. Why a teen girl feels that it can’t be shared. And surely Facebook shouldn’t then be using that moment to bombard them with extreme weight loss ads or beauty industry ads or whatever else they push on teens feeling vulnerable. The weird thing is that the rest of our Facebook coworkers seem unbothered about this.

My team and I are horrified; one of them messages me, “Also wondering about asking my apparently morally bankrupt colleagues if they are aware of any more. The Facebook advertising guy who is cited in the [Australian] article has three children — I talked him through his kid being bullied — what was he thinking?”

I’m still struggling to get a better picture of what we’re dealing with here. So I ask for an independent audit by a third party to understand everything that Facebook has done like this around the world, targeting vulnerable people, so I can try to stop it. Who has this information and how many advertisers has it been shared with? The team is not enthusiastic. Elliot nixes any audit and cautions against using the word “audit” at all, even as an ask like mine, saying that “lawyers have discouraged that description in similar contexts.” He doesn’t say why but I’m guessing he doesn’t want to create a paper trail, a report with damning details that could be leaked or subpoenaed. Years later I would learn that British teenager Molly Russell had saved Instagram posts including one from an account called Feeling Worthless, before committing suicide. “Worthless” being one of the targeting fields. This only emerged due to a lawsuit that revealed internal documents acknowledging a “palpable risk” of “similar incidents.”

The initial statement Facebook gives the Australian journalist who discovered the targeting and surveillance back in 2017 does not acknowledge that this sort of ad targeting is commonplace at Facebook. In fact, it pretends the opposite: “We have opened an investigation to understand the process failure and improve our oversight. We will undertake disciplinary and other processes as appropriate.”

A junior researcher in Australia is fired. Even though that poor researcher was most likely just doing what her bosses wanted. She’s just another nameless young woman who was treated as cannon fodder by the company.

When that doesn’t stop media interest, Elliot says, “We need to push back hard on the idea that advertisers were enabled to target based on emotions. Can you share to group so Sheryl et al can i) see the article, ii) understand next steps.” Joel wants a new, stronger statement, one saying that we’ve never delivered ads targeted on emotion. He directs that “our comms should swat that down clearly,” but he’s told that it’s not possible. Joel’s response: “We can’t confirm that we don’t target on the basis of insecurity or how someone is feeling?” Facebook’s deputy chief privacy officer responds, “That’s correct, unfortunately.” Elliot asks whether it is possible to target on words like “depressed” and the deputy chief privacy officer confirms that, yes, Facebook could customize that for advertisers. He explains that not only does Facebook offer this type of customized behavioural targeting, there’s a product team working on a tool that would allow advertisers to do this themselves, without Facebook’s help.

Despite this, Elliot, Joel, and many of Facebook’s most senior executives devise a cover-up. Facebook issues a second statement that’s a flat- out lie: “Facebook does not offer tools to target people based on their emotional state.” The new statement is circulated to a large group of senior management who know it’s a lie, and approve it anyway. It reads,

On May 1, 2017, The Australian posted a story regarding research done by Facebook and subsequently shared with an advertiser. The premise of the article is misleading. Facebook does not offer tools to target people based on their emotional state.

The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.

I take a couple of days off for a family trip and to celebrate Xanthe’s birthday, and the response team continues on without me. I’m glad to miss it. DM

Categories: