Review | Facebook whistleblower Frances Haugen warns of algorithmic dangers

[ad_1]

On one level, Frances Haugen’s new book, “The Power of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook,” is as basic as the title makes it sound. In 2021, Haugen, who had worked at Facebook for two years, took 22,000 pages of documents from inside the company, leaking information that led to a string of devastating stories about the company’s inner workings and a number of congressional and parliamentary hearings. “Facebook knew its platforms were causing harm,” Haugen writes. (The company has since been renamed Meta, but Haugen sticks with Facebook throughout.) “Its stock price continued to soar because nobody else knew.” Profits were contingent on “no one knowing how large the gap between Facebook’s and Instagram’s public narratives and the truth had grown.”

If all Haugen’s book did was present her whistleblowing case (the legal merits of which have yet to be decided), it might still be an important part of the ongoing chronicling of how we allowed social media’s dangers to creep up on us. But what really makes the book worth reading is the broader wisdom in her story (and the absence of the self-importance implied by the book’s unfortunate title).

Haugen began her career at Google in 2006. One of her assignments was helping to create Google Books, back in the innocent time when scanning every book was considered just a service to humanity. Her journey through social media took her from those early days, when the platforms promised to make the world a better place, to the much darker days when algorithms we can’t see or understand try to serve us information that has one purpose, and one purpose only: turning our attention into profits.

Haugen didn’t start out a cynic. The academically gifted child of an Iowa college professor, she came to recognize in high school what she calls her “significant interpersonal deficits,” like not being able to understand humor. Always an iconoclast, she got her degree at the Franklin W. Olin College of Engineering — now ranked among the top undergraduate programs in the country but then a start-up — and interrupted her career at Google to get a master’s degree from Harvard Business School. She writes that Olin “believed integrating the humanities into its engineering curriculum was essential because it wanted its alumni to understand not just whether a solution could be built, but whether it should be built.” That was an uncommon luxury in an engineering program, most of which, Haugen says, do not leave students any time to contemplate the nature of “what their responsibilities are as computer science engineers wielding their ever more godlike powers.” As we head into an AI-driven world, that’s a terrifying observation.

After Haugen’s career was derailed by health problems, she left Google. Following stints at Yelp and Pinterest, where she got more insight into the dark algorithmic arts, she joined Facebook in 2019. Interestingly, she says the job she ended up taking there had remained open for months, partly because, she writes, having Facebook on your résumé wasn’t a good thing. Haugen says she took the job in the hope that she could help reduce misinformation.

She quickly realized that that was going to be difficult — but not because Facebook had ever made the decision to be evil. You might call it the banality of metrics. “At Facebook, nothing mattered if it couldn’t be measured,” she writes — and part of the conundrum, she observes, is that the measuring mind-set isn’t all bad. “Companies that have cultures unmoored from metrics descend into politics and favoritism,” she writes. On the other hand, if you’re blinded by the data, you might miss the rot that metrics can’t catch. “The reality is that companies governed by metrics and experimentation have trouble identifying slow causal patterns,” she writes. “They’re used to seeing changes that cleanly demonstrate that X caused Y. Most of the worst impacts of Facebook’s product or algorithmic design choices happened step by step, with each step adding thin layers of sediment to the mountain of harm.” There are lessons in here for anyone running a business today.

Although Mark Zuckerberg doesn’t play a starring role in Haugen’s book — and it helps her credibility that she doesn’t pretend to have been closer to his decisions than she was — she clearly thinks he set the tone for the culture. When an internal group proposed the conditions under which Facebook should step in and take down speech from political actors, Zuckerberg discarded its work. He said he’d address the issue himself over a weekend. His “solution”? Facebook would not touch speech by any politician, under any circumstances — a fraught decision under the simplistic surface, as Haugen points out. After all, who gets to count as a politician? The municipal dogcatcher? It was also Zuckerberg, she says, who refused to make a small change that would have made the content in people’s feeds less incendiary — possibly because doing so would have caused a key metric to decline.

When the Wall Street Journal’s Jeff Horwitz began to break the stories that Haugen helped him document, the most damning one concerned Facebook’s horrifyingly disingenuous response to a congressional inquiry asking if the company had any research showing that its products were dangerous to teens. Facebook said it wasn’t aware of any consensus indicating how much screen time was too much. What Facebook did have was a pile of research showing that kids were being harmed by its products. Allow a clever company a convenient deflection, and you get something awfully close to a lie.

Haugen is quick to admit that harm to younger users wasn’t her chief concern. She credits Horwitz with helping her see outside her own preoccupations, which were mainly about genocide. After a trip to Cambodia, where neighbors killed neighbors in the 1970s because of a “story that labeled people who had lived next to each other for generations as existential threats,” she’d started to wonder about what caused people to turn on one another to such a horrifying degree. “How quickly could a story become the truth people perceived?”

And after the military in Myanmar used Facebook to stoke the murder of the Rohingya people, Haugen began to worry that this was a playbook that could be infinitely repeated — and only because Facebook chose not to invest in safety measures, such as detecting hate speech in poorer, more vulnerable places. “The scale of the problems was so vast,” she writes. “I believed people were going to die (in certain countries, at least) and for no reason other than higher profit margins.”

Haugen is solution-oriented — another thing to admire about her — and in a way that moves beyond what she points out is the false choice posited by most social media companies: free speech vs. censorship. She argues that lack of transparency is what contributed most to the problems at Facebook. No one on the outside can see inside the algorithms. Even many of those on the inside can’t. “You can’t take a single academic course, anywhere in the world, on the tradeoffs and choices that go into building a social media algorithm or, more importantly, the consequences of those choices,” she writes.

In that lack of accountability, social media is a very different ecosystem than the one that helped Ralph Nader take on the auto industry back in the 1960s. Then, there was a network of insurers and plaintiff’s lawyers who also wanted change — and the images of mangled bodies were a lot more visible than what happens inside the mind of a teenage girl. But what if the government forced companies to share their inner workings in the same way it mandates that food companies disclose the nutrition in what they make? What if the government forced social media companies to allow academics and other researchers access to the algorithms they use?

Haugen is not completely naive about the difficulties of change given the manifold conflicts of interest in our system. She recounts figuring out that as soon as congressional staffers get up to speed on the complexities of social media, they’re “poached by Big Tech with salaries five times what their senators could afford to pay them.” But the title of her book betrays that she’s earnest — and perhaps not cynical enough. She writes that the scandal she helped create caused Facebook’s stock, which had peaked at nearly $380 in 2021, to plummet to below $100. It’s now back up to $267, as of this writing, and Zuckerberg is still solidly in charge. One person might have the power to cause an outcry, but real change, it appears, is a challenge of a different order.

Bethany McLean is a contributing editor at Vanity Fair and the author of “Saudi America: The Truth About Fracking and How It’s Changing the World.”

How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook

Little, Brown. 340 pp. $30

A note to our readers

We are a participant in the Amazon Services LLC Associates Program,
an affiliate advertising program designed to provide a means for us to earn fees by linking
to Amazon.com and affiliated sites.

[ad_2]

Source link

Leave a Comment