Facebook: The Inside Story

I’m at a loss for what to say about Steven Levy’s book, Facebook: The Inside Story. At 500 pages it’s a deep dive into the history of Facebook (the startup and all that’s happened since). The excerpts below are just a few of the things that caught my eye. It would be a mistake to judge the book (or Mark Zuckerberg) based on the passages I underlined.

Soley by analyzing Likes, they successfully determined whether someone was straight or gay 88 percent of the time. In nineteen out of twenty cases, they could figure out whether one was white or African American. And they were 85 percent correct in guessing one’s political party. Even by clicking innocuous subjects, people were stripping themselves naked. […] In subsequent months,Kosinski and Stillwell would improve their prediction methods and publish a paper that claimed that using Likes alone, a researcher could know someone better than the people who worked with, grew up with, or even married that person. “Computer models need 10, 70, 150, and 300 Likes, respectively, to outperform an average work colleague, cohabitant or friend, family member, and spouse.”

A 2012 study found that Facebook was mentioned in a third of divorces.

It was a natural evolution to put (content moderators) in factories. They became the equivalent of digital janitors, cleaning up the News Feed like the shadow workforce that comes at night and sweeps the floors when the truly valued employees are home sleeping. Not a nice picture. And this kind of cleaning could be harrowing, with daily exposure to rapes, illegal surgery, and endless images of genitals.

Between January and March 2019, (Facebook) blocked 2 billion attempts to open fake accounts — almost as many as actual users on the system. […] The company concedes that around 5 percent of active accounts are fake. That’s well over 100 million.

It’s left to the 15,000 or so content moderators to actually determine what stuff crosses the line, forty seconds at a time. In Phoenix (site of one of the moderator “factories”) I asked the moderators I was interviewing whether they felt that artificial intelligence could ever do their jobs. The room burst out in laughter.

A computer-science teacher at one of the big AI schools told me that Facebook used to be the top employment choice. Now he guesses that about 30 percent of his students won’t consider it, for moral reasons.

“We’ve actually built an AI that’s more powerful than the human mind and we hid it from all of society by calling it something else,” Harris says. “By calling it the Facebook News Feed, no one noticed that we’d actually built an AI that’s completely run loose and out of control.” Harris says that using the News Feed is like fighting an unbeatable computer chess player—it knows your weaknesses and beats you every time.” — Tristan Harris (former Google interface engineer)

A few take-aways:

  • Facebook might be the most powerful (influential) organization in the world. And therefore — potentially — the most dangerous.
  • Everyone on the planet is affected by what Facebook does (or doesn’t do). Even those of us without accounts.
  • Mark Zuckerberg is brilliant and has surrounded himself with other brilliant people. He seems to believe he is always the smartest person in the room.
  • Zuckerberg is on a mission to save/change the world. Combined with the above, this makes him very dangerous.
  • People who use Facebook (and those of us who do not) have no idea the extent to which we are influenced by the people running the platform.
  • Users will never —voluntarily — stop using Facebook.

The book has left me a bit shaken. I always considered religion — some religion — the greatest danger to humanity. Facebook seems a greater threat.