Karen Hao for the MIT Technology Review, ‘How Facebook got addicted to spreading misinformation‘:
By the time thousands of rioters stormed the US Capitol in January, organized in part on Facebook and fueled by the lies about a stolen election that had fanned out across the platform, it was clear from my conversations that the Responsible AI team had failed to make headway against misinformation and hate speech because it had never made those problems its main focus. More important, I realized, if it tried to, it would be set up for failure.
The reason is simple. Everything the company does and chooses not to do flows from a single motivation: Zuckerberg’s relentless desire for growth. Quiñonero’s AI expertise supercharged that growth. His team got pigeonholed into targeting AI bias, as I learned in my reporting, because preventing such bias helps the company avoid proposed regulation that might, if passed, hamper that growth. Facebook leadership has also repeatedly weakened or halted many initiatives meant to clean up misinformation on the platform because doing so would undermine that growth.
In other words, the Responsible AI team’s work—whatever its merits on the specific problem of tackling AI bias—is essentially irrelevant to fixing the bigger problems of misinformation, extremism, and political polarization. And it’s all of us who pay the price.
“When you’re in the business of maximizing engagement, you’re not interested in truth. You’re not interested in harm, divisiveness, conspiracy. In fact, those are your friends,” says Hany Farid, a professor at the University of California, Berkeley who collaborates with Facebook to understand image- and video-based misinformation on the platform.
Hao’s editor, Gideon Lichfield, shared on Twitter some of the PR stunts Facebook (and other companies) use to push back on articles like these.
It’s not surprising Facebook wants to push back on these types of stories, but it also almost doesn’t matter. The damning stories keep coming, but Facebook continues to thrive.
Hao’s article is also the story of how we got addicted to Facebook, and why we keep coming back to it as a society. But Facebook isn’t an essential system; for most of us, it’s a daily habit that we mindlessly tap on when we’re bored. This is the easiest problem to solve: delete Facebook and once enough people do, the company — and Zuckerberg — go away.