Watch CBS News

Facebook internal documents show execs knew platform spread misinformation and failed to act at times

Ex-executive: Leaked files show Facebook's harm
Former Facebook executive says leaked documents show platform's harm 08:07

A Facebook researcher on the company's "integrity" team wrote in an August 2020 resignation letter reviewed by CBS News that "promising interventions" to clean up the site were "prematurely stifled or severely constrained by key decision makers" on multiple occasions.

One internal document from July 2019 titled "Carol's Journey to QAnon" describes a dummy account created by a researcher who then engaged with content suggested by Facebook's technology. It was a short journey – after just a few days, the account saw only conspiracy theories, lies, and graphic content.

Those are just two of thousands of pages of internal Facebook documents provided to Congress by lawyers for Frances Haugen, the whistleblower who told lawmakers at a Senate hearing that the company chooses profits over the safety of users. Haugen, who first revealed her identity on "60 Minutes," also filed complaints with the Securities and Exchange Commission.

A consortium of 17 U.S news organizations, including CBS News, has reviewed redacted versions of the documents received by Congress, files that include internal research, presentations and employee comments. The Wall Street Journal published a series of reports based on some of the same documents last month before Haugen's "60 Minutes" interview.

1025-ctm-facebookpapers-segall1.jpg
CBS News

"If you look at the history of Facebook, they clean up the mess after the fact," said Brian Boland, a former Facebook executive who told CBS News he quit last year over concerns the platform was contributing to polarization of society.

In a statement to CBS News, a Facebook spokesperson said "at the heart of these stories is a premise which is false."

"Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or well being misunderstands where our own commercial interests lie. The truth is we've invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook," the company spokesperson said. 

But that doesn't exactly square with what the internal documents reveal — that Facebook executives knew its technology recommends content to users that leads them down a rabbit hole of misinformation and conspiracy theories. Over the past year, as vaccine hesitancy spread, the documents revealed that executives knew about "rampant" comments regarding vaccine hesitancy. 

Another noted that Facebook lacks the tools to effectively moderate content in foreign languages. One researcher evaluating content moderation efforts in Afghanistan estimated Facebook catches less than 1% of hate speech posted by users in that country.

1025-ctm-facebookpapers-segall.jpg
Brian Boland CBS News

Facebook has said that the internal research leaked by Haugen, a former product manager who worked on algorithmic recommendation systems that power the platform's News Feed, is a "curated selection" of documents that don't fairly represent the company's work. 

But, Boland, who spent more than a decade at Facebook and was most recently overseeing market and product strategy, told CBS News the documents accurately reflect his own experiences at the company.

"They hired the world's best researchers to look into these questions around misinformation on these platforms. They asked them to come up with solutions," said Boland. "The problem is that when the moment of choice mattered most, where you choose to put safety and reducing misinformation first, or growing the business first, they chose to grow the business." 

Boland, who reviewed some of the documents obtained by CBS News, said they confirm his "greatest fears" -- that a focus on "growth and innovation" means "protections against real world harm are constantly undervalued and neglected."

He called an April 2020 memo containing reported feedback from CEO Mark Zuckerberg to proposed safety changes "a perfect, straightforward example of engagement and growth being at odds with these safety levers."

1025-ctm-facebookpapers-segall6.jpg
Mark Zuckerberg CBS News

In that memo, a researcher shared feedback from Zuckerberg with other colleagues on a proposal to "reduce bad content in News Feed," writing that Zuckerberg did not want to move forward with parts of the plan "if there was material tradeoff" with engagement numbers.

"Mark chose to stick with the growth side of the business," Boland said. "This culture of focus on growth and the things that people focus on starts at the top with Mark, and then permeates down into each of the teams to do their work," he added.

Facebook conducted research studies and penned memos analyzing the impact of a focus on what it calls "Meaningful Social Interactions." The effort, adopted in 2018, aims to get users to interact with content rather than passively clicking. The company has said it brings families and friends closer by showing them more of what they want to see.

But internal Facebook research found "unhealthy" side effects. The same models that were designed to increase meaningful interactions between family and friends were driving users towards misinformation and divisive content.

Facebook told CBS News that the ranking changes on its platforms are not the source of the world's divisions. "Research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed. It also shows that meaningful engagement with friends and family on our platform is better for people's well-being than the alternative," a company spokesperson said.

In a December 2020 internal memo, a researcher analyzing the influence of politics on content policy, wrote that "Facebook routinely makes exceptions for powerful actors when enforcing content policy" and added that "in multiple cases, the final judgements about whether a prominent post violates certain written policy, are made by senior executives, sometimes Mark Zuckerberg."

1025-ctm-facebookpapers-segall5.jpg
CBS News

The same memo claims that in early 2020, Facebook removed the "repeat offenders" designation – those who repeatedly violate community standards policies – from popular conservative voices, including radio talk show host Charlie Kirk, because the policy and PR teams feared backlash.

After the 2020 election, Facebook researchers found evidence that Stop the Steal groups, whose members organized their plans to storm the U.S Capitol on the platform, were growing "rapidly." Facebook researchers said the company was aware that the extremist group pages included high levels of hate and violence inciting content, but noted Facebook's response and enforcement was "piecemeal."

In an internal memo analyzing the growth of Stop the Steal groups following the presidential election, researchers said that Facebook "lacks the tools and protocols for handling evolution of movements" and had "little policy around coordinated authentic harm."

Facebook has long said it wants lawmakers to step in and provide policy solutions and regulatory framework. At a Senate subcommittee hearing earlier this month, Republicans and Democrats said they want the company to be more transparent about the impact of the platform on the public.

Last week, Senator Richard Blumenthal, the chair of the same subcommittee that heard from Haugen earlier this month, called on Zuckerberg to testify before Congress and answer questions raised by the Facebook Papers.

Boland warned "unless we get regulators to create an oversight body and to create regulation around transparency, we're at a loss, and we've lost this battle," he said, "my hope is that the company will become more transparent both with employees and externally. My fear is that it won't," he added.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.