Facebook Whisleblower Outlines Conflicting Motivations in the Company’s Approach to News Content


Is Facebook bad for society, and does the company knowingly contribute to division and angst, in order to maximize usage and profit?

This is a key question that’s lingered for the past few years, especially since the 2016 US election. And now, we have some insight into Facebook’s own thinking on the subject – over the last two weeks, The Wall Street Journal has reported a range of internal studies, and responses to such from Facebook executives, which were leaked by a former Facebook staffer who sought to expose the company’s inaction in addressing key flaws in its design.

That former employee was last night revealed by CBS to be Frances Haugen, an algorithmic design expert who had worked on Facebook’s Civic Integrity team before it was disbanded in the wake of the 2020 US election. According to the information shared by Haugen, Facebook has indeed knowingly avoided taking stronger action to address the worst aspects of its platform, due to the impacts any such moves could have on usage, and thus profits.

And while Facebook has refuted Haugen’s claims, her statements do align with what many previous reports have suggested, underlining key concerns around the societal impacts of Zuckerberg’s social behemoth.

Haugen’s key contention is that Facebook has knowingly overlooked or played down concerning findings, based on its own research, in favor of maintaining usage and user engagement.

As explained by Haugen:

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.”

Which, to some degree, makes sense – Facebook is, after all, a business, and as such, it’s driven by profit, and delivering maximum value for its shareholders.

The problem, in Facebook’s case, is that it operates the latest inter-connected network of humans in history, closing in on 3 billion users, many of whom use the app to stay informed, on various fronts, and gather key insights around the news of the day. As such, it has significant power to influence opinion.

That means, as Haugen notes, that its decisions can have big impacts.

Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.

Indeed, among the various findings highlighted in Haugen’s Facebook Files, the thousands of internal documents which she essentially smuggled out of Facebook HQ, are suggestions that Facebook has:

  • Overlooked the prevalence and impact of hate speech on its platforms, due to the fact that such content also drives more engagement among users
  • Played down the negative impacts of Instagram on young users, with the findings showing that the platform amplifies negative body image
  • Failed to address major concerns around Facebook usage in developing regions, in part due to cost/benefit analysis
  • Failed to address the spread of anti-vaccine content

Again, many of these elements have been widely reported elsewhere, but Haugen’s files provide direct evidence that Facebook is indeed well aware of each of these aspects, and has chosen, at times, not to act, or to take significant counter action, largely due to a conflict with its business interests.

Facebook’s PR team has been working hard to counter such claims, providing point-by-point responses to each of the Facebook Files reports, noting that the existence of these research reports, in themselves, shows that Facebook is working to address such concerns, and combat these problematic elements.

Facebook points to various changes it’s made on Instagram to provide more protection and control options for users, while Facebook’s also working to improve its algorithm ranking to limit exposure to divisive, angst-inducing content.

But at the same time, Facebook has played down the impacts of such on a broader scale.

As Facebook’s vice president of policy and global affairs, Nick Clegg has noted on the suggestion that Facebook played a key role in contributing to post-election protests at the Capitol building.

“I think the assertion [that] January 6th can be explained because of social media, I just think that’s ludicrous.” 

Clegg’s view is that Facebook is only one small part of a broader societal shift, and that it simply can’t be the core problem that’s lead to such major conflict, in various regions.

It’s impossible to know what Facebook’s impact is in this respect, but clearly, based on Haugen’s files, there are some key contributors there.

Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.”

Anger is the emotion that sparks the most response, the most engagement, and Haugen essentially contends that Facebook is profiting from that, by facilitating the spread of hate-inspiring content that then, as a by-product, amplifies division.

When we live in an information environment that is full of angry, hateful, polarizing content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other, the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”

There are two sides to this, and both can be equally correct. One, as Haugen notes, is that Facebook has an underlying motivation to facilitate the spread of hate-inducing content, which sparks more engagement among its users, while also exacerbating societal division – which, at Facebook’s scale, can have a significant impact.

On the other hand, as Facebook notes, it doesn’t conduct such research for nothing. Turning a blind eye to such issues completely would be to not conduct these studies at all, and while Zuck and Co. may not be taking as much action as all parties would like, there is evidence to suggest that the company is working to address these concerns, though in a more measured way that, ideally, also lessens business impact.

The question is, should ‘business impact’ be factored into such consequential decisions?

Again, Facebook operates the largest interconnected network of people in history, so we don’t know what the full impacts of its algorithmic-influenced sharing can be, because we don’t have another example to refer to, there’s no precedent for Facebook and its broader impact.

In some ways, Facebook, at its scale and influence, should really be a public utility, which would then change the company’s motivations – as Haugen notes:

No one at Facebook is malevolent, but the incentives are misaligned, right? Like, Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.”

At core, this is the main issue – we now have a situation where one of the key vehicles for information distribution and dissemination is motivated not by keeping people reliably informed, but by sparking the most engagement that it possibly can. And the way to do that is to incite emotional response, with hate and anger being among the most powerful motivators to lure people into reacting.

According to research, almost a third of US adults regularly access news content on Facebook – that means that at least 86 million Americans are getting direct insights into the latest happenings from a platform that has a clear motivation to show them the most angst-inducing, emotionally-charged takes on each issue.

News publishers also know this, as do politicians – in fact, as per the Facebook Files, various political groups have shifted to more partisan, divisive takes in their approaches in order to appease Facebook’s algorithm.

When you consider the scale of reach, and the impact the platform has on such messaging, it’s clear that Facebook does have an influence over how we engage.

But with conflicting motivations, and a need to maximize engagement in the face of rising competition, can we really expect Facebook to change its approach for the greater good?



Please enter your comment!
Please enter your name here