Facebook Announces New Experiment to Reduce the Reach of Divisive Political Content in User Feeds

Facebook Announces New Experiment to Reduce the Reach of Divisive Political Content in User Feeds

Facebook has announced a new experiment which will see it de-emphasize political posts and updates about current events in user feeds, in response to ongoing user feedback. Facebook will also limit the amount of political content that people see in their News Feeds, which could have a transformative impact on broader engagement – both on and off the platform itself.

As reported by Axios:

“Moving forward, Facebook will expand some of its current News Feed tests that put less emphasis on certain engagement signals, like the probability that a user will share or comment on a post, in its ranking algorithm. Instead, it will begin placing a higher emphasis on other types of user feedback, like responses to surveys.”

As Axios notes, the move is an extension of the experiment Facebook launched earlier in the year which has seen it reduce the amount of political content in some user News Feeds.

That test was developed in response to concerns about the impacts of divisive political debates on the platform, and Facebook has been running the experiment with ‘a small percentage of users’ in the US since February 17th.

Now, Facebook says, it has enough data to suggest that this is potentially a viable path forward for development.

We’ve seen positive results from our tests to address the feedback we’ve received from people about wanting to see less political content in their News Feed. As a result, we plan to expand these tests to Costa Rica, Sweden, Spain and Ireland.”

Facebook further notes the experiment has shown that ‘some engagement signals can better indicate what posts people find more valuable than others’.

“Based on that feedback, we’re gradually expanding some tests to put less emphasis on signals such as how likely someone is to comment on or share political content. At the same time, we’re putting more emphasis on new signals such as how likely people are to provide us with negative feedback on posts about political topics and current events when we rank those types of posts in their News Feed.”

So the updated algorithm approach – which, worth noting again, is only being launched in four nations in the initial test phase – will reduce the emphasis on content that prompts negative emotional response and comments. Which is a key criticism that has been leveled at Facebook over time, that its algorithm essentially incentivizes divisive debate by amplifying posts that are more likely to spark back and forth discussion, often angry and divisive in nature, as not all ‘engagement’ is positive in this respect.

Which could be a helpful update – because it’s true, not all engagement is beneficial, and if Facebook uses ‘engagement’ as a blanket proxy for interaction, and thus looks to fuel it in any way it can, taking a blind approach to what that engagement might actually be, that means that Facebook is often amplifying content that sparks debate, as the system merely sees that people are commenting and looks to show that content to even more people.

Which makes sense, from a pure interaction standpoint, and keeping people active in the app. But the problem that then creates is that it incentivizes creators and publishers to post more ‘hot takes’ as a means to get that dopamine rush from the subsequent responses and alerts, as well as more reach and clicks from a publishing standpoint.

People want to feel like they’re being heard, and social media provides a means to connect with a wide audience. But if you’re not saying something that gets attention – like a controversial comment, a funny remark or an inspirational quote – the chances of really gaining traction, and getting the subsequent buzz of notifications, is very minute.

That’s why everyone on social media is a comedian, or a life coach, or a political pundit, because that’s what generates response, and that response, in a political sense, often then leads to division, as the algorithms then amplify such, based on engagement, and prompt more people to essentially take a side in the debate.

And these are often debates that many users wanted no real part in, but once you start engaging with a certain topic, the algorithm will show you more of it, and eventually, your Facebook feed becomes a swap of political turmoil, which is really only built on people’s need for attention, and the excitement of response in the app.

This update could work to address this, by reducing the emphasis on your likelihood to comment, and factoring in the increasing direct feedback Facebook gets that people want to see less political posts in their feed.

Which Facebook CEO Mark Zuckerberg noted back in February:

“One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services. So one theme for this year is that we’re going to continue to focus on helping millions more people participate in healthy communities and we’re going to focus even more on being a force for bringing people closer together.”

This new update is the next stage of this exploration, and it could have a major impact, with Facebook already seeing positive response when it reduced political content in user feeds in the aftermath of the US election. That change lead to what Facebook staffers internally referred to as the ‘nicer’ News Feed, lessening the intensity of debate and division across the board.

Could this update enact a similar shift across all of Facebook?

Overall, it seems like a positive test either way, but as with any Facebook experiment, given its scale, some are going to lose out as well.

Facebook notes that the changes will affect public affairs content more broadly, and that publishers will likely see an impact on their traffic.

“Knowing this, we’re planning a gradual and methodical rollout for these tests, but remain encouraged, and expect to announce further expansions in the coming months.”

In other words, don’t expect a major change in your Facebook feed any time soon, but it is the beginning of a significant experiment, and that could have a major impact on how Facebook engagement works.

Will that also impact Page reach more broadly? It seems like it will only be focused on political content at this stage, which could, as a result, benefit Pages in other categories, as there may well be more room to fill in feeds as a replacement. But it’s impossible to know, as Facebook itself is only just beginning the next phase of testing, so it can’t be sure what the full impacts will be in this respect.

Overall, it seems like a positive push from The Social Network, responding to ongoing concerns, and moving in a more measured, positive direction.

Because while Facebook has sought to counter claims that it has helped to fuel divisive, negative movements through content amplification – wittingly or not – the overwhelming evidence suggests that it has had an impact in this regard, and that it’s News Feed algorithm, which, again, incentivizes content that will see the most debate and discussion, has changed the incentive structure for publishers, pushing them towards more emotion-charged headlines and reports.

It has shifted attention in this respect, and with most people having an active Facebook account, that can have, and arguably has had, a transformational impact on societal perspective, which has exacerbated existing divides.

As such, this could be a huge positive step, and one that reaches beyond Facebook itself.



Please enter your comment!
Please enter your name here