A group of former Facebook workers, who worked in the capacity of “news curators,” have disclosed how they manipulated ‘news’ released on Facebook pages by prioritizing some and suppressing others.
One curator revealed that workers prevented conservative articles from exposure in preferred sections even though those articles may have been trending organically elsewhere.
This revelation should be of high interest and sobering concern for those who obtain any news from Facebook’s pages who believe they are afforded access to all available information.
These former Facebook news curators, who preferred to remain anonymous to avoid possible retribution from their former employer, describe an operating environment in contrast to the company’s claim that the trending posts are “topics that have recently become popular on Facebook.”
Supposedly, Facebook uses a complex and mysterious algorithm that analyzes article content for suitability to a given audience. These ‘news curators’ disclose that both worker bias and institutional imperatives can influence content treatment as well.
One former curator confirmed that the operation had a particular aversion to various right-wing news sources. “It was absolutely bias. We were doing it subjectively. It just depends on who the curator is and what time of day it is. Every once in a while a Red State or conservative news source would have a story. But we would have to go and find the same story from a more neutral outlet that wasn’t as biased.”
Not all curators concurred with the claim and it remained unconfirmed whether left-leaning news sources or topics were similarly suppressed.
In addition to suppressing and promoting topics, curators had the liberty of injecting content from other sources that was not currently trending on Facebook. For example, one curator stated, “Facebook got a lot of pressure about not having a trending topic for Black Lives Matter. They realized it was a problem, and they boosted it in the ordering. They gave it preference over other topics. When we injected it, everyone started saying, ‘Yeah, now I’m seeing it as number one.’”
It remains unclear from how far up the corporate hierarchy these directives originated. Managers of news teams were known to instruct curators to artificially manipulate the trending modules, but the practice appears to contrast with stated corporate philosophy.
Readers on Facebook pages might not be aware that the information they receive is controlled or restricted.
For example, this article will be posted on the eHeadlines Facebook page, which has a current following of over 270,000 readers who have expressed interest in content from the page. However, the average eHeadline article reaches between 5,000 and 10,000 eHeadlines followers due to how the Facebook algorithm manages the distribution of content on the site. Over 90 percent of followers of the page, who may have an interest in receiving notification of new content, do not receive content from the page, according to the algorithm.
Indeed, the Facebook page is not an RSS feed equivalent although Facebook proposes to become a new source, but distribution of content to page followers is erratic and unreliable.
The choice of what a Facebook reader receives does not appear to be entirely in control of the user.
Join the discussion in the chat room
[flyzoo-embed-chatroom id=’5730ffdb4fb4d513ec2d64d0′ width=’auto’ height=’640px’]