A new study shows that disinformation across Facebook’s platform got six times more engagement than real news, according to mentioned The Washington Post.
This new study is likely to bolster critics’ arguments that the company’s algorithms are fueling the spread of misinformation at the expense of trusted sources.
The study looked at posts from the Facebook pages of more than 2,500 news publishers between August 2020 and January 2021.
News publishers known for providing misinformation got 6 times more likes, shares and interactions across the platform than trusted news sources.
This increased participation was seen across the political spectrum. But the study found that right-leaning publishers have a much higher tendency to share disinformation than publishers in other political categories, according to the Washington Post.
The researchers are sharing the study as part of the 2021 Internet Measurement Conference in November. But researcher Laura Edelson said it could be released sooner.
In response, the company said the report measured the number of people interacting with the content. But this is not a measure of how many people are watching it.
“This report looks at how people interact with content, which is not to be confused with the number of people who view it on Facebook,” a Facebook spokesperson said. And when you look at the content that gets the most reach across the platform, it’s not like what this study suggests.
He added: “The company has 80 fact-checking partners covering more than 60 languages that work to categorize misinformation and reduce its distribution.
Facebook does not provide access data to researchers. Instead, researchers who want to understand and quantify the problem of misinformation for the social media platform turn to a tool called CrowdTangle, which is owned by Facebook.
But in August, the company cut off this group of researchers’ access to this data. In addition to the library of political ads across the platform.
Read also: Facebook enters the field of fantasy sports
Interact with misinformation more on Facebook than on the news
The company said continuing to give third-party researchers access to the data could violate a settlement with the Federal Trade Commission it entered into after the Cambridge Analytica scandal.
The committee responded, in a rare refutation, that the settlement excludes researchers and that Facebook should not use it as an excuse to deny the public the ability to understand people’s behavior across social networks.
and use New York Times technology columnist Kevin Rose, Tool CrowdTangle to make regular lists of the posts that got the most engagement across the platform.
This is said to have angered senior employees within the company. And that Because the lists were regularly dominated by right-wing pages spreading a lot of wrong information.
In an attempt to address claims that misinformation is a problem across the platform, the company released a transparency report in August. Which showed the most viewed posts on the platform during the second quarter of the year.
The New York Times revealed days later that the company had canceled plans to release a report for the first quarter.
This was done because the most viewed post was an article that incorrectly linked the coronavirus vaccine to the death of a doctor in Florida. It is a post that many right-wing pages have used to cast doubt on the effectiveness of vaccines.
Read also: Facebook apologizes after calling black men apes