Tech

Facebook encourages hate speech for profit


Responsible for leaking huge amounts of Facebook’s internal company documents for newspaper The Wall Street Journal, Frances Hogan, in a program 60 minutes, revealing more of the inner workings of the world’s most powerful social media platform.

Hogan revealed her identity on national television. She described the company so committed to product improvement that it has adopted algorithms that amplify hate speech.

She pays for her earnings through our safety,” Hogan told 60 Minutes host Scott Bailey.

According to a since-deleted LinkedIn account, Hogan was a product manager for the company within the Civic Integrity group. It chose to leave the company in 2021 after the group dissolved.

She said she did not trust the company would be willing to invest what should be invested to prevent the platform from being dangerous. Consequently, she leaked a large body of internal research to the Securities and Exchange Commission in the hopes of driving better regulation of the company.

She noted that she worked for a number of companies, including Google. But it was much worse at Facebook because of the company’s desire to put its profits before the interest of its users.

“There was a conflict between what was good for the public and what was good for Facebook,” she said. Time and time again, the company opted for improvements that served its own interests such as making more money.

And while the company claims to help stop hate speech, at least via its own products, an internal document leaked by Hugin says: We estimate we act with between 3 and 5 percent hate and roughly 0.6 percent violence and incitement.

Read also: Facebook struggled to fix massive outage

Facebook does not care about the interests of its users

Another document states that the company has evidence from a variety of sources that hate speech, divisive political discourse, and disinformation across its platform and suite of apps affect communities around the world.

Hugin claims that the root of the problem lies in the algorithms introduced in 2018 that govern what you see across the platform. According to her, the goal is to increase participation. The company has found that the best interaction is the kind that instills fear and loathing in users. It’s easier to inspire anger than other emotions, Huggin said.

At the time, Mark Zuckerberg presented the algorithm changes as positive. He said, “We feel a responsibility to ensure that our services are not only fun to use, but also beneficial for the benefit of the people.”

But according to a Wall Street Journal report on Hugin’s concerns, the result was a sharp turn toward anger and hatred. One internal memo cited by the newspaper said about assessing the effects of the change: Disinformation and violent content are overly prevalent among re-shares.

The Wall Street Journal began publishing the files under the name “Facebook Files” in September. One report alleging that the company had conducted research proving that Instagram had harmed teenage girls since then led to a congressional hearing.

Read also: Facebook services are back after a six-hour outage

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button