Leaked documents show Facebook algorithm spread misinformation
November 11, 2021
What to know
- Former Facebook employee, Francis Haugen, released thousands of pages of internal company documents to prominent American news organizations highlighting Facebook’s alleged role in inciting violence and political misinformation.
- The redacted documents provided to Congress by the whistleblower revealed the company may have been downplaying Instagram’s negative impact on teen girls’ mental health.
- Reporting on the documents illustrates how the company ignored misuse of the company’s products overseas due to ineffective safety measures leading to human trafficking and exploitation.
Beginning Oct. 22, a series of articles dubbed the “Facebook Papers” were released by various news outlets, including the Associated Press, New York Times, Washington Post, Wall Street Journal and CNN. The documents outline the extent of the company’s awareness of its own faults and ill effects on its user base.
These news organizations were sent internal documents compiled by Francis Haugen, a former Facebook employee. Haugen alleges that the company was fully aware of its flaws, like an algorithm that promotes misinformation and inadequate content moderation in other countries. Despite being warned by employees and researchers, Facebook’s high-ranking employees and executives failed to appropriately address any of them.
“I am here today because I believe Facebook’s products harm children, stoke division and weaken our democracy,” Haugen said in an opening statement to the U.S. Senate on Oct. 5. “The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.”
Algorithm that promotes misinformation
Among these faults outlined by the documents are algorithms that promote political division and misinformation, Instagram’s harmful effects on young people’s mental health and ineffective content moderation overseas.
“In many cases, it (Facebook’s A.I.) has fostered and fueled conflict, and that’s because they find that on their platform, conflict leads to people spending more time interacting with their products,” said David Gunkel, professor of media studies with specialties in information and communication technology. “It is not a technical issue as to whether or not the A.I. is able or not able to do this. It’s really a management decision.”
The goal of a publicly-traded company like Facebook is to produce revenue for its shareholders. The way it does this is through keeping its user base on the app and engaged in content.
One document leaked by Haugen details an experiment in which Facebook employees created a fake account in the summer of 2019. The fake account followed prominent conservative Facebook pages, and two days later, the algorithm promoted several QAnon conspiracy pages to the account.
“QAnon purports that America is run by a cabal of pedophiles and Satan-worshippers who run a global child sex-trafficking operation and that former President Trump is the only person who can stop them,” according to a CBS News article about the conspiracy theory group.
The account was subject to what the Facebook employee running the account called a constant flow of misleading and polarizing content.
“Facebook we think of as a social media company, but it isn’t,” Gunkel said. “Facebook is an A.I. company, and what they are in the business of doing is collecting large amounts of data that they can then monetize and utilize to create various products like facial recognition algorithms, like recommendation algorithms for advertising, etc. The more time you spend interacting with the platform, the more value you supply the company.”
Facebook, as outlined in the internal research documents, has prioritized keeping certain polarizing, divisive content on the platform to keep users engaged and coming back to the site.
Facebooks damage to teens well-being
However, this approach has led to a bevy of detrimental byproducts for its user base.
Researchers inside Instagram found that 3 out of 10 teen girls said that when they felt bad about their bodies, Instagram made them feel worse, as reported by the Wall Street Journal.
The researchers also indicated that the platform made thoughts of suicide, self-injury and eating issues worse for teen girls using the website.
Following these reports, Facebook expressed its intent on pausing plans for ‘Instagram Kids,’ a project addressing an untapped social media demographic, children under the age of 13.
Many of these flaws are the result of the company’s prioritization of growth and usage over necessary preventative safety measures that might impede profit.
Facebooks past issues and its intent to change
In 2019, Apple threatened to pull Instagram and Facebook from the app store because the platforms were being used for human trafficking in the Middle East.
“Even today, a quick search for ‘khadima,’ or ‘maids’ in Arabic, will bring up accounts featuring posed photographs of Africans and South Asians with ages and prices listed next to their images,” according to the Associated Press.
Facebook has publicly expressed its intent on cracking down on human exploitation caused by language barriers, despite the continued presence of human traffickers using their platform to facilitate the sale of domestic workers.
“Facebook brought certain products within countries, but the perspective of the staff captured in these memos or reports is they did not have the adequate support to deal with the local context of communication within those countries,” said Andrea Guzman, an associate professor of journalism with specialties in human-machine communication and the intersection of artificial intelligence and communication.
Days after the documents were released, Facebook CEO Mark Zuckerberg announced that the company would be changing its name to ‘Meta.’
“(The rebranding) is maybe not in response directly to the Facebook papers, but it’s in response to increasing social, economic and most importantly governmental concerns, questions and pressures regarding the company,” Guzman said.