Yet another major controversy has erupted concerning social media platforms and it’s about none other than the social media giant Facebook.
Following an interview a week ago on "60 Minutes", the identity of the Facebook whistleblower, who released thousands of pages of internal research and documents leading to a huge debate about the unethical practices of social media companies like Facebook, was revealed to the world as “Frances Haugen”.
After all, companies run for profit, but adhering to basic standards of ethics in business is both desirable and necessary, especially if you run a social media company which not only carries and displays public sentiments and emotions but is powerful enough to impact public sensibilities. Recently, Facebook Vice-President of Global Affairs Nick Clegg told CNN's Brian Stelter, "There is no perfection on social media as much as in any other walk of life." But even if it is true, little or almost no mercy can be shown to social media companies as public stake runs quite high in the way they operate.
Frances Haugen, the 37-year-old former Facebook product manager and whistleblower, who worked on civic integrity issues at the company, has provided documents to the US Government agencies as well as to The Wall Street Journal about the extent to which the social media giant consciously exploits the harm and damage that its applications cause and hasn’t actually brought it before the public domain.
According to her and the documents which she has shared with various agencies show, “Facebook knows its platforms are used to spread hate, violence and misinformation, and that the company has tried to hide that evidence.” Haugen had actually started to work at Facebook in 2019 and was previously working for tech giants like Google and Pinterest. What is mostly revealed in her interview to “60 Minutes” is the fundamental contradiction between the public good and how social media appears to have been designed and what actually it is.
She said, "The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimise for its own interests like making more money."
Afterwards, the "60 Minutes" correspondent Scott Pelly reportedly quoted one internal Facebook (FB) document as, "We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world."
According to these reports and the claims of Haugen, there is no evidence to suggest that Facebook has actually worked on the information it has but rather kept silent to advance its own interest.
However, as per reports Haugen filed, at least eight complaints a month ago with the Securities and Exchange Commission alleging the company to have hidden its research about the shortcomings from investors and the public. She is also reported to have shared the documents with the Wall Street Journal, which show, “Facebook was aware of problems with its apps, including the negative effects of misinformation and the harm caused, especially to young girls, by Instagram.”Haugen has most popularly said in the interview that while "no one at Facebook is malevolent ..., the incentives are misaligned”.
She further said, "Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction, and the more anger that they get exposed to, the more they interact and the more they consume.
"The claims made and the documents leaked by Haugen suggest that Facebook’s policy of “safeguards” against hate speech, incitement to violence, content harmful to the mental wellbeing of young people are only for the namesake.
And that the company also appears to be aware of the role, its apps play in inciting ethnic violence in many parts of the world and that how its product named Instagram increases notions of body shame and depression among teenage girls.
As per the disclosures made by Haugen, the AI-based algorithms that Facebook uses are designed to make people remain on the site/app as long as possible so that the time and the data so collected are monetised. It particularly feels bad and feeds disgust when the company is reported to have indulged in a business of allowing hatred and negative emotions and sentiments on its platform and subsequently monetising the enhanced momentum on its site/app.
This is dangerous in the long run; such an internal policy of any social media company can produce dangerous impact on various walks of social life, be it politics or cultural practices or relations, as it propels sexual violence, hatred, misogyny, depression and suicidal tendencies, etc., to the extent of creating a complete social disharmony.
It is, therefore, necessary for various national governments to bring stringent social media regulations to control such malpractices although it should be ensured that these platforms remain truly democratic and free.
(The writer is a lawyer and a public policy expert and a Distinguished Adjunct Professor of Law and Media Studies at School of Mass Communication, KIIT Deemed University. He can be reached at email@example.com)