Facebook is yet again being tangled into a government and media squabble when a whistleblower based in research, shared her findings that Facebook was trying to keep hidden from the public. Francis Haugen is a data engineer who worked at Facebook from 2019 to 2021. On Tuesday, she testified before a group from the Senate Commerce Committee about how decisions were made at the biggest social media platform in the world. The findings range from many topics but have common themes of misinformation, determinants to our mental health and attention span, and the algorithmic spreading of hate.
Since the late 2000s when the first social medias were just in their infancy, as a society we noticed a distinct change of not only how we the people manipulate and use these platforms but more importantly how they manipulate us. Fast forward to 2021 where Facebook is on top of the social networking market, having absorbed “Whatsapp” and “Instagram” and most recently “Oculus” an augmented reality company. The United States Government and Federal Trade Commission are now questioning the notion of Facebook being a monopoly. Although they do face competitors like Linkedin, Twitter, Tik Tok, and Youtube even, there is something to be said in owning the most personal of platforms, for people to express who “they are.” as mentioned by the new head of the FTC Lena Khan. With this responsibility and power comes massive control of personal data which is the reason Facebook is getting caught in hot water.
The data that Francis Haugen and her team of researchers collected highlights that Facebook’s algorithm tees up content for each individual person that keeps their attention and leads us into further filter bubbles of reactive content. Andrew Morass, a New Yorker staff member, described this feature as “meaningful interaction”. He explains that the feed you’re seeing is no longer in chronological order, but teed up in a way that maximizes your engagement with said content, reupdating every few seconds. Morass uses the example of ex-boyfriends’ wedding photos, which make sense if you have been actively looking at that person’s page for the last several months/years. This also happens with opposing political views that are presented to you due to the language of the comment section touching on keywords that have relation to your beliefs and have been shown to get your engagement. Negative reactions create such a back and forth behavior between people that generates even more engagement as more people from opposite sides of the table join in. This has in turn caused a polarization of views on politics, human rights movements, and Covid-19 legitimacy.
What is even more frightening is what Francis Haugen describes as the “toggling on/off of hate speech” and censored content relating to the election during the end of 2020. Facebook was ordered by government agencies to regulate political speech in terms of misinformation and hate speech due to the problems that were faced in the last election surrounding Russia’s involvement. However, the minute the election was over the censored content settings were toggled off allowing a huge boil over of content that was extremist on both sides of the political coin. On January 6th a vast mob of Trump supporters gathered through the use of social media at the Capitol Building and sought out to overturn the decision of the 2020 election.
Mark Zuckerberg has denied Facebook responsibility in this situation as well as the accusation brought to the FTC’s attention by Francis Haugen. In the past, he has apologized for the imperfections of the platform and agreed to create teams and initiatives to fix the hate speech on the platform. Haugen teams’ research from just this year found that Facebook only decreased hate speech by 2 percent, and decreased speech surrounding violence and incitement about six-tenths of a percent, despite being “[T]he best in the world at it…”, to quote Mark himself. After the latest round of questioning from the government, Facebook is taking a more defensive approach, saying the claims of Haugen are illogical and are taken out of context to Facebook’s wide portfolio.
What’s the Solution?
From researchers, media experts, and regular everyday users, the question lingering seems to be how do we fix this problem that seems too far gone? Asking users to stay off a platform when loneliness and the need for information and connection are at an all-time high, seems to be impossible. The next option would be some government agency stepping in to monitor even closer about what people are posting and saying online. This however seems to be an unlikely choice given the constitutional right of our Freedom of Speech given to us by the first amendment.
We could hope that the FTC sees the true effect social media platforms (specifically Facebook and Instagram) have on our mental health, polarizing political system, and overall literacy communication and decide that the companies need to be broken up. The problem with this notion is that the FTC does not like to break up companies that have no concrete evidence of inflation of prices or monopolizing a single market.
On paper, the overarching brand of Facebook is not doing either by being a free public platform and not acquiring every social media platform out there. At this point, relying on Mark Zuckerberg to prioritize safety over monetization seems like a stretch given what we know about the Harvard grad and his humble beginnings, hacking Harvard’s class directory just to rate the attractiveness of his female classmates. As a society, we are left to try to make a cultural shift in invoking a sense of reality where we can admit that these platforms are stronger and smarter than our human weaknesses such as jealousy, impulsive anger, and envy to name a few. The citizens of this country need to have more media literacy, creating an all-encompassing curriculum or online resource on how social media platforms started, how they operate and thrive, and what effects they have on our innate human psyche. Considering that in the U.S. we can not even get every state to teach basic human reproductive systems and protection, this type of tech education might be years out.