banner2
The FDA should regulate the Instagram algorithm as a drug - Start Up Gazzete
Get In Touch
541 Melville Ave, Palo Alto, CA 94301,
ask@ohio.clbthemes.com
Ph: +1.831.705.5448
Work Inquiries
work@ohio.clbthemes.com
Ph: +1.831.306.6725

The FDA should regulate the Instagram algorithm as a drug

Banner1

The Wall Street Journal reported Tuesday on Silicon Valley’s worst-kept secret: Instagram harms teens’ mental health; in fact, its impact is so negative that it introduces suicidal thoughts.

Thirty-two percent of teenage girls who feel bad about their bodies report that Instagram makes them feel worse. Of teenagers with suicidal thoughts, 13% of British users and 6% of Americans trace those thoughts to Instagram, according to the WSJ report. This is Facebook’s internal data. The truth is surely worse.

President Theodore Roosevelt and Congress formed the Food and Drug Administration in 1906 precisely because large food and pharmaceutical companies failed to protect the general welfare. As its executives parade the Met Gala in celebration of the unattainable 0.01% of lifestyles and bodies that mere mortals will never achieve, Instagram’s unwillingness to do the right thing is a wake-up call to regulation: the FDA must affirm your hard-coded right to regulate the algorithm that drives Instagram’s drug.

Algorithms should be viewed by the FDA as a drug that affects our nation’s mental health: The Federal Food, Drug, and Cosmetic Act gives the FDA the right to regulate drugs, defining drugs in part as “articles (not are food) intended to affect the structure or any function of the body of man or other animals “. Instagram’s internal data shows that its technology is a brain-altering item. If this effort fails, Congress and President Joe Biden should create a mental health FDA.

The public needs to understand what the Facebook and Instagram algorithms prioritize. Our government is equipped to study clinical trials for products that can physically harm the public. Researchers can study what Facebook privileges and the impact those decisions have on our minds. How do we know this? Because Facebook is already doing it, they are just burying the results.

In November 2020, as Cecilia Kang and Sheera Frenkel report in “An Ugly Truth,” Facebook made an emergency change to its News Feed, putting more emphasis on “News Ecosystem Quality” (NEQs) scores. The high NEQ fonts were reliable fonts; lows weren’t reliable. Facebook altered the algorithm to privilege high NEQ scores. As a result, for five days around the election, users saw a “nicer News Feed” with less fake news and fewer conspiracy theories. But Mark Zuckerberg reversed this change because it led to less engagement and could cause a conservative reaction. The public suffered for it.

Facebook has also studied what happens when the algorithm privileges content that is “good for the world” over content that is “bad for the world.” Behold, the commitment diminishes. Facebook knows that its algorithm has a remarkable impact on the minds of the American public. How can the government let a man decide the standard based on his business imperatives, not the general welfare?

Upton Sinclair memorably uncovered dangerous abuse in “The Jungle,” sparking a public outcry. The free market failed. Consumers need protection. The Pure Food and Drug Act of 1906 first enacted safety standards, regulating consumable goods that affect our physical health. Today, we need to regulate the algorithms that affect our mental health. Adolescent depression has risen alarmingly since 2007. Similarly, suicide among those aged 10-24 has risen almost 60% between 2007 and 2018.

banner4

Of course, it is impossible to prove that social networks are solely responsible for this increase, but it is absurd to argue that they have not contributed. Filter bubbles distort our views and make them more extreme. Online bullying is easier and more constant. Regulators should audit the algorithm and question Facebook’s options.

When it comes to the biggest problem Facebook poses, what the product does to us, regulators have struggled to articulate the problem. Article 230 is correct in its intention and application; The Internet cannot work if the platforms are responsible for each user statement. And a private company like Facebook loses the trust of its community if it enforces arbitrary rules that target users based on their background or political beliefs. Facebook as a company does not have an explicit duty to uphold the First Amendment, but public perception of its fairness is essential to the brand.

Thus, Zuckerberg has been wrong over the years before belatedly banning denials Holocaust ists, Donald Trump, anti-vaccine activists and other bad actors. When deciding which speech is privileged or allowed on its platform, Facebook will always be too slow to react, too cautious and ineffective. Zuckerberg only cares about engagement and growth. Our hearts and minds are caught in the balance.

The scariest part of “The Ugly Truth,” the passage that got everyone in Silicon Valley talking, was the eponymous memo: “The Ugly” by Andrew “Boz” Bosworth from 2016.

In the memo, Bosworth, Zuckerberg’s former deputy, writes:

So we connect more people. That can be bad if they make it negative. It may cost someone a life to expose someone to bullies. Maybe someone will be killed in a coordinated terrorist attack with our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good.

Zuckerberg and Sheryl Sandberg caused Bosworth to back down on his remarks when employees objected, but to outsiders, the memo represents Facebook’s unvarnished identification, the ugly truth. Facebook’s monopoly, its absolute dominance over our social and political fabric, its growth mantra at all costs of “connection”, is not good de facto. As Bosworth acknowledges, Facebook causes suicides and allows terrorists to organize. So much power concentrated in the hands of a corporation, run by one man, is a threat to our democracy and way of life.

Critics of the FDA’s regulation of social media will claim that this is a Big Brother invasion of our personal freedoms. But what is the alternative? Why would it be bad for our government to require Facebook to take its internal calculations into account from the public? Is it sure that number of sessions, time spent, and revenue growth are the only results that matter? What about the collective mental health of the country and the world?

Refusing to study the problem does not mean that it does not exist. In the absence of action, we are left with one man deciding what is right. What is the price we pay for the “connection”? This is not up to Zuckerberg. The FDA must decide.

Author avatar
Joshua Smith
https://startupgazzete.com

Post a comment

Your email address will not be published. Required fields are marked *

Este sitio web utiliza cookies para que usted tenga la mejor experiencia de usuario. Si continúa navegando está dando su consentimiento para la aceptación de las mencionadas cookies y la aceptación de nuestra política de cookies, pinche el enlace para mayor información.plugin cookies

ACEPTAR
Aviso de cookies