Senators Push for Stricter Social Media Guidelines for Kids: Will This Cause More Harm Than Good?

 

Young kids look at their devices together. Source for photo: Daily Sabah.

During a Senate Judiciary Committee hearing on February 14th, Senators from both parties vocalized support for the Kids Online Safety Act, which pushes for stricter guidelines for social media platforms such as Instagram and TikTok in order to protect young people from harm. 

While Instagram requires all users to be at least thirteen, many who join are even younger. 

Senators have advocated for bans on content that would promote eating disorders, low self esteem, bullying, self-harm, or drug abuse. Research has shown an over a 60% increase in depression in adolescents from 2011 to 2018. A 2019 study finds ties between this decline in mental health and the rise of social media. 

Increased guidelines for tech companies would include more extensive parental controls over their child’s social media. However, this move could disadvantage children and teens with marginalized identities that use social media as an outlet and a place of community where they can be themselves with others of similar backgrounds. 

Many LGBTQ+ youth living in hostile households would be cut off from resources such as a community of support and sex education they would not otherwise have access to. This deprivation could prevent youth from escaping dangerous situations or could result in more extreme and risky outlets of expression.

This “strict parents lead to sneaky children” pipeline was demonstrated when QAnon conspiracy theorists were banned from various social media apps after the January 6th insurrection. The dangerous lies being spread did not die, but, rather, found a new home in darker, more inconspicuous platforms, perhaps fostering even more dangerous ideas without the alternative perspectives on mainstream social media.

The debate surrounding harm prevention versus censorship is not limited to youth as seen with former President Donald Trump’s ban from Facebook for his role in spreading QAnon theories and the phrase “Stop the steal” leading up to the January 6th insurrection on the Capitol. However, earlier this year, Trump was reinstated on the site. Nick Clegg, president of global affairs at Meta, Facebook’s parent company, stated, “The fact is people will always say all kinds of things on the internet. We default to letting people speak, even when what they have to say is distasteful or factually wrong.”

The main question these guidelines raise is that of who gets to dictate what is and is not allowed on social media. Concerns of censorship come amid claims that Elon Musk, Twitter’s new CEO, has suspended accounts for criticizing him as well as a flood of library book bans targeting stories centering characters that are LGBTQ+ and people of color. 

Opponents of the Kids Online Safety Act such as Evan Greer, director of the group Fight for the Future, voice valid concerns about censorship and authoritarianism. Banning specific content deemed inappropriate for children allows for bias in determining what is and is not safe. While Senators from both parties stand united in protecting young people, opposing views on the safety of content involving guns, the LGBTQ+ community, race and other topics of frequent debate on Capitol Hill could be dividing.

 

Furthermore, at what age should filtering end? When would one be mature enough to view content that increases suicide, self-harm, sexual trafficking and more when we’ve seen how adults are also susceptible to the dangers of social platforms as evident in the widespread mistrust in the 2020 Presidential election, culminating in the attack on the Capitol? 


A ban on certain content and greater parental controls is not a sustainable solution for protecting children from the risks of social media, nor are they the only groups being harmed. Instead, the U.S needs greater data privacy laws such as the European Data Protection Regulation that gives users greater control over how much and to whom they allow access to their data. This would help children and parents alike by preventing TikTok from feeding users targeted content that could lead to dangerous rabbit holes or data violations like the Facebook-Cambridge Analytica Scandal in which millions of users’ data were used to push voter support for Donald Trump. Additionally, stricter laws could prevent the possibility of one’s data being used as evidence in court, a concern that led many women to delete their menstrual cycle apps after the overturning of Roe v. Wade last year. While the Kids Online Safety Act aims to protect young children, more data collection and tracking will only make all internet users more vulnerable.