Editorial: How do we protect the young from social media?

Editorial: How do we protect the young from social media?

10:06 PM March 23, 2026

There are currently 11 bills proposed in Congress that cover Child Online Protection and Social Media Safety. A good number of these have been filed because other countries have begun to either pass or entertain similar legislation.

To assert that we need laws to protect the young from social media, we have to accept two premises. 

The first is that social media platforms contain significant potential harms that require legislation. 

Article continues after this advertisement

The second is that legislation and regulation are the right tools to protect the young, and the government is the agent to enforce this.

FEATURED STORIES

There are many studies already establishing correlation between social media use and depression, anxiety, and other internalizing symptoms. But the point that social media platforms stress in defending themselves is that correlation does not establish causation. 

So we are in a position where we can all intuit or sense that there’s something wrong with the way social media platforms run, but we are still asking, is there sufficient evidence for legislation? 

And on the other end of this, if there are actual harms and we don’t do something now, what if we act too late to reverse, undo, or prevent those harms?

Article continues after this advertisement

I think the other way to approach this is to look at what we have learned from social media platforms themselves. 

Litigation in the US has revealed documents that show that social media platforms were aware of the potential harms they were designing into their products, and they did it anyway. Meta’s own researchers described Instagram internally as “a drug” and its employees as “basically pushers.” 

An internal TikTok report acknowledged that “minors do not have executive mental function to control their screen time.” It’s also been documented that social media companies use the same tricks as casinos and gambling to keep people hooked.

Article continues after this advertisement

We also know that social media platforms are spaces where children and young people face real, documented harms such as algorithmic manipulation, exploitation, and cyberbullying. Between the available studies — including a 2024 meta-analysis published in JAMA Pediatrics establishing correlation between social media use and internalizing symptoms across adolescent populations — and the unethical practices we already know these platforms engage in, I believe there’s more than enough reason to explore legislation.

We need to fix platforms, but not just for kids

I do have one major concern when we only target a specific age group for protection. Because of course we need to protect children, the young, and the vulnerable from the harms of social media. 

But what happens when they turn 16 (which is the age limit set by several of the proposed bills)? If social media is a mess that radicalizes, makes people dislike themselves, and actively causes people harm, then why aren’t we trying to legislate and hold social media accountable for all? 

That’s my first challenge to those advancing these bills. Let’s start with the kids, but more importantly, we need to fix platforms so that they are actually good spaces to inhabit. Imagine if it were a physical space that was unsafe. 

You wouldn’t just restrict young people from entering that space, you would demand that the space be fixed or demolished. Well, this is a space that shapes our minds. We should be even more demanding.

We need to fix social media for everyone, not just for kids. Or, possibly even more difficult given how entrenched it is in our societies, if these platforms don’t fix their designs and uphold more pro-social standards, then we need to get off of them. I don’t know if that’s something that legislation can fix.

We need to know exactly where the interventions lie

I think one of the challenges facing any legislation is identifying where exactly the levers are. This is especially challenging with the many different kinds of platforms in play and how many of these are conflated in the many bills. 

But before I get to the details, I will advance that this isn’t just a situation where we are looking at how to regulate specific social media platforms. Rather we need to frame this within a context where we are navigating a world together, and we need to share the responsibility for making spaces safe and enriching for children. 

When I say we, this is meant to cover the government, platforms, parents, schools, and anyone who uses platforms. It’s not enough to set an age limit, but we need to understand the age limit is the first thing we are looking at because it is one of the easier things to enforce — and even then enforcement will still have to be contested depending on where age verification is done.

First off, in terms of parents and social media platforms sharing responsibility, we can ask, should we explore a “kids” mode? This would be similar to how parents can have a kid mode for their streaming services like Netflix or YouTube. And if there is such a mode, what age is appropriate for that kind of mode?

Another thing to look at is screen usage limits, which appears in proposed bills. This would be incredibly difficult to enforce, and the question is, who would be the one enforcing this? Would this be something that a parent does? Or the platform? Or something on the child’s device? This is where we see how a sensible suggestion like “let’s limit screen time” can become a challenging policy puzzle which might prove ultimately unenforceable.

This leads to the challenge of understanding where the real interventions come in. For example, when we say that we need to ensure no one under 16 uses a social media platform, whose job is it to enforce that? Do the platforms do the verification? Do we connect devices to identities, so that the device communicates the age of the user? These are the technological questions we will need to decide on as we develop how to implement any legislation.

One thing I do believe is enforceable is that social media platforms such as Instagram and Facebook should not be communication channels for students. Other platforms like Discord or Viber may fill that gap, but if we are to implement restrictions on social media usage, schools would need to identify and accredit specific platforms for student communication — similar to how companies designate internal comms tools like Slack. The choice of communication platforms should be accredited by the relevant government entities.

Media Literacy and maintaining safe spaces

Perhaps the biggest concern for me is that we need to be equally focused on Media Literacy, AI Literacy, and critical thinking and engagement with the online world. I think it’s easy to spot that the world online has gotten a lot more dangerous, contentious, and problematic than where we were when people only started using social networking in the mid to late ’00s. Increasing literacy and awareness has to be a crucial — and well-funded — component of any legislation.

Lastly, I did have a personal concern when thinking about allowing the young to access social media and online networks. I believe that any framework rooted in child protection must also be rooted in children’s rights. 

For young people from marginalized groups such as LGBTQ+ youth, those from minority communities, and those whose home environments are not safe or affirming, online spaces have often been lifelines.

Legislation that places sweeping controls in the hands of government and parents without nuance risks replicating, in digital form, the same exclusions these young people already face in physical spaces. A child whose identity is not accepted at home should not find that the state has handed their parents another mechanism of control over who they can talk to and what they are allowed to know. Protection must be defined broadly enough to include the right to access community, information, and one’s own developing sense of self.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

The goal of legislation in this space should be to protect young people from harm while actively safeguarding their right to connect, to learn, and to belong.

TOPICS: child safety laws, social media safety, technology
TAGS: child safety laws, social media safety, technology

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2026 INQUIRER.net | All Rights Reserved