Social media must be reined in
Countries are adopting higher age limits, but they need to tackle addictive design practices that target everyone

47% of people between the ages of 16 and 21 would prefer to be young in a world with no internet. Those startling numbers come from a new survey released Tuesday by the British Standards Institute, which also found that 68% of respondents feel worse about themselves after spending time on social media platforms.
The survey results are sure to fuel a growing conversation about what to do about the harms and drawbacks of social media platforms. Executives like X CEO Elon Musk and Meta CEO Mark Zuckerberg are championing a future of social media with even less moderation and fewer limits on what kind of information can spread on their platforms.
For those concerned about the mental health impacts of social media or the information environment created when false information and AI slop can easily spread to millions of people, those plans sound like a disaster waiting to happen. It should come as no surprise that governments are starting to act.
Curbing social media use
Last year, Australia passed a ban on under-16s using social media platforms, due to come into effect in December 2025. It was a shot heard around the world, and has given permission to other governments to consider their own responses. New Zealand is planning to follow suit, while the European Union is looking at setting an age of “digital majority” at 15 years. A growing number of governments are also banning smartphone use by students in schools, or at least during class time.
These conversations are playing out beyond Western countries, and have become particularly common in Asia. China’s policies are probably the most well known, with limits on the amount of time young people can spend on devices and “youth modes” on social media that promote educational content while limiting posts deemed more harmful. But it doesn’t stop there.
Malaysia and Pakistan are looking at licensing systems for social media, which would require platform companies to register with a government body and abide by certain rules to continue operating. Indonesia and Singapore are considering minimum-age laws for using the platforms, while Vietnam is moving forward with account verification requirements. Companies will be watching these developments closely, as they have significant user bases in that part of the world. 40% of Facebook users are in South and Southeast Asia, and they’re also some of the most engaged.
As the debate about what to do about social media platforms, and particularly their effects on young people gather steam, we need to make sure we’re taking the right approach — one that addresses the real harms without reaching for easy solutions that don’t tackle the real issues while creating serious unintended consequences. Despite what you might hear from some digital rights groups, there is a strong argument to be made for doing something about kids’ social media use.
Targeting the right problem
We have a long history of doing that in other mediums, whether it’s through regulation of advertising targeting children, limitations on their ability to view or access certain forms of entertainment or information, or the promotion of books, films, and television made exclusively for kids. Some of these measures have been rolled back in recent decades — not because they weren’t deemed appropriate, but because companies have consistently pushed for a greater commercialization of childhood. There’s little reason to believe this ethos shouldn’t also be applied in the digital environment.
It’s about time that governments starting to get serious about this issue after years of echoing tech company narratives and not being willing to regulate the industry. Social media has been able to expand in an environment with few rules while the founders behind it championed the notion that connection was an inherent good and ignored the drawbacks of what they were building. Now that regulation is arriving, we need to make sure it’s done properly — and I’m not convinced the dominant approaches are the right ones.
While I can see the argument for a higher age limit on social media, I’m not sure it’s the most effective approach or that it targets the right problems. Australia’s eSafety Commissioner recently found the existing age limit for those below 13 years old is very poorly enforced, with at least 80% of Australians aged between 8 to 12 already accessing social media. Even if that age limit was raised and enforcement got more effective — something I would argue is unlikely — the BSI findings show quite clearly that the negative effects of social media aren’t exclusive to those under 16 years of age.
It seems to me there are two better approaches. The first, specifically addressing young people, is something like what China has done on social media with “youth modes.” We already have something like this with YouTube Kids, but as many parents can attest, Google doesn’t do a very good job of actually making sure it’s safe, let alone educational, space. There’s little profit in that specific offering. Dedicated apps like the one made by recently defunded PBS Kids would be a better alternative — a non-commercial, publicly funded digital experience that isn’t trying to advertise toys to children or otherwise turn a profit. Other public broadcasters, like Canada’s CBC and Australia’s ABC, have their own versions.
There could easily be a graduated approach, with a more limited “kids” experience designed for the youngest users and more freedom being offered as children move into their teenage years. Undoubtedly, young people will be able to get around those limits if they so choose, especially if their parents allow it — but it will take a little more work, and potentially inappropriate content won’t just be thrown in front of them; they’d have to search it out.
The second approach is much broader, covering every social media user, and that’s tackling the way the platforms function. We have plenty of reporting and research that shows social media platforms are designed to be addictive — to ensure people spend more time on the app, thus generating more advertising profits. Social media platforms learned techniques from gambling companies to keep users hooked by using likes, notifications, and other methods to entice people to keep coming back, triggering dopamine responses that their brains craved even if the platforms made them feel worse at the same time.
Tackling those addiction design practices, dark patterns in interface design that nudge people to perform certain actions, and the way the platforms’ algorithms spread and amplify certain (often extreme or sensationalist) content to keep people engaged is a more difficult task than a hard age limit. Despite the country’s decision to move forward with an age limit, Australia’s eSafety Commissioner has already been examining and reporting on platform design issues and the issues with algorithmic recommendation systems.
If we’re serious about minimizing the harms of social media platforms, design interventions and algorithmic limits are a much more promising approach. Those policies will undoubtedly spur backlash from tech companies and cyberlibertarian rights groups that call out problems but too often stand in the way of governments addressing them, yet that’s not a good enough reason not to be bold in facing up to this challenge.
Digital harms are a collective issue
Social media companies need to be held to a higher standard. They had the opportunity to address these problems themselves, but they’ve chosen to double down in a self-serving effort to increase their political power and short-term profits. Outside the United States, countries should get far more aggressive to ensure social media aligns with local values and expectations. If US and Chinese companies increasingly dominating the space refuse to abide by new rules in other parts of the world, their refusal should be wielded to increase public support for domestic alternatives and funding for local tech development — including in the public sector.
Some people might look at this problem and simply ask, “why don’t they just leave social media?” But this isn’t a solely individual issue. These platforms are central to how we communicate, spread information, and for some people, how they make their living. It seems unlikely that social media as a form of communication can be — or should be — put back into the box, but that doesn’t mean they need to operate the way the most powerful companies in the world have designed them. Instead of maximizing profits and shareholder value, we could take a different approach; one that maximizes public benefit instead.
This is a collective problem, and that requires collective solutions. When so many young people think it would be better to get rid of the internet altogether than having to continue to live with the effects of social media on their mental states, it’s clear that action is necessary. We owe it to everyone in society, particularly those just getting their start in life, to find a better path.