PARIS – Across the globe the headlines come fast and furious.
A mother discovers her 8-year-old daughter exchanging messages with a 22-year-old Florida man the child met on TikTok. In California, a 36-year-old man poses as a delivery driver and appears at the home of a young girl, one of more than 20 girls with whom he had initiated sexually explicit conversations. An Oregon mother wakes to find her son has committed suicide in the garage – the result of cyberbullying on social media.
If these things happened at a neighborhood hangout, would you let your child go there? Of course not, yet it occurs every day in the spaces teens frequent the most: online social networks like TikTok, Instagram and Snapchat.
Here in France, where I live and raise my daughter, parents have had enough – and lawmakers are finally listening. On March 2, France’s Parliament began debating and then advanced new legislation that would prohibit teens under the age of 15 from having social media accounts, and would require parental consent for those under 15 to keep any accounts they currently have. In the U.S., legislators have introduced bills in the House and Senate that would set the minimum age for usage at 16.
In both nations, these measures are long overdue. Unfortunately, society needs to do much more to address the risks that social media poses to young people – psychologically, emotionally and cognitively – and especially young girls. Particularly challenging is the scope of this issue, which transcends countries and cultures.
Wherever social media exists, so does potential danger for young people. Studies in Britain, Spain and Scotland have linked social media to increased aggression, anxiety, bullying, psychological distress and thoughts of suicide in young people between the ages of 11 and 16. An Australian study found that after spending just 10 minutes on Facebook, young women reported being in a more negative mood than those who browsed a control website.
In the U.S., researchers set up fake social media accounts. Accounts with feminine usernames received an average of 100 sexually explicit or threatening messages a day, while those with masculine names received fewer than four.
The Reboot Foundation, which I founded in 2018 to encourage critical thinking, reflective thought and media literacy, polled 1,000 French citizens in February on whether they would support new restrictions on social media platforms. Support for government intervention was overwhelming: 77% of French people supported raising the minimum age for social media use from 13 years to 15 years old. When it comes to digital advertising, 86% supported banning ads targeting minors. And the algorithms responsible for amplifying and recommending harmful, hateful content to young people? Three-quarters of the French favored prohibiting them.
In the U.S, support is equally strong. A new Reboot survey of 1,049 U.S. adults found
62% agreed with the statement, “Children younger than 16 should not be allowed to create social media accounts.” Meanwhile, 73% agreed that digital platforms should not be allowed to examine your user profile, internet history, or search history when recommending new content. And an overwhelming 80% agreed that social media companies should warn their users that research has linked social media use to increased mental health problems in young people.
For lawmakers in both France and the U.S., it all adds up to a fraught debate around when and how best to rein in access to social media for young people who easily skirt current age restrictions. To combat it, France’s Parliament has passed a law that will require parental consent in the case of those under 15. That proposal now heads to the Senate where its odds of passing are good.
Meanwhile, in the United States, the government has been largely silent until very recently – instead leaving it to the courts and local leaders to combat social media’s ills. Now, the U.S. has an opportunity to learn from efforts in France and others in the EU. Currently on the table is legislation proposed in February from Sen. Josh Hawley (R-Mo.) that would raise the legal age for social media use in the U.S. to 16, and would, like its French equivalent, require social networks to verify the identity and age of their users. Rep. Chris Stewart (R-Utah) has introduced a similar bill in the House.
All of this is a much-needed start. But they’re only the first steps on a long road to social media safety. The time has come for social media to be regulated much like tobacco or alcohol, with vigorously enforced age restrictions, warning labels for users and tough penalties for violators. As we reach for these goals, more financial resources for schools and consumer protection groups from governments and the platforms themselves to safeguard against social media will be fundamental.
For example, funding additional research into the concept of “prebunking” or teaching users the tricks and tactics used by misinformation spreaders so they can recognize them. Requiring tech firms to fund comprehensive, high-quality, systemic media literacy and critical thinking education programs to fight misinformation, similar to the approach taken in Finland, should also be pursued.
While new regulations on social media platforms are necessary, we must also invest more in teaching critical thinking skills at the K-12 level. It is not enough to simply restrict the brain candy of social media that young minds can overindulge in – we must provide actual tools and support that feed positive cognitive development and build a strong foundation that will help adolescent minds learn to question, assess and objectively evaluate what they see and hear on social networks.
Another recent Reboot survey of France’s youngest TikTok users found alarming results regarding social media use in adolescents: 46% of 11 to 14-year-olds said they thought that TikTok “promotes reliable resources” and 47% said that a “certified” account on TikTok is synonymous with expertise in its field.
I created a foundation to elevate critical thinking and reflective thought, and I recognize that it can be dangerous to give the government more control over what people in a free society can see, hear, and say. But it’s clear from our research, and that of others, that it’s past time for action.
The social media companies cannot – or will not – police themselves adequately. The time has come for the government to intercede with thoughtful, well-crafted laws, like it did to promote automobile safety by enacting mandatory seatbelt use; like it did by setting the legal drinking age in the U.S. at 21; and just like it did when it forced tobacco companies to put warning labels on its products.
Big Tech and its social media are emerging as this decade’s Big Tobacco. It’s time we started treating them as such, on both sides of the Atlantic.