Earlier this month, the U.S. surgeon general issued an extraordinary warning: social media is an "important driver" of a mental health crisis among young people.
For Facebook whistleblower Frances Haugen, that came as no surprise. A former data scientist for the platform, in 2021 she released documents that laid out how Facebook prioritized profits over tackling harmful content like misinformation or human trafficking and dealing with its impact on children's mental health.
Yet even though social media's harms are well known, the announcement marked a shift in the global push to regulate these companies that could open the floodgates to "the most public debate we're going to have in the next 15 (to) 20 years," she said, speaking at a Monday event co-sponsored by Canada's National Observer and McGill University's Centre for Media, Technology and Democracy, where she is a fellow-in-residence this year.
The big question will be whether the policies that emerge from those discussions will bolster democracy by protecting social media users' "autonomy and dignity" online instead of using an authoritarian, censorship-based approach.
"I'm a little scared about what's being discussed right now," she said. Utah recently passed laws that effectively eliminate children's online privacy, while Montana has moved to ban TikTok entirely. Both approaches take a heavy toll on people's freedom that more "moderate, sensible laws" could avoid.
The early fixes are relatively easy, she said. For instance, forcing social media companies to be more transparent about everything from how their algorithms work to how many people moderate content in other languages than English — including French — could help hold them accountable to their own safety pledges.
"These are the kinds of things that unlock what I call the ecosystem of accountability. You can start to do lawsuits, you can start to boycott, you can start to do divestment," she said. "These laws can be transformative."
Implementing more oversight is key, particularly with the rise of TikTok. A growing number of under-35s get most of their news from the Chinese-owned platform, which she said is designed to funnel people so they see less content from people they know and more from a handful of popular accounts. This model makes it easier for the company — which has close links to the Chinese government — to control users' "news ecosystem."
Even small tweaks to the platform's user interface itself can make a big difference. Modifying news feeds to make it harder for users to see and share content generated by strangers reduces the spread of harmful content and hate speech. Tweaking the platform so it loads more slowly near a user-set bedtime could also help people sleep more by making "doom scrolling" less appealing than a good night's rest.
Earlier this month, the U.S. surgeon general issued an extraordinary warning: Social media is an "important driver" of a mental health crisis among young people. For Facebook whistleblower Frances Haugen, that came as no surprise.
While the European Union is furthest along when it comes to implementing these types of laws, she said Canada has a "huge" opportunity to shape how social media companies are regulated worldwide. In addition to pursuing domestic laws around social media accountability and transparency, the country is well-placed to spearhead global alliances on the issue.
Canada's geographic and cultural proximity to the U.S. also means that laws passed here will seem more palatable to Americans than European ones, she said. In light of U.S. state-level laws and right-wing chatter that have focused on heavy-handed measures like Montana's TikTok ban, that influence could be key to ensuring a democratic social media future.
"It's about giving people choice," she said. "Transparency laws and risk-based laws that are about giving people information so they get to make real choices."