Journalism is a public good and fundamental to a healthy democracy.

But as we all know, journalism is currently in crisis. Newsrooms are shrinking. Newspapers are dying. And trust in the news media is at an all-time low. More and more people are getting their news from digital platforms like social media — particularly younger folks.

The recently passed Bill C-18, The Online News Act, is the Canadian government’s attempt to support journalism in this rapidly changing information ecosystem. The law, which was influenced by similar legislation in Australia, seeks to create “revenue sharing between digital platforms and news outlets” — basically companies like Google and Meta will be expected to pay news media companies for using their content. This has resulted in these tech giants, who are not thrilled with this development, threatening to block the sharing of local news on their platforms.

Much has been written about the pros, cons and challenges associated with Bill C-18. But I’d like to focus on an aspect that hasn’t received as much attention: the possible impact on the spread and uptake of misinformation — which is, no doubt, one of the greatest challenges of our time. (Spoiler: I don’t have an answer.)

Quality journalism is essential to the fight against harmful online bunk. Studies have consistently shown that if you get your information from the established news media — what online trolls love to call the MSM (mainstream media) — you are less likely to be susceptible to misinformation. This is the kind of content Bill C-18 is meant to support. But if you get your news from some other online sources — often called the alternative media — you are more likely to believe and share misinformation.

A study from 2022, for example, found that individuals who are anti-vaccine are more likely to get their information from places like Facebook, Instagram and YouTube. Those who are pro-vaccine, consume content from the legacy media — that is, content created by journalists.

Another study published this year found that online anti-vaccine discourse is largely driven by content from “low-reliability media sites” (read: not quality journalism). Similar trends can be found with the other misinformation-filled topics, like the Big Lie and climate change denialism.

Obviously, social media platforms like Meta are not the only online sources of misinformation. Search engines also profile inaccurate or misleading content. Research I’ve done with our team at the University of Alberta has found that search engine results are often filled with misinformation.

In one study, we looked at Google search results for the phrase “immune boosting” and found that 85.5 per cent of the 227 websites we examined inaccurately portrayed its benefit and only 10 per cent provided a much-needed critique of the concept. In other words, the content recommended by Google was mostly misleading.

The controversies surrounding #C18 highlight the tremendous challenges and conflicts associated with crafting policy to deal with the spread of #misinformation, writes @CaulfieldTim #Misinformation #cdnmedia #cdnpoli @CdnHeritage #disinformation

Of course, the rise of AI-generated content will only make all of this much more problematic.

A recent analysis by NewsGuard identified dozens and dozens of news websites that are entirely written by AI. Some of these sites publish hundreds of articles a day, and some include false narratives and misleading information. And studies consistently show people can’t distinguish between social media content generated by humans and AI.

For me, the controversies surrounding Bill C-18 highlight the tremendous challenges and conflicts associated with crafting policy to deal with the spread of misinformation. Social media and search engines aren’t going away.

On the contrary, they are increasingly becoming a central part of our information universe. Given this reality, we need to take steps to ensure these platforms are filled with reliable, trustworthy and evidence-informed content.

By supporting good journalism, this law is, at least in spirit, designed to help do exactly that. But, at least for now, might it end up making room for low-reliability media sites, thus heightening our exposure to harmful misinformation?

I don’t think we should allow large tech companies to bully national governments into policies that favour their bottom lines. While Bill C-18 isn’t perfect — some have noted the law will benefit larger existing news media companies more than smaller and emerging entities — I’m also happy to see countries exploring ways to support journalism.

Still, the current kerfuffle highlights the complexity and potential unintended consequences of regulating this space. The last thing we need is a law that facilitates the spread of misinformation and misleading content.

Timothy Caulfield is a Canada Research Chair in Health Law and Policy at the University of Alberta and author of Relax: A Guide to Everyday Health Decisions with More Facts and Less Worry (Penguin Canada, 2022).

Keep reading

What I think this bill accomplishes is to expose Meta and Google as entirely unscrupulous AND that pushback is possible, both important truths right now when the general attitude towards big tech seems to be helplessness.