Dr. Taylor Owen speaks about fake news and the state of the media
Please enjoy this uncut video of Dr. Taylor Owen, Professor at McGill University, speaking at a National Observer fundraiser about fake news, social media's role in how we digest media, the future of news, and the current state of the media.
So I want to talk about journalism, but I’ll do it in the context of something a little bit broader I think. That clock in the background, very ominous. So, in the space I’m working on now, has been for the last couple of years really focused on this crisis of misinformation we’re seeing, the foreign intervention in elections that are happening around the world, the rise of corporate surveillance and state surveillance. And I think there’s this confluence of issues around technology and democracy that are really coming to afore. And I think I want to step back from that a little bit and suggest that I actually think we’re going through a real crisis in democracy. And I really do worry about this, about the ways technology are negatively affecting the integrity of our democracy. I think the collapse in trust we’re seeing in democratic institutions, the collapse of the media, the collapse of reliable information in our society, the way we’re being divided against each other, are a real fundamental problem in our western democracies right now.
And so that’s my starting point for this. What I want to try and argue is I actually think this is, in part, caused by the current nature of our digital technology. I think there’s a big debate over this, right? Whether the technology we have and the way we use it are the cause of the consequence of this decline we’re seeing, of this change in our democracy, our public sphere. The way we get information, the way we share information, and what we know about the world in our society. But I actually think technology is a significant cause here. And I want to talk through why I think that and some things that we could do about it. So, let me just make four broad arguments about this. The first is that the internet is not what it used to be. I think we’ve been through waves of evolution of this digital infrastructure that is the internet.
The original internet, what is called Web 1.0, empowered individuals in some fundamental ways. It gave voice to people who weren’t allowed to speak and broadcast their voice in public to large audiences. People who are excluded from the filtering that the media provided, that you wanted to reach a broad audience, you need to get access to a publication to reach a broad audience. The original internet challenged that. Anybody could now speak. And that was legitimately empowering. And the technology of the internet facilitated that. Individual nodes in this internet were given real autonomy and real power. A second wave of the internet, sort of really broadly, we could call the social web. And what that did is it took all these different nodes of people who could now speak and allowed them to organize with each other. And it enabled a form of collective action where, for the first time, you didn’t need to be a hierarchical institution, a large, 20th century industrial organization, to make large numbers of people do things. All of a sudden, they could connect with each other in this social web and could act collectively. And I believed at the time, and I worked and wrote on this at the time, that this was fundamentally empowering and it was aligned with democratic goods and democratic values. This was a positive force in democratic society. It was enabling activists, it was enabling new media organizations, and it was ultimately making our democracy richer and more participatory and more responsive to the demands of citizens.
I would suggest that we’re in a different internet now. It’s what I’d call the platform internet or the platform web, which is an internet that’s largely dominated in the western world. I’ll talk a bit more about the divisions globally that are going on now, but for large parts of the world, the internet is essentially governed by between four and six companies. Amazon, Microsoft, Google, Apple, and Facebook. That for most people, most of the time, are their experience with the internet. Twitter is a bit of an anomaly, we’ll be talking about Twitter, it’s not really a platform in the same sense in the scale that these others are, but yeah. So, for most people, the way they experience this infrastructure of the internet, the way they receive their information is filtered by these companies. So I think if we want to understand the natures of these sets of problems that we’re facing in our information space, in our public sphere, we need to understand how these companies function. We need to understand the structures of these companies themselves.
So that’s the second point I want to make, which is that when you look at these companies, what are ultimately global monopolies, we see some real structural problems I think. And so let me just describe two of them. One is the way they’ve monetized, and this is what’s often called the attention economy or surveillance calculism [ph] where data about our lives that we project, that we provide to companies, are then used to target content and filter content back to us. So what we see and what we experience are a function of the monetization of that data. So, there’s two attributes that are problematic of the attention economy. One is that the data we provide are used to create incredibly detailed profiles of our lives, of us. So, a Facebook user, for example, might have 30,000 data points about them collected and stored by the company. And those data points are then used for anybody who wants to influence our behaviour for any reason to target information to us. So this could be to sell an advertisement, but it also could be to make us vote a certain way, or make us not vote, or to make us hate someone, or to make us disbelieve something, right? Anyone who wants to influence our behaviour has access to this system to target us based on these highly detailed profiles of our likes or dislikes and so forth. Facebook, for example, has told advertisers that it knows the moment a teenager feels insecure and worthless. And that moment, they can be targeted. So it’s an incredibly powerful tool of persuasion that’s been built and it’s incredibly profitable.
Facebook made $60 billion last year, 99% of which was from targeted advertising, which is about changing behaviour. The second challenge with the attention economy is that the primary metric of value for these companies is engagement. They need you to stay on these sites as long as possible and not go to another site, right? And it turns out that the things that engage us are not necessarily aligned with the public interest or reliable information or good journalism or whatever it might be. The things that engage us are more likely to be things we disagree with, are more likely to be things we hate, are more likely to be things that enrage us, things that are blatantly false, and this is just a structural problem in the system. That fake information and disinformation spreads faster than reliable information in the system, partly because of this engagement metric. So that’s sort of one structural challenge. A second structural challenge in these platforms is the way they deal with scale. So, these companies are operating at a truly astounding global scale. So there are about a billion new posts to Facebook every day. So, that creates some challenges.
If I’m an individual user and I log onto Facebook, how is it decided what I see? That is increasingly determined by artificial intelligence. So the way this scale is managed is through automation. Content is filtered based on what the system thinks we want to see. And the platform is increasingly moderated by artificial intelligence. So what is allowed to be said and what we are allowed to hear is filtered by these automated systems. And this is a challenge because these artificial intelligence systems are necessarily opaque. Humans can’t necessarily understand how they’re making the decisions they’re making. And this has a real determinant of effect whether we as individuals are seen, how we’re heard, what we are able to see, what we are able to consume is determined by systems that we fundamentally don’t understand and are incredibly difficult to hold accountable. And this is scaling considerably as these companies start building virtual worlds, as they start augmenting our lives. I think we’re heading into a very, in my view, worrying space, of the role A.I. is playing in determining aspects of our lives. Even just the creation of media that’s now happening is what’s called synthetic media or what people call deep fake sometimes where you have literally A.I. creating versions of reality that are then targeted back to us using those kinds of profiles I was talking about.
So we’re really heading into an uncertain space I think about the credibility and reliability of evidence and information in our society. So, the third point I want to make is I think those structural problems are the cause of some of the challenges we’re seeing in our democracy at the moment. And let me just list a few of I think the direct challenges that these are causing. One is the nature of information is becoming increasingly unreliable. Simply put, I think this platform ecosystem, the way we receive our information is an increasingly toxic place. Highly gendered and racialized speech is often incentivized, political discourse has become extreme and divisive. And speech has been weaponized with an often censoring effect. So, more speech often has a censoring effect on the people who are the victims of hate speech and harmful speech. It’s shutting out certain voices in our society in some worrying ways. At the same time as this discourse is getting more toxic and more divisive, we’re seeing the collapse of the industry of journalism and this is continuing a pace.
This year, the revenue of the hundred newspapers in Canada dipped below the revenue of the CBC. So we essentially have two pools of media funding in this country, and they’re both… Well, CBC’s isn’t declining, that’s another thing we can talk about. So, the collapse of journalism and the industry of journalism is in part a function of the economic model that came out of this platform economy. And for a while, I worked at the Columbian Journalism School and we were studying this at a time, I ran the centre on digital journalism at a time when there was a ton of optimism about how digital start-ups were going to take over the collapse of legacy media organizations. This was going to be a really positive thing. But that hasn’t really happened and it certainly hasn’t happened in Canada, despite some exceptions like here and National Observer. We have not had the kind of scaled growth of the digital journalism sector that has happened in the US, but now is looking like it’s quite vulnerable in the US too. It looks like a lot of it was just a VC-funded bubble frankly, where the hiring of journalists at BuzzFeed and Vice and Mic and Vox was largely subsidized by VC funding and that’s now coming home to roost in some ways. We’re going to see a real downturn there I think. We can talk about that separately. But I think we’re in a point where the legacy companies are continuing their decline and you’re not seeing the replacement necessarily of this kind of civic content from the digital start-ups and that’s precarious at a time when we’re facing all these other challenges of reliable information. Second sort of consequence of this structural challenge is that we’re becoming fragmented.
I think the nature of the information ecosystem leads us into filter bubble and echo chambers of information that we either agree with or that are targeted against us to divide us. The result of this is that in this information system, and this is something we’re really starting to see, tribalism and polarization can emerge incredibly quickly. It used to be that if you wanted to spread a conspiracy theory for example, like the Anti-Vax movement, it would take a decade. It would be incredibly difficult to reach a lot of people with a conspiracy theory or a blatantly false rumour or whatever it might be. That can now happen incredibly rapidly, just because of the ability for content to go viral and to pay to access the most vulnerable populations to the message you want to spread. The Anti-Vax movement, largely funded by individuals in California, has spread incredibly rapidly through western Europe over the past 2 years. To the point where there were 41,000 cases of measles in France last year and only 5,000 the year before. And the head of whatever their health agency is has blamed that on the spread of the Anti-Vax movement on social media, funded by promoted micro-targeted ads coming from the United States. So it’s an incredibly powerful tool for this kind of thing and that’s really worrying. And it’s increasingly leading to violence. One study in Germany found that any town in Germany that had one standard deviation greater Facebook use had a 50 per cent higher chance of violence against immigrants. Violence against immigrants. So this is triggering violent behaviour, not just division and challenges.
So where in Canada, I think we kind of lag in a lot of tech trends here, and I’m curious what you think, but I suspect that some of the division we’re seeing around pipelines, around immigration, is being fueled by these kinds of phenomena. I’ll mention in a minute, but we don’t know here because we don’t have a good grasp of the nature of this problem in Canada. And so we need more journalism on it and more research on it in this country. The third and final sort of structural implication of this in our democracy I think is our elections themselves are becoming more vulnerable. We’ve seen this with what happened in the 2016 election in the US, with acute hacking of email servers, of political campaigns, the spreading of that on WikiLeaks, the behavioural nudging that happened with Cambridge Analytica that was mentioned in the Mercers. But it can actually be far more subtle and I just want to mention one example that I think gets at how challenging this problem is.
One of the things the Russian government did in the 2016 election was they created Facebook pages a year before the election that were fan pages for prominent figures in the African American community in the US. One was for Beyoncé and one was for Malcolm X. And for a year, they just posted fan content. So for Beyoncé it was like concert videos and whatever, gossip about her or whatever. And they built hundreds of thousands of followers across the country. 2 days before the election, they then weaponized those pages by posting content intended to suppress the African American vote. And the main thing they posted was content about Bill Clinton’s Welfare Reform Package and what that did to criminal sentencing of African Americans in the United States. But that’s what they had tested in all sorts of other ways would be the most powerful message to suppress the vote and they targeted districts where they knew that could suppress the Clinton vote the most. So, how do you spot that? That’s just a normal Facebook page. It was running, posting Beyoncé content for a year, and then two days before the election, it turns into this very powerful voter suppression tool. And I don’t think we know how to stop that. And I think it’s a real problem.
So the final point I want to make is that in my view, where we are in this problem, what is ultimately I think an epistemological problem, it’s how we know about the world, is I think in crisis. How we decide collectively about things in our democracy demands that we share some sorts of similar information, and if that is at risk, I think we have real problems. And I think ultimately that’s due to a governance failure and I’ve spent a lot of the time over the last couple of years trying to convince governments that this is a space where they need to govern and thinking through ways and working through with governments ways they could try and govern this space. This gets into a sensitive terrain, right? Because it touches on freedom of speech, it touches on free press, as we talk about with the journalism fund that was just announced. But I do think there’s a governance failure here and the main reason for this, there’ll be a few, one is that this market will not be self-regulated. For a long time, we thought this technology space could remain unregulated. That any sorts of regulation would stifle innovation in the tech space, and I think for a while that was probably true. It was the fastest industry in modern history to grow without real government intervention. It was left relatively unchecked. But I think now, we’re in a moment that I would describe as similar to the moment just before the 2008 financial crisis where the financial incentives of a lightly regulated industry were fundamentally misaligned with the need for reform. And I think that’s where we are here too. And in some ways, it’s even worse because these are global monopolies. Even worse than global monopolies, they are publicly-traded, privately-owned public monopolies. And we know from history that these types of companies never self-regulate.
The financial incentives and their shareholder interests prevent them from that. And so that, to me, when you have publicly-traded private monopolies that have negative social consequence, that is the time for governance. And in this case I think regulation. So I think that’s where we are, the need for a new regulatory regime for this space. The second governance issue here is I think this is primarily a supply-side problem, not a demand-side problem. You often hear that well, if individuals just be smarter, if they fact-check better, if kids are given digital literacy training, then we will stop sharing crappy content, fake information, be vulnerable to foreign interference, right? That individuals can solve this problem. And to me, that just misses the fact that this is a structural problem and that there are very powerful incentives working against us as individuals. We are subject to A.I. driven behavioural nudges that play on information about our lives that we can’t possibly even know. That is not something that knowing truth to fact-check a story is going to solve. This is something bigger than that. And I think it’s going to require some pretty broad ranging governance changes. I think we’re going to need to rethink data rights, move from regimes of data privacy to actually giving individuals fundamental rights of the data they produce and how that data is used, allowing us to take the data with us that companies collect and build and give it to other companies.
I think we’re going to need new competition law that looks at new forms of anti-trust that value data as a commercial commodity that could be subject to anti-trust law. So I think we’re going to need to look at what is allowed to be said and who is responsible for what is said in this public sphere. At the moment, platform companies operate under a condition of safe harbour where they are not responsible for what is said on their platforms, unless they breach very significant thresholds, like pornography or terrorist activity or there’s a couple of very significant thresholds where there’s some liability. But I think we’re going to have to rethink that. I’m not sure why the primary disseminating body of inform in our society should not be in some form liable for the content that’s spread when it’s spread deliberately and with intent for harm. But that’s going to involve a conversation about free speech, about the limits of hate speech. We’ll probably have to revisit, in Canada, our hate speech legislation for example, which is going to be hard and complicated and fraught. We’re going to need to fund journalism. I worked on a report that recommended this latest package that the government has just announced and I think look, if you asked me five years ago if there should be a half a billion dollar fund to support journalism, I would have said that was crazy. Now, I think with the state of where we are, that we need a backstop of reliable information. And if that’s not coming from a viable, free press, I don’t know where it’s going to come from.
We need to reform the CBC. That’s a whole other area of conversation. It’s crazy that the CBC is a competitor to media organizations in this country. I think it should be an enabler and an amplifier of Canadian journalism in a meaningful way and it’s fundamentally not right now. So I think there’s a bunch of things we need to do. But that means starting with the assumption that there’s a set of problems here that require government solutions that the market’s just not going to solve. And I think we’re getting there. I think we’re getting to that point. A year ago, we weren’t. I think now there’s a real recognition by governments and by broad swaths of society in our public debate that there’s something wrong with our information system. I think we need to get at that. I think I’ll leave it there. The final thing I’ll say is that we don’t understand this problem very well in Canada in a similar way as they do in other democracies. I’m not quite sure why that is. In most western European countries, certainly in the United States, there are monitoring labs for studying this information. There are significant journalistic enterprises. There’s large non-profits devoting real resources to report just on this problem. Particularly in the United States, there’s four or five non-profit accountability journalism organizations that focus really just on this problem. We don’t do that here to the extend we need to, and I think that’s partly a fault of the research community that hasn’t prioritized this and part of what I’m going to be able to do is to start a research centre to study this problem. But journalism needs to step up on this too in a real way. And there’s a relationship there I think between the research community and journalism that needs to be fostered. So I would just leave it at that, that I’m heartened to hear you guys are going to head in that direction. And I would just encourage it in the strongest possible way, we need to understand how vulnerable our media ecosystem is at the moment to this kind of intervention and corruption. So, I think I’ll leave it at that. I’m happy to talk about any or all of that.