MAY 9, 2022

Making the Media S2E11 with Jessica Cecil

Making the Media S2E11 with Jessica Cecil

What do media and tech companies need to do to fight disinformation and what are the dangers to journalism and journalists if they do not?

In this episode of Making the Media, host Craig Wilson talks in-depth with disinformation expert and former BBC executive Jessica Cecil.

Listen to Hear:

  • Why disinformation is so corrosive to journalism
  • The ways in which it can be combated
  • The need for a transnational approach which will last years


Our Guest This Episode

Jessica Cecil

Jessica is a leading media industry figure and an expert in the field of disinformation, currently working as a consultant to media and tech companies. She founded the Trusted News Initiative(TNI), the world’s only alliance of major international tech companies and news organizations to counter the most harmful disinformation in real time. The TNI’s membership includes Meta, Google, Twitter, and Microsoft, alongside the BBC, AFP, AP, Reuters, and other leading global news providers, and its members have a fast-alert system to counter the most harmful disinformation.

Jessica’s media industry leadership experience was honed over a 30-year career at the BBC, which serves an audience of 450 million people around the world. As chief of staff to four Directors-General, she had a track record of creating and leading global alliances responding to the changes tech is having on audiences’ lives.

Jessica’s background is as a news journalist and documentary maker. She was an international news producer and assistant editor of BBC Newsnight. She is an Emmy nominee for the prime time TV science documentary, Human Instinct. Jessica is also a trustee of the University of Bristol, where she chairs the Equality, Diversity, and Inclusion Oversight Committee. She sits on the Council of Advisors for RAND Europe and is an adjunct fellow of the Queen Elizabeth II Academy for Leadership in International Affairs at Chatham House.

Tech companies need to be working with the news organizations to find and agree a common set of principles about where free speech ends and threat to democracy, threat to life begins, because these are really difficult questions.” – Jessica Cecil, Disinformation Expert

Mentioned in This Episode

Making the Media S1E15: Do You Trust Me?
Find out more about Project Origin and the technical solution to authenticate media
Read more

Episode Transcript

Craig Wilson: Hi, welcome to the latest Making the Media podcast. My name is Craig Wilson, and I am delighted you’ve joined us.

Being able to trust the information which you are presented with is a fundamental when it comes to news. That bond between the news provider and the viewer is one which can be broken if there is an element of uncertainty that what you are seeing, hearing, or reading isn’t real or true.

But in a world where disinformation is being used, where how people consume news is changing fundamentally — moving away from the familiar TV anchor to the wild west of online platforms and social media — and where faith in journalism more generally is being eroded, what can be done to restore and repair that bond?

In a previous episode of the podcast, we spoke with Bruce MacCormack from Project Origin about how major news and tech organizations are working to provide tools to authenticate media — check out the show notes for details on that episode. In this episode we are focusing more on the journalistic side.

What are news organizations doing? What are the dangers if they do not do anything? What does this mean for individual journalists who are now often finding themselves under personal attack, particularly through social media?

My guest for this episode is Jessica Cecil — a former journalist and senior BBC executive, who was a founder of the Trusted News Initiative. Rather than me explain what that is, let’s hear now from Jessica about its aims, how it works, what the future holds for trust in news, but first of all, why disinformation prompted its formation in 2019.

Jessica Cecil: Generally, we were clearly concerned about it. Lots of people were concerned about it in the 2016 election in the U.S., but there was a very specific trigger for us as well, which was the 2019 Indian elections.

The BBC has a really big presence in India, in several languages. And we found at that point that there were fake polls coming out suggesting that the BBC was saying that various parties were or were going to win and were well ahead. They were totally spurious, but they had BBC branding on them. And at that point, there was absolutely nothing we could do about it.

We found that all we could do was put out press releases to say that wasn't us. And we realized that there was no way in which we could talk to, and in this case it was Facebook, now called Meta, where they were running. But we had no way of working out how to act fast and also to tell other media organizations that these were entirely fake. At a point where they could make a difference to the way, in this case Indian voters, chose to vote. So it was the real-world harm. It was the effect on democracy. It was the fact that we didn't know how to tell both in this case Facebook, but a wider set of technology organizations and media organizations. And there had to be a better way. So that, in a way, was the origin, as well as the wider issues of disinformation.

CW: So, for people who don't know what the Trusted News Initiative is, maybe you could just sum up who it is and who's involved in it?

JC: Of course. So, the Trusted News Initiative is—it started, as I say, in 2019. It involves four big tech companies: Meta, Twitter, Google, YouTube, and Microsoft, and several—most of the biggest news organizations in the west, anyway: the BBC, Reuters, AFP, the European Broadcasting Union, the Financial Times, Washington Post, the Hindu, The Nation Group, plus the Royalties Institute for the Study of Journalism, and First Draft.

So, it's a grouping of tech companies and news organizations. One, it's about defining what the most dangerous disinformation is, and I can talk more about that. Secondly, it's having protocols and ways of alerting each other fast when that most dangerous disinformation occurs. And then thirdly, it's having a framework for talking about it.

CW: One thing you mentioned earlier on was about the 2016 election in the U.S. One phrase obviously came out from that tough time was around the use of the words “fake news” and I wanted to ask your view on that, but has fake news become a term not to describe something which is fake, but to describe something which somebody doesn't like.

JC: Yes. I think that the term fake news, clearly President Trump used it as a way to talk about various news organizations he didn't like, but lots of other people have done that as well. So I don't think fake news per se is a very useful term because it's got various connotations. That's why I like to use the word disinformation because it's got a clarity about the fact that you're talking about whether something is true or not by agreed measures of veracity, rather than it being a slogan that has been quite widely used and thrown around.

CW: So when we talk about disinformation then Jessica and you talk about the most harmful types of disinformation, what do you mean by that?

JC: What we defined and indeed has become pretty well accepted, I think subsequently, is first of all, real-world harm. It's disinformation where actually it's going to have an effect. We defined it as immediate threat to democracy, the integrity of the democratic system, and immediate threat to life. And those were the kind of two pillars.

I think the second thing is it's got the ability, or the potential, or the likelihood to go viral. And as you will know, once a lie starts moving, it can jump across platforms very very quickly, and we can talk about lots of examples, not least in Ukraine, of this happening.

So if you look at those criteria, you'll really… since 2019, we started the Trusted News Initiative, you've seen, I think, real world harm in several places. You've clearly, with COVID, you have seen the dangers of health disinformation that costs lives. You know, there has been disinformation across the world. You know, in the Philippines, suggesting that you should drink bleach.

You've also seen disinformation that has been very, very harmful to democracy. In Ethiopia, in the United States, and you know, the real world harm, you look at the fact that it's pretty well documented that some third of adults in the U.S. don't believe that President Biden is there legitimately. And then you've seen the way it's been weaponized in war and Ukraine. Clearly there are many, many examples from the hospital attack in Mariupol, two suggestions that Ukrainian soldiers were using stage makeup as a way of Russian propaganda attempting to say that the Ukrainians have done it all to themselves, and clearly, these are all instances that have real-world harm. And I guess the final thing I would say is that in areas where there are communal divisions, disinformation can be really, really pernicious, and Myanmar and the attacks on the Rohingya there is clearly another very, very important real-world example. So, I think that we can see that disinformation really matters because it has huge effect.

CW: One area that we’ve talk about on the podcast before is we did an episode with Bruce MacCormack talking about Project Origin. So it that part of the overall strategy to combat this?

JC: Absolutely, so they're complementary. One is making sure that you tell other organizations you detect, you define, and you alert others, but clearly the provenance of information and whether it comes from the organization it purports to is really, really important. And indeed, Eric Horvitz from Microsoft and Tony Hall, the then-director general, and I had breakfast right at the beginning of the Trusted News Initiative and talked about what was the beginning of Project Origin and has now gone further and faster.

And as we move into the world of more and more deep fakes, it's going to become more and more important. You can't look at a piece of media and work out whether or not it’s been tampered with unless you know its provenance and where it comes from.

CW: We’ll come on to talk about what organizations can do about it and even looking at governments and what they can do about it, Jessica. But I'm also interested in what challenges this posed to individual journalists who are trying to make sense of the information that comes in? You know, how are they trying to deal with this?

JC: It's really difficult, and indeed, I think we've seen an evolution with the way in which journalists have looked at disinformation. There is much more awareness of being played, much more awareness of not amplifying disinformation because it looks interesting, and actually checking your sources more.

That said, it's really, really difficult to get to the truth because often journalists are attempting to work out what's true or what's not, whereas the disinformation has a real immediate resonance that's very emotionally satisfying and therefore it can move very, very quickly, so you've got emotionally charged disinformation running very, very quickly, whereas as we know, the nature of finding out what's true and what's not true has to be painstaking because it's fact-based and can take some more time.

I think what's been great is the way in which we've seen much, much more collaboration between journalists and between fact checkers. And, I'd like to call out Bellingcat here as an extraordinary organization that not only has taught many other organizations about how to find clues in open source material, but also has worked across coalitions of journalists to try and get to the truth in Ukraine. But also long before that, as well.

So, those are all the ways in which that is posing a real threat to journalists. But I think you also have to call out something different as well, which is that journalists are often at the sharp end of this, as well. You have a lot of attacks on individual journalists, particularly women and journalists of color. And you also have attacks by state actors on particular organizations and journalists that they don't particularly like in many organizations and in many countries.

So this has thrown up lots and lots of individual issues for journalists, and they are responding, and newsrooms are responding as I say, first of all, by collaborating, and I think the 2nd way is this sense of—and horrible phrase—pre bunking. A sense of being able to anticipate where you think disinformation stories and narratives will go, and trying to put out in advance what you think the true story is, before that false narrative gets to get a hold. And you've seen that particular pre bunking playbook also taken on by the U.S. and U.K. military intelligence, by saying “This is what we think is going to happen in the Ukraine war, as well”. Same principle.

CW: I mean, journalists, they're taught to be skeptical. You know, they're taught to question, you know, information that they do receive, but the landscape that you've described is so complex for them to deal with, I presume by extension of that, that landscape is almost impossible for the audience to then figure out what's actually going on.

JC: Exactly—I think there is a lot of emphasis quite rightly put on media education. And I think in the long term fight against disinformation, making the audience much, much more aware of what they might be able to trust and might not be able to trust is incredibly important. However, as you say, journalists are the specialists, the audience are not, and it becomes and therefore it's much, much more important to know that the journalists themselves can navigate on behalf of the audience. Some of these very, very difficult, tricky waters for disinformation.

And part of this is also making sure that journalists brands are also protected. I talked about the Indian election, and we are still seeing the brands of trusted news organizations being used as a way of trying to pretend a story has a legitimacy that it hasn't. We've, you know, seen only in the last few weeks the rocket attack on the station in Ukraine. And Russian disinformation putting BBC branding on that and pretending that it was actually the Ukrainians who shot at themselves and the BBC was claiming that, which obviously wasn't. So, these are... It's really important that one, the journalist makes sense of the disinformation and call it out when and how they can. Two, that their own brands are protected. But as I say, three, and this is where the Trusted News Initiative comes in, when one organization knows about really dangerous disinformation, all the other organizations know that, too.

CW: Because the other point I was going to come on to was you mentioned right back at the start about the way that sort of digital technology has transformed how audience behaves and I guess part of that is also about how audiences consume and receive their news. So, you know, when I was a lot younger, it was the newspapers, or through radio, or through television, and those were the outlets that kind of existed. Whereas now, a lot of people, I think, don't necessarily consider themselves to be consuming news when they're looking at something on Facebook or they're looking at something on Twitter, for example. And so the method of how people consume the news is very, very different, and that landscape has changed significantly. And so that, I guess, also then poses enormous challenges for news organizations when the words of John on the street could be viewed as truthful as the word off of James, who's the reporter who's working for the BBC, who's in Leviv and is reporting on the Ukraine crisis.

And I really struggle to understand what it is that organizations can really do to combat things like that.

JC: This is why I think it's really important that news organizations work with tech companies. Tech companies themselves have evolved their approach hugely in the last few years, and they now promote news sources that are credible so that actually you do get a sense that you can go to a trusted fact checker or a trusted news organization if there is a contentious piece of information. And I think that we've seen through the U.S. elections, really contentious elections, through COVID, and now through Ukraine an evolution in the way in which the tech companies are thinking about that.

But clearly they need to be working with the news organizations to find and agree a common set of principles about where free speech ends and threat to democracy, threat to life begins, because these are really difficult questions, and there is much more transparency and much more safety if you do that in wider coalitions.

CW: Because I get the other thing about this is we've talked about the very large organizations that are involved in the trusted news initiative, but if you're a journalist working on a local newspaper, if you're a journalist working on a local radio station or television station here in the U.K. or in the U.S., or really anywhere, in essence, that's really where things start. And that's why I think there's so much concern about disinformation because of its corrosive nature that it has—not just against the large organizations, but against everyone who's working in journalism.

JC: Totally, totally agree and we haven't talked about the corrosive nature of disinformation on democracy, which is ultimately the thing that is very, very scary, I think, in the long term for all of us. And that's because if you don't know which facts to trust, how can you know where and how to vote? And clearly that starts with local information and local news sources. Totally agree.

I mean, my personal view is that we now need a much broader coalition which brings in many more journalists, many more tech companies, and many more civil organizations and fact checkers, really, to take many of the principles of the Trusted News Initiative around fast alert and around a common and transparent approach to where some of these difficult issues lie, in order to find out about this information. But also, I think to work out what acceptable free speech is and isn't and to do something—I think finally—around some of the really difficult issues to do with politicians flirting with disinformation and how and when they’re called out, and that is a really really difficult position.

I think unless you have a clear set of principles of when flirting with disinformation can be dangerous for anybody or everybody, you're going to have a situation where, for organizations, for politicians, as well as state actors, as well as pressure groups, as well as those in it for financial gain, are going to feel that frankly there's nothing stopping them.

CW: I think it’s the type of thing, Jessica, that’s quite easy to get down about. There’s lots of challenges. But let’s look at things in a bit more positive way—what do you see as needing to be done to combat the situation of the moment?

JC: I'm positive because the stakes are so high there are lives at risk and there is the integrity of democracy at stake. So, I think we will do things. I think we need to have a bigger transnational coalition that brings together tech companies, media organizations, civil society, fact checkers, on a much much wider scale. I think we need to have a fast alert, I think we need to define what the limits of really dangerous disinformation—what defines dangerous disinformation. And when it might pose a threat to democracy.

And I think from there, you will allow two things to happen. One is it will create a situation where some of the issues that tech companies have in being more transparent could be overcome. They argue that there are different legal… a patchwork of legislation across the world that makes it very difficult to share information, because in some places that could be illegal and in some places it won't. I think you have to have a way in which there is safety in numbers. Secondly, I think there is safety in numbers in terms of looking at some of the policy issues that arise, and we saw particularly with Twitter and Facebook during the 2020 elections—really, really difficult issues arising. Again, these need to be much more openly debated, and I think we need to have the forum to debate those rather than to have individual tech companies making these decisions by themselves.

And I think that allows you finally to look at, share intel on, and work out what is an evolving disinformation environment. And as soon as you have more work being done with one platform, you have disinformation jumping onto others, which is why you've got to have a very very wide coalition.

It's really interesting to see that Twitch, which is owned by Amazon, which is known as you know, as a live streaming gaming platform, but according to the Financial Times, in the last few weeks, it's been home to disinformation suggesting that the invasion of Ukraine is a legitimate way of de-nazifying the country.

You know, it jumps from platform to platform. It goes into unusual places unless there is a much more coordinated conversation around some of the policy issues and what to do about them, transparency around decisions that are made, as well as fast, alert, and chairing of information around intelligence. We're not going to be in a position where we get on top of something, which is to say, is undermining democracy and costing lives. It's really, really serious.

CW: I guess one of the other challenges that you have is that, you know, the nature of media organizations is that you compete with each other.

JC: Of course.

CW: And, so, I guess I'm interested in how the Trusted News Initiative evolved where there's a lot of large media organizations on it. Same with Project Origin—there's number of large organizations that are involved in that, where, in essence, they have to come together. Was that in itself quite challenging to try to get? Or did people sense that there was a common enemy here, if you like, that they had to try to defend themselves.

JC: There was a common enemy. I mean, I think that what was really interesting was how open-minded and collaborative all the organizations were about coming together. The challenges we had were about defining precisely what our common definition of fast alert walls, of the most dangerous disinformation was, so how would we define that so that we knew when those sort of break-glass moments would be when we would alert each other.

The second challenge was actually operationalizing that. So, turning it from good words into ways in which you put that into the workflows of organization so they are actually in a position to act on the information that they got quickly. But I think we have seen a change in the nature of disinformation pushing through collaboration. And, as I say, you now have much more solidarity, if you like, in the issues of journalists safety, in the issue of fighting disinformation, and in the issue of sharing best practice, which is really, really welcome. But it's moved an awful long way in the last few years.

CW: Clearly, I guess Jessica, this is a long game. This is not something that's going to be sorted overnight.

JC: No. It's not going to be sorted out overnight, but I think we have to use each shock to the system as a way of pushing forward collaboration. And I think the last few weeks in Ukraine have been the latest shock to the system in terms of disinformation, and I hope that they will be the shock to the system we need to foster this greater collaboration, this leadership, and frankly the resources we need, particularly from the tech companies, and others, to make this happen.

I think this has to be the way in which the future fight against disinformation—which is going to take years and years and years, let's be clear—but I think we can get started now.

CW: Jessica, there is one question that I ask everyone who joins the podcast, so I will ask you. What is it, if anything, that keeps you up at night?

JC: The thing that keeps me up at night is the way in which my entire career has been about working in and around journalism. Particularly because it sustains democracy, which is an essential part, in my view, of people across the world living good, meaningful lives in which they have agency. Disinformation cuts at the roots of that. And that's really, really serious. So, I'm very committed because I absolutely believe in sustaining democracy by sustaining journalism, by making sure that disinformation is countered not just locally in specific ways but in collaborations. Because that's got to be the way in which it's done most effectively.

CW: I think all of us who have faith in journalism would echo those sentiments. Thanks to Jessica for joining us and detailing what is clearly going to be an ongoing battle.

As I mentioned earlier check out the show notes for links to the episode on Project Origin and their work, and also an article on how social storytelling is changing how news is being created.

Don’t forget to get in touch. We are on email at [email protected] and on social, I am @CraigAW1969 on both Twitter and Instagram.

And also please share the podcast with friends and colleagues on your podcast platform of choice.

That’s all for now, thanks once again to Jessica, thanks to our producer Matt Diggs, and thanks to you for listening. Join me next time for more in-depth discussion with the people Making the Media.

  • Placeholder Image
  • © 2024