Global Engagement Center leader talks disinformation, technology and reauthorization
As the Global Engagement Center faces potential elimination, a leader within the State Department’s foreign disinformation-fighting unit is emphasizing the success that it’s had using technology to combat disinformation created by foreign actors and spread outside the United States.
The agency will lose its funding at the end of the year without congressional intervention.
In an interview with Scoop News Group, Carrie Goux, an acting deputy coordinator who helps lead the Global Engagement Center, said that her team plays a critical role in the foreign information space — and that the government needs to understand the emerging technology’s place it. The Global Engagement Center has repeatedly emphasized that its work does not focus on Americans or the United States.
“We need to continue to invest in our ability to understand the information space, to use emerging technologies, [and] to understand how emerging technologies are being used,” Goux said. “That means we need to have the technical expertise to do that and to have the technical solutions to do that. This is only moving faster, and it’s not going away.”
The GEC does face some constraints: For starters, it’s not yet able to focus on audio and visual disinformation, though that’s a goal. Like those doing similar work in academia and in the private sector, it’s also difficult to measure the ultimate impact, which involves understanding how online behavior might affect real-life behavior, like the outcome of an election.
Still, Goux argues there’s a strong reason to keep the organization intact.
This interview was edited for clarity and length. Rebecca Heilweil asked questions for FedScoop and Derek Johnson asked questions for CyberScoop.
FedScoop: For a lot of people now, the term disinformation is a pretty loaded term. When you talk about disinformation, what are you talking about exactly? Can you give some examples?
Carrie Goux: At the GEC, when we’re talking about disinformation, we’re looking at foreign actors overseas and networks of disinformation, how they are spreading disinformation, how they are spreading narratives in ways that are hidden. This content may look like it comes organically from local sources, but it is lies from foreign actors.
There are many channels. This is intentional messaging meant to manipulate the information environment. … Disinformation is as old as time, but right now we see it moving a lot faster because of technology.
FS: I know your team had a major success identifying a big effort from the Kremlin to push disinformation narratives in Africa. I’m curious if you can talk about the role that AI plays for those actors. Is there AI involved? How does it manifest? What are you seeing?
GG: What we’re seeing is that these networks are using all sorts of new and emerging technologies in very, very dangerous ways. What we’re trying to do is kind of reduce some of those risks that these new technologies present, while also harnessing the benefits ourselves. … Technology is allowing the narratives to move quicker. It’s allowing foreign actors to get these narratives to their purveyors in a format where they can move it out through social media platforms — in ways that are not only fast and nefarious, but also hidden.
FS: What is GEC doing with AI machine learning algorithms in terms of its own research and the studies that you’re trying to conduct? How do you see AI playing a role in what you’re doing?
CG: We are using innovative tools. … For example, you’ve talked to our team about text similarity tools, text similarity analysis, which is a very novel way for us to be able to deploy natural language processing and to really understand — at volume — the reach of narratives and how those narratives spread.
FS: Has anything changed or evolved — this institution has been around for a few years — in terms of the types of tools that you have at your disposal, given the development of AI? Have things gotten more advanced? Are there improvements that you’re making?
CG: We’re broadening how we can see the structure of the information environment. The natural language processing that — and this particular [text similarity] tool that we’re talking about — allows us to look at really specific narratives, but now we can look at themes. We can look at specific pieces of language. We can start to understand how these networks form so that we can get ahead of these narratives, so we can preempt or even disrupt these networks.
We’re actually seeing how these foreign actors are trying to covertly distribute content from state media outlets to local media outlets without that clear attribution. They’re also making changes to text to avoid detection. These tools could identify that activity.
CyberScoop: One of the biggest challenges that we see a lot of times in the disinformation space is measuring impact. When it comes to measuring impact, how do you quantify [your] work, particularly when you’re talking to outside parties or in the context of your upcoming reauthorization challenge?
CG: In the communications world — this is always one of the most difficult things to do — is to measure your impact. First of all, you’re going through so many different channels. There’s not easy data you get to measure. You’re trying to make a connection between something that’s gone out [on social media] and then a real-life behavior change. The question is: how do you measure something that doesn’t happen? We look for signs. One of the things that we did recently, together with our partners, the UK and Canada, is we exposed networks of disinformation that were trying to intervene in the election in Moldova.
We were pretty aggressive about that kind of exposure and taking action against that. I’m not saying there’s causation here, but I am saying that we’re looking for signs that perhaps some of what we did contributed to certain outcomes. We hope that’s the case. In the Moldova election, President Sandu has won — and that the nefarious Russian networks that were working against her did not achieve their goals.
CS: I think a counter to that would be what happened in Georgia, where the pro-Russian party maintained control in the elections. How do you measure some of the work that you do in instances like that?
CG: I can’t really speak to Georgia. At the GEC, we weren’t deeply involved in that. … I can only speak to places where … we’ve done some of that exposure work.
FS: One of the things that the framework to counter foreign state information manipulation talks about is technical capacity. I’m curious about how you would discuss the technical capacity of not just the GEC, but the partners you work with. Does it seem like everyone’s up to date and using the most advanced techniques?
CG: Because technology is moving so quickly, because this is somewhat of a new area, on many levels and channels of communication are quickly proliferating. What we want to do is create a common operating model of how we see the information space together with like-minded partners.
Different partners are at different levels of capacity. We are leading the charge to help build capacity and working with our higher-capacity partners to work with those who need additional resources with analytics and creating that common way of looking at the information environment.
We’re working on some technology systems and platforms that allow us to share information among partners and have this structured understanding of tactics, techniques, and procedures. It is an ongoing project.
We have made a lot of progress with partners to build the system, have these data exchanges and have a platform that will work for everybody.
CS: When you look at the work that you do, and particularly when you compare it to the commercial threat intelligence and other entities out there that track a lot of this activity in different ways, how do you justify your value add? How do you make the case that the work that the GEC is doing is invaluable?
CG:. It is critical that government have this capability to combat foreign information manipulation overseas. It is important that Iranian, [Chinese], Russian — nefarious and covert networks of information manipulation are exposed and disrupted — and that we can work with our international partners to protect our national security overseas.
I hope people recognize that this is a critical piece. It’s also important that we work together with the private sector and civil society and our international partners. We do communicate with private-sector partners to share tactics, techniques, and procedures that we see and understand what they’re seeing. We do not do any directive kind of meetings. We do not tell them what to do. They do not tell us what we should be doing. But it’s important that we’re able to share that information, because it is a whole-of-society approach that we need to take.
FS: If funding does continue, what are the things that you might hope to do, especially thinking about the role that AI might play in the information environment?
CG: Looking forward, we want to be able to build our expertise on these technologies, work with the interagency — as AI grows — to find ways that we can label, identify and authenticate content. That would be critical. But also, [part of] being able to do that is having the technology expertise in-house.