Part 2: Challenges and Solutions for Combating Nuclear Mis- and Disinformation  

There is no catchall solution for completely stopping mis- and disinformation. It might be impossible to untangle the technology, psychology, and politics and social issues that all contribute to the spread and believability of false information. This is true for nuclear mis- and disinformation as well. But recognizing the challenges and thinking about possible solutions is an important start, especially for such an impactful and increasingly salient topic like nuclear weapons. 

FacebookXLinkedInEmailCopy Link

This is the second part in a two-part series on nuclear mis- and disinformation. The first part explores past cases of WMD related disinformation and gives examples of nuclear mis- and disinformation and the impact on the American public. This article will continue the focus on nuclear mis- and disinformation by exploring solutions and challenges to combating false information about nuclear weapons. The topic of nuclear weapons is not free from mis- and disinformation. Like many other salient topics today – climate change, pandemics, international conflicts, domestic politics – the digital environment is filled with false information. How can the problem of mis- and disinformation be solved, and when the topic of concern is nuclear weapons, are there specific challenges and solutions to be aware of? 

A challenging environment 

There are many challenges to combatting mis- and disinformation. One challenge is that not just people, but bots, spread the false information. Bot networks can be used to artificially boost the numbers next to a post, increasing the reach of the post. AI bots can also generate content. A recent Notre Dame experiment where participants interacted with humans or AI bots running on large language models (LLMs) found that 58 percent of the time, the participants could not tell if they were interacting with a bot or a human. Even if bots generate factual content through LLMs, there is still some erosion of trust over whom you interact with online, which is another human. If AI bots spread mis- or disinformation, this loss of trust is worse. Humans still have a role to play in spreading false information as well. In fact, some research has found that humans were equal or more responsible for the virality of false news than bots. Finally, these technologies have a lower cost of producing mis- and disinformation, allowing malign nonstate actors to create and then spread false information cheaper than before.  

Why, if the information is false, are so many people exposed to it? This might seem simple on the surface: increasing volumes of information will mean increasing volumes of mis- and disinformation. Also, with more humans and bots creating and spreading mis- and disinformation, there is just bound to be more of it. However, the problem goes deeper than that. A report in Science, analyzing rumor cascades on Twitter from 2006-2017, found that lies and false news spread farther, faster, deeper, and more broadly than the truth. The mechanisms of social media also perpetuate sharing false information. Social media’s rewards-based learning system causes users to form habits of sharing information that attracts others’ attention.  

Perhaps the biggest challenge to combating mis- and disinformation is that there is little understanding of the actual effect false information has on internet users. There is much debate over the effectiveness or ineffectiveness of disinformation campaigns and research over the influence of such campaigns poses methodological challenges. Further, foreign disinformation campaigns can be hyped up as the cause of problems that are domestic in nature. While outside the scope of this post, it is important to understand the debates on mis- and disinformation’s influence power. The spread of mis- and disinformation does not have a single cause; there is not a single solution to combat it, nor is it the sole contributor to global threats and conflict.  

The nuclear dimension 

The above challenges are present in nuclear mis- and disinformation, which also comes with its own set of difficulties. Nuclear weapons are a highly technical, classified, and emotive topic. And it is a topic that foreign state and nonstate actors have spread false information about in the past and could be willing to do today, this time with more digital communication methods. These features make it hard to find practical solutions for combatting nuclear mis- and disinformation. 

A Chicago Council on Global Affairs survey found that the public remains unfamiliar with aspects of nuclear weapons. That is not to say the public has no knowledge or is not thinking about nuclear weapons. Multiple surveys have been conducted over the years on the public related to nuclear deterrence, the relationship between nuclear weapons and nuclear power, or hypothetical nuclear conflict scenarios. Yet formal education in nuclear weapons is lacking. A 2019 study found that in 75 of the top-ranked public, private, and military institutions in the country, each institution offered, on average, seven nuclear weapons courses over a two-year academic period; climate change related courses by contrast were offered in 19 to 30 courses.  

A lack of educational opportunities and unfamiliarity on the subject leads to gaps in knowledge. With little prior knowledge on a technical subject, the public may believe sources that sound scientific, but are not true. People will also believe something that they see more often. Put together, widespread and frequent nuclear mis- and disinformation might have a better chance of being believed, especially if the reader lacks knowledge on nuclear weapons topics.   

Nuclear weapons can also be an emotionally charged and highly debated issue. There are many different views of nuclear weapons held by the public as the surveys mentioned above indicate. And when people think about nuclear weapons and nuclear war different emotional patterns occur. Some people state having feelings of worry, fear, and sadness. People may also feel hopelessness and despair and ignore the issue altogether. One researcher stated, “the general public is also likely dissociated because of this huge existential terror—even if they’re not conscious of it. There’s never been a time in human history prior to the invention of nuclear weapons that had that quality of total annihilation.”  

A disinformation campaign could take advantage of these emotions, such as through disinformation mimicking the accidental ballistic missile alert in Hawaii. Residents expressed uncertainty and anxiety during this incident, leading to various actions or inaction. In a time of crisis or emergency with nuclear mis- and disinformation involved, emotion could override reason and cause panic or hopelessness or other negative emotions. People showing heightened emotionality generated a greater belief in fake – but not real – news. As such, this would be a perfect situation for an adversary to muddy the information environment, causing uncertainty and fear in a population and throwing it off-guard. Nuclear mis- and disinformation could be a catalyst for further spreading other false information into an emotionally charged population.  

Solving the problem 

There are many ways in which to combat general mis- and disinformation. These could be proactive measures, such as media literacy education, reducing data collection and targeted ads, or labeling social media content. There could also be reactive solutions such as fact-checking social media posts or counter-messaging strategies. There are also several online tools to help inform the public on mis- and disinformation.  

Each of these measures has limitations and weaknesses. For example, a “backfire effect” was found when gauging the effectiveness of fact checking. Participants in a study presented with factual information (in this case evidence that no WMDs were found in Iraq) that went against their ideological and partisan attachments became more entrenched in their false beliefs. And who would oversee fact-checking? Would it be the government, social media platforms, or some citizen-led group? Whatever the group, there would be pushback by some segments of society. 

On a more analytical level it is also necessary to understand the different actors and their reasons for spreading false information, as well as to understand the differing narratives between populations and the root causes of false information in the media landscape. Solving the problem of mis- and disinformation requires being actively engaged with digital content, questioning where appropriate and using critical thinking to discern the truthfulness of the content. In the nuclear sphere, who would spread nuclear disinformation, and in what form? Would it be a deepfake meant to cause widespread uncertainty, internet trolls nuclear signaling on behalf of a state, or something else? While a subject for a different paper, understanding the actors and intentions of nuclear disinformation, as well as the target audience, is important.  

Spreading the nuclear know-how  

Solutions to combating nuclear mis- and disinformation fall not just on the public being more well-informed and willing to engage in the topic of nuclear weapons. Members of the nuclear enterprise and adjacent areas such as academia have a role to play as well.  

It appears that, while Americans say they do not know much about nuclear weapons and policy, they want to learn more about the subject. While avenues of learning about this topic may not be available to everyone, there are some existing avenues. HighlyNRiched is an online resource which provides connections to nuclear experts and resources more available for those who want to learn more about nuclear policy. At a local level, Los Alamos National Laboratory’s official public museum, the Bradbury Science Museum, helps visitors learn about the Lab’s beginnings during the Manhattan Project and how the Lab’s continuing work enables global security. They also offer a Science on Wheels program where educators come to the classroom to lead students through the hands-on activity. On the academic side, more classes and activities (TTXs, wargames) should be offered at a middle school, high school and university level on nuclear weapons, nuclear policy, and nuclear deterrence. Exercises could take the form of a Model UN-type simulation involving nuclear weapons and arms control or more personal exercises around nuclear weapons use and effects.   

Increasing access to knowledge and involvement in the nuclear enterprise alone will not stop mis- and disinformation from spreading or being believed. At a time of decreased trust in the government – in 2023, public trust in the federal government returned to record low levels – it is not a guarantee that more information will be more believable, especially from official sources. Realizing that the Strategic Posture Commission Report findings had bipartisan consensus might surprise and give hope to those that believed no serious policy conclusions could be had in a hyperpartisan environment, but it is not enough to increase trust in the wider government system. 

The nuclear enterprise cannot solve the root causes of decreased trust or stop mis- and disinformation. But an easy action to take could be to increase and update its online presence. An arms control negotiating team using social media to compete with foreign governments (or trolls) could be applied to the nuclear policy crowd more broadly. Also, official government websites on arms control – and on nuclear policy generally – being updated from hard to find, insider language and mathematics-heavy content to content easier for the public to engage with could prevent false information about U.S. nuclear policy from spreading.  

A more novel approach comes from a workshop on social media and nuclear early warning systems. It focuses on the role of civil society and local government. Relating to the potential for mis- or disinformed false alarms and warnings, the workshop found that cities and civil society emerged as a set of actors and networks that could create forms of governance and public information goods that restrains the aggressive use of social media that may contribute to false alarms. They would do so by contributing to independent, impartial and validated information in events of early warning. 

There is no catchall solution for completely stopping mis- and disinformation. It might be impossible to untangle the technology, psychology, and politics and social issues that all contribute to the spread and believability of false information. This is true for nuclear mis- and disinformation as well. But recognizing the challenges and thinking about possible solutions is an important start, especially for such an impactful and increasingly salient topic like nuclear weapons. 

* The views and opinions expressed in this paper are those of the author and do not necessarily represent the official positions of Los Alamos National Laboratory, the Department of Energy, or the United States Government.* 

FacebookXLinkedInEmailCopy Link