Part 1: Nuclear Mis- and Disinformation and Impacts to the American Public

Russia’s nuclear saber rattling throughout the war in Ukraine, reports of China’s increasing nuclear modernization, and the 2023 release of Oppenheimer has reignited in the public talk of nuclear weapons. With the increasing prevalence of conversation around nuclear weapons comes increasing opportunity for mis- or disinformation to spread about them and the nuclear enterprise (NE)....

FacebookTwitterLinkedInEmailCopy Link

Russia’s nuclear saber rattling throughout the war in Ukraine, reports of China’s increasing nuclear modernization, and the 2023 release of Oppenheimer has reignited in the public talk of nuclear weapons. With the increasing prevalence of conversation around nuclear weapons comes increasing opportunity for mis- or disinformation to spread about them and the nuclear enterprise (NE). Mis- and disinformation are not new phenomena. However, the speed and reach offered by current digital communications technology is novel. What are the impacts of misinformation or disinformation about nuclear weapons to the American public?

The first of a two-part series will explore this question. This article will give an introduction on the current mis- and disinformation landscape, provide historic examples of weapons of mass destruction (WMD) related disinformation, and attempt to outline the possible risks posed by nuclear mis- and disinformation. The topic of nuclear disinformation specifically influencing the public has less research than other technological threats to the nuclear enterprise such as artificial intelligence and cyber-attacks. It is important to fill this gap and think about the detrimental impacts nuclear mis- or disinformation might have.

(Dis)information Overload

There are many definitions for mis- and disinformation. Keeping to the theme of nuclear weapons, I will refer to the Defense Threat Reduction Agency’s (DTRA) definitions. Disinformation is “False information which is intended to mislead, especially propaganda issued by a government organization to a rival power or the media” while misinformation is “spreading, sometimes unknowingly, false information such as rumors, insults, or falsehoods”. The key difference is that disinformation is always spread with the intent to mislead or harm, while misinformation is not. Yet the two can be connected. A social media user may genuinely believe disinformation created by a malign actor and share the unknowingly false information. Or a state actor can sponsor a news agency or online influencer to spread disinformation on its behalf, as the People’s Republic of China (PRC) is said to do.

Most if not all readers of this article have been exposed to some form of online mis- or disinformation. This is not surprising as false information, spreads more rapidly than ever before thanks to social media. Furthermore, research has shown that false news reaches more people than the truth. Recent world events have been a ripe target for mis- and disinformation as well, cluttering social media feeds and causing confusion in the public. Examples include COVID-19 information manipulation by state actors such as Russia and China, climate disinformation spread through X (Twitter) and Reddit, and disinformation across the war in Ukraine. Mis- and disinformation will follow any major event or news story. WMDs, and nuclear weapons among them, are no exception.

WMDisinformation

Disinformation related to WMDs has occurred in the past. An overview of historic cases helps in understanding what risks nuclear disinformation may carry and the effects it had on the public. From 1951-1953 The Soviet Union, North Korea, and the People’s Republic of China (PRC) spread disinformation accusing the United States of using biological weapons in both North Korea and the PRC. The disinformation led to left-wing organizations in Western European nations mobilizing millions of people to protest against the United States supposed use of bioweapons.

The Soviet Union went on to spread more bio-related disinformation when in 1985 a Soviet newspaper republished and spread the accusation that the HIV virus spread from a U.S. lab leak. This was known as Operation Infektion. The false information spread around the world and further insidious claims circulated that the United States released the virus in Africa to kill Africans. Although these beliefs were not held by the majority of the American public, “20 public opinion surveys of African-Americans between 1990 and 2009 showed that an average of 28 percent of respondents believed that the purpose of genocide was involved in the origin of HIV”.

A third example of WMD disinformation concerns Iraq. Disinformation, alongside groupthink and flawed intelligence may have played a role in the decision to invade Iraq in 2003. After 9/11, journalists and U.S. officials received a stream of false information from Iraqi exiles regarding Hussein’s WMD aspirations and connections to al-Qaeda. Western media throughout late 2001 spread false information from Iraqi defectors and anonymous U.S. officials about sophisticated WMD architecture in Iraq. While not aimed at the public specifically, the disinformation dissemination may have impacted the public’s view of Iraq. In 2002, 77 percent of Pew survey participants in America said that that Iraq was developing WMDs.

There are important takeaways from the above examples of disinformation. First, misinformation and disinformation are linked. A state or nonstate actor will intentionally spread disinformation for its own gains. Once in the media landscape though things become murky. A social media user or a reporter may pick up the disinformation and spread it without malicious intent, either because they believe it is true or did not do due diligence in fact checking. As described in the previous section, a state actor could sponsor a news agency or influencer to spread false information. Second, the examples indicate that segments of the public will believe false claims. Government officials – in the case of Iraq – and the media can fall prey to disinformation as well. This may complicate the public’s attempt to find truth from the government or news sources.

Mis- and Disinformation Go Nuclear

Mis- and disinformation have already occurred in the nuclear weapons space. In 2020, Pakistan’s Defense Minister responded to a fake news article purporting an Israeli military threat against Pakistan with a tweet reminding Israel that Pakistan was a nuclear state. After drones struck the Kremlin in 2023, verified blue checkmark Twitter accounts spread misinformation about Russia preparing a nuclear response. And in 2018 a false missile warning by an employee of concern was sent out to Hawaii residents. With these examples in mind, what risks does nuclear disinformation carry to the American public?

Panic and apathy:

The false missile warning in Hawaii was not part of a disinformation campaign from a malicious actor. Yet the public reactions may inform what a similar situation, one caused by disinformation, would be like. The alert, “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL” went uncorrected for 38 minutes. In those minutes people thought they were going to die, rushed home to be with loved one, or even hid their children into manholes. It is easy to imagine even more desperation and panic if a longer amount of time passed before the alert was corrected. If an adversary wanted to sow panic and confusion in a population, especially one already aware of its vulnerable position to a nuclear weapon strike, this is a good option.  

The Hawaii false missile warning also begs the question: what happens if the public ignores or mistrusts these warnings? Someone getting multiple false warnings on their phone about an impending nuclear missile strike may begin to ignore them. Over-alerting is a common fear in the emergency-management community. And false alerts could muddy the water of what is actually going on, and begin to erode trust in the warning system and those responsible for it. For all of the panic caused in Hawaii, a common theme amongst residents was to seek further information from social media or the news or radio. While important to verify the warning’s validity, information seeking would be risky under tight time restraints or during a social media disinformation campaign.

Knowledge gaps:

Oppenheimer may have led many theatergoers to think more about nuclear weapons, but is the information they are exposed to accurate? Nuclear mis- and disinformation threatens to make it more difficult to research nuclear matters, an already difficult task due to its technical nature and classified information. In addition, the unique destructiveness of nuclear weapons makes it both an emotional and cerebral exercise to comprehend nuclear weapons fully.

While most people would not believe the conspiracy theory that the U.S. faked nuclear tests or that the explosion of the port of Beirut was from a nuclear weapon, more realistic disinformation could keep people from wanting to learn more about nuclear weapons. This could include disinformation around nuclear policy, threats, or weapons effects. For example, a Soviet information campaign influenced peace movements to forestall NATO intermediate nuclear forces (INF) deployment in Europe and make the Soviet Union look peace loving. If something similar occurred today on social media, it could quickly confuse portions of the public, leading to a distrust of the nuclear enterprise. Further, mis- and disinformation on scientific topics, such as those focused on nuclear non-proliferation, could “lead the public to believe objectively inaccurate or even wholly false information related to nuclear non-proliferation, ultimately leading to support for wrong-headed or dangerous policies”.

What the public believes about nuclear weapons has implications for the nuclear enterprise. This is true not just in support of policy, but also in hiring practices. At a time where many retirement-eligible personnel are expected to exit the nuclear enterprise and with the U.S. facing a range of nuclear issues, it is important to have an informed public, some of whom are willing to work for the nuclear enterprise. Mis- or disinformation keeping someone from entering the NE workforce should be a concern. Moreover, for new employees, a starting knowledge of nuclear issues based on correct information is important.

Escalation:

Could nuclear mis- or disinformation contribute to increased crisis escalation between states? Would the American public play a role in such a situation? With both the public and decision-makers active on social media and exposed to a 24/7 news cycle, mis- and disinformation could affect escalation dynamics. Social media users or bots expand the volume, variety, and velocity of information, misinformation, and disinformation available to decision-makers that could pressure them to act or cause decision paralysis. Similarly, an adversary influence operation could fabricate sensitive information such as war plans or internal communications, increasing public demands for information and action from decision-makers. Further, a state may use disinformation to cause intentional escalation to gain some advantage, signal resolve, or to avoid defeat. 

The public has a role to play in escalation dynamics. Public opinion can shape political salience by creating pressure for policymakers to respond to popular concerns; conversely, political salience can influence public opinion. A political leader could frame the narrative on nuclear weapons policy and shape the information and arguments presented to the public. Escalation may emerge depending on how leaders frame about nuclear weapons to the public, becoming caught in a ‘commitment trap’ of sorts and having to hold onto their views. Mis- or disinformation could not only be part of that framing, but also shape the leader’s framing in the first place. Made public on social media, the framing may be hard to control, creating a ‘rally round the flag’ effect that could have unintended international consequences.

Neither an information campaign nor nuclear mis- or disinformation will be the sole or main cause of crisis escalation. Yet during a time of crisis, these can interact with traditional forms of military or economic power, potentially exasperating the crisis. It can be difficult to predict what effects an information campaign may have, if any at all. However, an information environment, one partly based on social media and filled with mis- and disinformation, and which the public and increasingly political leaders use, has the potential to cause crisis escalation. This calls for increased caution and awareness about who is sharing what, and when.

Conclusion

There could be many impacts on the American public from nuclear mis- or disinformation. Like WMD disinformation campaigns in the past, it could cause social movements against and distrust of the nuclear enterprise and U.S. government. Mis- and disinformation can also cause undue panic or apathy in the event of false alarms. Without a public willing and able to learn about nuclear weapons, the nuclear enterprise as a whole is at risk of being unable to hire future generations of nuclear practitioners. Finally, a muddied information landscape presents a risk of crisis escalation. While other technologies’ effects on nuclear weapons – AI, cyber, automation – are becoming a major focus, a focus on information communication technology is lacking. To better understand potential risks of nuclear mis- and disinformation, more cross-disciplinary research is needed.

LA-UR-24-22468

* The views and opinions expressed in this paper are those of the author and do not necessarily represent the official positions of Los Alamos National Laboratory, the Department of Energy, or the United States Government.*

FacebookTwitterLinkedInEmailCopy Link