Friday, May 27, 2022

Is cold exposure beneficial for metabolic health?

Peter Attia, MD

When I was in Norway in February 2020, I shared on Instagram that a friend and I were experimenting with cold exposure by spending about 20 minutes outside in a wind chill of roughly -10°F. Our hope in putting ourselves through such misery? Activation of brown fat.

What is cold exposure therapy?

Cold exposure therapies are forms of short exposure to cold temperatures (which don’t have to be as extreme as Norway in mid-winter). Cold showers, ice baths, open water swimming, and even short walks in cold weather are all forms of cold exposure therapy. Goals of cold exposure therapy include boosting energy expenditure and improving metabolic health. Though the precise activities and durations may vary, for the purpose of improving metabolic health, cold exposure therapies are all rooted in the same simple rationale: production of heat (known as “thermogenesis”) requires an expenditure of energy. In cold environments, the body must produce more heat to maintain its temperature, thus burning more calories. One mechanism by which mammals are known to produce heat is through skeletal muscle shivering. Another is known as “nonshivering thermogenesis” and relies principally on a specialized type of adipose tissue called brown fat.

To learn more about both cold exposure therapy and heat therapy, check out AMA #16.
What is brown fat?

Brown fat, or brown adipose tissue (BAT), gets its name from its difference in color relative to white adipose tissue (WAT), which constitutes the vast majority of fat in the human body. Unlike WAT, BAT contains a high concentration of iron-rich mitochondria, which, in addition to giving the tissue a brown color, make BAT far more metabolically active than WAT. The mitochondria in BAT are unique from those in other body tissues, however. Instead of harnessing energy and converting it to chemical forms (ATP) that the cell can use to support its normal functions, BAT mitochondria are designed to “waste” energy in the form of heat.

When mammals are exposed to cold conditions, the sympathetic nervous system releases norepinephrine, stimulating brown fat to burn energy and create heat to maintain core body temperature. Smaller mammals lose heat easily due to their relatively high surface area to volume ratio, and thus need to generate more heat to maintain core body temperature against cold environments than larger mammals. For this reason, small rodents have been used extensively to study BAT, since BAT activity contributes up to 60% of nonshivering thermogenesis enabling survival in winter conditions. One study found that an eight-week constant exposure to 41°F led to a 42% increase in nonshivering thermogenesis capacity over control mice housed at 72°F. In another study, hairless mice housed at an ambient temperature of 50°F for four weeks had 3x more BAT by percent body weight than their counterparts housed at 86°F (1.5% compared to 0.5%).

How important is brown fat for humans?

In humans, the thermogenic capacity of brown fat is most important during infancy. Newborns, like other small mammals, have a high surface area to volume ratio and a relatively low quantity of insulating WAT at birth. Therefore, it makes sense that about 5% of a newborn’s body weight is BAT, a much higher percentage than adults. As we age, we lose a lot of the brown fat with which we were born and transition to muscle shivering as the predominant form of thermogenesis in cold conditions. However, small deposits of brown fat persist through adulthood in vital regions of the body, especially around the neck and upper shoulders.

In humans, BAT volume cannot be measured directly, as BAT cells are intermingled with WAT cells and other tissues. Instead, BAT volume is estimated from measurement of BAT metabolic activity. The gold standard for measuring BAT activity in laboratory studies is positron emission tomography-computed tomography (PET/CT) imaging, in which 18F-fluorodeoxyglucose (18F-FDG), a radionuclide marker for glucose, is used to visualize glucose uptake in BAT. When activated, BAT increases its uptake of glucose and free fatty acids from the bloodstream, acting as a “metabolic sink” to fuel thermogenesis. BAT is thus identified as areas of adipose tissue that show high levels of 18F-FDG uptake – indicating high metabolic activity.

18F-FDG studies have shown that BAT activity in humans increases with cold acclimation. Young adults underwent a ten day protocol of acclimation to intermittent, mildly cold temperatures of 60°F. BAT activity was measured before and after the acclimatization with 18F-FDG PET/CT during a more acute cold exposure, just above the threshold for shivering. After acclimatization, glucose uptake increased, nonshivering thermogenesis increased from 10.8 to 17.8%, and the subjective experience of cold became more comfortable over the ten-day period. In addition to the increase in BAT activity, there is also evidence from rodent studies that suggest that cold exposure may cause BAT induction, where WAT cells become more “brown-like.” This process – termed “beiging,” has limited and inconclusive evidence in humans for short-term cold exposures. However, there is some evidence that long-term, regular cold exposure promotes this type of WAT-to-BAT transition in humans.
Measuring Brown Fat Activity

For my self-experimentation in Norway, I was not going to have a PET/CT scan following my stint in the cold, so instead I used an infrared camera to capture a thermal map of my skin surface.



As you’ll hear us discuss in the video below, the measurement of BAT activation is based on a “delta” – or the change in the thermal measurements before vs. after cold exposure. And it turns out, I had the largest change in BAT activity in response to the cold!

Impact of BAT Activation on Energy Expenditure

So how much of a difference can BAT activation make for overall energy expenditure? The estimated mass of BAT in adults is small, ranging from 10-300g (whereas WAT mass is typically 13.5-18 kg for a normal weight adult male). It is estimated that 50 g of BAT is responsible for roughly 5% of resting energy expenditure, or 75-100 kcal per day. During short-term mild cold exposure, when BAT is most active, glucose uptake per gram of BAT is greater than uptake per gram of skeletal muscle. However, skeletal muscle mass far exceeds BAT mass, so in considering each of these tissues in total, BAT is not going to have nearly the effect on metabolism that skeletal muscle has. This is clear from measurements of energy expenditure changes during cold exposure: while overall energy expenditure increases up to 250-300 kcal/24 h, the contribution of BAT activity accounts for <20 kcal/24 h, even in individuals with relatively high amounts of BAT. The remaining increase in energy expenditure is attributed to thermogenesis in skeletal muscle.

The Bottom Line.

Cold exposure therapies are based on a known and reliable mechanism of BAT activation. Even seasonal changes in temperature have been correlated to increased BAT activity. Still, despite the high metabolic capacity of active BAT, its overall contribution to total energy expenditure is fairly small in adults. Indeed, even when looking exclusively at cold-induced increases in energy expenditure, BAT activity contributes only a relatively small percentage compared with both shivering and nonshivering activity in skeletal muscle. In other words, the rationale for cold therapies may be valid at a cellular, mechanistic level, but it falls apart when we consider the minuscule impact that cold-induced BAT activation has on whole-body energy expenditure in humans.

It’s possible that cold exposure therapies may also influence alternative pathways besides BAT, which might account for the observed effects of these practices on inflammation and the cardiovascular, endocrine, and nervous systems. Further, pharmacological interventions may someday be capable of activating BAT and/or triggering beiging of WAT to a far greater extent than appears possible by cold exposure, potentially opening the door for BAT activation to become a viable option for driving meaningful increases in energy expenditure. But when it comes to using cold exposure therapies to activate BAT, the benefits are likely too small to make any difference for metabolic health. It appears that my suffering in Norway was to no avail for health. But hey, at least I left with some pretty unique bragging rights.

Sign up to receive Peter's expertise in your inbox
Sign up to receive the 5 tactics in my Longevity Toolkit, followed by non-lame, weekly emails on the latest strategies and tactics for increasing your lifespan, healthspan, and well-being (plus new podcast announcements).

Disclaimer: This blog is for general informational purposes only and does not constitute the practice of medicine, nursing or other professional health care services, including the giving of medical advice, and no doctor/patient relationship is formed. The use of information on this blog or materials linked from this blog is at the user's own risk. The content of this blog is not intended to be a substitute for professional medical advice, diagnosis, or treatment. Users should not disregard, or delay in obtaining, medical advice for any medical condition they may have, and should seek the assistance of their health care professionals for any such conditions.

Friday, May 13, 2022

 The horrors of Hiroshima and Nagasaki made the whole world afraid of the atomic bomb – even those who might launch one. Today that fear has mostly passed out of living memory, and with it we may have lost a crucial safeguard


On an August morning in 1945, 600 metres over the Japanese city of Hiroshima, a small sun came briefly into existence. Few remember a sound, but the flash printed shadows on the pavements and sent buildings thrashing. The explosion – 2,000 times greater than that of any bomb yet used – announced not only a new weapon but a new era.

It was a stunning military victory for the United States. Yet jubilation there was undercut by “uncertainty and fear”, the newsman Edward R Murrow observed. It took only a moment’s reflection on the bomb’s existence to see the harrowing implication: what had happened in Hiroshima, and three days later in Nagasaki, could happen anywhere.

The thought proved impossible to shake, especially as, within the year, on-the-ground accounts emerged. Reports came of flesh bubbling, of melted eyes, of a terrifying sickness afflicting even those who’d avoided the blast. “All the scientists are frightened – frightened for their lives,” a Nobel-winning chemist confessed in 1946. Despite scientists’ hopes that the weapons would be retired, in the coming decades they proliferated, with nuclear states testing ever-more-powerful devices on Pacific atolls, the Algerian desert and the Kazakh steppe.

The fear – the pervasive, enduring fear – that characterised the cold war is hard to appreciate today. It wasn’t only powerless city-dwellers who were terrified (“select and fortify a room in which to shelter”, the UK government grimly advised). Leaders themselves were shaken. It was “insane”, US president John F Kennedy felt, that “two men, sitting on the opposite sides of the world, should be able to decide to bring an end to civilisation”. Yet everyone knowingly lived with that insanity for decades. It was as if, wrote the historian Paul Boyer, “the Bomb” were “one of those categories of Being, like Space and Time, that, according to Kant, are built into of the very structure of our minds, giving shape and meaning to all our perceptions”.


Except that the threat of nuclear war, as Vladimir Putin is reminding the world, has not gone away. Russia has amassed the world’s largest collection of nuclear weapons, and Putin has threatened to “use them, if we have to”. The odds that he may are steadily increasing as Nato countries inch toward direct conflict with Russia. They are now sending Ukraine tanks and missiles, amassing troops in eastern Europe and providing intelligence that has allowed Ukrainians to target and kill Russian generals and sink a Russian warship. If this continues, the risk of nuclear war will be “considerable”, Russia’s foreign minister has warned. “The danger is serious, real, and we must not underestimate it.”

Yet many of Putin’s adversaries seem either unconvinced or, worse, unbothered by his threats. Boris Johnson has flatly dismissed the idea that Russia may use a nuclear weapon. Three former Nato supreme allied commanders have proposed a no-fly zone over Ukraine. This would almost certainly entail direct military conflict between Nato and Russia, and possibly trigger the world’s first all-out war between nuclear states. Still, social media boils over with calls to action, and a poll found that more than a third of US respondents wanted their military to intervene “even if it risks a nuclear conflict”.

Nuclear norms are fraying elsewhere, too. Nine countries collectively hold some 10,000 warheads, and six of those countries are increasing their inventories. Current and recent leaders such as Kim Jong-un, Narendra Modi and Donald Trump have, like Putin, spoken brazenly of firing their weapons. After North Korea promised “thousands-fold” revenge in 2017 for sanctions on its accelerating nuclear weapons programme, Trump threatened a pre-emptive strike, pledging to unleash “fire and fury like the world has never seen”. “This is analogous to the Cuban missile crisis,” one of Trump’s former aides, Sebastian Gorka, insisted.

Leaders have talked tough before. But now their talk seems less tethered to reality. This is the first decade when not a single head of a nuclear state can remember Hiroshima.

Does that matter? We’ve seen in other contexts what happens when our experience of a risk attenuates. In rich countries, the waning memory of preventable diseases has fed the anti-vaccination movement. “People have become complacent,” notes epidemiologist Peter Salk, whose father, Jonas Salk, invented the polio vaccine. Not having lived through a polio epidemic, parents are rejecting vaccines to the point where measles and whooping cough are coming back and many have needlessly died of Covid-19.

That is the danger with nuclear war. Using declassified documents, historians now understand how close we came, multiple times, to seeing the missiles fired. In those heartstopping moments, a visceral understanding of what nuclear war entailed helped keep the launch keys from turning. It’s precisely that visceral understanding that’s missing today. We’re entering an age with nuclear weapons but no nuclear memory. Without fanfare, without even noticing, we may have lost a guardrail keeping us from catastrophe.

The nuclear age started at 8.15am, 6 August 1945, with the release of a 4,400kg bomb from a B-29 over Hiroshima. Forty-three seconds later, an enormous explosion shattered the city. That so much destruction could be wrought so quickly was shocking news, which until then only a small coterie of scientists and military officials had known. “One senses the foundations of one’s own universe trembling,” a New York paper wrote in reaction to the terrifying new weapon.

What did the bomb mean? Atomic scientists, who’d had time to contemplate the question, rushed in to explain. The important thing wasn’t that atomic bombs could incinerate cities – conventional weapons were already accomplishing that. What distinguished atomic weapons was how easy they made it, noted J Robert Oppenheimer, who had helped oversee the bomb’s development. Atomic weapons “profoundly upset the precarious balance” between offence and defence that had governed war hitherto, Oppenheimer explained. A single plane, a single payload – no city was safe.

The implications were, the scientists admitted, horrifying. Albert Einstein had proposed in 1939 that the US government develop nuclear weapons, in order to ensure that Adolf Hitler wouldn’t acquire them first. But immediately after the war, he worried about any country possessing such power. Along with many of his colleagues, he concluded that the only solution was a global government, sovereign above existing countries, that would hold the world’s nuclear arsenal, enforce laws and prevent wars (other details were vague). One World Or None was the title of a bestselling book that Einstein, Oppenheimer and other scientists produced in March 1946. Even the head of the US air force contributed a chapter pleading for “a world organisation that will eliminate conflict by air power”.

Five months later, the world heard from another set of voices. The US occupation authorities in Japan had censored details of the bomb’s aftermath. But, without consulting the censors, the American writer John Hersey published in the New Yorker one of the most important long-form works of journalism ever written, a graphic account of the bombing. Born to missionaries in China, Hersey was unusually sympathetic to Asian perspectives. His Hiroshima article rejected the bomber’s-eye view and instead told the stories of six survivors.


For many readers, this was the first time they registered that Hiroshima wasn’t a “Japanese army base”, as US president Harry Truman had describedit when announcing the bombing, but a city of civilians – doctors, seamstresses, factory workers – who had watched loved ones die. Nor did they die cleanly, vaporised in the puff of a mushroom cloud. Hersey profiled a Methodist pastor, Kiyoshi Tanimoto, who raced to the aid of his ailing but very much still-living neighbours. As Tanimoto grasped one woman, “her skin slipped off in huge, glove-pieces”. Tanimoto “was so sickened by this that he had to sit down for a minute”, wrote Hersey. “He had to keep consciously repeating to himself, ‘These are human beings.’”

Hersey’s contemporaries understood the significance of these accounts. The New Yorker dedicated its full issue to Hersey’s article, and within an hour sold out its entire newsstand print run of 300,000 (plus another 200,000 copies to subscribers). Knopf published it as a book, which eventually sold millions. The text was reprinted in newspapers from France to China, the Netherlands to Bolivia. The massive ABC radio network broadcast Hersey’s text – with no commercials, music or sound effects – over four consecutive evenings. “No other publication in the American 20th century,” the journalism historian Kathy Roberts Forde has written, “was so widely circulated, republished, discussed, and venerated.”

Tanimoto, boosted to celebrity by Hersey’s reporting, made speaking tours of the US. By the end of 1949, he had visited 256 cities. Like Einstein, he pleaded for world government.

World government: it seems now like a wild utopia. Yet an astonishing number of people – responsible, sober people – felt it to be the only way of preventing more Hiroshimas. Winston Churchill and Clement Attlee both supported the idea. In France, Jean-Paul Sartre and Albert Camus championed it. France’s postwar constitution provided for the “limitations of sovereignty” a future world state might require. So did Italy’s.

Even in the US, which stood to lose its nuclear monopoly and global supremacy, support for a world state hovered between a third and a half in opinion polls. “World Government shall come – this is practically the consensus in this generation,” wrote the University of Chicago’s chancellor; he even convened a committee to draft its constitution. When candidates in the 1948 elections were asked if they favoured a global government with direct jurisdiction over individuals and peacekeeping powers, 57% said yes, including John F Kennedy and Richard Nixon. In 1946, the movie star Ronald Reagan donated $200 to the cause.

Such enthusiasm, however, counted for little in the face of geopolitics. Rising tensions between Washington and Moscow erased the possibility of global government. Still, they didn’t change the fact: across the west, leading thinkers felt nuclear weapons to be so dangerous that they required, in Churchill’s words, remoulding “the relationships of all men of all nations” so that “international bodies by supreme authority may give peace on earth and justice among men”.


or Albert Einstein, a world state was “the only one way out”: it was either that or annihilation. Yet the world state never came, and neither did another nuclear war. One of the most surprising and important facts of modern history is a quiet one: in the 76 years since Hiroshima and Nagasaki, not a single nuclear weapon has been detonated in anger.

Most weapons don’t work like that. Poison gas is one of the few other military technologies that, despite its effectiveness, has been shunned – the first world war was a gas war, but the second world war mostly wasn’t. Yet even chemical weapons have seen intermittent use, as by Iraq in the 1980s or Syria in 2013. By contrast, the number of nuclear weapons used since 1945 is zero.

Why? The conventional explanation is deterrence. The very thing that terrified Oppenheimer about nuclear weapons – that they made attacking easy and defending nearly impossible – meant that any country nuking a similarly armed foe would have to expect a counterstrike. “Because catastrophic outcomes of nuclear exchanges are easy to imagine, leaders of states will shrink in horror from initiating them,” the political scientist Kenneth Waltz argued. This logic led Waltz to a counterintuitive position: nuclear proliferation might be good. It’s not just that nuclear weapons deter nuclear weapons, it’s that they deter major wars in general by making the risks too high. The greater the number of states with nuclear arms, the argument goes, the less likely we are to see violence on the scale of the two world wars.

Conflicts have indeed become smaller since 1945. And although there have been border skirmishes between nuclear states – China and the USSR in 1969, India and Pakistan more recently – there haven’t been full wars. The bomb “gave peace to Europe”, the nuclear physicist Abdul Qadeer Khan argued. Khan led Pakistan’s nuclear programme starting in the 1970s and then transferred nuclear technologies to Iran, North Korea and Libya in the 1980s and 1990s. His active role in proliferation horrified many, but Khan felt that Pakistan’s acquisition of nuclear weapons (it tested its first in 1998) had saved it “from many wars”. By this grim reasoning, we might celebrate the fact that nearly half of humanity now lives in countries with nuclear weapons (and mourn Ukraine’s decision in the 1990s to destroy its warheads).

Yet central to Waltz’s deterrence theory was that the “catastrophic outcomes” of nuclear war were “easy to imagine”. For Waltz, who served in Japan in the immediate aftermath of the second world war, no imagination was necessary. Others needed help. This is why John Hersey’s Hiroshima article and Kiyoshi Tanimoto’s speaking tours were so important: they turned nuclear war from an abstraction into a reality.

A ‘duck and cover’ nuclear survival drill in a school in New York in 1962.
A ‘duck and cover’ nuclear survival drill in a school in New York in 1962. Photograph: GraphicaArtis/Getty Images

It was a reality many lived with daily. Today, telling schoolchildren to hide under their desks if a hydrogen bomb strikes seems quaintly unhinged (though, actually, it’s good advice). But besides whatever trauma they wrought, preparedness drills like that created a widely shared nuclear consciousness. People regularly envisioned themselves in the position of the Hiroshima survivors. And if they needed help, such films as On the Beach (1959, US), The Last War (1961, Japan) and The Day After (1983, US but broadcast globally) and Threads (1984, UK/Australian) vividly dramatised what a nuclear war would be like.

“Is it possible never to think about nuclear weapons?” asked the novelist Martin Amis in the 1980s. “The man with the cocked gun in his mouth may boast that he never thinks about the cocked gun. But he tastes it, all the time.” In the same decade, the psychiatrist Robert Lifton assessed the “psychic toll” such nuclear dread had taken. Hiroshima and Nagasaki weren’t just historical events, he argued, they were psychological ones, with rippling consequences. Living with the threat of annihilation threw “all relationships” into question. How could children trust their parents to keep them safe or churches provide spiritual continuity in such a world? Lifton attributed the rise of divorce, fundamentalism and extremism to the “radical futurelessness” the bomb had engendered.

Maybe one could dismiss the fallout shelters as theatre and the films as fiction. But then there were the bomb tests – great belches of radioactivity that previewed the otherworldly dangers of nuclear weapons. By 1980, the nuclear powers had run 528 atmospheric tests, raising mushroom clouds everywhere from the Pacific atoll of Kiritimati to the Chinese desert. A widely publicised 1961 study of 61,000 baby teeth collected in St Louis showed that children born after the first hydrogen bombs were tested had markedly higher levels of the carcinogen strontium-90, a byproduct of the tests, despite being some 1,500km away from the closest test site.

Unsurprisingly, nuclear tests stoked resistance. In 1954, a detonation by the US at Bikini Atoll in the Pacific got out of hand, irradiating the inhabited atoll of Rongelap and an unfortunate Japanese tuna fishing boat. When the boat’s sickened crew returned to Japan, pandemonium erupted. Petitions describing Japan as “thrice victimised by nuclear bombs” and calling for a ban collected tens of millions of signatures. Ishiro Honda, a film director who’d seen the Hiroshima damage firsthand, made a wildly popular film about a monster, Gojira, awakened by the nuclear testing. Emitting “high levels of H-bomb radiation”, Gojira attacks a fishing boat and then breathes fire on a Japanese city.

Godzilla from 1954.
Godzilla, 1954. Photograph: Allstar

Gojira – or Godzilla, as he’s known in English – wasn’t the only one awakened by the 1954 Bikini test. The test put Hiroshima back in the spotlight and raised the profile of its survivors. In 1955, Kiyoshi Tanimoto brought 25 women, the “Hiroshima Maidens”, to the US for reconstructive surgery. He appeared before 15 million viewers on the television show This Is Your Life and recounted his ordeal the day Hiroshima was attacked. (In an agonising moment, he was then made to shake hands with a surprise guest, a drunken Robert A Lewis, one of the two pilots who had bombed it.)

As the anti-nuclear movement spread, “Hiroshima” became less an unfortunate event in Japan’s past than a semi-sacred one in world history, to be commemorated by morally serious people no matter their nationality. Tanimoto promoted “Hiroshima Day” and by the early 1960s there were protests and memorials on that day throughout the world. Denmark alone held demonstrations in 45 towns in 1963.

By then, Hiroshima occupied a similar place in public memory to Auschwitz, the other avatar of the unspeakable. The resemblance ran deep. Both terms identified specific events within the broader violence of the second world war – highlighting the Jews among Hitler’s victims, and the atomic bomb victims among the many Japanese who were bombed – and marked them as morally distinct. Both Hiroshima and Auschwitz had been the site of “holocausts” (indeed, early writers more often used that term to describe atomic war than European genocide). And both Hiroshima and Auschwitz sent forth a new type of personage: the “survivor”, a hallowed individual who had borne witness to a historically unique horror. What Elie Wiesel did to raise the stature of Europe’s survivors, Tanimoto did for Japan’s. In their hands, Hiroshima and Auschwitz shared a message: never forgetnever again.

Yet the analogy was imperfect. The European Holocaust was the work of many hands. Mass killing, ordered from on high, had to be carried out by countless willing executioners, who snatched the victims from their homes, stuffed them on to trains, kept them in camps, shot them, gassed them and disposed of their corpses. By contrast, the nuclear apparatus, once in place, could be set into motion by a handful of men in only a few minutes.

This also meant, the world soon realised, that another Hiroshima could come by accident. The murder of Europe’s Jews was many things, but it wasn’t inadvertent. In nuclear standoffs, a plane crash, system malfunction or miscalibrated threat could all plausibly trigger annihilation.


Nuclear standoffs are dangerous by design. As in the game of chicken, the point is to set off on a collision course and frighten your opponent into swerving first. “Fill the nuclear glass to the brim,” Soviet premier Nikita Khrushchev advised his colleagues, “but don’t pour the last drop.”

Such brinksmanship requires leaders to quell their doubts, possibly even to convince themselves that they’re willing to see the glass spill over. Perhaps some are. “The whole idea is to kill the bastards,” said US general Thomas Power, when presented in 1960 with a nuclear plan designed to minimise casualties. “Look. At the end of the war, if there are two Americans and one Russian, we win.” This is the man who led the US Strategic Air Command – responsible for its nuclear bombs and missiles – during the Cuban missile crisis.

A duck and cover defence poster, 1950
A duck and cover defence poster, 1950. 
Photograph: Corbis/Getty Images

Generals like Power, tasked with winning wars, pressed often for pre-emptive strikes. Yet they were fortunately overruled. Deterrence is surely one reason why, but memory played an important part, too. At key moments, decision-makers vividly imagined what would happen if they fired their weapons. They knew what nuclear aftermath looked like.

Even Truman, who had initially considered the bombing of Hiroshima “the greatest thing in history”, found cause for restraint. When the UN forces became stalemated in the Korean war, their commander Douglas MacArthur requested “atomic capability”, later explaining that he’d wanted to drop “between 30 and 50 atomic bombs”. Although Truman readied nuclear weapons, he fired MacArthur and declined to use them.

Why? Looking back, Truman complained of his “locally minded” field generals who couldn’t grasp what going nuclear would have meant. It would have meant an escalating war destroying cities containing millions of “innocent women, children and noncombatants”, Truman imagined. Truman “just could not” drop the bombs, he wrote. And, he added, “I know I was right.

Truman’s successor, Dwight Eisenhower, felt the same. He built up his country’s nuclear arsenal, but when his military advisers urged a preventative attack on the Soviet Union, he refused. He’d seen war, and he could easily envision a nuclear conflict. It would mean, he told them, a “great area from the Elbe to Vladivostok and down through south-east Asia torn up and destroyed without government, without its communications, just an area of starvation and disaster”. He couldn’t do it, either.

Such familiarity with war’s horrors proved essential in the Cuban missile crisis. The US placement of nuclear missiles in Turkey, followed by the Soviet placement of them in Cuba, brought the two powers terrifyingly close to war. But after a series of escalating threats, Kennedy’s Soviet counterpart Khrushchev changed the tone with a frantic, personal appeal. “I have participated in two wars and know that war ends when it has rolled through cities and villages, everywhere sowing death and destruction,” he wrote. Kennedy, as a fellow “military man” who had also seen combat (Khrushchev mentioned this twice), would “understand perfectly what terrible forces” could be unleashed.

Kennedy did understand, and he veered away from what he called the “final failure”. Yet even as Kennedy and Khrushchev were de-escalating their dangerous confrontation, a “far more dangerous” one was developing at sea, the late historian Martin Sherwin has argued. Its resolution suggests just how important experiential knowledge has been in keeping disaster at bay.

The situation involved a Cuba-bound Soviet submarine, carrying a nuclear warhead as powerful as the one that had decimated Hiroshima. The submarine, out of radio contact for days, had missed all the hasty diplomacy between Kennedy and Khrushchev. And so, when the sub’s captain found himself beneath a US warship, he refused to surface when signalled. Instead, he stayed submerged while the US sailors grew more aggressive in their signalling. One skipper tried the unauthorised and reckless tactic of dropping grenades on the sub.

A leaflet distributed in the UK in May 1980, giving advice about surviving a nuclear attack
A leaflet distributed in the UK in May 1980, giving advice about surviving a nuclear attack.Photograph: PA

The explosions were terrifying, and the Soviet captain understandably assumed the war was on. He ordered the torpedo readied. “We’re gonna blast them now! We will die, but we will sink them all – we will not become the shame of the fleet,” the ship’s radio officer remembered him shouting. Firing the first nuclear weapon since Nagasaki at the US navy near Cuba at the height of the missile crisis would have almost certainly triggered nuclear retaliation. “This was not only the most dangerous moment of the cold war,” believed Kennedy aide and historian Arthur Schlesinger Jr, “It was the most dangerous moment in human history.”

The disaster was prevented by a Soviet officer, Vasily Arkhipov, who by chance had been assigned to travel with the submarine. Fifteen months earlier, he’d served aboard a nuclear-powered submarine whose reactor coolant system had failed, exposing the crew to radiation and killing 22 of his 138 shipmates. “He’d seen with his own eyes what radiation did to people,” his wife told a historian. “This tragedy was the reason he would say no to nuclear war.” Now facing the likelihood of war in the Caribbean, Arkhipov talked the enraged captain down.

It was astonishing luck: the submarine poised to start a nuclear war randomly had aboard one of the few individuals on the planet with recent experience of nuclear disaster. The ability to clearly remember and imagine nuclear war’s consequences had been, once again, essential to averting them.


In 1985, John Hersey returned to Hiroshima for the 40th anniversary of the bombing. He came to commemorate the past, but he found it fading. The survivors, on average, were now 62. Two of the six people he had profiled in his article were dead. Kiyoshi Tanimoto was still alive, but he was over 70 and retired. “His memory, like the world’s, was getting spotty,” Hersey wrote.

By then, the above-ground nuclear tests had stopped (the achievement of decades of antinuclear activism and treaties). Apocalyptic movies continued, but more often featuring zombies, aliens, intelligent machines, diseases or climate change than nuclear catastrophe. Hiroshima’s resonance, which once matched Auschwitz’s, was growing muffled. Today, knowledge of the Holocaust is kept alive by more than 100 museums and memorials, including in such unexpected countries as Cuba, Indonesia and Taiwan. But there is no comparable memory industry outside of Japan to remind people of nuclear war.

The result is a profound generational split, evident in nearly every family in a nuclear state. My father, born a month after Hiroshima was bombed, remembers going to a concert during the Cuban missile crisis “wondering if I would survive to the end”. My mother had constant nightmares of nuclear war. By contrast, I was born the year the atmospheric testing stopped, and such thoughts never crossed my mind. The extent of my nuclear consciousness was the hours I spent playing a video game called Duke Nukem. That game debuted in 1991, the year the cold war ended and Mikhail Gorbachev declared that “the risk of a global nuclear war has practically disappeared”.

We should feel relief, but the dispelling of dread has made it hard for many to take nuclear war seriously. “I hear people talk about nuclear weapons,” arms control expert Jeffrey Lewis told me recently, “and it’s just so divorced from reality.” They’ve become “dead metaphors”, Lewis feels, lacking the concreteness to disturb our thoughts or constrain our behaviours.

With nuclear threats far from mind, voters seem more tolerant of reckless politicians. Donald Trump, a case in point, has made outrageous threats, praised his own “unpredictability” in nuclear affairs and suggested using nuclear bombs against hurricanes (“you could hear a gnat fart in that meeting”, a source told Axios). Yet in the energetic discussion about the risks of Trump’s running for president in 2024 and winning, nuclear issues are far from central.

Nor is it only Trump. The nine nuclear states have had an impressive string of norm-breakers among their recent leaders, including Trump, Vladimir Putin, Narendra Modi, Kim Jong-un and Benjamin Netanyahu. With such erratic men talking wildly and tearing up rulebooks, it’s plausible that one of them might be provoked to break the ultimate norm: don’t start a nuclear war.

Caution does not seem in abundance nowadays. Invading Ukraine, Russia turned Chernobyl into a battlefield and recklessly shelled Europe’s largest nuclear power plant at Zaporizhzhia. The Zaporizhzhia attack unsurprisingly set part of the site aflame. It was “the first time in history” that a nuclear plant had been attacked, Ukrainian president Volodymyr Zelenskiy pointed out. “If there is an explosion, it is the end of everything.”

But how guided are leaders by such fears? In the past 20 years, the US has pulled out of the 2015 nuclear deal with Iran and two of the three main treaties restraining its arms race with Russia (the third is in bad shape). Meanwhile, China has been developing aggressive new weapons. In 2019, India made an airstrike in Pakistan, the first time its planes crossed the military border in Kashmir – known as the Line of Control – since either state acquired nuclear weapons. India has “stopped the policy of getting scared of Pakistan’s threats”, its prime minister, Modi, declared. India has the “mother of nuclear bombs” and its arsenal is “not being kept for Diwali”.

The cost of the shredded norms and torn-up treaties may be paid in Ukraine. Russia invested heavily in its nuclear arsenal after the cold war; it now has the world’s largest. The worse the war in Ukraine goes, the more Putin might be tempted to reach for a tactical nuclear weapon to signal his resolve. Already, Russia has threatened nuclear war multiple times, and yet Nato countries increase their aid to Ukraine. The “current generation of Nato politicians”, Russia’s exasperated ambassador in Washington has complained, “does not take the nuclear threat seriously”.

Maybe they don’t. Hiroshima lies just outside their collective memory. The oldest of the 30 Nato leaders, Joe Biden, was two years old in August 1945. The youngest, prime minister Dritan Abazović of Montenegro, may not even remember the cold war, as he was five when the Soviet Union collapsed.

That is what time does. Traumas fade, fortunately for us all. It is a profound achievement – though surely aided by luck – that no nuclear war has refreshed our memories since 1945. We should rejoice, too, that the looming dread engulfing past generations has largely dissipated. This is what we want nuclear war to be: an archaic practice, relegated safely to the past.

But we can’t drive nuclear war to extinction by ignoring it. Instead, we must dismantle arsenals, strengthen treaties and reinforce antinuclear norms. Right now, we’re doing the opposite. And we’re doing it just at the time when those who have most effectively testified to nuclear war’s horrors – the survivors – are entering their 90s. Our nuclear consciousness is badly atrophied. We’re left with a world full of nuclear weapons but emptying of people who understand their consequences. 

 This article was amended on 13 May 2022. An earlier version included The Day the Earth Caught Fire among films that dramatised what a nuclear war would be like, however that 1961 film is is about the consequences of a simultaneous nuclear test. This has been replaced by a reference to the 1984 film Threads.

 Follow the Long Read on Twitter at @gdnlongread, listen to our podcasts here and sign up to the long read weekly email here.