Atomic Medicine: the Cold War Origins of Biological Research
Military concerns drove the development of nuclear weapons. But a by-product of this huge deployment of scientific resources by the US and the UK was an upsurge in biological research leading to a new age of regenerative medicine. Alison Kraft discusses the history of stem cell biology.
The detonation of atomic bombs over the Japanese cities of Hiroshima and Nagasaki in August 1945 announced to the world the arrival of the ‘atomic age’. Twelve years later, a team in New York led by E. Donnall Thomas reported the first use of bone marrow transplantation in cancer patients. These striking but very different developments may appear unconnected; but both are in fact intimately linked to the history of stem cell biology. In different ways each was important for the identification in the late 1950s of the blood stem cell, the origin of the entire blood system and the first stem cell to be recognised. In the decades that followed, stem cell biology was to develop into a cutting-edge biomedical enterprise that today offers hope for a host of prevalent, debilitating diseases.Amid the contemporary expectations surrounding the prospect of new and powerful stem cell therapies, the atomic heritage of stem cell biology has largely been forgotten.
The Manhattan Project
The atomic bombs dropped on Japan in 1945 were the culmination of a top-secret American enterprise, the Manhattan Engineering District,more commonly known as the Manhattan Project. The initiative was prompted by growing fears that Nazi Germany might be developing an atomic bomb following a number of advances in nuclear physics that suggested the theoretical possibility of such a weapon.Authorised by President Roosevelt in the summer of 1942, the top-secret bomb project was placed under military control and commanded by General Leslie R.Groves. The Manhattan Project was a scientific mission unprecedented in scale: lasting for three years, it involved over 150,000 personnel and cost over $2 billion. The vast and complex organisation involved a number of sites across the US: secrecy was paramount and maintained by stringent security, including a policy of strict compartmentalisation at and between sites. Only a few senior figures knew of the project: President Truman,who over the summer of 1945 agreed with his advisors on the use of the atomic weapon, learned of it only after succeeding Roosevelt in April.
Mobilisation of the project was swift and involved military personnel, scientists, commercial contractors and a large construction workforce. Each site had responsibility for a specific task: initial small-scale development of the nuclear reactor, for example, took place at the University of Chicago. Establishing ‘proof-of-principle’ of the chain-reacting nuclear reactor was key to the entire project: bomb production rested wholly upon a supply of uranium and plutonium generated by fission reactions within the reactor. Reactor scale-up, together with uranium separation, was the province of a new installation at Oak Ridge, Tennessee.Here a quiet valley east of Nashville was cleared of its few inhabitants and transformed in under a year into the fifth largest city in the state,with a population of 55,000 drawn largely from neighbouring towns and states. Two thousand miles away, towering reactors were under construction in Hanford,Washington, principally for plutonium production. The final phase,weaponisation, was overseen by the charismatic theoretical physicist J. Robert Oppenheimer at the remote site of Los Alamos in the New Mexico desert. The first nuclear bomb test, code-named ‘Trinity’, was carried out in July 1945 at Alamogordo, 200 miles from Los Alamos. Timing was crucial: the successful ‘Trinity’ test allowed Truman to hint to Stalin at the Potsdam Conference held in July and August 1945 of the powerful new weapon. It also foreshadowed by less than a month the use of the bombs against Japan.
A new era of radiation biology
The Manhattan Project culminated in events that profoundly shaped the postwar geopolitical map and in technologies that formed the basis of a new world order. It has been seen as exemplifying ‘Big Science’, a concept that highlights the changing scale of science after the Second World War and draws attention to shifts in its organisation, culture and practice. Some historians have emphasised how the Manhattan Project marked a turning point in the relationship between science and the military and between scientists and the state. A less well-known story is that of the central place of the Project in the development of radiobiology in the ‘atomic age’ and its place in the history of stem cell biology. In 1942 it was recognised that the fission reaction at the heart of the nuclear reactor would generate new and intense forms of ionising radiation (a form of radiation that causes biological damage, such as burns or genetic mutations). It was envisaged too that further processing of fission products to generate bomb-grade uranium and plutonium would likewise present new and daunting radiation hazards.Yet the precise nature and extent of the hazard remained uncertain because the dangers to health of ionising radiation remained poorly understood.
Following Wilhelm Roentgen’s discovery of X-rays in 1895, the widespread medical use of X-rays was by the 1920s being linked to an increased incidence of leukaemia among radiologists. Indications of a link between radiation exposure and cancer also came from the profligate use of the highly radioactive element radium, discovered in 1898 by the Curies. For instance, radium-based health ‘tonics’ such as Radiothor were popular during the 1920s until links with cancer curtailed their use.Most notorious in this respect was the case of the radium dial painters in the US,which served as a sombre harbinger of the health hazards of ionising radiation. Radium was used industrially as a luminising compound, for example in the manufacture of luminous dials for aircraft instruments and watches: typically it was applied with paint brushes tapered to a fine point by mouth. In the mid- 1920s, Theodore Blum and Harrison Martland, dentist and physician respectively, became concerned about the incidence of oral and jawbone cancers in young women employed by the United States Radium Corporation in New Jersey. Following protracted and acrimonious litigation, these cancers were finally acknowledged to be linked to radium – that is, to exposure to low-level radiation.
The hazards of low-level radiation were by this time also being emphasised by the radiation biologist H.J.Muller,who proposed that this kind of exposure could cause genetic mutations, some of which might lead to cancer. Furthermore, it was suggested that such acquired mutations might also be passed between generations. In the absence of conclusive evidence, the association between radiation exposure and cancer remained contested.Concerns were sufficient, however, to prompt the creation in 1928 of national and international radiation protection committees which established guidelines for the handling and use of radiation: international regulations setting out new ‘safe’ limits on occupational exposure followed six years later.Unease over radiation dangers spurred research into the harmful effects of ionising radiation. If radiobiology remained a nascent and somewhat arcane field in the interwar period, important foundations were laid: for example, blood was established to be the tissue most acutely sensitive to the damaging effects of ionising radiation.However, knowledge of the physiological effects of radiation remained at best partial and exposure to it was recognised to pose ‘risks’ to health.Mindful of this, radiation safety formed an integral part of the Manhattan Project. That said, the ‘special hazard’ – the code word for radiation – was not permitted to impede progress on what was a military mission.
Radiation and blood
The University of Chicago, home to the first step in the technical sequence leading to the bomb – the development of the nuclear reactor – was also home to a Health Division charged with responsibility for radiation safety for the entire project. Initially focused on monitoring radiation exposure by a systematic programme of regular blood tests for all personnel, the work of the Health Division expanded and changed over the course of the project. In particular, following the demonstration of the first nuclear chain reaction in December 1942 by a team led by Enrico Fermi, when the reactor achieved a controlled, self-sustaining chain reaction (‘criticality’), its role moved beyond health surveillance into a wide-ranging programme of radiobiological research. This involved thousands of experiments, mainly on mice, investigating the biological effects of acute and chronic exposure to different forms and intensities of ionising radiation.
A Chicago-based physician Leon Orris Jacobson emerged as a key figure in this research enterprise – selected for his expertise in blood diseases. Through his involvement in the Manhattan Project, Jacobson became a leading expert on the biological effects of ionising radiation upon the blood system.After the war, he remained in Chicago where, as well as directing a new cancer hospital in the city, he continued his research into the damaging effects of radiation on the blood system. In this, Jacobson worked closely with the émigré German biophysicist Egon Lorenz from the National Cancer Institute in Bethesda,Maryland. This collaboration,which continued until Lorenz’s death in 1954, contributed scientific insights and technical advances that were fundamental to radiation biology; it also set in train the radiobiological research from within which the blood stem cell was to emerge.
The scope and scale of the radiation hazard was transformed after the war: the possibility of mass exposure of the public to potentially large doses of radiation seemed all too real. Radiation research took on new importance as several nations embarked on peacetime nuclear programmes encompassing weapons but also energy and medical applications – technologies that came to symbolise the ‘peaceful atom’,which were later to become a focus of President Eisenhower’s 1953 ‘Atoms for Peace’ initiative. In the US, the peacetime nuclear enterprise was placed under the control of a new body, the Atomic Energy Commission, created by the 1946 Atomic Energy Act. Here, military and civilian figures forged an uneasy alliance as they vied for primacy in nuclear matters. Against this backdrop, the strategic significance of radiobiology rocketed, its newfound status reflecting its central role in managing and minimising the dangers of radiation. Radiobiology flourished on both sides of the Atlantic. In the US, the network of labs created as part of the Manhattan Project became a national laboratory system serving the strategic scientific needs of the nuclear state: the University of Chicago and Oak Ridge, for example, became major centres for radiobiology.
In Britain, similarly alert to the dangers of radiation, provision for radiobiological research was included within the Atomic Energy Research Establishment (AERE) at Harwell in Oxfordshire. Created in 1945 by the newly elected Attlee government, AERE was headed by Sir John Cockcroft, charged with oversight of key parts of Britain’s nuclear enterprise, including the medical radioisotope programme. A new Radiobiological Research Unit (RRU) was established in 1947 as part of the AERE, to which haematologist John F. Loutit was appointed director. By the early 1950s this unit had become the country’s leading centre for radiobiology where the ‘radiation and blood’ theme formed a principal focus of research.
The radiation recovery factor
Meanwhile, in the US, Jacobson and Lorenz were revisiting a puzzling observation made by Jacobson during his wartime work on the Manhattan Project, which suggested that one part of the blood system, the spleen, offered some form of ‘protection’ against radiation damage. Jacobson postulated the existence of a radiation ‘recovery factor’ produced by the spleen and responsible for ‘radiation protection’. Jacobson was alert to the therapeutic potential inherent in this intriguing phenomenon: if the ‘recovery factor’ could be isolated, it might be useful as a means to protect against or treat radiation injury. This resonated strongly with growing political concern and public unease about ionising radiation. In 1951, Lorenz devised the technique of bone marrow transplantation (BMT) to test whether bone marrow could also give rise to this protective effect. This experiment, which involved the transfer of bone marrow from a healthy mouse into a recipient mouse that had been given a lethal dose of radiation, confirmed the recovery factor to be present in bone marrow. The transplanted marrow prevented the death of the recipient. Bone marrow superseded the spleen as the key site for investigations of the recovery factor while BMT rapidly became the principal research tool for studying the effects of radiation on the blood system.
In the first half of the 1950s the identity of the recovery factor became the subject of intense debate within the small research community that coalesced around the ‘radiation and blood’ theme. This resolved into two competing explanations: Leon Jacobson thought the factor was probably a hormone; by contrast, the British group at RRU Harwell thought it more likely to be a distinctive kind of cell. In favouring the hormonal explanation, Jacobson may have had in mind the possibility of an anti-radiation drug based on the recovery factor. This vision of a hormone-based anti-radiation drug did not go unnoticed by a pharmaceutical industry keen to repeat its recent runaway clinical and commercial success with the hormone insulin for the treatment of diabetes.Radiation was a ‘hot’ topic and a potentially lucrative commercial market. This was especially compelling amid intensifying concerns about nuclear war and nuclear accidents,which were accompanied by fears of mass exposure to large doses of radiation.An anti-radiation drug would certainly be of interest to the military and to those bodies – public health, civil defence – concerned with the welfare of the public in the atomic age.
To concerns about single large doses of radiation arising from bombs and accidents could be added growing unease about radiation ‘fallout’, the low-level radioactive contamination that accompanies nuclear explosions. Between 1945 and 1949 the US enjoyed a nuclear monopoly: keen to make technical improvements on the relatively ‘crude’ bombs deployed against Japan, it had begun nuclear testing in the Pacific in 1946. The US nuclear monopoly was brought to an end by the first Soviet nuclear test in August 1949. Reeling at this and grappling with wider events including the Berlin airlift, the ‘fall’ of China, the emerging Korean crisis and revelations of atomic espionage, the Truman administration in 1950 quadrupled its defence budget and sanctioned the development of the hydrogen bomb.
The onset of the Cold War and the arms race was accompanied by a period of intense nuclear testing by the US and Britain in the Pacific and by the Soviets at Semipalatinsk in Kazakhstan; from 1951 onwards the US also conducted tests at the Nevada Proving Grounds. Concerns about fallout were amplified by accidents such as occurred during the Upshot- Knothole test series in Nevada in 1953,which resulted in contamination of people and land in Nevada and neighbouring states, and at the Bravo test in the Pacific the following year when a Japanese fishing trawler, the Fukuryu Maru (Lucky Dragon), was heavily contaminated with radioactive fallout.Widely covered in the press, such incidents served to heighten fears of fallout and, in underlining the link between nuclear technologies and radiation, fuelled further public wariness of the nuclear enterprise generally. Fallout and lingering deaths from radiation sickness featured in a burgeoning nuclear genre, including films and books such as Nevil Shute’s post-apocalyptic novel On the Beach (1957).
Public unease about radiation fallout, nuclear war and nuclear accidents lent urgency to the search for anti-radiation therapies. ‘Nuclear fears’ also meshed in complex ways with mounting public concerns in the 1950s about cancer. Cancers of the blood – the leukaemias – were especially dreaded. Although new chemotherapeutic agents were in this period showing great promise, the leukaemias remained invariably fatal and they also afflicted children disproportionately.Moreover, reports that the incidence of leukaemia was increasing led some to view this disease as the ‘pestilence of the atomic age’. Since blood was known to be acutely sensitive to radiation, some argued that the rising incidence of leukaemia was the result of increased radiation exposure – perhaps from fallout. The longstanding and unresolved link between radiation and leukaemia now fed into the debate over the dangers of fallout. This link was strengthened by data emerging from the US Atomic Bomb Casualty Commission, established on the orders of President Truman in 1946 to study the long-term health of the Hibakusha (survivors of Hiroshima and Nagasaki), which suggested a disproportionate incidence of leukaemia within this population. These findings, however, were fiercely contested among scientists who remained divided on the issue of a causal link between leukaemia and radiation.
Attempts to bring to the fore the ‘peaceful’ uses of ‘the atom’ – energy and the medical radioisotope – such as Eisenhower’s 1953 Atoms for Peace initiative, failed to assuage growing public concern about the nuclear enterprise. Later in the 1950s this found expression in movements such as the Campaign for Nuclear Disarmament,while in the early 1950s public unease about radiation fallout, nuclear war and nuclear accidents lent urgency to the search for anti-radiation therapies. There were two main approaches to the search, both of which attracted media attention. One focused on a group of chemical compounds that had been shown to reduce radiation damage. For a time it was hoped that these compounds might form the basis of an anti-radiation pill which, it was envisaged, could sit in the bathroom cabinet alongside other medications to be used if, or when, necessary. In the event, these compounds proved too toxic for such use.
The other centred on Leon Jacobson’s recovery factor,which had become the subject of a great deal of research both in the US and at Harwell.However, hopes that a hormone-based recovery factor might form the basis of an anti-radiation therapy were not to be realised. Instead, in 1956 experiments by the Harwell team and by other groups confirmed the recovery factor to be the blood stem cell – from which all blood cells originate. Its existence had been postulated since the late 19th century when the blood system was first recognised to be a dynamic cell-based system comprising nine different kinds of cell,with a turnover estimated at over one million cells per second.Uniquely endowed with regenerative powers, the blood stem cell could give rise to the entire blood system. Resident in small numbers within the bone marrow, this cell was responsible for the ‘radiation recovery’ first reported by Jacobson: the ‘radioprotective effect’ observed following marrow transplantation reflected the regeneration of the blood system by this distinctive cell.
BMT, a technique devised to study the biological effects of lethal radiation, had, quite unexpectedly, provided a window onto the blood stem cell and its regenerative powers.
Recognition that the recovery factor was the blood stem cell marked a new beginning, one that rested on harnessing its remarkable regenerative powers for therapeutic purposes in the cancer clinic. BMT was now recast as an adjunct to radiotherapy in the treatment of leukaemia.Here, Loutit’s group at Harwell laid important ground when, later in 1956, they reported the successful treatment of leukaemic mice with high-intensity radiotherapy followed by BMT.At a time when pioneering work in transplantation medicine was hitting the headlines – the first human kidney transplant had taken place in 1954 – the radically new technique of bone marrow transplantation now moved rapidly into the cancer clinic.Here it formed a new treatment paradigm for the leukaemias: BMT formed the second component of a two-stage therapeutic regime, the first element being high doses of radiation,which, it was hoped,would better eradicate the malignancy arising within the patient’s blood system. The first report of its use in humans was published in 1957 by the team led by the US physician E.D. Thomas,who was driven by the plight of his cancer patients. By the late 1980s, and drawing on advances from many fields, especially immunology, BMT had become the therapy of choice for a range of blood cancers. In 1990, Thomas received the Nobel Prize for his pioneering work in the transplantation field. This story of clinical success rested fundamentally upon the regenerative powers of the blood stem cell.
The atomic heritage of stem cell biology illustrates the contingent nature of scientific advance and reveals how the direction in which science moves is sometimes a pragmatic response to wider political and public concerns. Radiobiologists seeking to understand the biological effects of radiation and to develop anti-radiation therapies devised a new technique, BMT, through which they then stumbled upon the blood stem cell. BMT then found use in the treatment of leukaemia, a disease which became more prominent after the war and for which new therapies were desperately needed and especially welcome.
Political, public and, increasingly, commercial forces are also at play within contemporary stem cell biology. In the past two decades, the field has expanded rapidly and turned in wholly unanticipated directions. It now encompasses a diverse range of stem cell populations, including since its identification in 1981, the embryonic stem cell. Currently, stem cells occupy the vanguard of a new therapeutic paradigm, that of regenerative medicine. The hopes vested in them derive from a vision that their unique regenerative powers can form the basis of novel and effective therapies for a range of prevalent and debilitating diseases for which there is no effective treatment. Commercial interest and investment have been considerable. The journey from recovery factor to blood stem cell, from radiobiology to regenerative medicine, illuminates how our understanding of contemporary science is enriched by an appreciation of its past.
- C. Caufield, Multiple Exposures. Chronicles of the Radiation Age (Penguin Books, 1989)
- C. Clark, Radium Girls: Women and Industrial Health Reform, 1910-1935 (University of North Carolina Press, 1997)
- M. Gowing and L. Arnold, Independence and Deterrence: Britain and Atomic Energy 1945-1952 (Macmillan Press, 1974)
- J. Hughes, The Manhattan Project. Big Science and the Atom Bomb (Icon Books, 2002)
- M.S. Lindee, Suffering Made Real: American Science and the Survivors at Hiroshima (University of Chicago Press, 1994)
- J.S. Walker, Permissible Dose: A History of Radiation Protection in the 20th Century (University of California Press, 2000).
- Middle East
- North America
- South America
- Central America
- Early Modern
- 20th Century
- 21st Century
- Economic History
- Environmental History
- Historical Memory
- Science & Technology