History Today Subscription Offer

Germ Warfare

Robert Bud says we should remember the Asian flu epidemic of 1957 as a turning point in the history of antibiotics.

Drawing of the 1918 Influenza. Wellcome Collection.

A fiftieth anniversary is an appropriate occasion to recall the first great influenza pandemic of the antibiotic age – one that caused more than a million deaths worldwide. The ‘Asian flu’ identified in China in February 1957 reached Britain that autumn, where it killed, directly and indirectly, more than 16,000 people, and possibly as many as 40,000.

More dangerous than the influenza itself was a secondary infection that caused a sometimes fatal pneumonia and confronted doctors with rich evidence of the growing resistance of bacteria to the hitherto triumphant penicillin. Even while the world was celebrating the miraculous discovery and deep impact of this new wonder drug and others like it, they were being used in ways that vitiated their effects by encouraging the breeding of resistant organisms.

To modern ears, the phrase ‘antibiotic age’ may sound archaic, but half a century ago it expressed some of the revolutionary hopes for the years after the Second World War. The writer Peter Vansittart would later recall in his autobiography In the Fifties (1995) that, along with Fascism, mass unemployment and (soon) colonialism, the threat from germs seemed to have been routed. Fear of infectious disease appeared a thing of the past. The plague epidemics of the fourteenth and seventeenth centuries, the outbreaks of cholera in the nineteenth century, even the influenza epidemic that followed the First World War, are part of Britain’s ‘island story’, but now pioneering sociologists of medicine reported that the enduring human experience of infectious disease had apparently been superseded. A major textbook of the day, Macfarlane Burnet’s Natural History of Infectious Disease (2nd edition, 1953), celebrated ‘the virtual elimination of infectious disease as a significant factor in social life’. These attitudes were underpinned by dramatic change in the doctor’s surgery: carbuncles were now easily treated with antibiotics, pneumonia could be cured within a few days, even tuberculosis was tamed.

Antibiotics have been credited with being the major factor in ending the risks consequent on infection. Penicillin, the first, was introduced during the Second World War. It was followed by the anti-tuberculosis drug streptomycin and then by a host of powerful and widely effective members of the ‘golden horde’ of tetracycline drugs. Through the 1950s other important families of antibiotics were quickly identified. It can be argued that too much importance was attributed to this one piece of new technology. In wealthier countries at least, social factors such as improved housing and nutrition were responsible for a long-term decline in the threat from diseases such as tuberculosis. Nonetheless, antibiotics complemented these factors, both by reducing the dangers of infection and by providing reassurance in the certainty of a cure.

With time, of course, such confidence came to seem facile. The 1980s witnessed the terror of AIDS, followed by ‘flesh-eating viruses’, CJD, SARS and avian flu. Bacteria resistant to antibiotic treatment came to infest hospitals and communities; the management of MRSA (methicillin-resistant Staphylococcus aureus), the latest fear, became a political issue in the 2005 general election, and policy-makers have openly discussed the implications of a ‘post-antibiotic’ age.

The dramatic shift in narrative from the post-war celebration of victory to fear of defeat in our own times might be compared to a Greek tragedy. Indeed, Aristotle’s defining characteristics of a ‘complex tragedy’ – the turning point (peripiteia) and the moment of the hero’s awareness (anagnoresis) – have their counterparts in this most modern of dramas. The 1957 epidemic can surely serve as the turning point, and the years since the mid-1990s be portrayed as the point of awareness by the general public. 

The Asian flu proved highly infectious, one in six of the British population sharing the symptoms of a high fever, generally short lasting, and aching joints. More working time was lost to the illness in the first two weeks of October than to all the strikes in 1956. With miners laid low, production of coal fell by one-and-a-half million tons. The press reported that the infection was gripping schools and incapacitating naval vessels: in September, two ships and two submarines had to be excused from a major NATO exercise because their crews had fallen ill. According to The Times of October 8th, half the boys at Eton were sick.  

Despite a brief high fever, with a temperature quickly topping 39ºC (102ºF), the symptoms were generally quite mild, and the treatment of aspirin, plenty to drink and bed rest usually proved effective. The British Medical Association discouraged advertisements that urged patients to call their doctor. A vaccine, given by two injections at a three-week interval, was available in limited quantities only, to such priority cases as medical and welfare staff. The Queen was given a course before visiting North America that year.  

For a proportion of the epidemic’s victims – small as a percentage of the whole, but nonetheless large in actual numbers of people – the illness was far from trivial. The degree of suffering was much higher among the elderly, the infirm and the very young, and in many cases they succumbed, not to the influenza itself, but to pneumonia caused by a recently characterized bacterium resistant to all antibiotics, Staphylococcus aureus 80/81. This germ could produce an enzyme, penicillinase, that destroyed the penicillin molecule. Not just powerful against medicine, the bacterium was also virulent in its attack on humans. 

Staphylococcal pneumonia proved fatal to more than a quarter of patients who contracted the infection, normally after entering hospital suffering from influenza. Its effects were rapid: within three days, the victims turned blue and asphyxiated. In America, childhood deaths from pneumonia had declined radically through the 1950s; now figures showed a brief upsurge. For those who wished to see, the fragility of the antibiotic age had been exposed.  

Even before the flu epidemic, Staph. aureus had posed the major bacterial threat to the curative powers of antibiotics. The species had been named in 1884 to denote its golden colour and resemblance to a bunch of grapes. For all their attractive appearance and poetic name, the germs had long been recognized as a danger in hospitals. Indeed it was the effect of penicillium mould on a staph. culture that led Alexander Fleming to recognize the mould’s potential power and ultimately led to the isolation and use of penicillin.

At first, Staph. aureus was almost universally susceptible to penicillin. However, antibiotic-resistant strains of the bacterium took the place of those that could not much more quickly than was the case with other susceptible species – a bacteriologist at London’s Hammersmith Hospital noticed in 1948 that the character of a Staph. aureus population had shifted from predominantly susceptible to generally immune to penicillin within just two years. One immune strain of the bacterium in particular, Staph. aureus  80/81, spread rapidly in the 1950s to infect hospitals across the world.  

Staph. aureus 80/81 did not just cause pneumonia in influenza sufferers, it also infected the skin of newborn children. At that time infants born in hospital were kept separate from their mothers in large crèches, within which infection spread easily. About one in ten contracted a skin infection, usually very slight, which they then passed on to the breasts of their nursing mothers. The number could be higher: a South Wales hospital reported that one in six of its newborn infants suffered from some skin infection or conjunctivitis. The wounds of surgical patients, protected by bandages that needed frequent changing, could also be infected. A 1958 American conference was told that between 5 and 9 per cent of clean wounds would become infected with staphylococci.

The ‘antibiotic age’, while providing more tools for medicine, was putting more pressure on individual doctors and on facilities worldwide. A Swedish physician who addressed the British Medical Association in 1964 talked not just of the triumph of technology but also of a demand vastly exceeding supply of medical care. The results of this growth in demand were rushed doctors’ appointments or an ever higher turnover of patients through hospital beds. Antibiotics could be used effectively in response, as patients swiftly departed surgeries armed with a prescription or drugs were used to manage hospital infection. Poorly managed use, however, could lead to the natural selection of resistant strains that made the antibiotics sometimes ineffective. 

The press became aware of the threat from Staph. aureus 80/81 in January 1958. The file of the Standing Medical Advisory Committee, now in the National Archives, is full of the press clippings assembled by civil servants at the time. On January 12th, following a report from the Public Health Laboratory Service, the Sunday Express gave high prominence to the closure of wards. Next day, the Daily Mail and Daily Telegraph both dealt with the threat of staphylococci, while the Daily Herald spoke of a ‘Mystery Germ X’. On January 21st, more soberly but with the same theme, the Standing Medical Advisory Committee warned about the dangerous spread of antibiotic-resistant bacteria.

These bacteria were clearly not just a British problem. In October 1958, the US Surgeon General told a public health conference that ‘every man and woman here knows that the stakes in this national problem are truly awful.’ He also acknowledged that British scientists were among the world leaders in identifying and managing the threat. At the Central Public Health Laboratory in Colindale, north London, a widely used technique for identifying bacteria had been devised during and after the war. Robert Williams, director of Colindale’s streptococcus, staphylococcus and air hygiene laboratory, was at the centre of an international network of analysts supported by the World Health Organization.

The analysts came to the same conclusion. In the antibiotic age, standards of cleanliness had been allowed to slip. This was a salutary lesson. In 1963 the Royal Society of Health organized a conference entitled The Prevention of Hospital Infection'. Mrs D. Sissons, nursing tutor at the Royal Liverpool Children's Hospital, suggested that nurses

should not be afraid to go to a houseman and say, ‘What are you doing with that mask half hanging off? Why are you walking over to outpatients with theatre clothes on? Are doctors sterile and nurses not?’ People would say they were old dragons, but she herself was proud of being one. Was it not clear to everyone, professional and lay, that they had a patient’s life in their hands, and that strict discipline should be restored? Absolute authority should be given to the matron or medical superintendent, not to a lay person.

Mrs Sisson's message went down well. Dr Jackson of Ash ton under Lyme complained:

The level of infection in the ward had gone up because the beds were never cooled, the wards were never properly cleaned, and the time was reached three times in a year when that exploded – there was widespread infection and the place had to be closed ... Hospitals now were composed of sergeant-majors and very few leaders, and they should go back to the people who were really intimately concerned with the question of ward infection.

But if the 1957 flu crisis dealt a clear warning to public health specialists that antibiotics by themselves could not protect against widespread infection, the public remained unaware that a turning point had been reached. A study of the American press conducted forty years later for the Office of Technology Assessment highlights the public’s continuing faith in antibiotics in the 1950s. While the threats of Staph. Aureus 80/81 provoked professional disquiet, the general press coverage was neither political nor outraged, presenting the epidemic of virulent and resistant Staph. aureus not as a political problem, but as a professional challenge. To use the expression of John Sheehan, a leading antibiotics chemist, if bacteria could be ‘wily’, so could chemists.

Coincidentally, just as the 1957 influenza pandemic was beginning, chemists working for the Beecham Company had discovered how to brew the ‘trunk’ of penicillin on to which they could attach branches to produce novel antibiotics. Within three years they went from scientific to medical breakthrough, and in 1960 Beecham launched methicillin, whose structure made it immune to the destructive enzyme produced by Staph. aureus 80/81. Methicillin seemed to demonstrate that chemists could outsmart bacteria.

The benefits of the new drug could be dramatic. When Elizabeth Taylor contracted staphylococcal pneumonia on the movie set of Cleopatra, methicillin saved her. Nonetheless, the decline of the Staph. aureus 80/81 threat was not entirely due to the deployment of a single medicine; it was perhaps fading in any case. Moreover, it took just two years, from the drug’s launch, for the dangers of MRSA (methicillin-resistant Staphylococcus aureus) to become very apparent, after the death of an infected child at the Queen Mary’s Children’s Hospital in Roehampton.

This death from MRSA in 1962 coincided with the emergence of a new questioning of authority that would have an ironic consequence for the control of antibiotic use. That same year the British physician Maurice Pappworth issued a warning that the public were being used as unwitting ‘human guinea pigs’ in medical experimentation, and the scandal of the foetal abnormalities caused by the drug thalidomide became widely publicized. New anxieties spurred the foundation of the Patients’ Association and the early signs that the public was growing more sceptical of the goodwill of the medical profession and indeed of the drug companies.

This increased scepticism did not, however, temper the demand for drugs and medical treatment. It seems instead to have made doctors more cautious about resisting the expectations they believed patients brought into the surgery – of an antibiotic prescription on the way out. Whatever patients actually wanted (a complex question in itself), many doctors felt it necessary to earn trust by giving an antibiotic prescription.

MRSA and other threatening bacteria such as the antibiotic-resistant pneumococcus did not become widespread until the 1980s. In the intervening years, the anxieties of an earlier age had been forgotten, but confidence in the power of technology to beat bacteria had also declined. After a brief, and often unrewarding, enthusiasm for the promise of genomics and of robots in finding new and potent agents, the development of new antibiotics attracted rather less commitment from the pharmaceutical companies than half a century earlier. The late 1990s did, however, see intense anxiety about excessive use of antibiotics, with much greater emphasis being put on the management of antibiotic use. New forms of collaboration between doctor and patient were explored, by which patients themselves shared the decision if and how to medicate. Such attempts to alter contemporary experience remind us that we are living within, and consciously engineering, our own historical narratives. In reflecting on the experience of 1957 and the threat of Staph. aureus 80/81 we experience the turning point in the story of antibiotics, one of the iconic projects of the modern era.

Robert Bud is the author of Penicillin: Triumph and Tragedy (Oxford University Press, 2007).


Get Miscellanies, our free weekly long read, in your inbox every week