James Bradley, University of Melbourne
It’s easy, in retrospect, to portray World War II as a major turning point in the history of medical ethics. But it’s a portrayal we should resist because it blinds us to the troubles that persist to this day in matters of informed consent.
When Germany surrendered in May 1945, the victors’ worst fears were confirmed: the Nazis had committed innumerable and horrific war crimes, including the attempt to annihilate the Jewish people. So severe were the depredations of Hitler and his henchmen and women that new words were invented to describe their actions: genocide (the obliteration of an entire people) and thanatology (the science of producing death).
As the extermination camps in Poland and the concentration camps in Germany were liberated, the scale of human destruction wrought by Nazi ideology both confounded and shocked. But, in their efforts to reap the benefits of Nazi scientific research, intelligence officers rapidly discovered the devastation of human medical experimentation.
Nuremberg trials
Jews, political prisoners and other “undesirables” were subjected to a range of experiments that resulted in death and disability. The Luftwaffe, for instance, wished to know how to protect and revive pilots shot down in the sea who suffered from hypothermia.
The Nazi solution was to immerse experimental subjects in freezing water to the point of death and beyond.
Following the assassination of high-ranking Nazi Reinhard Heydrich, a cry went out to experiment more boldly with sulphanomides (drugs that curb the growth of bacteria) since Heydrich had died of wound infection.
The solution: scarify the legs of experimental subjects, infect the wounds and see what happens with or without drugs.