Welcome to the WWII Forums! Log in or Sign up to interact with the community.

Medicine of WWII

Discussion in 'War44 General Forums' started by Jim, Jun 19, 2008.

  1. Jim

    Jim Active Member

    Joined:
    Sep 1, 2006
    Messages:
    3,324
    Likes Received:
    15
    via War44
    As in most wars, disease and accidents were more common in World War II than battle wounds. Two-thirds of hospital admissions—on both sides—resulted from sickness or injuries not received in combat. Typhus was the scourge of the eastern front, while in North Africa dysentery, hepatitis, malaria, and skin diseases were rampant, especially among the Germans. Malaria was universal throughout Asia and the South Pacific. There was no cure for it, but antimalaria drugs existed, as did mosquito nets. The failure to provide either of these to U.S. and Filipino troops defending Bataan in 1942 had ruinous consequences. Hospital admissions for malaria ranged from 500 to 1,000 a day. Up to 80 percent of the troops on the line are thought to have been infected with it. Along with shortages of food and ammunition, disease was the main reason why the Bataan garrison surrendered when it did.

    2nd Lieutenant Frances Bullock applies a dressing to a wounded soldier's hand in an Army hospital.

    [​IMG]

    Venereal diseases have been a problem throughout history for almost every army, including those of the United States. Prevention was the best approach, especially because penicillin did not become widely available until 1944. The military staged propaganda campaigns designed to create “syphilophobia,” and thereby frighten men into abstinence. These drives proved ineffective, however, for troops facing death in battle could not be made to fear a curable social disease. Rejecting inductees found to be suffering from syphilis, for example, was not workable either. Of the first 2 million men inducted, 48 in every 1,000 had it, and among blacks the rate was 272 per thousand. Most black inductees were from the South, and southern males, black and white, had a venereal disease rate four times that of northerners. Early in 1942, the Army began treating infected men instead of declaring them 4F (physically unfit for induction). By 1945, 170,000 inductees had been cured of syphilis, a process that took about 10 days (before antibiotics became available). Prevention was effective too. Once the military conceded that enlisted men could not be scared away from having sex, it established hundreds of prophylactic stations in the United States as well as overseas to treat men medically after intercourse. Large quantities of condoms were passed out as well—some 50 million a month—and became so popular that by the end of the war demand outstripped supply. Although the commonsense approach to social disease offended religious groups, the military could not afford to give it up. Thus, while in 1940 the venereal disease rate for the army was 42.5 cases for every 1,000 men, by 1943 it had fallen to 24. For the entire war it came to 37 per 1,000, about the same rate as among civilians. In 1940, for every 1,000 soldiers, 1,278 days a year were lost on account of venereal disease, but by 1943 the figure was only 368. This improvement was a public health victory of great importance. Many medical gains in this period were directly related to the war. Penicillin was still in the research stage when war broke out, which inspired the British scientists involved to redouble their efforts. Because manufacturing the drug was beyond Britain’s means, it was arranged for a U.S. firm to do so. As a result of crash programs, wounded soldiers were being treated in the field with penicillin as early as 1943. Progress was also made in surgery and anesthetics. No achievement was more impressive than the new treatments devised for combat fatigue, known to the army as neuropsychiatric cases, or NP, for short. Although wounded rates were higher in Europe, NP disorders were more frequent in the Pacific, which was a far worse environment in which to fight.
    After three months of combat on New Georgia in the Solomon Islands, a force of 30,000 men had 13,000 hospital admissions for illness and injury. Of these, 27 percent were wounded in action, 15 percent were otherwise injured, 21 percent had malaria, 18 percent diarrheal diseases, and 19 percent neuropsychiatric disorders. The incidence of NP cases on New Georgia was not exceptional by World War II standards; however, for it was a rule of thumb that the longer an action lasted the more NPs there would be, regardless of other conditions.

    U.S. Army Medics remove a casualty from the battlefield to an aid station in an air raid shelter, near Brest, France, 28 August 1944. The site was formerly used by the Germans.

    [​IMG]

    In Europe the Army would experience 110,000 NP cases as a whole, but they were highly concentrated. During one 44-day period of intense combat on the Gothic Line in Italy, 54 percent of all casualties were neuropsychiatric. The high incidence of NPs caught both psychiatrists and army leaders by surprise, despite the relatively large number of mentally damaged soldiers that World War I had produced. These had been so many and so serious that as late as 1942 some 58 percent of all patients in Veteran’s Administration hospitals were World War I “shell shock” victims. World War II psychiatrists believed they had developed a screening process that would keep most men liable to become NP cases out of the Army. Their confidence in this process was such that of 5.2 million men who appeared at army recruiting stations 1.6 million were rejected for “mental deficiencies.” All the same, psychiatric discharges from the Army would be two and a half times as common as in the previous war.
    Every army suffered from combat fatigue, which was inevitable given the horrors of modern war. But in addition to war itself, many NP cases resulted from how men were employed in it. The basic problem in the U.S. Army by 1944 was that a military manpower shortage led to men keeping men in action for excessive periods of time. Even before doctors thought to address the problem, officers had noticed that men reached their peak of efficiency after about 90 days on the line. Then they all started to deteriorate, regardless of their individual strength or courage. A survey of platoon leaders in two veteran infantry divisions carried out by the Army Research Bureau revealed that when asked which soldiers they would most hate to lose, the greatest concentration among enlisted men was those with four to five months of experience (including days spent in rear areas).
    Both groups reached their highest level of performance between their third and seventh months, and after eight months they were less effective than men with less combat time. Contrary to the earlier view that some individual types of men were predisposed to break down, it was now found that after enough time in battle—with 200 days being about the maximum, and somewhere between 140 and 180 the average—everyone broke down. Mental health experts recommended that men be given more time behind the lines to ease the stress of battle, and doctors wanted soldiers to be rotated home before they reached the breaking point. But in an Army where there were never enough riflemen to go around, these proposals were out of the question. Although the Army did not change its way of doing business, psychiatrists did, introducing more aggressive therapies. North African NP casualties had initially been sent to hospitals hundreds of miles to the rear, from which fewer than 10 percent returned to duty. Later, a series of psychiatric care levels were established that started with battalion surgeons operating close to the front.

    This World War II photo shows personnel of the U.S. Army Air Forces medical services, learning how to remove a simulated casualty to the Dodge WC-54 3/4 ton ambulance. When dispatched, the ambulance will transfer the injured soldier to a waiting transport ready to evacuate the wounded.

    [​IMG]


    They provided psychiatric first aid, consisting mostly of mild sedation, a good night’s sleep, and hot food. More serious cases went to division clearing stations two to five miles farther back, where they were sedated longer and allowed to bathe. For tough cases the next level of treatment was at “exhaustion centers,” where patients might receive actual psychiatric treatment. Then, after a week or 10 days, if all else had failed, the most seriously disturbed went into neuropsychiatric hospitals, from which they seldom returned to combat.
    This approach produced impressive results, with about 60 percent of NPs returning to their outfits within five days. Some 70 percent of those hospitalized were later given noncombat assignments, where they replaced men fit for battle. By revolutionizing neuropsychiatric care, U.S. doctors made their biggest contribution to victory.
    Improvements in conventional medicine also helped soldiers recover, but they were offset to a large degree by developments in weaponry and munitions. Thus, despite the fact that by World War II medical care had progressed enormously since the 1860s, battlefield death rates remained similar to those of the Civil War. It was only in psychiatry and hospital treatment that doctors made real advances.
     
  2. Froix

    Froix New Member

    Joined:
    Jun 20, 2008
    Messages:
    12
    Likes Received:
    0
    Location:
    Asia
    via War44
    Great thread Jim!

    It's interesting to know that such diseases quite possibly could've caused more deaths during the second world war than through combat.

    I find it also interesting that when people hear casualties when discussing war it's often associated to the number of deaths when in truth it also pertains to injuries. I'm not quite sure if it also includes phsycological damages incurred during the war not necessarily through trauma.
     

Share This Page