WWI Centennial: Britain Grants Women’s Suffrage

Imperial War Museum, Wikimedia Commons // Public Domain
Imperial War Museum, Wikimedia Commons // Public Domain

Erik Sass is covering the events of the war exactly 100 years after they happened. This is the 300th installment in the series. Read an overview of the war to date here.

The First World War triggered a wave of political reform, as country after country gave women the vote in recognition of their many contributions to the war effort, including working in war industries, serving as nurses and ambulance drivers, and running businesses and public services. There were other arguments besides: some pundits said that women, naturally inclined to pacifism, would exert a moderating influence over male politics. Others worried women would refuse to bear a new generation of children, needed to make good the loss of millions of lives in the war, unless they got the vote.

One month after the U.S. House of Representatives approved the 18th Amendment giving women the vote (later rejected by the Senate until 1920), on February 6, 1918, Britain’s Parliament passed the Representation of the People Act, also known as the Fourth Reform Act, granting women householders and university graduates ages 30 and over the right to vote, as well as universal male suffrage. The law added 8.4 million women and 5.6 million men to the franchise nationally, although women would remain outnumbered in the British electorate until full female suffrage was granted in 1928.

Although activists had been pursuing women’s suffrage for decades in Britain, there were no huge public celebrations following Parliament’s historic vote, due partly to the grim wartime context—but also because many had long taken the outcome for granted. The arrival of women’s suffrage was something of an anticlimax, following the revolution in gender relations brought about by the war.

WOMEN'S WAR, WOMEN'S WORLDS

Across Europe and much of the world, war brought women new freedoms in other spheres, but also new pressures and concerns. In addition to war work, women were expected to continue serving in their traditional roles as homemakers and caregivers, leaving them torn between work and family, a still-familiar dilemma. For women working in the war zone, this meant the constant threat of being forced to abandon their patriotic duties. The diarist Vera Brittain, who served as a volunteer nurses' aid for three years in France and Malta, recalled:

"Because we were women we feared perpetually that, just as our work was reaching its climax, our families would need our youth and vitality for their own support. One of my cousins, the daughter of an aunt, had already been summoned home from her canteen work in Boulogne; she was only one of many, for as the war continued to wear out strength and spirits, the middle-aged generation, having irrevocably yielded up its sons, began to lean with increasing weight upon its daughters. Thus the desperate choice between incompatible claims—by which the women of my generation, with their carefully trained consciences, have always been tormented."

For women working factory jobs “on the home front,” in addition to the tedium and dangers of such work, every day was a balancing and juggling act—especially for married women with young children. To help with the burden many factories started providing nurseries and daycare, while older children went to school. However, millions of women still had to rely on relatives, friends, religious or charitable establishments, or paid arrangements (as in the early industrial revolution, some women supported themselves running informal daycares for the children of factory workers). Female workers were also still responsible for feeding their families, which often meant waiting in long lines for basics like meat and bread. One British factory worker, Elsie McIntyre, remembered scrambling for groceries to feed her mother and siblings:

"The most awful thing was food. It was very scarce. And as we were coming off shift someone would say 'There is a bit of steak at the butchers.' And I would get off the train and then go on a tram. And can get off at Burley Road and run to the shop only to find a long queue. And by [the time] it got to my turn there would be no more meat, only half a pound of sausage, you see. And that’s coming off the night shifts. You went straight into a queue before you could go to bed."

As this account hints, just getting to and from work was often a struggle for women relying on overtaxed public transportation. One worker, Peggy Hamilton, recalled that it took 90 minutes to get to her job at a Royal Arsenal factory in London’s Woolwich Square:

“The buses were always full and when we arrived in the square it would be teeming with people fighting for a place on the bus. No one ever paid because the conductor had no chance of collecting the fares. Each bus was crowded to the suffocation point … We had to fight and push to get on board and were often ejected from several buses.”

Mill workers during World War I
Imperial War Museum, Wikimedia Commons // Public Domain

Many factory workers came from the countryside or provincial towns, leaving low-paid domestic, agricultural, or textile work for well-paid munitions and heavy industrial work in the bigger cities, making it impractical to commute. So across Britain and Europe, factory owners and private individuals established hostels and boarding houses for young women, usually offering primitive accommodations with shared bedrooms and communal washrooms, and typically leaving girls and young women little if any privacy (and, along with factories and army barracks, providing a perfect breeding ground for communicable diseases including the flu).

MORAL ANXIETY

Reflecting the Victorian sensibilities of the older generations, parents, politicians, and clergy anxious about “loose morals” among young female factory workers demanded that towns, factories, and hostels hire female police officers, matrons, and other older women to keep an eye on female factory workers both at work and off duty. Concerns for morality and propriety covered a wide range of activity including everything from swearing and horseplay to drinking and smoking, and, of course, relations with men; members of the opposite sex were strictly forbidden in hostels and factory dormitories.

In a small concession to human nature, young women were allowed to establish “girls clubs” attached to factories and hostels where they could entertain male visitors for dances and parties in a chaste, supervised setting. But morality police had less control over young women out on the town, using their newfound spending power to visit bars, tearooms, movie theaters, and dancehalls, where it was much easier to meet members of the opposite sex including fellow factory workers and soldiers on leave. Although it is hard to generalize about the behavior of young women—most seemed determined to remain “respectable” or at least maintain that appearance—many clearly exercised their new freedom to meet, socialize, and have romantic encounters with men. Ray Strachey, a British feminist, remembered two decades later:

"It was during the war, and after it, that the changing moral standard of women became definitely noticeable. Thousands of women had seen their actual or potential mates swallowed up in that ever-increasing wave of death which was the Great War. Life was less than cheap; it was thrown away … All moral standards have been submerged … Little wonder that the old ideals of chastity and self-control in sex were, for many, also lost."

By the same token not every assignation ended in sexual intercourse. A.B. Baker, a volunteer in the Women’s Auxiliary Army Corps serving in France, remembered one comparatively tame—but intense—kiss with a young soldier bound for Passchendaele:

"He said that he was afraid—more afraid than he had ever been in his life. He was sure that this time he was going to 'collect something worse than a packet.' He wanted to know what I believed about death. I forget what I told him. He made me promise to write to his mother if anything happened to him. When I promised he said that I was a “dear kid.” I was very near to crying. He asked me if he could kiss me. I said, “Yes.” He kissed me many times, and held me very tight. He held me so tight that he hurt me and frightened me. His whole body was shaking. I felt for him as I had never felt for any man before. I know now that it wasn’t love. It was just the need to comfort him a little."

Sexual morality was just one of the areas policed, rather ineffectively, by paragons from the older generations. The war also saw large numbers of women take up smoking, as tobacco was made more convenient and “feminine” with mass-produced cigarettes. Daniel Poling, an American YMCA lecturer and temperance advocate, was scandalized by the scene that greeted him in his London hotel in 1917:

"In the dining room of my hotel I found literally scores of women, perhaps as many as 300, smoking. The young, the middle-aged, and the old, were all at it. I saw a young mother calmly blow smoke over the head of her 8-year old soon, who displayed only a mild interest … For a man who is old-fashioned enough to prefer womanhood à la his wife and mother, the 'woman of the cigarette' is very disquieting, to say the least."

But for young women cigarettes came to symbolize elegance, sophistication, and worldliness, according to Brittain, who recalled her first visit home after picking up the habit:

"After supper I settled down luxuriously to smoke—a new habit originally acquired as a means of defense against the insect life of Malta—and to talk to my father about the hazards and adventures of my journey home. My parents took a gratifying pleasure in my assumption of worldly wisdom and the sophistication of the lighted cigarette; after 20 continuous months of Army service I was almost a stranger to them."

SEPARATION AND ALIENATION

War was broadly disruptive to couples, both married and unmarried, as women and men contended with long separations and uncertainty. In Britain and most other combatant nations, the marriage rate surged in the first year of the war and then plunged. Similarly, birth rates across Europe plummeted during the war, as couples put off childbearing for happier times.

Graph showing birth rates in Europe during World War I
Erik Sass

In addition to the ordinary obstacles presented by romantic relationships, during the war women and men also contended with a profound experiential barrier, as men tried to shield women back home from the grim reality of the trenches. Mildred Aldrich, an American retiree living in the French countryside, noted:

"One of the striking features about this war is that the active soldiers almost never talk with the civilians about the war. In a sense, it is forbidden, but the reason goes deeper than that. The soldier and the civilian seem today to speak a different language. It almost seems as if a dark curtain hung between the realities of life 'out there,' and the life into which the soldier enters en repos [on leave]."

Similar, Brittain worried that the war was creating a barrier between her and her fiancé, Roland Leighton:

"To this constant anxiety for Roland’s life was added, as the end of the fighting moved ever further into an incalculable future, a new fear that the war would come between us—as indeed, with time, the war always did, putting a barrier of indescribable experience between men and the women whom they loved, thrusting horror deeper and deeper inward … Quite early I realized this possibility of a permanent impediment to understanding."

Of course the dynamic sometimes worked the other way as well, as women who served at or near the front experienced physical danger on a regular basis, alienating them from older adults of both genders who never saw the war zone. A.B. Baker, the volunteer W.A.A.C., remembered scoffing at “spiritual advice” about the war received from a male clergy member who’d remained safely at home:

"A few days later I had a letter from our curate. In it he talked about war as a noble discipline. He said it purged men of selfishness, and by its pity and terror brought men nearer to God. I felt sick for a second time. He put with his letter a printed Prayer for Victory, and told me to say it every night. I remembered that my prayer in the dug-out had been just this, said over and over again: “O God, stop this war; stop it, and let me go home.” At home the curate had been rather a hero of mine. He wasn’t my hero any more."

The war saw a wide variety of new types of relationships forming, including casual, practical, and purely formal. Some women married men they didn’t really love out of a sense of desperation or patriotic duty, according to an American volunteer ambulance driver, William Yorke Stevenson, who heard about one situation from a French acquaintance in March 1916:

“She says a friend of hers who nursed a man, blind and without arms, is going to marry him because she thinks it is her duty, although she does not care for him. She is not pretty; but as the man is blind it will not matter, she says. Such cases are not rare.”


Imperial War Museum, Wikimedia Commons // Public Domain

On the other hand, the disruptions of war weren’t always unwelcome to married women and widows, depending on their previous circumstances, which might have seen them trapped in unhappy marriages. Mildred Aldrich confided an awkward truth about the lives of French peasant women in her diary in April 1916:

"I often wonder if some of the women are not better off than in the days before the war. They do about the same work, only they are not bothered by their men … for nearly two years they have had no drinking man to come home at midnight either quarrelsome or sulky; no man’s big appetite to cook for; no man to wash for or to mend for. They have lived in absolute peace, gone to bed early to a long, unbroken sleep, and get 25 cents a day government aid, plus 10 cents for each child … under my breath, I can assure you that there is many a woman of that class a widow today who is better off for it, and so are her children."

GRIEF AND DEDICATION

Finally, women would also bear for decades the lasting burden of grief for family members killed during the war. Visitors described crowds of Parisian women dressed black in church and other public places, and some women continued to dress in mourning many years. Privately, the grieving process began with the returned possessions of the dead, as vividly described by Brittain in January 1916:

"All Roland’s things had just been sent back from the front through Cox’s; they had just opened them and they were all lying on the floor. I had no idea before of the after-results of an officer’s death, or what the returned kit, of which so much has been written in the papers, really meant. It was terrible … Everything was damp and worn and simply caked with mud … the smell of those clothes was the smell of graveyards and the dead. The mud of France which covered them was not ordinary mud; it had not the usual clean pure smell of earth, but it was as though it were saturated with dead bodies."

So much importance was attached to these items that soldiers and civilians sometimes sent the possessions of dead enemy soldiers to their families on the opposing side, typically via neutral countries. Evelyn Blucher, an Englishwoman married to a German aristocrat and living in Berlin, tried to identify the possessions of British soldiers killed in battle and send them home. In August 1917 she wrote in her diary of one such occasion:

"A feeling of hopeless sadness crept over me as I saw these trays of things, the only mementoes left of men who had such a short time ago been alive in the full flush of manhood. There was a whole stack of battered and bloodstained cigarette cases, some with inscriptions or monograms engraved on them, many containing small photos or a few written words … Then there were all the other various small articles generally to be found in a man’s pocket—fountain pens, handkerchiefs, torn letters, purses, coins, etc.; and I felt the tears come into my eyes when I thought of what value they would be to some in England now."

At the same time, many women cited their own grief, as well as awareness of the losses suffered by others, as motivation for their own continuing war work. After Roland’s death Brittain wrote in her diary:

“Well, one of the things this final part of Roland’s story has made me feel is that as long as the war lasts … I cannot lead any but an active life, even though it should last for five years … No, it must be some form of active service, and if it implies discomforts, so much the better. I am beginning to feel that to leave nursing now would be a defeat."

Women drinking tea during World War I
Imperial War Museum, Wikimedia Commons // Public Domain

In the same vein, a French woman, Marguerite Lesage, wrote in March 1916:

“There are times when I wonder if I’m going to give in to le cafard [depression] … Yes … but having mentally run through this list for the thousandth time, it is enough to think of our soldiers—and in what conditions!—to think, once again, that as long as I can, I must be worthy of them and stay here.”

Unsurprisingly even the most dedicated women workers found their spirits flagging as the war went on, leading to a regime of self-criticism and emotional self-policing. In 1916, now stationed in Malta, Brittain admitted in a letter to her brother:

“One’s personal interest wears one’s patriotism rather threadbare by this time … After all it is a garment one has had to wear for a very long time, so there’s not much wonder if it is beginning to get a little shabby.”

And Julia Stimson, an American volunteer head nurse, wrote in a letter home in June 1917:

"It is so pathetic the way one can lose sight of one’s inspirations if one’s feet are tired, or the way one can forget one is on a crusade if there is no drinking water to be had for half a day, and can be just an ordinary uninspired human female and be fretful and discouraged because you don’t like the tone of voice of a supervisor. It is my job of course to keep before my people the why of our coming and to keep their spirits up."

NEW CONFIDENCE

Despite numerous hardships, the First World War marked an expansion of women’s horizons. Again, it’s worth noting this didn’t result from the granting of women’s suffrage, but rather the reverse, as male politicians and voters were forced to recognize women’s contributions to the war effort, which had already brought new freedoms and greater economic power in its train. Two decades after the war, Robert Roberts, a boy at the time, remembered that the right to vote was granted almost as an afterthought, as even children could see the huge changes in the adult world:

"Whatever war did to women in home, field, service, or factory, it undoubtedly snapped strings that had bound them in so many ways to the Victorian age. Even we, the young, noticed their new self-confidence. Wives in the shop no longer talked about ‘my boss,’ or ‘my master.’ Master had gone to war and Missis ruled the household, or he worked close to her in the factory … earning little more than she did herself. Housewives left their homes and immediate neighborhood more frequently, and with money in their purses went foraging for goods even into the city shops … She discovered her own rights."

See the previous installment or all entries, or read an overview of the war.

12 Facts About Japanese Internment in the United States

Portrait of internee Tom Kobayashi at Manzanar War Relocation Center, Owens Valley, California, 1943
Portrait of internee Tom Kobayashi at Manzanar War Relocation Center, Owens Valley, California, 1943
Ansel Adams, Library of Congress/Wikimedia Commons // No Known Copyright Restrictions

On February 19, 1942, President Franklin Delano Roosevelt issued Executive Order 9066, which sanctioned the removal of Japanese immigrants and Americans of Japanese heritage from their homes to be imprisoned in internment camps throughout the country.

At the time, the move was sold to the public as a strategic military necessity. Following the attack on Pearl Harbor on December 7, 1941, the government argued that it was impossible to know where the loyalties of Japanese-Americans rested.

Between 110,000 and 120,000 people of Japanese ancestry were relocated to internment camps along the West Coast and as far east as Louisiana. Here are 12 facts about what former first lady Laura Bush has described as "one of the most shameful episodes in U.S. history."

1. The government was already discussing detaining people before the Pearl Harbor attack.

In 1936, President Franklin Roosevelt—who was concerned about Japan’s growing military might—instructed William H. Standley, his chief of naval operations, to clandestinely monitor "every Japanese citizen or non-citizen on the island of Oahu who meets these Japanese ships [arriving in Hawaii] or has any connection with their officers or men" and to secretly place their names "on a special list of those who would be the first to be placed in a concentration camp in the event of trouble."

This sentiment helped lead to the creation of the Custodial Detention List, which would later guide the U.S. in detaining 31,899 Japanese, German, and Italian nationals, separate from the 110,000-plus later interred, without charging them with a crime or offering them any access to legal counsel.

2. Initial studies of the “Japanese problem” proved that there wasn’t one.

In early 1941, Curtis Munson, a special representative of the State Department, was tasked with interviewing West Coast-based Japanese-Americans to gauge their loyalty levels in coordination with the FBI and the Office of Naval Intelligence. Munson reported that there was extraordinary patriotism among Japanese immigrants, saying that "90 percent like our way best," and that they were "extremely good citizen[s]" who were "straining every nerve to show their loyalty." Lieutenant Commander K.D. Ringle’s follow-up report showed the same findings and argued against internment because only a small percentage of the community posed a threat, and most of those individuals were already in custody.

3. The general in charge of Western defense command took nothing happening after Pearl Harbor as proof that something would happen.

Minidoka Relocation Center. Community Store in block 30
National Archives at College Park, Wikimedia Commons // CC BY 3.0

Despite both Munson and Ringle debunking the concept of internment as a strategic necessity, the plan moved ahead—spurred largely by Western Defense Command head General John L. DeWitt. One month after Pearl Harbor, DeWitt created the central ground for mass incarceration by declaring: "The fact that nothing has happened so far is more or less ... ominous in that I feel that in view of the fact that we have had no sporadic attempts at sabotage that there is a control being exercised and when we have it, it will be on a mass basis."

DeWitt, whose ancestors were Dutch, didn’t want anyone of Japanese descent on the West Coast, stating that “American citizenship does not necessarily determine loyalty.”

4. Almost no one protested internment.

Alongside General DeWitt, Wartime Civil Control Administration director Colonel Karl Bendetsen avowed that anyone with even “one drop of Japanese blood” should be incarcerated, and the country generally went along with that assessment. Some newspapers ran op-eds opposing the policy, and the American Baptist Home Mission Societies created pamphlets to push back, but as historian Eric Foner wrote in The Story of American Freedom, "One searches the wartime record in vain for public protests among non-Japanese." Senator Robert Taft was the only congressperson to condemn the policy.

5. Supporting or opposing internment were both matters of economics.

White farmers and landowners on the West Coast had great economic incentives to get rid of Japanese farmers who had come to the area only decades before and found success with new irrigation methods. They fomented deep hatred for their Japanese neighbors and publicly advocated for internment, which is one reason so many of the more than 110,000 Japanese individuals sent to camps came from the West Coast. In Hawaii, it was a different story. White business owners opposed internment, but not for noble reasons: They feared losing their workforce. Thus, only between 1200 and 1800 Japanese-Americans from Hawaii were sent to internment camps.

6. People were tagged for identification.

Children in a drawing class at Minidoka Relocation Center
National Archives at College Park, Wikimedia Commons // CC BY 3.0

Moving entire communities of people to camps in California, Colorado, Texas, and beyond was a gargantuan logistical task. The military assigned tags with ID numbers to families, including the children, to ensure they would be transferred to the correct camp. In 2012, artist Wendy Maruyama recreated thousands of these tags for an art exhibition she titled "The Tag Project."

"The process of replicating these tags using government databases, writing thousands of names, numbers, and camp locations became a meditative process," Maruyama told Voices of San Diego. “And for the hundreds of volunteers, they could, for a minute or two as they wrote the names, contemplate and wonder what this person was thinking as he or she was being moved from the comforts of home to the spare and bare prisons placed in the foreboding deserts and wastelands of America. And could it happen again?”

7. Not everyone went quietly.

Directly combatting the image of the “polite” Japanese-Americans who acquiesced to internment without protest, collections of resistance stories paint a disruptive picture of those who refused to go to the camps or made trouble once inside. Among those who were considered "problematic" were individuals who refused to register for the compulsory loyalty questionnaire, which asked questions about whether the person was a registered voter and with which party, as well as marital status and "citizenship of wife" and "race of wife."

“A broadly understood notion of resistance represents a more complete picture of what happened during World War II,” David Yoo, a professor of Asian American Studies and History and vice provost at UCLA's Institute of American Cultures, told NBC News about collecting these resistance stories. “Because these stories touch upon human rights, they are important for all peoples.”

8. The government converted unused buildings into camp facilities.

For the most part, camps were set against desert scrub land or infertile Ozark hills bordered with barbed wire. Before getting on buses to be transported to their new "homes," detainees had to go through processing centers housed in converted racetracks and fairgrounds, where they might stay for several months. The largest and most noteworthy center was Santa Anita Park, a racetrack in Arcadia, California, which was shut down so that makeshift barracks could be assembled and horse stables could be used for sleeping quarters.

9. Ansel Adams took hundreds of photographs inside the most famous camp, as did an internee with a smuggled camera.

Wooden sign at entrance to the Manzanar War Relocation Center with a car at the gatehouse in the background
Ansel Adams, Library of Congress/Wikimedia Commons // Public Domain

Approximately 200 miles north of Santa Anita Park, at the foot of the Sierra Nevada mountain range, was Manzanar—which, with its 11,000 internees, was perhaps the most famous of America's 10 relocation centers. It was also the most photographed facility. In the fall of 1942, famed photographer Ansel Adams—who was personally outraged by the situation when a family friend was taken from his home and moved halfway across the country—shot more than 200 images of the camp. In a letter to a friend about a book being made of the photos, Adams wrote that, "Through the pictures the reader will be introduced to perhaps 20 individuals ... loyal American citizens who are anxious to get back into the stream of life and contribute to our victory."

While Adams may have successfully offered a small glimpse at life inside Manzanar, Tōyō Miyatake—a photographer and detainee who managed to smuggle a lens and film into the camp, which he later fashioned into a makeshift camera—produced a series of photos that offered a much more intimate depiction of what everyday life was like for the individuals who were imprisoned there between 1942 and 1945. Today, Manzanar is a National Historic Site.

10. Detainees were told they were in camps for their own protection.

Japanese-Hawaiian hula dancers on an improvised stage during one of the frequent talent shows at Santa Anita (California) Assembly Center
U.S. Signal Corps, Library of Congress, Wikimedia Commons // Public Domain

Just as the justification for internment was an erroneous belief in mass disloyalty among a single racial group, the argument given to those incarcerated was that they were better off inside the barbed wire compounds than back in their own homes, where racist neighbors could assault them. When presented with that logic, one detainee rebutted, “If we were put there for our protection, why were the guns at the guard towers pointed inward, instead of outward?”

11. Internees experienced long-term health problems because of the camps, and children had it the worst.

Internment officially lasted through 1944, with the last camp closing in early 1946. In those years, Japanese-Americans did their best to make lives for themselves on the inside. That included jobs and governance, as well as concerts, religion, and sports teams. Children went to school, but there were also dances and comic books to keep them occupied. But the effects of their internment were long-lasting.

There have been multiple studies of the physical and psychological health of former internees. They found those placed in camps had a greater risk for cardiovascular disease and death, as well as traumatic stress. Younger internees experienced low self-esteem, as well as psychological trauma that led many to shed their Japanese culture and language. Gwendolyn M. Jensen’s The Experience of Injustice: Health Consequences of the Japanese American Internment found that younger internees “reported more post-traumatic stress symptoms of unexpected and disturbing flashback experiences than those who were older at the time of incarceration.”

12. A congressional panel called it a “grave injustice" ... 40 years later.

Japanese Americans going to Manzanar gather around a baggage car at the old Santa Fe Station. (April 1942)
Russell Lee, Library of Congress, Wikimedia Commons // Public Domain

It wasn’t until 1983 that a special Congressional commission determined that the mass internment was a matter of racism and not of military strategy. Calling the incarceration a “grave injustice,” the panel cited the ignored Munson and Ringle reports, the absence of any documented acts of espionage, and delays in shutting down the camps due to weak political leadership from President Roosevelt on down as factors in its conclusion. The commission paved the way for President Reagan to sign the Civil Liberties Act, which gave each surviving internee $20,000 and officially apologized. Approximately two-thirds of the more than 110,000 people detained were U.S. citizens.

This list first ran in 2018.

The Disturbing Reason Schools Tattooed Their Students in the 1950s

Kurt Hutton, Hulton Archive/Getty Images
Kurt Hutton, Hulton Archive/Getty Images

When Paul Bailey was born at Beaver County Hospital in Milford, Utah on May 9, 1955, it took less than two hours for the staff to give him a tattoo. Located on his torso under his left arm, the tiny marking was rendered in indelible ink with a needle gun and indicated Bailey’s blood type: O-Positive.

“It is believed to be the youngest baby ever to have his blood type tattooed on his chest,” reported the Beaver County News, cooly referring to the infant as an “it.” A hospital employee was quick to note parental consent had been obtained first.

The permanent tattooing of a child who was only hours old was not met with any hysteria. Just the opposite: In parts of Utah and Indiana, local health officials had long been hard at work instituting a program that would facilitate potentially life-saving blood transfusions in the event of a nuclear attack. By branding children and adults alike with their blood type, donors could be immediately identified and used as “walking blood banks” for the critically injured.

Taken out of context, it seems unimaginable. But in the 1950s, when the Cold War was at its apex and atomic warfare appeared not only possible but likely, children willingly lined up at schools to perform their civic duty. They raised their arm, gritted their teeth, and held still while the tattoo needle began piercing their flesh.

 

The practice of subjecting children to tattoos for blood-typing has appropriately morbid roots. Testifying at the Nuremberg Tribunal on War Crimes in the 1940s, American Medical Association physician Andrew Ivy observed that members of the Nazi Waffen-SS carried body markings indicating their blood type [PDF]. When he returned to his hometown of Chicago, Ivy carried with him a solution for quickly identifying blood donors—a growing concern due to the outbreak of the Korean War in 1950. The conflict was depleting blood banks of inventory, and it was clear that reserves would be necessary.

School children sit next to one another circa the 1950s
Reg Speller, Fox Photos/Getty Images

If the Soviet Union targeted areas of the United States for destruction, it would be vital to have a protocol for blood transfusions to treat radiation poisoning. Matches would need to be found quickly. (Transfusions depend on matching blood to avoid the adverse reactions that come from mixing different types. When a person receives blood different from their own, the body will create antibodies to destroy the red blood cells.)

In 1950, the Department of Defense placed the American Red Cross in charge of blood donor banks for the armed forces. In 1952, the Red Cross was the coordinating agency [PDF] for obtaining blood from civilians for the National Blood Program, which was meant to replenish donor supply during wartime. Those were both measures for soldiers. Meanwhile, local medical societies were left to determine how best to prepare their civilian communities for a nuclear event and its aftermath.

As part of the Chicago Medical Civil Defense Committee, Ivy promoted the use of the tattoos, declaring them as painless as a vaccination. Residents would get blood-typed by having their finger pricked and a tiny droplet smeared on a card. From there, they would be tattooed with the ABO blood group and Rhesus factor (or Rh factor), which denotes whether or not a person has a certain type of blood protein present.

The Chicago Medical Society and the Board of Health endorsed the program and citizens voiced a measure of support for it. One letter to the editor of The Plainfield Courier-News in New Jersey speculated it might even be a good idea to tattoo Social Security numbers on people's bodies to make identification easier.

Despite such marked enthusiasm, the project never entered into a pilot testing stage in Chicago.

Officials with the Lake County Medical Society in nearby Lake County, Indiana were more receptive to the idea. In the spring of 1951, 5000 residents were blood-typed using the card method. But, officials cautioned, the cards could be lost in the chaos of war or even the relative quiet of everyday life. Tattoos and dog tags were encouraged instead. When 1000 people lined up for blood-typing at a county fair, two-thirds agreed to be tattooed as part of what the county had dubbed "Operation Tat-Type." By December 1951, 15,000 Lake County residents had been blood-typed. Roughly 60 percent opted for a permanent marking.

The program was so well-received that the Lake County Medical Society quickly moved toward making children into mobile blood bags. In January 1952, five elementary schools in Hobart, Indiana enrolled in the pilot testing stage. Children were sent home with permission slips explaining the effort. If parents consented, students would line up on appointed tattoo days to get their blood typed with a finger prick. From there, they’d file into a room—often the school library—set up with makeshift curtains behind which they could hear a curious buzzing noise.

When a child stepped inside, they were greeted by a school administrator armed with indelible ink and wielding a Burgess Vibrotool, a medical tattoo gun featuring 30 to 50 needles. The child would raise their left arm to expose their torso (since arms and legs might be blown off in an attack) and were told the process would only take seconds.

A child raises his hand in class circa the 1950s
Vecchio/Three Lions/Getty Images

Some children were stoic. Some cried before, during, or after. One 11-year-old recounting her experience with the program said a classmate emerged from the session and promptly fainted. All were left with a tattoo less than an inch in diameter on their left side, intentionally pale so it would be as unobtrusive as possible.

At the same time that grade schoolers—and subsequently high school students—were being imprinted in Indiana, kids in Cache and Rich counties in Utah were also submitting to the program, despite potential religious obstacles for the region's substantial Mormon population. In fact, Bruce McConkie, a representative of the Church of Jesus Christ of Latter-Day Saints, declared that blood-type tattoos were exempt from the typical prohibitions on Mormons defacing their bodies, giving the program a boost among the devout. The experiment would not last much longer, though.

 

By 1955, 60,000 adults and children had gotten tattooed with their blood types in Lake County. In Milford, health officials persisted in promoting the program widely, offering the tattoos for free during routine vaccination appointments. But despite the cooperation exhibited by communities in Indiana and Utah, the programs never spread beyond their borders.

The Korean conflict had come to an end in 1953, reducing the strain put on blood supplies and along with it, the need for citizens to double as walking blood banks. More importantly, outside of the program's avid boosters, most physicians were extremely reticent to rely solely on a tattoo for blood-typing. They preferred to do their own testing to make certain a donor was a match with a patient.

There were other logistical challenges that made the program less than useful. The climate of a post-nuclear landscape meant that bodies might be charred, burning off tattoos and rendering the entire operation largely pointless. With the Soviet Union’s growing nuclear arsenal—1600 warheads were ready to take to the skies by 1960—the idea of civic defense became outmoded. Ducking and covering under desks, which might have shielded some from the immediate effects of a nuclear blast, would be meaningless in the face of such mass destruction.

Programs like tat-typing eventually fell out of favor, yet tens of thousands of adults consented to participate even after the flaws in the program were publicized, and a portion allowed their young children to be marked, too. Their motivation? According to Carol Fischler, who spoke with the podcast 99% Invisible about being tattooed as a young girl in Indiana, the paranoia over the Cold War in the 1950s drowned out any thought of the practice being outrageous or harmful. Kids wanted to do their part. Many nervously bit their lip but still lined up with the attitude that the tattoo was part of being a proud American.

Perhaps equally important, children who complained of the tattoo leaving them particularly sore received another benefit: They got the rest of the afternoon off.

SECTIONS

arrow
LIVE SMARTER