Opponents called it the first step towards socialized medicine. The law was too expensive, they complained, and it violated states’ rights. One woman who testified during a congressional committee hearing even suggested its passage would lead to “bureaucratic control of family life.” If he was alive, John Roberts surely would have found a way to strike it down in court.
You don’t hear about it much anymore, but the Sheppard-Towner Act
—or the “Better Babies Bill,” as some reporters referred to it at the time—was a big f’ing deal. No Congress in U.S. history had ever approved a federally funded social welfare program before S-T came up for debate in the early 1920s; aside from the Volstead Act
, it was the most controversial law of its era. A Boston Globe
writer summed it up this way: “It ranks next in importance, in the opinion of its advocates, to the legislation which finally gave women the right to vote.” And there are some striking parallels between the fight over Sheppard-Towner and the recent debate surrounding President Obama’s embattled health reform law. With the Supreme Court set to rule on the constitutionality
of the Affordable Care Act next month, it’s worth investigating the legacy of its earliest legislative antecedent.
The story begins in 1912, when President Taft created the U.S. Children's Bureau and hired Julia Lathrop to run it. Housed within the Labor Department, the agency was designed to investigate and report “upon all matters pertaining to the welfare of children and child life among all classes of our people.” Lathrop (pictured above) had been doing essentially the same work for two decades at Chicago’s Hull House, undertaking extensive surveys to document the brutal living conditions in her city’s slums, mental health institutions, orphanages, and poorhouses. She was a natural fit. As one senator’s wife gushed to the Washington Post
, choosing Lathrop was “the finest and most just recognition of a woman's ability, and her place in the nation, that has ever been made by any president" (November 6, 1912).
“Young America’s Aunt”* knew instantly what problem her department should tackle first: infant mortality. When she arrived in Washington, Lathrop’s office launched an eight-city examination into American childbirth habits. The results were startling; the nation’s overall infant mortality rate was a whopping 111.2 per 1,000 live births, higher than almost every other industrialized country in the world. Annually, 250,000 American babies died during their first year and another 23,000 mothers were killed during the delivery. (It was the second leading cause of death among women between the ages of 18 and 45, behind tuberculosis.) There was also a correlation between poverty and the mortality rate—for families earning less than $450 annually, one baby in six
died before his or her first birthday. Respected Johns Hopkins pediatrician Dr. J. H. Mason Knox made clear at the time that nearly all of those deaths were preventable if families just received proper prenatal care. Only 20 percent of expectant moms did.
With firm data in hand, Lathrop set about drafting a piece of legislation that would use federal funds to provide “public protection of maternity and infancy.”** Like the newly-established Smith-Lever Act, which authorized the Department of Agriculture to distribute matching funds to the states for extension work
by county agents, Lathrop envisioned a program in which Washington partnered with local nurses, universities, and social workers to subsidize the instruction of mothers on the care of infants. “The bill,” Lathrop wrote, “is designed to emphasize public responsibility for the protection of life just as already through our public schools we recognize public responsibility in the education of children.”
In 1919, U.S. Rep. Horace Mann Towner (R-Iowa) and U.S. Sen. Morris Sheppard (D-Texas) submitted a bill that contained the basics of Lathrop’s proposal. The Hull House veteran wasted no time stumping for her idea. Over the next three years, Lathrop enlisted support wherever she could, relying heavily on women’s associations that were emboldened by the recent extension of suffrage. The Children’s Bureau sponsored “The Year of the Child,” in which the agency appealed to groups across the nation and published catchy graphics to illustrate the country’s poor international standing. Lathrop convinced popular magazines like Good Housekeeping
and the Ladies Home Journal
to editorialize in favor of the measure. Ultimately, 13 of the most powerful women's groups in America rallied behind Sheppard-Towner, too; in the final weeks of negotiations, the Women's Joint Congressional Committee—a massive umbrella group
—conducted interviews with congressman at the rate of 50 per day. “It is doubtful,” reported the Globe
(December 18, 1921), “if any single piece of legislation enacted by Congress in recent years—apart from equal suffrage—has had the organized influence of so great a body of the citizenship of the country back of it.”
The final version of the bill passed both the House (279 to 39) and the Senate (63 to 7) by a wide margin in late 1921, in part because the law was modest in scope. Congress agreed to appropriate just $1.24 million annually (about $15 million today) for the program, with each participating state receiving $5,000 outright and then dollar-for-dollar matching funds as determined by its population. After five years, the funding would need to be reauthorized, as well. (The advocates of the bill were confident that half a decade was “sufficiently long to demonstrate the real value of the measure.”) One year after its passage, a reporter for the Detroit Free-Press
described Sheppard-Towner as “mild and rather helpless” (December 1, 1922). He wasn’t wrong.
But the idea behind the bill, at least in the United States, was revolutionary. Social insurance, in any form, just didn’t exist. In her book “Protecting Soldiers and Mothers,” historian Theda Skocpol writes that Lathrop’s brainchild “extended what was once domestic action into a new understanding of governmental action for societal welfare.” Put another way
, the new law was “a fragile seed growing in isolation from the then-traditional health programs.”***
That seed quickly bloomed. Within the first year of implementation, 45 out of 48 states passed enabling legislation to receive matching S-T funds. (Illinois, Connecticut, and Massachusetts never participated.) Each used their subsidy in different ways; some organized conferences where physicians ran demonstrations on maternal and infant care and hygiene, while others paid nurses to visit new or expectant mothers. However it was deployed, the money went a long way. Between 1922 and 1929, the Bureau distributed over 22 million pieces of literature, conducted 183,252 health conferences, established 2,978 permanent prenatal centers, and visited over 3 million homes. Lathrop’s successor at the Children’s Bureau, Grace Abbott, estimated that one-half of U.S. babies had benefited from the government's childrearing information.
Not surprisingly, infant mortality dropped precipitously while Sheppard-Towner was on the books. A new working paper
published last month by the National Bureau of Economic Research estimates that Sheppard-Towner activities accounted for 12 percent of the drop in infant mortality during the 1920s, with one-on-one interventions creating the most statistically significant results. Combined with rising incomes and better nutrition, preventative health education helped cut down the infant mortality rate to 67.6 deaths per 1,000 live births in 1929. Considering how little money Congress actually spent on the law, the results were thrilling.
Not everyone was so excited by the precedent Sheppard-Towner was setting. During the initial debate in Congress, several opponents delivered unhinged criticism of both the bill and its supporters. U.S. Sen. James Reed (D-Missouri) declared (incorrectly) that Sheppard-Towner would permit officials to “invade” the homes of mothers-to-be. “We would better reverse the proposition,” he charmingly added, “and provide for a committee of mothers to take charge of the old maids (at the Children’s Bureau) and teach them how to acquire a husband and have babies of their own.” Not to be outdone, his colleague in the House, U.S. Rep. Henry Tucker (D-Virginia), characterized the bill as an attempt to “make Uncle Sam the midwife of every expectant woman in the United States.” And Mary G. Kilberth of the National Association Opposed to Woman Suffrage argued that Sheppard-Towner advocates were both “inspired by foreign experiments in Communism” and “connected with the birth-control movement.” A wealthy socialite from Boston went so far as to challenge the law before the U.S. Supreme Court, contending unsuccessfully
that it violated the Tenth Amendment.
If ideologues couldn’t rescind the law, doctors had a better shot. The American Medical Association board was initially skeptical of Sheppard-Towner, calling it an “imported socialistic scheme unsuited to our form of government” at its annual meeting in 1922. Four years later, however, the association fully mobilized for the funding reauthorization fight, lobbying Congress and writing letters to the president. It’s clear that many physicians moved to incorporate preventive health education into their private practices only when they saw the benefits of prenatal care play out in new clinics across the country. In a very real sense, the Children’s Bureau had become a primary competitor, and its own worst enemy.
Desperate to keep their projects operating, directors of the state Sheppard-Towner programs and Abbott cut what the historian Skocpol deemed a “deal with the devil,” agreeing to terminate the law altogether in exchange for two more years of full financial support. In 1929, seven years after reformers printed their first informational flyers, Sheppard-Towner came off the books. Over the next four years, progressives introduced 14 different bills that would have funded maternity and infancy health programs using federal dollars. All of them failed. When the Great Depression hit, most states dropped their existing programs altogether.
The lesson, though, had been learned. And while the United States’ current infant mortality rate is still not where it should be
, it’s decidedly safer for babies and mothers now than it was a century ago. For that, we can thank Julia Lathrop and her small, ambitious staff.
*Headline in the Post
on June 9, 1912
** Children’s Bureau’s Fifth Annual Report of the Chief, 1917
***The Sheppard-Towner Era: A Prototype Case Study in Federal-State Relationships
; June, 1967
Derrick Rose needs to get healthy, quickly. Chicago’s star guard has missed over 40 percent of his team’s games this season because of nagging injuries, the latest of which is causing a wave of panic to wash over otherwise-optimistic Bulls fans who fear that bumps and bruises could spoil what’s been a dominant season thus far. Sure, the Bulls carry the deepest bench
in the league, yet a poor run of form
(and common sense) suggests their GQ cover boy
must play at a high level if they have any shot at winning the NBA title. “I’m just trying to survive,” Rose joked
earlier this week. So are we, Derrick. So are we.
Thankfully, Rose and other modern athletes now have at their disposal a ton of sophisticated medical procedures and medications to help the body heal, from physical therapy and acupuncture to cortisone injections and advanced surgeries. Professionals—whose livelihood depends on proper functioning arms and legs—will even spend thousands and thousands of dollars on remedies that have not yet passed clinical trials, like the injectable anti-inflammatory drug Toradol
, in which the “patient's own tissues are extracted, carefully manipulated, and then reintroduced to the body.” There are obvious risks in stepping back onto the field or court after undergoing experimental treatments—just ask the owners of drugged thoroughbreds.
But with the biological clock ticking, the more options available, the better.
Dr. George Bennett, a sports medicine pioneer, would be thrilled to see these innovations. Born in the Catskill Mountains in 1885, Bennett was himself a solid baseball player, landing a roster spot on a local semi-pro team by the age of 16. (Friends later described him as a “rather undisciplined little tough guy.”) But medicine was Bennett’s true passion. After high school, he worked a series of odd jobs throughout the Midwest, stashing away his earnings to pay for medical school tuition. Bennett eventually matriculated at the University of Maryland, graduated in 1908, and landed a job at the Johns Hopkins Hospital two years later. He was 25.
It was an interesting time for a sports fan to enter the field, such as it was. “Sports medicine,” as we understand it today, was in no way a recognized discipline. In the locker room, “it was considered effete and unnecessary to have a doctor in attendance” (Washington Post
; March 10, 1962), and trainers—most of whom had no science background—applied the lion’s share of treatments, which often meant rubbing sore muscles with balms. At the same time, doctors were starting to use x-rays with more regularity, producing detailed images of the body without having to penetrate the skin physically. If an entrepreneurial physician studied how the athlete’s body works and used that knowledge to create procedures that sped up recovery times, he could give daring ballplayers a competitive advantage while making a tidy profit for himself.
So Bennett poured over x-rays, starting with baseball pitchers. And what he found was troubling. While baseball players were subject to the same disabilities of the average laborer, repeating the overhand throwing motion over and over did increase by a wide margin the frequency of degenerative joint injuries. The ligaments, tendons, and muscles in the human arm are just not designed to exert the pressure necessary to propel a baseball 60 feet at rapid speeds, much less make it curve in flight. "Pitching,” Bennett would famously say
, “is a most unnatural motion.”
Bennett penned an article in the American Physical Education Review
in 1925 laying out the case in plan details that pitching can create long-term structural damage. He followed that piece up with another influential article in 1941, titled “Shoulder and Elbow Lesions of the Professional Baseball Pitcher
,” that included x-ray photos and a controversial suggestion that pitchers should use the side-arm delivery (like Walter Johnson
) to lengthen their careers. It seems obvious now, but the conclusion was revelatory at the time; pitcher workloads didn’t begin to drop dramatically
until the mid-1920s, after Bennett’s first paper was published.
While he studied joints in the lab, Bennett simultaneously built a successful practice, which he would leave Johns Hopkins to run full-time in 1947. Over time, the doctor garnered what sports columnist Red Smith called “the enviable and deserved reputation for remantling athletes” (Baltimore Sun
; May 28, 1950). Famous ballplayers liked him for a number of reasons: he was clearly bright, he took sports seriously, and he was not afraid to take orthopedic chances if his client requested it of him. Most importantly, he kept his mouth shut; an AP reporter once joked that the only two words the humble Bennett ever said in public were “operation successful.”
Over the course of his career, Bennett opened up stars like Joe DiMaggio, Dizzy Dean, Lefty Gomez, Pee Wee Reese, and Johnny Unitas.* (Clark Cable and Lord Halifax sought out his counsel, too.) With the help of a colleague at Hopkins, he also invented
the first batting helmet, a hat designed with a specialty zipper pocket that held two hard plastic slabs
. And once in a while, he worked miracles.
The career of Roy Sievers
(pictured above) is an instructive example. A hulking left fielder, Sievers won the American League Rookie of the Year award in 1949, hitting .306 and slugging 16 home runs for the St. Louis Browns. But in 1951, after struggling during his sophomore season, he broke his right collarbone diving for a ball in the outfield. The next spring, he dislocated the same shoulder making a throw across the diamond. His career appeared finished. Then Sievers visited Bennett. In what the doctor described as “an experiment,” he drilled a hole in Sievers’ bone, cut his tendons, slipped them through the opening, and knotted them together on the other side to keep the bone from rolling out of the shoulder socket. The procedure drastically limited Sievers’ throwing power, forcing a positional move to first base. While supportive of the initial operation, Browns president Bill Veeck and his colleagues in the front office weren’t convinced he would return to form, so they shuffled him off to Washington in a trade for the unremarkable Gil Coan.
This turned out to be a mammoth mistake; Sievers gradually redeveloped strength in his arm and subsequently took the majors by storm, blasting over 20 home runs in nine straight seasons. His best year came in 1957, when Sievers finished third in the AL MVP race, logging 42 home runs and an on-base plus slugging percentage of .967. According to Bennett, Sievers’ recovery was a “miracle of modern medicine” (Washington Post
; September 20, 1957). The Senator agreed; during an awards dinner for Bennett the following year, Sievers came up to the doctor with a tear in his eye and thanked him for saving his career. Red Smith aptly described Bennett’s enduring reputation: “This sort of thing has become such a familiar story—the halt and lame of sports have been shuffling off to Baltimore for so long now and in such numbers—that a newspaper reader might be excused if he got the notion that Dr. Bennett had invented the practice of medicine.”
Bennett died in 1962,
so he didn’t get to see the creative surgical work of the doctors who followed in his wake. That includes Frank Jobe, who successfully repaired Tommy Johns’ shoulder and launched
a medical revolution. But his impact on sports, and the medical profession more broadly, was undeniable. NFL commissioner Roger Goodell might even want to revisit the doctor’s thoughts on football, broadcast in an AP interview on December 18, 1947. “The present helmet is simply equipping a player with armor and the steel mask in front is an open invitation to crush someone’s jaw or knock his teeth out,” he said. “The toll of injuries will continue to mount unless the face mask is legislated out of the game immediately.”
Prescient words from a thoughtful man.
*“After listening to that all-star team of players Dr. Bennett has mended,” Joe Garagiola said at an awards dinner in 1958, “I’m sort of sorry I didn't break my leg."
On Friday, I sat at my desk and dreamt that I owned a treadmill.
I’m no Mr. Universe, but I try to stay in shape. So last year, I joined a simple and clean fitness club that let me run comfortably and get my pump on
at a reasonable price. When I left my full-time job and we moved back across town, I had to ditch my membership, and I haven’t found a suitable replacement yet. That means when it snows several inches in one day, as it did late last week, you’re more likely to find me drinking porters and watching the Bulls then battling the elements to burn a few calories.
This lack of will power
makes me feel rotten. I like to work out, and the benefits are self-evident. Just last week, the New England Journal of Medicine released a new study
showing that Medicare Advantage enrollees who were offered new fitness-membership benefits as part of their health insurance plan were demonstrably healthier than Medicare beneficiaries who were not provided free health club access. This study is no outlier
. When we get off our asses and get to the gym, we feel better and live longer.
For the millions of Americans
trying to make good on their New Year’s resolution to exercise and lose weight, we have the fathers of the American fitness movement, Vic Tanny and Jack LaLanne, to thank. You’ve probably heard of the latter, an exercise zealot who stayed in the public eye deep into old age by performing
intense and often bizarre feats of strength.* (To mark his 70th birthday, for instance, he towed a flotilla of 70 rowboats for one mile along the Pacific coastline.) But it’s his transformation from troubled teen into quasi-superhero that inspires weight training buffs. LaLanne was no natural athlete; as a Bay Area adolescent, he was skinny, addicted to sugar, covered in acne, and prone to violent outbursts and suicidal thoughts. One day, his frustrated mother -- a member of the health-focused
Seventh-day Adventist Church -- dragged Jack to a lecture at a women’s club by nutritionist Paul Bragg. As LaLanne tells it, he experienced
a feeling akin to a religious awakening sitting in the audience. When he got home, he ditched the cookies and started experimenting with weights at the Berkeley Y.M.C.A., an activity his peers would never consider. To learn about musculature, he found a copy of the famous textbook Gray’s Anatomy
and enrolled in chiropractic school. Then in 1936, at the ripe age of 21, he procured a space in an old Oakland office building, stocked it with workout equipment (plus a juice bar and health food supplies), and sought out weak teens looking to “turn their lives around.” The first-of-its-kind health studio was born.
On the other side of the country, in Rochester, New York, a hulking first-generation Italian-American named Vic Tanny
(pictured above) was making similar moves. With the help of his brother -- a body builder and former Mr. USA -- Tanny (born Victor Iannidinardo) created a proto-gym in his garage using broom handles and sand bags. Neighbors didn’t take to the idea**, so after four slow years, the pair moved out west in 1940 and rented a space on Second Street in Santa Monica. The location was ideal; vets who first trained with weights during World War II flooded Southern California in the early 1940s, and a slice of nearby shoreline called Muscle Beach -- once famous for its vaudevillian performances -- developed into a meeting ground for a new generation of fitness-conscious strongmen. Tanny, for his part, had the foresight to take the gym out of the cellar. While the few facilities that existed at the time were dank and stuffed with rusty barbells, Vic’s spots used
bright colors, carpeting, background music, and chromium-plated weights to draw in prospective members with more delicate sensibilities. (Later iterations even offered adjacent spas, tennis courts, and swimming pools.) And because Tanny allowed customers to pay their fees on an installment plan, working- and middle-class people weren’t shut out. Armed with a killer marketing slogan -- “Take It Off, Build It Up, Make It Firm” -- and a small army of brawny trainers who were willing to stand in high-traffic areas and give demonstrations, Tanny’s brand exploded. By 1947, he opened several gyms around Los Angeles. Thirteen years later, he owned over 100 locations in North America and grossed $34 million. As one colleague told
the Los Angeles Times
in 1985, “Vic Tanny was to the gym business what Henry Ford was to the automobile."
LaLanne, meanwhile, parlayed his modest success as a trainer into a contract with a San Francisco television station. His idea was simple: host a fitness show that will spread the gospel of exercise to a wider audience and will rely on cheap household equipment like chairs and towels to do so. In 1951, the channel gave the charismatic but untested guru a shot, offering an early-morning time-slot reserved primarily for children’s programming. LaLanne, as he was wont to do, took the challenge in stride, bringing a dog named Happy on set as a way to appeal to kids, who were urged to wake up their moms for a workout. The show was an instant hit. After eight years on the local airwaves, it was syndicated nationally in 1959 and ran 26 additional years, helping popularize the radical idea that Americans can live more comfortably if they eat well and get their heart rates up once in a while.
Unfortunately for Tanny, his empire crumbled just as LaLanne’s viewers started seeking out gyms in which to practice their new routines. Like Borders would do 40 years later, Tanny expanded way too quickly, and the company’s cash supply dried up drastically. To compensate, the executives halted advertising, slashed maintenance and staff budgets, and attempted to sell stock to raise capital, but the ledger never properly balanced. By 1961, the government moved to collect $2.6 million in back taxes. Shortly thereafter, Tanny closed more than half of his clubs -- some so quickly that members weren’t given sufficient notice to retrieve clothes from the locker room -- and sold the rest, many of which had fallen into disrepair because of improper upkeep. Those who had invested in the company, supplied Tanny with equipment, or bought “lifetime” memberships (for $500 up front) lost a combined $15 million in the collapse. LaLanne’s star faded as well, but not until the 1980s, when “a new generation of glitzy instructors promised dramatic results fast” and the long-running television show was cancelled. He stayed active in the ensuing years, hocking
vitamins, supplements, juicers, and inventions like the “Glamour Stretcher
" on infomercials before passing last January.
I urge you to watch this clip of LaLanne, donning his patented workout suit, talking about “sugarholics
.” It’s worth it just to hear the unorthodox patter of his speaking voice. And here’s a demonstration of some goofy “fingertip push-ups
,” with a comic appearance from Happy the Dog.
Now get to the gym!
*LaLanne’s justification for staging wild physical stunts is as hilarious as it is delusional: "Jesus, when he was on Earth, he was out there helping people, right? Why did he perform those miracles? To call attention to his profession. Why do you think I do these incredible feats? To call attention to my profession!"
**Tanny theorized that New Yorkers in the 1930s were “ashamed to admit they wanted to improve their appearance.”
Scarfing down rotting flesh is no way to endear oneself to neighbors, but it’s the delicacy of choice for Old World vultures
, the bald scavenging birds that for centuries have circled the skies above Africa and Asia in search of animal carcasses. Putrid meat is a viable food source for this particular breed of avifauna, whose corrosive stomach acids break down toxins that make competitors retch. And their grimy rummaging eliminates from towns and villages waste that can cause deadly diseases like tuberculosis, brucellosis, and foot-and-mouth. Vultures are indispensable, albeit foul, health care providers.
That’s what makes the population’s precipitous decline in India so alarming. Once a constant presence on the subcontinent, almost 50 million birds have perished in the last 15 years, leaving less than 60,000 free in the wild, according to Britain’s Royal Society for the Protection of Birds. These deaths were completely preventable, too. After years of confusion, a microbiologist from Washington State University identified
a mild painkiller called diclofenac as the cause. Indian farmers, it turns out, started giving cows a generic version of the medicine in mass to ease pain from swollen udders and hoofs, not realizing that the substance, when ingested by vultures, induced kidney failure. Some scientists now consider the species “functionally extinct,” and the death spiral has drawn comparisons to the American Passenger Pigeon, which was carelessly wiped out
in the 19th century by over-aggressive hunters. Without birds to dispose of the remains, cow cadavers now dot swaths of the Indian landscape, attracting rabid, nasty canines.
Government officials and conservationists have launched multiple ventures to save the vultures, including banning
the use of diclofenac in animals, which is reducing contamination, though not as quickly as originally hoped. This year, 10 chicks were also bred in captivity
as part of a government-initiated recovery plan, which the RSPB regards as “exciting news.” Yet that strategy is not guaranteed to replenish the genus, as Meera Subramanian outlines
in her excellent article on the topic for the Virginia Quarterly Review
: Breeding vultures in captivity is a tentative experiment, and basic biology is against the scientists’ chances for success. Of the thirty-two vultures I watched in the white-backed aviary, there were only twelve established pairs. They build nests for six weeks before the mother lays a single egg. Together, the parents keep it warm during two months of incubation, and, if the egg hatches successfully, keep the young fed for another four nest-bound months before the inaugural flight of a fledgling bird. It will take the offspring five years to sexually mature. The process is slow and yields minimal results; only seventeen vultures have been bred successfully at Pinjore in the last three years, not even enough to make a dent in the population’s continued rate of decline. [...]Even if the breeding is successful, if BNHS can raise the funds—more each year—and find the biologists willing to do the unpraised work, even if vultures accept their new confines, what then? There is no hope for the ultimate stage of a captive breeding program—release—unless diclofenac is completely removed from the environment. Each year, there will be more vultures to care for, and the ark will need to expand, and yet the floodwaters continue to rise each time a farmer pricks the rump of an ailing cow with a shot of diclofenac.
For more on “the most rapid population collapse of any animal in recorded history,” page through this 2007 report
*Hindus won’t consume the meat, Muslims only will if the animal is killed according to halal traditions, and Parsis
lay out the dead bodies for a ritual known as a “sky burial.”
Do yourself a favor and leaf through rockstar neurologist
David Eagleman’s fascinating cover story
in The Atlantic
this month. The premise of the piece is fairly straightforward: while our criminal justice system presupposes that all humans are equally capable of using reason
to decide whether or not to commit a crime (and therefore equal before the law), advances in science are teaching us that factors both genetic and social can warp the neural networks in the brain that control impulses
, make decisions, and comprehend consequences. There is no “neural equality,” as Eagleman puts it, but we still treat criminals of every stripe as if their brains are identical, missing the opportunity to hand down customized punishments and rehabilitation strategies that take into account the differences in how each criminal mind (literally) operates.
One of the more extreme examples Eagleman uses to illustrate the interplay between neural activity and villainous intent is the case of Kenneth Parks. In 1987, this young Ontario resident and new father lost a considerable sum of cash gambling at the race track, forcing him to embezzle money from his employer and crack open his family’s savings account without his wife’s knowledge. Unfortunately for Parks, his boss caught wind of the scheme and fired him.* With few options left, he and his wife subsequently put their house up for sale, a decision that surely would have saddened and embarrassed Parks’ in-laws, with whom the couple was close.
That’s when tragedy struck. One May evening, in what seemed like a desperate attempt to steal money or keep hidden his humiliating financial secrets, Parks drove
14 miles to the in-laws’ townhouse, parked his car in a basement garage, entered their house with a key they'd given him, and using a tire iron he brought with and a knife he stole from their kitchen, attacked the couple in their bedroom, killing his mother-in-law and choking his father-in-law nearly to death. Drenched in blood, he then hopped back into his car and drove to a nearby police station to confess.
An open and shut murder case, right? Not quite. Parks and his family had a long history of sleep disorders, and the night before his fateful drive, he had slept poorly. The following morning, Parks also received a small blow to his right temple during a friendly rugby match. Katherine Ramsland offers more context
in her summary of the court proceedings: The experts described Parks's actions as the result of many circumstances converging: he had plans to fix his in-laws' furnace, he was used to the route he would take to get to their house, and he was restless from stress and anxious about his upcoming embezzlement trial. In his sleep, something spurred him to take care of the favor, and when he went in to fix the furnace, he was startled by his in-laws. He attacked both without knowing what he was doing. To strengthen the defense, his family's history of sleep disorders was submitted.
When he couldn’t recall one single moment of the event during months of testimony, and eventually expressed extreme remorse over the death of his wife’s mom, a jury determined that Parks was the victim of homicidal somnambulism
(sleepwalking) and his actions were totally involuntary
. The decision was eventually upheld
by Canada’s Supreme Court, and Parks left court with a minor prescription
and a new lease on life.
If you have some beef you want squashed and think this unique defense may work for you, don’t get your hopes up. Last year, in a blog post for the New York Times
about the intersection of sleep violence and the law, Virginia Tech historian A. Roger Ekirch points out
that “courts were less lenient in the event of ill will between a defendant and his victim.” These days, a little therapy and/or some meds can placate the afflicted, too. Feigning parasomnia, in other words, is a strategy best left inside one's dreams.
* I picture Parks as William H. Macy in "Fargo," frantically stammering
inside some huge parka.
Last night, my book club
met to talk about “Cutting for Stone
,” Abraham Verghese’s mythic tome about medicine and family. (The verdict on the novel was mixed; I found it engaging and affecting, though a bit simplistic.) The protagonist and narrator Marion, an Ethiopian-born doctor, is named after Marion Sims
, a 19th century American surgeon whose personal biography is novelistic in its own right.
Verghese initially describes Sims as “a simple practitioner in Alabama, USA, who had revolutionized women’s surgery.” That almost undersells the southerner’s varied accomplishments
. From his small private practice in Montgomery, Sims refined techniques to remedy painful ailments like cleft palate and newborn lockjaw. His major contribution to the field was a surgical treatment for vesico-vaginal and recto-vaginal fistulas
, a gruesome injury (of which I’ll spare you the grisly details) largely caused by prolonged labor or violent rape. In 1853, Sims moved to New York City, where he established the first hospital for women in the United States, and eventually opened up the Cancer Hospital, now known as Memorial Sloan-Kettering Cancer Center. A monument
in his likeness still stands at the corner of Fifth Avenue and 103rd Street in Central Park.
His legacy, however, is severely complicated by the approach he employed to perfect his craft: operating on slave women. From a 2003 story
in the New York Times
:By all accounts, Sims, like a vast majority of his antebellum Southern white counterparts, was a strong proponent of slavery. Thus, when Sims wanted fistula patients, he simply bought or rented the slaves from their owners. Sims operated on at least 10 slave women from 1846 to 1849, perfecting his technique. It took dozens of operations before he finally reported success, having used special silver sutures to close the fistulas. Three of the slaves -- Lucy, Anarcha and Betsy -- all underwent multiple procedures without anesthesia, which had recently become available. Sims's records show that he operated on Anarcha 30 times.Sims's persistence aroused some alarm, and several physicians urged him to stop experimenting. In response, he later reported that the slave women had been ''clamorous'' for the operation and had even assisted him with surgery.
Some activists in East Harlem have requested
that the city take down Sims’ statue, though a 2007 petition circulated by New York City Council member Charles Barron fell short. Meanwhile, the medical condition the doctor helped eradicate in the developed world is still ravaging women in sections of Africa and Asia where obstetric care is scarce. Though statistics are difficult to gather, the United Nations Population Fund estimates
that 2 million women remain untreated globally and at least 50,000 to 100,000 new cases occur each year. Over the past decade, Nicholas Kristof has devoted several columns
to the problem, one the media or global leaders seldom discuss because the patients affected are almost all poor and stigmatized. Here’s another reason domestic pols with a bully pulpit need to correct the misconception
that America overspends on foreign aid.
Feel like ruining your day? Take a glance at this horrifying report
published by Human Rights Watch detailing widespread lead poisoning in China. While the pollutant is tightly regulated in most developed countries, investigators with the human rights group claim that hundreds of thousands of villagers and children in at least nine of China’s 31 provinces now suffer
from toxic levels of lead exposure, largely caused by run-off from battery factories and metal smelters. And local officials in the Chinese government, keen on “optimizing economic development” (as their Five-year Plan for Environmental Protection requires), have essentially choked off access to tests and treatment for kids at risk, bringing back frightening memories
of its lackluster response to the 2003 SARS epidemic.
When lead enters the bloodstream, even in tiny amounts, it can cause serious damage, particularly among kids whose bodies absorb the substance quickly and whose nervous systems are still developing. “A gradual build-up of lead in the bloodstream,” the Guardian notes
, “can … lead to anaemia, muscle weakness, arrested development, attention disorder and brain damage.” The latter risk, which is often irreversible, is most unnerving. A 2003 article
in The New England Journal of Medicine
convincingly linked elevated lead levels to reduced IQ. And Jonah Lehrer penned a great post
last month explaining lead’s effect on the brain’s prefrontal cortex, which controls impulses and “executive function
.” Here’s the conclusion:The tragedy of lead exposure is that it undermines one of the most essential mental skills we can give our kids, which is the ability to control what they’re thinking about. While the unconscious will always be full of impulses we can’t prevent, and the world will always be full of dangerous temptations, we don’t have to give in. We can choose to direct the spotlight of attention elsewhere, so that instead of thinking about the marshmallow we’re thinking about Sesame Street, or instead of thinking about our anger we’re counting to ten. And so there is no fight. We walk away.
If the Chinese don’t take care of this problem immediately, it’s entirely conceivable that violent crime
will rise dramatically over the next several decades. That trend would pose a greater threat to the country’s international image than slightly diminished economic growth.
King Tut’s life in Egypt was brief. New biological evidence suggests that his funeral was, too.
Since Howard Carter stumbled upon the famous boy king’s preserved tomb 90 years ago, archaeologists and Egyptologists have tried desperately to determine the cause of his premature death. Was he done in by an assassin (unlikely)
? Malaria (getting warmer
, though inconclusive)? A leg fracture made lethal by a degenerative bone condition (most convincing
)? Or maybe he gave his life for tourism (funniest option
)? Whatever the final cause, it’s clear Tut was cursed with bad genes and inadequate medical care; a DNA study by Egyptian raconteur Zahi Hawass concluded
last year that various members of the 19-year-old’s family exhibited “cleft palate, clubfeet, flat feet, and bone degeneration.” A Victorian court
, this lot was not.
Microbiologist Ralph Mitchell added another piece to the Tut puzzle this week: according to his research, the Pharaoh was buried much more quickly than most Egyptian sovereign. And the evidence is literally stuck on his tomb’s walls, in the form of brown spots covering a famous painting of the goddess Hathor. From Scientific American:The Egyptian Supreme Council of Antiquities asked the Getty Conservation Institute. They in turn posed the question to Harvard microbiologist Ralph Mitchell. His lab cultured material from the spots and sequenced its DNA. It turns out that the brown marks contain melanins — by-products of fungus metabolism. But the fungus is no longer alive. And photos show that the spots haven’t grown in the past almost 90 years.Mitchell thinks this evidence indicates that King Tut was buried in a hurry. Because the paint on the walls was probably still wet. And that moisture, along with the body and the food buried there, would have fed the wall fungus, until the tomb ultimately dried out.
Nobody is ready to speculate about what might have necessitated the hasty memorial. He was, after all, embalmed and provided customary last rites. For all we know, they could have rushed his body inside for some benign reason like a bad storm. I don't think a sickly teenager deserved all the hubbub anyway.
Sorry for the radio silence these past few days, everyone. I have no excuses, other than a vacation hangover and an uptick in my NBA reading diet. So let’s get back to it.
Australian researchers made a heartening discovery
this week: the “hole” in the ozone layer over the South Pole is beginning to heal. The observation required environmental scientists to isolate dramatic annual fluctuations in ozone levels, a tricky task because of a weather pattern known as dynamical forcing that I won’t even pretend to understand. But the basic conclusion, neatly summarized
by the good folks at Nature
, is that the average springtime Antarctic ozone levels have already recovered by 15 percent since the late 1990s. And the monumental Montreal Protocol
treaty, which mandated a 50 percent reduction in the use of ozone depleting substances, seems to be the primary cause. (Environmental regulation skeptics, take note
Without a strong stratospheric ozone to absorb the Sun’s potent ultraviolet light, life on Earth would not exist. But not all O3 is created equal. Closer to the ground, low-level ozone known as tropospheric ozone -- formed when air laced with exhaust or industrial emissions interacts with sunlight -- actually serves as a pollutant, and can trigger or intensify
serious respiratory problems. (It’s the main ingredient of urban smog, for example.)
Luckily for our lungs, humans have a natural defender in the fight against ground-level ozone. According to a new paper
in the journal Environmental Science & Technology
, chemist Charles Weschler has determined that oil found on flakes of human skin quite effectively break apart the toxin: In the dust, the team found significant amounts of squalene and cholesterol. Weschler was not surprised, since humans shed skin rapidly: People shed their outer skin layer every two to four weeks. Given the amounts of the compounds, Weschler estimates that dust could be responsible for 2 to 15 percent of indoor ozone removal, depending on the amount of squalene in the dust.
"Basically, human beings are large ozone sinks,” Weschler wrote
in a subsequent email to LiveScience. “We have only found this out within the last five years!" Next time you’re at The Body Shop, in other words, pass on the loofah. It might help save your life.
Chris Bond is tired of his constituents expectorating on the sidewalk. The London City Council member, representing the far northern neighborhood of Enfield
, is petitioning his country’s Secretary of State for Justice to give local officials the authority to outlaw public spitting. "It is my belief,” he said
in an interview with the BBC this week, “that most people find spitting a wholly obnoxious, filthy habit which can spread germs and causes health issues.”
One hundred years ago, Bond’s request would not have seemed peculiar. Indeed, by 1916, 195 out of 213
American cities with populations over 25,000 had prohibitions against the practice on their books. The laws were imposed to prevent the spread of tuberculosis, which can be contracted by inhaling just a tiny bit of particulate bacteria embedded in sputum
. The National Association for the Study and Prevention of Tuberculosis, the antecedent of the American Lung Association, was actually formed
in 1904 with the direct goal of educating the public about the dangers of unloading saliva near the feet of unsuspecting neighbors.
These days, we in the west live cleaner lives and have access to antibiotics and vaccines that largely protect us from that deadly infection. Chewing tobacco, a once-prevalent substance that produces excess saliva, has also fallen out of favor with the non-baseball-playing
public. (The Onion had some fun
with that development a few years back.) So the problem is solely cosmetic, not health-related. And enforcing spitting bans, it turns out, is incredibly difficult pretty much everywhere, especially when law enforcement is apathetic about the statute itself. As one police official in the Indian state of West Bengal (which does outlaw spitting) told The Telegraph
three years ago, “if we started imposing the law, we would perhaps end up having to arrest every second person on the streets of Calcutta.”
Some other politicians have tried alternative methods to dissuade spitters. The city of Kunming, China, for example, began distributing
tiny green phlegm bags for people to hock into and then trash. And India has launched the Spit Free India campaign
in an effort to raise awareness about potential health risks. Perhaps its a good time to invest in modern-day spittoon manufacturers
, if any actually exist.
(H/T The Awl