Ed. note: I'm on deadline this week, so I'm handing the keys over to Jayne Kelley, a Chicago-based writer and editor who graduated last May with a master's degree in Design Criticism from the University of Illinois at Chicago. The piece below, written in 2009 and updated briefly after the passing of Adam Yauch, is the first essay she penned for the program. I think it's really funny and sharp.
"The Beastie Boys' first album should bear a prominent label, not to protect impressionable teens so much as their elders. Warning: certain scenes and references contained herein may seem offensive, even dangerous, until you realize that it's all a colossal joke.”
– Mark Coleman, Rolling Stone review
of Licensed to Ill
, Feb. 26, 1987
“I went inside the deli and my man’s like, ‘What?’
I write the songs that make the whole world suck.”
– Ad-Rock (Adam Horovitz), “Unite,” 1998
Beastie Boys lyrics occupy a not-insignificant portion of my memory’s real estate. Obviously, the fervor with which I acquired this knowledge does not remotely match the likelihood of my being able to use it. An obsessive impulse common in teenage girls offers some explanation, but considering the Beasties’ 15-year-old back catalog of rap (and hardcore punk) I memorized, Beatlemaniacs had it easy. Still, the Beastie Boys’ lyrical contributions haven’t gone totally unrecognized: the Oxford English Dictionary credits them with coining “mullet,” a term that has taken on a cultural life all its own.
In retrospect, as a kid who took herself much too seriously, it was the Beasties’ signature sense of humor that propelled my dive into their discography. "The group is well-known for its eclecticism, jocular and flippant attitude toward interviews and interviewers, obscure cultural references and kitschy lyrics, and for performing in outlandish matching suits,” according to Wikipedia
. And indeed, what’s not to like about outlandish matching suits? From red and black adidas track gear to multicolor workman’s one-piece uniforms to sober business attire, the Beastie Boys’ style has always conveyed a commitment to looking good that is somehow ironic and wholly genuine at once. That this 20-year commitment would repeatedly take the form of outlandish matching suits is no accident (and is actually quite impressive). In fact, I’d argue that outlandish matching suits are emblematic of a highly particular, appealing sensibility: what I’ll call the “half-serious.”
The half-serious aesthetic hinges on another career-long consistency that the Boys themselves might deny. 1986’s Licensed to Ill
is often considered one of the first gangster rap albums; alongside stupid, harmless lyrics of hedonistic anthems like “Fight for your Right (to Party)” are more overtly, and confusingly, violent lines, such as “You better keep your mouth shut because I’m fully strapped” (“The New Style”). By 1998’s Hello Nasty
, the group had entirely abandoned the trappings of their blend of punk gangsterism. On “Unite,” Adam Yauch (MCA) asserts, “I don’t like to fight, I don’t carry a piece / I wear a permanent press so I’m always creased.” The idea that it’s Yauch’s perfectly wrinkle-free clothes that give him street credibility comes off as partially self-deprecating but mostly just silly, even as listeners recognize he was nearing 40 and probably did take that kind of care to press his pants. In interviews, Mike Diamond (Mike D) has framed this shift as a progression, an eventual, conscious evolution toward maturity: “Obviously there are moments in the past that you look back at and cringe…but it’s actually a privilege to be able to change and to be making records that reflect that change,” he told SPIN
It makes sense that the Beastie Boys themselves would understand what they’ve done in this way, and I was certainly aware of that narrative when I became a fan. Still, my own growing up didn’t match this arc at all; because I heard everything at once—beginning with a copy of Paul’s Boutique
I had blindly ordered from BMG Music Service, moving in both directions quickly from there—Hello Nasty felt just as new to me as the puerile gangster posturing that came before. More importantly, all the output seemed equally absurd. To position the 1986 Beastie Boys as somehow incompatible with today’s group misses the point of their project—after all, they’ve kept the same alter egos. For me, it’s wrong to argue that a “mature” straight-faced pun like “You better think twice before you start flossin’ / I been in your bathroom often” (“Crawlspace,” 2004) is categorically different from a line like “My pistol is loaded / I shot Betty Crocker” (“Rhymin’ and Stealin’,” 1986). Both are idiotic, just not in the same way.
The notion that the Beastie Boys are fundamentally idiots originates with the (perhaps apocryphal) title of Rolling Stone’s Licensed to Ill
review. “Three Idiots Create a Masterpiece” suggests these goofy Jewish kids somehow weren’t aware they were making what would become the decade’s best-selling rap album. In truth, it’s hard to know how self-conscious the Boys were. While I don’t believe their success was an accident, the album doesn’t qualify as total parody. The group was thoroughly faithful to hip-hop’s tropes (and thus mostly respected in the rap community); it’s the fact that they’re white boys from Manhattan and Brooklyn that makes the committed engagement of these conceits ridiculous. The Beastie Boys’ early success is evidence that they understood how to exploit their inherent incongruity for maximum aesthetic effect. This embrace of complexity—intentionally taking on two seemingly opposite qualities at once, those of “real” rappers and “fake” rappers—is key to the half-serious approach.
By the time I started to listen to them, the Beastie Boys had largely apologized for their obnoxious behavior and devoted themselves to less foolish pursuits, notably activism for a Free Tibet. The five albums after Licensed to Ill
had enjoyed near-unanimous acclaim; time legitimized the group’s status as hip-hop pioneers. In this context, a half-serious attitude generates another type of idiocy: because their personas are no longer ridiculous, the Beastie Boys must engage rap’s conventions in a facetious way. What was once a threat of murder is now a warning that MCA “will steal your keys and then…check your mail” (“Oh Word?”, 2004). On the same track, Adam Horovitz (Ad-Rock) politely disses an ugly adversary: “Talk about your face, now don’t get pissed / but I suggest you see a dermatologist.” It’s clear that the Beastie Boys are no longer idiots, but their nonsensical plays on typical forms of hip-hop posturing seem just as effortless as their early “accidental” success. Right at the moment when we realize they really are skilled rappers, the Beasties begin to undermine their image by haplessly failing at self-aggrandizement.
As an overachieving, self-doubting teenager, part of what I really loved about the Beastie Boys is how the half-serious approach seemed to obviate a lot of criticism. For me, this sort of calculated remove was a very desirable model to emulate; even though they’re experts, no one would ever accuse the Beastie Boys of trying too hard, which to me was (and in many ways still is) the telltale mark of the uncool. The music video for “Sabotage,” for example, works so well because it’s clear the Boys’ appropriation of the look is ironic while their excitement over and dedication to the appropriation is total. I’m obviously not the first to point out how the technique of sampling is analogous to the construction of a persona, the intentional amalgamation of different aspects of personality or culture a rapper adopts. But there’s something so great about the way the Beastie Boys have pulled off this balancing act—something inviting rather than alienating, flexible (always fresh), and above all, good-natured, with a knowing, and self-knowing, sense of humor.
As a model for working, the half-serious doesn’t preclude actual effort in order to make it look easy, but it provides mechanisms for absorbing and reconciling contradictory influences (and even realities) without a fuss. The result is genuine, but not earnest—sophisticated enough to know when sophistication is overrated; familiar, but always a shade unexpected. It’s a little bit like the story of the term “mullet”: the Boys called wide attention to a ridiculous hairdo in what was essentially a very public in-joke (with the song “Mullet Head”), but doing so had a real impact, solidifying the mullet’s cultural import. Anyway, especially now, I’m still listening, memorizing lyrics and taking notes.
 “Heroes and Antiheroes: Beastie Boys,” SPIN
, April 2000, 66.
On October 1, 1935, New York City mayor Fiorello La Guardia
jumped onto the radio and declared war. He warned residents that his crusade would not be “a mere flash;” in addition to an elaborate education campaign, the Republican planned to dispatch special police squads all over the city in an effort to run down perpetrators. This battle had raged, in varying degrees, for two decades. La Guardia hoped his assault would finally put it to rest. “Everyone is vitally affected … by the “gigantic common enemy,” he added. “Let’s turn it into a harmless dwarf” (New York Times
; September 29, 1935).
What was the mysterious menace terrorizing America’s largest city? The plague that “has increased to a point beyond all reasonable tolerance,” according to one early observer? Noise. To be more precise, unnecessary noise, flung from cars and nighttime revelers, trucks and tourists. Reformers in Gotham City had grown tired of the urban din, and in the mid-1930s, their man in City Hall took it upon himself to silence the five boroughs once and for all.
La Guardia may have been Manhattan’s most prominent anti-noise activist, but he wasn’t the first. That distinction belongs to Julia Barnett Rice
, a New Orleans-born debutante whose unique political career George Prochnik briefly profiles in his 2010 book “In Pursuit of Silence
.” In many ways, Rice was an unlikely campaigner; after studying music and then medicine at Women’s Medical College of the New York Infirmary, she married a successful corporate lawyer and started a family, raising six children inside the couple’s custom-built mansion
overlooking the Hudson River on the Upper West Side. From their secluded perch at Riverside Drive and 89th Street, the Rice clan was insulated from all but the most thunderous downtown clatter. Tugboat horns, on the other hand, were a constant nuisance, particularly in the summer, when the couple propped open its many windows to let in fresh air. Rice didn’t think for a second that all of the toots she heard were actually preventing crashes.
In 1905, two years after moving into “Villa Julia,” Rice used her family’s wealth in a novel way, hiring six Columbia Law students to track the number of whistle blasts pouring up from the Hudson River each night. Their final report—which ran to 33 pages and included testimonials from both police officers and neighbors—charged that tugboat captains “murder sleep and therefore menace health,” shooting off their horns 2,000-3,000 times in a typical evening, often to greet passing ships or servant girls working along the river. It was just the evidence Rice needed. Because the waterway was under federal jurisdiction, the socialite took her report to Washington. With the help of her congressman, U.S. Rep. William Bennett, and letters from hospitals and patients about the health effects of errant noise, Rice convinced lawmakers to pass a bill regulating the “useless and indiscriminate tooting of sirens and steam whistles.” Boisterous skippers along the eastern seaboard would now face aggressive fines, all thanks to Rice.
Her advocacy had struck a nerve. At the turn of the 20th century, city dwellers were subject to more chatter than any population in history, and public health professionals were beginning to understand the severe risk excessive noise carries.* Audible disturbances were now a community problem, requiring a community response. Once the tugboat campaign earned some publicity, Rice was engulfed with letters of support from regular citizens and public officials alike; she realized that New York’s patrician class—eager to preserve its genteel way of life while paternalistically protecting the poor and infirm—was ready, in Prochnik’s words, “to extend the battle to an array of targets.” And so in 1906, Rice enlisted friends and acquaintances and launched the Society for the Suppression of Unnecessary Noise, the world’s first anti-noise organization.
For two decades, the aristocrat fought day and night for a silent city, writing op-eds and pamphlets, enlisting the aid of prominent individuals, and urging local politicos to draft and update municipal bylaws. Data played a key role in her organizing—each new offensive was backed by an almost-maniacal amount of measurements. To be sure, there were some triumphs, like when Rice—with the help of Mark Twain
—convinced thousands of New York schoolchildren to stay quiet while they walked by or played near hospitals. But the rise of the automobile
complicated the society’s already difficult work, and their victories were ultimately modest. Prochnik contends that Rice’s determination couldn’t mask one core flaw in her position: “When it comes to noise, how do we tell the necessary from the unnecessary?”
That’s a question New York City’s health commissioner, Shirley Wynee, tried to answer conclusively after Rice’s retirement. In 1929, Wynee established the Noise Abatement Commission, a government-sponsored panel whose task was to quantify
the problem of noise. Over the course of six months, staff members—most of whom were experts in science, engineering, or medicine—canvassed huge swaths of the city, using questionnaires, audio recorders, and a “roving noise laboratory” to gather sound level readings on the street and inside buildings. (The measuring truck alone covered over 500 miles, making 7,000 observations at 113 locations.) Once the voluminous data was organized, NAC members paired their results with scholarly reports detailing the impact of noise on concentration and productivity, publishing City Noise
, a massive landmark study on the acoustical intensity of urban life.
What did the NAC find? Street traffic, elevated trains, and subways accounted for 52 percent of New York’s clatter, while construction work, automobile horns, and various other sources made up the rest. Like Rice, the existing medical literature also convinced members that exposure to constant loud noises could lead to hearing impairment, interfere with sleep, and reduce the efficiency of workers. In framing the problem, however, NAC took a different approach than its activist predecessor. As Lilian Radovac argues in her fascinating American Quarterly essay
“The ‘War on Noise’: Sound and Space in La Guardia’s New York,” the commission considered unwanted sound a technological obstacle, one that could be solved (or at least muted) relatively easily by improving industrial design, reforming construction practices, informing the public, and updating the city’s noise ordinances. If New Yorkers want to “do away with unnecessary noise and reduce to a minimum such noises as are necessary,” the commission wrote, they can do so “if they are willing to take a little trouble.”
Initially, the recommendations embedded in City Noise
landed on deaf ears. Even though his own administration completed the legwork, Mayor Jimmy Walker—immersed in a corruption scandal that would force him to resign—ignored the directives entirely. The issue only gained political traction when La Guardia, trying to establish his reputation as a reformer, stormed City Hall in 1934 and made noise reduction a priority.
La Guardia’s “War on Noise” was simple in conception. First, city officials, with the help of civic organizations and the police commissioner, would educate the public about what behavior they considered unacceptable—the unnecessary blasting of horns, for example, and attempts to call people to windows by shouting. Residents would then be urged to cooperate with their neighbors and ensure “noiseless nights.” For his part, the mayor made a series of personal commitments, assuring residents that “many miles of [elevated train tracks] will be torn down” and that cabarets in residential districts would be closed.*** He also set up a decibel machine in his own office so he could track the sounds lofting up from the street below. “Most of the city officials including Mayor La Guardia are shouting at the tops of their voices about how quiet everything is going to be when the metropolis gets its noiseless nights,” the Washington Post
sarcastically noted that summer (August 17, 1935).
The second plank of La Guardia’s quiet campaign was legal. When the Republican took office, New York’s existing noise bylaws were a total mess—Radovac describes them as “a series of discrete clauses that had accrued over several decades in different sections of the administrative code”—so he signed various executive orders he hoped would serve as an adequate substitute until aides could convince the Board of Aldermen to pass a comprehensive anti-noise ordinance. One prohibited the use of political campaign trucks after 10:30 p.m, another allowed police to ticket any driver who sounded his horn after 11 p.m. The Noise Abatement Council, a non-profit organization that sprung up after the original NAC disbanded, dubbed this approach “Quiet by Fiat” (New York Times
; May 19, 1934) and suggested that 75 percent of unnecessary noise could be eliminated by administrative orders alone. While their estimate was wildly optimistic, it’s clear the restrictions did make an impact initially. During the first four days of the drive, the NYPD issued over 5,000 warnings, and officers wrote more than 1,500 summonses by the end of 1935, a dramatic increase from the year prior. “Several of the more exasperating noises faded,” anti-noise activist Edward Peabody later told The New Yorker
(November 22, 1941). “Cats induced to stop yowling, gravediggers laid off hitting the tombstones with shovels, etc.”
By the spring of 1936, La Guardia finally muscled through a revamped ordinance, one that banned 14 types of noise and greatly expanded the powers of the police. In one swoop, the administration’s focus shifted almost entirely from the preventative to the punitive. Those who played their radios too loudly or worked on construction projects at night, among others, would now face a graduated series of fines: $1 for the first infraction, $2 for the second, $4 for the third, and $10 for the fourth. (During the Depression, those tickets were not cheap.) Labor organizers, immigrants, and those reliant on the street-based economy—itinerant musicians, pushcart sellers, junkmen—were disproportionately targeted. In 1938 alone, 16,000 residents were ticketed for violations and another 293,000 were issued warnings. Like two Republican mayors who would follow in his footsteps decades later, La Guardia came to believe that enforcement was the most sensible way to tame the chaos of his hometown. Radovac offers a less generous interpretation; La Guardia, she writes, “conflated the everyday annoyances of city life with criminal acts.”
And in the end, the same enigma that beguiled Julia Rice frustrated La Guardia and his allies: when the line between “necessary” and “unnecessary” commotion is so fuzzy, it’s nearly impossible to convince people that excessive noise is a problem worth solving. Five years after the War on Noise began, Peabody complained that the public was apathetic and that “you couldn't say things are so hot” in his movement. In the 1940s and early 1950s, a coalition of executives from companies that manufactured soundproofing products organized “National Noise Abatement Week,” an annual education awareness campaign that was cynical in its formulation and lackluster in its execution. Eventually, noise pollution fell out of the headlines entirely. As one member of the group would admit
to The New Yorker
(May 15, 1948), “we feel in our heart rather hopeless about New York.”
*Via the CDC
, hearing loss; sleep disturbances; cardiovascular and psychophysiologic problems; performance reduction; annoyance responses; and adverse social behavior.
***”There is no reason why they should annoy the neighborhood with what they call music late at night” (AP; September 10, 1935).
Opponents called it the first step towards socialized medicine. The law was too expensive, they complained, and it violated states’ rights. One woman who testified during a congressional committee hearing even suggested its passage would lead to “bureaucratic control of family life.” If he was alive, John Roberts surely would have found a way to strike it down in court.
You don’t hear about it much anymore, but the Sheppard-Towner Act
—or the “Better Babies Bill,” as some reporters referred to it at the time—was a big f’ing deal. No Congress in U.S. history had ever approved a federally funded social welfare program before S-T came up for debate in the early 1920s; aside from the Volstead Act
, it was the most controversial law of its era. A Boston Globe
writer summed it up this way: “It ranks next in importance, in the opinion of its advocates, to the legislation which finally gave women the right to vote.” And there are some striking parallels between the fight over Sheppard-Towner and the recent debate surrounding President Obama’s embattled health reform law. With the Supreme Court set to rule on the constitutionality
of the Affordable Care Act next month, it’s worth investigating the legacy of its earliest legislative antecedent.
The story begins in 1912, when President Taft created the U.S. Children's Bureau and hired Julia Lathrop to run it. Housed within the Labor Department, the agency was designed to investigate and report “upon all matters pertaining to the welfare of children and child life among all classes of our people.” Lathrop (pictured above) had been doing essentially the same work for two decades at Chicago’s Hull House, undertaking extensive surveys to document the brutal living conditions in her city’s slums, mental health institutions, orphanages, and poorhouses. She was a natural fit. As one senator’s wife gushed to the Washington Post
, choosing Lathrop was “the finest and most just recognition of a woman's ability, and her place in the nation, that has ever been made by any president" (November 6, 1912).
“Young America’s Aunt”* knew instantly what problem her department should tackle first: infant mortality. When she arrived in Washington, Lathrop’s office launched an eight-city examination into American childbirth habits. The results were startling; the nation’s overall infant mortality rate was a whopping 111.2 per 1,000 live births, higher than almost every other industrialized country in the world. Annually, 250,000 American babies died during their first year and another 23,000 mothers were killed during the delivery. (It was the second leading cause of death among women between the ages of 18 and 45, behind tuberculosis.) There was also a correlation between poverty and the mortality rate—for families earning less than $450 annually, one baby in six
died before his or her first birthday. Respected Johns Hopkins pediatrician Dr. J. H. Mason Knox made clear at the time that nearly all of those deaths were preventable if families just received proper prenatal care. Only 20 percent of expectant moms did.
With firm data in hand, Lathrop set about drafting a piece of legislation that would use federal funds to provide “public protection of maternity and infancy.”** Like the newly-established Smith-Lever Act, which authorized the Department of Agriculture to distribute matching funds to the states for extension work
by county agents, Lathrop envisioned a program in which Washington partnered with local nurses, universities, and social workers to subsidize the instruction of mothers on the care of infants. “The bill,” Lathrop wrote, “is designed to emphasize public responsibility for the protection of life just as already through our public schools we recognize public responsibility in the education of children.”
In 1919, U.S. Rep. Horace Mann Towner (R-Iowa) and U.S. Sen. Morris Sheppard (D-Texas) submitted a bill that contained the basics of Lathrop’s proposal. The Hull House veteran wasted no time stumping for her idea. Over the next three years, Lathrop enlisted support wherever she could, relying heavily on women’s associations that were emboldened by the recent extension of suffrage. The Children’s Bureau sponsored “The Year of the Child,” in which the agency appealed to groups across the nation and published catchy graphics to illustrate the country’s poor international standing. Lathrop convinced popular magazines like Good Housekeeping
and the Ladies Home Journal
to editorialize in favor of the measure. Ultimately, 13 of the most powerful women's groups in America rallied behind Sheppard-Towner, too; in the final weeks of negotiations, the Women's Joint Congressional Committee—a massive umbrella group
—conducted interviews with congressman at the rate of 50 per day. “It is doubtful,” reported the Globe
(December 18, 1921), “if any single piece of legislation enacted by Congress in recent years—apart from equal suffrage—has had the organized influence of so great a body of the citizenship of the country back of it.”
The final version of the bill passed both the House (279 to 39) and the Senate (63 to 7) by a wide margin in late 1921, in part because the law was modest in scope. Congress agreed to appropriate just $1.24 million annually (about $15 million today) for the program, with each participating state receiving $5,000 outright and then dollar-for-dollar matching funds as determined by its population. After five years, the funding would need to be reauthorized, as well. (The advocates of the bill were confident that half a decade was “sufficiently long to demonstrate the real value of the measure.”) One year after its passage, a reporter for the Detroit Free-Press
described Sheppard-Towner as “mild and rather helpless” (December 1, 1922). He wasn’t wrong.
But the idea behind the bill, at least in the United States, was revolutionary. Social insurance, in any form, just didn’t exist. In her book “Protecting Soldiers and Mothers,” historian Theda Skocpol writes that Lathrop’s brainchild “extended what was once domestic action into a new understanding of governmental action for societal welfare.” Put another way
, the new law was “a fragile seed growing in isolation from the then-traditional health programs.”***
That seed quickly bloomed. Within the first year of implementation, 45 out of 48 states passed enabling legislation to receive matching S-T funds. (Illinois, Connecticut, and Massachusetts never participated.) Each used their subsidy in different ways; some organized conferences where physicians ran demonstrations on maternal and infant care and hygiene, while others paid nurses to visit new or expectant mothers. However it was deployed, the money went a long way. Between 1922 and 1929, the Bureau distributed over 22 million pieces of literature, conducted 183,252 health conferences, established 2,978 permanent prenatal centers, and visited over 3 million homes. Lathrop’s successor at the Children’s Bureau, Grace Abbott, estimated that one-half of U.S. babies had benefited from the government's childrearing information.
Not surprisingly, infant mortality dropped precipitously while Sheppard-Towner was on the books. A new working paper
published last month by the National Bureau of Economic Research estimates that Sheppard-Towner activities accounted for 12 percent of the drop in infant mortality during the 1920s, with one-on-one interventions creating the most statistically significant results. Combined with rising incomes and better nutrition, preventative health education helped cut down the infant mortality rate to 67.6 deaths per 1,000 live births in 1929. Considering how little money Congress actually spent on the law, the results were thrilling.
Not everyone was so excited by the precedent Sheppard-Towner was setting. During the initial debate in Congress, several opponents delivered unhinged criticism of both the bill and its supporters. U.S. Sen. James Reed (D-Missouri) declared (incorrectly) that Sheppard-Towner would permit officials to “invade” the homes of mothers-to-be. “We would better reverse the proposition,” he charmingly added, “and provide for a committee of mothers to take charge of the old maids (at the Children’s Bureau) and teach them how to acquire a husband and have babies of their own.” Not to be outdone, his colleague in the House, U.S. Rep. Henry Tucker (D-Virginia), characterized the bill as an attempt to “make Uncle Sam the midwife of every expectant woman in the United States.” And Mary G. Kilberth of the National Association Opposed to Woman Suffrage argued that Sheppard-Towner advocates were both “inspired by foreign experiments in Communism” and “connected with the birth-control movement.” A wealthy socialite from Boston went so far as to challenge the law before the U.S. Supreme Court, contending unsuccessfully
that it violated the Tenth Amendment.
If ideologues couldn’t rescind the law, doctors had a better shot. The American Medical Association board was initially skeptical of Sheppard-Towner, calling it an “imported socialistic scheme unsuited to our form of government” at its annual meeting in 1922. Four years later, however, the association fully mobilized for the funding reauthorization fight, lobbying Congress and writing letters to the president. It’s clear that many physicians moved to incorporate preventive health education into their private practices only when they saw the benefits of prenatal care play out in new clinics across the country. In a very real sense, the Children’s Bureau had become a primary competitor, and its own worst enemy.
Desperate to keep their projects operating, directors of the state Sheppard-Towner programs and Abbott cut what the historian Skocpol deemed a “deal with the devil,” agreeing to terminate the law altogether in exchange for two more years of full financial support. In 1929, seven years after reformers printed their first informational flyers, Sheppard-Towner came off the books. Over the next four years, progressives introduced 14 different bills that would have funded maternity and infancy health programs using federal dollars. All of them failed. When the Great Depression hit, most states dropped their existing programs altogether.
The lesson, though, had been learned. And while the United States’ current infant mortality rate is still not where it should be
, it’s decidedly safer for babies and mothers now than it was a century ago. For that, we can thank Julia Lathrop and her small, ambitious staff.
*Headline in the Post
on June 9, 1912
** Children’s Bureau’s Fifth Annual Report of the Chief, 1917
***The Sheppard-Towner Era: A Prototype Case Study in Federal-State Relationships
; June, 1967