George Gallup, the father of modern opinion polling, ran his first political survey because his mother-in-law asked him to.
The year was 1932, and Iowa’s Democratic party bosses had just placed Ola Babcock Miller
on the ticket as candidate for Secretary of State. It was a token gesture. Miller—a former teacher and amateur painter—had only modest political experience, but her deceased husband Alex had edited the only important Democratic newspaper in the traditionally Republican state, and his friends in the party thought nominating his suffragette widow for statewide office would be a worthy tribute. She was flattered, but overwhelmed; early in the race, Miller described her campaign as “a martyrdom to the cause.”
She didn’t take the responsibility lightly, though. Instead, Miller enlisted the help of her son-in-law, George
. Raised by an Iowa dairy farmer and land speculator, Gallup had spent the past decade at the University of Iowa and Drake University studying the reading patterns of American newspaper and magazine consumers. Using Gallup’s survey method, revolutionary at the time, an interviewer would take readers through the pages of a newspaper’s most recent issue, glean how much or how little of it they read, and then break down the items by category so publishers could learn what types of features drew significant eyeballs. His doctoral dissertation was titled, quite frankly, “A New Technique For Measuring Reader Interest in Newspapers,” and a host of Midwestern rags eventually hired him to evaluate their products.  His mother-in-law, looking for any advantage she could secure, wondered whether Gallup’s technique could be used to figure out what issues mattered most to her potential constituents.
It was an insightful idea. Throughout the fall, George knocked on doors, asking voters what they valued in a candidate. (Highway safety, he found, was a top priority.) He passed those details on to Miller, who worked it into her campaign stump speeches. Riding the coattails of Franklin Delano Roosevelt, who took home 2,225 out of 2,435 precincts in the Hawkeye State that November, the widow upset her opponent by less than 3,000 votes, one of the first Democrats since the Civil War to win statewide office in Iowa. She had Gallup to thank for the extra edge.
Gallup’s own experience on the trail was revelatory. He had come to the conclusion during his academic training that people “arrive at their preferences and opinions in an orderly way” (Journal of Marketing
; October, 1962). It followed that one could get a firm understanding of the public’s political
desires if he or she asked the right amount of Americans the right questions. And if voters’ collective intentions were pure and preferences widely known, politicians would have no choice but to bend to their will. Democracy would flourish. The trick was figuring out an efficient and accurate technique to ascertain what the public actually wanted.
Political junkies had taken straw polls for decades , but they invariably lacked statistical rigor. (“Of the 16 people on this cable car, 12 are Harding men.”) In the early 20th century, the most comprehensive and influential practitioner was Literary Digest
, a weekly general interest magazine. During each presidential election cycle, starting in 1916, its editors mailed out ersatz ballots to its subscribers (and eventually to publicly-listed names of people who owned telephones and cars) and solicited responses. Heaps of data poured back into the magazine’s offices, where hundreds of clerks tallied the results. Saturation was the name of the game; if enough Americans are contacted, the thinking went, the nation’s predilections will inevitably emerge. One or two weeks before Election Day, the magazine would release its final forecast. In the first five races it tracked, Literary Digest’s
data spit out the correct winner five times.
Gallup knew there was a better way. In December of 1933, after moving to New York City to head up the research arm of the advertising agency Young and Rubicam, Gallup and his colleagues began conducting nationwide opinion polls, employing the sampling method, which he likened to government inspectors who test wheat. (“They take a sample here, another there,” he told the Washington Post
on December 8, 1935, “and by choosing the samples properly are able to judge the quality of the whole amount from what the sample shows.”) Along with people who owned cars and phones, Gallup hired hundreds of staffers to interview laborers and persons on public aid in all 50 states, folks who intended to vote but whose names could not be traced back to assets they owned or luxury magazines to which they subscribed. (This was the height of the Depression, so that demographic was significant.) The firm deliberately surveyed a cross-section of Americans they felt was representative of the electorate at large: farmers and city dwellers, rich and poor persons, Republicans and Democrats. The approach seems intuitive, but nobody in politics had ever tried it before.
After gathering data for two years, Gallup created a new organization—the American Institute of Public Opinion (AIPO)—and published its initial findings in a syndicated newspaper column he named “America Speaks.” Within the year, over 60 newspapers had purchased the package, which Gallup updated with new information weekly. Columnist Franklyn Waltman dubbed it “the periodic gauging of the public heartbeat” (Washington Post
; July 13, 1936). The concept of a running poll was so novel that some newspapers had to convince readers that the information Gallup presented did not reflect the political views of their editorial board.
Gallup didn’t hide his modest disdain for Literary Digest’s
survey work, either. A few months before that magazine was set to release its enormous report, the Iowa native predicted in one of his columns that his competitor’s data would skew toward Republican challenger Alf Landon
, despite what he considered strong evidence that the incumbent, President Roosevelt, held a commanding lead. (As early as November 24, 1935, Gallup had identified 30 states in which the Democrat was garnering at least 53 percent support, a bloc that would have given him enough electoral college votes (269) to win without securing even one “borderline state.”) Wilfred Funk, LD’s
editor, clapped back, calling Gallup’s prediction “a gratuitous statement." His staff had “never been able to discover how many ‘rich men, poor men, G-men, racketeers, and candlestick makers’ voted in a given election,” he added sarcastically (New York Times
; July 19, 1936).
They should have tried harder. After sending out 10 million postcards through the mail, and receiving a whopping 2.4 million responses, Literary Digest
projected that Landon would upend Roosevelt with 370 electoral votes. Gallup’s last “America Speaks” column of the cycle argued that Roosevelt could take home anywhere from 315 to 477 electoral votes. He carried 48 states and won with 523.
Funk and company had no explanation for their poor showing. “We were far from correct,” they wrote in their first post-election issue. “Why? We ask that question in all sincerity, because we want to know.” Refusing to admit that they over-sampled wealthier, Republican-leaning voters, Literary Digest
ultimately just threw its hands up in the air: “So we were wrong, although we did everything we knew to assure ourselves of being right.” It was the last poll they’d conduct; after promising to try their luck again four years down the road, the magazine merged with a competitor in 1937, Funk fled for Reader’s Digest
, and the whole enterprise sank the following year. The Washington Post,
was so thrilled with the accuracy of its partner that the paper printed, on two different days, letters from “political and business leaders” praising Gallup’s work; according to the chairman of the Federal Trade Commission, AIPO “undoubtedly demonstrated its superiority over other devices for prejudging the results of the election” (November 5, 1936). Gallup used the high-profile victory as a launching pad, eventually building with his son an international network of research offices that ask private citizens, to this day, what they are thinking. Political polling, as is painfully obvious this time of year, took on a life of its own, for better and worse. As Jason Zengerle wrote
in New York
last week, “the polling industry has never been less confident in its ability to reduce a series of interviews to a number that is an accurate reflection of the opinions and future behavior of the populace.” Pollsters, it turns out, are still grasping for that elusive representative sample.
As for Gallup’s mother-in-law? She was re-elected for two additional full terms, in 1934 and 1936, before she succumbed to pneumonia at the age of 57. Her most meaningful contribution to Iowa public life was the establishment of a state highway patrol unit in 1934, one that boasted 15 inspectors and helped reduce the highway accident rate considerably
. More often than not, it helps to know what voters want.
 One of Gallup’s early discoveries was that people loved to look at pictures and comic strips. According to the Journal of Marketing
(October, 1962), “advertisers, and particularly Ralph Starr Butler of General Foods, seized on these findings, and the first comic-strip advertising resulted.”
 Via the Wall Street Journal
, they were "named for the way farmers threw a fistful of straw in the air to see which way the wind was blowing."
It was the moment she had worked toward for years. On June 25, 1924, Lena Springs
—a decorated 41-year-old academic and political activist from Lancaster, South Carolina—was set to address the Democratic National Convention in New York’s famed Madison Square Garden. Her task was routine but symbolically important: Springs was to submit a report for the committee on credentials , of which she was chairman, the first woman to hold that title. In a wide-ranging interview published by the Washington Post
six weeks later (August 10, 1924), Springs admitted that she was “not one of those who think a woman needs to submerge her femininity in order to be a politician,” so she dressed up for the occasion, donning a crepe meteor dress and a “very Parisian and very chic” straw leghorn hat (New York Times
; June 26, 1924). As she walked purposefully across the stage and approached the lectern on that sweltering summer day, the convention’s band leader cued up a song he felt encapsulated all that Springs had to offer the party of Jackson and Wilson. It was an old ragtime ditty titled “Oh You Beautiful Doll
Two days later, a group of delegates formally placed Springs in nomination for the office of Vice President. No woman at a major party convention had ever received that distinction, one that “heralded the time, according to the fair sex, when one of their number will be entered in the lists for the Presidency itself” (Post
). She was honored and humbled by the gesture; the leading woman Democrat of the South, as a colleague described her, promised she would not treat the nomination “lightly.” Springs didn’t win, but her 15 minutes in the spotlight—zealously documented by news and style reporters of the time—show how much progress we’ve made in equalizing political opportunities for women, and how much further we still have to go.
Lena Jones was born in 1883 in Pulaski, Tennessee. She took to school instantly, first in the Dallas-area private schools where she studied for the first decade of her life and later at Sullins College, where she graduated with a B.A. in English. Like few other women of her era, Jones continued her scholarship in post-graduate programs at Virginia College in Roanoke and Presbyterian College in South Carolina, eventually earning a prestigious D. Litt. and landing a job as head of the Queens College (North Carolina) English Department, all before her 30th birthday.
Love interrupted her academic ascent. In 1913, Jones married Leroy Springs, a cotton mill owner and one of the most prominent industrialists of the South. She adopted her husband’s name, moved 50 miles south to Lancaster, and redirected her energy into politics, taking on key roles in the South Carolina Equal Suffrage League and the South Carolina Federation of Women’s Clubs. During the Great War, Springs organized the Lancaster Red Cross, establishing two emergency hospitals in her town to treat influenza. Once the 19th amendment was ratified, she eagerly cast her ballot for the Democratic slate, and ultimately secured a position as Democratic National Committeeman . The convention in 1924—the first held since women were extended the franchise—was an event Springs would not miss.
She wanted plenty of women to join her, too. Springs had spent a portion of her spring alongside national Democratic vice chairman Emily Newell Blair lobbying party members to increase the number of delegations-at-large from four to eight, with the explicit intention of reserving half of the new slots for women. (Just one female Democrat served as a delegate at the DNC’s 1908 convocation in Denver, though the number of lady envoys increased slowly over the next 16 years.) The provision passed. When Springs arrived in New York, another 475 female delegates and alternates were in attendance—almost one-fourth of the entire membership, and the largest delegation of women ever to participate officially in a U.S. political convention.
The Democratic women came from all walks of life—elected officials, Tammany Hall leaders, school teachers, business women, housewives—and made clear there was “no one outstanding issue on which [they], as a group, will make a fight” (Chicago Tribune
; June 23, 1924). What they wanted was to be treated seriously as political thinkers and organizers. Right away, they launched a campaign to gain control of a major committee chairmanship, canvassing delegations for pledges “in a professional way” (AP; June 22, 1924). It was a display of strength, and Springs was the beneficiary of this exerted clout, taking the reins of the Credentials Committee as the convention got underway. Not surprisingly, she handled the job like a veteran, “winning the plaudits of her colleagues not only for her ability but her courtesy and fairness” (Times
; June 28, 1924). Her first appearance on stage was a direct result of her skillful work behind the scenes.
And how did reporters depict the South Carolina suffragette upon her initial foray into national politics? Like the band who played her on stage, the press overwhelmingly treated Springs like little more than a pin-up model, or “an ornament to the convention,” as the Boston Globe
so delicately put it (July 6, 1924). Even after watching some members of the media objectify Vice Presidential nominee Sarah Palin four years ago, the reports from 1924 are jarring. Springs was “a highly personable and sumptuously-draped young woman” with “a complexion which Helen of Troy might envy.” She had “coal black hair and eyes, and a dynamic, scintillating vitality.” Her dresses and jewels were “particularly lovely and becoming,” catching the eyes of fashion editors, “who have been shown that the women from the Southland know how to wear clothes as well as the women on Fifth Ave.” Springs, in short, was “distinctly the type that men ‘fall for’ good and hard.”
The coverage got even more ridiculous when 18 of Springs’ fellow South Carolina delegates—as a way to honor her service to their state and national party—nominated her for the presidential ticket. The former English professor understood that her chances were minuscule; 29 other men were nominated, and she’d never held elected office. But even the modest tribute was partially spoiled by rampant sexism. It took two forms: Springs’ nomination was understood as either a novelty to mock or a nightmare to fear. With a creepy wink and nod that practically leaps off the page, Senator Burton Wheeler (D-MT) provided a clear example of the former, noting that Springs “would most certainly improve the situation in the Senate” (Times
; June 28, 1924). Her (jackass) husband Leroy joined the latter camp, initially telling reporters that he “did not take kindly to the idea of having his wife’s name ‘coupled’ with that of another man’s” (Globe
), and adding two weeks later he would “leave for some other country” on the day a woman won the presidency (Post
). What would become of the United States if a woman took a seat in the vice presidential chair? According to the Post
, “speculation [on the convention floor] ranged from making every day a Mother’s day to appointing the head of the Thursday Mah Jong club as secretary of war.”
Springs, for her part, said she would keep a low profile and “try to keep the Senate from getting into too great a parliamentary tangle.” (Where is she when we need her
?) And while she maintained that no woman would lead a major party until “men get over this old-fashioned jealousy or sex superiority complex,” she was optimistic about the future, despite the treatment she herself received. “The day is not very far off,” she told the Post
, “when a woman will be President of the United States.”
Springs took home 2.7 percent of the vote at the 1924 convention, not enough to eclipse eventual nominee Charles Bryan. Afterwards, she stayed modestly active in local affairs before moving to New York following her husband’s death in 1931. Nine years later, at the age of 59, she passed away, the victim of a surprise heart attack. And with no women on either 2012 ticket, her wish to see a female president remains unfulfilled.
 The credentials committee confirms the identity of the party’s delegates and their authority to vote.
 The Republican Party, she would later say, “stands for special privilege.”
Chicago Ald. John Hoellen
(47th Ward)—a Republican reformer who represented Lincoln Square from 1947 to 1975—loved rooting out government waste. He relished a fight, particularly with The Boss, Mayor Richard J. Daley. He was always good for a snappy quote. And he hated
the statue that would become his city’s most iconic piece of public art.
On July 8, 1967, a month before Pablo Picasso’s unnamed sculpture
was to be unveiled in the plaza of the newly-built Civic Center (now the Richard J. Daley Center), Hoellen threw an epic hissy fit in the chambers of the City Council. He started by introducing a resolution demanding his colleagues suspend council rules and replace immediately the “rusting heap of iron” with a statue honoring Mr. Cub, Ernie Banks. When Daley and Ald. Thomas Keane (31st Ward), the administration’s floor leader, tried to proceed with other business, Hoellen whipped out a homemade helmet, “fashioned from a metal lampshade with a cardboard image of the Picasso on it” (Chicago Tribune
; July 8, 1967), and paraded around the floor, screaming comments like “five stories of boilerplate” and “it doesn’t belong in our city.”
As the council clerk attempted to enter the next resolution into the city’s record, Hoellen marched up to the mayor and deposited the helmet on his rostrum. Without hesitating, Daley tossed it on the floor next to his feet, a move that drew laughs from the gallery. Two weeks later, in an interview with the Tribune (
July 30, 1967), Hoellen clarified his artistic analysis: “The statue represents the power of City Hall, stark, ugly, overpowering, frightening,” he said. “If you want to get junk [in the Civic Center plaza], get two junk automobiles that have been involved in a head-on collision on the Kennedy expressway. They’ll attract attention, but, much more important than that, they’ll create a meaningful idea. They’ll tell a powerful story.”
Hoellen may have been the most vocal opponent of Chicago’s Picasso, presented to the public for the first time 45 years ago next month, but he wasn’t the only local frustrated with or confused by Daley’s new acquisition. Universally admired half a century later, it took time and a copyright dispute for Chicagoans to embrace fully the brooding, steel abstraction that sits in their city’s heart. Or as Hoellen later called it, "the heroic monument to some dead dodo.”
Without William Hartmann, principal at the architecture firm Skidmore, Owings and Merrill (SOM), there would have been no piece of work for Hoellen to mock. When the Public Building Commission of Chicago hired SOM (among a few other firms) to design a 31-story civic center in the city’s central business district, it was Hartmann who concluded that an attention-grabbing statue would be a suitable anchor for the spacious plaza—345 feet by 220 feet—adjoining the giant courthouse. And instead of showcasing another sober historical monument, likely with some old war hero on a horse, Hartmann dreamed big. Carrying with him the imprimatur of the culturally conservative mayor—"If you gentlemen think he's the greatest, that's what we want for Chicago, and you go ahead”—Hartmann traveled to the home of 82-year-old Pablo Picasso in the French Riviera, bearing an album of Chicago photographs and a model of their site.
Over the next three years, whenever he was in Europe, Hartmann dropped in on the famous artist to rehash his elevator pitch and deliver an assortment of Chicago-themed gifts: a Sioux war bonnet, a White Sox blazer, a Bears helmet, a Chicago Fire Department hat, photos of Ernest Hemingway and Carl Sandburg.  Though Picasso had never visited the Windy City, Hartmann was charming, and the idea of creating a dramatic monumental sculpture for a major American metropolis appealed to him. Finally, in 1965, the master completed a 42-inch model based on a series of drawings he had started 35 years prior. He turned down a $100,000 from the Building Commission (a pittance for what his services would have commanded on the open market), preferring to give the design as a "gift to the people of Chicago."
Hartmann rushed it back home, where city officials authorized SOM to investigate its practicality. The firm estimated that for $300,000, welders at U.S. Steel in nearby Gary could translate the model into a finished sculpture
, clocking in at 50 feet and 162 tons. Three local foundations jumped at the opportunity to underwrite the construction. It wasn’t easy to build; a 12-man crew of iron workers pounded away for three months, “[rolling] steel to sizes which never have been rolled” (WTTW
). But on August 15, the alloy sculpture was assembled at the corner of Washington and Clark, and 50,000 Chicagoans descended on the plaza to watch Mayor Daley commemorate his giant attraction.
At the time, the AP speculated that it was the “largest crowd ever gathered to watch the introduction of a piece of sculpture.” The city, hoping to capitalize on the attention, ramped up the pageantry. There was a performance from the Chicago Symphony Orchestra, invocations from a minister and rabbi, and speeches from dignitaries like Hartmann. Daley read a telegram sent by President Lyndon Johnson  and Gwendolyn Brooks read a poem
she wrote specifically for the occasion.  Then came the moment the curious crush had waited for: the mayor stepped back up to the mic, announced his administration was “[dedicating] this celebrated work … with the belief that what is strange to us today will be familiar tomorrow," and yanked off a turquoise cloth that was draping the statue. The cover snagged on its steel noise—nearly sending Daley’s press secretary into cardiac arrest—before tumbling past the marble base and onto the ground.
There it stood, 150 feet of it, for the city and world to see. And all the crowd did, at least initially, was gasp. “The weakest pinch hitter on the Cubs receives more cheers,” Mike Royko joked in his Chicago Daily News
column the next day.
Most just didn’t know what to make of it. They’d never seen a piece of art so big, so centrally located, and so mysterious. It didn’t really look like the head of a woman, as they’d been told it would. Surveyed by reporters that day, crowd members delivered their own amateur interpretations: an Afghan hound, a rib cage and appendix, a sea horse, a character from “Kukla, Fran, and Ollie
,” a baboon, a Barbary ape, an aardvark. My favorite? “Nothing, absolutely nothing” (Tribune
, August 15, 1967).
A few high-profile residents joined Hoellen’s anti-Picasso brigade. Even before the statue was unveiled, Col. Jack Reilly—the mayor's director of special events—made headlines for urging its removal. Royko, the most widely read newsman in the region, wrote a withering (and frankly philistine) column
in which he described the piece as “some giant insect that is about to eat a smaller, weaker insect.”  Less preoccupied with its literal meaning, experts fell in love with Picasso’s design. “Like a fine bridge,” an art critic for the New York Times
wrote (August 15, 1967), “it combines absolute firmness with an effect of lightness.” Time
declared the acquisition “one of the most magnificent windfalls in [Chicago’s] history.” “There will come a time,” surmised James Brown IV, longtime director of the Chicago Community Trust, “when we can’t imagine anything else being in the plaza except the Chicago Picasso because it is so appropriate to the site and backdrop.”
Brown’s intuition was right, but Chicagoans took a few years to grow accustomed to their new female. Were it not for a 1970 court decision, in which a judge determined that the statue’s copyright belonged to the public and not to the Public Building Commission, it might not have happened at all. The year before, Letter Edged in Black Press, an art publisher, filed suit against the city, arguing that it had no right to prevent merchants and artists from reproducing Picasso’s design. For those that enjoy legalese, the details of the decision are here
. Essentially, the court ruled that the city—as part of its publicity blitz—had authorized the press to photograph Picasso’s model and publish those shots in newspapers and magazines without first affixing a copyright notice to the design. Because of that technicality, the Copyright Act of 1909
did not apply, and the work was tossed into the public domain. According to the Tribune’s
25th anniversary retrospective (August 14, 1992), “the sculpture's big eye and flowing mane soon found its way onto postcards and keychains.” That explosion in swag “bred familiarity, the first step toward love.”
Indeed, it’s tough to find any Chicagoan these days who doesn’t appreciate Ald. Hoellen's “rusting heap of iron,” the city’s first piece of public art for art's sake. The steel behemoth has stood for 45 years, staring us in the face, and we keep staring back at her.
 Walter Netsch, another partner at SOM, later told the Tribune
(March 7, 2003) that Hartmann would consistently ask, “How are we going to amuse him?"
 "You have demonstrated once again that Chicago is a city second to none."
 “Does man love Art? Man visits Art, but squirms/Art hurts.”
 Royko did conclude that the statute’s “pitiless, cold, mean” eyes effectively captured the city’s “I will get you before you will get me spirit.” So in some ways, he was a fan.
Tim Reid and Tom Dreesen were desperate as hell when they wandered into the Manhattan studio of “The David Frost Show.” It was late 1971, and the Chicago-based comedy team had been working on their act, with only modest success, for two years. They needed a break. Specifically, they needed to land an appearance before a national audience. “Television,” Ron Rapoport writes in his 2008 book
“Tim and Tom,” “was the holy grail.”
Brushed off by bookers at “The Tonight Show” down the block, the duo stormed the offices of Frost’s talent coordinator like salesmen on a cold call. There stood Ken Reynolds, an African-American from Chicago’s South Side, the duo’s home turf. They’d hit the jackpot. It took just a few minutes of small talk—mostly about mutual friends back home—and a short audition before Reynolds tossed Tim and Tom in front of the camera and let them rip. They killed.
In the days between their set and the show’s air date, Reid and Dreesen hustled like they had never hustled before, passing along the details of their upcoming showcase to everyone they knew in the industry. The former, so confident the Frost spot would generate substantial buzz, quit his day job to focus exclusively on comedy. Then the episode aired … and landed with a thud. No agents called. No club owners sent over contracts. “It was as if nobody had seen it,” Rapaport writes. “As if they had never even been on.”
So went the career of Tim and Tom
, the first interracial comedy team in U.S. history. To say their act was influential, or even popular, would be an overstatement; Reid and Dreesen’s “sure-fire, can’t-miss idea”—built on the simple premise that people should put aside their racial differences and laugh together—more or less failed. But their rocky five-year run was undeniably interesting
, shedding light on race relations and the cutthroat world of show business in the early 1970s. Both members of the unlikely partnership would become bankable performers in Hollywood eventually. First, they had to take a few lumps standing side-by-side.
Though they grew up 850 miles apart—Reid in segregated Norfolk, Virginia, and Dreesen in the integrated, working-class town of Harvey, Illinois—Tim and Tom’s childhoods mirrored each other in fundamental ways: both went to Catholic elementary schools, had parental figures who battled addiction (Reid’s stepfather, Dreesen’s father), and spent time living above establishments (Dreesen a bar, Reid a brothel) they were too young to enter legally. As adolescents, however, their lives diverged sharply. Inspired by a trip to see the March on Washington in 1963, Reid buckled down in high school, put himself through college at Norfolk State by waiting tables, joined the school’s nascent drama department, and landed a marketing job in Chicago with DuPont, one of the company’s first black hires. Dreesen didn’t even sniff college, having dropped out of high school at 16 after his dad’s drinking habit intensified and his family moved into a rat-infested shack (with no hot water) on the black side of town. The series of odd jobs he picked up—as a caddy, a pinsetter—didn’t pay enough to make ends meet, so he followed his older brother into the Navy before stumbling into a job as a life insurance salesman back home, one that required few tangible skills aside from charm.
Their paths first crossed in 1969, after each joined the United States Junior Chamber in Chicago’s south suburbs, Reid because he wanted to be active in his community’s affairs and Dreesen because he thought it’d be a clever way to find potential clients. At the first meeting Reid attended, Dreesen suggested the organization launch a drug-prevention program in Harvey’s grade schools. The pitch—"open the session with music, tell a few jokes to get everybody relaxed, talk to them on their own level” (Rapoport, 67)—appealed instantly to Reid, a former thespian who had spent far too many years living with a heroin addict. After a few introductory conversations, the pair called up some local principals to schedule sessions.
While the field was admittedly thin, it didn’t take long for Tim and Tom to establish a reputation as the hippest anti-drug crusaders in Chicago’s southland; unlike cops or other authority figures, they were laid back, relatable, and shared a natural rapport. Schools requested their services constantly. After one particularly inspired assembly, an eighth grader named Joanne Cerfuka made an offhanded suggestion that caught the men flatfooted. “You guys are so funny,” she said. “You ought to be a comedy team.” It took a few days for the idea to sink in, mostly because neither had any clue how to write a routine. “If we had known how hard it was going to be,” Reid would later tell the Tribune
(October 19, 1980), “I don’t think either of us would have tried it.” But both found the prospect of a career on stage appealing, so Reid bought every stand-up album he could find, they cleared some space in his kitchen, and started experimenting.
From the onset, the two men knew that race would be the predominant theme of their work—it’s what they knew and wanted to lampoon. If the act was going to work, though, they had to deal with each other on even terms—Reid wouldn’t stand for any Amos ‘n’ Andy-style minstrelsy. As one of their friends told Rapoport, “[Reid] could be laughed with, but not at.”
Fortunately, this was an easy problem to avoid; Reid was naturally the more urbane and articulate of the two, so it made sense for him to play the “cool and collected observer of his partner’s antics” (Rapoport, 71), the Dean Martin to Dreesen’s Jerry Lewis. “Black audiences had never seen that,” Reid said, “and liberal white audiences didn’t want to see a black man playing the buffoon.” The pair’s coordinated attire—colorful dinner jackets, formal tuxedos—reinforced the balanced relationship they hoped to project.
Tim and Tom developed material remarkably quickly, some of which they would use for the duration of their run. A lot of it was broad and silly. They invented a superhero named Super Spade who fought crime alongside his sidekick, The Courageous Caucasian. They poked fun at Tom’s Italian heritage. (“Are you sure he’s not with the bossa nova?” Tim would ask of Tom’s father.) In this bit below, one of their most well-known, Reid grows increasingly frustrated with Dreesen’s “anxiety” about hanging out in black neighborhoods:
After just a few months of writing in their spare time, Reid and Dreesen convinced the owner of Party Mart Supper Club on Chicago’s South Side to put them on the evening bill. (Dreesen had no problem approaching complete strangers, a useful quality in an industry that requires one to sell himself constantly.) And in their first professional performance, in September 1969, the nervous pair “bolted through their act as if they were double parked” (Rapoport, 76). An adult audience out for a night on the town, it turned out, was tougher to impress than teens in a classroom.
Yet over time, they grew more comfortable with each other and with performing generally. Their relentless networking helped secure new gigs, too. A popular deejay they met at WBEE, a black radio station in Harvey, took them to Chicago-area jazz clubs, record parties, and charity functions. They hit it off with Don Cornelius and Merri Dee, two luminaries in the Chicago black entertainment business. Within a year, they made a name for themselves on the local circuit, opening for Gladys Knight and the Pips, Dionne Warwick, and Anita Bryant. They were even added to the lineup of the 1st annual Black Exposition, held before 25,000 people in the International Amphitheater and headlined by Bill Cosby. Within two, they were playing some of Chicago’s biggest supper clubs. Their following at home was strong enough that by 1973, Dreesen was able to convince club owner Henry Norton to let Tim and Tom host a comedy night at his bar, LePub, on Monday nights. Improv and sketch were thriving here, but their stand-up showcase was the city’s first.
Chicago, as big as it was, only contained so many stand-up venues and potential fans. To make a living full-time behind the mic, which Tim and Tom hoped to do, the team needed to get out on the road.
Building a national brand proved more difficult than they anticipated. Finding gigs wasn’t the problem; Tim and Tom played rooms big and small, in venues as diverse as the Playboy Clubs and theaters on the southern Chitlin’ Circuit. “Whenever another act fell out, missed a plan or something, boom, we were there,” Reid remembered (Rapoport, 93). “We were the National Guard of comedy teams.” The trick was finding the right gigs. More often than other stand-ups, they’d encounter racially-hostile crowds, folks who came out just to heckle, not laugh. Bookers at top-shelf venues and television shows weren’t so eager to hire an interracial pair either, particularly one whose material was charming but never excellent. (They were relative beginners, after all.) During their five years together, the biggest paycheck the pair ever earned was $750 … split two ways. And the only album they ever recorded, 1973’s “The Classic Comedy of Tim & Tom,” didn’t sell well. Reid’s enthusiasm ultimately waned.
"The thought of a black-and-white comedy team really intrigued us,” Dreesen once told the Tribune (July 11, 1991), “so we went on the road for [five] years and bombed and struggled and bombed.” In show business, you need the right combination of talent and luck to take off. It never coalesced for Tim and Tom.
The duo had an acrimonious falling out in the mid-1970s. Reid’s attention turned to Hollywood, having forged a relationship with the connected singer and actress Della Reese. “I began picturing myself doing what they were doing,” he tells Rapoport (143), about his trips to Los Angeles, “becoming an actor more than a comedian.” In November 1974, after a painful set before four people at the Hyatt Hotel in Houston, Reid flew west for good, effectively ending the partnership. It was a difficult, albeit fruitful, decision; in his first two years, after studying at the Film Actors Workshop in Burbank, he landed a series of bit parts on television shows and movies before securing the lucrative role of Venus Flytrap on the sitcom “WKRP in Cincinnati.” He’s been acting and directing ever since.
Dreesen took the breakup much harder. Like his partner, he picked up and moved to Los Angeles, where he struggled to develop a mainstream solo act based on his childhood in Harvey and slept in his car to save money. He sounds downright bitter in a 1975 interview with the Tribune’s Bruce Vilanch, placing blame for the group’s demise squarely on Reid’s shoulders. But Dreesen got back on his feet, too. A talent coordinator for “The Tonight Show” caught his act one night in the winter of 1975 and offered him a spot, the first of 500 television appearances he would make over the next 35 years. He parlayed that exposure into a job opening for Frank Sinatra, a high-profile gig that he held for 14 years. And he’s forgiven Reid, the man with whom he’d tried to change the face of American comedy.
The buck stood in the doorway of the dinning room atop a thin layer of leaves. He was hulking, weighing in at 250 pounds, and looked as “natural as life.” A quail pecked at food around his hooves, unbothered by the orchestra pumping in the background or the hungry aristocrats who streamed past, gazing at the majestic animal they would consume—among many others—later that evening. They’d traveled in from Paris, New York City, Vienna, and Montreal. Game was on their minds. They were ready for “Chicago’s Greatest Feed.” 
Long before Charlie Trotter diced vegetables or Grant Achatz picked up a blowtorch—hell, before the railroad companies opened up the Union Stockyards--John B. Drake
turned Chicago into a foodie destination. The Ohio native was an unlikely gastronome, in that he spent virtually no time in a kitchen. Instead, Drake operated hotels, originally in Cincinnati and then as steward (and eventual owner) of Chicago’s Tremont House
, the city’s “first hotel of metropolitan proportions.” 
Between 1860 and 1880, Chicago’s population would more than quadruple, but in 1855, when Drake arrived, it was still very much a frontier town. And it was teeming with game. Hunters didn’t have to travel further than the modern city limits to find an abundant supply of wild geese, turkeys, and prairie chickens. According to Edwin O. Gale’s “Reminiscences of Early Chicago,” a local pioneer once killed a 400-pound bear that occupied a tree near where LaSalle and Adams now intersect. The Tremont’s earliest guests, from the hotel’s front steps, could even shoot shoot ducks swimming in the surrounding marshland. What better way to harness those local resources and
build his reputation in a new city, Drake thought, than by hosting an elaborate dinner party for Chicago’s elite. The menu would feature the region’s finest game, in quantities and variety unavailable anywhere else—Manifest Destiny on a plate, served up by Mr. Drake.
The first meal, in 1855, drew just 40 people. But what started as a relatively modest affair grew quickly in both popularity and culinary ambition. On a Sunday afternoon in 1860, for the fifth anniversary, the proprietors of the Tremont supplied 20 different meats. In 1864, the menu featured 25 cuts. Three years later, Drake served up 26. Literally every newspaper article published about the event included a variation of this phrase: “the dinner is bigger and better than the year before.” Invitations, meanwhile, were rarely turned down. “To be invited to one of these dinners was a sign that a young man had been accepted by the business community,” writes Emmett Deadmon in his book “Fabulous Chicago” (1983). “Which in Chicago was identical with being accepted socially.”
The Great Chicago Fire of 1871 was both a blessing and a curse for Drake. Like 17,500 other buildings torched by inferno, the Tremont House burned to the ground, and its owner was able to salvage only the money from the safe and a few pillowcases full of silver before the gorgeous structure collapsed. His game dinner, scheduled for the next month, was postponed indefinitely. On Drake’s walk home, however, he passed the Michigan Avenue Hotel at Congress Street (where the Congress Hotel now sits) and noticed that it was still standing. Feeling ambitious, he strolled into the lobby, found the owner, and plopped $1,000 in cash from his moneybox on the man’s desk—enough to cover an advanced payment on the hotel’s lease and furniture. The startled hotelier, convinced the flame would engulf his property like everyone else’s, hastily drew up a contract handing control over to his competitor before fleeing to safety.
Drake was rewarded for his courage. The next morning, when he strolled down to the edge of the burned district, he found the Michigan Avenue Hotel standing along the boulevard, still untouched. Immediately, Drake renamed it the Tremont House and enlarged it by taking in some adjoining buildings. Because so few hotels were still operational, the new Tremont faced very little competition; between 1871 and 1875, Wagenknecht writes that Drake “did a rushing business.” Those new profits allowed him to purchase, in 1874, the lease of the famous Grand Pacific Hotel. They also subsidized his game dinners, a tradition he would resume at a new location—and with renewed enthusiasm—the following year. “The game dinners given heretofore by Mr. Drake at the old Tremont House was one of the features of Chicago and the West,” the Tribune
wrote on November 7, 1875, “and we are glad to see he intends to keep up the custom.”
So were the lucky invitees. For 18 years, beginning in 1875, Drake threw an annual party like nothing the city had ever seen … or would ever experience again. For starters, the guest list was massive. By the mid-1880s, over 500 people routinely attended, including luminaries like former President Ulysses Grant, Gen. Philip Sheridan, and Marshall Field. Every nook and cranny of the Grand Pacific ballroom was covered in flowers, ferns, and smilax, simulating the forest from where their food was gathered. Stuffed birds and fowl were perched above the greenery, their wings outspread. Tables were set with the “daintiest of china and glass” (Tribune
; November 22, 1885). A game piece and an elaborate display of confectionery served as the centerpiece. “In a conspicuous place of honor,” the Tribune
noted in 1888, “was set a cute little black bear cub, harnessed in dainty scarlet ribbon, and was ridden by a squirrel which announced itself the prize ‘bare’ back rider.”
The menu was just as outrageous as the decorations. Under Drake’s exacting supervision, a well-drilled army of waiters—often more than 100—brought out course after course of the rarest meat imaginable
: ham of black bear, leg of elk, loin of moose, buffalo tongue, ragout of squirrel à la Francaise, roasted Sand Peeps. Diners could sample over 40 different animals. It was an orgy of flesh, or a “saturnalia of blood,” as Stefan Bechtel writes in his new book “Mr. Hornaday’s War.” Deadmon pit it more delicately: “The standards of Chicago were those of the gourmand rather than the gourmet.”
What made the later banquets even more impressive was the increasing lengths to which Drake had to go to procure his food. Game was much more scarce in 1890 than it was in 1855, particularly in Chicago, which developed rapidly after railroad companies laid tracks around the Chicago River. Instead of hunting in his own backyard, Drake contacted suppliers by telegraph, who would then kill and ship the animals to Chicago in refrigerated locomotive cars. Each November, the meat would pour in—at great expense—from the Rockies and Catskill Mountains, the shores of the Chesapeake and the swamps of the Carolinas. “A game dinner,” a Tribune
reporter noted in November 1885, “now means a great deal more than an expert shot and a good cook.”
In 1894, “with the glory of the World’s Fair full upon it,”  John Drake decided to discontinue his celebrated game dinner. The advance planning needed to pull all of the ingredients together was just too much for the elderly entrepreneur, who died the following November. Fifteen years later, members of the Union League Club attempted to revive the tradition, throwing a dinner for 300 men that “furnished a menu that compared favorably with those old times” (Tribune
; December 9, 1909).
The reboot lasted just one year. In a letter to a friend, printed by the Tribune
in 1957, Drake-era regular Martha Freeman Esmond said her husband “came home somewhat disappointed,” adding that the Union League’s shindig was “a fair substitute … for those who hadn’t had that privilege” of eating with Mr. Drake.
For those who had, it was only a mediocre imitation.
 New York Times
; November 21, 1886
 “Chicago;” Edward Wagenknecht (1964)
 Chicago Tribune
; November 14, 1894
Ed. note: I'm on deadline this week, so I'm handing the keys over to Jayne Kelley, a Chicago-based writer and editor who graduated last May with a master's degree in Design Criticism from the University of Illinois at Chicago. The piece below, written in 2009 and updated briefly after the passing of Adam Yauch, is the first essay she penned for the program. I think it's really funny and sharp.
"The Beastie Boys' first album should bear a prominent label, not to protect impressionable teens so much as their elders. Warning: certain scenes and references contained herein may seem offensive, even dangerous, until you realize that it's all a colossal joke.”
– Mark Coleman, Rolling Stone review
of Licensed to Ill
, Feb. 26, 1987
“I went inside the deli and my man’s like, ‘What?’
I write the songs that make the whole world suck.”
– Ad-Rock (Adam Horovitz), “Unite,” 1998
Beastie Boys lyrics occupy a not-insignificant portion of my memory’s real estate. Obviously, the fervor with which I acquired this knowledge does not remotely match the likelihood of my being able to use it. An obsessive impulse common in teenage girls offers some explanation, but considering the Beasties’ 15-year-old back catalog of rap (and hardcore punk) I memorized, Beatlemaniacs had it easy. Still, the Beastie Boys’ lyrical contributions haven’t gone totally unrecognized: the Oxford English Dictionary credits them with coining “mullet,” a term that has taken on a cultural life all its own.
In retrospect, as a kid who took herself much too seriously, it was the Beasties’ signature sense of humor that propelled my dive into their discography. "The group is well-known for its eclecticism, jocular and flippant attitude toward interviews and interviewers, obscure cultural references and kitschy lyrics, and for performing in outlandish matching suits,” according to Wikipedia
. And indeed, what’s not to like about outlandish matching suits? From red and black adidas track gear to multicolor workman’s one-piece uniforms to sober business attire, the Beastie Boys’ style has always conveyed a commitment to looking good that is somehow ironic and wholly genuine at once. That this 20-year commitment would repeatedly take the form of outlandish matching suits is no accident (and is actually quite impressive). In fact, I’d argue that outlandish matching suits are emblematic of a highly particular, appealing sensibility: what I’ll call the “half-serious.”
The half-serious aesthetic hinges on another career-long consistency that the Boys themselves might deny. 1986’s Licensed to Ill
is often considered one of the first gangster rap albums; alongside stupid, harmless lyrics of hedonistic anthems like “Fight for your Right (to Party)” are more overtly, and confusingly, violent lines, such as “You better keep your mouth shut because I’m fully strapped” (“The New Style”). By 1998’s Hello Nasty
, the group had entirely abandoned the trappings of their blend of punk gangsterism. On “Unite,” Adam Yauch (MCA) asserts, “I don’t like to fight, I don’t carry a piece / I wear a permanent press so I’m always creased.” The idea that it’s Yauch’s perfectly wrinkle-free clothes that give him street credibility comes off as partially self-deprecating but mostly just silly, even as listeners recognize he was nearing 40 and probably did take that kind of care to press his pants. In interviews, Mike Diamond (Mike D) has framed this shift as a progression, an eventual, conscious evolution toward maturity: “Obviously there are moments in the past that you look back at and cringe…but it’s actually a privilege to be able to change and to be making records that reflect that change,” he told SPIN
It makes sense that the Beastie Boys themselves would understand what they’ve done in this way, and I was certainly aware of that narrative when I became a fan. Still, my own growing up didn’t match this arc at all; because I heard everything at once—beginning with a copy of Paul’s Boutique
I had blindly ordered from BMG Music Service, moving in both directions quickly from there—Hello Nasty felt just as new to me as the puerile gangster posturing that came before. More importantly, all the output seemed equally absurd. To position the 1986 Beastie Boys as somehow incompatible with today’s group misses the point of their project—after all, they’ve kept the same alter egos. For me, it’s wrong to argue that a “mature” straight-faced pun like “You better think twice before you start flossin’ / I been in your bathroom often” (“Crawlspace,” 2004) is categorically different from a line like “My pistol is loaded / I shot Betty Crocker” (“Rhymin’ and Stealin’,” 1986). Both are idiotic, just not in the same way.
The notion that the Beastie Boys are fundamentally idiots originates with the (perhaps apocryphal) title of Rolling Stone’s Licensed to Ill
review. “Three Idiots Create a Masterpiece” suggests these goofy Jewish kids somehow weren’t aware they were making what would become the decade’s best-selling rap album. In truth, it’s hard to know how self-conscious the Boys were. While I don’t believe their success was an accident, the album doesn’t qualify as total parody. The group was thoroughly faithful to hip-hop’s tropes (and thus mostly respected in the rap community); it’s the fact that they’re white boys from Manhattan and Brooklyn that makes the committed engagement of these conceits ridiculous. The Beastie Boys’ early success is evidence that they understood how to exploit their inherent incongruity for maximum aesthetic effect. This embrace of complexity—intentionally taking on two seemingly opposite qualities at once, those of “real” rappers and “fake” rappers—is key to the half-serious approach.
By the time I started to listen to them, the Beastie Boys had largely apologized for their obnoxious behavior and devoted themselves to less foolish pursuits, notably activism for a Free Tibet. The five albums after Licensed to Ill
had enjoyed near-unanimous acclaim; time legitimized the group’s status as hip-hop pioneers. In this context, a half-serious attitude generates another type of idiocy: because their personas are no longer ridiculous, the Beastie Boys must engage rap’s conventions in a facetious way. What was once a threat of murder is now a warning that MCA “will steal your keys and then…check your mail” (“Oh Word?”, 2004). On the same track, Adam Horovitz (Ad-Rock) politely disses an ugly adversary: “Talk about your face, now don’t get pissed / but I suggest you see a dermatologist.” It’s clear that the Beastie Boys are no longer idiots, but their nonsensical plays on typical forms of hip-hop posturing seem just as effortless as their early “accidental” success. Right at the moment when we realize they really are skilled rappers, the Beasties begin to undermine their image by haplessly failing at self-aggrandizement.
As an overachieving, self-doubting teenager, part of what I really loved about the Beastie Boys is how the half-serious approach seemed to obviate a lot of criticism. For me, this sort of calculated remove was a very desirable model to emulate; even though they’re experts, no one would ever accuse the Beastie Boys of trying too hard, which to me was (and in many ways still is) the telltale mark of the uncool. The music video for “Sabotage,” for example, works so well because it’s clear the Boys’ appropriation of the look is ironic while their excitement over and dedication to the appropriation is total. I’m obviously not the first to point out how the technique of sampling is analogous to the construction of a persona, the intentional amalgamation of different aspects of personality or culture a rapper adopts. But there’s something so great about the way the Beastie Boys have pulled off this balancing act—something inviting rather than alienating, flexible (always fresh), and above all, good-natured, with a knowing, and self-knowing, sense of humor.
As a model for working, the half-serious doesn’t preclude actual effort in order to make it look easy, but it provides mechanisms for absorbing and reconciling contradictory influences (and even realities) without a fuss. The result is genuine, but not earnest—sophisticated enough to know when sophistication is overrated; familiar, but always a shade unexpected. It’s a little bit like the story of the term “mullet”: the Boys called wide attention to a ridiculous hairdo in what was essentially a very public in-joke (with the song “Mullet Head”), but doing so had a real impact, solidifying the mullet’s cultural import. Anyway, especially now, I’m still listening, memorizing lyrics and taking notes.
 “Heroes and Antiheroes: Beastie Boys,” SPIN
, April 2000, 66.
On October 1, 1935, New York City mayor Fiorello La Guardia
jumped onto the radio and declared war. He warned residents that his crusade would not be “a mere flash;” in addition to an elaborate education campaign, the Republican planned to dispatch special police squads all over the city in an effort to run down perpetrators. This battle had raged, in varying degrees, for two decades. La Guardia hoped his assault would finally put it to rest. “Everyone is vitally affected … by the “gigantic common enemy,” he added. “Let’s turn it into a harmless dwarf” (New York Times
; September 29, 1935).
What was the mysterious menace terrorizing America’s largest city? The plague that “has increased to a point beyond all reasonable tolerance,” according to one early observer? Noise. To be more precise, unnecessary noise, flung from cars and nighttime revelers, trucks and tourists. Reformers in Gotham City had grown tired of the urban din, and in the mid-1930s, their man in City Hall took it upon himself to silence the five boroughs once and for all.
La Guardia may have been Manhattan’s most prominent anti-noise activist, but he wasn’t the first. That distinction belongs to Julia Barnett Rice
, a New Orleans-born debutante whose unique political career George Prochnik briefly profiles in his 2010 book “In Pursuit of Silence
.” In many ways, Rice was an unlikely campaigner; after studying music and then medicine at Women’s Medical College of the New York Infirmary, she married a successful corporate lawyer and started a family, raising six children inside the couple’s custom-built mansion
overlooking the Hudson River on the Upper West Side. From their secluded perch at Riverside Drive and 89th Street, the Rice clan was insulated from all but the most thunderous downtown clatter. Tugboat horns, on the other hand, were a constant nuisance, particularly in the summer, when the couple propped open its many windows to let in fresh air. Rice didn’t think for a second that all of the toots she heard were actually preventing crashes.
In 1905, two years after moving into “Villa Julia,” Rice used her family’s wealth in a novel way, hiring six Columbia Law students to track the number of whistle blasts pouring up from the Hudson River each night. Their final report—which ran to 33 pages and included testimonials from both police officers and neighbors—charged that tugboat captains “murder sleep and therefore menace health,” shooting off their horns 2,000-3,000 times in a typical evening, often to greet passing ships or servant girls working along the river. It was just the evidence Rice needed. Because the waterway was under federal jurisdiction, the socialite took her report to Washington. With the help of her congressman, U.S. Rep. William Bennett, and letters from hospitals and patients about the health effects of errant noise, Rice convinced lawmakers to pass a bill regulating the “useless and indiscriminate tooting of sirens and steam whistles.” Boisterous skippers along the eastern seaboard would now face aggressive fines, all thanks to Rice.
Her advocacy had struck a nerve. At the turn of the 20th century, city dwellers were subject to more chatter than any population in history, and public health professionals were beginning to understand the severe risk excessive noise carries.* Audible disturbances were now a community problem, requiring a community response. Once the tugboat campaign earned some publicity, Rice was engulfed with letters of support from regular citizens and public officials alike; she realized that New York’s patrician class—eager to preserve its genteel way of life while paternalistically protecting the poor and infirm—was ready, in Prochnik’s words, “to extend the battle to an array of targets.” And so in 1906, Rice enlisted friends and acquaintances and launched the Society for the Suppression of Unnecessary Noise, the world’s first anti-noise organization.
For two decades, the aristocrat fought day and night for a silent city, writing op-eds and pamphlets, enlisting the aid of prominent individuals, and urging local politicos to draft and update municipal bylaws. Data played a key role in her organizing—each new offensive was backed by an almost-maniacal amount of measurements. To be sure, there were some triumphs, like when Rice—with the help of Mark Twain
—convinced thousands of New York schoolchildren to stay quiet while they walked by or played near hospitals. But the rise of the automobile
complicated the society’s already difficult work, and their victories were ultimately modest. Prochnik contends that Rice’s determination couldn’t mask one core flaw in her position: “When it comes to noise, how do we tell the necessary from the unnecessary?”
That’s a question New York City’s health commissioner, Shirley Wynee, tried to answer conclusively after Rice’s retirement. In 1929, Wynee established the Noise Abatement Commission, a government-sponsored panel whose task was to quantify
the problem of noise. Over the course of six months, staff members—most of whom were experts in science, engineering, or medicine—canvassed huge swaths of the city, using questionnaires, audio recorders, and a “roving noise laboratory” to gather sound level readings on the street and inside buildings. (The measuring truck alone covered over 500 miles, making 7,000 observations at 113 locations.) Once the voluminous data was organized, NAC members paired their results with scholarly reports detailing the impact of noise on concentration and productivity, publishing City Noise
, a massive landmark study on the acoustical intensity of urban life.
What did the NAC find? Street traffic, elevated trains, and subways accounted for 52 percent of New York’s clatter, while construction work, automobile horns, and various other sources made up the rest. Like Rice, the existing medical literature also convinced members that exposure to constant loud noises could lead to hearing impairment, interfere with sleep, and reduce the efficiency of workers. In framing the problem, however, NAC took a different approach than its activist predecessor. As Lilian Radovac argues in her fascinating American Quarterly essay
“The ‘War on Noise’: Sound and Space in La Guardia’s New York,” the commission considered unwanted sound a technological obstacle, one that could be solved (or at least muted) relatively easily by improving industrial design, reforming construction practices, informing the public, and updating the city’s noise ordinances. If New Yorkers want to “do away with unnecessary noise and reduce to a minimum such noises as are necessary,” the commission wrote, they can do so “if they are willing to take a little trouble.”
Initially, the recommendations embedded in City Noise
landed on deaf ears. Even though his own administration completed the legwork, Mayor Jimmy Walker—immersed in a corruption scandal that would force him to resign—ignored the directives entirely. The issue only gained political traction when La Guardia, trying to establish his reputation as a reformer, stormed City Hall in 1934 and made noise reduction a priority.
La Guardia’s “War on Noise” was simple in conception. First, city officials, with the help of civic organizations and the police commissioner, would educate the public about what behavior they considered unacceptable—the unnecessary blasting of horns, for example, and attempts to call people to windows by shouting. Residents would then be urged to cooperate with their neighbors and ensure “noiseless nights.” For his part, the mayor made a series of personal commitments, assuring residents that “many miles of [elevated train tracks] will be torn down” and that cabarets in residential districts would be closed.*** He also set up a decibel machine in his own office so he could track the sounds lofting up from the street below. “Most of the city officials including Mayor La Guardia are shouting at the tops of their voices about how quiet everything is going to be when the metropolis gets its noiseless nights,” the Washington Post
sarcastically noted that summer (August 17, 1935).
The second plank of La Guardia’s quiet campaign was legal. When the Republican took office, New York’s existing noise bylaws were a total mess—Radovac describes them as “a series of discrete clauses that had accrued over several decades in different sections of the administrative code”—so he signed various executive orders he hoped would serve as an adequate substitute until aides could convince the Board of Aldermen to pass a comprehensive anti-noise ordinance. One prohibited the use of political campaign trucks after 10:30 p.m, another allowed police to ticket any driver who sounded his horn after 11 p.m. The Noise Abatement Council, a non-profit organization that sprung up after the original NAC disbanded, dubbed this approach “Quiet by Fiat” (New York Times
; May 19, 1934) and suggested that 75 percent of unnecessary noise could be eliminated by administrative orders alone. While their estimate was wildly optimistic, it’s clear the restrictions did make an impact initially. During the first four days of the drive, the NYPD issued over 5,000 warnings, and officers wrote more than 1,500 summonses by the end of 1935, a dramatic increase from the year prior. “Several of the more exasperating noises faded,” anti-noise activist Edward Peabody later told The New Yorker
(November 22, 1941). “Cats induced to stop yowling, gravediggers laid off hitting the tombstones with shovels, etc.”
By the spring of 1936, La Guardia finally muscled through a revamped ordinance, one that banned 14 types of noise and greatly expanded the powers of the police. In one swoop, the administration’s focus shifted almost entirely from the preventative to the punitive. Those who played their radios too loudly or worked on construction projects at night, among others, would now face a graduated series of fines: $1 for the first infraction, $2 for the second, $4 for the third, and $10 for the fourth. (During the Depression, those tickets were not cheap.) Labor organizers, immigrants, and those reliant on the street-based economy—itinerant musicians, pushcart sellers, junkmen—were disproportionately targeted. In 1938 alone, 16,000 residents were ticketed for violations and another 293,000 were issued warnings. Like two Republican mayors who would follow in his footsteps decades later, La Guardia came to believe that enforcement was the most sensible way to tame the chaos of his hometown. Radovac offers a less generous interpretation; La Guardia, she writes, “conflated the everyday annoyances of city life with criminal acts.”
And in the end, the same enigma that beguiled Julia Rice frustrated La Guardia and his allies: when the line between “necessary” and “unnecessary” commotion is so fuzzy, it’s nearly impossible to convince people that excessive noise is a problem worth solving. Five years after the War on Noise began, Peabody complained that the public was apathetic and that “you couldn't say things are so hot” in his movement. In the 1940s and early 1950s, a coalition of executives from companies that manufactured soundproofing products organized “National Noise Abatement Week,” an annual education awareness campaign that was cynical in its formulation and lackluster in its execution. Eventually, noise pollution fell out of the headlines entirely. As one member of the group would admit
to The New Yorker
(May 15, 1948), “we feel in our heart rather hopeless about New York.”
*Via the CDC
, hearing loss; sleep disturbances; cardiovascular and psychophysiologic problems; performance reduction; annoyance responses; and adverse social behavior.
***”There is no reason why they should annoy the neighborhood with what they call music late at night” (AP; September 10, 1935).
Opponents called it the first step towards socialized medicine. The law was too expensive, they complained, and it violated states’ rights. One woman who testified during a congressional committee hearing even suggested its passage would lead to “bureaucratic control of family life.” If he was alive, John Roberts surely would have found a way to strike it down in court.
You don’t hear about it much anymore, but the Sheppard-Towner Act
—or the “Better Babies Bill,” as some reporters referred to it at the time—was a big f’ing deal. No Congress in U.S. history had ever approved a federally funded social welfare program before S-T came up for debate in the early 1920s; aside from the Volstead Act
, it was the most controversial law of its era. A Boston Globe
writer summed it up this way: “It ranks next in importance, in the opinion of its advocates, to the legislation which finally gave women the right to vote.” And there are some striking parallels between the fight over Sheppard-Towner and the recent debate surrounding President Obama’s embattled health reform law. With the Supreme Court set to rule on the constitutionality
of the Affordable Care Act next month, it’s worth investigating the legacy of its earliest legislative antecedent.
The story begins in 1912, when President Taft created the U.S. Children's Bureau and hired Julia Lathrop to run it. Housed within the Labor Department, the agency was designed to investigate and report “upon all matters pertaining to the welfare of children and child life among all classes of our people.” Lathrop (pictured above) had been doing essentially the same work for two decades at Chicago’s Hull House, undertaking extensive surveys to document the brutal living conditions in her city’s slums, mental health institutions, orphanages, and poorhouses. She was a natural fit. As one senator’s wife gushed to the Washington Post
, choosing Lathrop was “the finest and most just recognition of a woman's ability, and her place in the nation, that has ever been made by any president" (November 6, 1912).
“Young America’s Aunt”* knew instantly what problem her department should tackle first: infant mortality. When she arrived in Washington, Lathrop’s office launched an eight-city examination into American childbirth habits. The results were startling; the nation’s overall infant mortality rate was a whopping 111.2 per 1,000 live births, higher than almost every other industrialized country in the world. Annually, 250,000 American babies died during their first year and another 23,000 mothers were killed during the delivery. (It was the second leading cause of death among women between the ages of 18 and 45, behind tuberculosis.) There was also a correlation between poverty and the mortality rate—for families earning less than $450 annually, one baby in six
died before his or her first birthday. Respected Johns Hopkins pediatrician Dr. J. H. Mason Knox made clear at the time that nearly all of those deaths were preventable if families just received proper prenatal care. Only 20 percent of expectant moms did.
With firm data in hand, Lathrop set about drafting a piece of legislation that would use federal funds to provide “public protection of maternity and infancy.”** Like the newly-established Smith-Lever Act, which authorized the Department of Agriculture to distribute matching funds to the states for extension work
by county agents, Lathrop envisioned a program in which Washington partnered with local nurses, universities, and social workers to subsidize the instruction of mothers on the care of infants. “The bill,” Lathrop wrote, “is designed to emphasize public responsibility for the protection of life just as already through our public schools we recognize public responsibility in the education of children.”
In 1919, U.S. Rep. Horace Mann Towner (R-Iowa) and U.S. Sen. Morris Sheppard (D-Texas) submitted a bill that contained the basics of Lathrop’s proposal. The Hull House veteran wasted no time stumping for her idea. Over the next three years, Lathrop enlisted support wherever she could, relying heavily on women’s associations that were emboldened by the recent extension of suffrage. The Children’s Bureau sponsored “The Year of the Child,” in which the agency appealed to groups across the nation and published catchy graphics to illustrate the country’s poor international standing. Lathrop convinced popular magazines like Good Housekeeping
and the Ladies Home Journal
to editorialize in favor of the measure. Ultimately, 13 of the most powerful women's groups in America rallied behind Sheppard-Towner, too; in the final weeks of negotiations, the Women's Joint Congressional Committee—a massive umbrella group
—conducted interviews with congressman at the rate of 50 per day. “It is doubtful,” reported the Globe
(December 18, 1921), “if any single piece of legislation enacted by Congress in recent years—apart from equal suffrage—has had the organized influence of so great a body of the citizenship of the country back of it.”
The final version of the bill passed both the House (279 to 39) and the Senate (63 to 7) by a wide margin in late 1921, in part because the law was modest in scope. Congress agreed to appropriate just $1.24 million annually (about $15 million today) for the program, with each participating state receiving $5,000 outright and then dollar-for-dollar matching funds as determined by its population. After five years, the funding would need to be reauthorized, as well. (The advocates of the bill were confident that half a decade was “sufficiently long to demonstrate the real value of the measure.”) One year after its passage, a reporter for the Detroit Free-Press
described Sheppard-Towner as “mild and rather helpless” (December 1, 1922). He wasn’t wrong.
But the idea behind the bill, at least in the United States, was revolutionary. Social insurance, in any form, just didn’t exist. In her book “Protecting Soldiers and Mothers,” historian Theda Skocpol writes that Lathrop’s brainchild “extended what was once domestic action into a new understanding of governmental action for societal welfare.” Put another way
, the new law was “a fragile seed growing in isolation from the then-traditional health programs.”***
That seed quickly bloomed. Within the first year of implementation, 45 out of 48 states passed enabling legislation to receive matching S-T funds. (Illinois, Connecticut, and Massachusetts never participated.) Each used their subsidy in different ways; some organized conferences where physicians ran demonstrations on maternal and infant care and hygiene, while others paid nurses to visit new or expectant mothers. However it was deployed, the money went a long way. Between 1922 and 1929, the Bureau distributed over 22 million pieces of literature, conducted 183,252 health conferences, established 2,978 permanent prenatal centers, and visited over 3 million homes. Lathrop’s successor at the Children’s Bureau, Grace Abbott, estimated that one-half of U.S. babies had benefited from the government's childrearing information.
Not surprisingly, infant mortality dropped precipitously while Sheppard-Towner was on the books. A new working paper
published last month by the National Bureau of Economic Research estimates that Sheppard-Towner activities accounted for 12 percent of the drop in infant mortality during the 1920s, with one-on-one interventions creating the most statistically significant results. Combined with rising incomes and better nutrition, preventative health education helped cut down the infant mortality rate to 67.6 deaths per 1,000 live births in 1929. Considering how little money Congress actually spent on the law, the results were thrilling.
Not everyone was so excited by the precedent Sheppard-Towner was setting. During the initial debate in Congress, several opponents delivered unhinged criticism of both the bill and its supporters. U.S. Sen. James Reed (D-Missouri) declared (incorrectly) that Sheppard-Towner would permit officials to “invade” the homes of mothers-to-be. “We would better reverse the proposition,” he charmingly added, “and provide for a committee of mothers to take charge of the old maids (at the Children’s Bureau) and teach them how to acquire a husband and have babies of their own.” Not to be outdone, his colleague in the House, U.S. Rep. Henry Tucker (D-Virginia), characterized the bill as an attempt to “make Uncle Sam the midwife of every expectant woman in the United States.” And Mary G. Kilberth of the National Association Opposed to Woman Suffrage argued that Sheppard-Towner advocates were both “inspired by foreign experiments in Communism” and “connected with the birth-control movement.” A wealthy socialite from Boston went so far as to challenge the law before the U.S. Supreme Court, contending unsuccessfully
that it violated the Tenth Amendment.
If ideologues couldn’t rescind the law, doctors had a better shot. The American Medical Association board was initially skeptical of Sheppard-Towner, calling it an “imported socialistic scheme unsuited to our form of government” at its annual meeting in 1922. Four years later, however, the association fully mobilized for the funding reauthorization fight, lobbying Congress and writing letters to the president. It’s clear that many physicians moved to incorporate preventive health education into their private practices only when they saw the benefits of prenatal care play out in new clinics across the country. In a very real sense, the Children’s Bureau had become a primary competitor, and its own worst enemy.
Desperate to keep their projects operating, directors of the state Sheppard-Towner programs and Abbott cut what the historian Skocpol deemed a “deal with the devil,” agreeing to terminate the law altogether in exchange for two more years of full financial support. In 1929, seven years after reformers printed their first informational flyers, Sheppard-Towner came off the books. Over the next four years, progressives introduced 14 different bills that would have funded maternity and infancy health programs using federal dollars. All of them failed. When the Great Depression hit, most states dropped their existing programs altogether.
The lesson, though, had been learned. And while the United States’ current infant mortality rate is still not where it should be
, it’s decidedly safer for babies and mothers now than it was a century ago. For that, we can thank Julia Lathrop and her small, ambitious staff.
*Headline in the Post
on June 9, 1912
** Children’s Bureau’s Fifth Annual Report of the Chief, 1917
***The Sheppard-Towner Era: A Prototype Case Study in Federal-State Relationships
; June, 1967
One passenger train leaves Chicago’s Union Station at 12:35 p.m. and travels west at 85 miles per hour. Another passenger train leaves Chicago’s Union Station at 12:35 p.m., merges onto the same track, and travels at an identical speed two minutes behind the first. What happens when the lead train abruptly stops and the second doesn’t?
is the fifth largest city in Illinois, a giant and affluent Chicago suburb voted
five years ago as the second best place to live in the entire country. It’s home to well-performing schools, green space, and plenty of jobs. “It’s a suburb that does all the suburban things,” says
UIC urban planning professor Robert Bruegmann, “but slightly better.”
In the mid-1940s, Naperville was vastly different. Not entirely urban or rural, its 5,000 residents worked primarily on farms or at a factory run by the Kroehler Furniture Company
. There was a college on the edge of town, but no hospital. The city still hadn’t razed the Pre-Emption House
—the oldest continuously operating bar in the state and a vestige of Naperville's pioneer roots. And the Chicago, Burlington, and Quincy Railroad
operated tracks that ran right along 4th Avenue.
On April 26, 1946, around noon, 150 people boarded Burlington’s Advance Flyer, a nine-car “fast train” heading from downtown Chicago to Omaha and Lincoln, Nebraska. Another 175 hopped on the Oakland-bound Exposition Flyer, advertised
as “The Scenic Way to California—Thru the Colorado Rockies and the Feather River Canyon by Daylight.” (The trip took two days, with stops in Denver and Salt Lake City.) At the helm of the second train was W.W. Blaine, a 68-year-old engineer who had worked 40 years at the railroad and had operated diesel locomotives since 1933, the first year they were put in service on his line. To be sure, Blaine was old for his job; the railroad’s standard retirement age was 70. But he had passed all of his signal tests and the Illinois Interstate Commerce Commission ranked Burlington first in safety every year between 1930 and 1944. The passengers on board expected a smooth, relaxing ride into the western plains.
Burlington operated three tracks just west of Chicago’s city limit; the two outside tracks were reserved for freight and commuter trains, while intercity liners used the center track. Since the pair of Flyers were scheduled to depart Chicago at the exact same time, the railroad decided to treat them as one train, letting the Advance Flyer speed along in the lead at a marginally faster pace. Everything went just as planned for about 25 minutes. And then everything went terribly awry
would call it a “caprice of fate” (April 26, 1946). Nobody ever figured out what actually happened. But something
—a small rock, perhaps, or a piece of metal—shot out from the Advance Flyer’s undercarriage, spooking the engineer enough to force an unscheduled stop near the Naperville station. Slowing down to check the running gear so quickly after taking off was an unusual move, and the crew employed every available safeguard to protect its clients, setting the emergency control system into operation and sending flagman James Tangey out the rear car to, in his words, “try to stop the train behind us.”
That proved impossible. Blaine and his Exposition Flyer blew through both a yellow caution and red stop signal, rounded a curve, and roared past Tagney. Blaine’s fireman, a frightened man named E.H. Crayton, saw the parked train in the distance and leapt from the speeding locomotive, only to hit the ground and die instantly upon contact. Blaine stayed inside and leaned on the brake for as long as he could. A mere 90 seconds after The Advanced Flyer rolled to a stop, The Exposition Flyer—chugging along at 45 miles per hour—barreled into its caboose, tore through its roof, and “plunge[d] down with terrific force upon the very floor and trucks of the car” (Tribune
). Blaine’s front wheels were sheared off by the impact. “I never heard anything like it before or since to compare it to,” Jim Dudley, then an eighth grader at a nearby school, told the Tribune
in a 1988 retrospective
. “It was like an explosion.”
Dust, smoke, and debris scattered across the nearby countryside. The smell of ashes hung in the air. “The scene of the disaster,” the Tribune
noted later that day, “was one of twisted and gnarled confusion, with huge luxury passenger coaches strewn across torn tracks like abandoned toy trains.” For a few seconds after the collision, the passengers on board made little noise. Then the shock wore off. “A moment of tragic silence was broken,” the AP wrote
, “by screams and cries for help from the dying and injured.” The rear of The Advanced Flyer absorbed the bulk of the damage—most of those sitting in the rear coach and diner car were killed straightaway. Those seated further up the train escaped the worst, but were rocked nonetheless. “Things happened so fast,” one passenger said, “that I don’t remember what happened to me. I was doubled up suddenly and my knees were pushed against my chest.”
Startled by the clamor, all 800 employees at the Kroehler Furniture factory ran out to help. So did 50 students studying at North Central College. A police officer nearby made a series of frantic phone calls, recruiting doctors, nurses, and ambulances from neighboring towns. Within a matter of minutes, a full-blown rescue crew was assembled. They worked feverishly, but the task of pulling out bodies from the wreckage proved difficult. To reach the injured and dead, the police were forced to burn through the train plates using acetylene torches; eight hours after the crash, the authorities still hadn’t cut through every upturned car.
Those that were fished out were carried into the Kroehler warehouse—set up as a temporary hospital—on mattresses, because Naperville didn’t even own stretchers at the time. Miraculously, Blaine survived, crawling out through his cab’s window before making his way to first aid, where he was treated for a skull fracture. Others weren’t so lucky. Delbert Boon, a sailor from Missouri, was rushed to a hospital in adjacent Aurora, where he sent a cryptic telegram to his parents: “Come and see me. Was in train accident.” He died 30 minutes later.
It took 27 hours to clear one of Burlington’s three tracks, and three days to remove the entirety of the rubble. Thousands of curious locals jammed Naperville’s highways and streets while crews worked to catch a glimpse of the disaster. In total, 47 people eventually lost their lives in the accident, while another 125 were injured. It was, and still is, one of the worst crashes in state history.
So what the hell happened? Burlington surveyed its automatic signal systems right away and found that their lights had indeed functioned properly. From his hospital bed, Blaine—charged with manslaughter by state’s attorney Lee Daniels* to ensure he appeared at an inquest—insisted he saw the yellow caution and applied his brakes at once, but couldn’t slow the train down in time because he was moving too fast and his train was too light. (The Exposition was pulling nine cars that day, instead of its usual haul of 12.)
His crew mates weren’t convinced. At a public hearing set up by Burlington officials (and assailed by Blaine’s lawyer) on April 28, a road foreman testified that he inspected the locomotive shortly after the wreck and found the brake valve in the “service” position, not the “emergency” position. The Exposition’s conductor went so far as to say he noticed “no application of brakes whatsoever.” Brakeman C.W. Norris agreed with the foreman, telling his bosses that “there was never any emergency application the day of the accident.”
To test this hypothesis, the ICC and Burlington ran a series of simulations on the Naperville track a week after the crash, using a diesel train that paralleled The Exposition in length and weight. Speeding along at 85 miles per hour, a different (and younger) engineer applied the brakes immediately when he saw the yellow light and was able to slow his train to a stop 934 feet
from the rear of the standing Advance Flyer. During the final test, in which he applied both service and emergency brakes when he saw the red light, he still nearly avoided contact, stopping with the engine and just one car past the collision point. The evidence did not reflect well on Blaine.
In the end, though, the embattled engineer was absolved of major blame by both the ICC and a DuPage County grand jury. In an October verdict, the latter declined to take action against the Burlington railroad or the crews of either train, instead charging everyone involved with nine “negligent acts,” ranging from improper scheduling to poor intercommunication between conductors. Rule changes followed: the ICC mandated in 1951 that trains were only permitted to exceed 79 miles per hour if automatic train stop equipment
was in place, and most rail agencies still don’t mix cars of different weights on the same train. Blaine retired shortly thereafter.
Cult street photographer Charles Cushman
was on hand to document the grisly scene. His photos, along with the rest of his work, are hosted online
by Indiana University. Also keep an eye out for Naperville resident Chuck Spinner’s upcoming book
, which will detail the stories of the victims.
would later serve as the Speaker of the House in Illinois.
Derrick Rose needs to get healthy, quickly. Chicago’s star guard has missed over 40 percent of his team’s games this season because of nagging injuries, the latest of which is causing a wave of panic to wash over otherwise-optimistic Bulls fans who fear that bumps and bruises could spoil what’s been a dominant season thus far. Sure, the Bulls carry the deepest bench
in the league, yet a poor run of form
(and common sense) suggests their GQ cover boy
must play at a high level if they have any shot at winning the NBA title. “I’m just trying to survive,” Rose joked
earlier this week. So are we, Derrick. So are we.
Thankfully, Rose and other modern athletes now have at their disposal a ton of sophisticated medical procedures and medications to help the body heal, from physical therapy and acupuncture to cortisone injections and advanced surgeries. Professionals—whose livelihood depends on proper functioning arms and legs—will even spend thousands and thousands of dollars on remedies that have not yet passed clinical trials, like the injectable anti-inflammatory drug Toradol
, in which the “patient's own tissues are extracted, carefully manipulated, and then reintroduced to the body.” There are obvious risks in stepping back onto the field or court after undergoing experimental treatments—just ask the owners of drugged thoroughbreds.
But with the biological clock ticking, the more options available, the better.
Dr. George Bennett, a sports medicine pioneer, would be thrilled to see these innovations. Born in the Catskill Mountains in 1885, Bennett was himself a solid baseball player, landing a roster spot on a local semi-pro team by the age of 16. (Friends later described him as a “rather undisciplined little tough guy.”) But medicine was Bennett’s true passion. After high school, he worked a series of odd jobs throughout the Midwest, stashing away his earnings to pay for medical school tuition. Bennett eventually matriculated at the University of Maryland, graduated in 1908, and landed a job at the Johns Hopkins Hospital two years later. He was 25.
It was an interesting time for a sports fan to enter the field, such as it was. “Sports medicine,” as we understand it today, was in no way a recognized discipline. In the locker room, “it was considered effete and unnecessary to have a doctor in attendance” (Washington Post
; March 10, 1962), and trainers—most of whom had no science background—applied the lion’s share of treatments, which often meant rubbing sore muscles with balms. At the same time, doctors were starting to use x-rays with more regularity, producing detailed images of the body without having to penetrate the skin physically. If an entrepreneurial physician studied how the athlete’s body works and used that knowledge to create procedures that sped up recovery times, he could give daring ballplayers a competitive advantage while making a tidy profit for himself.
So Bennett poured over x-rays, starting with baseball pitchers. And what he found was troubling. While baseball players were subject to the same disabilities of the average laborer, repeating the overhand throwing motion over and over did increase by a wide margin the frequency of degenerative joint injuries. The ligaments, tendons, and muscles in the human arm are just not designed to exert the pressure necessary to propel a baseball 60 feet at rapid speeds, much less make it curve in flight. "Pitching,” Bennett would famously say
, “is a most unnatural motion.”
Bennett penned an article in the American Physical Education Review
in 1925 laying out the case in plan details that pitching can create long-term structural damage. He followed that piece up with another influential article in 1941, titled “Shoulder and Elbow Lesions of the Professional Baseball Pitcher
,” that included x-ray photos and a controversial suggestion that pitchers should use the side-arm delivery (like Walter Johnson
) to lengthen their careers. It seems obvious now, but the conclusion was revelatory at the time; pitcher workloads didn’t begin to drop dramatically
until the mid-1920s, after Bennett’s first paper was published.
While he studied joints in the lab, Bennett simultaneously built a successful practice, which he would leave Johns Hopkins to run full-time in 1947. Over time, the doctor garnered what sports columnist Red Smith called “the enviable and deserved reputation for remantling athletes” (Baltimore Sun
; May 28, 1950). Famous ballplayers liked him for a number of reasons: he was clearly bright, he took sports seriously, and he was not afraid to take orthopedic chances if his client requested it of him. Most importantly, he kept his mouth shut; an AP reporter once joked that the only two words the humble Bennett ever said in public were “operation successful.”
Over the course of his career, Bennett opened up stars like Joe DiMaggio, Dizzy Dean, Lefty Gomez, Pee Wee Reese, and Johnny Unitas.* (Clark Cable and Lord Halifax sought out his counsel, too.) With the help of a colleague at Hopkins, he also invented
the first batting helmet, a hat designed with a specialty zipper pocket that held two hard plastic slabs
. And once in a while, he worked miracles.
The career of Roy Sievers
(pictured above) is an instructive example. A hulking left fielder, Sievers won the American League Rookie of the Year award in 1949, hitting .306 and slugging 16 home runs for the St. Louis Browns. But in 1951, after struggling during his sophomore season, he broke his right collarbone diving for a ball in the outfield. The next spring, he dislocated the same shoulder making a throw across the diamond. His career appeared finished. Then Sievers visited Bennett. In what the doctor described as “an experiment,” he drilled a hole in Sievers’ bone, cut his tendons, slipped them through the opening, and knotted them together on the other side to keep the bone from rolling out of the shoulder socket. The procedure drastically limited Sievers’ throwing power, forcing a positional move to first base. While supportive of the initial operation, Browns president Bill Veeck and his colleagues in the front office weren’t convinced he would return to form, so they shuffled him off to Washington in a trade for the unremarkable Gil Coan.
This turned out to be a mammoth mistake; Sievers gradually redeveloped strength in his arm and subsequently took the majors by storm, blasting over 20 home runs in nine straight seasons. His best year came in 1957, when Sievers finished third in the AL MVP race, logging 42 home runs and an on-base plus slugging percentage of .967. According to Bennett, Sievers’ recovery was a “miracle of modern medicine” (Washington Post
; September 20, 1957). The Senator agreed; during an awards dinner for Bennett the following year, Sievers came up to the doctor with a tear in his eye and thanked him for saving his career. Red Smith aptly described Bennett’s enduring reputation: “This sort of thing has become such a familiar story—the halt and lame of sports have been shuffling off to Baltimore for so long now and in such numbers—that a newspaper reader might be excused if he got the notion that Dr. Bennett had invented the practice of medicine.”
Bennett died in 1962,
so he didn’t get to see the creative surgical work of the doctors who followed in his wake. That includes Frank Jobe, who successfully repaired Tommy Johns’ shoulder and launched
a medical revolution. But his impact on sports, and the medical profession more broadly, was undeniable. NFL commissioner Roger Goodell might even want to revisit the doctor’s thoughts on football, broadcast in an AP interview on December 18, 1947. “The present helmet is simply equipping a player with armor and the steel mask in front is an open invitation to crush someone’s jaw or knock his teeth out,” he said. “The toll of injuries will continue to mount unless the face mask is legislated out of the game immediately.”
Prescient words from a thoughtful man.
*“After listening to that all-star team of players Dr. Bennett has mended,” Joe Garagiola said at an awards dinner in 1958, “I’m sort of sorry I didn't break my leg."