George Gallup, the father of modern opinion polling, ran his first political survey because his mother-in-law asked him to.
The year was 1932, and Iowa’s Democratic party bosses had just placed Ola Babcock Miller
on the ticket as candidate for Secretary of State. It was a token gesture. Miller—a former teacher and amateur painter—had only modest political experience, but her deceased husband Alex had edited the only important Democratic newspaper in the traditionally Republican state, and his friends in the party thought nominating his suffragette widow for statewide office would be a worthy tribute. She was flattered, but overwhelmed; early in the race, Miller described her campaign as “a martyrdom to the cause.”
She didn’t take the responsibility lightly, though. Instead, Miller enlisted the help of her son-in-law, George
. Raised by an Iowa dairy farmer and land speculator, Gallup had spent the past decade at the University of Iowa and Drake University studying the reading patterns of American newspaper and magazine consumers. Using Gallup’s survey method, revolutionary at the time, an interviewer would take readers through the pages of a newspaper’s most recent issue, glean how much or how little of it they read, and then break down the items by category so publishers could learn what types of features drew significant eyeballs. His doctoral dissertation was titled, quite frankly, “A New Technique For Measuring Reader Interest in Newspapers,” and a host of Midwestern rags eventually hired him to evaluate their products.  His mother-in-law, looking for any advantage she could secure, wondered whether Gallup’s technique could be used to figure out what issues mattered most to her potential constituents.
It was an insightful idea. Throughout the fall, George knocked on doors, asking voters what they valued in a candidate. (Highway safety, he found, was a top priority.) He passed those details on to Miller, who worked it into her campaign stump speeches. Riding the coattails of Franklin Delano Roosevelt, who took home 2,225 out of 2,435 precincts in the Hawkeye State that November, the widow upset her opponent by less than 3,000 votes, one of the first Democrats since the Civil War to win statewide office in Iowa. She had Gallup to thank for the extra edge.
Gallup’s own experience on the trail was revelatory. He had come to the conclusion during his academic training that people “arrive at their preferences and opinions in an orderly way” (Journal of Marketing
; October, 1962). It followed that one could get a firm understanding of the public’s political
desires if he or she asked the right amount of Americans the right questions. And if voters’ collective intentions were pure and preferences widely known, politicians would have no choice but to bend to their will. Democracy would flourish. The trick was figuring out an efficient and accurate technique to ascertain what the public actually wanted.
Political junkies had taken straw polls for decades , but they invariably lacked statistical rigor. (“Of the 16 people on this cable car, 12 are Harding men.”) In the early 20th century, the most comprehensive and influential practitioner was Literary Digest
, a weekly general interest magazine. During each presidential election cycle, starting in 1916, its editors mailed out ersatz ballots to its subscribers (and eventually to publicly-listed names of people who owned telephones and cars) and solicited responses. Heaps of data poured back into the magazine’s offices, where hundreds of clerks tallied the results. Saturation was the name of the game; if enough Americans are contacted, the thinking went, the nation’s predilections will inevitably emerge. One or two weeks before Election Day, the magazine would release its final forecast. In the first five races it tracked, Literary Digest’s
data spit out the correct winner five times.
Gallup knew there was a better way. In December of 1933, after moving to New York City to head up the research arm of the advertising agency Young and Rubicam, Gallup and his colleagues began conducting nationwide opinion polls, employing the sampling method, which he likened to government inspectors who test wheat. (“They take a sample here, another there,” he told the Washington Post
on December 8, 1935, “and by choosing the samples properly are able to judge the quality of the whole amount from what the sample shows.”) Along with people who owned cars and phones, Gallup hired hundreds of staffers to interview laborers and persons on public aid in all 50 states, folks who intended to vote but whose names could not be traced back to assets they owned or luxury magazines to which they subscribed. (This was the height of the Depression, so that demographic was significant.) The firm deliberately surveyed a cross-section of Americans they felt was representative of the electorate at large: farmers and city dwellers, rich and poor persons, Republicans and Democrats. The approach seems intuitive, but nobody in politics had ever tried it before.
After gathering data for two years, Gallup created a new organization—the American Institute of Public Opinion (AIPO)—and published its initial findings in a syndicated newspaper column he named “America Speaks.” Within the year, over 60 newspapers had purchased the package, which Gallup updated with new information weekly. Columnist Franklyn Waltman dubbed it “the periodic gauging of the public heartbeat” (Washington Post
; July 13, 1936). The concept of a running poll was so novel that some newspapers had to convince readers that the information Gallup presented did not reflect the political views of their editorial board.
Gallup didn’t hide his modest disdain for Literary Digest’s
survey work, either. A few months before that magazine was set to release its enormous report, the Iowa native predicted in one of his columns that his competitor’s data would skew toward Republican challenger Alf Landon
, despite what he considered strong evidence that the incumbent, President Roosevelt, held a commanding lead. (As early as November 24, 1935, Gallup had identified 30 states in which the Democrat was garnering at least 53 percent support, a bloc that would have given him enough electoral college votes (269) to win without securing even one “borderline state.”) Wilfred Funk, LD’s
editor, clapped back, calling Gallup’s prediction “a gratuitous statement." His staff had “never been able to discover how many ‘rich men, poor men, G-men, racketeers, and candlestick makers’ voted in a given election,” he added sarcastically (New York Times
; July 19, 1936).
They should have tried harder. After sending out 10 million postcards through the mail, and receiving a whopping 2.4 million responses, Literary Digest
projected that Landon would upend Roosevelt with 370 electoral votes. Gallup’s last “America Speaks” column of the cycle argued that Roosevelt could take home anywhere from 315 to 477 electoral votes. He carried 48 states and won with 523.
Funk and company had no explanation for their poor showing. “We were far from correct,” they wrote in their first post-election issue. “Why? We ask that question in all sincerity, because we want to know.” Refusing to admit that they over-sampled wealthier, Republican-leaning voters, Literary Digest
ultimately just threw its hands up in the air: “So we were wrong, although we did everything we knew to assure ourselves of being right.” It was the last poll they’d conduct; after promising to try their luck again four years down the road, the magazine merged with a competitor in 1937, Funk fled for Reader’s Digest
, and the whole enterprise sank the following year. The Washington Post,
was so thrilled with the accuracy of its partner that the paper printed, on two different days, letters from “political and business leaders” praising Gallup’s work; according to the chairman of the Federal Trade Commission, AIPO “undoubtedly demonstrated its superiority over other devices for prejudging the results of the election” (November 5, 1936). Gallup used the high-profile victory as a launching pad, eventually building with his son an international network of research offices that ask private citizens, to this day, what they are thinking. Political polling, as is painfully obvious this time of year, took on a life of its own, for better and worse. As Jason Zengerle wrote
in New York
last week, “the polling industry has never been less confident in its ability to reduce a series of interviews to a number that is an accurate reflection of the opinions and future behavior of the populace.” Pollsters, it turns out, are still grasping for that elusive representative sample.
As for Gallup’s mother-in-law? She was re-elected for two additional full terms, in 1934 and 1936, before she succumbed to pneumonia at the age of 57. Her most meaningful contribution to Iowa public life was the establishment of a state highway patrol unit in 1934, one that boasted 15 inspectors and helped reduce the highway accident rate considerably
. More often than not, it helps to know what voters want.
 One of Gallup’s early discoveries was that people loved to look at pictures and comic strips. According to the Journal of Marketing
(October, 1962), “advertisers, and particularly Ralph Starr Butler of General Foods, seized on these findings, and the first comic-strip advertising resulted.”
 Via the Wall Street Journal
, they were "named for the way farmers threw a fistful of straw in the air to see which way the wind was blowing."
It was the moment she had worked toward for years. On June 25, 1924, Lena Springs
—a decorated 41-year-old academic and political activist from Lancaster, South Carolina—was set to address the Democratic National Convention in New York’s famed Madison Square Garden. Her task was routine but symbolically important: Springs was to submit a report for the committee on credentials , of which she was chairman, the first woman to hold that title. In a wide-ranging interview published by the Washington Post
six weeks later (August 10, 1924), Springs admitted that she was “not one of those who think a woman needs to submerge her femininity in order to be a politician,” so she dressed up for the occasion, donning a crepe meteor dress and a “very Parisian and very chic” straw leghorn hat (New York Times
; June 26, 1924). As she walked purposefully across the stage and approached the lectern on that sweltering summer day, the convention’s band leader cued up a song he felt encapsulated all that Springs had to offer the party of Jackson and Wilson. It was an old ragtime ditty titled “Oh You Beautiful Doll
Two days later, a group of delegates formally placed Springs in nomination for the office of Vice President. No woman at a major party convention had ever received that distinction, one that “heralded the time, according to the fair sex, when one of their number will be entered in the lists for the Presidency itself” (Post
). She was honored and humbled by the gesture; the leading woman Democrat of the South, as a colleague described her, promised she would not treat the nomination “lightly.” Springs didn’t win, but her 15 minutes in the spotlight—zealously documented by news and style reporters of the time—show how much progress we’ve made in equalizing political opportunities for women, and how much further we still have to go.
Lena Jones was born in 1883 in Pulaski, Tennessee. She took to school instantly, first in the Dallas-area private schools where she studied for the first decade of her life and later at Sullins College, where she graduated with a B.A. in English. Like few other women of her era, Jones continued her scholarship in post-graduate programs at Virginia College in Roanoke and Presbyterian College in South Carolina, eventually earning a prestigious D. Litt. and landing a job as head of the Queens College (North Carolina) English Department, all before her 30th birthday.
Love interrupted her academic ascent. In 1913, Jones married Leroy Springs, a cotton mill owner and one of the most prominent industrialists of the South. She adopted her husband’s name, moved 50 miles south to Lancaster, and redirected her energy into politics, taking on key roles in the South Carolina Equal Suffrage League and the South Carolina Federation of Women’s Clubs. During the Great War, Springs organized the Lancaster Red Cross, establishing two emergency hospitals in her town to treat influenza. Once the 19th amendment was ratified, she eagerly cast her ballot for the Democratic slate, and ultimately secured a position as Democratic National Committeeman . The convention in 1924—the first held since women were extended the franchise—was an event Springs would not miss.
She wanted plenty of women to join her, too. Springs had spent a portion of her spring alongside national Democratic vice chairman Emily Newell Blair lobbying party members to increase the number of delegations-at-large from four to eight, with the explicit intention of reserving half of the new slots for women. (Just one female Democrat served as a delegate at the DNC’s 1908 convocation in Denver, though the number of lady envoys increased slowly over the next 16 years.) The provision passed. When Springs arrived in New York, another 475 female delegates and alternates were in attendance—almost one-fourth of the entire membership, and the largest delegation of women ever to participate officially in a U.S. political convention.
The Democratic women came from all walks of life—elected officials, Tammany Hall leaders, school teachers, business women, housewives—and made clear there was “no one outstanding issue on which [they], as a group, will make a fight” (Chicago Tribune
; June 23, 1924). What they wanted was to be treated seriously as political thinkers and organizers. Right away, they launched a campaign to gain control of a major committee chairmanship, canvassing delegations for pledges “in a professional way” (AP; June 22, 1924). It was a display of strength, and Springs was the beneficiary of this exerted clout, taking the reins of the Credentials Committee as the convention got underway. Not surprisingly, she handled the job like a veteran, “winning the plaudits of her colleagues not only for her ability but her courtesy and fairness” (Times
; June 28, 1924). Her first appearance on stage was a direct result of her skillful work behind the scenes.
And how did reporters depict the South Carolina suffragette upon her initial foray into national politics? Like the band who played her on stage, the press overwhelmingly treated Springs like little more than a pin-up model, or “an ornament to the convention,” as the Boston Globe
so delicately put it (July 6, 1924). Even after watching some members of the media objectify Vice Presidential nominee Sarah Palin four years ago, the reports from 1924 are jarring. Springs was “a highly personable and sumptuously-draped young woman” with “a complexion which Helen of Troy might envy.” She had “coal black hair and eyes, and a dynamic, scintillating vitality.” Her dresses and jewels were “particularly lovely and becoming,” catching the eyes of fashion editors, “who have been shown that the women from the Southland know how to wear clothes as well as the women on Fifth Ave.” Springs, in short, was “distinctly the type that men ‘fall for’ good and hard.”
The coverage got even more ridiculous when 18 of Springs’ fellow South Carolina delegates—as a way to honor her service to their state and national party—nominated her for the presidential ticket. The former English professor understood that her chances were minuscule; 29 other men were nominated, and she’d never held elected office. But even the modest tribute was partially spoiled by rampant sexism. It took two forms: Springs’ nomination was understood as either a novelty to mock or a nightmare to fear. With a creepy wink and nod that practically leaps off the page, Senator Burton Wheeler (D-MT) provided a clear example of the former, noting that Springs “would most certainly improve the situation in the Senate” (Times
; June 28, 1924). Her (jackass) husband Leroy joined the latter camp, initially telling reporters that he “did not take kindly to the idea of having his wife’s name ‘coupled’ with that of another man’s” (Globe
), and adding two weeks later he would “leave for some other country” on the day a woman won the presidency (Post
). What would become of the United States if a woman took a seat in the vice presidential chair? According to the Post
, “speculation [on the convention floor] ranged from making every day a Mother’s day to appointing the head of the Thursday Mah Jong club as secretary of war.”
Springs, for her part, said she would keep a low profile and “try to keep the Senate from getting into too great a parliamentary tangle.” (Where is she when we need her
?) And while she maintained that no woman would lead a major party until “men get over this old-fashioned jealousy or sex superiority complex,” she was optimistic about the future, despite the treatment she herself received. “The day is not very far off,” she told the Post
, “when a woman will be President of the United States.”
Springs took home 2.7 percent of the vote at the 1924 convention, not enough to eclipse eventual nominee Charles Bryan. Afterwards, she stayed modestly active in local affairs before moving to New York following her husband’s death in 1931. Nine years later, at the age of 59, she passed away, the victim of a surprise heart attack. And with no women on either 2012 ticket, her wish to see a female president remains unfulfilled.
 The credentials committee confirms the identity of the party’s delegates and their authority to vote.
 The Republican Party, she would later say, “stands for special privilege.”
Opponents called it the first step towards socialized medicine. The law was too expensive, they complained, and it violated states’ rights. One woman who testified during a congressional committee hearing even suggested its passage would lead to “bureaucratic control of family life.” If he was alive, John Roberts surely would have found a way to strike it down in court.
You don’t hear about it much anymore, but the Sheppard-Towner Act
—or the “Better Babies Bill,” as some reporters referred to it at the time—was a big f’ing deal. No Congress in U.S. history had ever approved a federally funded social welfare program before S-T came up for debate in the early 1920s; aside from the Volstead Act
, it was the most controversial law of its era. A Boston Globe
writer summed it up this way: “It ranks next in importance, in the opinion of its advocates, to the legislation which finally gave women the right to vote.” And there are some striking parallels between the fight over Sheppard-Towner and the recent debate surrounding President Obama’s embattled health reform law. With the Supreme Court set to rule on the constitutionality
of the Affordable Care Act next month, it’s worth investigating the legacy of its earliest legislative antecedent.
The story begins in 1912, when President Taft created the U.S. Children's Bureau and hired Julia Lathrop to run it. Housed within the Labor Department, the agency was designed to investigate and report “upon all matters pertaining to the welfare of children and child life among all classes of our people.” Lathrop (pictured above) had been doing essentially the same work for two decades at Chicago’s Hull House, undertaking extensive surveys to document the brutal living conditions in her city’s slums, mental health institutions, orphanages, and poorhouses. She was a natural fit. As one senator’s wife gushed to the Washington Post
, choosing Lathrop was “the finest and most just recognition of a woman's ability, and her place in the nation, that has ever been made by any president" (November 6, 1912).
“Young America’s Aunt”* knew instantly what problem her department should tackle first: infant mortality. When she arrived in Washington, Lathrop’s office launched an eight-city examination into American childbirth habits. The results were startling; the nation’s overall infant mortality rate was a whopping 111.2 per 1,000 live births, higher than almost every other industrialized country in the world. Annually, 250,000 American babies died during their first year and another 23,000 mothers were killed during the delivery. (It was the second leading cause of death among women between the ages of 18 and 45, behind tuberculosis.) There was also a correlation between poverty and the mortality rate—for families earning less than $450 annually, one baby in six
died before his or her first birthday. Respected Johns Hopkins pediatrician Dr. J. H. Mason Knox made clear at the time that nearly all of those deaths were preventable if families just received proper prenatal care. Only 20 percent of expectant moms did.
With firm data in hand, Lathrop set about drafting a piece of legislation that would use federal funds to provide “public protection of maternity and infancy.”** Like the newly-established Smith-Lever Act, which authorized the Department of Agriculture to distribute matching funds to the states for extension work
by county agents, Lathrop envisioned a program in which Washington partnered with local nurses, universities, and social workers to subsidize the instruction of mothers on the care of infants. “The bill,” Lathrop wrote, “is designed to emphasize public responsibility for the protection of life just as already through our public schools we recognize public responsibility in the education of children.”
In 1919, U.S. Rep. Horace Mann Towner (R-Iowa) and U.S. Sen. Morris Sheppard (D-Texas) submitted a bill that contained the basics of Lathrop’s proposal. The Hull House veteran wasted no time stumping for her idea. Over the next three years, Lathrop enlisted support wherever she could, relying heavily on women’s associations that were emboldened by the recent extension of suffrage. The Children’s Bureau sponsored “The Year of the Child,” in which the agency appealed to groups across the nation and published catchy graphics to illustrate the country’s poor international standing. Lathrop convinced popular magazines like Good Housekeeping
and the Ladies Home Journal
to editorialize in favor of the measure. Ultimately, 13 of the most powerful women's groups in America rallied behind Sheppard-Towner, too; in the final weeks of negotiations, the Women's Joint Congressional Committee—a massive umbrella group
—conducted interviews with congressman at the rate of 50 per day. “It is doubtful,” reported the Globe
(December 18, 1921), “if any single piece of legislation enacted by Congress in recent years—apart from equal suffrage—has had the organized influence of so great a body of the citizenship of the country back of it.”
The final version of the bill passed both the House (279 to 39) and the Senate (63 to 7) by a wide margin in late 1921, in part because the law was modest in scope. Congress agreed to appropriate just $1.24 million annually (about $15 million today) for the program, with each participating state receiving $5,000 outright and then dollar-for-dollar matching funds as determined by its population. After five years, the funding would need to be reauthorized, as well. (The advocates of the bill were confident that half a decade was “sufficiently long to demonstrate the real value of the measure.”) One year after its passage, a reporter for the Detroit Free-Press
described Sheppard-Towner as “mild and rather helpless” (December 1, 1922). He wasn’t wrong.
But the idea behind the bill, at least in the United States, was revolutionary. Social insurance, in any form, just didn’t exist. In her book “Protecting Soldiers and Mothers,” historian Theda Skocpol writes that Lathrop’s brainchild “extended what was once domestic action into a new understanding of governmental action for societal welfare.” Put another way
, the new law was “a fragile seed growing in isolation from the then-traditional health programs.”***
That seed quickly bloomed. Within the first year of implementation, 45 out of 48 states passed enabling legislation to receive matching S-T funds. (Illinois, Connecticut, and Massachusetts never participated.) Each used their subsidy in different ways; some organized conferences where physicians ran demonstrations on maternal and infant care and hygiene, while others paid nurses to visit new or expectant mothers. However it was deployed, the money went a long way. Between 1922 and 1929, the Bureau distributed over 22 million pieces of literature, conducted 183,252 health conferences, established 2,978 permanent prenatal centers, and visited over 3 million homes. Lathrop’s successor at the Children’s Bureau, Grace Abbott, estimated that one-half of U.S. babies had benefited from the government's childrearing information.
Not surprisingly, infant mortality dropped precipitously while Sheppard-Towner was on the books. A new working paper
published last month by the National Bureau of Economic Research estimates that Sheppard-Towner activities accounted for 12 percent of the drop in infant mortality during the 1920s, with one-on-one interventions creating the most statistically significant results. Combined with rising incomes and better nutrition, preventative health education helped cut down the infant mortality rate to 67.6 deaths per 1,000 live births in 1929. Considering how little money Congress actually spent on the law, the results were thrilling.
Not everyone was so excited by the precedent Sheppard-Towner was setting. During the initial debate in Congress, several opponents delivered unhinged criticism of both the bill and its supporters. U.S. Sen. James Reed (D-Missouri) declared (incorrectly) that Sheppard-Towner would permit officials to “invade” the homes of mothers-to-be. “We would better reverse the proposition,” he charmingly added, “and provide for a committee of mothers to take charge of the old maids (at the Children’s Bureau) and teach them how to acquire a husband and have babies of their own.” Not to be outdone, his colleague in the House, U.S. Rep. Henry Tucker (D-Virginia), characterized the bill as an attempt to “make Uncle Sam the midwife of every expectant woman in the United States.” And Mary G. Kilberth of the National Association Opposed to Woman Suffrage argued that Sheppard-Towner advocates were both “inspired by foreign experiments in Communism” and “connected with the birth-control movement.” A wealthy socialite from Boston went so far as to challenge the law before the U.S. Supreme Court, contending unsuccessfully
that it violated the Tenth Amendment.
If ideologues couldn’t rescind the law, doctors had a better shot. The American Medical Association board was initially skeptical of Sheppard-Towner, calling it an “imported socialistic scheme unsuited to our form of government” at its annual meeting in 1922. Four years later, however, the association fully mobilized for the funding reauthorization fight, lobbying Congress and writing letters to the president. It’s clear that many physicians moved to incorporate preventive health education into their private practices only when they saw the benefits of prenatal care play out in new clinics across the country. In a very real sense, the Children’s Bureau had become a primary competitor, and its own worst enemy.
Desperate to keep their projects operating, directors of the state Sheppard-Towner programs and Abbott cut what the historian Skocpol deemed a “deal with the devil,” agreeing to terminate the law altogether in exchange for two more years of full financial support. In 1929, seven years after reformers printed their first informational flyers, Sheppard-Towner came off the books. Over the next four years, progressives introduced 14 different bills that would have funded maternity and infancy health programs using federal dollars. All of them failed. When the Great Depression hit, most states dropped their existing programs altogether.
The lesson, though, had been learned. And while the United States’ current infant mortality rate is still not where it should be
, it’s decidedly safer for babies and mothers now than it was a century ago. For that, we can thank Julia Lathrop and her small, ambitious staff.
*Headline in the Post
on June 9, 1912
** Children’s Bureau’s Fifth Annual Report of the Chief, 1917
***The Sheppard-Towner Era: A Prototype Case Study in Federal-State Relationships
; June, 1967
If you’re into history or genealogy, or just get a kick out of rummaging through government documents, Monday was an exciting day. That’s because the U.S. National Archives released complete records
from the 1940 U.S. Census and made the entire set, for the first time in history, accessible online and free of charge.
The data dump is a blessing for the caretakers of family trees, who can now mine that census for personal information about family members who passed on before their progeny could jot down key biographical facts. It’s also a long time coming; while aggregate statistics for cities or counties are published without restrictions as soon as they are available, specific records pertaining to individual citizens are sealed from the public, by law, for 72 years.
If that seems like a random amount of time to keep the decennial findings hidden, it kind of is. Sixty years ago, in an attempt to mollify both civil libertarians and statisticians who thought the value of the census was dependent upon confidentiality, Census Bureau Director Roy Peel and U.S. Archivist Wayne Grover wrote an informal rule
(later codified by Congress) that forced the government to keep particulars under lock and key for seven decades. In 1952, female life expectancy in the States was 71.6 years
. According to their logic, very few people would still be living who had participated in the census 72 years earlier, so any harm caused by the disclosure would be minimal. As it turns out, female life expectancy is now 79.5 years, and 21 million Americans alive in 1940 are still kicking today, a full 16 percent of those counted that year. Luckily, few have complained about a breach of privacy. Most, like 100-year-old Verla Morris
, seem to enjoy the novelty of reading their name in America’s history book.
Back in the late-1800s, it took years to tabulate the census results at all. Bureaucrats didn’t put to bed the 1880 census until 1887, and they knew finishing the 1890 census by 1900, when Congress was constitutionally required to reapportion district boundaries, would be even tougher. Not only was the nation’s population expanding by about 25 percent each decade, but the Census Office added a series of new questions to the document, including queries about home ownership, war service, and race. Counting the data by hand, as they had done for a full century, wasn’t going to cut it.Herman Hollerith
knew just how inefficient the process was. Born to German immigrants in Buffalo, the eccentric Hollerith graduated in 1879 with an engineering degree from the Columbia University School of Mines and followed one of his professors into the Census Office, where he watched in horror as his new colleagues slogged through an endless pile of paper forms, one by one. Hollerith wanted desperately for the government to organize its records mechanically, thereby saving time and reducing errors. He just needed to figure out how best to do it.
Inspiration struck, as it so often does, on the train. As Hollerith recalls
, he was taking a ride out from Washington when he watched a conductor use a punch card to certify a passenger’s ticket. That got him thinking: what if the government could transfer census questionnaires onto a punch card, with each hole representing a different data point (location, gender, occupation), and then feed the cards into an electrical machine that tallied the results? After five years of trial and error, the engineer finally figured out a design
that worked. Using the same principles as a Jacquard loom
, his prototype featured a series of tiny cups, all filled with mercury and connected to a wire nail. Each cup corresponded with a different hole on the punch card. When the card was inserted and the machine was set into motion, any punched hole would provide empty space in which the nail and mercury could interact like a circuit, thereby setting of an electrical charge. Those charges were sent to the machine’s dashboard, which contained a series of clock-like dials. All the census worker had to do was plug in a card, mark down which dials moved, take it out, and grab the next one.
Hollerith filed his first patent in 1884 and tested the gadget
in Baltimore three years later. His old colleagues were impressed with the results and offered him a contract when they reopened for business in 1890. It was a profitable decision. Using the electrical invention, the Census Office was able to analyze more information in a shorter amount of time (five years) and at a discount to taxpayers (an estimated $5 million). “This apparatus works unerringly as the mills of the gods,” The Electrical Engineer
wrote in November 1891, “but beats them hollow as to speed.”
Government officials may have been impressed with their new machine, but they were awfully cavalier with the documents it eventually tabulated. At the turn of the century, it was the job of individual agencies to maintain their own records, and some were more careful than others. Short on space in their vaults, archivists in the Commerce Department opted to stack the voluminous 1890 census neatly on pine shelves in their building’s basement. Few questioned the decision until January 10, 1921, when building fireman James Foster noticed smoke spewing through openings around some pipes that ran from the boiler room into the file room. Minutes later, another watchman upstairs smelled something burning in the men’s bathroom. Both made their way downstairs, where they ran right into an inferno. The pair pulled the house alarm, evacuated the office, and then watched as “five alarms quickly brought every piece of apparatus in downtown Washington to the scene” (New York Times
, January 11, 1921). It took 20 hoses and two-and-a-half hours to extinguish the unfortunate blaze.
It was impossible to determine how long the fire had burned before anyone noticed, nor was it clear what set it off in the first place. (An errant cigarette is one potential culprit.) But the damage it caused was obvious. Kellee Blake, who wrote a big piece
on the incident for Prologue
, called it ”an archivist's nightmare.” One-quarter of the 1890 census burned instantly. Another 50 percent suffered heavy smoke and water damage. Census Bureau Clerk T. J. Fitzgerald told reporters the morning after that Hollerith’s data was "certain to be absolutely ruined” (Washington Post
, January 11, 1921). And without modern preservation technology, the salvageable remains further deteriorated in the temporary storage space to which they were relocated. Today, only about 6,000 names
from the almost 63 million census returns exist, a fact that frustrates genealogists to this day.
If there’s a silver lining to the story, it’s that the fire helped convince enough people in the capital that it would be useful to store important documents in a centralized and safe location. In 1926, Congress appropriated $1 million for an archival building, and eight years later, President Roosevelt signed a law
establishing the National Archives as an independent agency. Hollerith, meanwhile, took the proceeds from his government contract and formed the Tabulating Machine Company, which would eventually change its name to the International Business Machines Corporation. He never became a rich man—the engineer did not get along with the company’s top salesman, Thomas Watson
, and stepped aside from day-to-day operations in 1921—but his work revolutionized the field of information processing.
For more on that original contraption, be sure to read this article
Hollerith wrote in 1890 describing its mechanics. The illustrations are particularly charming.
In Iowa and South Carolina, Stephen Colbert -- master improviser, co-creator of the bizarre and hilarious “Strangers With Candy,” inventor of “truthiness” -- is doing the most brilliant work of his career. The launch of his SuperPAC Americans for a Better Tomorrow, Tomorrow
and his “entrance
” into tomorrow’s Republican primary as an unofficial (and “uncoordinated”) supporter of the disgraced Herman Cain are masterstrokes, allowing the comedian to expose the absurdities of presidential politicking in a post-Citizens United world by “crossing the line
that separates a TV stunt from reality.” If his participation in the process allows the performer to poke fun at some seriously flawed arch-conservative candidates? All the better.
While Colbert’s satirical political operation may be the wealthiest and most sophisticated in U.S. history, he’s not the only comedian who has mined material out of a quixotic bid for America’s highest office. Seventy-two years ago, with the country recovering slowly from a massive depression and on the brink of war, a lonely nation turned its eyes to Gracie Allen, a tiny woman with a high-pitched voice who offered voters, for a few buoyant months, a truly unique political platform.
A San Francisco native, Allen got her start in vaudeville, dancing with her older sister and then performing comedy bits with her partner-turned-husband George Burns. In real life, Allen was brainy and clever. Paired with Burns, she crafted a persona that was scatterbrained and obtuse. “The secret of Gracie’s humor,” Burns once said, “was her ability to deliver the most incredible lines with absolute sincerity.” (New York Times
, August 29, 1964.) After a guest spot on the BBC, fellow comedian Eddie Cantor invited the duo to make select radio appearances on his show in the States. Cantor’s audience lapped up their goofy banter. Before long, Burns and Allen inked a contract to start their own NBC radio show as well as appear in movies for Paramount Pictures. The ensuing exposure transformed the vaudevillian and her cigar-chomping hubby into national stars.
In the late 1930s, with comedy radio shows still generating big ratings, on-air performers tried countless gimmicks to broaden their listener base. One day, knitting at her home in Beverly Hills, Allen turned to Burns and casually mentioned that she thought it might be fun to run for president in 1940. On February 7, during their regular broadcast, Gracie told her audience she was contemplating the idea, too. They roared with approval. Two months later, on April 21, she formally entered the fray, announcing to a Texas audience that she hoped “you'll vote for me, at least once.”
Allen’s campaign, such as it was, was more absurdist than satirical, relying on word play and the candidate’s unorthodox delivery to generate laughs.* Allen represented the Surprise Party, an affiliation cemented as an infant. (“My mother was a Democrat, my father a Republican, and I was born a Surprise.”) The party’s mascot was a Kangaroo named Laura, adopted because 1940 was a leap year, and its presumptuous slogan -- "It’s in the bag!” -- was printed on sew-on campaign buttons, an innovation Allen pioneered
to discourage supporters from changing their minds before Election Day. Once Allen and her team realized their joke bid had serious legs, she committed to the character fully, jumping repeatedly and sometimes unannounced onto popular radio shows to broadcast her views on the day’s hot topics. President Gracie would welcome foreign relations "so long as they bring their own bedding and don't stay too long." Under her administration, Allen promised the government would offer free correspondence courses “so that people who can't find jobs in their own line will soon be without jobs in three or four different types of work.” And she assured the public she would hold no fireside chats from the White House between April 15 and October 15, unlike incumbent Franklin Roosevelt. “It is asking too much and I don’t know how President Roosevelt stands it,” she quipped. “Washington is awfully hot in summer.”
In May, with Burns at her side, Allen embarked on a 34-city whistle-stop tour from Los Angeles to Omaha, stopping along the way to make stump speeches from the back of her train. The journey concluded at the Ak-Sar-Ben Auditorium
, where 10,000 delegates nominated her unanimously for president. (Her ticket had no vice-presidential candidate because Gracie had warned that she would tolerate no vice in her administration.) According to a United Press correspondent on hand (May 19, 1940), the first promise Allen made during her acceptance speech, which NBC radio aired live, was to bring Maine and Vermont back into the Union. A worthy goal, indeed.
Following the Surprise Party Convention, Allen let her campaign peter out. That summer, she wrote a book of advice for future candidates called "How to Become President," which a Los Angeles Times
review (July 7, 1940) called an "effective antidote to the various viruses emanating from the current campaigns." The popularity of the Burns & Allen show soared
, and in November, on the day President Roosevelt won his third term in office, Allen earned a few thousand write-in votes for the highest post in the country. Reflecting in 2008, Allen and Burns’ colleague Robert Easton suggested
that their dogged dedication to silliness “brought a much needed sense of comedy relief to very tense times.” If Colbert's SuperPac existed in 1940, I’m confident its director would have cut a few ads for little Gracie.
*She never made any effort to get herself on the ballot.
On Native American reservations, “disenrollment” is serious business. If a family’s bloodline is determined to be impure, its members can lose their home, health insurance, educational stipend, and cultural heritage with just one signature from a tribal leader. There are no appeals. There are no second chances. “That’s it,” one excommunicated Chukchansi woman recently told
the New York Times
. “We’re tribeless.”
According to reporter James Dao, tribal leaders in California have purged at least 2,500 Native Americans since 2001, and the rationale has virtually nothing to do with proper ancestry. Sadly, the decisions are driven primarily by greed over revenue from casinos, a massive industry that has fully transformed reservation life during the past three decades. In 2009, the last year for which data was available, Indian casinos* generated
$26.4 billion, outperforming nationally their older and often smaller commercial counterparts. With that much money on the table, activists and academics contend, some elder statesmen have taken the most extreme step possible -- slimming down the tribe -- to consolidate control over the profitable gaming operations.
That gambling plays such an outsized role in the lives of Native Americans is essentially an historical accident, the byproduct of a obscure legal decision and the subsequent entrepreneurialism of one desperate indigenous administrator. A few years ago, University of New Mexico law professor Kevin Washburn wrote a detailed law review article
about Bryan v. Itasca County
, the case he calls the “bedrock upon which the Indian gaming industry began.” The plaintiffs were Helen Charwood and Russell Bryan, members of the Leech Lake Band of Ojibwe in northern Minnesota. In late 1971, the couple purchased a two-bedroom trailer to replace a dwelling that had burned down on their property. Most Native American land is held in trust
by the federal government, a legal arrangement that shields homeowners from state property taxes, but Itasca tax collectors considered double-wides personal
property, which is not similarly exempt. One day the following June, a tax bill for the last two months of 1971 arrived in Charwood’s mailbox; the levy equaled $29.85. In July, USPS delivered another notice indicating a six-month assessment of $118.10. Both bills had to be paid off within 30 days.
Charwood, a Head Start teacher fearful of foreclosure, immediately contacted the newly-established office of the Leech Lake Reservation Legal Services Project, which helped her file suit. Whether or not the mobile home constituted federal trust property turned out to be largely irrelevant; instead, the case centered around the court’s interpretation of Public Law 280
. While the U.S. Constitution generally grants the federal government plenary power over Indian tribes, that law -- passed by Congress in 1953 -- transferred to six states (including Minnesota) federal law enforcement authority over certain tribal nations. Itasca’s lawyers assumed tax recovery was one of the regulatory powers the county could bring to bear. The Ojibwe disagreed. The case worked its way up to the U.S. Supreme Court, which released a broad and blistering decision in favor of the Native Americans. Chief William Rehnquist and all eight of his colleagues found nothing “remotely resembling an intention to confer general state civil regulatory control over Indian reservations.” The purpose of 280 was to bequeath the state with jurisdiction over criminal
offenses committed on the reservations, they reasoned, not to control all behavior. Charwood and Bryan were off the hook.
As Washburn writes, the expansive and forgiving SCOTUS ruling was “pregnant with possibility.” Jim Billie
was the first to capitalize. Born in 1944 on the grounds of a chimp farm, and initially left for dead
** by his bruising and proud people, Billie learned at a young age that tourists would flock to Indian reservations if the home tribe provided a novel reason to visit. His first entertainment venture was alligator wrestling, for which he became regionally famous. "It was P.T. Barnum, that's all," he later told Sarasota Magazine
in 2004, "All them boys could catch 'gators. I just added a flair to it." Twenty years later, after two tours in Vietnam (as a decorated Army Ranger), a spell in college studying cosmetology, and a victory in the race for Seminole tribal chairman, the charismatic Floridian turned his attention towards big-time gambling.
The appeal was obvious. The year Billie was elected, the incumbent chairman controlled a budget of just $400 per member
. Poverty was rampant, human services and cultural programs nearly absent. And on his first day in office, the tribe’s comptroller dumped on his desk a proposal from several South Florida financiers to open a high-stakes bingo parlor on Seminole land, a plan his predecessor -- fearful of the reaction he’d receive from his many Southern Baptist members -- had ignored. Bingo was already legal in Florida, but state law only sanctioned a maximum jackpot of $100. Thanks to the Bryan
ruling, which solidified the regulatory sovereignty of Native American tribes, Billie and his potential colleagues figured they could offer payouts that were both legal and massive. He quickly signed a contract that gave the tribe a 55 percent cut of all revenue, secured a $3 million bank loan for construction, and set about building a bingo hall outside of Fort Lauderdale.
Americans, it turned out, really
like to gamble. The first hall, which initially featured a $4,000 super jackpot, opened to capacity crowds. The state of Florida tried to shut down the operation, but Billie’s team of lawyers outmaneuvered the AG’s office and established legal precedence for their establishment in 1981. The following year, the Seminoles opened a second building in Tampa, which included poker and video slot machines. Players stormed the gates. Before long, they expanded to Immokalee, Brighton, and Coconut Creek, and the cash poured in; between 1980 and 2000, the tribe’s budget grew from $500,000 to $650 million, of which over 90 percent came directly from gambling. Billie went on a construction spree, opening schools, museums, hotels, tourist ranches, farms, office parks, and even an aircraft plant. The Seminole political action committees donated more money to Florida politicians than any other interest group in the 1990s, ensuring their gaming rights would not be challenged. Every man, woman, and child in the tribe was sent a monthly subsidy. And their exuberant and flirtatious chief even bought
a 47-foot yacht, a $9-million jet, and three multi-million dollar helicopters to travel from his home in the Everglades to tribal headquarters.
Money, while useful, can also breed resentment and suspicion, feelings from which the nouveau-riches Seminole leaders were not immune. When Billie hired an outside investigator to audit the tribe’s ostentatious spending habits, and the administrator recommended the Seminoles eliminate a massive pool of contingency cash the five councilmen doled out without any oversight (and often for political favors), his colleagues turned on him. After five straight landslide electoral victories, the council went public with a sexual harassment complaint they would have historically dealt with privately and forced Billie to resign from his post without severance in 2003. The man who had launched the Indian gambling industry was escorted off his own reservation by police.
But that’s not the end of Billie’s tale. This May, the former leader was reelected
as tribal chairman in a “house-cleaning that saw several other long-entrenched tribal leaders defeated in bids to stay in office.” Back in power, he’ll need to confront some intractable health and social problems afflicting his tribe; although most live in free housing and collect monthly dividends of $3,500, Seminoles still suffer from a disproportionate rate of joblessness, Type II diabetes, and substance abuse. Those are issues penny slots alone can’t solve.
For more on Billie, be sure to read the excellent Sarasota Magazine
piece I linked to above.
*Nearly 450 casinos now operate in 28 states.
**Billie’s father was an Irish soldier and “mixed bloods” were once thought to possess evil spirits.
President Obama is close to setting a modern record, one his liberal base is unlikely to applaud. The Democrat still has not commuted one prison sentence since taking office, a constitutional right every president
except George W. Bush exercised earlier in his first term. Obama’s been stingy
with his presidential pardons, too, officially forgiving the crimes of just 17 Americans, despite a massive backlog of applications. The Village Voice
published a lengthy analysis
of the trend a few weeks back, assigning blame to Obama’s political team, a skittish Attorney General Eric Holder, and the “arcane internal activities of the Justice Department,” which author Graham Rayman argues has “dismantled the administrative functions of the pardon office.” For those who've been wronged by the criminal justice system or have rehabilitated themselves following their initial incarceration, the light at the end of the tunnel is dimmer than it used to be.
Of the 17 people who did receive clemency from the current administration, most committed minor crimes (coin mutilation, alligator hide possession) decades ago. Not that they aren’t grateful; depending on the jurisdiction, ex-felons of any kind forfeit some of their core civil rights, including the right to vote, to hold public office, and/or to serve on a jury. Indeed, Junior Johnson
-- one of the most accomplished Americans ever to file for clemency -- considered his 1986 presidential pardon one of the great thrills of his life.
And Johnson had many. The first superstar of NASCAR, Junior won 50 major stock car races between the mid-1950s and the mid-1960s and another 139 as a team owner in the three succeeding decades. Behind the wheel, he perfected two racing techniques that forever changed the sport: the power slide and drafting. The former, when the driver slows through a turn by cocking the wheel to the left and gunning it before
hitting a bend in the track, allows the car to come out of turns more quickly than consistently laying on the brake, while the latter -- when one car trails in the wake of another to minimize wind resistance -- helps slower cars keep pace with souped-up competitors.
It was on the dusty roads of Wilkes County, North Carolina where Junior gained his dexterity behind the wheel. Like thousands of Scot-Irish descendants before him, Johnson ran untaxed moonshine* through the back country woods. To the locals, it was a business with no less legitimacy than any other. Junior worked for his father, who operated one of the biggest individual copper-stills in the region. On a given night, young drivers could make $300 transporting a few dozen cases of mountain dew**, so long as they outran the federal tax agents known as “revenuers
,” which Johnson always did. “I never got caught behind the wheel running," he later bragged
. “I've still got my marks on a bunch of those trees all along the roads there.”
Yet he was clipped once, in 1957, when officials staked out his father’s home and found the son on his way to fire up the family’s still. Johnson was arrested and served 11 months and three days in federal prison, the crime for which he was eventually pardoned. In a funny way, the booking was a blessing; as an older man, the driver claimed that in prison he “found out that I could listen to another fellow and be told what to do and h'it wouldn't kill me." His felony helped reinforce his bonafides as a rebel, too, setting the stage for his emergence as legitimate southern folk hero.
That image was solidified in 1963, the most memorable season of his career. The good ole boy grossed $100,000 in earnings and set multiple qualifying records driving a Chevy, an inferior car not bankrolled by Detroit’s two earliest NASCAR investors, Ford and Chrysler. Those qualifying runs were often more thrilling to fans than the races themselves; because a stock car track is a giant oval and circling it requires very little handling ability, the sprints became, as Tom Wolfe describes it in his famous profile
of Johnson, “a test of raw nerve -- of how fast a man is willing to take a curve.” In those early days, nobody was quite as fearless as Junior.
Last year, Johnson was one of the inaugural inductees
of the NASCAR Hall of Fame, an honor he shared with legendary mustachioed driver Richard Petty. Since 2007, he’s also returned to his liquor roots, distilling and distributing
Junior Johnson's Midnight Moon, a legal moonshine. (Reviews are mixed.) Working through an authorized dealer is certainly preferable to rejoining the underground industry, which investigative reporter Max Watman
says “shows no signs
of letting up.” If he ran into trouble again, Johnson would probably have less success convincing the new occupants of the White House to throw him a bone.
*A home-brewed whiskey distilled from corn, potatoes, or anything that would ferment.
**Other hilarious nicknames include White Lightning; Kickapoo, Joy Juice, Hooch, Ruckus Juice, Happy Sally, Hillbilly Pop and Panther's Breath.
The only thing presidential hopeful Rick Perry loves more than locking and loading at his racist hunting camp
is running for office. In an insightful profile
for The New Republic
, reporter Alec MacGillis surveyed the Texas governor’s career and concluded that Perry is not really motivated by ideological principles or an appreciation of well-crafted public policy. Rather, he’s moved by “the business of politics: ladder-climbing, deal-making, campaigning, and, most of all, winning.” This explains why a conservative hell-bent on slashing government spending has no compunction about doling out
grants, tax breaks, and contracts to serious campaign supporters like he was Rod Blagojevich on some drunken binge. For Perry and his coiffed ‘do, the game means everything.
If we assume MacGillis’ characterization is accurate, I’m curious what a young Perry -- bored at home in West Texas following college and a stint in the Air Force -- took away from his visit
to Washington D.C. in 1978, when he witnessed the first major protest of the fascinating, controversial, and short-lived American Agriculture Movement
To understand the significance of this jolly band of farmers, an amateur lesson in agrarian economic policy
is helpful. Thanks largely to technological improvements, farmers have gotten better at their jobs over time; between 1948 and 2002, total U.S. agricultural output rose
by a factor of 2.6, while population didn't even double. When suppliers harvest more food than Americans can gobble up in a given year, the surplus exerts downward pressure on prices. That’s good news for families trying to save a few bucks at the grocery store, but bad news for the laborers whose income depends on the sale of those crops. The problem gets worse when thousands of farmers across the country all decide individually that the only way their families can eke out a living is by planting more seeds. What seems like a rational economic calculation in the face of falling prices actually floods the market further, lowering premiums in the process.
For 40 years, beginning during the Great Depression, the United States tried to solve this dilemma by limiting the amount of seed farmers sowed. If American agriculturalists churned out more than their neighbors could chew, the government would simultaneously purchase and store excess grain and pay the farmers to leave portions of their land fallow. This command-and-control system worked reasonably well until the early 1970s, when synthetic pesticides and new machinery overwhelmed the government's ability to minimize agricultural output. “Given such leaps in productivity,” wrote Tom Philpott in his concise overview
of the Roosevelt-era initiative, “it was inevitable that the New Deal paradigm would break down.”
In stepped the Nixon Administration and one of the great names in American political history, former USDA Secretary Earl “Rusty” Butz
. To boost prices, Butz and his colleagues decided they would focus their efforts on the demand side of the equation by opening up foreign markets to U.S. crops. Quickly, he engineered a massive grain sale to the Soviet Union and then pressed farmers to till their bare farmland “from fencecrow to fencecrow.” If his strategy caused prices to dip below the cost of production in the short-term, the free-marketer agreed to make direct payments to the landowners to protect them from falling into debt. A new era was born.
This is where the AAM enters the story. In 1973, with the passage of the Agriculture and Consumer Protection Act, Congress established its first set of “target prices” for crops. If the average price for a certain commodity fell below the given estimate, the government would simply write a check to the farmers who grow it. Four years later, with that first omnibus bill scheduled to expire, lawmakers and the Carter White House renegotiated target price levels. Surveys taken at the time showed
that a huge majority of food producers -- whose purchasing power was at its lowest level since the 1930s -- were keen to keep the new law in place so long as target prices were raised. The final legislation did, but only modestly. And that’s when some young American farmers got radical.
That fall, 3,000 small farmers from 24 states convened in Pueblo, Colorado to berate U.S. Agriculture Secretary Bob Bergland for ignoring their plight. His response -- that they should "just wait a while and things will get better” -- was unsatisfying, to say the least. After the meeting, frustrated farmhands decided to organize the AAM formally. Their sole demand? Establish “100 percent parity,”* described informally as a “minimum wage for farmers.” The movement spread
like a wildfire. Optimistically claiming the support of one-third of U.S. farmers, AAM allies held rallies at statehouses across the country that winter, and even secured a meeting with President Carter on December 24, one that the Washington Post
described at the time as “mostly symbolic.” Their agitation culminated in a giant tractor rally (or “tractorcade”) in the nation’s capital just a few weeks later. The AAM’s resident historian, biased as he or she may be, sets the frantic scene
: AAM's tactics in those early days brought harsh criticism, but also much needed publicity. Farmers learned to tell their story in front of TV cameras and on radio talk shows. Soon the whole nation knew there was a problem, whether they agreed with the farmers or not and whether they condoned their tactics or not. When Congress reconvened on January 18, 1978, 50,000 farmers were in Washington, D.C. to greet them. Again, all of this was done with no formal communications network. On March 15, 1978, 30,000 farmers marched down Pennsylvania Avenue in one of the largest farm demonstrations ever. Some farmers from Missouri had brought along some goats, and somehow they got loose just as the parade approached the Capitol. The versatile goats nimbly climbed the steps, the statues, the fences or whatever else they wanted to. Police, not used to herding goats, tried to catch them. The news media barely noticed what the farmers were saying for the antics of the goats.
Though not for lack of effort, AAM leaders failed to initiate a “national strike,” whereby farmers across the States would stop buying equipment and selling produce until their demands were met. But the Carter administration agreed to halt temporarily all foreclosures conducted by the Farmers Home Administration as a result of their actions. Six months after the D.C. fracas, the Post
offered a largely positive assessment of the AAM’s experiment. “What the movement brought can be measured largely in intangibles: a new political awareness for farmers, some small political victories, and a new sense of community among people who pride themselves on their individualism.”
The next winter, a smaller and more aggressive band of AAM members, again riding their John Deere’s, returned to Washington and snarled rush-hour traffic in an attempt to drum up publicity. In just a matter of hours, 19 farmers were arrested, 17 tractors were impounded, and the goodwill AAM officials had generated 12 months earlier was squandered. “This is not a legislative year, and there ain’t much going to happen,” the chairman of the House Ag subcommittee told Scripps-Howard
. “You cannot find the sympathy for them they had last year.” It was the group’s last true moment in the limelight.
These days, it’s large corporate conglomerates who suck up
a growing percentage of the farm subsidies Rusty Butz first set aside. (Archer Daniels and the like then use the government cheese to buy out young farmers who the USDA initially intended to support.) If elected president, it’s not yet clear if Rick Perry -- himself a mild beneficiary
of direct subsidies -- would do anything to alter the status quo. I guess we will just have to wait to find out what lessons he gleaned from 1978.
*Adjusted for inflation, the farmers wanted to earn as much profit-per-acre as their predecessors did in 1910.
In Mentor, Minnesota, a town from which a portion of my family hails
, there stands a white cottage. It’s tucked away on the tree-lined shores of Maple Lake
, connected to the highway that runs through town by just a slim, winding road. And for years, it housed the area’s most unlikely and infamous resident, Texas-bred Bascom Giles
From 1938 through the mid-1950s, Giles was elected eight times to lead the Texas General Land Office, a powerful agency with the sole responsibility
of managing the public domain. For most of his tenure, the entrepreneurial Democrat and Grand Master
mason served with distinction. Texans were not surprised, then, when Giles wrote and lobbied for an amendment to the state constitution that promised to authorize $100 million in public bonds ($800 million today) to buy land that would be resold to returning war veterans at low interest rates (3 percent) and with a marginal down payment (5 percent). The plan seemed both patriotic and beneficial to the state’s growing economy, and voters approved the measure in 1946 by a wide margin.
Their trust in Giles and his colleagues in Austin was misguided. The big pot of money set aside for the GI program attracted the attention of grafters, who in turn convinced officials inside Gov. Allan Shivers’ administration that there was cash to be made off hoopleheads returning from war. Some despicable predatory behavior ensued
Soon, land promoters and dishonest public servants were waxing fat at the expense of the veterans and the taxpayers, with an ingenious racket. The racketeers 1) got options on land at market prices, 2) duped veterans into signing the necessary papers, 3) with the aid of crooked officials, got the land appraised at several times its actual worth, 4) put on pressure to get state loans on it, in the names of the bamboozled veterans, and 5) pocketed the profits made in the jacked-up prices for the land.The Veterans Land Board, composed of Giles, Governor Allan Shivers and Attorney General John Ben Shepperd, eased the way by hastening its approval of the hot transactions, often acted so expeditiously that the promoters were able to pick up the options with the state's money. The fact that Shivers and Shepperd rarely attended board meetings undoubtedly helped Giles work out his plan. Usually the ex-servicemen had no idea what they were signing. Many thought the papers were some sort of application for a cash bonus.
An estimated $10 million in state money was siphoned off to thieves before a cub reporter named Ken Towery, working on his first real story for the tiny Cuero Record
, unraveled the entire scheme. Towery received a tip that several prominent businessmen in his county were entertaining black and Mexican laborers at a private bottle club, a curious occurrence in the segregated south. “Down in this country,” he later told Time
, “white people just don't set up big parties for colored field hands.” The reporter subsequently interviewed dozens of vets and figured out that land owners were rounding up former troops, some of them functionally illiterate, and tricking them into “applying” for land grants without their knowledge. Towery published his explosive findings on November 14, 1954, which set off an intensive statewide probe the following year, resulting in 20 indictments in nine different counties. The “Veterans Land Board scandal” rocked the state; Giles was convicted on charges of bribery, theft, and conspiracy and served three years in Texas lock-up, the highest-ranking pol to go down in the case. His boss, weakened by the scandal, did not seek a fourth term in 1956.
It was after his release that Giles moved up north, presumably for a fresh start in a town that knew little about his Lone Star improprieties. He never served in public office again and was eventually killed in an auto accident in Florida. Towery came away from the affair with a Pulitzer Prize for local affairs reporting. Not bad for your first journalism assignment.
There’s more on the scandal here
. And here’s a recent (infuriating) example
of predatory lenders taking advantage of American service members and their families, shenanigans that Holly Petraeus will try to prevent
if and when the Consumer Financial Protection Bureau ever gets off the ground.
The debt deal Congress struck over the weekend wasn’t the only time-consuming legislative project lawmakers have focused on at the expense
of job creation. For months, congressmen in the capital “debated
” the merits of a major patent reform bill, one that led to a “byzantine war between powerful interests,” according to the Huffington Post’s Zach Carter. Our Constitution gives Congress the power to issue patents, whose purpose is to encourage the development of new and productive ideas. Carter’s reporting shows that the latest melee on the Hill did little to clarify or improve existing intellectual property law. Instead, it provided lawmakers a justification to hold meetings and accept donations from tech giants, drug companies, and major banks. It was a massive, if profitable, distraction.
If the nation’s first patent statute was still the law of the land, this process would have looked considerably different. On the books for just three years, from 1790 to 1793, initial applicants were required
to file a petition with the U.S. Secretary of State (who at the time was Thomas Jefferson). Along with the Secretary of War and the U.S. Attorney General, the cabinet member evaluated each application to determine whether or not the discovery was “sufficiently useful and important.” If approved, the patent holder would receive exclusive rights for 14 years.
Jefferson and his fellow “Commissioners for the Promotion of the Useful Arts,” who were a little tied up running a brand new country, granted just three patents during their first year on the job. The original awardee
was a man named Samuel Hopkins, a Quaker from Pennsylvania who claimed to have perfected
a new and improved technique for manufacturing potash
, a substance (derived from the ashes of hardwood trees) that served as a crucial ingredient in soap, glass, and gunpowder. Hopkins speculated that potash producers could boost yields by burning the raw tree ash in a furnace before dissolving and boiling the plant product in water. With patent in hand, Hopkins staked his financial future on a licensing scheme, asking asheries in forest-rich New England for a $50 down payment (roughly $650 today) in exchange for a five-year furnace contract.
Hopkins' price, it turned out, was way too high to attract serious investors. As David Maxey wrote in his 1998 report on the subject, not-so-subtly titled “A Study of Failure
,” “the cost to licensees far outweighed the increased yield … especially when a much cheaper, noninfringing alternative for producing potash was within easy reach.” Hopkins couldn’t dump his own resources into an advertising blitz either, because making a major financial commitment to an unproven venture could have been construed as gambling, behavior that was inconsistent with the values of the Religious Society of Friends
. Financial ruin inevitably followed. Here’s an excerpt from the minutes of one monthly church meeting in 1802, which Maxey dug out of the archives:
"Samuel Hopkins, who has been some years removed from us with his Family to Rahway, hath been treated with by the Overseers on account of having, by entering into Engagements beyond his Ability to manage, failed to fulfill his Contracts, and pay his just Debts, whereby Reproach hath been brought on the Profession of Truth which he hath made. At our last Sitting, . . . Samuel came forward and asked the aid of his Friends, and the Meeting verbally named a Committee to hear his Request in company with the Overseers. With their united Consent he now offers a paper acknowledging that by leaving a Business in which he had been instructed, and for want of Patience too speedily embracing another Employment which though appearing more eligible, had led him into a Train of Difficulties and Embarrassments, and prevented his doing Justice to his Creditors, whereby he had been brought into deep and exercising Conflicts on these Accounts.
America’s first patent holder, in other words, was a massive failure. And we have the nation’s third president to blame.