Çàäàíèÿ
Èñòî÷íèêè
Ëèòåðàòóðà
Áèáëèîòåêà
Èëëþñòðàöèè
spacer

Èñòîðèÿ ÑØÀ
Ìàòåðèàëû ê êóðñó

spacer
Ñîôò
Êàðòû
Ññûëêè
Ýêçàìåí
Êîíòàêòû
 

Ýòî ñòàðàÿ âåðñèÿ ñàéòà, îíà íå îáíîâëÿåòñÿ. Íîâûé ñàéò "Èñòîðèÿ ÑØÀ"
ðàñïîëîæåí ïî àäðåñó USHISTORY.RU



Current R., Williams T. H., Freidel F., Brinkley A. American History: a survey. 7th edition; New York, 1987.


Ê ÎÃËÀÂËÅÍÈÞ

 

PART SIX.Prosperity, Depression, and War, 1920-1945

In many respects the period between the end of World War I and the end of World War II was one of sharp discontinuities. Few eras in American history present such vivid contrasts compressed into so short a time.

Politically, the nation experienced what many considered a fundamental change after the election of 1920. For a full decade, the government remained in the hands of the Republican party and—for eight of those years at least—in the hands of two conservative presidents who rejected most of the liberal assumptions of the progressive era. An age of reform seemed to have given way to an era of reaction. Then, beginning in 1933 with the inauguration of Franklin Delano Roosevelt, the Democratic party—the minority organization for most of the previous seventy-five years—established a dominance that it was not to relinquish for decades. And with it came a new administration and a new Congress that would together produce major reforms and a drastic expansion of the role of government in American life. Finally, after the outbreak of World War II, the nation witnessed a rapid strengthening of conservative forces that brought the march toward liberal reform once again to a virtual halt.

Economically, the nation experienced equally profound shifts. Beginning in 1921 the American economy embarked on a period of growth without precedent in the history of the world. The nation's industrial capacity grew rapidly; the income of its citizens soared; America's position in world trade became one of unrivaled supremacy. And the American corporate world, after having been on the defensive for many years, basked in a widespread public popularity that turned once-despised captains of industry into national heroes. Then, starting with a dramatic stock market crash in 1929, the imposing economic edifice collapsed, and the country entered the worst economic crisis in its history. Industrial production declined; new investment virtually ceased; unemployment reached epic proportions. Although the magnitude of the crisis varied from one year to the next, the Great Depression, as it soon became known, continued for nearly a full decade. Only the outbreak of World War II and the stimulus that the conflict gave to the economy brought the misery to an end. By 1942 the United States had once again embarked on a period of economic growth—one that soon put even the prosperous 1920s to shame.

Culturally, there seemed to be equally sharp contrasts. In the 1920s a bitter conflict emerged between the forces of modernism associated with the new urban-industrial society and the forces of traditionalism associated with more provincial, often rural communities. On issues such as prohibition, religion, and race, the tensions between the new society and the old were vividly displayed. In the 1930s, by contrast, the nation's outlook appeared to shift dramatically. Cultural divisions now seemed less important than economic ones, and the controversies of the 1930s centered less on questions of values than on questions of wealth and power. The outbreak of World War II, however, dampened the economic conflicts and helped to produce a new search for national unity.

Yet for all the very real differences between the 1920s and the 1930s, and between both periods and World War II, there were also important continuities. For the United States during these years remained involved in a struggle that transcended the changing fortunes of the moment: a struggle between the forces promoting national consolidation and the forces sustaining social disunity.

Through prosperity, depression, and war, the nation experienced a steady increase in government influence on the lives of its people, a continuing movement toward centralization of economic power in great national institutions, and the emergence of a social and cultural outlook that helped to overcome local and regional differences and to draw the entire country together. At the same time, America in those years encountered strong resistance to all these developments—resistance that could not reverse the trend toward consolidation but that could and did limit and reshape it. The story of American life in these years, therefore, is both the story of how the United States became a more modern, united nation and the story of how it remained at the same time in many ways a fragmented, localized, individualistic society.

Chapter 24. The New Era

America's present need is not heroics, but healing; not nostrums, but normalcy," Warren G. Harding, soon to become president of the United States, told the nation in 1920. "The world needs to be reminded that all human ills are not curable by legislation."

The 1920s have often been portrayed as an era fully in accord with Harding's conservative words— an era of affluence, conservatism, and cultural frivolity; the Roaring Twenties; the age of "normalcy." In reality, however, the decade was a time of significant, even dramatic social, economic, and political change. It was an era in which the American economy not only enjoyed spectacular growth but began to assume its modern forms of organization. It was a time in which American popular culture reshaped itself to reflect the realities of urban, industrial society. And it was a decade in which American government, for all its conservatism, experimented with new approaches to public policy that helped pave the way for the important period of reform that was to follow. Contemporaries liked to refer to the 1920s as the "New Era"—an age in which America was becoming a modern nation. In many ways, it was an appropriate label.

At the same time, however, the decade saw the rise of a series of spirited and at times effective rebellions against the modern developments that were transforming American life. The intense cultural conflicts that characterized the 1920s were evidence of how much of American society remained unreconciled to the modernizing currents of the New Era.

The New Economy

Growth and affluence were the most striking and visible characteristics of American life in the 1920s; and the remarkable performance of the economy in those years lay at the heart of the many other social and cultural changes of the era. After the recession of 1921-1922, the United States began a period of almost uninterrupted prosperity and economic expansion; indeed, American industry came to seem one of the wonders of the world. Less visible at the time, but equally significant, was the survival (and even extension) of serious inequalities and imbalances, in which lay the seeds of future economic troubles.

Economic Performance

No one could deny the remarkable, some believed miraculous, feats that the American economy was performing in the 1920s. The nation's manufacturing output rose by more than 60 percent during the decade; the gross national product increased at an average of 5 percent a year; output per worker rose by more than 33 percent. Per capita income grew by a third. Inflation was negligible. A mild recession in 1923 momentarily interrupted the pattern of growth; but when it subsided early in 1924, the economy expanded with even greater vigor than before.

The economic boom was a result of many things. The most obvious immediate cause was the debilitation of Europe after World War I, leaving the United States for a time the only truly vigorous industrial power in the world. More important, however, was technology, and the great industrial expansion it made possible. The automobile industry, as a result of the invention of the assembly line and other technological innovations, grew from a relatively modest size in the years before the war to become one of the most important forces in the nation's economy. Americans bought 1.5 million cars in 1921; in 1929 they purchased more than 5 million. Expansion in one industry meant, of course, expansion in others. Auto manufacturers purchased the products of steel, rubber, glass, and tool companies. Auto owners bought gasoline from the oil corporations. Road construction in response to the proliferation of motor vehicles became itself an important industry. The increased mobility that the automobile afforded increased the demand for suburban housing, fueling a boom in the construction industry.

Other new industries benefiting from technological innovations contributed as well to the economic growth. The new radio industry became a booming concern within a few years of its commercial debut in 1920. The motion picture industry expanded dramatically, especially after the introduction of sound in 1927. Aviation, electronics, and home appliances all helped sustain American economic growth. The invention of new plastics and synthetic fibers helped the chemical industry to become an important force. Improved methods of extraction and transportation, as well as new production techniques, helped the aluminum and magnesium industries to develop.

Cheap, readily available energy—from newly discovered oil reserves, from the expanded network of electric power, and from the nation's abundant coal fields—further enhanced the ability of industry to produce. Improvements in management techniques also played a role in increasing productivity. More and more industries were subscribing to the "scientific management" theories of Frederick Winslow Taylor, making deliberate efforts to improve the efficiency of their operations.

Economic Organization

This quest for improved efficiency in the factory was only part of a larger trend. Large sectors of American business in the 1920s were making rapid strides toward national organization and consolidation. The process had begun, of course, decades before; but the New Era witnessed an acceleration of such trends. By the end of the decade, 8,000 small mining and manufacturing companies had been swallowed up into larger combinations; 5,000 utilities had disappeared, most of them into great holding companies. Local merchants foundered and vanished as national chain stores cornered more than a quarter of the nation's food, apparel, and general merchandise markets. In some industries, power resided in so few firms that competition had all but vanished. U.S. Steel, the nation's largest corporation, controlled its industry almost alone; its dominance was suggested by the widely accepted use of the term "Little Steel" to refer to all of its competitors combined.

This consolidation did not occur in all segments of the economy. Some industries—notably those dependent on large-scale mass production—seemed naturally to move toward concentrating production in a few large firms. Others—industries less dependent on technology, less susceptible to great economies of scale—proved resistant to consolidation, despite the efforts of many businessmen to promote it. By the end of the decade, it was becoming clear that the American economy would not be dominated by any single form of organization; that in some industries there would be a high degree of consolidation, while in others power would remain widely dispersed.

In those areas where industry did consolidate, new forms of corporate organization emerged to advance the trend. General Motors, which was by 1920 not only the largest automobile manufacturer but the fifth largest American corporation, was a classic example. Under the leadership of Alfred P. Sloan, GM developed a modern administrative system with an efficient divisional organization, which replaced a chaotic management structure. With the new system, not only was it easier for GM to control its many subsidiaries; it was also a simpler matter for it—and for the many other corporations that adopted similar administrative systems—to expand further.

Some industries less susceptible to domination by a few great corporations attempted to stabilize themselves not through consolidation but through cooperation. An important vehicle was the trade association—a national organization created by various members of an industry to encourage coordination in production and marketing techniques. Trade associations often succeeded in limiting competition and stabilizing the market in industries dominated by a few large firms. But in industries such as cotton textiles, where power remained widely dispersed, their effectiveness was limited.

The strenuous efforts by industrialists throughout the economy to find ways to curb competition through consolidation or cooperation reflected the survival of a basic corporate fear: the fear of overcapacity. Even in the booming 1920s, industrialists remembered how too rapid expansion and overproduction had helped produce disastrous recessions in 1893, 1907, and 1920. The great dream of the New Era—a dream that remained unfulfilled—was to find a way to stabilize the economy so that such collapses would never occur again.

Labor's Dilemma

The remarkable economic growth was only one side of the American economy in the 1920s. Another was the maldistribution of wealth and purchasing power that persisted during the decade. New Era prosperity was real enough, but it was restricted to a minority of the population. More than two-thirds of the American people in 1929 lived at no better than what one study described as the "minimum comfort level." Half of those languished at or below the level of "subsistence and poverty." Large segments of the society remained unable to organize, and they found themselves without sufficient power to protect their economic interests.

American labor experienced both the benefits and the deficiencies of the 1920s as much as any other group. On the one hand, most workers saw their standard of living rise during the decade; many enjoyed greatly improved working conditions and other benefits. Employers in the 1920s, eager to avoid disruptive labor unrest and forestall the growth of unions, adopted paternalistic techniques that came to be known as "welfare capitalism." Industrialists such as Henry Ford shortened the work week for employees and instituted paid vacations. Manufacturers such as U.S. Steel spent millions of dollars installing safety devices and improving sanitation in the workplace. Most important, perhaps, many employers offered their workers substantial raises in pay and other financial benefits. By 1926, nearly 3 million industrial workers were eligible for pensions on retirement. In some companies, employees were permitted to buy stock at below-market value. When labor grievances surfaced despite these efforts, workers could voice them through the so-called company unions that were emerging in many industries—workers' councils and shop committees, organized by the corporations themselves. Welfare capitalism brought workers important economic benefits. It did not, however, offer employees any real control over their own fates.

Company unions may have been psychologically comforting, but they were for the most part feeble vehicles for demanding benefits. In most companies, the workers' councils were forbidden to deal with questions of wages and hours. And welfare capitalism survived only as long as industry prospered. After 1929, with the economy in crisis, the entire system quickly collapsed.

Welfare capitalism affected only a relatively small number of workers, in any case. Most laborers worked for employers interested only in keeping their labor costs to a minimum, and workers as a whole, therefore, received wage increases that were proportionately far below the increases in corporate profits. Unskilled workers, in particular, saw their wages increase very slowly—by only a little over 2 percent between 1920 and 1926. Many workers, moreover, enjoyed no real security in their jobs. Unemployment in the 1920s was lower than during most of the previous decades. But while historians and economists disagree about the levels of joblessness in these years, the most recent evidence suggests that an average of between 5 and 7 percent of nonfarm workers were unemployed between 1923 and 1929.

In the end, American workers remained in the 1920s a relatively impoverished and powerless group. Their wages rose; but the average annual income of a worker remained below $1,500 a year at a time when $1,800 was considered necessary to maintain a minimally decent standard of living. In some industries, such as coal mining and textiles, hours remained long and wages rose scarcely at all. Nor could workers do very much to counter the effects of technological unemployment. Total factory employment increased hardly at all during the 1920s, even while manufacturing output was soaring.

Some laborers continued to regard an effective union movement as the best hope for improving their position. But the New Era was a bleak time for labor organization. Part of the blame lay with the workers themselves, many of whom were seduced by the benefits of welfare capitalism and displayed no interest in organizing. Even more of the blame rested with the unions, which failed to adapt to the realities of the modern economy. The conservative American Federation of Labor remained wedded to the concept of the craft union, in which workers were organized on the basis of particular skills. In the meantime, a huge new segment of the work force was emerging: unskilled industrial workers, many of them immigrants from southern or eastern Europe. They received little sympathy or attention from the craft unions and found themselves, as a result, with no organizations to join. The AFL, moreover, remained throughout the 1920s painfully timid about supporting strikes—partly in reaction to the disastrous setbacks it had suffered in 1919. William Green, who became president of the organization in 1924, was committed to peaceful cooperation with employers and strident opposition to communism and socialism.

A growing proportion of the work force consisted of women, who were concentrated in what have since become known as "pink-collar" jobs— low-paying service occupations with many of the same problems as manufacturing employment. Large numbers of women worked as secretaries, salesclerks, and telephone operators, and in similar capacities. Because such positions were not technically indus-trialjobs, the AFL and other labor organizations were uninterested in organizing these workers.

Black workers were another group that could hope for little help from the unions. The half-million blacks who had migrated from the rural South into the cities during the Great Migration after 1914 constituted a small but significant proportion of the unskilled work force in industry; but as unskilled workers, they had few opportunities for union representation. The skilled crafts represented in the AFL often worked actively to exclude blacks from their trades and from their organizations. Most blacks worked in jobs in which the AFL took no interest at all—as janitors, dishwashers, and garbage collectors, and in other menial service jobs.

But however much the workers and unions themselves contributed to the weakness of the labor movement, corporate and government policies contributed more. If welfare capitalism was the carrot for inducing workers to accept the status quo, the antiunion policies of most industrialists constituted the stick. Corporate leaders worked hard after the turmoil of 1919 to spread the doctrine that unionism was somehow subversive and un-American, that a crucial element of democratic capitalism was the protection of the open shop (a shop in which no worker could be required to join a union). The crusade for the open shop, euphemistically titled the "American Plan," received the endorsement of the National Association of Manufacturers in 1920 and became a pretext for a harsh campaign of union busting across the country.

When such tactics proved insufficient to counter union power, government assistance often made the difference. In 1921, the Supreme Court upheld a ruling that declared picketing illegal and supported the right of lower courts to issue injunctions against strikers. In 1922, the Justice Department intervened to quell a strike by 400,000 railroad workers. In 1924, the courts refused protection to members of the United Mine Workers Union when mine owners launched a violent campaign in western Pennsylvania to drive the union from the coal fields.

The result of all these factors was that union membership suffered a serious decline in the 1920s. Union membership as a whole fell from more than 5 million in 1920 to under 3 million in 1929. Not until the mid-1930s, when a combination of increased labor militancy and active government assistance added strength to the labor movement, would the antiunion syndrome be broken.

The Plight of the Farmer

Despite their other problems, many American workers gained at least an increase in income during the 1920s. In contrast, most American farmers of the New Era experienced only decline. Agriculture, like industry, was discovering the advantages of new technology for increasing production. The number of tractors at work on American farms, for example, quadrupled during the 1920s, helping to open 35 million new acres to cultivation. But while the increases in industrial production were matched by increases in consumer demand, the expansion of agricultural production was not. The European market for American foodstuffs contracted rapidly after the war, as European agriculture began to resume production. At the same time, domestic demand for food rose only slightly.

The result was a disastrous decline in food prices and a severe drop in income for farmers. The per capita annual income for Americans not engaged in agriculture in 1929 was $870. For farmers, it was $223. In 1920, farm income had been 15 percent of the national total; by 1929, it was 9 percent. More than 3 million people left agriculture altogether in the course of the decade. Of those who remained, an alarming number were forced into tenancy—losing ownership of their lands and having to rent instead from banks or other landlords.

In response, farmers began to demand government relief. A few gravitated to such vaguely radical organizations as the Nonpartisan League of North Dakota or its successor, the Farmer-Labor party, which established a foothold as well in Minnesota and other Midwestern states. Most farmers, however, adopted a more moderate approach, agitating for some form of government price supports.

Through such organizations as the Farm Bureau Federation, they put increasing pressure on Congress (where farmers continued to enjoy disproportionately high representation); and while reform sentiment in most other areas made little headway in the 1920s, the movement for agrarian reform rapidly gathered strength.

One price-raising scheme in particular came to dominate agrarian demands: the idea of parity. "Parity" referred to a price for crops determined by a complicated formula. The parity price of agricultural goods was to reflect what farmers called a "fair exchange formula," which was based on the average price of the crop during the decade preceding the war (a good time for farmers) as compared with the general average of all prices during the same period. Its purpose was to ensure that farmers would earn back at least their production costs no matter how the national or world agricultural market might fluctuate. The government would guarantee parity to farmers in two ways: first, by maintaining a high tariff barrier against foreign competition, thus enabling American agriculture to sustain high prices at home; second, by buying up any surplus crops at parity and selling them abroad at whatever the world market would bring. An "equalization fee"—that is, a general tax on all crops—would compensate the government for any loss while spreading the burden evenly among all farmers.

The legislative expression of the demand for parity was the McNary-Haugen bill, named after its two principal sponsors in Congress and introduced repeatedly between 1924 and 1928. In 1924, a bill requiring parity only for grain failed in the House. Two years later, with cotton, tobacco, and rice added to win Southern support, the measure passed, only to fall victim to a veto by President Coolidge. In 1928, it won congressional approval again, only to succumb to another presidential veto. Although farmers had impressive political strength, as long as agrarian problems did not seem to affect the general prosperity there was little hope for reform.

Despite the inequities, and despite structural flaws that would ultimately contribute to the coming of a great crisis, the American economy in the 1920s did experience real and important growth. It helped spur the development of a host of new industries that would be of long-range importance to American economic health. It increased the size of the affluent middle class and thus helped create the mass consumer market that would be so crucial to future economic growth. It changed the American landscape by producing new residential and travel patterns and, in the process, breaking down the isolation of rural areas. And it helped create the outlines of a new national culture, which reflected both the fruits of American industry and the growing importance of urban life.

The New Culture

Americans in the 1920s experienced a series of profound changes in the way they lived and thought. A new urban culture emerged that helped people in all regions to live their lives and perceive their world in increasingly similar ways; and it exposed them to a new set of values that reflected the prosperity and complexity of the modern economy.

Consumerism

The United States of the 1920s was for the first time becoming a true consumer society—a society in which not only the affluent but many ordinary men and women bought items not just because of need but for the sheer pleasure of buying. What they bought, moreover, helped change the way they lived. Middle-class families rushed to purchase such new appliances as electric refrigerators, washing machines, and vacuum cleaners. Men and women wore wristwatches and smoked cigarettes. Women purchased cosmetics and mass-produced fashions. Americans in every part of the country ate commercially processed foods distributed nationally through chain stores and supermarkets. The clearest illustration of the new consumerism was the frenzied excitement with which Americans greeted the automobile, which was in the 1920s becoming more widely available and affordable than ever before. By the end of the decade, there were more than 30 million cars on American roads. Automobiles had, in the process, become not just a means of transportation but the first great national consumer obsession.

No group was more aware of the emergence of consumerism (or more responsible for creating it) than a new and growing sector of the economy: the advertising industry. The first advertising and public relations firms (N. W. Ayer andj. Walter Thompson) had appeared well before World War I; but it was in the 1920s, partly as a result of techniques pioneered by wartime propaganda, that advertising truly came of age. Publicists began to see themselves as more than purveyors of information. They viewed themselves, rather, as agents of the growing American economy; and they advertised products by attempting to invest them with glamour and prestige.

They also encouraged the public to absorb the values of promotion and salesmanship and to admire those who were effective "boosters" and publicists. One of the most successful books of the 1920s was the work of an advertising executive, Bruce Barton. In The Man Nobody Knows, Barton drew a portrait of Jesus Christ as less a religious prophet than a "super salesman" who "picked up twelve men from the bottom ranks of business and forged them into an organization that conquered the world." The parables, Barton argued, were "the most powerful advertisements of all time." Barton's message, a message apparently in tune with the new spirit of the consumer culture, was that Jesus had been a man concerned with living a full and rewarding life in this world; twentieth-century men and women should do the same. "Life," Barton wrote on another occasion, "is meant to live and enjoy as you go along."

National Communications

The advertising industry could never have had the impact it did but for the emergence of new vehicles of communication that made it possible to reach large audiences quickly and easily. Some of these vehicles were traditional media in changing guises. The number of local newspapers was shrinking rapidly; and those that survived often became members of great national chains—which meant that readers in widely scattered cities were reading the same material in their various newspapers. There was, as well, a growing number of national, mass-circulation magazines— Time, Reader's Digest, The Saturday Evening Post, and others—aimed at the widest possible audience. Fewer and fewer sources of information were servicing larger and larger groups of people.

Even more influential in shaping the popular culture of the 1920s was the growing popularity of the movies. Over 100 million people saw films in 1930, as compared to only 40 million in 1922. The addition of sound to motion pictures, beginning with the first "talkie" in 1927—The Jazz Singer with Al Jolson— created nationwide excitement. All across the nation, Americans were watching the same films, idolizing the same screen stars, and absorbing the same set of messages and values.

The most important communications vehicle of all, however, was the only one that was truly new to the 1920s: radio. The first commercial radio station in America, KDKA in Pittsburgh, began broadcasting on election night in 1920; and the first national radio network, the National Broadcasting Company, took form in 1927. By 1923, there were more than 500 radio stations, covering virtually every area of the country; by 1929, more than 12 million families owned radio sets. Broadcasting became the ultimate vehicle for linking the nation together, providing Americans everywhere with instant access to a common source of information and entertainment.

One result of the communications revolution was that America became in the 1920s a society in which fads and obsessions could emerge suddenly and powerfully. Radio helped elevate professional sports, and in particular professional baseball, from the level of limited local activities to that of a national craze. Men and women across the country shared a pleasure in popular stunts—flagpole sitting, marathon dancing, goldfish swallowing. They shared an interest in national sensations, such as the famous murder trial of Leopold and Loeb or the tortuous progress of the Sacco-Vanzetti case. It is not surprising that Frederick Lewis Allen, the celebrated chronicler of the 1920s, referred to the decade as the "ballyhoo years."

Modern Religion

It was not only fads that were engaging the attention of the nation. Americans in the 1920s were being exposed as well to a wide range of new standards of thought and behavior. Such changes affected some of the nation's most basic institutions, among them religion.

The scientific advances of the late nineteenth and early twentieth centuries had by the 1920s already produced profound changes in American theology. Liberal Protestant clergymen in particular had revised religious doctrine in an effort to reconcile traditional faith with the theories of Charles Darwin. Ministers in the progressive era had played an important role in promoting social issues; churches had become not only centers of worship but agents of reform. After World War I, the increasing secularism of American society worked even further changes on both religious faith and religious behavior. Theological modernists—among them Harry Emerson Fosdick and A. C. McGiffert—taught their followers to abandon many of the traditional trappings of religion (literal interpretation of the Bible, belief in the Trinity, attribution of human traits to the deity) and to accept a faith that would help individuals to live more fulfilling lives in the modern world. Critics considered the most advanced forms of this new, more secular Protestantism only one step removed from agnosticism.

The extremes of religious modernism found acceptance among only a relatively few people; Americans remained, by the standards of a later time, highly religious. Even so, changes in popular religious assumptions and patterns were widespread. The sociologists Robert and Helen Merrell Lynd discovered during a study of community life in Muncie, Indiana, for example, that while most people continued to attend church and express a belief in God, their faith was also experiencing important changes. Fewer people seemed to believe in hell; many admitted that they "think of Heaven less than they used to." "One infers," the Lynds reported in their famous study Middletown (1929), "that doubts and uneasiness among individuals may be greater than a generation ago."

The New Woman

The decade of the 1920s was particularly important in redefining the role of women, both within the family and within American society as a whole. The new economy and culture affected women in a number of ways. For many, it meant a turning away from the social activism of the progressive era and into a more personal, private search for satisfaction. For others, it meant new kinds of professional careers. And for still others, it meant political activism, an effort to keep feminism alive as a vital force after the victory of the suffrage crusade.

College-educated women were no longer pioneers in the 1920s. They were forming the second and third generations of graduates of women's or coeducational colleges and universities; and they were occasionally making their presence felt in professional areas that in the past they had rarely penetrated. A substantial group of women now attempted to combine marriage and careers; 25 percent of all woman workers were married in the 1920s. In the progressive era, middle-class women had generally had to choose between work and family. Still, professional opportunities remained limited by society's assumptions about what were suitable occupations for women. Although there were notable success stories about woman business executives, journalists, doctors, and lawyers, most professional women remained confined to such fields as fashion, education, social work, nursing, and the lower levels of business management.

The "new professional woman" was a vivid and widely publicized image in the 1920s. In reality, however, most employed women were nonprofes-sional, lower-class workers. Middle-class women, in the meantime, remained largely in the home. The number of employed women rose by several million in the 1920s, but the percentage of women employed rose scarcely at all. Society as a whole still had little tolerance for the idea of combining marriage and a career, and women who might otherwise have been inclined to try to do so found little support for their ambitions.

Yet the 1920s nevertheless constituted a new era for non-professional middle-class women. In particular, the decade saw a redefinition of motherhood. In the first years of the twentieth century, most Americans had believed that a woman's principal mission was to bear and raise children. They had assumed that women were uniquely and instinctively qualified for parenthood. After World War I, however, an influential group of psychologists—the "behaviorists," led by John B. Watson—began to challenge such assumptions. Maternal affection was not, they claimed, sufficient preparation for child rearing. Instead, mothers should rely on the advice and assistance of experts and professionals: doctors, nurses, and trained educators in nursery schools and kindergartens.

For many middle-class women, these changes devalued what had been an important and consuming activity. Motherhood and housekeeping continued to occupy a large proportion of most women's time (except for the affluent few able to afford servants). But it did not provide satisfaction commensurate with its costs. Some middle-class women turned to professional careers to find fulfillment. Many more, however, attempted to enrich their lives by devoting new attention to their roles as wives and companions. A woman's relationship with her husband assumed a greatly enhanced importance. She increasingly shared in her husband's social life; she devoted attention to cosmetics and seductive clothing in an effort to please her husband; she tried to prevent her children from interfering with the development of the marital relationship. Most of all, perhaps, a woman was encouraged to think of her sexual relationship with her husband not simply as a means of procreation, as earlier generations had been taught, but as an important and pleasurable experience in its own right, as the culmination of romantic love.

Thus it was that the 1920s saw important new advances in the creation of a national birth control movement. The pioneer of American birth-control was Margaret Sanger, who had spent most of her adult life promoting and publicizing new birth-control techniques (especially the diaphragm). At first, she had been principally concerned with birth control for working-class women, believing that large families were among the major causes of poverty and distress in poor communities. By the 1920s (partly because she had limited success in persuading working-class women to accept her teachings), she was becoming more concerned with persuading middle-class women of the benefits of birth control. Women should, Sanger argued, be free to enjoy the pleasures of sexual activity without relation to the bearing of children. Birth-control devices began to find a large market among middle-class women, even though some techniques remained illegal in many states (and abortion remained illegal nearly everywhere).

In some senses, these changes offered women a form of liberation. The declining birth rate meant that many women had to spend fewer years caring for children. The introduction of labor-saving appliances (washing machines, refrigerators, vacuum cleaners) in the home reduced some of the burdens of housework (although not always the amount of time devoted to housework, since standards of cleanliness rose simultaneously). Many middle-class women experienced a significant increase in their leisure time. And the new secular view of womanhood, the new emphasis on woman as part of her husband's social world, meant that no longer did women have to maintain a rigid, Victorian "respectability." They were free to adopt far less inhibited life styles. They could smoke, drink, dance, wear seductive clothes and make-up, and attend lively parties. The popular image of the "flapper"—the modern woman whose liberated life style found expression in dress, hair style, speech, and behavior—became one of the most widely discussed features of the era.

But the changes were not always as liberating as they might appear. For one thing, they generally affected only a relatively small group of middle-class women, leaving the lives of vast numbers of rural and working-class women unchanged. And even among the middle class, the transformation was not without cost. By placing more and more emphasis on their relationships with men, women were increasing their vulnerability to frustration and unhap-piness when those relationships proved unsatisfactory. It was not surprising, perhaps, that the national divorce rate climbed dramatically in the 1920s; nor that many women who remained married experienced boredom and restlessness.

The realization that the "new woman" was more myth than reality inspired some American feminists to continue their crusade for reform. The National Woman's party, under the leadership of Alice Paul, pressed onward with its campaign for the Equal Rights Amendment, although members found little support in Congress (and met continued resistance from other feminist groups such as the League of Women Voters). The campaign for the ERA made little headway in the 1920s.

Feminists of another sort won an apparent triumph in 1921 when they helped secure passage of a measure more in keeping with the traditional goal of securing "protective" legislation for women: the Sheppard-Towner Act. It provided federal funds to states to establish prenatal and child health-care programs. From the start, however, it produced controversy both inside and outside women's ranks. Alice Paul and her supporters opposed the measure, complaining that it classified all women as mothers. Margaret Sanger complained that the new programs would discourage birth-control efforts. More important, the American Medical Association fought Shep-pard-Towner, warning that it would introduce untrained outsiders into the health-care field, which should remain solely the province of doctors. In 1929, Congress terminated the program. On the whole, feminists willing to challenge the belief that women occupied a separate sphere and had special needs were relatively few in the 1920s. Most feminists continued to accept the idea of a distinct place for women in society; and with suffrage now achieved, many retreated from controversial political efforts and, like many other groups in American life, concentrated instead on working for personal fulfillment.

Education and Youth

The growing secularism of American culture and its expanding emphasis on training and expertise found reflection in the changing role of education, which was beginning to occupy an increasingly important role in the lives of American youth. The changes were evident in numerous ways. First, more people were going to school in the 1920s than ever before. High-school attendance more than doubled during the decade: from 2.2 million to over 5 million. Enrollment in colleges and universities increased threefold between 1900 and 1930, with much of that increase occurring after World War I. In 1918, there had been 600,000 college students; in 1930, there were 1.2 million, nearly 20 percent of the college-age population. Attendance was increasing as well at trade and vocational schools and in other institutions providing the specialized training that the modern economy demanded. Schools were, in addition, beginning to perform new and more varied functions. Instead of offering simply the traditional disciplines, they were providing training in modern technical skills: engineering, management, economics.

The growing importance of education was contributing as well to the emergence of a separate youth culture. The idea of adolescence as a distinct period in the life of an individual was for the most part new to the twentieth century. It was a result in some measure of the influence of Freudian psychology. But it was a result, too, of society's recognition that a more extended period of training and preparation was necessary before a young person was ready to move into the workplace. Schools and colleges provided adolescents with a setting in which they could develop their own social patterns, their own hobbies, their own interests and activities. An increasing number of students saw school as a place not just for academic training but for organized athletics, other extracurricular activities, clubs, and fraternities and sororities—that is, as an institution that allowed them to define themselves less in terms of their families and more in terms of their peer group.

The Decline of the "Self-Made Man"

The increasing importance of education and the changing nature of adolescence underscored one of the most important changes in American society: the gradual disappearance of the reality, and to some degree even of the ideal, of the "self-made man." The belief that any person could, simply through hard work and innate talent, achieve wealth and renown had always been largely a myth; but it had had enough basis in reality to remain a convincing myth for generations. By the 1920s, however, it was becoming difficult to believe any longer that success was possible without education and training. "The self-made manager in business," wrote Century Magazine in 1925, "is nearing the end of his road. He cannot escape the relentless pursuit of the same forces that have eliminated self-made lawyers and doctors and admirals."

The "Doom of the Self-Made Man," as Century described it, was a difficult development for Americans to accept. It suggested that individuals were no longer entirely in control of their own destinies, that a person's future depended in large part on factors over which he or she had only limited control. And like many of the other changes of the decade, many Americans greeted this one with marked ambivalence. These mixed feelings were reflected in the identity of three men who became the most widely admired heroes of the New Era: Thomas Edison, the inventor of the electric light bulb and many other technological marvels; Henry Ford, the creator of the assembly line and one of the founders of the automobile industry; and Charles Lindbergh, the first aviator to make a solo flight across the Atlantic Ocean. All received the adulation of much of the American public. Lindbergh, in particular, became a national hero the like of which the country had never seen before.

The reasons for their popularity indicated much about how Americans viewed the new epoch in which they were living. On the one hand, all three men represented the triumphs of the modern technological and industrial society. On the other hand, all three had risen to success without the benefit of formal education and largely through their own private efforts. They were, it seemed, genuinely self-made men. Even many Americans who were happily embracing a new society and a new culture were doing so without entirely diverting their gaze from a simpler past.

The Disenchanted

To a generation of artists and intellectuals coming of age in the 1920s, the new society in which they lived was even more disturbing. Many were experiencing a disenchantment with modern America so fundamental that they were often able to view it only with contempt. As a result, they adopted a role sharply different from that of most intellectuals of earlier eras. Rather than involving themselves with their society's popular or political culture and attempting to influence and reform the mass of their countrymen, they isolated themselves and embarked on a restless search for personal fulfillment. Gertrude Stein once referred to the young Americans emerging from World War I as a "Lost Generation." For many writers and intellectuals, at least, it was an apt description.

At the heart of the Lost Generation's critique of modern society was a sense of personal alienation, a belief that contemporary America no longer provided the individual with avenues by which he or she could achieve personal fulfillment. Modern life, they argued, was cold, impersonal, materialistic, and thus meaningless. The sensitive individual could find no happiness in the mainstream of American society.

This disillusionment had its roots in many things, but in nothing so deeply as the experience of World War I. To those who had fought in France and experienced the horror and savagery of modern warfare—and even to those who had not fought but who nevertheless had been aware of the appalling costs of the struggle—the aftermath of the conflict was shattering. Nothing, it seemed, had been gained. The war had been a fraud; the suffering and the dying had been in vain. Ernest Hemingway, one of the most celebrated (and most commercially successful) of the new breed of writers, expressed the generation's contempt for the war in his novel A Farewell to Arms (1929). Its hero, an American officer fighting in Europe, decides that there is no justification for his participation in the conflict and deserts the army with a nurse with whom he has fallen in love. Hemingway made it clear that he was to be admired for doing so.

At least equally dispiriting was the character of the nation these young intellectuals found on their return home at war's end. It was, they believed, a society utterly lacking in vision or idealism, obsessed with materialism, steeped in outmoded, priggish morality. Worst of all, it was one in which the individual had lost the ability to control his or her own fate. It was a sleek, new, industrialized and professionalized world that was organized in a dehumanizing way.

One result of this alienation was a series of savage critiques of modern society by a wide range of writers, some of whom were often described as the "de-bunkers." Particularly influential was the Baltimore journalist H. L. Mencken. In the pages of his magazines, first the Smart Set and later the American Mercury, he delighted in ridiculing everything Americans held dear: religion, politics, the arts, even democracy itself. He found it impossible to believe, he claimed, that "civilized life was possible under a democracy," because it was a form of government that placed power in the hands of the common people, whom he ridiculed as the "booboisie." When someone asked Mencken why he continued to live in a society he found so loathsome, he replied: "Why do people go to the zoo?"

Echoing Mencken's contempt was the novelist Sinclair Lewis, the first American to win a Nobel Prize in literature. In a series of savage novels, he lashed out at one aspect of modern society after another. In Main Street (1920), he satirized life in a small Midwestern town (much like the one in which he himself had grown up). In Babbitt (1922), he ridiculed life in the modern city. Arrowsmith (1925) attacked the medical profession (and by implication professionalism in general). Elmer Gantry (1927) satirized popular religion.

To those who held the values of their society in such contempt, the standard avenues for advancement held little appeal. Intellectuals of the 1920s turned their backs on the traditional goals of their parents. They claimed to reject the "success ethic" that they believed dominated American life (even though many of them hoped for—and a few achieved—commercial and critical success on their own terms). F. Scott Fitzgerald, whose first novel, This Side of Paradise (1920), established him as a spokesman for his generation, ridiculed the American obsession with material success in The Great Gatsby (1925), The novel's hero, Jay Gatsby, spends his life accumulating wealth and social prestige in order to win the woman he loves. The world to which he has aspired, however, turns out to be one of pretension, fraud, and cruelty, and Gatsby is ultimately destroyed by it. Fitzgerald and his intellectual contemporaries claimed to want nothing to do with conventional American society (although Fitzgerald himself seemed at the same time desperately to crave acceptance by it). They chose, instead, to search elsewhere for fulfillment.

A Refuge in Art

Their quest took them in several different directions, often at the same time. Many Lost Generation intellectuals left America to live in France, making Paris for a time a center of American artistic life. Some adopted hedonistic life styles, indulging in conspicuous debauchery: drinking, drugs, casual sex, wild parties, and a generally flamboyant way of life. (The publicity they received helped set the tone for other less alienated members of their generation, who began to imitate this uninhibited pursuit of pleasure.) Many intellectuals resorted to an outspoken self-absorption, openly repudiating any responsibility for anyone but themselves. For most of these young men and women, however, the only real refuge from the travails of modern society was art—not art for any social purpose, but art for its own sake. Only art, they argued, could allow them full individual expression; only the act of creation could offer them fulfillment.

The result of this quest for fulfillment through art was not, for the most part, personal satisfaction for the writers and artists involved. They remained throughout the 1920s a restless, usually unhappy generation, searching in vain for contentment. They did, however, produce a body of work that made the decade one of the great eras of American art. Most notable were the writers: Hemingway, Fitzgerald, Lewis, as well as others such as Thomas Wolfe, John Dos Passos, Ezra Pound, Gertrude Stein, and Eugene O'Neill—the first great American playwright and the only one ever to win a Nobel Prize. T. S. Eliot, a native of Boston who spent most of his adult life in England, led a generation of poets in breaking with the romanticism of the nineteenth century. His epic work The Waste Land (1922) brought to poetry much of the harsh tone of despair that was invading other areas of literature.

The writers of the 1920s were notable not only for the effectiveness of their critiques but for their success in pioneering new literary styles and techniques. Some incorporated Freudian psychology into their work, using literature to explore the workings of the psyche as well as the external actions of characters. Others produced innovations in form, structure, and dialogue: Ernest Hemingway, with his spare, clean prose; Sinclair Lewis, with his biting satire; John Dos Passos, with his use of the techniques of journalism as well as of literature. The literature of the 1920s was escapist; but it was also intensely creative, even revolutionary.

Other Visions

Not all intellectuals of the 1920s, however, expressed such total alienation and despair. Some expressed reservations about their society not by withdrawing from it but by advocating reform. Older progressive theorists continued to expound the values they had celebrated in the years before the war. Thorstein Veblen, for example, continued to attract a wide audience with his argument that modern society should adopt the "discipline of the machine" and assign control to engineers and technocratic experts. John Dewey remained influential with his appeals for "practical" education and experimentation in social policy. Charles and Mary Beard, perhaps the most influential historians of their day, also promoted progressive principles. In their book The Rise of American Civilization (1927), they stressed economic factors in tracing the development of modern society and suggested the need for social and economic planning.

These progressive intellectuals were often harshly critical of the society of the 1920s; yet they were, indirectly, legitimizing some of its most important features. Society was not, they were saying, excessively routinized and disciplined, as members of the Lost Generation were complaining. If anything, it was not disciplined and organized enough.

To another group of intellectuals, the solution to contemporary problems lay neither in escapism nor in progressivism, but in an exploration of their own regional or cultural origins. In New York City, a new generation of black intellectuals created a flourishing Afro-American culture widely described as the "Harlem Renaissance." The Harlem poets, novelists, and artists drew heavily from their African roots in an effort to prove the richness of their own racial heritage (and not incidentally to prove to the white race that the black was worthy of respect). The poet Langston Hughes captured much of the spirit of the movement in a single sentence: "I am a Negro—and beautiful." Other black writers in Harlem and elsewhere—James Weldon Johnson, Countee Cullen, Zora Neale Hurston, Claude McKay, Alain Locke— as well as emerging black artists and musicians helped to establish a thriving culture rooted in the historical legacy of their race.

A similar effort was under way among an influential group of Southern intellectuals. Known first as the "Fugitives" and later as the "Agrarians," these young poets, novelists, and critics sought to counter the depersonalization of industrial society by evoking the strong rural traditions of their own region. In their controversial manifesto V\\ Take My Stand (1930), a collection of twelve essays by twelve Southern intellectuals, they issued a simultaneously radical and conservative appeal for a rejection of the doctrine of "economic progress" and the spiritual debilitation that had accompanied it. The supposedly "backward" South, they argued, could serve as a model for a nation drunk with visions of limitless growth and modernization.

One of the greatest of all American writers of this era also expressed the Southerner's strong sense of place and of cultural heritage. William Faulkner, in a remarkable series of novels set in the fictional Mississippi county of Yoknapatawpha—The Sound and the Fury (1929), Absalom, Absalom (1936), and others—was, like many of his contemporaries, concerned with the problems of the individual seeking fulfillment in the modern world. But unlike others, he painstakingly re-created the bonds of region, family, and community, rather than expressing a detachment from society.

A Conflict of Cultures

The modern, secular culture of the 1920s was not unchallenged. It grew up alongside an older, more traditional culture, with which it continually and often bitterly competed. The new culture reflected the values and aspirations of an affluent, largely urban middle class, committed to a new, increasingly uninhibited life style, linked to a national cultural outlook. The older culture expressed the outlook of generally less affluent, less urban, more provincial Americans—men and women who continued to revere traditional values and customs and who feared and resented the modernist threats to their way of life. Beneath the apparent stability of the New Era and its celebrated business civilization, therefore, raged a series of harsh cultural controversies.

Prohibition

When the prohibition of the sale and manufacture of alcohol went into effect in January 1920, it had the support of most members of the middle class and most of those who considered themselves progressives. Within a year, however, it had become clear that the "noble experiment," as its defenders called it, was not working well. For a time, at least, prohibition did seem substantially to reduce drinking, at least in many regions of the country. But it also produced conspicuous and growing violations that made the law an almost immediate source of disillusionment and controversy.

The first prohibition commissioner promised rigorous enforcement of the new law. But violations were soon so rampant that the resources available to him proved ludicrously insufficient. The government hired only 1,500 agents to do the job. Before long, it was almost as easy to acquire illegal alcohol in much of the country as it had once been to acquire legal alcohol. More disturbing than the laughable ineffectiveness of the law, however, was the role prohibition played in stimulating organized crime. An enormous, lucrative industry was now barred to legitimate businessmen; underworld figures quickly and decisively took it over. In Chicago, Al Capone built a vast criminal empire based largely on illegal alcohol. He guarded it against interlopers with an army of as many as 1,000 gunmen, whose zealous-ness contributed to the violent deaths of more than 250 people in the city between 1920 and 1927. Other regions produced gangsters and gang wars of their own. Prohibition, in short, became not only a national joke but a national scandal.

Nevertheless, it survived. The middle-class progressives who had originally supported prohibition may have lost interest; but an enormous constituency of provincial, largely rural, overwhelmingly Protestant Americans continued vehemently to defend it. To them, drinking and the general sinfulness with which they associated it were an assault on their conservative code of morality. Prohibition had taken on implications far beyond the issue of drinking itself. It had come to represent the effort of an older America to maintain its dominance in a society that was moving forward in spite of it.

As the decade proceeded, opponents of prohibition (or "wets," as they came to be known) gained steadily in influence. Not until 1933, however, when the Great Depression added weight to their appeals, were they finally able effectively to challenge the "drys" and win repeal of the Eighteenth Amendment.

Nativism and the Klan

Hostility to immigrants was not new to the 1920s. Nor was it restricted to the defenders of traditional, provincial society. Like prohibition, agitation for a curb on immigration had begun in the nineteenth century; and like prohibition, it had gathered strength in the years before the war largely because of the support of middle-class progressives. Such concerns had not been sufficient in the first years of the century to win passage of curbs on immigration; but when in the years immediately following the war immigration began to be associated with radicalism, popular sentiment on behalf of restriction grew rapidly. In 1921, therefore, Congress passed an emergency immigration act, establishing a quota system by which annual immigration from any country could not exceed 3 percent of the number of persons of that nationality who had been in the United States in 1910. The new law cut immigration from 800,000 to 300,000 in a single year, but the nativists remained unsatisfied. In 1924, Congress enacted an even harsher law: the National Origins Act, which banned immigration from east Asia entirely (deeply angering Japan) and reduced the quota for Europeans from 3 to 2 percent. The quota would be based, moreover, not on the 1910 census, but on the census of 1890, a year in which there had been far fewer southern and eastern Europeans in the country. What immigration there was, in other words, would heavily favor northwestern Europeans—people of "Nordic" or "Teutonic" stock. The 1924 act cut the yearly flow almost in half, to 164,000. Five years later, a further restriction set a rigid limit of 150,000 immigrants a year. In the years that followed, immigration officials seldom permitted even half that number actually to enter the country.

The legislative expression of nativism reflected largely the doctrines of progressivism, even if a harsh and narrow progressivism. Restricting immigration, its proponents believed, would contribute to the efficient and productive operation of society. There were, however, other expressions of nativism that reflected very different sentiments. To defenders of an older, more provincial America, the growth of large communities of foreign peoples, alien in their speech, their habits, and their values, came to seem a direct threat to their own embattled way of life. This provincial nativism took a number of forms. But the most prominent was the resurgence of the Ku Klux Klan as a major force in American life.

The Klan was originally the product of the first years after the Civil War. That early organization had died in the 1870s. But in 1915, shortly after the premiere of the film The Birth of a Nation, which celebrated the early Klan, a new group of Southerners gathered on Stone Mountain outside Atlanta, Georgia, to establish a modern version of the society. At first the new Klan, like the old, was largely concerned with intimidating blacks, who were, Klan leader William J. Simmons claimed, becoming dangerously insubordinate. After World War I, however, concern about blacks gradually became secondary to concern about Catholics, Jews, and foreigners. The Klan would devote itself, its leaders proclaimed, to purging American life of impure, alien influences.

It was then that the modern Klan experienced its greatest growth. Membership in the small towns and rural areas of the South soon expanded dramatically; more significantly, the Klan was now spreading northward, establishing a strong foothold particularly in the industrial cities of the Midwest. By 1923, there were reportedly 3 million members; by 1924, 4 million.

In some communities, where Klan leaders came from the most "respectable" segments of society, the organization operated much like a fraternal society, engaging in nothing more dangerous than occasional political pronouncements. Often, however, the Klan also operated as a brutal, even violent, opponent of "alien" groups and as a defender of traditional, fundamentalist morality. Klansmen systematically terrorized blacks, Jews, Catholics, and foreigners: boycotting their businesses, threatening their families, and attempting to drive them out of their communities. Occasionally, they resorted to violence: public whipping, tarring and feathering, arson, and lynching.

What the Klan most deeply feared, it soon became clear, was not simply "foreign" or "racially impure" groups; it was anyone who posed a challenge to traditional values. Klansmen persecuted not only immigrants and blacks but those white Protestants they considered guilty of irreligion, sexual promiscuity, or drunkenness. The Klan worked to enforce prohibition; it attempted to institute compulsory Bible reading in schools; it worked to punish divorce. The Ku Klux Klan, in short, was fighting not just to preserve racial homogeneity but to defend a traditional culture against the values and morals of modernity. The organization itself began to decline in influence after 1925, when a series of internal power struggles and several sordid scandals discredited some of its most important leaders. The issues it had raised, however, retained strength among some Americans for many years.

Religious Fundamentalism

Another great cultural controversy of the 1920s revealed even more starkly the growing gulf between the new culture and the old. It was a bitter conflict over questions of religious doctrine and, even more, over the place of religion in contemporary society. By 1921, American Protestantism was already divided into two warring camps. On one side stood the modernists: mostly urban, middle-class people who had attempted to adapt religion to the teachings of modern science and to the realities of their modern, secular society. On the other side stood the fundamentalists: provincial, largely (although not exclusively) rural men and women, fighting to preserve traditional faith and to maintain the centrality of religion in American life. The fundamentalists looked with horror at the new morality of the modern city. (They formed a substantial part of the constituency defending prohibition in the 1920s.) They expressed outrage at the abandonment of traditional beliefs in the face of scientific discoveries, insisting that the Bible was to be interpreted literally. Above all, they opposed the teachings of Charles Darwin, who had openly challenged the biblical story of the Creation. Human beings had not evolved from lower orders of animals, the fundamentalists insisted. They had been created by God, as described in Genesis.

Fundamentalism had been growing in strength in American Protestantism since the 1870s, but for many years it had found expression chiefly within the denominations. But it was as well an evangelical movement, interested in spreading the doctrine to new groups. Evangelists, among them the celebrated Billy Sunday, traveled from state to state (particularly in the South and parts of the West) attracting huge crowds to their revival meetings.

Protestant modernists looked on much of this activity with condescension and amusement. But by the mid-1920s evangelical fundamentalism was beginning to take a form that many regarded with real alarm. In a number of states, fundamentalists were gaining political strength with their demands for legislation to forbid the teaching of evolution in the public schools. To the modernists, such laws were almost unthinkable. Darwinism had to them become indisputable scientific fact; to forbid the teaching of evolution, they believed, would be like forbidding teachers to tell their students that the world was round. Yet they watched with incredulity as one state after another seriously considered the fundamentalist demands. In Tennessee in March 1925, the legislature actually adopted a measure making it illegal for any public school teacher "to teach any theory that denies the story of the divine creation of man as taught in the Bible."

The result was one of the most celebrated events of the decade. When the American Civil Liberties Union offered free counsel to any Tennessee educator willing to defy the law and become the defendant in a test case, a twenty-four-year-old biology teacher in the town of Dayton, John T. Scopes, arranged to have himself arrested. And when the ACLU decided to send the famous attorney Clarence Darrow to defend Scopes, the aging William Jennings Bryan (now an important fundamentalist spokesman) announced that he would travel to Dayton to assist the prosecution. Journalists from across the country, among them H. L. Mencken, flocked to Tennessee to cover the trial, which opened in an almost circus atmosphere. Scopes had, of course, clearly violated the law; and a verdict of guilty was a foregone conclusion, especially when the judge refused to permit "expert" testimony by evolution scholars. Scopes was fined $100, and the case was ultimately dismissed in a higher court because of a technicality. Nevertheless, Darrow scored an important victory for the modernists by calling Bryan himself to the stand to testify as an "expert on the Bible.*' In the course of the cross-examination, Darrow made Bryan's churlish defense of biblical truths appear increasingly foolish and finally tricked him into admitting the possibility that not all religious dogma was subject to only one interpretation.

The Scopes trial did not resolve the conflict between fundamentalists and modernists. Indeed, four other states soon proceeded to pass antievolution laws of their own. The issue continued to smolder for decades until it emerged in full force once again in the form of the creationist movement of the early 1980s.

The Democrats' Ordeal

The anguish of provincial Americans attempting to defend an embattled way of life proved particularly troubling to the Democratic party, which suffered a serious debilitation during the 1920s as a result of tensions between its urban and rural factions. Far more than the Republicans, the Democrats consisted of a diverse coalition of interest groups, linked more by local tradition than common commitment. Among those interest groups were prohibitionists, Klansmen, and fundamentalists on one side and Catholics, urban workers, and immigrants on the other.

In 1924, the tensions between them proved devastating. At the Democratic National Convention in New York that summer, bitter conflict broke out over the platform when the party's urban wing attempted to win approval of planks calling for the repeal of prohibition and a denunciation of the Klan. Both planks narrowly failed. More serious was a deadlock in the balloting for a presidential candidate. Urban Democrats supported Alfred E. Smith, the Irish Catholic Tammanyite who had risen to become a progressive governor of New York; rural Democrats backed William McAdoo, Woodrow Wilson's Treasury secretary (and son-in-law), later to become a senator from California, who had skillfully positioned himself to win the support of Southern and Western delegates suspicious of Tammany Hall and modern urban life. For 103 ballots, the convention dragged on, until finally both Smith and McAdoo withdrew and the party settled on a compromise: the corporate lawyer John W. Davis.

In the years that followed, the schism between the two wings of the party continued to plague the Democrats. In 1928, Al Smith finally did manage to secure his party's nomination for president after another acrimonious but less prolonged battle. He was not, however, able to unite his divided party. As a result, he became the first Democrat since the Civil War to fail to carry the South. (He won only six of the eleven states of the former Confederacy.) Elsewhere, although he did well in the large cities, he carried no states at all except Massachusetts and Rhode Island.

Smith's opponent, and the victor in the presidential election, was a man who perhaps more than any other personified the modern, prosperous, middle-class society of the New Era: Herbert Hoover. The business civilization of the 1920s, with its new institutions, fashions, and values, continued to arouse the animosity of large portions of the population; but the majority of the American people appeared to have accepted and approved it. In 1928, at least, the New Era seemed to be permanently enshrined—as the success of the Republican party, its political embodiment, suggested.

Republican Government

For twelve years, beginning in 1921, both the presidency and the Congress rested securely in the hands of the Republican party—a party in which the power of reformers had greatly dwindled since the heyday of progressivism before the war. For most of those years, the federal government expressed a profound conservatism and enjoyed a warm and supportive relationship with the American business community. Yet the government of the New Era was more than the passive, pliant instrument that critics often described. It attempted to serve in many respects as an active and powerful agent of economic change.

Warren G. Harding

Nothing seemed more clearly to illustrate the death of crusading idealism in the 1920s than the characters of the two men who served as president during most of the decade: Warren G. Harding and Calvin Coolidge.

Harding was elected to the presidency in 1920, having spent many years in public life doing little of note. He had advanced from the editorship of a newspaper in his hometown of Marion, Ohio, to the state legislature by virtue of his good looks, polished speaking style, and geniality. He had moved from there to the United States Senate as a result of his party regularity. And he had moved from there to the White House as a result of a political agreement among leaders of his party who considered him, as one noted, a "good second-rater."

The new president had few illusions about his own qualifications for office. Awed by his new responsibilities, he made sincere efforts to perform them with distinction. He appointed capable men to the most important cabinet offices; he attempted to stabilize the nation's troubled foreign policy; and he displayed on occasion a vigorous humanity, as when he pardoned socialist Eugene V. Debs in 1921. Even as he attempted to rise to his office, however, he exhibited a sense of bafflement about his situation, as if he recognized his own unfitness. "I am a man of limited talents from a small town," he reportedly told friends on one occasion. "I don't seem to grasp that I am President." Unsurprisingly, perhaps, Harding soon found himself delegating much of his authority to others: to members of his cabinet, to political cronies, to Congress, to party leaders. In the meantime, the nation's press, overwhelmingly Republican, was portraying him as a wise and effective leader.

Harding's personal weaknesses as much as his political naivete finally resulted in his demise. He realized the importance of capable subordinates in an administration in which the president himself was reluctant to act. At the same time, however, he lacked the strength to abandon the party hacks who had helped create his political success. One of them, Harry Daugherty, the Ohio party boss principally responsible for his meteoric political ascent, he appointed attorney general. Another, Albert B. Fall, he made secretary of the interior. Members of the so-called Ohio Gang filled important offices throughout the administration. It was widely known within the government that the president's cronies led active, illicit social lives; that they gathered nightly at the famous "House on K Street" to drink illegal alcohol, play poker, and entertain attractive women; and that the president himself often joined in all these activities.

The Harding Scandals

What remained for a time generally unknown was that Daugherty, Fall, and others were engaged in a widespread pattern of fraud and corruption. They sold government offices and favors, bribed congressmen and senators to support legislation favorable to their interests, and plundered the agencies and departments in which they worked.

The most spectacular scandal involved the rich naval oil reserves at Teapot Dome, Wyoming, and Elk Hills, California. At the urging of Albert Fall, Harding transferred control of those reserves from the Navy Department to the Interior Department. Fall then secretly leased them to two wealthy businessmen—Harry F. Sinclair and Edward L. Donehny —and received in return nearly half a million dollars in "loans" to ease his private financial troubles. Fall was ultimately convicted of bribery and sentenced to a year in prison; Harry Daugherty barely avoided a similar fate for his part in another scandal.

For several years, apparently, Harding himself remained generally unaware of the corruption infecting his administration. But by the summer of 1923, only months before Senate investigations and press revelations brought the scandals to light, he began to realize how desperate his situation had become. Tired and depressed, the president left Washington for a speaking tour in the West and a visit to Alaska. In Seattle late in July, he suffered severe pain, which his doctors wrongly diagnosed as food poisoning. A few days later, he seemed to rally and traveled on to San Francisco. There, on August 2, he died. He had suffered two major heart attacks.

Calvin Coolidge

In many ways, Calvin Coolidge, who succeeded to the presidency on the death of Harding, was utterly different from his predecessor. Where Harding was genial and garrulous, Coolidge was dour and silent. Where Harding adopted a loose, even debauched life style, Coolidge lived soberly and puritanically. And while Harding was if not personally corrupt then at least tolerant of corruption in others, Coolidge was honest beyond reproach. The image of stolid respectability that he projected was so unassailable that the Republican party managed to avoid any lasting damage from Teapot Dome and related scandals. In other ways, however, Harding and Coolidge were similar figures. Both represented no soaring ideals but an unadventurous conservatism. Both took a passive approach to their office.

Like Harding, Coolidge rose to the presidency on the basis of few substantive accomplishments. During his years in Massachusetts politics, he had won a reputation as a safe, trustworthy figure; and largely as a result of that, he had become governor in 1919. His response to the Boston police strike won him national attention and, in 1920, his party's vice-presidential nomination. Three years later, news of Harding's death reached him in Vermont; and there, by the light of a kerosene lamp on a kitchen table, he took the oath of office from his father, a justice of the peace.

If anything, Coolidge was an even less active president than Harding, partly as a result of his conviction that government should interfere as little as possible in the life of the nation and partly as a result of his own personal lassitude. He took long naps every afternoon. He kept official appointments to a minimum and engaged in little conversation with those who did manage to see him. He proposed no significant legislation and took little part in the running of the nation's foreign policy. "He aspired," wrote one of his contemporaries, "to become the least President the country ever had. He attained his desire."

In 1924, he received his party's presidential nomination virtually unopposed. Running against Democrat John W. Davis, a wealthy corporate lawyer who had served in the Wilson administration, he won a comfortable victory: 54 percent of the popular vote and 382 of the 531 electoral votes. Robert La Follette, the candidate of the reincarnated Progressive party, received 16 percent of the popular vote but carried only his home state of Wisconsin. Coolidge's negative, custodial view of the presidency clearly had the approval of the great majority of the American people. Four years later, it still did. The president could probably have won renomination and reelection easily in 1928. Instead, in characteristically laconic fashion, he walked into a press room one day and handed each reporter a slip of paper containing a single sentence: "I do not choose to run for president in 1928."

Government and Business

The story of Harding and Coolidge themselves, however, is only a part—and by no means the most important part—of the story of their administrations. However inert the New Era presidents may have been, much of the federal government was working effectively and efficiently during the 1920s to adapt public policy to the widely accepted goal of the time: helping business and industry to operate with maximum efficiency and productivity. The close relationship between the private sector and the federal government that had been forged during World War I continued.

In the executive branch, the most active efforts came from members of the cabinet. Secretary of the Treasury Andrew Mellon, the wealthy steel and aluminum tycoon who became one of the most influential and respected figures in government, devoted himself to working for substantial reductions in taxes on corporate profits and personal incomes and inheritances. Largely because of his efforts, Congress cut them all by more than half. The result, Mellon claimed, would be to stimulate investment and ensure general prosperity. Mellon also worked closely with President Coolidge after 1924 on a series of measures to trim dramatically the already modest federal budget. The administration even managed to retire half the nation's World War I debt.

The most prominent member of the cabinet was Commerce Secretary Herbert Hoover, who considered himself, and was considered by others, a notable progressive. Hoover was active in so many areas that he often seemed to be running the entire federal government single-handedly. He used his position to promote a better organized, more efficient national economy. Only thus, he claimed, could the nation hope to fulfill its most important task: the elimination of poverty.

During his eight years in the Commerce Department, Hoover constantly encouraged voluntary cooperation in the private sector as the best avenue to stability. But the idea of voluntarism did not require the government to remain passive; on the contrary, public institutions, Hoover believed, had a duty to play an active role in creating the new, cooperative order. Above all, he became the champion of the concept of business associationalism—a concept that envisioned the creation of national organizations of businessmen in particular industries. Through such trade associations, private entrepreneurs could, Hoover believed, stabilize their industries and promote efficiency in production and marketing. Hoover strongly resisted those who urged that the government sanction collusion among manufacturers to fix prices, arguing that competition was essential to a prosperous economy. He did, however, believe that shared information and limited cooperation would keep that competition from becoming destructive and thus improve the strength of the economy as a whole.

The Supreme Court in the 1920s further confirmed the business orientation of the federal government, particularly after the appointment of William Howard Taft as chiefjustice in 1921. The Court struck down federal legislation regulating child labor (Bailey v. Drexel Furniture Company, 1922); it nullified a minimum wage law for women in the District of Columbia (Adkins v. Children's Hospital, 1923); and it sanctioned the creation of trade associations, ruling in United States v. Maple Flooring Association (1925) that such organizations did not violate antitrust statutes as long as some competition survived within an industry. Five years earlier, in United States v. U.S. Steel, the Court had applied the same doctrine to the monopolistic United States Steel Corporation; there was no illegal "restraint of trade," it ruled, as long as U.S. Steel continued to face any competition, no matter how slight.

The probusiness policies of the Republican administrations were not without their critics. There survived in Congress throughout the 1920s a large and influential group of progressive reformers of the old school, whose vision of public power as an antidote to private privilege remained very much alive. They continued to criticize the monopolistic practices of big business, to attack government's alliance with the corporate community, to decry social injustices, and to call for economic reform. Occasionally, they were able to mobilize enough support to win congressional approval of progressive legislation, most notably the McNary-Haugen plan for farmers and an ambitious proposal to use federal funds to develop public electric power projects on the Tennessee River at Muscle Shoals. But the progressive reformers were clearly no longer the dominant force in American political life. When the president vetoed the legislation they had promoted, as he almost always did, they lacked the strength to override him. Some progressives derived encouragement from the results of the 1928 election, which elevated Herbert Hoover—widely regarded as the most progressive member of the Harding and Coolidge administrations—to the presidency. Hoover easily defeated Alfred Smith, the Democratic candidate. And he entered office promising bold new efforts to solve the nation's remaining economic problems. But Hoover had scant opportunity to demonstrate his commitment to extending American prosperity to those who had not shared in it. Because less than a year after his inauguration, the nation plunged into the severest and most prolonged economic crisis in its history—a crisis that brought many of the optimistic assumptions of the New Era crashing to the ground and launched the nation into a period of unprecedented social innovation and reform.

 


 
Çàäàíèÿ
Èñòî÷íèêè
Ëèòåðàòóðà
Áèáëèîòåêà
Èëëþñòðàöèè
Êàðòû
Ññûëêè
Ýêçàìåí
Ñîôò
Êîíòàêòû
 

Õîñòèíã ïðåäîñòàâëåí Ôàêóëüòåòîì ìåæäóíàðîäíûõ îòíîøåíèé ÑÏáÃÓ

© È.À. Öâåòêîâ
2003 - 2010