Book Search:  

 

 
Google full text of our books:

bookjacket

More Equal Than Others:
America from Nixon to the New Century
Godfrey Hodgson

Book Description | Reviews | Table of Contents

COPYRIGHT NOTICE: Published by Princeton University Press and copyrighted, © 2004, by Princeton University Press. All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher, except for reading and browsing via the World Wide Web. Users are not permitted to mount this file on any network servers. Follow links for Class Use and other Permissions. For more information, send e-mail to permissions@press.princeton.edu

This file is also available in Adobe Acrobat PDF format

Chapter 1

STATE OF THE UNION

And so we have gone on, and so we will go on, puzzled and prospering beyond example in the history of man.
--Thomas Jefferson to John Adams, January 21, 1812

IN THE YEAR 1975, the mood of the United States was perplexed, morose, and uncertain. For the first time in the modern era, the nation had lost a war. For the first time, a president had been driven from office in disgrace. It was said that the American Dream would be denied to many, because for the first time a generation of Americans would be worse off than their parents.1

For the first time, Americans, "people of plenty,"2 used to a culture of abundance, confronted the prospect of not being self-sufficient in energy and in some key raw materials.3 A quarter of a century later, the national mood was buoyant. The Soviet Union had disintegrated, and its communist creed was utterly discredited. Americans had not so much come to agree with one another about politics, as to lose interest in the political process. By the end of the twentieth century there was an echo in the air of that epoch in nineteenth-century French politics whose motto was "enrich yourselves."4 Americans were busy enriching themselves, and a significant minority did so to impressive effect.

Most striking of all was the transformation of the nation's public philosophy from liberal to conservative. By the 1990s, few cared to identify themselves as liberal. In 1992, for example, a poll showed that only 20 percent of the voters regarded themselves as liberals, compared with 31 percent who identified themselves as conservatives.5 Whereas in 1973 only 32 percent agreed with the proposition that "the best government is the government that governs the least," by 1998 56 percent agreed.6 In the middle 1970s Americans were just coming to the conclusion--painful for some, liberating for others--that the ideas of the New Deal had served their time.7 People were turning from the warm inclusiveness of the liberal consensus to the energizing astringency of a new conservative philosophy. By the 1980s free-market capitalism was being enthroned, not just as a useful system for wealth creation that needed to be kept under watchful scrutiny, but also as one of the twin keys, with democracy, to the American belief system and the American future.8

In politics and journalism, and in a welter of what can only be called corporate propaganda, the idea was ceaselessly reiterated that giant corporations and the stock exchange were the true democracy, and that anyone who dared to challenge their hegemony was no friend to the working families of America, but an elitist and, as such, a traitor to the nation's best traditions. Thus a college professor or working journalist living on a few thousand dollars a month was condemned as an oppressor, whereas a CEO, who paid himself--with the connivance of an intimidated or collusive remuneration committee--several hundred times as much as his average employee, was held up to sycophantic praise as the working man's friend.9

By the late 1990s Americans had put the hesitations of the 1970s far behind them. They had made it, the majority felt. Whatever private fears or misgivings individuals might have, the public mood was robustly confident. It was never more trenchantly expressed than in the State of the Union Message with which President Bill Clinton celebrated the millennium on January 27, 2000, his last after eight helter-skelter years in the White House.10

His tone was triumphant, not to say triumphalist. It was as if the United States had finally attained a state of economic and social perfection, nirvana now.11 We were, he said to an applauding audience of senators and members of Congress, "fortunate to be alive at this moment in history." Never before, Clinton went on, "has our nation enjoyed, at once, so much prosperity and social progress with so little internal crisis and so few external threats." Never before, he said, "have we had such a blessed opportunity to build the more perfect union of our founders' dreams."

This was, of course, political rhetoric. But it was not mere idle bragging. The United States really did enter the new millennium with impressive achievements to look back on, and exciting prospects for the future. The economy seemed to have overcome all hesitations. Clinton could claim, with some exaggeration, that under his administration the nation had achieved the "fastest economic growth in more than 30 years." In sober truth, the economy really had created some 20 million new jobs in a few years. As a consequence, the number of poor people had fallen to the lowest figure in twenty years and the unemployment rate to the lowest level in thirty years. What was more, unemployment for African Americans and for Hispanics was also lower than ever before. For the first time in forty-two years, there had been budget surpluses in two consecutive years, and--the president accurately predicted--in the very next month America would achieve the longest period of economic growth in its entire history.

That was not all. Economic revolution, Clinton claimed, had been matched by what he called "a revival of the American spirit." That might be hard to measure. But crime, for example, was said to be down by 20 percent, to its lowest level in twenty-five years. The number of children born to teenage mothers had fallen for seven years in a row, and the welfare rolls, those stubborn indicators of hidden misery, had been cut in half. Such statistical measures of societal, let alone spiritual, advancement are always suspect. But at home there certainly was a widespread sense of pride and optimism.

Abroad, the United States in 2000 was dominant in the world as never before. American military power was arguably greater, relative to all possible rivals, than ever. Even in 1945 the Soviet Union had formidable military forces under arms. At the beginning of the millennium the United States stood unchallenged. It had put down such truculent breakers of the peace as Iraq and Serbia with little help from allies and with few casualties. Less than two years later, American military supremacy was challenged once again, this time by terrorism. Once again, at least so far as war in Afghanistan and Iraq was concerned, it was confirmed.

The American economy, at least temporarily, had outdistanced former rivals in Europe and Asia.12 Moreover, never before, unless perhaps in the first few years after World War II, had America been so much admired around the world. American fashions, American music, even American movies, were seen as the last thing in cool. America basked in the prestige earned, from Budapest to Bangalore, by American domination of the new frontiers of computing, information technology, and the Internet. The continued rise of the stock market seemed to confirm that the American economy defied the law of economic gravity.

President Clinton was quick to claim some of the credit for these achievements for his own administration. To renewed applause, he said, "We have built a new economy," and proceeded to set forth his vision of a social utopia as well: "We will make America the safest big country on earth, pay off the national debt, reverse the process of climatic change, and become at last what the Founding Fathers promised: 'one nation, under God, indivisible, with liberty and justice for all.' " With dollars cascading into the U.S. Treasury in a profusion that was unimaginable when his second term began, commented R. W. Apple Jr. in the New York Times, Clinton spoke "with all the expansiveness of a man who had hit the lottery."13

The president's euphoria was no doubt sharpened by relief, and his triumphant tone by an understandable desire to have his revenge on ruthlessly vindictive political opponents. Only weeks previously, after all, he had been acquitted in an impeachment trial before the United States Senate, the first chief executive to face that humiliating ordeal since Andrew Johnson more than 130 years earlier. He would not have been human if he had not taken advantage of the opportunity to confound his enemies and rub the doubters' noses in his success.

Yet Clinton's millennium speech by no means reflected a mere personal or partisan version of how things stood as the twentieth century ended. As early as 1997, Fortune magazine, for example, hardly a mouthpiece for the narrow political contentions of Democrats, claimed that the U.S. economy was "stronger than it's ever been,"14 something it could in truth have said at most points in the past fifty years. Business Week consistently preached the gospel of the new economy. Gurus like Nicholas Negroponte of the Massachusetts Institute of Technology and innumerable brokers drummed away at the idea that, between them, the Internet and the stock market had changed the rules of the game of success.15 From Wall Street the same message came drumming from the million hooves of Merrill Lynch, the thundering herd of people's capitalism. The broker proclaimed in a circular to its happy investors, that this was "Paradise Found: The Best of All Possible Economies"--except, presumably, for the next day's.16 Later Merrill Lynch admitted to the government that its salesmen had been urging clients to "buy" or "accumulate" stocks its analysts privately regarded as "crap," "a dog," and even "a piece of shit."17 Late in 2002 ten banks, including Credit Suisse First Boston, Merrill Lynch, and Salomon Smith Barney, agreed to pay $1.4 billion in a settlement that revealed that published research on stocks was "essentially bought and paid for by the issuer." One supposedly independent analyst at Goldman Sachs, the settlement found, was asked what his three most important goals for 2002 were. The response was "1. Get more investment banking revenue. 2. Get more investment banking revenue. 3. Get more investment banking revenue." Another analyst, at Lehmann Brothers, said that misleading "the little guy who isn't smart about the nuances" was "the nature of my business."18

Much media discourse in the 2000 election campaign took as real the idea that the country as a whole was enjoying unimaginable prosperity. This was an exaggeration. It might be more credible in Washington or Manhattan, Boston or Seattle, or the San Francisco Bay area, than in some less favored parts of the country. It might be truer of lawyers, doctors, and editorial writers than of Nebraska farmers, laid-off Indiana machinists, or Hispanic immigrants to southern California, let alone African American single mothers on the South Side of Chicago. But it was true enough for enough people that it became the key signal of the year, and even of the decade of the 1990s.

It was against this bass line of euphoria and optimism that Clinton moved to his peroration: "After 224 years," he declaimed to his applauding audience--made up in its majority of the Republicans who had opposed his policies every inch and actually prevented him from carrying out many of the policies he had advocated--"the American revolution continues. We remain a young nation. And as long as our dreams outweigh our memories, America will be forever young. That is our destiny. And this is our moment."

* * * * *

LESS THAN TWO SHORT YEARS LATER, that shining moment was tarnished in many ways. Clinton's Democratic heir, Al Gore, had been defeated in the 2000 presidential campaign. The election itself was so close that its result was doubtful through weeks of litigation. On balance it is likely that the victor, George W. Bush, was not elected. One of the most careful and authoritative of the many analyses of the election result concluded that Gore was denied victory by a Supreme Court opinion "doomed to infamy" and that "the wrong man was inaugurated on January 20, 2001, and this is no small thing." No small thing, indeed, for a country that would be the teacher of democracy to the world.19

At the heart of the optimism so fervently expressed by Clinton in his State of the Union Address, but very widely shared in the nation, were two beliefs that had been sharply challenged, if not discredited, within months. One was the upward march of the stock market, rewarding the talent and enterprise of the few but also spreading its beneficence over the many. The Standard & Poor's composite stock price index, corrected for inflation, which trotted along modestly by comparison through the crash of 1929 and the boom of the 1960s and 1970s, soared dizzily from 1995 on to spike in the very month of Clinton's millennium speech.20 The second idea behind the "irrational exuberance" of the market was the confidence that the new technology of the computer and the Internet promised a New Economy. By early 2001, the Dow Jones index of common stocks had fallen from its high by some 40 percent. The Nasdaq index, dedicated to charting the heroics of the new technology stocks, had fallen even more catastrophically. The Nasdaq composite index, to be specific, which passed 5000 in early 2000, had fallen below 2000 by late 2001, or by more than 60 percent in less than two years.21 After the September 11 attacks, the markets rallied to the point where some investment analyses claimed to see the green shoots of a new bull market. Even if that were so, the idea that the new technology had simply abolished the laws of economic gravity had been exploded for good. And by the summer of 2002 the market had fallen to the point where it threatened the health of the economy as a whole.

The other heartening economic statistics Clinton cited had also been swept away. By March 2001 the economy was technically in recession.22 Unemployment was rising, from 3.9 percent in October 2000 to 5.4 percent a year later. In October 2001 alone it rose half a percentage point, the biggest increase in a single month since February 1996.23 And the economic climate was harshest for those very high technology sectors that had led the way in the boom of the later 1990s. "Dot.com" startups were worst hit. The major winners of the high-technology market--Yahoo!, AOL, Compaq, Sun, even Intel and Microsoft--all announced profit warnings, layoffs, even in some cases actual losses, and their stock fell heavily. Indeed, contradicting a central assumption of the boosters of the New Economy, the stock market would have done better without the new technology companies than with them.

By the summer of 2002 any talk of a New Economy would have appeared fatuous. The Dow Jones indeed had fallen below 8000. Disillusion had spread far beyond the overhyped dot.com stocks to most sectors of the market. Worse, the collapse of Enron and Worldcom, and serious problems in many other major corporations, had shaken public faith in Kenneth Lay of Enron, Bernie Ebbers of Worldcom, and Jack Welch of General Electric, all of whom, two short years earlier, had been credited with virtually magical powers. Those corporate meltdowns, in turn, exposed the collusive behavior of major accounting firms, especially Arthur Andersen.

In the summer of 2002, the economist-turned-columnist Paul Krugman, writing in the New York Times, drew a political moral from the economic events: "The current crisis in American capitalism isn't just about the specific details--about tricky accounting, stock options, loans to executives, and so on. It is about the way the game has been rigged on behalf of insiders."24

As Krugman and many others pointed out, the Bush administration was full of insiders. George W. Bush's secretary of the army, Thomas White, came from Enron, where he headed a division whose profits were manipulated to the tune of $500 million, and who sold $12 million of stock before the company collapsed. His vice president, Richard Cheney, was chairman and chief executive of Halliburton, whose stock was one of the hottest counters in the boom, and where dubious accounting turned a loss into a profit. The president himself was not above suspicion. He had been a typical player in the same corporate world. He profited from insider knowledge to unload stock in his own company, Harken Energy. He made $15 million personally out of the sale of the Texas Rangers baseball club, which benefited from favors squeezed out of a suburban community government. And as governor of Texas, he allowed a friend and major political contributor, Tom Hicks, to benefit financially from the privatization of a large portion of the University of Texas's substantial endowment.25 So far from demonstrating the brilliance of the New Economy, the Bush administration seemed to be drawing on Republican traditions as old as Teapot Dome26 and Credit Mobilier.27 And, to judge by the polls, the public seemed to draw its own conclusions.

As for Clinton's rhetoric about the United States remaining "forever young," it is tempting to quote Oscar Wilde's cynical witticism: "[T]he youth of America," he said more than a hundred years ago, "is its oldest tradition."28 Certainly the graying of America was attested by the new political clout of the American Association of Retired Persons. Without the new immigrants who continued to pour in, especially from Mexico, Central America, and the Caribbean, for the first time the birthrate in the United States would have fallen, as it had done in some countries in Western Europe, below the point where it guaranteed a natural increase in population. As for Clinton's confidence that the country faced no external threat, that too was to be disproved in a bizarre and terrifying way less than two years later, on September 11, 2001.

* * * * *

AT THE BEGINNING of the twenty-first century, the United States was a mature civilization marked by striking, well-rooted contradictions. It is (and the list of pairs by no means exhausts the difficulties facing anyone who attempts a simplistic analysis) generally pacific but occasionally bellicose; religious yet secular; innovative but conservative; tough but tender; aggressive yet reluctant to incur casualties; egalitarian by instinct but stratified in tiers of wide and growing inequality; puritan yet selfindulgent; conformist but full of independent-minded people; devoted to justice, yet in many ways remarkably unfair; idealistic yet given to cynicism. ("Nice guys finish last" is almost a national motto.) At some times it can be self-confident to the verge of complacency, at others self-doubting to the point of neurosis.

A quarter of a century ago, while Richard Nixon was falling from grace, I was at work on a book published just as Jimmy Carter was on course for the White House and the intractable problems he found waiting for him there.29 Its main argument was that, by the early 1950s, the Depression, the New Deal, World War II, and the Cold War had forged what, for want of a more precise term, I called a "liberal consensus" and that this was a victim of the tumultuous events of the 1960s. The consensus I had in mind was in effect a gigantic deal, by which conservatives accepted the main principles of a welfare state, made necessary by the Depression, and in return liberals accepted the need for the national security state, demanded by the Cold War. The book described how that consensus was shattered by three converging crises. There was the upheaval against racial oppression, first in the rural South, then in the great cities of the north and West. There was the long agony of the Vietnam War and the growing opposition to it, culminating in the humiliation of defeat. And there was a pervasive crisis of authority. That had its roots, no doubt, in gradual, subtle changes inside the nation's families and schoolrooms. But it was brought home by a series of political shocks, among them the assassinations of John F. Kennedy, Robert Kennedy, and Martin Luther King, and the rise of a new, defiantly intransigent conservatism that directly challenged both liberalism and consensus. There was a fourth crisis gathering like a storm over the horizon: the 1960s saw the first widespread expressions of concern at the impact American industry and the American consumption of water, forests, fossil fuels, and other natural resources were having on the environment.

In the last words of that book, I summed up the prospect from the middle 1970s in a phrase from a letter that Thomas Jefferson wrote to his former antagonist and later friend, John Adams. "So we shall go on," Jefferson predicted, "puzzled and prospering beyond example in the history of man." Puzzled and prospering? Prospering, certainly, in the aggregate, but still puzzled. That is another pair of contrasts that describes the true state of the Union over the last quarter of the twentieth century.

One obvious clue to resolving this endless string of apparent contradictions, of course, lies in the sheer size and diversity of America. "Do I contradict myself?" asked the preeminent national poet, Walt Whitman. "Very well, I contradict myself. I am large, I contain multitudes."30 By the end of the twentieth century the United States contained some twenty times more people than when Whitman wrote those lines, and those people lived in an infinitely richer variety of ways.

This is a nation on a continental scale, a country uniquely devoted to change, extraordinarily quick to adopt new fashions--including new fashions in thinking about itself. Yet the United States at the close of the twentieth century had assumed the patterns of belief and practice of a mature society, patterns that were strongly established, even when they were contradictory. In certain respects, it was even an imperial society, led by a self-confident elite, sure that its destiny was to lead and, if necessary, to dominate the world. In economic, intellectual, and cultural life its achievements were extraordinary. Not the least of them was the freedom that not just allowed but actually encouraged people to explore many different ways of living their lives.

Perhaps it was this freedom, rather than a fluctuating and ill-distributed prosperity, that accounted for the buoyancy and cheerfulness of many individual lives and for the resilience and adventurousness of America as a whole. It was on the whole an unprecedentedly tolerant society. It was one in which social arrangements made it possible for trout fishermen, dancers, scholars, architects, oceanographers, gamblers, golfers, grandmothers, writers, musicians, rock climbers, theologians, stockbrokers and many others--even, within limits, radicals--to pursue their diverse personal grails.

This was a rich vein of individualism. What was less often celebrated was the remarkable array of institutions through which individuals could achieve their diverse goals. Universities, graduate schools, night schools, foundations, charities, websites, and chatrooms were all dedicated to supporting individual obsessions and making individual dreams come true. So was an extraordinary range of commercial institutions. Pension plans, savings companies, trust companies, brokerages, real estate agencies, insurance policies, mutual funds, hedge funds, and every other instrument imaginable for saving and investing money were all in the dream business, while malls, markets, stores tiny and titanic, travel agents, and fashion emporia offered every kind of temptation to spend. This proliferation of commercial energy, dramatized by downtown towers with marble atriums as well as by suburban temples to consumption, was supported by an immense, only half-visible infrastructure of financial institutions. Collisions were avoided or mitigated by those pillars of American rectitude (also exemplars of American competitiveness), law firms of all shapes and sizes.

Science was passing through a golden age in the late twentieth century. Foundations and universities vied with one another to encourage original work and to recruit donors to finance it. Secondary and undergraduate education were less impressive. Research depended heavily on immigrant scientists, but as long as they continued to be attracted to work in America, there seemed no cause for concern.

This was a civilization on the move. An immense web of communications was already in place by the beginning of the period we are examining. Indeed, by the 1970s infrastructure was beginning to show signs of wear and tear. The interstate highway system, like the Internet a spin-off of Cold War expenditure, was approaching completion.31 But the ingenuity of the franchising system was creating a national network of predictable caravanserais: the traveling businessman could eat, drink, sleep, use his laptop, and watch movies on TV in an environment as far as possible identical from Key West to the Puget Sound.

This rich abundance of provision and the unprecedented personal freedom of American life in the late twentieth century had not come about by accident. They were the willed consequences of tens of millions of lives devoted to foresight, hard work, planning, and, above all, investment. This was a society that had focused on its future even more than on its past. But here, too, there were contradictions. One was that it chose not to have any detailed blueprint for planning that future. Another was that Americans saved less than citizens of other developed countries, relying instead on the willingness of foreigners, especially the Japanese, to invest in America.

Americans might dislike government, as many of them never ceased to repeat. But they experienced more government than anyone. This vast display of individual and institutional opportunities was organized and protected by the most elaborate and opulently funded pyramid of government in the world. It rose from the minimalist administrations of poor counties and dying towns to state governments like those in Albany or Sacramento that challenged comparison with all but a handful of foreign capitals. At the apex stood the grand vistas and marble palaces of the federal government in imperial Washington.

By the end of the century, government, while widely derided, had become one of the nation's most prosperous businesses, as witness the great arc of office buildings that stretches for twenty miles from Dulles to Reagan airports on the west side of Washington. They accommodate enterprises that live by selling goods and services to the federal government, even as their occupants grumble about it. Four of the twenty richest counties in America are contiguous to the federal capital.32

Yet here too contrast and contradiction are everywhere. Washington is resented by the voters who in theory control it and is accused of every corruption by the citizens it serves. For years the preferred strategy for politicians hoping to be elected to national office in Washington has been to campaign against the Beltway Babylon and all its works. Yet on any summer weekend, the well-educated government interns playing their rowdy softball on the Mall can see, decanted from tour buses, the plain folks who have brought their well-scrubbed children to stare at the White House and tour the Capitol, like pilgrims to the shrines of the national political religion.

* * * * *

AMID THIS WELTER of diversity and contradiction, two massive changes stand out between the middle 1970s and the end of the century. The first is the revival of national confidence, so shaken by the events of the 1960s and the 1970s. The second was the replacement of the liberal consensus by the conservative ascendancy.

The coming of the millennium, said Bill Clinton, was America's moment. His older listeners would have been struck by the contrast with the state of the Union little more than twenty years earlier. Throughout the 1970s, Americans' self-esteem was punctured with a frequency that damaged the national psyche and indeed the national credibility in the outside world.

The blows were internal and external, economic and political, public and psychic, and they fell relentlessly. With the fall of Saigon in the summer of 1975, the United States had lost a war, for the first time. The might of an American expeditionary force with carriers, fighter aircraft, helicopters, and more than half a million men had been defeated by small, lightly armed men in black pajamas.33 Among other things, Vietnam seemed to mean "the end of exceptionalism."34

Everywhere in the world the Soviet Union seemed to be on the attack--in Afghanistan and at half a dozen points in Africa, in Chile, and in the Middle East.35 The Democratic Party, in the ascendancy since 1933, was discredited in the eyes of many of its core supporters because of its ambitious dreams of building a Great Society. In 1968, partly because of its own internal divisions, it lost the presidency to its archenemy, Richard Nixon. Then the integrity of the U.S. government at its highest level was soiled by the Watergate scandal. Nixon himself escaped impeachment only by a humiliating resignation. But it is probable that, if Nixon's personal insecurity had not led him to misuse his power, a conservative turnover would have come seven years earlier than it did.36

As it was, the Watergate debacle both enabled the Democrats to win the presidency against Nixon's successor, Gerald Ford, and created the circumstances in which they chose as their candidate Jimmy Carter. Pious and self-denying, he was well cast as the symbolic leader of that element in the nation that had lost all confidence in itself.

The background to this nadir of frustration and failure was the oil crisis that resulted from the boycott by the Organization of Petroleum Exporting Countries in 1973-74. Americans were used to an unthinking abundance in energy and other natural resources. Few noticed that, even before the oil price rise, their country had ceased to be self-sufficient. Now they were suddenly obliged to import more than half of their oil. They could scarcely comprehend what was happening when gas lines appeared even in the most affluent suburbs. A handful of oil-producing countries, many of them desert kingdoms, which as enemies of Israel rated low on any measure of American sympathy, had shown that they were far from powerless. They succeeded in forcing up the price of oil by a factor of four in 1973. After the fall of the shah of Iran in 1979 the price doubled again. No economy could have survived an eightfold increase in its basic fuel in less than seven lean years without being severely shaken. "I could feel it everywhere," said a Gulf Oil executive, "it was the ebbing of American power--the Romans retreating from Hadrian's Wall."37

Oil was only part of the shock. American industry faced the unfamiliar problem of a lack of competitiveness in the face of exports from Europe and Japan. Helped by American aid and American investments, not to mention by the lessons they had learned from American business economy, first the Europeans and then the Japanese and Koreans began to pour their exports into the American market. It was the first time since the nineteenth century that any foreign industries had shown an ability to compete successfully with Americans. Military expenditure, tourism, increased imports, and the loss of competitiveness all weakened the dollar. In 1971 Nixon devalued it for the first time since World War I. By the middle 1970s the American economy faced a new danger, the combination of low growth and high inflation known as "stagflation."

That was not all. Many Americans were shocked and alarmed by changes in society. They were disproportionately to be found among traditional Democratic voters both in the South and among the unionized working class in the north and Middle West, the very people whose loyalty had sustained Democratic presidents from Franklin Roosevelt to Lyndon Johnson. Many combined liberal ideas about economics with deeply conservative social values. They venerated American patriotism. They cherished marriage and the family. Many of them, even those who could not by any stretch of language be called racist, nevertheless felt uncomfortable about the accelerated social change brought about by the civil rights movement and the physical movement of black people into their neighborhoods.

The radical ideas, the demeanor and the language of the "counterculture" appalled them. In this, there was often an element of class resentment. Men and women who had worked hard and never expected to study at Berkeley or Columbia resented it when students at such institutions flouted deeply held values, seeming ungrateful to the system that had afforded them privileged opportunity.

To crown everything, it began to be said that, for the first time in American history, a new generation might not be as well off as its predecessor. Substantial groups of voters, in short, were jolted by events at home and abroad into reconsidering their basic political allegiance. They included unionized workers; white inner city dwellers; white southerners alienated by black enfranchisement; southern Baptists and other evangelicals angry at the decline in public morals and especially at the Carter administration's withdrawal of tax immunity from church schools; Jews angered by black hostility at home and alarmed by developments in the Middle East. Such voters, many of them for the first time in their lives, were available to listen to the arguments of the missionary new conservatism. For many, the evidence of America's apparent impotence when faced with Islamic militancy and communist aggression was the last straw. Energy was the issue that brought home to many Americans both the threat to America's position in the world and a perceived threat to the comfortable American way of life at home.

In 1979 President Jimmy Carter returned from an ill-tempered summit with the world's leading industrial nations in Tokyo to scare headlines about the energy crisis. Canceling yet another announced energy speech to the nation, he retreated to the presidential hideaway at Camp David. There, sitting on the floor in his trademark cardigan, he conferred with a shifting cast of pundits and wise men.38 After ten days of something approaching a national panic, he emerged on July 15, 1979, and summed up the nature of the problem as he saw it. There was, he said, a growing disrespect for churches, schools, news media, and other institutions, and not without cause.

We were sure that ours was a nation of the ballot, not the bullet, until the murders of John Kennedy, Robert Kennedy and Martin Luther King, Jr. We were taught that our armies were always invincible and our causes always just, only to suffer the agony of Vietnam. We respected the Presidency as a place of honor until the shock of Watergate. We remember when the phrase "sound as a dollar" was an expression of absolute dependability, until the years of inflation begin to shrink our dollar and our savings. We believed that our nation's resources were limitless until 1973, when we had to face a growing dependence on foreign oil. The wounds are still very deep. They have never been healed.39

Carter's analysis of the situation was true enough at the time, so far as it went, but he was never forgiven for it. His trouble was that no one really believed he knew what to do about what came to be called "the national malaise."40 With puritanical rigor, he insisted that salvation would have to come not just from the White House but from every house in America. It was not what the American majority wanted to hear. With relief, many voters turned from this painfully honest man and his awkward truths to the genial simplicities of Ronald Reagan and his assurance that it was, after all, "morning in America."

There were many causes of the conservative ascendancy. The widespread popular reaction against the social upheavals of the 1960s and also against the liberal agenda of the Johnson administration would probably have led to a Republican victory in the 1976 presidential elections and also to mass defections from the Democratic Party in Congress, perhaps even earlier.41 Anger at the national humiliation in Vietnam and at other perceived defeats abroad, including the Carter administration's renegotiation of the Panama Canal treaties, stirred in a fiery condiment of outraged patriotism. But it was economic discontent and a revolution in economic thinking that did most to prepare the ground for the sweeping ideological change of the 1970s.

In the course of that decade, the faith in a mixed economy that had sustained a broadly liberal consensus since the New Deal was replaced by a new belief in the superior ability of "the market" to allocate resources and make social as well as economic decisions. The causes of this ideological revolution were exceptionally complex. Some lay in the disappointing performance of the American economy. Some could be traced to populist politics, like the tax revolt that spread like wildfire after the success of Proposition 13 in California. The rebellion of businessmen against what they saw as the constricting regime of high taxation, greedy unions, and heavy-handed regulation played its part.42

Most important of all, perhaps, was the reaction to "stagflation" among academic economists, politicians, and the more thoughtful businessmen. For a generation, we had all been Keynesians, as even Richard Nixon put it. The real John Maynard Keynes was a subtle, sometimes apparently self-contradictory thinker, far more conservative than--in the United States, at least--he was popularly supposed to be. But the heart of American neo-Keynesian doctrine was the idea that there was a trade-off between unemployment and inflation. The implication, and this was central to the economic orthodoxy of the liberal consensus, was that government could reduce unemployment and stimulate economic activity by administering careful doses of inflationary stimulation. "Stagflation," that is, the plain fact, evident to all in the 1970s, that one could have rising inflation and a sluggish economy at the same time, seemed utterly to disprove the central plank of the prevailing Keynesian doctrine. This intellectual shock made economists and others who thought about economic policy open to conversion to conservative doctrines they previously rejected.

Now the hour had struck for conservative theorists, like Milton Friedman and his followers in the Chicago school, like monetarists and all others who rejected the ideal of a mixed economy, in which the free market was directed and restrained by public action. According to Robert Kuttner:

When economic growth faltered after 1973, a new radically classical economics gradually gained influence in the academy and in politics. Resurgent business groups, once cowed by the New Deal-Great Society era, became unabashed crusaders for laissezfaire. The increasing marketization of global commerce undermined the institutional capacity of nation-states to manage a mixed economy, and discredited center-left parties.43

The ascendancy of free-market economics in academic and business circles was firmly established even before Ronald Reagan became president in 1981. Enthusiastically, if not always consistently, the Reagan administration did all it could to demean and diminish government itself. It cut back on both the regulatory and the redistributive functions of the welfare state. The famous decision in August 1981 to dismiss more than 11,000 air traffic controllers, which led to the bankruptcy and collapse of their union, and subsequent tough action in response to other transportation strikes, sent a powerful message. The airline and banking industries were radically deregulated in response to free-market theory. Long before Bill Clinton defeated George Bush senior in 1992, two Republican administrations had fully adopted conservative social theory and an uncompromising version of free-market economics as taught by the University of Chicago, whose leaders succeeded one another on the podium in Stockholm as Nobel laureates.44

By the end of the 1980s, the economy and society had been dramatically changed by the ascendancy not only of free-market theory but also of markets, and in particular financial markets. When money is short, as it was in the 1980s, the power of those who have it increases. The people with the money were in the first place the banks and the other financial institutions, and--more broadly--the well-to-do. The thirty years after World War II had been a golden age for big industrial corporations and their largely unionized workers. It was a time of regulation and imperfect competition but also of low unemployment and high growth in output and productivity. The last quarter of the twentieth century, in contrast, was a flush time for the financial sector.

Virtually every important change in the structure and performance of the economy was better for Wall Street than for anyone else.45 Lower taxes disproportionately favored the wealthy and those with high incomes, leaving them with more funds to invest. With the exception of a few years after the sharp market break in October 1987, the stock market rose precipitously. The gains went to all shareholders, including the growing number of Americans who owned stock indirectly through mutual funds, but they went disproportionately to a small number of big shareholders. If by the end of the century almost half of all Americans had some stake, direct or indirect, in the stock market, for most the stake was small: for half of the stock-owning half, it was worth less than $5,000, or less than the value of a second-hand automobile.46 No doubt it was to be expected that new investors would have a small stake. But this was hardly what was suggested by the promoters of a stock-owning democracy.

The big prizes went disproportionately to the insiders: to the bankers, the brokers, the arbitrageurs, the speculators, and those directors who were lucky enough to hold stock when it was run up by one financial operation or another.47 Luckiest of all were those--bankers, lawyers, accountants, and other professionals--who commanded large fees for their role in facilitating the endless series of mergers, acquisitions, hostile takeovers, leveraged buyouts, and other even more recondite exploits of financial engineering. Inequality was further increased by the regressive character of the tax code, especially after the George W. Bush administration's selective tax cut of 2001, which raised the after-tax income of households in the top 1 percent by 6.3 per cent, compared with 2.8 percent for other groups. This was a $45,000 a year bonus for the wealthiest 1 percent of American families.

In free-market theory, what was happening was that the pitiless but ultimately benevolent "creative destruction"48 of capitalism was allocating investment funds to the best-managed companies where they could earn the highest returns. All too often the process benefited not the shareholders, still less the corporation's employees or the economy as a whole, but the corporate raiders. The latter could borrow the money to take a company over and then leave it, panting for life like a beached whale, with a mountain of debt. Often, too, they could take advantage of some tax break, making the money cheap to them.

Deregulation and globalization together meant that virtually all the money in the world was available for speculation on the New York Stock Exchange. The orgy of financial imprudence has been well charted in a number of accounts, some disapproving, others unable to conceal their admiration.49 That was part of the cultural shift to the free market. Americans had first admired, then pilloried the Robber Barons of the Gilded Age and first followed, then blamed the unscrupulous businessmen responsible for the Great Crash. Now the heroes of Wall Street were cultural icons, and to disapprove of them smacked of either envy or leftism. Tom Frank summed up the doctrine of what he called "market populism":

From Deadheads to Nobel laureate economists, from paleoconservatives to New Democrats, American leaders in the nineties came to believe that markets were a popular system, a far more democratic form of organization than (democratically elected) governments . . . That in addition to being mediums of exchange, markets were mediums of consent . . . markets were a friend of the little guy; markets brought down the pompous and the snooty; markets gave us what we wanted; markets looked out for our interests.50

The financial insanity culminated in the collapse of Long Term Capital Management in 1998.51 This was a so-called hedge fund, constructed according to the precepts of two academic economists, Robert Merton and Myron Scholes. (They were awarded the Nobel Prize for the ingenuity of their theories about the new derivative instruments that were all the rage.) Months later, the fund was obliged to reveal that by pyramiding and kiting investments in time-honored fashion, albeit under cover of a barrage of newly coined pseudoscientific jargon, it was in danger of collapse. Its portfolio was valued at $200 billion. The derivatives hanging from it like baubles from a Christmas tree were put at $1.2 trillion. Putting ideological commitments to free-market theory aside, the great and the good of Wall Street loyally got together to bail out their overeager competitors. Broke or not, LTCM was too big to be allowed to fail.

Such casino capitalism in the financial markets earned great fortunes for those whom the novelist Tom Wolfe called the new Masters of the Universe. These ruthless raiders sacked venerable companies like financial Mongols. Greed is good, was their motto. They produced rich profits for shareholders, though in the nature of things only those who could afford to invest large sums made enough money to make much difference to their life-style.

The effect on the employees of once solid corporations in the real economy was not happy. One easy way for management, under pressure from such raiders, to cut costs was to reduce the work force. Unions in many industries were no longer in a position to protect their members. "For much of the union movement," wrote labor historian Nelson Lichtenstein, "the 1970s and 1980s were a disaster."52 Union membership as a proportion of the entire work force fell from 29 percent in 1973 to just above 16 percent in 1991. In traditionally unionized industries (the needle trades, meat-packing, engineering, and the trucking and warehousing organized by the Teamsters, as well as automobiles) the losses were proportionately more severe. Membership in the International Ladies' Garment Workers Union fell by no less than two-thirds.53 In the construction industry, too, traditionally dominated by the conservative craft unions, as early as the late 1960s employers consciously set out to break the unions. "It's time for a showdown," said Winton Blount, the Alabama contractor who became Nixon's secretary of labor.54

The unions' political clout fell commensurately. New jobs were being created and at an impressive clip. But many of the new jobs were at or even (in the underground economy) below the minimum wage, while many of the jobs that were disappearing were well-paid jobs with fringe benefits covered by union contracts.

Manufacturing jobs were being "exported" to developing countries, in two ways. Sometimes the U.S. corporation physically moved plants abroad to take advantage of far lower wages. Many manufacturers, for example, moved production to maquiladora plants in Mexico where workers, using the same equipment as in the United States, were paid one-seventh as much. Sometimes it was easier to buy semifinished or finished products from countries with even lower wages and with minimal or nonexistent costs for health and safety regulation, taxes, or environmental protection. Scanning the labels of garments in mass market stores like Gap or Banana Republic became a geography lesson, as U.S. retailers brought in goods from Surinam or the Andaman Islands.

This "outsourcing" to lower-cost foreign producers was a major reason why domestic employment in manufacturing fell from 27 percent in 1970 to 19 percent in 1986.55 Moreover a Brookings Institution study showed that the trend toward assembly in low-wage foreign countries was encouraged by a favorable tax and tariff regime. Under tariff regulations, U.S. companies are permitted to reimport, without duty, goods that originate in the United States if they are fabricated or assembled overseas. While half of these duty-free goods were assembled in Japan, Germany, or Canada, half came from developing countries, notably Mexico, Malaysia, Singapore, the Philippines, Korea, Taiwan, and Hong Kong.56

Back home, corporate management systematically replaced unionized workers with unprotected "consultants" or contracted workers, often with no pension or health plan rights. Sometimes employers in effect asked existing workers to bid for their own jobs at lower wages. Sometimes even unionized plants installed two-tier wage patterns whereby older workers retained higher rates but newcomers entered at a lower wage that would never catch up.57

In obedience to the ancient philosophical fallacy that to name something is to explain it, some analysts attributed the decline in union membership to something called "post-industrial society."58 Others saw it as the result of conscious strategies on the part of management to weaken labor.59 According to Barry Bluestone and Bennett Harrison,

[M]ost mainstream economists have rather cavalierly concluded that globalization (in both trade and investment) has not brought much downward pressure on the wages of lower-skilled American workers. One noteworthy exception is Harvard's Dani Rodrik, otherwise very much the orthodox economist, who surprised his colleagues with the publication in 1997 of an argument that more open trade . . . reduces the domestic bargaining power of labor, possibly leading to lower wages or lower growth in wages.60

Enthusiasts for the free market praised the new labor market for its realism or flexibility, and no doubt there had been plants and whole industries where wage costs were unrealistically high in a globalized world. That, after all, was what had made management go global: to drive down U.S. wages with competition from developing or scarcely developing countries.

Undeniably, however, what this added up to was a massive shift of economic and ultimately political power from labor to management, from the industrial sector to the financial, from the unionized North and Midwest to the largely nonunion South and the less unionized West, from workers and middle managers to top managers, major shareholders, and bankers--in short, from poor to rich. Overall, the effect of the free-market ideology in a deregulated economy was absolutely predictable, because intended. It was sharply to increase inequality.

* * * * *

THIS RIGHTWARD SHIFT was not just a matter of conservatives winning more elections. It was also a matter of those who had called themselves liberals being converted to conservative ideas. There was no more interesting example than President Clinton himself. By stages, although originally an economic populist of a kind, he espoused the free-market creed. He belonged to that faction in the Democratic Party that had responded to the decline in the Democratic vote, and especially among white males, by moving to centrist positions. Clinton won in 1992 by repeating the mantra that it was "the economy, stupid." He accepted the strategy of the Democratic Leadership Council and was much influenced by its leader, Al From. Even so, the decisive stage in the framing of his economic policy came not from within the Democratic Party but from Wall Street.61

In his 1992 election campaign and in his manifesto, Putting People First, Clinton promised a middle-class tax cut and, at the same time, elimination of three-quarters of the deficit. Even before he was inaugurated, the Congressional Budget Office warned that the deficit would be much higher than expected. Next, the Office of Management and Budget predicted that the federal deficit would rise sharply, to more than $300 billion. Clinton's own budget chief, Leon E. Panetta, a former chairman of the House budget committee, and the holdover Republican chairman of the Federal Reserve, Alan Greenspan, agreed that long-term interest rates would rise. The danger was that stock and bond markets would lose confidence and might fall catastrophically.

Greenspan was a lifelong Wall Street conservative and disciple of the ultraconservative novelist and guru Ayn Rand. He was first appointed as chairman of the Fed by Ronald Reagan and then reappointed by George Bush senior and by Clinton. Clinton's first treasury secretary was the conservative Texas Democrat, Senator Lloyd Bentsen; and Robert Rubin, a Wall Street Democrat who had been chairman of Goldman Sachs, was brought in as chairman of a new body, the National Economic Council, before succeeding Bentsen two years later. These advisers persuaded Clinton that his first priority must be to reduce the federal deficit.

Suddenly Clinton's problem was how to persuade Congress not to cut taxes, as he had promised, but to increase them. In a startling turnaround, Clinton in effect abandoned Democratic economic and budget policy and adopted Wall Street orthodoxy. In the near term, it was a brilliantly successful move. Clinton's policies can be credited with having played a large part in making possible the stock market spike of the late 1990s. They reduced the burden of servicing the government's debt and so made more money available to lower taxes. They increased savings and damped down inflation and so contributed to creating a favorable impression on Wall Street.

After the Republican successes in the 1994 mid-term elections, widely attributed to Newt Gingrich and his Contract with America, Clinton again moved to the right.62

It was a good time for corporate profits, and good, exceptionally good, for the stock market. The most spectacular rises were in high-technology stocks, and especially in initial public offerings like those of Yahoo! (1996) and America Online (1995). The rise in equity values made people feel rich. Consumer spending roared ahead, much of it financed by credit card debt. But the newly tough management ethos, coupled with global competition and such technical innovations as sophisticated inventory management, not to mention the constant raider warfare on Wall Street, made people's jobs less secure. They might continue to spend. But they saved far less than their parents, and they went far deeper into debt.

So the economic policies of the 1990s caused, or at least allowed, a spectacular boom, especially on Wall Street. But they also did nothing to reverse the steady rise in inequality. Income inequality has increased since the 1970s. By 1997, the first year of Clinton's second term, a careful study showed that this had already extended to inequality of wealth. "The gap between haves and have-nots," it concluded, "is greater than at any time since 1929. The sharp increase in inequality since the late 1970s has made wealth distribution in the United States more unequal than in what used to be perceived as the class-ridden societies of northwestern Europe."63

Under Reagan and George Bush senior and their allies in Congress and in the conservative think tanks, the Republicans had set out to reverse what seemed to them an unhealthy trend toward equality. And in that they had succeeded.

Now the boom is over, but the inequality remains. It is clear that the increase in inequality is squarely the consequence of free-market policies promoted by conservatives in politics and by their allies in the economics profession. It is also plain that mainstream Democrats, including President Clinton, have to some extent adopted this ideology. In a number of major speeches, Clinton buckled together, as essentials of the American belief system, political democracy and free-market capitalism.

"We have a free market now," says Gordon Gekko, the antihero of Oliver Stone's movie, Wall Street. "We don't have a democracy."64 America has always been a capitalist country. But it has not always been usual to couple free-market economics with democracy as the two coequal and essential foundations of the American public philosophy. Once, wrote one of the few who dared to challenge the omnipotence of the market at the very height of the great 1990s bull market, "Americans imagined that economic democracy meant a reasonable standard of living for all Today, however, American opinion leaders seem generally convinced that democracy and the free market are simply identical."65

Between 1975 and the end of the century, many Americans, including many who previously thought of themselves in some way as liberals, espoused the quintessentially conservative belief that business knows best and that the imperfect competition between giant corporations, guided by a few thousand stock market professionals trying to make their own fortunes, could do more to promote the prosperity of the many than a democratically elected government.

* * * * *

THE THEME of this book, then, is the way the American polity has changed over the past twenty-five years. It argues that those changes have been misunderstood, largely because of a sometimes naive, often self-interested, adoption of conservative free-market ideology.66 Between Richard Nixon's departure from the White House in 1974 and the return of the Republican George W. Bush in 2001, a new conservative consensus was forged. Some Democrats felt they had no alternative but to join it. In place of the New Deal philosophy, in which the workings of the free market were to be restrained and controlled by government intervention, the new public philosophy sought to set the market free. Whereas in the middle third of the century the political consensus encouraged modest redistribution of wealth, in its last quarter the free market was set free. If the result was a winner-take-all society, in which the devil took the hindmost, that was acceptable. The new assumption was that, left to itself, the mighty engine of free-market capitalism would generate wealth on a scale that would make redistribution unnecessary, even unpopular.

This book argues that this new worship of the market was doubly mistaken. While unrestrained markets and a deregulated economy could indeed create wealth on a dazzling scale, they could not abolish the laws of economic gravity. Bust would still follow boom, and perhaps in a cycle that was even more destructive than in the past. At the same time, the working of the free market, and of the conservative political philosophy that imposed it and made propaganda for it, would cause--had indeed already caused--social consequences that would be undesirable and in the end dangerous.

* * * * *

PART OF THE OVERCOOKED triumphalism of the late 1990s was based on a misunderstanding of the genuinely thrilling opportunities afforded by new technology in general and especially the Internet. Chapter 3 traces the history of the new technology, and argues that this history has been misunderstood. Credit for its dramatic achievements has been wrongly awarded. It shows that, so far from being mainly the product of youthful entrepreneurs taking advantage of a deregulated free-market economy, both the basic science and the technology of the Internet were largely the products of research and development done under the impetus of the Cold War. Furthermore they can hardly be claimed as evidence of the creative élan of deregulated free-market capitalism, because the essential work was done either in government institutions or in universities or industrial laboratories shielded from the operations of the free market by government contracts. It was not the market but the government that built the Internet.

Chapter 4 shows that, in spite of the dizzy stock market boom and the really favorable economic conditions in the middle and late 1990s--low unemployment, relatively high productivity, low inflation--it is an illusion to see the economic record of the past twenty-five years as a brilliant one. To the contrary, until the mid-1990s the economy performed in only a mediocre way. Substantial gains for the wealthiest few in society concealed the fact that the situation for the majority of Americans actually got worse over most of the period. The spectacular gains of the last half-dozen years of the twentieth century compare very unfavorably with the record of the years after World War II, when--under a mixed economy philosophy--productivity and economic growth achieved a far superior record to those boasted of by conservatives in more recent years. In short, the New Economy is largely an illusion, and talk of new parameters is intellectually disreputable.

Contrary to President Clinton's predictions of lasting prosperity, and in spite of widespread self-congratulation in conservative and business circles, a sudden, short-lived spike in financial markets benefited the few rather than the many. And this is against the American grain. For the life-giving strength of the American economy has always been, until the past few decades, the breadth of its benevolence. The genius of the American economy has not lain in the ruthlessness with which the few were allowed to trample on their defeated competitors but in the generosity in which the many were allowed to share in the common prosperity.

Three chapters look at how the promise of real progress for significant groups in society, together making up a large majority of the population, has been disappointed by the reactionary character of the political revolution since the 1970s. Chapter 5 looks at the consequences of the new immigration, which has added more than 30 million people to the population since the late 1960s. It will transform the American gene pool, turning what was statistically a population of transplanted Europeans into one in which people of European descent will probably be a minority within half a century. It will also change American politics by concentrating immigrants and their descendants in a few metropolitan areas, rather as the last mass immigration of the early twentieth century did, but in the future migrants will be drawn in the majority from the developing world. And, of course, immigration will introduce new issues and new stresses into American politics.

Chapter 6 shows how the high hopes of the women's movement in the 1960s have given way to division and frustration. It argues that there have been in effect two women's movements. The first, largely confined to a small group of highly educated women in metropolitan areas, evoked a backlash by its unwisely enthusiastic adoption of radical rhetoric and tactics. The far larger movement of women into the work force has done more to change the status and expectations of all women. Yet it, too, has encountered frustration.

The argument of chapter 7 is that something similar has happened to the hopes of African Americans. Here the contrast, and the inequality, is even greater than for women. There are now almost no heights a black man or even a black woman cannot aspire to in America--Colin Powell, after all, is secretary of state and was a credible candidate for president. Black actors, entertainers, musicians, sportsmen and sportswomen have achieved astonishing feats. Yet equality of condition and equality of esteem remain elusive for all minorities. And a proportion of all African Americans that is depressingly high in the light of the ambitions of the civil rights movement has drifted down into the condition of an underclass, without jobs, without opportunities, too often without hope.

Chapter 8 returns to the mainstream. It looks at changes in the way all Americans live at the beginning of a new century. It examines the decline of civic engagement. It inquires into the responsibility of media, especially news media, for some disquieting trends. It confronts the implications of the fact that a growing majority of Americans live neither in cities nor in the countryside, but in suburbs. It suggests that one reason why such social issues as poverty have disappeared from the agenda is because the poor, and other unfortunate groups, are geographically remote and largely invisible. It asks, too, whether the economic inequality of the past few decades may not be reintroducing those hierarchies of social class Americans were so proud of abolishing.

The paradox of American power is that, at the very moment when American influence in the world is greater than ever, Americans seem less and less interested in the rest of the world: more self-sufficient, but also more self-satisfied. Chapter 9 points out some implications of this paradox for the United States and for the rest of the world and inquires into the revival of American exceptionalism. Finally, chapter 10 seeks to tie these themes together in the context of Jefferson's vision of a nation still sorely puzzled in many ways, even though prospering "beyond example in the history of man." But first we must analyze some of the changes in the political system over the past quarter of a century, and how they have converted the American majority to a new public philosophy that is willing to trade inequality for prosperity.

Return to Book Description

File created: 8/7/2007

Questions and comments to: webmaster@pupress.princeton.edu
Princeton University Press

New Book E-mails
New In Print
PUP Blog
Videos/Audios
Princeton APPS
Sample Chapters
Subjects
Series
Catalogs
Textbooks
For Reviewers
Class Use
Rights
Permissions
Ordering
Recent Awards
Princeton Shorts
Freshman Reading
PUP Europe
About Us
Contact Us
Links
F.A.Q.
PUP Home


Bookmark and Share