This file is also available in Adobe Acrobat PDF format
THE EVOLUTION OF AMERICAN COLLEGES
Undergraduate education today bears no resemblance to the instruction masters and tutors gave to the trickle of adolescents entering one of the nine colleges that existed prior to the American Revolution. Even a century later, less than 2 percent of young people were attending college. Serious research had barely gained a foothold on the nation's campuses, and entire fields of knowledge that are common today were still unknown. As late as 1940, fewer than 1 in 20 adults had a B.A. degree. It is only within the past 50 years that universities have come to boast the huge enrollments, the elaborately equipped research laboratories, and the legions of faculty members and other instructors that fill their campuses today.
Some understanding of the evolution of American colleges is needed as the background for a serious discussion of the contemporary college experience. Only through acquaintance with this history can one tell whether critics are correct in asserting that the quality of liberal education is in serious decline. Without some knowledge of the past, one cannot fully appreciate which aspects of the undergraduate program are amenable to change and which seem to stubbornly resist reform. Lacking historical perspective, one cannot even be sure whether "new" proposals are truly new or merely nostrums that have been trotted out before with disappointing results. At the very least, anyone wishing to criticize or reform undergraduate education should know its history well enough to understand what important changes have occurred and what features of undergraduate education have remained essentially the same over time.1
THE EVOLUTION OF UNDERGRADUATE EDUCATION: A BRIEF SUMMARY
Until the Civil War, colleges in the United States were linked to religious bodies and resembled finishing schools more closely than institutions of advanced education. Student behavior was closely regulated both inside and outside the classroom, and teachers spent much of their time enforcing regulations and punishing transgressors. Rules of behavior were written in exquisite detail. Columbia's officials took two full pages merely to describe the proper forms of behavior during compulsory chapel. Yale turned "Sabbath Profanation, active disbelief in the authenticity of the Bible, and extravagant [personal] expenditures" into campus crimes.2
Most courses were prescribed in a curriculum that usually included mathematics, logic, English, and classics, with a heavy dose of Latin and Greek. In a typical class, students recited passages from an ancient text under the critical eye of the instructor. Although many colleges offered courses in the sciences, such as astronomy or botany, classes were taught more often by invoking Aristotle and other authorities than by describing experiments and the scientific method. By most accounts, the formal education was sterile. Many students felt that they learned much more outside the classroom in informal clubs and literary societies, where they engaged in debates, read modern literature, and discussed serious subjects.3
Despite their quaint ways, colleges before the Civil War were deliberately organized to pursue two important objectives: training the intellect and building character. The most influential defense of the prevailing model appeared in an 1828 report from Yale College, which held that the principal aim of college instruction was not to supply all of the important information that students might some day use but to instill mental discipline.4 According to the report's authors, a classical education was ideally suited to this purpose.
Mental discipline was supposed to emerge from hours of demanding work translating ancient languages, disputing arcane questions in class, and solving mathematical problems. As one college president put it, "If you seek to bring your mental powers up to a high degree of efficiency, put them to work, and upon studies that will tax them to the uttermost. When one has been mastered, take a second, and a third, and so go on conquering and to conquer, victory succeeding victory in your march to mental conquests and triumphs."5 Not until the end of the century did this inspiring message fall victim to Edward Thorndike's experiments suggesting that the skills acquired through painstaking translations of Cicero and Virgil would rarely help students to analyze and solve problems outside the realm of Latin texts.
Character would be forged by having undergraduates study classical texts, demanding strict compliance with the detailed rules of campus behavior, and requiring daily attendance at chapel. As a culminating experience, most colleges prior to the Civil War offered a mandatory course for seniors on issues of moral philosophy, often taught by the president himself. Ranging over ethical principles, history, politics, and such issues of the day as immigration, slavery, and freedom of the press, this capstone course served multiple objectives. It set forth precepts of ethical behavior, it prepared students for civic responsibility, and it brought together knowledge from several fields of learning. For many students, it was the high point of an otherwise dull and stultifying education.
By the middle of the nineteenth century, the traditional program was showing signs of strain. Experimental scientists and scholars of modern languages and literature were gradually gaining a foothold in the curriculum. Instructors chafed at having to spend so much time enforcing rules of behavior. Expressing their displeasure with the status quo, students began to vote with their feet. From 1850 to 1870, undergraduate enrollments in America actually declined as a proportion of the total population. As Francis Wayland, the president of Brown, succinctly put it, "We have produced an article for which the demand is diminishing."6
With the end of the Civil War, higher education began a period of unprecedented reform. Aided by federal land grants and by the philanthropy born of industrial fortunes, college presidents such as Charles W. Eliot, Andrew White, William Rainey Harper, and Benjamin Gilman built new institutions and radically transformed old ones. Inspired by the model of the great German universities, these leaders encouraged research, welcomed science, and introduced Ph.D. programs to build new cadres of scholar-teachers.
Undergraduate education soon felt the impact of these changes. The old classical curriculum gave way to offerings of a newer and more practical kind. Instruction in modern languages and literature continued to spread. Courses in physics, biology, and chemistry sprung up everywhere. Private universities introduced new programs in vocational subjects such as commerce and engineering. Public universities carried occupational training even further. According to Laurence Veysey, "such untraditional disciplines as pedagogy, domestic science, business administration, sanitary science, physical education, and various kinds of engineering were all becoming firmly established at a number of leading universities by the turn of the century."7
More radical still were the reforms at some of America's most prominent institutions. At Harvard, for example, President Charles W. Eliot not only rejected the old prescribed classical curriculum, he urged that all requirements be abolished, leaving students free to study whatever appealed to their interests. By the end of his 40-year term of office in 1909, only a course in English composition and the study of one foreign language were required of freshmen. Sophomores, juniors, and seniors were left completely free to study what they chose. At Cornell, another advocate of student choice, President Andrew White, explained the reasons for shifting to a freer curriculum: "The attempt to give mental discipline by studies which the mind does not desire is as unwise as to attempt to give physical nourishment by food which the body does not desire . . . Vigorous, energetic study, prompted by enthusiasm or a high sense of the value of the subject, is the only kind of study not positively hurtful to mental power."8
Religious orthodoxy also lost its grip on many colleges. Nonsectarianism was increasingly considered conducive to sound university governance. Faith was no longer thought central to the development of moral character. Compulsory chapel began to give way on many campuses, making religious observance little more than another option within a broad array of extracurricular pursuits.
Not all college presidents agreed with the trend toward greater freedom of choice. Some clung tenaciously to the old classical model and mounted a spirited defense of the status quo. President James McCosh of Princeton was particularly outspoken in opposing Harvard's reforms, denouncing President Eliot in words reminiscent of William Bennett's ripest prose during his term as secretary of education:
Tell it not in Berlin and Oxford that the once most illustrious university in America no longer requires its graduates to know the most perfect language, the grandest literature, the most elevated thinking of all antiquity. Tell it not in Paris, tell it not in Cambridge in England, tell it not in Dublin, that Cambridge in America does not make mathematics obligatory on its students. Let not Edinburgh and Scotland and the Puritans in England know that a student may pass through the one Puritan college of America without having taken a single class of philosophy or a lesson in religion.9
President Eliot was unmoved. As time went on, he would watch the currents of reform begin to turn in his direction. In 1890, 80 percent of the curriculum was required in the average college. By 1901, curricula in more than one-third of American colleges were at least 70 percent elective.10 By 1940, the share of mandatory courses in the typical college curriculum had declined to 40 percent.
In the end, however, Eliot's vision proved too extreme to survive intact even at Harvard. Although no one wanted to return to the old, classical curriculum, most educators felt that the doctrine of total elective choice went too far in the other direction. Such freedom clearly did not produce "the vigorous, energetic study" that enthusiasts like White had promised. By the time Eliot finally retired, 55 percent of Harvard students were graduating having taken virtually nothing but elementary courses. More than 70 percent did not pursue any single field of knowledge in real depth.11 Many undergraduates studied as little as possible and relied on paid tutors--or "crammers"--to fill their heads with just enough information at semester's end to pass their exams.
Meanwhile, social clubs and fraternities flourished. Intercollegiate sports took hold, as football games attracted tens of thousands of raucous students and alumni. At colleges across the nation, undergraduates decorated their rooms with posters reading "Don't let your studies interfere with your education."12 For many undergraduates, college was not a serious intellectual experience but an excuse for making social contacts and enjoying the good life. As one dean of students, LeBaron Briggs, candidly admitted, "Social ambition is the strongest power in many a student's life."13
In retrospect, it is likely that the casual attitude toward coursework reflected the spirit of the times more than the nature of the curriculum. Even in the more conservative atmosphere of Yale, the typical student was described as "a careless boy-man who is chiefly anxious to 'have a good time,' and who shirks his work and deceives his instructors in every possible way."14 Whatever the underlying causes, critics of the elective system seized on such carefree undergraduate behavior as a justification for imposing greater structure on the curriculum. By the early twentieth century, both the extreme free-choice model embraced by universities such as Stanford and Cornell and the more rigid, traditional system still in place at Princeton seemed equally out of touch with the times.
Once Eliot retired, revisionist forces took over at Harvard. His successor, A. Lawrence Lowell, soon persuaded the faculty to require students to choose a major, or field of concentration, to stop them from taking a long series of introductory courses. The resulting curriculum, with its combination of breadth and depth of study, had already been adopted by most other colleges. Depth was achieved through concentrations that consisted of a number of courses within a single discipline. Breadth was typically ensured by requiring students to take two or three courses in each of several broad areas of knowledge, such as the humanities, social sciences, and sciences.
By the start of World War II, college curricula were divided between two models. Most public universities offered a wide assortment of vocational majors along with the standard liberal arts concentrations, while achieving breadth through some form of distribution requirement. Most leading private universities tended to resist occupational majors (save for engineering and business). A few, among them Stanford and Columbia, went beyond distribution requirements by requiring students to complete specially created survey courses on such broad topics as Western Civilization or the Great Books, in an effort to ensure that every student graduated with a basic grounding in the intellectual heritage of the West.
These patterns of breadth and depth were nourished by constant growth in the number of courses, made possible by the steady expansion of university faculties. Entirely new disciplines, with courses of their own, gave undergraduates a wider range of options from which to choose electives, ful-fill their distribution requirements, or select a major.
In the aftermath of World War II, universities underwent further substantial change. Encouraged by the GI Bill and later by the demands of an increasingly sophisticated economy, larger and larger numbers of young people crowded into colleges. Existing universities expanded, and new ones were founded. From 1945 to 2000, the number of B.A. degrees awarded annually rose almost eightfold, from 157,349 to approximately 1.2 million.
The rapid growth in the undergraduate population meant that higher education was no longer reserved for the elite but now attracted a majority of American youth. Student bodies became more diverse, as blacks, Hispanics, Asians, and other ethnic minorities entered private and public colleges alike. As applicant pools grew larger, the best-known institutions became highly selective, teachers' colleges evolved into multipurpose universities, and community colleges sprouted like mushrooms. Many of the new students (and their parents) were more interested in preparing for jobs than in acquiring a broad liberal arts education. Responding to this demand, more and more colleges began to offer vocational programs. Before long, the number of students choosing vocational majors exceeded the numbers concentrating in traditional arts and sciences disciplines.
The rapid rise in the undergraduate population was matched by growth in other dimensions. The number of faculty members increased severalfold. Aided by generous federal support, especially in the sciences and social sciences, the volume of research expanded massively. Academic specialties proliferated, producing new majors, new academic journals, and ever greater intellectual fragmentation.
University faculties responded to these developments in various ways. Although the basic structure of the curriculum remained intact, with its provision for breadth and depth, the steady growth of new knowledge pushed aspects of science once reserved for graduate students back into intermediate and even introductory college texts. As researchers separated themselves into more and narrower specialties, colleges began developing interdisciplinary programs to focus on large societal issues, such as environmental problems or the impact of science and technology on society. Challenged by a more diverse student population, many faculties launched other multidisciplinary ventures in fields such as women's studies, Afro-American studies, and ethnic studies. In response to America's new international prominence, and aided by significant outside support, other faculty members created research centers and interdepartmental programs aimed at understanding major regions of the world, such as Western and Eastern Europe, Africa, and East Asia.
As student numbers continued to rise and individual universities grew larger, colleges launched a variety of experiments to provide more individualized instruction, at least for portions of their student bodies. Honors programs were established for qualified students. Research internships offered opportunities for undergraduates to work in laboratories alongside experienced investigators. Freshman seminars, group tutorials, and small senior colloquia afforded students at least a modicum of personal contact with faculty members.
Meanwhile, advances in technology brought changes in the way professors taught their classes. In the 1950s, the spread of paperback books and photocopiers expanded the depth and variety of course materials far beyond the single hardcover text that had been the staple of most earlier college courses. Several decades later, the Internet brought an even wider array of readings within easy reach of students. Some professors actually put entire courses on line so that students could not only obtain syllabi and homework assignments at their computers but also participate in discussions with their classmates or ask questions of the instructor. Never before had such extensive intellectual resources been so readily available to enhance the undergraduate educational experience.
RECENT CRITICISMS IN HISTORICAL PERSPECTIVE
All the new courses, interdisciplinary programs, and other curricular innovations could not forestall the wave of criticism briefly described in the Introduction. Disapproving voices from within the academy and beyond attacked universities for lacking a clear vision for undergraduate education, failing to counteract the growing fragmentation of knowledge, sacrificing the liberal arts in favor of vocationalism, and neglecting undergraduates to concentrate on research.
Most of the critical writings suggest that the quality of undergraduate education today has declined from some elusive pinnacle of prior greatness. For Allan Bloom, the moment of excellence seems to have occurred between World War II and the 1960s. For Charles Sykes, the best days came decades earlier. For Martin Anderson, the peak of educational achievement is never specified.
Sorting through fragments from the past, however, one is hard put to discover any true golden age. What Henry Adams said of Harvard in the mid-nineteenth century--"it taught little and taught that ill"--could have been said equally well of most other colleges of the period.15 By the end of the century, old universities had been reformed and new ones added, but few who remember the indolent, socially ambitious students of that era would regard the education obtained in the leading colleges as a model worthy of emulation.
Colleges in the late 1930s likewise seem to have offered something less than a peak educational experience, even for the best students at the most prestigious institutions. In terms reminiscent of Henry Adams, McGeorge Bundy dismissed most of his studies at Yale as "a terrible waste of time."16 Elliot Richardson was no kinder in describing his own Ivy League experience: "I did not think much of Harvard education in those days before the Second World War. In most cases, it wasn't worth going to class."17
The end of the war brought to American campuses a flood of GIs with a new seriousness of purpose that gladdened the hearts of many professors. Even so, by the 1950s studies appeared that described undergraduates in unflattering terms and concluded that "colleges rarely succeed in bringing about important changes in attitudes or values."18 Contemporary critics who deplore the political correctness of today seem to forget the social pressures of the postwar years, which stunted the ambitions of so many college women, ignored any need to recruit minority students, and silenced radical voices for a generation. The conformities of that age may have seemed less visible than the campus orthodoxies of today, but that is only because they were much more closely aligned with the prevailing social attitudes of the time. According to one large-scale study of undergraduates in the early 1950s, "the main overall effect of higher education upon student values is to bring about general acceptance of a body of standards and attitudes characteristic of college-bred men and women . . . There is more homogeneity and greater consistency of values among students at the end of their four years than when they begin."19
It is equally difficult to find any period during the past century and a half when educators were united around a common unifying vision of liberal education. After the Civil War, as previously noted, colleges took very different paths, with some advocating almost total freedom in the choice of courses, others clinging to a largely prescribed, traditional curriculum, and still others adding programs of a more practical, vocational sort. Humanists argued for the primacy of liberal learning and the cultivation of intellect, refinement, and judgment. University presidents such as Woodrow Wilson spoke of a commitment to public service. Research-minded faculty were preoccupied with attracting and preparing their successors. Far from choosing among these visions, leading educators such as Henry Tappan, Andrew White, and Charles W. Eliot seemed to embrace them all. As Lincoln Steffens put it in describing the University of Wisconsin, the university stood ready "to teach anybody-anything-anywhere."20 Observing this smorgasbord, Abraham Flexner, the leading authority on higher education at the time, remarked that "the [American] college is without clear-cut notions of what a liberal education is and how it is to be secured . . . and the pity of it is that this is not a local or special disability, but a paralysis affecting every college in America."21
The interwar period did not bring new clarity. In 1942, the Progressive Education Association concluded its eight-year study of undergraduate education with the mournful lament that "Liberal arts college faculties seldom state what they mean by liberal or general education. Perhaps they do not know."22 One noted educator of the period, President Robert Maynard Hutchins of the University of Chicago, did propose a new model to counteract what he termed the "disunity, discord, and disorder" that had overtaken undergraduate education.23 But his idea of a curriculum founded on a study of the great works of Western civilization never gained much of a following among other colleges in the United States.
After World War II, several major universities tried anew to build a model program of general education that would prepare young people to take their place as knowledgeable, thoughtful members of a free and democratic society. Again, however, no consensus emerged. The vast majority of colleges were content to go on imparting breadth of knowledge merely by requiring students to choose a designated number of courses in the catalogue from each of the three major divisions of learning: the humanities, sciences, and social sciences. By 1977, Frederick Rudolph concluded his history of curricular reform in America by observing that "the general education movement is hopelessly engaged in the respiration of a lifeless ideal."24
In short, anyone seeking a common purpose must go all the way back to a time before the Civil War, when colleges united around a classical curriculum aimed at mental discipline and character building. No one today would willingly return to that antebellum model of student recitations, ancient languages, and rigid disciplinary codes. Ever since the demise of the classical curriculum, faculties have clung to several different visions of education, with no one model proving itself superior in a clearly demonstrable way. As Laurence Veysey points out in his survey of curricular change since 1900, "when one stands back and looks at the entire pattern of the American curriculum from a distance, the changes (aside from course proliferation) seem usually to mark variations on themes begun long ago."25
There is nothing surprising about the variety of aims and philosophies represented in contemporary American colleges. Uniformity of purpose and curriculum has always been more characteristic of European universities than of their counterparts in the United States. The traditional hallmark of higher education in this country has been its variety, featuring large colleges and small, secular and religiously affiliated institutions, single-sex and coed student bodies, and a rich mix of educational opportunities and programs. Now that student populations here and in Europe have expanded to include young people with widely differing interests and needs, many foreign educators look upon the diversity of our colleges as a strength rather than a weakness.
While the aims of a liberal education may be no more confused or contested today than they were a hundred years ago, specialization has undoubtedly increased. But specialization itself is not a new phenomenon. Even at the beginning of the twentieth century, William James complained of the narrowness of "the Ph.D. octopus."26 When educators called for a convention in 1904 to reaffirm the unity of knowledge "in this time of scattered, specializing work," the effort failed completely.27 By the 1920s, Rexford Tugwell could declare that "no one any more can hope to understand the whole of science as was very definitely hoped by the scientists a century or two ago; and so modern scientists gradually have abandoned the hope of a great intensive integration in favor of frankly specialized work in particular fields."28
Ensuing decades of academic research have fragmented knowledge even further, and colleges have admittedly paid a price. Many academic books and articles published today seem uncomfortably narrow, bound too closely by the confines and conventions of their discipline to do full justice to the problems they address. At the same time, it is hardly fair to ignore the progress that specialization has producedor to argue as though different branches of knowledge could be combined in some finely integrated whole if only blinkered professors would try a little harder. No one yet has demonstrated convincingly that the drawbacks of fragmentation have outweighed the contributions to knowledge made possible by specialization. Nor has any general theory or universal method emerged to knit the separate disciplines together. The unity of knowledge remains an elusive ideal. Although college presidents prior to the Civil War may have drawn on knowledge from many different sources to teach their seniors about the major issues of the day, few knowledgeable observers would bring these opinionated patriarchs back into the classroom if they were alive today.
As for the rise of vocational education, it is true that the proportion of B.A. recipients graduating with occupational majors has grown since the 1970s. Yet the proportion today is not significantly higher than it was in the 1940s and 1950s.29 Moreover, vocational courses are hardly new; they have been a fixture in American higher education at least since the Morrill Act of 1862 declared that the leading object of the new land grant universities would be, "without excluding other scientific or classical studies, to teach such branches of learning as are related to agriculture and the mechanic arts."30 As Frederick Rudolph reported in his classic study of the curriculum, undergraduate education in the United States has always had a practical, vocational aspect.31 In the words of Christopher Jencks and David Riesman, "the question has always been how an institution mixed the academic with the vocational, not whether it did so."32
The recent growth in the number of students pursuing vocational degrees seems to have come about primarily for two reasons, both external to the university. One is the tendency of American employers to demand higher levels of knowledge and skill from those whom they employ. Thus increasing numbers of young people believe that they must look to college for the competence they need to secure a good job. Many of them are students who would previously have gone directly into the workforce to learn the necessary skills on the job.
The other reason for the growth of vocational majors is the marked increase in the number of students who look upon making money and succeeding in one's career as primary motivations for going to college. Since 1970, the percentage of freshmen who rate "being very well off financially" as an "essential" or "very important" goal has risen from 36.2 to 73.6 percent, while the percentage who attach similar importance to "acquiring a meaningful philosophy of life" has fallen from 79 to 39.6 percent.33 It is hardly a surprise that these trends have been accompanied by a growing number of students seeking to prepare themselves for a career.*
Against this backdrop, can one really blame universities for offering more vocational programs? Surely colleges have some responsibility to respond to the desires of their students. In a system in which colleges must compete vigorously for enough applicants to fill their classrooms, they could hardly do otherwise and survive. Oddly, critics rarely recognize this fact, nor do they even pause to explain just why it is wrong for colleges to offer vocational programs; the very mention of the term vocational is considered enough to demonstrate the unworthiness of the offending institutions. Such disdain has its roots in nineteenth-century England, at a time when influential writers, such as Cardinal Newman, championed a liberal education free of any practical vocational instruction, and the young gentlemen attending Oxford and Cambridge were not supposed to "go into trade."34 Such attitudes have little relevance to present-day America, where college students know they will spend most of the waking hours of their adult lives at work and are naturally concerned about choosing a career and preparing themselves to get a job.
There is admittedly much more to college than acquiring occupational skills. Nevertheless, it is misleading to speak of students in vocational programs as if they study nothing but practical subjects. What a vocational B.A. typically means is simply that a student has substituted a practical, job-oriented concentration for a major in a liberal arts discipline. The remaining courses in the undergraduate program (often more than half the total number required) are still available for general education and electives.35 It may be that vocational majors do not offer the same benefits as a concentration in a traditional discipline or that some vocational majors have expanded to the point that they interfere with the general education program. But these are narrower arguments that call for discussion. Even if true, they suggest a need to revise vocational concentrations, not to abandon them entirely.
As for the overall quality of undergraduate teaching, there is little indication of a growing neglect on the part of college faculties. The widely publicized complaints are usually focused on the 125 research universities whose professors divide their attention among research, graduate instruction, and teaching undergraduates. Even within this restricted group of institutions, critics have leveled serious charges, but no one has offered convincing evidence that the quality of instruction was ever better than it is today. After castigating Harvard for failing to promote several popular younger faculty, Charles Sykes observes: "This [neglect of teaching] was, by no means, always the case at Harvard. At one time, Harvard boasted such brilliant scholar-teachers as Henry James, Irving Babbitt (a teacher of T. S. Eliot), George Santayana, Joseph Schumpeter, and William Ernest Hocking."36 True enough, but it is also worth noting that, during the golden age to which Sykes refers, the Harvard College Committee on Instruction reported that "certain lecturers failed to interest, some were inaudible, and some wasted time dictating data or having it copied from the blackboard."37 In this respect, Harvard was not unique. As Robert Angell of the University of Michigan observed in 1928, "No one can deny that professors are interested in their fields of study; but many believe that frequently they have little ability in, or enthusiasm for, imparting their knowledge and interest to immature undergraduates."38 A perusal of student evaluations today at most major universities would tell much the same story, with many instructors receiving high praise along with others whose teaching could clearly stand improvement.39
History, then, offers weak support at best for the reports of a decline in the quality of undergraduate education. Loose allegations to that effect have little foundation in fact but instead rest on fanciful visions of some previous golden age.40 Other charges alleging a growing neglect of teaching or the loss of some grand unifying purpose are likewise unsubstantiated. Even the recent growth of vocationalism is not unprecedented and results from underlying causes that most universities could not and should not ignore.
Although the prominent critics of undergraduate education may have an imperfect grasp of history, nothing that has been said proves that colleges are above reproach. It may well be that undergraduate education has not suffered any discernible decline in quality over the past 50 or 100 years. But is that really a satisfactory outcome? Most human enterprise improves with time and experience. That is certainly true of consumer goods, athletic performances, health care, the effectiveness of our armed forces, the speed of our transportation and communication systems, and much else. Given the vastly expanded resources colleges have acquired, thanks to growing private donations, steadily rising tuitions, and massive infusions of federal financial aid, isn't it fair to expect the quality of education to improve as well?
To be sure, the undergraduate enterprise has grown in several dimensions. Millions more students enter college today than half a century ago. Countless new buildings have been built; faculties have greatly increased in numbers; new courses of every kind fill college catalogues to overflowing. Undergraduates can now watch PowerPoint lectures, print out articles at their personal computers, and receive homework assignments via the Internet. But all these changes, however broad in scope, say very little about what is truly important. Has the quality of teaching improved? More important, are students learning more than they did in 1950? Can they write with greater style and grace? Do they speak foreign languages more fluently, read a text with greater comprehension, or analyze problems more rigorously?
The honest answer to these questions is that we do not know. In fact, we do not even have an informed guess that can command general agreement.
Colleges are not alone, of course, in finding it hard to demonstrate progress. Consumer products, medical treatments, and track and field performances may have improved demonstrably over the past 50 years, but there are plenty of activities besides undergraduate education for which it is difficult, even impossible, to render a convincing verdict. No clear consensus exists on whether the quality of architecture, poetry, or painting has improved since 1950, or whether lawyers are practicing their craft more skillfully or philosophers writing with greater insight.
This chapter ends, therefore, with an important question unresolved. Is it fair to judge the current state of undergraduate education as one might evaluate a consumer product, and ask for demonstrable improvements in quality? Or is the experience of college more like the writing of poetry and the practice of architecture, activities that normally defy such judgments, at least over periods of 50 or 100 years? This question will lurk beneath the surface of much that follows and will require at least a tentative answer before this book concludes.
Return to Book Description
File created: 8/7/2007
Questions and comments to: email@example.com
Princeton University Press