From Christian Gentleman to Bewildered Seeker: The Transformation of American Higher Education

Russell K. Nieli

When a student arrives at the university, he finds a bewildering variety of departments and a bewildering variety of courses. And there is no official guidance, no university-wide agreement, about what he should study....The net effect of the student’s encounter with the college catalogue is bewilderment and very often demoralization.

—Allan Bloom, The Closing of the American Mind1

Contemporary higher education in America is faulted on many grounds. But no criticism has been more enduring over the past fifty or sixty years than the charge that the typical college or university curriculum in the United States offers students a smorgasbord of courses and choices without coherence, interconnection, or relevance to the deeper purposes of life. “Over-specialization,” “fragmentation,” “supermarket sweeps,” “incoherence,” and “alienating irrelevance” are but a few of the terms that have been employed to describe this situation; they are just as likely to be applied to education at some of the better liberal arts colleges as at the larger research universities. Even the most prestigious and venerable of America’s older institutions of higher learning including Harvard, Princeton, Amherst, and Yale come under this indictment.

The current state of the typical undergraduate curriculum was long in developing. In this essay, I will outline the major transformation in American higher education that began in the late nineteenth and early twentieth centuries and continued during the tumultuous period of the late 1960s and early 1970s. In this later period, many of the trends attacked by critics in the first half of the twentieth century greatly accelerated, and new, even more destructive developments were added to the educational mix. To understand where we are today it will be helpful to understand from whence we have come.

The Protestant Ascendancy

Virtually all the institutions of higher learning established in America in colonial times and most of those started in the nineteenth century were founded by individuals or groups affiliated with Protestant Christian religious denominations. A major goal of these founders was to pass on the moral, intellectual, and religious heritage of Christianity and Greco-Roman high culture to succeeding generations of the nation’s youth. At first, Congregationalists, Presbyterians, and Anglicans dominated the college-creation business followed by a surge of college-creation activity in the nineteenth and early twentieth centuries by Methodists, Baptists, Lutherans, and members of several other Protestant denominations. Late in the game, following the large influx of Catholics from Ireland and then from Italy, Poland, and other parts of Southern and Eastern Europe, the Roman Catholic Church got into the college-founding business, establishing several denominational institutions of its own. The number of Catholic institutions, however, was dwarfed by the number of Protestant ones throughout the eighteenth and nineteenth centuries.

It is sometimes difficult for us to imagine today how many of our best-known colleges and universities, now so research- and vocation-oriented and far removed from any ecclesiastical influence, got their start as Christian religious institutions under the inspiration of evangelizing founders and clerics. Harvard, Yale, and Dartmouth were each created by pious New England Congregationalists; Princeton by “New Light” Presbyterians inspired by the first Great Awakening; William and Mary, Columbia, and the University of Pennsylvania by English-style Episcopalians; Boston University, Northwestern, Southern California, Syracuse, Vanderbilt, and Duke by Wesleyan Methodists; Brown and the University of Chicago by northern Baptists; and Georgetown, Fordham, and Notre Dame by Roman Catholics.

Even some of the most famous colleges and universities that were deliberately established on nondenominational and nonsectarian lines, including Stanford and Cornell, were conceived by their founders to reflect a general liberal Protestant or Unitarian religious spirit rather than a purely secular or rationalistic outlook. Leland and Jane Stanford, for instance, established Stanford in the late 1880s on nonsectarian grounds but directed the university’s trustees “to prohibit sectarian instruction, but to have taught in the University the immortality of the soul, the existence of an all-wise and benevolent Creator, and that obedience to His laws is the highest duty of man.”2 The thoroughly secularized university or college, with no claim to a higher moral or spiritual purpose, would have to await the early decades of the twentieth century for its maturation and development.

Even state-sponsored colleges and universities in the eighteenth and nineteenth centuries had a distinctly Christian religious flavor to them. With the elites of the day harboring a very different view of desirable church/state relationships than those that reign today, the public colleges in America usually reflected the attitudes, values, and worldview of Protestant Christianity, although one stripped to its essentials and devoid of many of the divisive theological doctrines that had led in the past to denominational strife.

The University of North Carolina at Chapel Hill, founded in 1789, was typical in this respect. Reflecting the educational views of some of its founding trustees, its early curriculum was patterned closely after Presbyterian Princeton, complete with the requirement that all students attend daily morning and evening religious services, in addition to Sunday services and Sunday evening examinations “on the general principles of morality and religion.”3 In the eighteenth and early nineteenth centuries, required attendance at chapel services was almost as universal among public colleges as at private schools. As late as 1890, one survey found that twelve of twenty-four state institutions of higher learning required chapel attendance, while most of the remaining offered Christian chapel services on a voluntary basis.4

During the long period when education was dominated by Protestant clergymen and other dedicated Reformation Christians, the educational system had a clear purpose, focus, and coherence. There was one simple and overriding goal: the production of morally earnest Christian gentlemen, well versed in liberal learning and in the classics of Greco-Roman and Biblical high culture, who would be able to assume leadership positions in American society, especially as clergymen, businessmen, lawyers, and elected officials. Not only the curriculum but the overall college experience in the age of ascendant Protestantism was marked by an overriding unity and a sense of collective purpose at once instructive and morally elevating. The culmination of this enterprise was the senior year course on moral philosophy, often taught by the college’s clergyman president, which explained to the students their ultimate moral obligations to God, their families, their nation, and their church.

The typical college curriculum in the eighteenth and early nineteenth century consisted of something very close to the medieval trivium (grammar, rhetoric, and logic), and quadrivium (arithmetic, astronomy, geometry, music), with the emphasis being on classical language learning (Latin and Greek) and mathematical knowledge. Besides the Bible, students would be expected to read many of the best of ancient Greek writers, including the tragedians, Plato, Aristotle, Xenophon, and Plutarch, as well as the best of the Latin authors, especially Cicero, Virgil, Livy, Seneca, and Marcus Aurelius. Instruction was generally uniform—everyone took the same prescribed course of study over a four-year period, with few if any “electives.” The instructors, a large portion of whom were ordained clergymen, were expected to be “generalists” who could be called upon when necessary to teach a variety of courses. There were no “research professors” or those not concentrated on the instruction of the young. College professors and tutors were also expected to set a good example in terms of moral rectitude and good Christian living for the students under their charge.

In addition to the unified structure and purpose embodied in the college curriculum, most American colleges in the eighteenth and early nineteenth century possessed what might be called a ritual or ceremonial unity born of the fact that prayer services in the college chapel, typically required of all students and attended by many of their professors and tutors, began and ended each academic day. These daily prayer activities helped to drive home symbolically the ennobling ideal that the entire educational enterprise was to be dedicated to the service of God and the furtherance of a higher moral and spiritual purpose.5 Even when colleges and universities began their rapid transition to secularism in the early years of the twentieth century, removal of the required chapel service was slow in coming. Arthur Hadley, for instance, who in 1899 became Yale’s first non-clergyman president, resisted the growing tendency of his era to make chapel attendance optional on the grounds that optional attendance would “[interfere] with the coherence of the student body.” As one Yale faculty member wrote at the time: “Students and alumni in large numbers express their approval of required prayers because of the inspiration which comes from seeing so many students together, and feeling one’s self a member of a great institution.”6 The required chapel service would remain an integral part of many of America’s foremost universities until well into the twentieth century.

Evangelical Protestantism today is often associated with fundamentalist Biblicism, anti-intellectualism, and suspicion of, if not hostility toward, much of modern natural science and its tradition of free inquiry. This, however, is largely a product of late nineteenth and early twentieth century developments, especially the controversies over Darwinian evolutionary theory and modern Biblical criticism. In the two hundred-year period between the European settlement of New England and the Civil War, Protestant Christianity presented a decidedly different public face. Evangelical Christians typically saw themselves as in the forefront of efforts to combine the best in secular learning with the spiritual and religious truths contained in the Bible. They saw no conflict between theological and secular knowledge. Both were ultimately understood as rooted in God’s providential design—in the gift of Creation in the formation of man and nature and in the gracious disclosure of the soteriological and spiritual truths contained in the life of Jesus and the Bible.

Different methods and faculties might be used to discern the nature of these two truths, but no contradiction was seen as inherent between them. Indeed, no contradiction could exist between God’s two great gifts to mankind—the created natural order and the divinely revealed Scripture. As one influential Williams College professor wrote in a popular apologetic textbook of the 1840s, “If God has made a revelation in one mode it must coincide with what he has revealed in another.”7 The vast majority of the Congregationalist, Presbyterian, Episcopalian, Methodist, and Baptist clergymen of the era would all have ascribed to some version of this claim. Nature properly investigated, it was held, would disclose the mystery and wonder of God’s creative hand, just as the Bible, properly explored and meditated upon, disclosed the truths of his supernatural revelation.

Protestants at this time often took pride in their openness to secular science, their belief in free inquiry, as well as in their adoption of congregationalist or republican-style principles of governance in both church and state, with each of these contrasted to what they saw as the darkness of Roman Catholicism. The Roman Catholic Church was associated with the persecution of Galileo and other early explorers of nature, as well as with the defense of monarchic tyranny, both secular and religious. As in England and Holland, the Protestant clergy in America often saw itself as the more advanced segment of modern society and looked down upon the obscurantism and backwardness it associated with the pope and Catholicism. As historian George Marsden writes, in late eighteenth and early nineteenth century America, “it was a widely shared article of faith that science, common sense, morality, and true religion were firmly allied....All the major Protestant parties agreed that commonsensically based scientific understanding of God’s revelation in Nature confirmed his revelation in Scripture....Protestantism was identified with the advances of civilization and the cause of freedom.”8

A phrase commonly used during this early period of ascendant Protestantism was “unity of knowledge.” All knowledge whether secular or religious was seen to be united in a coherent and integrated whole, and it was the purpose of higher education to cultivate an understanding of the various elements of knowledge in their relationship both to one another and to the more encompassing manifold. There was no sharp distinction to be made between moral and empirical knowledge, and both were to be employed in the service of individual character formation. As a famous 1828 Yale report stated, education must have as its goal “a proper balance of character”—and this balance was to be achieved by a broadly based liberal arts education, by regular devotional and prayer exercises, and by the living example of cultivated Christian gentlemen who provided day-to-day role models for those entrusted to their care.9

This holistic approach to theological and secular learning was greatly facilitated in the eighteenth and early nineteenth century by the popularity in America of Anglo-Scottish philosophy, particularly what came to be known as the school of “common sense.” Combining some of the salient features of Protestant Christianity with the common-sense style of philosophical and scientific inquiry that came to dominate the eighteenth-century Scottish universities, Anglo-Scottish philosophy stamped the character of American colleges in the eighteenth and early nineteenth centuries much the way Thomism would later stamp the character of the Catholic universities of the twentieth century.

The fact that the Anglo-Scottish Enlightenment—in stark contrast to its French counterpart—was integrative, balanced, and not hostile to religion had a salutary effect on the character of early American higher education. The most frequently read thinkers of the period, including Joseph Butler, John Locke, Adam Smith, Dugald Steward, William Hamilton, and Thomas Reid, integrated moral, spiritual, and social concerns in varying ways that tried to do justice to the dual imperatives of high morals and sound practical judgment. Science and religion, the humanities and the mathematical arts, Christianity and material culture, were all seen as part of a single unified body of knowledge. The later tears in the intellectual fabric that would give rise to the acrimonious controversies pitting science against religion were blissfully absent.

From Christian Gentleman to Research Specialist

The decades following the Civil War saw a steady and cumulative alteration in the nature of higher education in America. Over time, this change would transform the American college into something that would have been all but unrecognizable to the college founders of an earlier era. During this period, several of the nation’s most influential educators came to believe that the older style of liberal arts education in America, represented by the Protestant denominational college, was not up to the task of educating young men to assume creative roles in an increasingly urban, industrial, and science-based society. While acknowledging its strengths, these critics charged that the older curriculum did not leave enough room for advancing knowledge in the expanding sciences of the day. The traditional emphasis on humanistic, religious, and linguistic education, they said, impeded the growth in the kind of technical knowledge that would be necessary for America to continue as a leader in economic and scientific progress. What might have been an appropriate curriculum for an agriculturally based nation of artisans, shopkeepers, and small farmers, some contended, would no longer suffice for an increasingly industrial and science-based economy.

A number of prominent college presidents of this period were in the forefront of educational reform. These included Charles Eliot of Harvard, Andrew Dickson White of Cornell, Frederick Barnard of Columbia, Daniel Gilman of Johns Hopkins, and James Angell of the University of Michigan. All of these educators believed that American higher education had to undergo dramatic changes to keep up with the changing demands of America’s increasingly science- and technology-based economy and the explosion of knowledge in many academic fields. Failure to do so, they believed, would consign America to the backwaters of educational progress. Columbia’s president Frederick Barnard declared in 1866 that the central educational problem of the day involved the development of new kinds of programs in America’s colleges and universities that would meet the needs of a rapidly growing, modern technological society. “Until a comparatively recent period,” Barnard declared, “our higher institutions of learning...have been equal to the wants of the country. But with the advancement of human knowledge and the growing diversity of the arts of civilized life new fields are opening and new wants springing up which imperatively demand the creation of new agencies.”10

The “new agencies” Barnard had in mind included new and more specialized college courses (particularly in the natural and social sciences), an expanded faculty and student body, and a new focus on cutting-edge scholarship and research. Although early reformers may not have been aware of it—and some would probably not have approved of it if they had—they were embarking on an enterprise that within little more than a generation would transform many of the older Protestant liberal arts college into modern, research-oriented universities that retained little if anything of their classical or Christian past.

For many of the educational reformers, the research universities in Germany offered some of the qualities that American colleges should seek to emulate. Many leading scholars in the latter half of the nineteenth century had studied in Germany as graduate students and were impressed by many of the features of the German university. In particular, they admired the work of the full-time professors who labored in highly specialized branches of knowledge seeking to expand the frontiers of their respective disciplines. As Marsden writes, “It would be rare to find either a university leader or a major scholar [in the latter part of the nineteenth century] who had not spent some years studying in Germany....Americans stood in awe of the German universities....For Americans, who in university building were behind just about every European country, an appeal to a German precedent could be an intimidating argument.”11

The advancement of knowledge, men like Frederick Barnard believed, required a huge expansion in the size of the college faculty and the scope of the college curriculum. Just as important, they held, was to hire faculty strictly on the basis of their excellence in a specialized field of knowledge without regard to their moral or religious convictions. Such a basis of selection had far-ranging and often unforeseen consequences for the overall cultural and religious cohesiveness of an institution.

With the expansion in the size and scope of faculty and course offerings, the older ideal of the “unity of knowledge” was placed under the severest of strains. Since students could only take a small fraction of the courses offered in the ever-expanding college course catalogue, some principle of selection had to be devised. The most common response to this problem was the adoption of the elective system.

Pioneered by Harvard College in the 1870s under the leadership of Charles Eliot, the elective system, which would gradually be adopted by other leading colleges and universities over the next generation, represented the triumph of diversity and educational choice over the older ideal of a common curriculum. It was well fitted to the American college’s new emphasis on being in the forefront of expanding knowledge. When first introduced, Harvard’s elective system gave virtual free rein to undergraduates to take whatever courses they wanted—or didn’t want—although in most colleges and universities (including Harvard) the elective system would in time be modified by a “majoring” or “concentrating” requirement similar to the one in place today.

Undoubtedly, the reforms instituted by men like Eliot and Barnard helped elevate American institutions of higher learning. With the destruction of the German universities by the Nazis in the 1930s, American colleges and universities assumed their present status as the finest research institutions in the world. But there is also no doubt that something precious had been lost in this transition. With the declining importance of literature and the humanities within the ever-expanding list of college course offerings and the rise in importance of natural science, economics, and vocationally oriented graduate and business programs, there was a clear loss of educational cohesiveness and shared educational mission. By the early twentieth century, colleges and universities were rapidly abandoning their older mission of passing on a valued spiritual and intellectual heritage to succeeding generations. This trend was greatly accelerated in the early twentieth century by the declining presence of clergymen in higher education as teachers, tutors, college presidents, and trustees.

While these changes opened many exciting new educational opportunities to students, for those students not focused on a specialized vocational course of study the undergraduate college experience could be marked by an acute sense of bewilderment and anomie. This loss of coherence and civilizing mission, and the resulting bewilderment and sense of drift that it produced in many students, became the target of many critics of American higher education, especially in the years from the early twentieth century to the present.

Early Voices of Protest

The shift from the older Christian liberal arts college to the modern research-oriented “multiversity” did not occur without protest. From the very beginning of the modern reform era in the 1870s and 1880s, there was a spirited and articulate opposition to most of the things that reformers at places like Harvard, Columbia, Cornell, and Michigan were trying to do. Many of these early educational reformers sincerely believed that it would be possible to combine the culture- and character-forming aspects of a Christian liberal arts college with the greater choice and opportunities offered in a larger institution that conducted state-of-the art research and provided an expanded and diversified array of courses. But their hopes proved to be an illusion, as their many critics accurately foresaw.

The two most influential representatives of the anti-reformist position in the closing decades of the nineteenth century were Yale’s president Noah Porter and Princeton’s president James McCosh.12 Both were devout Christians who saw such changes as Harvard’s adoption of the elective system and its abandonment of the daily chapel requirement as mortal threats to the traditional meaning and mission of America’s institutions of higher learning. Porter was a distinguished philosopher who had spent two years as a young man studying in Germany and was not at all hostile to at least some of the changes that colleges were undergoing. But he believed that scholarship and research should be carried on within the college setting primarily by liberally educated gentlemen or clerics who all shared a basic commitment to fundamental Christian values. “American colleges,” he wrote in 1869 shortly before assuming the helm at Yale, “should have a positively religious and Christian character.”13 This was particularly important in the present day, he contended, as a way of correcting and offsetting what he saw as the selfish and materialistic tendencies in much of modern culture and intellectual life.

Porter was particularly disturbed by the increasing popularity among the intellectually advanced segments of the public of atheistic, agnostic, and otherwise anti-Christian thinkers like Auguste Comte, Herbert Spencer, and T. H. Huxley. In a much publicized dispute, he objected when William Graham Sumner, Yale’s high-profile Social Darwinian professor of sociology, used a textbook by Herbert Spencer in his main undergraduate course. While Porter admired some aspects of the German research universities and did not believe in a strict denominational test to teach at Yale, he thought that no one should be allowed to use his college teaching position to espouse views hostile to Christianity or to religious faith. Colleges inevitably teach some kind of value system or “theology,” Porter argued, so the real question is “what theology it shall teach—theology according to Comte and Spencer, or according to Bacon and Christ.”14 “Religious influences and religious teachings,” he said, “should be employed in colleges in order to exclude and counteract the atheistic tendencies of much of modern science, literature, and culture.”15 Porter also believed in drawing upon classical education, as traditionally conceived, to further the college’s mission as educator and guide to the young. Failing to do so, he feared, would gravely undermine the traditional Christian educating and civilizing mission of colleges like Yale.

Three-quarters of a century after Porter’s time, many of his complaints about the direction in which modern higher education was moving would be echoed by William F. Buckley Jr., then a young firebrand conservative and recent Yale graduate. In 1951 Buckley published a stinging indictment of what he believed was Yale’s anti-Christian bias that restated many of Porter’s arguments against the trends of his day. Buckley’s book, God and Man at Yale,16 touched off a spirited round of soul-searching among Yale’s administrators and trustees, but it was no more able to change the secular and modernist direction in which Yale and other top colleges and universities were headed than the earlier protests of Porter.17

James McCosh, president of Princeton from 1868 to 1888, was also deeply troubled by many of the developments in American colleges during this period. Harvard’s free-ranging elective system was a special target of his criticism. He was alarmed by the fact that one could go through four years at an institution like Harvard and never be exposed to a course on religion or morality. In 1885, in a much-publicized debate in New York City with Harvard’s president Charles Eliot, McCosh accused Harvard of having abandoned its religious and character-forming mission and scornfully suggested that the motto over the gates of Harvard Yard should read, “All knowledge imparted here except religion.”18 Like all the founders of America’s early colleges, McCosh believed that religion and the classical heritage were important for building up and reinforcing good morals among the young. But he believed that Christianity was important for other reasons. In modern times, McCosh argued, students were asking troubling questions about ultimate meaning and whether life is worth living. The prevailing secular and agnostic trend in modern science, he said, offered no answer to such searching questions and thus was often a source of great anguish for students. Religion, and more specifically Christian faith, could provide comfort and assurance to such troubled souls, McCosh held, in ways that the philosophies of men like Herbert Spencer and T.H. Huxley could not.19

McCosh, supported chapel attendance requirements and believed that colleges like Princeton should hire only committed Christians to the faculty. He also believed in the traditional classical curriculum, although he would depart from it in allowing somewhat wider student choice of subjects. A nationally respected philosopher and scholar, McCosh exerted considerable influence during his many years at Princeton and was probably one reason why Princeton retained important elements of its classic and Christian heritage longer than most of its Ivy League rivals.

The atomizing, secularizing, and fragmenting trends in higher education that alarmed critics like Porter and McCosh continued at an accelerated pace in the first half of the twentieth century and produced a host of outspoken critics. The most influential in the early decades were the so-called “new humanists,” a loosely affiliated group of literary-oriented intellectuals whose most influential spokesman was the Harvard comparative literature professor Irving Babbitt. Babbitt had himself studied in Europe, but in Paris at the Sorbonne rather than in Germany, and he was highly critical of the effect that German scholarship was having on the older style of American classical and literary education. Babbitt particularly disliked German philology—a prestigious scholarly enterprise in his day—which he saw as too often leading to trivial pursuits and to the loss of appreciation for great literature.

In his first book, Literature and the American College (1908), Babbitt charged that the German philologists, in their obsession with tracing the history of words, had missed the meaning of literature and were often engaged in a thoroughly useless project: “So far from asking himself whether his work will ever serve any practical purpose,” he wrote, the German specialist in philology “never stops to inquire whether it will serve any purpose at all.”20 Classical and literary studies in America, Babbitt believed, had been greatly harmed and trivialized by the influence of German scholarship, which had lost the ability to discern what was truly great in great literature. “We should be grateful to the Germans for all we have learned from them,” Babbitt conceded, “but at the same time we should not be their dupes.”21

Babbitt, like his fellow humanists, believed that a revival of literary and classical studies was needed to counteract the American overemphasis on science and business. Exposure to great literature, philosophy, and art, Babbitt and like-minded humanists believed, had an ennobling effect on the conscientious reader that was at once aesthetic, intellectual, and moral. Babbitt also opposed the branch of modern literature, descended from Rousseau, that celebrated sentimentality and unrestrained feeling over self-discipline and cultivated taste. He believed that a revival of the best in Christian and particularly classical Greek literature and philosophy would provide an antidote to both Rousseauean romanticism and scientific materialism and help stop the precipitous decline in moral and aesthetic tastes.22

Although not formally affiliated with the “new humanists,” one of the most influential educational innovators of the early twentieth century was the Columbia literature professor John Erskine. Like Babbitt, Erskine believed that college education in the late nineteenth and early twentieth century had lost its coherence and its sense of a civilizing mission, and that this development resulted from an over-emphasis on scientific and material progress and a neglect of the older ideal of character-formation. Erskine’s response was to introduce in 1920 a two-year undergraduate honors course in the “the masterpieces” of Western philosophy and literature. Erskine’s honors course at Columbia became a model for what would later be referred to as the Great Books approach to learning. In 1937 it was adopted by Columbia University as a required sequence for all entering freshmen. Erskine believed that exposing students to the greatest literature and philosophy that Western culture had produced would not only pass on to the young a valued heritage, but would help to expand human sympathies and counter what he saw as the shallowness and materialism of much of contemporary American life.23

Erskine’s mission was carried on with even greater zeal and effect by the most famous graduate of his honors course, the New York-born Jewish intellectual Mortimer Adler. Adler, the son of immigrant parents, continued at Columbia through the Ph.D. degree, and after teaching there for a short time, moved on in the early 1930s to the University of Chicago, where he began a long-time collaboration with the university’s young Wunderkind president, Robert Maynard Hutchins. In the 1930s and 1940s Adler and Hutchins became the great apostles of the Great Books approach to learning and both also supported the revival of the kind of metaphysical and natural law philosophy associated with the great Western thinkers of the past, especially Plato, Aristotle, and Aquinas. In the early 1950s they helped edit a 54-volume collection of Great Books of the Western World intended for use in college courses.

Like most intellectuals who came of age in the era of European totalitarianism, Adler and Hutchins believed that Western civilization had to spell out clearly what it stood for if it was ever to retain the allegiance of those fighting in its name against fascism and Stalinism. Adler, in fact, believed that the loss of the older classic and Christian vision that had once sustained the West, especially the Platonic-Aristotelian-Thomistic tradition of metaphysics and natural law, was the real source of Western civilization’s weakness in the face of totalitarian danger. The positivism and philosophic materialism of the intellectual elite, Adler charged, was at the root of America’s cultural weakness and decline. In an incendiary speech titled “God and the Professors,” delivered at New York’s Jewish Theological Seminary in 1940, Adler accused the American professoriate of being the main vehicle of the positivist corruption. His speech drew widespread commentary and impassioned opposition. “The most serious threat to Democracy,” Adler declared, “is the positivism of the professors, which dominated every aspect of modern education and is the central corruption of modern culture.” Democracy, he went on, “has much more to fear from the mentality of its teachers than from the nihilism of Hitler.”24

In a more relaxed mode, Hutchins tried to make the case for a revival of classic philosophy and metaphysics in a book he published in 1936, The Higher Learning in America, which became the most discussed work on higher education over the ensuing decade.25 In this thin book Hutchins charged that higher learning in America had fallen into a state of anarchy and chaos due to the loss of its organizing center and the bewildering proliferation of new disciplines and sub-disciplines. This development was closely tied to the worship of money and to an overemphasis on natural science: “Our erroneous notion of progress,” Hutchins charged, “has thrown the classics and the liberal arts out of the curriculum, overemphasized the empirical sciences, and made education the servant of any contemporary movements in society, no matter how superficial.”26 This development, he continued, had robbed education of its function in providing a common culture and was responsible for education’s current state of fragmentation and incoherence. “Unless students and professors have a common intellectual training,” he wrote, “a university must remain a series of disparate schools and departments, united by nothing except the fact that they have the same president and board of trustees. Professors cannot talk to one another, not at least about anything important.”27

What was needed most, Hutchins believed, was a knowledge of “first principles” that would enable the college or university to see how all the disciplines fit into a hierarchically structured and integrated whole. Such first principles, he believed, were explored by Greek and Christian metaphysicians and could be rediscovered today through the exercise of human reason. While Hutchins, like Adler, did not believe that theology, as traditionally understood, could provide the moral and spiritual foundation for genuine cultural revival (“we are a faithless generation and take no stock in revelation”), he believed that metaphysical knowledge of the higher things in life was both possible and urgently necessary. Hutchins proposed a radical solution: a) provide only a general education based on the Great Books for the first two undergraduate years; b) eliminate the current disciplinary boundaries and divide the university’s departments up into just three divisions—natural science, social science, and metaphysics; and, c) establish an Aristotelian-like hierarchy and interconnection of all the branches of knowledge with metaphysical knowledge providing the unifying matrix. Hutchins’ proposals, however, were severely criticized—John Dewey and Sidney Hook mocked the ideas of Adler and Hutchins as a “new medievalism”—although the Great Books component of his program would gain considerable currency and be adopted at least in a scaled-back version by several American colleges and universities in the 1940s and 1950s.28

One of the universities to give serious consideration to the Great Books idea in the 1940s was Harvard. By the middle of the decade it had become clear to many on Harvard’s faculty that the massive proliferation of courses, departments, fields, and sub-fields, coupled with the huge expansion in the size of the faculty, staff, and student body, had produced a loss of unity and sense of common purpose among all members of the university community that was most acutely felt among the undergraduates at Harvard College. To address this situation and the more general issue of education in America, Harvard’s president James Bryant Conant appointed a distinguished committee of the Harvard faculty, chaired by the dean of the faculty of arts and sciences, Paul H. Buck. The committee issued a comprehensive report in 1945 titled General Education in a Free Society.29 This report, the most thoughtful and comprehensive treatment of the state of higher education in America of all that had been published up until that time, gained a large and respectful audience.

At Harvard College and elsewhere, the report concluded, liberal education had been deprived of “any clear, coherent meaning” by the proliferation of courses and expansion in student choice. Even with the system of majoring and concentrating that had been instituted to offset the pure elective system, there was a universally perceived need, the report suggested, for greater coherence and some unifying bond to hold together the undergraduate curriculum. The “headlong growth of knowledge,” the report stated, was largely responsible for this situation and there was certainly much to be said for this development. Progress itself had caused the problem, but progress was still progress and could not and should not be rolled back. The contemporary situation was described in the following paragraph:

A supreme need of American education is for a unifying purpose and idea. As recently as a century ago, no doubt existed about such a purpose: it was to train the Christian citizen. Nor was there doubt how this training was to be accomplished. The student’s logical powers were to be formed by mathematics, his taste by the Greek and Latin classics, his speech by rhetoric, and his ideals by Christian ethics. College catalogues commonly began with a specific statement about the influence of such a training on the mind and character. The reasons why this enviable certainty both of goal and of means has largely disappeared have already been set forth. For some decades this mere excitement of enlarging the curriculum and making place for new subjects, new methods, and masses of new students seems quite pardonably to have absorbed the energies of schools and colleges. It is fashionable now to criticize the leading figures of that expansive time for failing to replace, or even to see the need of replacing, the unity which they destroyed. But such criticisms, if just in themselves, are hardly just historically. A great and necessary task of modernizing and broadening education waited to be done, and there is credit enough in its accomplishment. In recent times, however, the question of unity has become insistent. We are faced with a diversity of education which, if it has many virtues, nevertheless works against the good of society by helping to destroy the common ground of training and outlook on which any society depends.30

How to restore unity of purpose as well as the morally and spiritually elevating features of the older Christian liberal arts college has become the challenge of the day, the report said. A variety of solutions had been proposed. One solution, which the report was quick to dismiss, was the Roman Catholic one. “Sectarian, particularly Roman Catholic, colleges have of course their solution,” the report explained, “which was generally shared by American colleges until less than a century ago: namely, the conviction that Christianity gives meaning and ultimate unity to all parts of the curriculum, indeed to the whole life of the college.” However, such a solution, the report concluded, “is out of the question in publicly supported colleges and is practically, if not legally, impossible in most others.”31

How can a new unity and a new common culture be fostered when the older style of denominational college, well represented by the early motto on Harvard’s seal, Christo et Ecclesiae, is the product of a bygone era that cannot be restored? One possibility suggested in the report is a Great Books program, such as that started at St. John’s College in Maryland, where students would spend their entire four undergraduate years reading many of the greatest books in philosophy, religion, mathematics, science, and the like drawn from the entire sweep of Western history beginning with the Jews and Greeks. Electives would be largely eliminated and all would share in a common curriculum. The Harvard committee, however, rejected this approach, at least in its radical form, as inconsistent with the current departmental structure of modern colleges and universities. “The much criticized departmentalization of the colleges is but a product of the enormous growth and specialization of learning during the past two or three generations,” the report explained, “and it would be entirely unrealistic and out of keeping with the growth of higher learning in modern times to propose that this differentiation should be supplanted by an organizational scheme unrelated to the existing specialization and diversification.”32

While the committee rejected the total-immersion Great Books approach of a college like St. John’s, the report’s final recommendation was for two courses to be required of all undergraduates that would draw upon the Great Books principle. One of the proposed courses, “Great Texts of Literature,” would survey many of the classics of Western literature, while the other proposed course, “Western Thought and Institutions,” would deal with the leading philosophers and political thinkers of the West, beginning with the classical and Christian periods. Required readings for the first course, the report suggested, might include “Homer, one or two of the Greek tragedians, Plato, the Bible, Virgil, Dante, Shakespeare, Milton, [and Tolstoy],” while the other might include selections from the writings of Aristotle, Aquinas, Machiavelli, Luther, Bodin, Locke, Montesquieu, Rousseau, Adam Smith, Bentham, and Mill.33 Something like the second proposed course had been successfully taught at Columbia for many years, the report noted, and the committee members believed it could be successfully taught as a required course at Harvard and elsewhere. The committee’s recommendations would later influence many colleges in instituting required survey courses in Western civilization and other “core curriculum” subjects. While this by no means reestablished the unity and coherence of the older Christian liberal arts college, it was clearly a step in that direction.

Destructive Generation: The 1960s and Beyond

The 1940s and 1950s saw a continued expansion in the size of America’s university population as the G.I. Bill and a return to economic normalcy following the Great Depression enabled larger segments of the population to afford a college education. Many educators during this period had absorbed the criticisms leveled against American higher education by critics like Adler, Hutchins, and Harvard’s Buck committee. They tried to maintain some kind of balance between “core curriculum” imperatives that added meaning and coherence to undergraduate instruction and the demands of ever more specialized academic and vocational disciplines.

The huge migration of some of Europe’s most outstanding scholars during the fascist and World War II years, in addition to greatly enriching American academic life, helped slow down the long-term trend toward greater specialization and fragmentation of university instruction. European émigré scholars, including many who rose to great prominence in American universities during this period, had often received a rigorous classical education in advanced secondary schools like the German and Austrian gymnasien, and were often among the severest critics of the narrowly specialized modes of inquiry in many American academic disciplines. Some of the better known and more influential academics in this category included Jacques Maritain, Hannah Arendt, Leo Strauss, Erich Fromm, Paul Ricoeur, Mircea Eliade, Eric Voegelin, Carl Friedrich, Ludwig von Mises, Erik Erikson, Hans Morgenthau, Joseph Schumpeter, Friedrich Hayek, Walter Kaufmann, Pitirim Sorokin, Paul Tillich, and Waldemar Gurian (to name just a few). So great was the influence of these scholars that Allan Bloom, who met many of them as a student at the University of Chicago in the early 1950s, could later write that “no universities were better than the best American universities [in the early 1950s] in the things that have to do with a liberal education and arousing in students the awareness of their intellectual needs.”34

Americans who went to college during this period, many of whom were the first in their families to afford study beyond high school, were an unusually serious, mature and appreciative lot. Having passed through the crucible of the depression years and the Second World War, they would later be celebrated in some quarters as America’s “Greatest Generation.” With the Cold War as backdrop, few from this generation needed convincing that in the West’s civilizational clash with Communism Americans required some understanding of what they believed in. Thus colleges were only too eager to offer courses in “The Western Tradition.” Exactly what was good (or not good) in the tradition of the West was a matter of dispute—the good could be located in the American political tradition of democracy and freedom, in the Reformation theological tradition of religious protest and Biblical faith, in the humanism of the Renaissance, in the Aristotelian-Thomistic natural law tradition, in the Anglo-Scottish tradition of common sense, in the classical tradition of Greco-Roman high culture, or (as was more commonly assumed) in some judiciously selected and qualified combination of these. But it was generally agreed that valuable elements in the Western past urgently needed to be explored and defended, and both the high school and college curriculum were seen as appropriate places for this to be done.

In the tumultuous period that began in the late 1960s, however, all these assumptions would be called into question. The 1960s today are remembered for many things including student protests against the Vietnam war, urban race riots, romantic fantasies about new political utopias being created in Cuba and other Communist lands, and revolutionary changes in mainstream attitudes towards hair, authority, sex, drugs, gender roles, and music.35 The period also saw dramatic changes in the way colleges and universities viewed themselves in their relationship to students and student demands. These changes in perspective would have a momentous impact on those meager remnants of the older liberal arts education that had managed to survive until that time.

The situation has been well described by former education secretary William Bennett: “When students demanded a greater role in setting their own educational agendas,” Bennett writes, colleges and universities “eagerly responded by abandoning course requirements of any kind and with them the intellectual authority to say to students what the outcome of a college education ought to be.” There was, Bennett continues, “a collective loss of nerve and faith on the part of both faculty and academic administrators during the late 1960s and early 1970s [that] was undeniably destructive of the curriculum....The curriculum was no longer a statement about what knowledge mattered; instead, it became the product of a political compromise among competing schools and departments overlaid by marketing considerations....Because colleges and universities believed they no longer could or should assert the primacy of one fact or one book over another, all knowledge came to be seen as relative in importance, relative to consumer or faculty interest.”36

The final outcome of this process, says Bennett, was the transformation of American institutions of higher learning into educational bazaars where students go about like tourists looking for cheap bargains. This pernicious development, moreover, quickly spread downward. It was no surprise, Bennett says, that “once colleges and universities decided the curriculum did not have to represent a vision of an educated person, the secondary schools and their students took the cue and reached the same conclusion.”37

Educational scholar Diane Ravitch offers a similar assessment of the late 1960s and beyond. During this period, Ravitch says, mandatory courses and structured curriculum requirements were either diminished or abolished and ever fewer demands of any kind were made upon students. “There was no end to the bold ideas to reduce the academic focus of the schools,” Ravitch writes, “or to demonstrate that students could somehow lead themselves to water and then persuade themselves to drink.”38 University professors in this post-1960s dispensation, she says, no longer saw themselves as educators whose purpose was to help students to stand on the shoulders of giants; it was not their task to expose students to any accumulated wisdom of the ages or to transmit a valued culture or tradition. Professors had been reduced to the servants of immature youth and their fickle demands.39

For those who had witnessed the great leavening effect upon the universities of the European migration of the 1930s and 1940s, the late 1960s-and early 1970s-era developments came as a traumatic shock. As they saw it, the barbarians stood before the gates and the gate keepers eagerly invited them in. A combination of cowardice, bewilderment, and confusion, they believed, had destroyed what remnants of the older liberal high culture had managed to survive the many eroding forces of Western modernity. The college curriculum that existed at places like Cornell in the 1950s and early 1960s, Allan Bloom wrote, was at least “a threadbare reminiscence of the unity of knowledge”; it provided “an obstinate little hint that there are some things one must know about if one is to be educated.” “You don’t replace something with nothing,” says Bloom, but that is exactly what the educational reformers of the 1960s, in their do-your-own-thing approach to the college curriculum, successfully managed to do.40 It was the triumph of nihilism.

Bloom ruefully describes how he sat on various committees at Cornell during this period “and continuously and futilely voted against dropping one requirement after the next.”41 Something was indeed being replaced by nothing, and even the most impassioned protests by educators like Bloom were unable to stop the trend.

As if this assault hadn’t been enough, the late 1980s and early 1990s witnessed a renewed attack on the idea of a common core curriculum, at least a curriculum structured at all along traditional lines. The leaders of this attack were a motley coalition of students and young professors (many of the latter aging 1960s-era radicals) who identified with various radical feminist, black activist, gay rights activist, or Third-Worldist causes. They were able to influence many colleges and universities into changing those elements of the “Western Civ” curriculum that still survived in order to accommodate their very different agendas. The watchword of the movement was “multiculturalism,” an ambiguous term with a politically charged meaning. “Hey, hey, ho, ho, Western culture’s got to go” was how student activists at Stanford expressed their demands, and the Stanford faculty’s surrender to their importunities provided a model for student activism that would be successfully emulated elsewhere.42

Under student pressure, the Stanford Faculty Senate voted overwhelmingly in the spring of 1988 to change the required three-semester core curriculum sequence, which had previously focused on European history and the classics of Western literature and philosophy, into a more diffuse, less coherent mélange of courses and subject matter bearing the mushy title “Cultures, Ideas, and Values.” The new CIV sequence focused on previously marginalized and oppressed groups and their perspectives and was frequently accompanied by an unspoken contempt for what was seen as the traditional viewpoints of “dead white Western males.” Besides the new, three-semester CIV curriculum, Stanford adopted an additional course in late 1990, one required of all entering freshmen, that focused on the writings of blacks, Hispanics, feminists, and homosexuals. Once again this would set an example for other, less prestigious institutions, some of which would require that students take courses on non-Western cultures or multiethnic studies but not courses on Western history or Western culture. The result was an even greater neglect of the once venerable masterpieces of Western literature and philosophy. “Alice Walker’s The Color Purple,” one knowledgeable observer remarked in the early 1990s, “is taught in more English departments today than all of Shakespeare’s plays combined.”43

The protesters’ demand for “multiculturalism,” however, was in many ways very different than it appeared on the surface. If the only change sought had been to include some of the great masterpieces of literature and philosophy from non-Western cultures in a Great Books or similar approach to learning, few would have objected. Including the Bhagavad Gita, the Tao Te Ching, the Tibetan Book of the Dead, or the poetry of Rumi in a required course on literature or philosophy is a curriculum expansion even many conservative traditionalists would applaud. “The humanities,” former education secretary Bennett writes (echoing Mathew Arnold), can be described as “the best that has been said, thought, written, and otherwise expressed about the human experience. The humanities tell us how men and women of our own and other civilizations have grappled with life’s enduring, fundamental questions.”44 Bennett and any other prominent educator in America during the rage for multiculturalism would have agreed that non-Westerners have produced some of “the best that has been said, thought, written, and otherwise expressed about the human experience.”

But the multiculturalist protesters of the late 1980s and early 1990s were more intent on deprecating Western achievements and projecting onto a worldwide canvas their own fantasies and hostilities about a supposedly unique Western proclivity for oppression and violence than they were in coming to understand the greatness of any of the intellectual achievements of non-Westerners. Their goals were primarily symbolic and political, not intellectual or scholarly. This can be readily discerned by the fact that for all the changes in the required reading lists that multiculturalists brought about at places like Stanford, their crusade produced little discernible surge in serious student interest in any of the major non-Western literary languages such as Sanskrit, Persian, Hindu, Swahili, Arabic, or Chinese or in the study of non-Western high cultures more generally. Once again, something was being replaced, if not by nothing, then by a politically correct stew of pallid and largely non-nutritious ingredients, if not actual junk food.

There was another, equally serious problem with the multiculturalist project of this era, one trenchantly analyzed by American Enterprise Institute scholar Dinesh D’Souza. The political ideals that the multiculturalists believed were more highly respected in Third World societies than in the supposedly racist-sexist-homophobic West, D’Souza explains, were in large part Western-derived ideals that had little resonance outside the West. “The movement for curricular expansion,” D’Souza writes, “arose in the aftermath of the civil rights, feminist and homosexual rights struggles of the 1960s and 1970s. For its advocates the purpose of studying other cultures is to affirm them as alternatives to Western mores, to celebrate the new pluralism and diversity.” The basic difficulty here, D’Souza goes on, “is that, by and large, non-Western cultures have no developed tradition of racial equality. Not only do they violate equality in practice, but the very principle is alien to them, regarded by many with suspicion and contempt. Moreover, many of these cultures have deeply ingrained ideas of male superiority....[For instance], the renowned Islamic scholar Ibn Taymiyya advises, ‘When a husband beats his wife for misbehavior, he should not exceed ten lashes.’...Feminism is simply not indigenous to non-Western cultures.”45

In regard to the treatment of homosexuals, the non-Western attitude is even more divergent from attitudes in contemporary America and Europe. “It is perhaps pointless,” D’Souza writes, “even to bring up the issue of non-Western attitudes toward homosexuality or other ‘alternative lifestyles,’ which in various societies are enough to warrant segregation, imprisonment, even capital punishment. In Cuba homosexuals are often thrown in jail and in China, they are sometimes subjected to shock treatment.”46

The late Harvard historian Arthur Schlesinger Jr. made similar points in his influential critique of the multicultural agenda, The Disuniting of America. “No doubt Europe has done terrible things, not least to itself,” Schlesinger writes. “But what culture has not?” he asked. “The sins of the West are no worse than the sins of Asia or of the Middle East or of Africa. There remains, however, a crucial difference between the Western tradition and the others,” he wrote. “The crimes of the West have produced their own antidotes. They have provoked great movements to end slavery, to raise the status of women, to abolish torture, to combat racism, to defend freedom of inquiry and expression, to advance personal liberty and human rights.”47

“Whatever the particular crimes of Europe,” Schlesinger went on, “that continent is also the source—the unique source—of those liberating ideas of individual liberty, political democracy, the rule of law, human rights, and cultural freedom that constitute our most precious legacy and to which most of the world today aspires. These are European ideas, not Asian, nor African, nor Middle Eastern ideas, except by adoption. The freedoms of inquiry and of artistic creation, for example, are Western values. Consider the differing reactions to the case of Salmon Rushdie: what the West saw as an intolerable attack on individual freedom the Middle East saw as a proper punishment for an evildoer who had violated the mores of his group.”48

Schlesinger continued in this vein:

It was the West, not the non-Western cultures that launched the crusade to abolish slavery—and in doing so encountered mighty resistance, especially in the Islamic world (where Moslems, with fine impartiality, enslaved whites as well as blacks). Those many brave and humane Africans who are struggling these days for decent societies are animated by Western, not by African ideals....Today it is the Western democratic tradition that attracts and empowers people of all continents, creeds, and colors. When the Chinese students cried and died for democracy in Tianamen Square, they brought with them not representations of Confucius or Buddha but a model of the Statue of Liberty.49

Schlesinger, of course, would not deny that people in other cultures have made great contributions to “the best that has been said, thought, written, and otherwise expressed about the human experience.” But different cultures excel at different things. If one is looking for the kind of defense of individual liberty, political freedom, and freedom from imperialist oppression as 1990s-era proponents of multiculturalism said they did, a good place to start would be a reading list heavily weighted with Western thinkers such as Marsilius of Padua, John Milton, John Locke, Tom Paine, Thomas Jefferson, James Madison, Mary Wollstonecraft, Johan Schiller, Johan Herder, William Wilberforce, Benjamin Constant, John Stuart Mill, Alexis de Tocqueville, Jacob Burckhardt, Wilhelm von Humboldt, Frederick Douglass, Susan B. Anthony, Lord Acton, Leo Tolstoy, Woodrow Wilson, Eleanor Roosevelt, Martin Luther King Jr., Alexander Solzhenitsyn, Vaclav Havel, and John Paul II. Such a reading list is a far cry from what undergraduates read today, and college-educated adults are much the poorer as a result.

Concluding Thought: A Clear Vision of What Is Worth Knowing

“Most students enter college,” writes Los Angeles Times editor David Savage, “expecting that the university and its leaders have a clear vision of what is worth knowing and what is important in our heritage that all educated persons should know.” “They also have a right,” he continues, “to expect that the university sees itself as more than a catalogue of courses.”50 Savage is clearly correct, and on this score modern universities have been conspicuously failing their undergraduates for many decades.

Is there an easy way out of the dilemma? The answer is probably “no,” as there is undoubtedly a conflict between the demands of a modern research university and the kind of liberal arts education that was the goal of American institutions of higher learning during the long period of the Protestant ascendancy. Yet healthy compromises can surely be struck and there is no reason to believe that the extreme curriculum incoherence and fragmentation that plagues most of our major universities today is an unalterable fate.

“What is to be done?” is a topic the details of which must be left for another day, but a good place to start thinking about these issues is with the recommendations of the 1946 report of Harvard’s Buck committee, General Education in a Free Society. It is amazing how little has changed in the area of “general education” (the older term for “liberal arts education”) from the situation outlined in this excellent report. The report’s main recommendation for restoring curriculum coherence and cultural continuity to Harvard College’s educational offerings is still valid today. As previously explained, it consists of a required core curriculum focusing on the Great Books and Great Thinkers of the Western past—the thinkers and writers who have exerted the greatest impact on the development of Western philosophy, religion, politics, culture, and social life and thus on individual liberty and political freedom.

Such a core curriculum would leave plenty of room for vocational training, specialization, and electives, as well as for studying cultures and traditions of peoples outside the West. But it would guarantee that the college educated in America really will be educated—and not just entertained or narrowly trained—and it holds the promise of alleviating much of the drift and anomie among college students that is so endemic today to the à la carte university. A core curriculum focused on “the best that has been said, thought, written, and otherwise expressed about the human experience” by Western thinkers would benefit enormously not only the students involved but would have a leavening effect upon modern culture.

  • Share
Most Commented

August 23, 2021

1.

Testing the Tests for Racism

What is the veracity of "audit" studies, conducted primarily by sociologists, that appear to demonstrate that people of color confront intense bias at every level of society?...

April 16, 2021

2.

Social Justice 101: Intro. to Cancel Culture

Understanding the illogical origin of cancel culture, we can more easily accept mistakes, flaws, and errors in history, and in ourselves, as part of our fallen nature....

April 19, 2021

3.

Critical Race Theory and the Will to Power

A review of "1620: A Critical Response to the 1619 Project" by NAS President Peter W. Wood....

Most Read

May 30, 2018

1.

The Case for Colonialism

From the summer issue of Academic Questions, we reprint the controversial article, "The Case for Colonialism." ...

December 21, 2017

2.

March 20, 2019

3.

Remembering Columbus: Blinded by Politics

American colleges and universities have long dispensed with efforts to honor or commemorate Christopher Columbus. But according to Robert Carle, “most Americans know very little about......