The University of Stonehenge

Oct 11, 2011 | 

Peter Wood

Font Size  

  

The University of Stonehenge

Oct 11, 2011 | 

Peter Wood



Part 1

How old is the university? Old by most reckoning. The University of Bologna, the oldest official one, was founded in 1088. By a looser definition, it is even older. Plato founded his Academy in 387 BC. Of course, there were teachers with their own schools of advanced thought in ancient Greece still earlier. Pythagoras established his school at Croton about 530 BC. Before that?

We don’t know, but presumably ancient scribes learned their art in some systematic form; Babylonians must have had some organized means of passing along their relatively sophisticated observations of the heavens; and the builders of Stonehenge clearly possessed the sort of knowledge of solstices and other calendrical phenomena that implied rigorous training of successive generations of adepts.

The University of Stonehenge, however, went the way of Antioch College—but unlike Antioch it seems unlikely to resurrect itself.

The question occurred to me while looking at the row of punctured seashells on the front page of The New York Times “Arts” section. No, not some postmodernist conceptual show in a Chelsea gallery. These shells were punched out by some Cro Magnon jeweler living in Greece 33,000 to 35,000 years ago, to be “strung on a cord,” and worn to declare beauty, prestige, or power.

To cover the distance between those ancient beads and the modern university will require several steps. This is the first of three posts to close the necklace.

Bead Work

The Times story, “History That’s Written in Beads as well as in Words,” by Patricia Cohen, is really about “an unusual coalition of scholars” that is “trying to stage an intellectual coup.” The members of this coalition are trying to swing the pendulum back from the micro-history that currently dominates the academy to a much broader and more synthetic approach—one that recaptures the realm of human activities before the invention of writing.

It is a surprising story to read in the Times, which generally shines its sun on the fatuities of postmodern history. Cohen, however, makes some worrying observations, such as:

Three out of four historians, for example, specialize in the post-industrial era or the 20th century, the American Historical Association reports.

This has prompted a hardy band of dissenters in the discipline of history to start looking at “the long march of human existence” that is “being ignored.”

The immediate occasion of Cohen’s article is the publication of Deep History: the Architecture of Past and Present, edited by Andrew Shryock and Daniel Lord Smail (University of California Press). I hustled out and bought Deep History, and may have something to say about it once I’ve read it. But for the moment, I am taking my cue from Cohen and those seashells.

They come into the story like this. We of course lack written records for the Paleolithic, but that doesn’t mean we can’t reconstruct some important historical developments. Humans had been knocking out beads here and there for millennia years before the shell necklaces in question were made. But at that particular point, perforated shells took off. Cohen quotes the authors:

Relative to population size, after all, shell beads were perhaps being produced in the Upper Paleolithic at the rate iPhones are being manufactured today.

The profusion suggests a number of things, including wider trading networks, greater exchange, and virtuosity in handicrafts, but it also points to “political alliances” and economic activity. Beads “allowed ancient people to transform food surpluses, created by shifts in production, into commerce and political power.” They were, in that sense, precursors of money, “credit cards, bank notes, [and] gold coins.”

Part 2

In part one of this series, I introduced a new academic field, “deep history,” that is emerging in history departments.  A New York Times article this week drew attention to “deep history” as grounded in a critique of contemporary history departments as overly focused on the 20th century.  These critics assert both the need for historians to pay attention to “the long march of human existence,” and the possibility of making valid historical inferences from data such as 35,000 year-old perforated shells found in Greece, which might even be understood as a step toward the development of “credit cards, bank notes, [and] gold coins.”

Up to the Minute

Well, maybe. The concept of “deep history” is a welcome one, but the practice may prove a little tricky. One important reason we need “deep history” is that our academic detour into five-minutes-ago history leaves undergraduate students in the dark about matters of real significance.  For example:

  • Earlier this year, my organization, the National Association of Scholars, released a study, The Vanishing West, 1964-2010, which traced the disappearance of Western Civilization survey courses at elite private and top public colleges and universities over the last half century. A few universities have made feints in the direction of “world history” as a substitute overview, but even world history is seldom required. American higher education has essentially kicked out the one course that attempted a broad narrative overview and replaced it with…nothing.
  • We hear with great lament that the idea of evolution is under assault because the advocates of creationism and intelligent design have pushed so hard for their agenda, but in truth, colleges and universities that are entirely secular in spirit pay scant attention to teaching the evolutionary macro-narrative of human development.
  • A few years ago, the Intercollegiate Studies Institute administered a 60-question multiple choice exam about civic literacy to 14,000 freshmen and graduating seniors. It produced the astonishing result that at several top universities (Princeton, Yale, Cornell, Duke, and Berkeley) seniors scored lower than freshmen. The strong suggestion was that a four-year liberal arts education had resulted in a net loss of historical knowledge. The study, “Failing Our Students, Failing America” (2007) was conducted by researchers at the University of Connecticut, and combined multiple-choice factual questions (e.g. “What battle brought the Revolutionary War to an end?”) with simple interpretive ones (“The dominant theme in the Lincoln-Douglas debates was: _________”)

College students today have abundant access to courses that explore race, gender, and class in all their permutations, and a superfluity of courses on niche historical topics, but relatively few courses that attempt anything in the way of integrating historical knowledge over the wider arcs—human development, civilizations, or the American experiment.

Ann Arbor and Reed

To offer such a criticism is a bit perilous. Thousands of history professors offer tens of thousands of history courses in American colleges and universities. Many members of the history faculty are very good at what they do, and history curricula are inevitably wide ranging. Any loose generalization can be met with dozens of examples of exceptions. But to look at the undergraduate history offerings at almost any college or university is to encounter a mix that includes numerous niche courses. Here are two examples, one from a major research university, the other from an elite liberal arts college.

The University of Michigan has a robust undergraduate history major, but it includes courses such as History 331, “Poland in the 20th and 21st Centuries,” and History 397, “Colloquium, Occult Internationalism:  The Global Spread of Secret Knowledge.” These may be fine courses in their own right, but an undergraduate student has time for only so many history courses. Are these the best choices to fill out a student’s knowledge of history?

At Reed College, a student can take History 311, “Food in American History:  Burgers, Fries, and Apple Pie,” History 379, “The 50s in America”; and History 321, “Support the Qing, Destroy the Foreign:  Interpreting and Remembering the Boxer Uprising.”   Interesting topics these, but are these courses grace notes that add a little flourish to a Reed student’s  otherwise systematic exploration of history?  A student seeking a context at Reed for the “burgers and fries” course can take History 230, “Empire and Liberty:  The United States in the Nineteenth Century;” History 270, “Nature, Culture, and Society in American History”; History 276, “Culture and Society in Twentieth Century America;” History 278, “U.S. Politics and Culture, 1929-1979;” History 302, “Origins of the Second World War;” History 303, “The Cold War,” and quite a few other courses. These latter courses add up to an intellectually serious undergraduate history curriculum, and Reed’s “Food in American History” course looks a good deal less frivolous in that context.

But there is at least one other way to consider that particular collection of courses: what does it leave out? It would take me too far down a different road to attempt that analysis here, but I’ll offer it as a homework assignment to the interested reader. Reed’s history curriculum is here.

As far as demonstrating the nonce character of history offerings around the country, that’s as easy as canoeing downstream, as in Guildford College’s History 324, “American Rivers,” which “uses American rivers and their watersheds as focal points to study the various ways in which people have interacted with their environments and each other,” and invites students to “select a river of their choice on which they conduct a semester-long research project.”

Or History 3530 at Auburn, “Science Fiction as Intellectual History.”

Or History 2452 at Cornell, “Dress, Cloth and Identity in Africa and the Diaspora.”

Second homework assignment: Find a college at which the undergraduate course offerings do not include at least one such divertissement.

Part 3

In Part 1 of this series, I introduced a new academic field, “deep history,” that is grounded in a critique of contemporary history departments as overly focused on the 20th century. In Part 2, I expanded on the reasons why this critique is needed. Not only do college and university history departments devote disproportionate attention in the undergraduate curriculum to the very recent past, they also scant crucial questions about the development of culture and civilization, and often divert students into overly specialized and sometimes trivial topics. In this concluding part, I acknowledge the difficulty of pursuing the “deep history” project beyond some initial insights, but urge the profound importance of one of those insights about the transmission of complex knowledge from generation to generation.

Good as Gold

The new “deep history” is a promising corrective to what many see as higher education’s focus on the minute, the recent, and the ideological.

It does, however, face numerous hurdles. Historians who are attracted to the approach will have to defer a great deal to biologists, archaeologists, anthropologists, and others who specialize in teasing out the clues to what happened to humanity before people started marking up clay tablets.

A few years ago the Cambridge archaeologist Colin Renfrew published a short book calling attention to the ways that the material remains of the past can illuminate emergent qualities of the human mind. In the absence of a system of writing, the mind does not leave an easily deciphered record of itself. But Renfrew offers some intriguing examples in Prehistory: The Making of the Human Mind (2007) of where and when certain intellectual leaps occurred. Gold, for instance, is common enough on the earth’s surface that our ancient ancestors must have come across it fairly often. They apparently kicked it aside.  The first instance we know about in which gold came to be considered something valuable happened about 6,500 years ago. A Neolithic cemetery in Varna, Bulgaria has yielded gold beads and ornaments in graves that contain other prestige items such as long blades of flint and copper swords and daggers.

Renfrew also draws attention to stone cubes found in the Indus Valley that are exact multiples of one another in weight: undeniable evidence that the Indus Valley civilization had created a standard system for measuring mass. But Renfrew adds, “in a sense these stone cubes serving as weights are symbolic of themselves:  weight as a symbol of weight.”  On the whole, he emphasizes the way in which manipulations of the physical world, social institutions, and intellectual advances cohere. The architects of Stonehenge weren’t so much celebrating an already achieved social order as bringing one into existence.

Renfrew’s approach is probably compatible with “deep history,” though he is mentioned only once in passing in the new book.  Both are about restoring the search for the larger narratives in human development and both treat human consciousness as forged by social experience. But Renfrew’s earlier excursion in this line was also a reminder of the thinness of such history. It is exciting to recognize that gold wasn’t always as good as gold and that weights and measures were once uncountable. But, as William Cronon, a history professor at the University of Wisconsin and a critic of “deep history” told Patricia Cohen, “Lee Harvey Oswald’s assassination of President John F. Kennedy depended on the discovery of iron, but so what?”

Gathering, Keeping, and Teaching

A “So what?” question hangs over everything in the postmodern university. “Burgers, Fries, and Apple Pie” depend on—respectively—the Neolithic domestication of cattle and grain; the Andean domestication of the potato; and the domestication of apple trees in Kazakhstan. That quintessential American meal was gathered from the ends of the Earth.

Which of course is true of the university too. While I am much in favor of students studying Western civilization as a core part of the curriculum, I am in favor too of students acquiring the even greater breadth of understanding of the cultural predicates that make any civilization possible. The most important of these predicates is our species-wide capacity to acquire complex knowledge and transmit it from generation to generation. The university itself is just one of many ways a society can organize that transmission.

The success of such an enterprise is never guaranteed. If we neglect the larger purpose too long or too often, it goes the way of Stonehenge. Picturesque, for sure, and a reminder of great labors once thought worthwhile, but no longer serving any real purpose.

I venture the guess that too few professors of history see themselves as shouldering a special responsibility in this area and too few colleges and universities expect it of them. In that sense, I welcome the rise of “deep history.” It won’t solve all of the problems in history instruction—the “vanishing” West, the scientific a-literacy, the evaporation of even simple historical detail from the minds of students after a four-year education at an elite college, the profusion of overly specialized undergraduate courses, the obsessive presentism, the overloading of race-class-gender themes, and the frequent descent into triviality—but it is a large step in the right direction.

This article originally appeared in a three-part series at the Chronicle of Higher Education's Innovations blog on September 28, September 29, and October 3.

There are no comments for this article yet.