This article was originally published on the Chronicle of Higher Education's Innovations blog.
‘Tis the season of paradox. In a widely noted op-ed in The New York Times, Judah Cohen, identified by the Times merely as “director of seasonal forecasting at an atmospheric and environmental research firm,” explained that the frigid temperatures and heavy snowfalls afflicting Europe and much of North America this year are, mirabile dictu, the result of “the overall warming of the atmosphere.” Quick-draw skeptics made the obvious retorts: (1) that advocates of the theory of global warming seem to have constructed a one-way street for interpreting data. No matter what happens in the actual atmosphere of our planet—whether temperatures rise, fall, or remain the same; ditto the level of precipitation; ditto the severity of storms—the theory of anthropocentric global warming (AGW) is vindicated. (2) the public is growing more and more jaundiced about this theoretical legerdemain; and (3) a fair amount of the skepticism now focuses on the capacity of climate scientists to be honest judges of the global warming evidence in view of the enormous amounts of money that flows their way and will continue to flow only if AGW retains its legitimacy.
Judah Cohen might be taken as an exemplar of the latter problem. He is a qualified scientist, having received his Ph.D. from Columbia University in atmospheric sciences in 1994. According to the firm he directs, Dr. Cohen “has since focused on conducting numerical experiments with global climate models and advanced statistical techniques to better understand climate variability and to improve climate prediction.” He has a research affiliate appointment at M.I.T. in civil engineering and has published “over two dozen articles.” There is no gainsaying Dr. Cohen’s credentials. Except that there is the little matter of his livelihood.
He works for Atmospheric and Environmental Research (AER), a company whose whole business depends on maintaining alarm over the state of the climate. AER advises insurance and investment firms, alternative energy companies, and government agencies how to be “proactive about their risks” arising from “weather-related challenges” including “climate change.” There is nothing wrong with this, of course, but it is the sort of context that would normally summon a little apprehension. We are appropriately on guard when the head of research at a tobacco company tells us that studies of the dangers of smoking are unreliable; or the researchers at an oil company minimize the dangers of offshore drilling. But when advocates of global warming enunciate their views, many people, including many in the academic community, put their sensitivity to conflicts of interest on hold.
This observation might sound like a prelude to my dismissing Dr. Cohen’s explanation of why and how a frigid winter is the result of global warming—but it isn’t. Dr. Cohen’s account, though counterintuitive, might be correct. He essentially argues that the warming of the Arctic in the summer has freed up water that evaporates and then returns as a heavy Siberian snow cover; the snow cover reflects sunlight, cools the Northern Hemisphere, and the jet stream, meandering off course, brings the Siberian air and snow to lower latitudes in North America and Europe.
Dr. Cohen’s story has so many variables that someone who is not an atmospheric specialist—and maybe even someone who is—would be hard put to judge it on its merits. It sounds quite specific. He invokes “exceptionally high mountain ranges, including the Himalayas, the Tien Shan, and the Altai” and he invokes compelling analogies, such as the jet stream dividing like water in a stream around a boulder. But truly, the story sounds more like a string of untestable hypotheses snapped together in sequence like Lego pieces. Perhaps this is the way the world works; perhaps not. I am studying to keep an open mind, but keeping oneself receptive to counterintuitive explanations requires some discipline as the snow mounts up.
On the same page of the Times as Dr. Cohen’s op-ed, columnist Nicholas Kristof calls for cuts in American military spending. At one point in his argument he cites the since-abandoned expensive military bases the U.S. kept in Saudi Arabia after the first Gulf War. He tries to drive the lesson home with a rhetorical question, “Wouldn’t our money have been better spent helping American kids get a college education?” I have nothing to say here one way or the other about military bases or defense spending, but I don’t think Kristof’s invocation of educational spending as the wholesome alternative works anymore—at least to the degree he seems to suppose.
The indubitable virtue of increased public spending on higher education has become another theory, like global warming, that has a divided life. As the general public grows more and more skeptical about it, the people society pays to be skeptical—professors and journalists—by and large continue to see nothing amiss.
Over the last year, numerous observers have been calling attention to an emerging higher education “bubble,” likened to the real estate bubble, in which the public awakens to find that it has been paying way too much for something on the mistaken assumption that the high prices would be covered by an even higher return. Housing prices, however, peaked and then rapidly descended, leaving many people with mortgages higher than the resale values of their homes. As for higher education, it has been clear for a while that many students pay tuition and pile on debt far in excess of what their college degrees are likely to bring them by way of augmented lifetime earnings. The situation has been dramatized by a few extreme cases, such as Kelli Space, the sociology major who graduated from Northeastern University in 2009 with $200,000 in debt in the form of student loans. Recently, my fellow Innovations blogger Richard Vedder has unearthed Department of Labor statistics that are dispositive: 60 percent of the growth in college graduates from 1992 to 2008 ended up working in low-skill jobs, the kind of jobs for which the Bureau of Labor Statistics regards a college degree as irrelevant.
Like global warming, the topic is intrinsically complex, though probably nowhere near as imponderable as the dynamics of heat transfer in the atmosphere. Clearly having a degree from the right college in the right field can translate (on average) into a larger premium in lifetime earnings. For many students, however, a college experience can end in no degree and a substantial debt burden. And many others graduate having learned little, possessed of a credential that carries little weight in the job market and yet still saddled with student loans that will take decades to pay off. These days, in any given week one can find half a dozen articles decrying this situation. (This week, for example, I’d include in the count Neal McCluskey from the Cato Institute, “Hurrah for ‘Draconian’ Education Cuts!”; Hans Bader from the Competitive Enterprise Institute, “Time for Big Cuts in Education Spending;” and Katherine Mangu-Ward writing in Reason, “Easy Money For College Can Really Mess You Up, Man.”)
Higher education’s response? Generally, if the topic is acknowledged at all, it is done so in scorn for the philistines who would reduce the “value” of a college degree to the job prospects and earnings of graduates. Never mind that higher education has been busy selling itself to the public in precisely those terms for the last fifty years and that the official position of the Obama administration is that our “national competitiveness” depends on a huge expansion in the number of young people who earn college degrees.
But I’m ready to concede the point. Higher education should be about more than gaining a credential that gives one a leg up in the marketplace. But if we are going to re-focus the debate on the non-utilitarian substance of higher learning—on the transmission of disciplined intellectual inquiry, on developing civilized discernment, on aspiration for genuinely higher knowledge—we had better be prepared to rethink our national preoccupation with mass higher education. Judged by those standards, contemporary American undergraduate education as a whole is a colossal failure.
Which is it? Do we want to run a mass credentialing service that the public increasingly views as an expensive con? Or do we want to engage in rigorous higher education as something that has intrinsic value, but which our current system is ill-suited to provide?
There may be clear-cut answers to these questions, deflected in the winds high over the Tien Shan and Altai Mountains, reflected in the glare of Siberian snowfields, and twisted in the vacillations of the jet stream. But I’m not sure. I do know that when I encounter the offhand assurance of those who simply assume that more and more college degrees at greater and greater and greater public expense are unquestionably a good thing, I get a chill.