Art and Delusion: Unreality in Art School

Ross Neher

I teach painting in a Master of Fine Arts program at Pratt Institute in Brooklyn, New York. Typically, the students have anywhere from $60,000 to $100,000 of student loan debt that by law must be repaid. Because of mismanagement of the college’s finances in the 1970s and 1980s (there is practically no endowment); only a pittance is available for graduate scholarships. Most students have to pay the entire $29,900 yearly tuition.

What do they get for their money? Or, to pose the question in a more general way, what is the value of an MFA degree? The answer is complicated, for it involves among other things student expectations, educational trends, a changing art world, and the state of the U.S. economy.

Each fall semester I ask my students why they have come to Pratt and what they want to do when they graduate. The common answer is to develop as artists and find a commercial gallery to show and sell their work. Some want the MFA degree so they can teach on a college level. I look at these young adults and see hope and ambition. For many, the two years they spend in school will be the best years of their lives. They can experiment with media, don multiple styles, and hone their message—all in an extraordinarily supportive environment that keeps the outside world at bay. They will finish up with a thesis exhibition showcasing their best work. At their openings they will beam with pride and accept the accolades of the faculty. And then, after they’ve earned their diplomas, most will fail to have successful careers as artists.

One obvious problem is that there are just too many artists, whether they hold degrees or not, for society to absorb. Especially in this distressed economic climate, the odds of an artist living off sales or finding a teaching job approach lottery ratios. Art schools have a poor record of providing students with “real world skills”; for most graduates it is sink or swim. The lucky ones will get jobs as bartenders, where they can work a few nights a week and earn enough to support their vocation. Others return to school to learn computer graphics, since web design can be a lucrative sideline. Traditionally for men, construction and carpentry have been fallback positions. Within the art world itself, artists can find employment as gallery assistants, museum guards, and art handlers. However, if art world success remains elusive after a few years of such toil, many grow discouraged and give up. And many of those who give up still must pay off that student debt, which has to be bitter reminder of a poor degree choice. The remembrance of two halcyon years can hardly compensate for a career gone south.

An MFA degree does not come with a money back guarantee for success. Certainly artists with an MFA have gotten teaching jobs, while those lacking the degree have been turned down. And the more an applicant is willing to compromise and accept a position in a less desirable part of the country the more likely he is to get a job. That goes for graduates in any liberal arts field. So, the degree has some value. But to assess an MFA’s value in other ways means asking how it first came to be a defining factor of an artist’s identity. After all, Rembrandt didn’t have an MFA.

Picasso didn’t have an MFA, the abstract expressionist Willem de Kooning didn’t have an MFA, and Jackson Pollock was a high school dropout. At least since the Romantic era, artists have believed that art flows from intuitive or subconscious sources; overly intellectualized art has long been deemed contrived and inauthentic. Indeed, the abstract expressionist Robert Motherwell, who held a BA in philosophy from Stanford and taught philosophy at the University of Oregon, was regarded by his peers with suspicion, if not downright contempt.1 And yet, by the mid-1960s, the MFA was considered an essential component of an American artist’s resume.

For much of Western history art was a trade, like that of cobbler or cooper. A boy learned a craft in a workshop, first as an apprentice, then as a journeyman, until he became a master and thus was eligible to join the guild. The pair of shoes that didn’t fall apart, the barrel that wouldn’t leak—those were the “masterpieces” that demonstrated the professional competence necessary to become a reputable artisan. The crafts of painting and stone carving were no different.

The guild system eventually gave way to the academy, where the techniques of painting and sculpture, refined during the High Renaissance, were codified and taught in a rigorous, some would say rigid, manner. The first academy was the French Académie des Beaux Arts founded in 1648. The first major American art academy, the Pennsylvania Academy of the Fine Arts, was founded in Philadelphia in 1805.

Exactly one hundred years later came the first attempt to make training for professional artists a part of the university. On April 23, 1905, Charles de Kay, literary and art critic for the New York Times, reported that Columbia University and the National Academy of Design were in talks that would have allowed the University to take over the Academy.2 This development was of great concern to de Kay, who worried that “having already separated the disciple from the master by massing numbers in a school where they receive occasional visits from instructors, it is proposed now to go further and practically place the students under what is really lay guidance.”3 De Kay reasoned that there was more to be lost than gained under such an arrangement. Art students would find themselves isolated within the student body by virtue of their temperament and common interests, and would chafe under restrictive university rules. Moreover, what began badly would only get worse:

The two systems under the same head really seem incompatible. One must look to see the art side of a university dwindle to nothing.4

I cannot imagine more prophetic words.

De Kay exposed what was very likely the first instance of academe’s encroachment into academy territory. In this case, the “merger” talks failed and the National Academy of Design remained, and continues to remain, proudly independent. But though the stand-alone art academy continued to function as the training ground for a generation of exceptional illustrators up through World War II (Norman Rockwell, a high school dropout, was a product of this system), its days were numbered. The decisive shift in American art education was the passage of the G.I. Bill of Rights in 1944, which provided tuition to homecoming veterans to attend an accredited institution of higher learning. That same year the National Association of Schools of Art and Design (NASAD) was formed to accredit art schools and provide standards for bachelor’s and master’s degree programs. To insure their own survival, independent art schools formed alliances with nearby colleges and universities (e.g., the Boston Museum School with Tufts, the Pennsylvania Academy of Fine Arts with the University of Pennsylvania) and became degree-granting institutions. Academe and the academy married, but it was a shotgun wedding.

Other events contributed to this union. The space race that followed the Soviets successful launch of the Sputnik satellite in 1957 brought massive government funding for scientific research in its wake. Much of this money went to university chemistry and physics departments, but the “soft” sciences benefited too. Because the Soviet launch caught Americans by surprise, a chastened nation acted to close a presumed education gap with the USSR. All facets of the educational system were perceived to need improvement. Americans became education crazy. Depression-era parents, deprived of the opportunity, insisted their children go to college. The government abetted this desire by providing very low-interest National Defense loans. In 1964 the first of the baby boom generation entered college. Further swelling the ranks of the college population were draft-eligible males who, attempting to escape participating in an escalating war in Vietnam, sought refuge in college deferments.

While the art academy was now solidly ensconced in academe, developments outside of the college environment helped to undermine traditional methods of teaching, drawing, and painting. Photography, for example, seemed to render the oil portrait and the painted documentation of historic events obsolete. Of course, there continued to be a demand for the presidential and society oil portrait; John Singer Sargent’s dazzling portraits revealed just how wanting their mechanical equivalents could be. The golden age of American illustration of the 1930s and 1940s also pointed to photography’s technological deficiencies. Book and magazine covers and print advertising remained the illustrator’s bailiwick. Coca Cola’s Santa Claus ad campaign, which began in 1931, demonstrated how vividly a gifted artist could re-imagine a fictional character However, the technology of photography continued to advance steadily and by the 1960s there was little call for illustration apart from detective novel covers and some fashion ads.

Moreover, abstract art had been at the forefront of American avant garde art since the late 1940s. In the context of such new art movements as hard-edged abstraction or minimalism the mastery of traditional drawing and painting skills seemed increasingly irrelevant. On the graduate level it seemed a waste of resources to teach representational drawing and painting when the only students benefiting were a few doughty portrait and landscape painters. The MFA, as educators never fail to point out, is a terminal degree and represents as much a mark of professionalism as the prospective guild member’s properly made shoes and barrels. But as craft increasingly became unimportant to the contemporary artist, something else was necessary to take its place. That something was theory.

A representational still life needs no theory to support it. A Jackson Pollock painting needs a goodly amount of theory, if only to explain why it should be considered art in the first place. To know why a Pollock painting should be considered art, and important art at that, required a familiarity with Clement Greenberg’s formalist theories, which owed their force to a Marxist-Hegelian notion of historical inevitability. Only a certain kind of art could be considered significant at a specific point in history and at the twentieth century’s midpoint it was (according to Greenberg) abstract art of Jackson Pollock’s kind.5

But by the mid-1960s Greenbergian formalism came under attack, not because it was too theoretical, but because it wasn’t theoretical enough. “Conceptual” or “Idea” art captured the imagination of many because of its apparent simplicity and purity. Conceptual artists maintained that art should exist only as an idea, free of the constraint of a physical object. (One such piece, by Joseph Kosuth, consisted of nothing more than a dictionary definition of “idea” displayed on a gallery wall.) The conceptual art movement was unabashedly leftist. Because it existed only as an idea, conceptual art (theoretically) could not be bought or sold and therefore thwarted the profit mongering of capitalists. Art that conformed to Western esthetics was excoriated as elitist and bourgeois. But the real appeal of conceptual art to a coddled, self-indulgent generation was the absence of discipline. The minimalist sculptor Donald Judd said, “If someone calls it art, it is art.”6 Yes, and I am an artist if I say I am. What could be easier?

The very open-endedness of conceptual art meant that after its obligatory Marxist phase it quickly succumbed to feminist and gay permutations. Graduate art departments embraced these ideologies and they were soon incorporated into their programs. (At Pratt, for example, a course titled “Painting” is not a hands-on studio course that addresses issues of perception and technique, but rather a weekly three-hour “talk” seminar on any subject of the professor’s choosing.) Today no functional difference exists between graduate departments of art and departments of English, sociology, psychology, and anthropology. What we have instead is one big ideological stew:

These [mandatory] courses provide an in-depth look at contemporary media and art discourse historically contextualized by a wide range of theoretical approaches, including: aesthetic theories, new media theories, structuralism, semiotics, phenomenology, Marxism, gender and queer studies, post-structuralism, deconstruction, issues of authorship, postcolonial theory, multiculturalism, and theories of social and environmental justice.7

This description lists “Areas of Study” for the MFA program on the San Francisco Art Institute website. Omit the words “art,” “aesthetic,” and (possibly) “media,” and the text could serve as a template for liberal arts programs in practically every college in America.

This account of how art schools moved from teaching techniques of individual mediums to the ubiquitous “interdisciplinary” approach is overly compressed, but it takes us to today. Those who get teaching jobs at the higher levels will perpetuate the indoctrination of leftist ideology endemic to academe and have as their mission the rooting out of every last vestige of Western culture. Those are the few. For the many, the lure of the MFA is the lure of fame and fortune.

The Yale School of Art cemented its reputation as the country’s premier graduate art school with its class of 1964, which included Chuck Close, Richard Serra, and Nancy Graves (the minimalist painter Brice Marden graduated in 1963 and is considered a part of this group). After obtaining their MFAs, these artists achieved almost instantaneous art world success and today are considered solidly blue chip. Built up in the 1950s by the great artist and pedagogue Joseph Albers, Yale’s art program had, by the mid-1960s, established an old-boy/girl network with leading New York galleries, critics, and collectors that still exists. (New Haven’s proximity to New York City was a distinct advantage.) The correlation of the MFA and art world success was thus made manifest.

After the economic doldrums of the 1970s the art market exploded during the Reagan era. Art “stars” such as Julian Schnabel, David Salle, and Eric Fischl seemed to print their own money. “Artist” became a glamorous profession and one was as likely to see successful artists appearing in Vogue as in Artforum.

But that was nothing compared to the last art boom that ended with the current recession. In 2006 the Jack Tilton Gallery in Manhattan mounted an exhibition called “School Days” that consisted of work by nineteen art students still in graduate school. Within two days 70 percent of the show sold out, with prices ranging from $1,200 to $16,000. Not one “artist” was over twenty-six years old.8 When students in MFA programs heard about this show and about exhibitions selling out in Chelsea with works going for princely sums, a $60,000 tuition bill suddenly seemed like small beer. One or two solo exhibitions would easily pay back student loans and a future of easy riches awaits. How difficult can that be?

Well, given the numbers, it was difficult in 2006 and it is nigh impossible today. According to U.S. Census data, there were 2,196,000 “artists” in 2001.9 (The quotation marks come from the Princeton University’s Center for Arts and Cultural Policy Studies, but we know what they mean.) This staggering figure includes architects, authors, actors, dancers, musicians, etc., as well as visual artists. If we total up the number of painters, sculptors, craft artists, and photographers in the survey the actual figures comes to 414,000. But these statistics are almost a decade old.

A look at statistics released in 2008 from the U.S. Department of Labor’s Bureau of Labor Statistics indicates that number of “fine artists” (which here includes painters, sculptors and illustrators but excludes craft artists and photographers) is only 30,000—a figure I regard as far too small.10 Indeed, I estimate that there are approximately 40,000 visual artists in New York alone.11 For an idea of the scope of the problem, in October 2007, at the height of the art boom, there were 360 galleries in the neighborhood of West Chelsea, considered the center of the New York (and therefore the international) art market.12 One needn’t be a math whiz to grasp the daunting odds. For those artists not wealthy or socially connected, it is delusional to think that art world success will come in but the most limited terms. Armed with nothing but “Queer Theory” and no definite skills, the chances of “living off sales,” an oft-stated goal of the graduate art student, are close to zero. There is simply not a large enough collector base to support all but a handful of artists.

Moreover, the extravagant salaries and bonuses that fueled the last art boom are mostly things of the past (Goldman Sachs, as always, is the exception). Wall Street has been decimated, the easy money is gone and galleries are closing at a rapid rate. And while we don’t yet know the ultimate size of the federal deficit, we do know the heaviest share of the tax burden will fall upon those most likely to collect art, donate to museums, and back art galleries, i.e., the wealthy (those making over $250,000 a year?), who will hardly regard tax hikes as an incentive to buy art.

* * *

I just taught my first painting class of the fall 2009 semester. I had twelve students. They were fresh-faced, wide-eyed, and optimistic. I explained to them the present gloomy situation and the difficulties they will face. They would hear none of it. But that’s fine. They are about to have the two best years of their lives.

  • Share
Most Commented

April 16, 2021

1.

Social Justice 101: Intro. to Cancel Culture

Understanding the illogical origin of cancel culture, we can more easily accept mistakes, flaws, and errors in history, and in ourselves, as part of our fallen nature....

April 19, 2021

2.

Critical Race Theory and the Will to Power

A review of "1620: A Critical Response to the 1619 Project" by NAS President Peter W. Wood....

May 30, 2018

3.

The Case for Colonialism

From the summer issue of Academic Questions, we reprint the controversial article, "The Case for Colonialism." ...

Most Read

August 23, 2021

1.

Testing the Tests for Racism

What is the veracity of "audit" studies, conducted primarily by sociologists, that appear to demonstrate that people of color confront intense bias at every level of society?...

May 30, 2018

2.

The Case for Colonialism

From the summer issue of Academic Questions, we reprint the controversial article, "The Case for Colonialism." ...

March 29, 2019

3.

Homogenous: The Political Affiliations of Elite Liberal Arts College Faculty

A study on the partisanship of liberal arts professors at America's top universities. ...