College Is All About Curiosity. And That Requires Free Speech.
"True learning can only happen on campuses where academic freedom is paramount — within and outside the classroom.
I have served happily as a professor at Yale for most of my adult life, but in my four-plus decades at the mast, I have never seen campuses roiled as they’re roiling today. On the one hand are gleeful activists on the right, taking victory laps over the tragic tumble from grace of Harvard’s president, Claudine Gay. On the other is a campus left that has spent years crafting byzantine and vague rules on hate speech that it suddenly finds turned back on its allies. For those of us who love the academy, these are unhappy times.
The controversy began with criticisms of some universities, Harvard included, for soft-pedaling their responses to the horrific Oct. 7 Hamas attack on Israel, and for then ignoring the overheated rhetoric of many pro-Palestinian protesters on campus. It has since spiraled into a full-bore battle in the never-ending culture wars.
There’s something sad but deeply American about the way that the current crisis stems not from the terror attacks but from a subsequent congressional hearing at which the presidents of Harvard, the University of Pennsylvania and the Massachusetts Institute of Technology gave such cautious responses that it was hard to understand their positions. It was all very embarrassing; and, in its way, very McCarthyist.
Still, some good may yet come of the debacle. I have in mind not, as the left might think, a fresh rallying of the angry troops; nor, as the right might think, an eager readiness for the next battle. Rather, the controversy provides us with an opportunity to engage in a serious debate about what higher education is for.
The Oct. 7 attack was hardly an auspicious moment to unfurl campus demands that the world pay attention to the context underlying the vicious assault. Measured by casualties, the Hamas attack is the third-deadliest terror incident in the half century for which we have data; measured in per capita terms, it was by far the worst, with more than 1 in every 10,000 Israelis killed. I have a fair degree of sympathy — quite a lot, actually — for many aspects of decolonial theory. I have taught, for example, the works of Frantz Fanon and Talal Asad, both of whom seek, in different ways, to offer an explanation for anti-Western violence that most observers find inexplicable. I have no sympathy whatsoever for the intentional targeting of children and the weaponization of sexual violence. Drawing a distinction between civilian and military targets might benefit the more powerful side, but the distinction is nonetheless correct.
In the wake of so traumatizing and polarizing an episode, to say nothing of its aftermath, students of different ideological persuasions will nevertheless behave like the rest of us, selecting facts from sources that share their predilections. This behavior is intellectually inside-out. A goal of higher education must be to train our students (and often ourselves) away from re-enacting endlessly the famous 1954 experiment by Albert H. Hastorf and Hadley Cantril, in which students at Dartmouth and Princeton watched a film of a penalty-ridden football game between the two schools. When questioned about what they witnessed, each side’s partisans saw mainly the fouls committed by their opponents.
In short, they found only what they were looking for.
The university classroom is the place to press students to look for more.
The classroom is said to be an outdated technology: A teacher stands up front explaining a subject as students take notes. Future-minded critics insist that the model is near its inevitable end. The Covid-19 shutdowns accelerated the drive to develop better tools for asynchronous learning. The rise of generative A.I., together with the mutually parasitic relationship between young people and their screens, suggests that the old way is nearly done.
Here I am moved to argue for the defense. These dire predictions might fall into line with what’s happening on campus, but what universities are becoming bears little relation to what the classroom should be. The difficulty stems, as my Yale Law School colleague Anthony Kronman has accurately noted, from the yielding of universities to the “vocational ideal,” the notion that what’s most important in life is work (together with the wallet), and concomitantly that the purpose of higher education is to inculcate the skills needed to excel in the labor market. In short, college as a path to a better job and a higher income.
If that’s how the mavens of higher education see their institutions, they will deservedly lose the battle over how students are taught. For if the purpose of the university classroom is simply to impart information, digital agents will soon perform the task better than experienced teachers, and colleges will find themselves unnecessary, merely duplicative of what students can get at home, cheaply and efficiently.
But this stumble toward redundancy rests upon a fatal misunderstanding of why the classroom exists. Teaching a subject is important; it is also in a sense incidental. The classroom is, first and foremost, a place to train young minds toward a yearning for knowledge and a taste for argument — to be intellectually curious — even if what they wind up discovering challenges their most cherished convictions. If the behavioral economist George Loewenstein is right that curiosity is a result of an “information gap” — a desire to know more than we do — then the most vital tasks of higher education are to help students realize that the gap always exists and to stoke their desire to bridge it.
Impediments to free speech are impediments to free thought and can only interfere with that search. That’s why academic freedom is so precious.
The lineage of academic freedom can be traced back at least to the 15th century, but historians tell us that the modern conception comes largely from the German universities of the 19th century. As the physicist and philosopher Hermann von Helmholtz put the point in 1869, scholars must be “more fearless of the consequences of the whole truth than any other people.” In a world where so many “other people” are on the lookout for wrongthink, the ability to pursue truth becomes more precious still.
A 1938 essay in Annals of the American Academy of Political and Social Science described academic freedom as the professor’s “special need of elbowroom in his work”; with apologies to Virginia Woolf, we might call it the need for elbowroom of one’s own. The image is one of adequate space to do your work, secure in the knowledge that nobody is looking over your shoulder to enforce ideological conformity. For the university to do its job of inculcating curiosity, academic freedom should be a near-absolute, within the classroom as well as without. College administrators who believe otherwise shouldn’t be college administrators.
In her fine book “Curiosity: A Cultural History of Early Modern Inquiry,” Barbara M. Benedict shows how this quality that we nowadays consider a virtue was in the past often regarded as a dangerous disruption of the social order. The view on campus today is often much the same, with scholarly work guided along particular ideological paths, and with those who transgress often marked down as unpersons. What the essayist and literary critic George Steiner derided half a century ago as a “nostalgia for the absolute” is these days everyday academic life.
What made the congressional hearing so sad was not merely the accusatory quality of the committee’s questions, or even the evasive quality of the presidents’ answers. It was that the presidents were being asked to interpret their own rules on campus speech — and couldn’t.
They’re not alone. Existing campus speech rules have led to all sorts of horror stories. Many are true. Because the regulations tend to be standardless — often, deciding what’s hateful based on the response of the listener, a so-called “heckler’s veto” — they give no fair warning of what’s forbidden, leading to such absurdities as stopping a student from passing out copies of the Constitution on Constitution Day; or investigating a professor for the sin of stopping to watch a “Back the Blue” rally; or rebuking an untenured lecturer who in a discussion about race showed a documentary that included graphic images of lynching, and read aloud from the Rev. Dr. Martin Luther King Jr.’s “Letter from Birmingham Jail.” Evidently the film, like the letter, included what we’re now supposed to call the N-word. (Full disclosure: I’ve used the word often in my books — fiction and nonfiction alike — and, seemingly only yesterday, I used it in the pages of The Times.)
But even were the rules crystal clear, they’d have both students and faculty looking over their shoulders, wondering which of their ideas might bring forth not disagreement — the mother’s milk of academic life — but condemnation from their fellows and, most dreaded of all, investigation. The inculcation of fear as part of daily work on campus is very McCarthyist; more McCarthyist, even, than hauling college presidents before Congress to try to force them to place even more speech off limits. Because having to look over your shoulder is something you contend with every day.
Many investigations and rules are justified on the grounds of ensuring that students feel safe. But I’m not at all sure the college classroom should be a place of comfort. It should be a place, rather, where students regularly face the challenge of difficult questions, as their professors help prepare them for the life of the mind. The more successful the teaching, the more students will carry their taste for the give-and-take of argument out of the classroom and into the larger world.
Sign up for The New York Times Magazine Newsletter The best of The New York Times Magazine delivered to your inbox every week, including exclusive feature stories, photography, columns and more.
This is why those who insist that pro-Palestinian protesters be punished for their views have matters inside-out: What campuses need is less regulation of speech, not more.
This year marks the 60th anniversary of the “free speech” movement that began among students at the University of California, Berkeley, and spread swiftly across the country. The anniversary is a reminder of a sort that students, too, are entitled to academic freedom — and I don’t just mean the freedom to protest, cherished as that right may be. What’s far more important is the freedom of students to grow into distinct thinking individuals. The student must be, in the phrasing of the German philosopher Friedrich Paulsen, “free to devote himself wholly to his task of forming himself into an independent personality.”
I agree with the philosopher Seana Valentine Shiffrin that when we search for the justifications for free speech, we tend to overlook its value in crafting our own identities, the way that a self can try on ideas like clothes, to discover which fit best. Sometimes the ideas will be beautiful; sometimes they’ll be ugly; sometimes they’ll look better on one person than another.
This process of testing ideas should be encouraged, particularly among the young. But it carries risks, not least because of what we might call influencers, who wind up dictating which ideas it’s fashionable to wear and which should be tossed out. When large majorities of college students report pressure to self-censor, this is what they’re talking about. Surveys suggest that the principal reason students keep controversial ideas to themselves is to avoid the disdain not of their professors but of their peers.
That is unfortunate, not least because it tells us how badly the educational process has failed.
The strength of my commitment to free inquiry and free speech might be traceable in part to an adolescence spent in mostly white schools, full of students and teachers alike who sought to tell me what I should think and say. I didn’t like it then; I don’t like it now. My father used to say that from the time I was a toddler, all I seemed to do was ask questions. Yet a chunk of my formative years was spent among peers who thought answers were obvious and questions were in poor taste. As it happened, the views they pressed upon me were mostly conservative; but I didn’t fight them off so that I might later subject myself to the same condescending nonsense from progressives.
I was probably attracted to academic life because I believed I had found in it the antithesis of that attitude. But then I was fortunate enough to come along at a time when campuses, still embarrassed by their feckless surrender to the McCarthyist right in the 1950s, had developed a broad and rich tolerance for uncomfortable speech.
My undergraduate education at Stanford in the 1970s was full of serious argument over controversial propositions. Little was out of bounds. In my history courses, we eagerly debated such subjects as whether slavery was more efficient than wage labor, or whether the influence of Christian missionaries on Asia and Africa and Latin America had been a net negative or net positive. When the great Carl Degler solemnly told a lecture class that slavery in Brazil had been harsher than slavery in the United States, nobody got mad, nobody circulated an outraged petition; instead, a group of Black students, myself among them, went to the lectern afterward to question, argue and learn.
Few professors tried to foist their own views on students. I thoroughly enjoyed a philosophy-of-religion seminar where the professor, an adamant atheist, took obvious glee in slapping down student arguments against the existence of God; and a political science course in which the professor was unapologetic in his view that Brown v. Board of Education was wrongly decided. Nobody boycotted him; in fact, his classes were hugely popular, even though his use of the Socratic method, for most of us our first exposure, had us trembling. Yet arguing with him over fine points of constitutional law not only improved my ability to defend my positions; it also helped establish my love of argument, of studying the evidence and listening with care.
Perhaps the high-water mark of my tiny corner of the Golden Age was my brief encounter with William Shockley, a brilliant engineer who shared the Nobel Prize in Physics as an inventor of the transistor, a device whose descendants we use every time we start our cars, fire up our laptops or pick up our phones. By the time he crossed my path, Shockley had become notorious for his avowed certainty that racial differences in I.Q. were driven principally by genetics — and for his proposal that we could increase national intelligence by paying those with low I.Q. scores not to reproduce.
Student protesters argued that Shockley shouldn’t be allowed to teach; on more than one occasion, they disrupted his classes. But I didn’t want to see him punished. I wanted to see him refuted.
As it happened, I worked at The Stanford Daily, which sponsored regular lunches at which a faculty member would be peppered with questions. During my sophomore year, The Daily invited Shockley, who much to our surprise agreed to participate. I spent weeks before his appearance haunting the campus libraries, reading every article I could find on the relationship among race, genetics and I.Q. — and learning a lot along the way about a subject into which I otherwise might never have delved. (Again, curiosity had its innings.)
On the appointed day, Shockley showed up, bringing along a tape recorder. He gave his talk (which I didn’t find impressive) and then answered questions in his polite, but firm way. After all these decades I don’t remember what I asked, except that it was long and complicated and well rehearsed, and that it was related to potential weaknesses in a particular study he kept citing; what I remember instead is the way his eyes narrowed as he listened. “That’s rather clever,” he said when I was done. I like to think that I put him off his game a bit; but, in truth, I doubt it.
That same year Shockley appeared on a campus stage, debating the issue of I.Q. and inheritance with Luigi Luca Cavalli-Sforza, one of the world’s great geneticists. Cavalli-Sforza made mincemeat of Shockley’s arguments — which is the way that academic disagreements should be decided. It’s a tragedy that subsequent debates at Harvard (1973) and Yale (1974) were canceled because of student protests. The more exposure Shockley got, the weaker his claims appeared.
This, I thought then and think now — this is how one lives the life of the mind! No, not everyone on campus need see things this way; but no one should interfere with those who do.
What should happen to those who do interfere? Yale suspended a dozen students from among those who, in the words of the physicist’s biographer, “jeered and booed for more than an hour until Shockley finally left the stage.” President Kingman Brewster Jr. accused them publicly of “storm trooper tactics.” Well, maybe; or maybe they were passionate about their cause and, in their passion, went too far. I quite agree that schools should punish those who disrupt the educational process, no matter how noble-sounding their reasons. Interrupting classes, making noise so that others can’t hear speakers, blocking access, taking over buildings — all of these are less acts of dissent than assertions of raw power.
The commitment to curiosity includes a commitment to understanding why those whose views are objectionable think as they do. The sharp line between propounding a position and trying to suppress dissenting voices is one that the campus should always honor.
I find something admirable in protesters who willingly suffer punishment for their actions rather than disrupting the academic enterprise and expecting (even demanding) no consequences. A student who takes an ugly position should be tolerated. A student who tries to prevent someone else’s ugly position from being heard should be suspended.
Stanford could have taken action against Shockley on other grounds, entirely consistent with academic freedom. He was accused of telling a Nigerian graduate student in a quantum-mechanics seminar that he did not belong in the class because of his race and should consider merely auditing the course. That act of singling out an individual illustrates how weakly the famous physicist understood statistics; it also constitutes a form of discrimination that a university can regulate without violating academic freedom, for Shockley was evaluating not the abilities of the student as an individual but the abilities of the student as representative of a class. But in the end Stanford took no action. (The student in question subsequently earned a physics Ph.D.)
If telling students and faculty what they must not say is bad, telling them what they must say is often worse. The success of the university, Paulsen wrote, rests upon the notion “that truth is the sole aim and not the proof of officially prescribed and quasi-officially desired or at least permitted views.” During the second Red Scare, which began after World War II, the most significant threat to this view was the loyalty oath. Faculty members across the country were asked to swear allegiance to the United States and, often, to affirm that they were not and had never been Communists. Many who refused lost their jobs — including at the nation’s wealthiest and most powerful institutions.
Nowadays, I worry that we’re repeating the same mischief, with mandated (or strongly encouraged) “diversity statements” and the like. I’m often told that because I obviously agree with the goals the statements promote, I shouldn’t make a big deal out of them. But this objection misses the point. The “what’s the big deal?” approach puts me in mind of the philosopher Sidney Hook, who in a 1953 essay in The Times argued that an academic who refused to swear to not being a member of the Communist Party was like a chef who refused to say whether he was the one who poisoned the food. Not for a moment did Hook entertain the notion that the hypothetical professor might simply believe, as a matter of principle, that it is wrong to screen the professorate for ideological conformity.
It was wrong then; it is wrong now.
Don’t mistake me. I’m not against ideology and social movements, except when they interfere with academic curiosity. On campus, at least, you should be able to support Israel in the Gaza War, yet feel free to argue that the Israelis have prosecuted the conflict too aggressively; or to support the aspirations of the Palestinians, yet be willing to condemn unequivocally Oct. 7.
This understanding points toward the proper resolution of other campus conflicts as well. Consider, for instance, the recent controversy at the University of California, Berkeley, law school over student organizations that require invited speakers to be “anti-Zionist,” an explicit ideological screen. The answer is not to argue over whether such demands violate school rules, but to emphasize the ways in which such restrictions violate the norms of curiosity and engagement that lie at the heart of higher education. Hanging out only with those they already agree with is one of the worst ways for young people to waste their years on campus.
Should the scope of campus debate be unlimited? Of course not. There will of necessity always be a palisade marking the boundary line of civilized conversation, and some opinions will always rest in the wilderness beyond freedom’s protection. Most obviously, calls for violence based on racial or other differences rest in that wilderness, and need not be tolerated in the classroom. Neither should playing Shockley’s game, judging intellectual ability according to the color of a student’s skin. But the force that exiled these topics from serious discussion was history, not top-down ordering. If we believe in the values of the academy, then the wilderness must be small, and the palisade in the far distance, scarcely visible to the unaided eye.
All of which brings us back to the tragedy of Claudine Gay. One way to look at the question she had such trouble answering was whether the universe of rational discourse — the space education properly occupies — is broad enough to accommodate demands for genocide. Of course, the only reasonable answer is no, and that is what the presidents should have said, clearly and concisely. Yet I think I know what they were worried about. On today’s campuses, the word “genocide” is tossed about frequently by activists, to describe what might better be called “sharp disagreement.” Part of the responsibility of the classroom teacher is to guide debate away from apocalyptic terminology, in the hope that students, rather than reaching for the language of horror, will seek to engage their opponents — that they’ll be more curious.
I’ve argued elsewhere that each generation should be permitted to place, at most, one issue beyond the fence. I believe it still. But that exclusion should follow, not precede, the waves of social change. Especially today, as the world teems with ferment, as every activist left or right pushes to move this issue or that outside the universe of rational discourse, the academy must be the last to yield.
Stephen L. Carter is the William Nelson Cromwell professor of law at Yale and the author of 17 books. He is working on a memoir titled “The Case for the Classroom.” He most recently wrote for the magazine about his long association with Thurgood Marshall. Andrew Rae is an illustrator, a graphic novelist and an art director known for his irreverent images of characters using a simple hand-rendered line. He is based in London.
No comments:
Post a Comment