All Articles
Democrats Need a Programmatic Alternative to Mass Deportation, Not Just “Resistance”
The recent focus on President Trump’s provocative and possibly illegal militarization of the immigration issue has obscured the vulnerability of his underlying policy on the subject itself. Democrats would do well to refocus attention on that policy. With the number of new unauthorized arrivals now down to near zero, the moment is ripe for Democrats to advance a broad proposal for bipartisan immigration reform that substantially accepts the Trump-era restrictions on new border crossings and visa violations – which began during the final year of the Biden administration – while urging a path to citizenship for the millions of illegal immigrants, other than convicted criminals, who are already here.
This op-ed was first written in May 2025 and then modified over the month of June as President Trump’s detentions of unauthorized immigrants began to increase in intensity and as the “No Kings” demonstrations of June 14 attracted millions to their day of protest. A shortened version appeared in the Pittsburgh Post-Gazette on July 12, 2025 under the mostly misleading headline, “On Immigration, Trump and the Democrats both went too far.” The latter headline implied that the op-ed was a piece of detached analysis, when it was really meant as a call to political action for the Democratic party.
The recent focus on President Trump’s provocative and possibly illegal militarization of the immigration issue has obscured the vulnerability of his underlying policy on the subject itself. Democrats would do well to refocus attention on that policy. With the number of new unauthorized arrivals now down to near zero, the moment is ripe for Democrats to advance a broad proposal for bipartisan immigration reform that substantially accepts the Trump-era restrictions on new border crossings and visa violations – which began during the final year of the Biden administration – while urging a path to citizenship for the millions of illegal immigrants, other than convicted criminals, who are already here.
In brief, Democrats should say yes to border restrictions, no to mass deportations. There are strong reasons to believe that a majority of Americans would support such a middle position.
The more the U.S. Immigration and Customs Enforcement agency shifts its attention away from alleged gang members and foreign student radicals and begins to detain the likes of nonthreatening workers from car washes or locally-popular proprietors of family-owned businesses – as has happened in the Philadelphia area, where I live – or restaurant workers, as just happened outside of Pittsburgh, the more likely will opposition develop to such inhumane and seemingly incomprehensible practices. The same goes for more nationally publicized raids on garment, construction, and landscape workers in California, Florida, and Massachusetts, which have often resulted in painful separations between family members. These actions, notwithstanding the president’s latest pledge to avoid going after agricultural and hospitality workers, are creating an opening for more sensible immigration proposals to be considered.
Despite Mr. Trump’s repeated assertions that voters delivered him a “mandate” for all his actions on immigration, there are signs that a solid majority of Americans oppose mass deportations on both humanitarian and practical grounds. As one Trump supporter in a rural part of Missouri explained her decision to aid a recently-jailed restaurant coworker, a forty-five-year-old woman who came to the United States from Hong Kong on a tourist visa twenty years ago and now has three American-born children, “I voted for Donald Trump, and so did practically everyone here [in Kennett, Missouri]. But no one voted to deport moms. We were all under the impression we were just getting rid of the gangs….” The detained woman’s church has organized a prayer vigil for her, and hundreds of local residents have signed petitions to bring her home.
A recent national poll conducted by the centrist organization Third Way found that while 90% of voters agreed that “We should deport any immigrant who is convicted of a violent crime,” fully 75% also stated that “We need to establish a pathway to citizenship for hardworking immigrants who have been living and working here for years, even if they came here illegally.”
Democrats can look to several past precedents for models of a programmatic alternative to the president’s policies. The most promising initiative came in June 2013 when the Senate passed a comprehensive immigration reform bill by a 68-32 vote, with 14 Republican senators (including Marco Rubio and Lindsay Graham) joining 52 Democrats and 2 Independents in favor. The bill promised strict border security; it reorganized visa preferences for legal immigrants, including temporary workers; and it established a five-year path to citizenship for “Dreamers” (children who arrived illegally in the U.S. under age 16) and a 13-year path to citizenship for those immigrants who arrived illegally as adults. The latter path required applicants to pay a $1000 fine for having broken the law when they first entered the country or overstayed their visa, to pass English-language proficiency and United States history and civics tests, to be free of a criminal record, and to pay an additional $1,000 in processing fees for the verification of their applications. Unfortunately, the effort died when the House rejected the bill.
More recently (2023-24), a bipartisan group of 38 Congressional representatives (30 Democrats and 8 Republicans), including three Pennsylvanians (Democrats Chris Deluzio and Chrissy Houlahan and Republican Brian Fitzpatrick) proposed the Dignity Act, which called for strong border security and a streamlined, 60-day asylum process, along with a path toward legalized residency for Dreamers and other undocumented immigrants. To be legalized, these immigrants would have to pass a criminal background check and pay $6,000 in fines and restitution fees.
Given President Trump’s hold over the current GOP, it might seem foolish to aim for bipartisan legislation. But that assessment fails to reckon with what the coming months may bring in terms of chaos and likely clashes between extremists on the right (including within the federal government) and left. We have already seen plenty of evidence of such extremism in Los Angeles. Under these changing circumstances, it is critical that Democrats hold out a civically responsible path toward sensible immigration reform, to which disaffected Republicans, even if small in number at first, might gravitate. Such an approach will pave the way for significant gains in the 2026 midterm elections. Prying just 10% of Trump voters away from the Republicans on this issue would likely bring Congress back under Democratic control.
Democrats are absolutely correct to be vocal in condemning the Trump administration’s violations of due process rights of detained students and other immigrants, along with its other likely infractions of the law. These high-profile cases touch on important legal and moral principles. But these cases are mostly a side show for the general population. Because of the unpopularity of most targeted student radicals and alleged gang members and the resort to vandalism by the most extreme protesters, the president even uses these few examples to build support for the far wider deportations he has now set in motion. Democratic “resistance” cannot substitute for programmatic opposition that will actually cut into the wider base of support for President Trump’s immigration policy.
Democrats also need to acknowledge where they went wrong from 2016 through 2023, when they overlooked the negative consequences of far too many people entering or remaining in the country illegally. Only by making this admission will they regain the trust of American voters. On this basis they can then return to their party’s more longstanding and valuable commitment to a controlled flow of legal immigrants. Democrats must not shy away from explaining why immigration is good for America’s future, both economically and demographically, provided the flow of newcomers can be kept within limits and conducted with sensitivity to occupational competition and settlement patterns. With these principles in mind, Democrats can forcefully proclaim that there is no good reason to deport the very same kinds of people today who have always added so much to our nation’s wealth and character.
The “Settler Colonial” Trap for Israel
Adam Kirsch’s new book, a powerful critique of the concept of settler colonialism as applied to the history of Israel, could not have arrived at a more timely moment. Kirsch outlines a number of essential points that collectively show the shallowness of viewing the establishment of the state of Israel through the lens of this concept, one that has become ubiquitous on college campuses. However, Kirsch’s book exhibits two shortcomings that point in the direction of further work needed to complete the task the author has so forcefully begun.
A review of Adam Kirsch’s On Settler Colonialism: Ideology, Violence, and Justice (Norton, 2024)
This extended review of Adam Kirsch’s book, On Settler Colonialism: Ideology, Violence, and Justice (Norton, 2024), appeared in Athenaeum Review, Issue 11 (Spring 2025): 66-71. Among the journal’s minor changes to my article was the omission of parenthetical page references to Kirsch’s book. I have retained these page references from my original draft in the copy reprinted here. The footnotes are the same in both versions.
Adam Kirsch’s new book, a powerful critique of the concept of settler colonialism as applied to the history of Israel, could not have arrived at a more timely moment. Kirsch outlines a number of essential points that collectively show the shallowness of viewing the establishment of the state of Israel through the lens of this concept, one that has become ubiquitous on college campuses. Moreover, he demonstrates how the concept has been adopted by pro-Palestinian and anti-Israeli activists not for its value as a term of scholarly analysis but rather as a component of a political ideology aimed at undermining Israel’s legitimacy. At a length of a mere 140 small-sized pages, the book’s brevity and lucidity make it readily assignable for undergraduate and graduate courses (and even for high school social studies classes), thus addressing exactly the audience of idealistic, yet naïve and hot-headed students (and teachers) who today are most in need of reading it. Along with these achievements, however, Kirsch’s book exhibits two shortcomings that point in the direction of further work needed to complete the task the author has so forcefully begun.
The term “settler colonialism” emerged in the 1960s and ’70s among academics interested in studying the phenomenon of African and Asian decolonization, a process that unfolded with great rapidity following the end of World War II. Scholars used the term to distinguish between colonies like India and Vietnam, where the European mother country sent few of its own people to settle there, and those like Rhodesia or Algeria, where European settlers came to comprise significant minorities of the population. However, by the 1980s and ’90s the concept had become a pejorative label – really, little more than a slogan – with which to stigmatize the nation of Israel, itself formed during the same era.
As Kirsch points out, almost none of the characteristics common to settler colonies fits the history of modern Israel’s creation. There was no “mother country” which “sent” its Jewish inhabitants to the Ottoman Empire’s Syrian and Sidonian (later, Beirut) provinces or its Jerusalem district (together including the area referred to intermittently as Palestine)1 when the Zionist movement began in the 1870s and ’80s. Jews fled the Russian Empire and other countries of Eastern and later Western Europe as refugees from the persecution of pogroms and then the Holocaust. They came on their own steam, in any way they could muster, though helped by the philanthropy of well-to-do Jews in Europe and the United States. They bought land from Turkish and Arab landowners. These newcomers to the place that had given birth to both Judaism and Christianity, moreover, could make their own justifiable claim to being its indigenous people, a term frequently employed by today’s settler-colonial ideologues in an attempt to draw a sharp distinction between those said to be the original inhabitants of an area and its later colonial invaders. Not only had the ancient kingdoms of Israel and Judah and then the kingdom of Judea persisted from early in the first millennium BCE until the Romans killed or enslaved and carried off the great majority of the region’s Jews (perhaps about one million in number) in the first and second centuries CE, but Jewish people trickled back and continued to live in Jerusalem, Safed, Hebron, and other Palestinian towns and villages, sometimes in substantial numbers, during the many centuries following the conquest of the area by Arabs and Islam in the seventh century.2
Even when the British endorsed the Zionist project by proclaiming their support for a “Jewish homeland” in Palestine with the Balfour Declaration of 1917, this important decision did not mean that the British would now call all the shots there. The growing Jewish population in Palestine faced off with the British on numerous occasions during the period of the British Mandate (1920-1948) over the issue of how many Jewish immigrants would be allowed in. Indeed, the British ultimately turned their back on the Jews in 1939. Were it not for the moral pull, particularly on President Truman and the Americans, exerted by the Holocaust and especially by the plight of more than 100,000 displaced European Jews who had survived the Second World War, the modern nation of Israel might never have come into existence.3
Since Kirsch’s book centers on how the notion of settler colonialism serves a broader ideology, some of the book’s strongest parts go toward identifying that ideology, which turns out to be a species of revolutionary leftism that adopts a particularly uncompromising, almost nihilistic stance. The clue to understanding this outlook resides in the fact that once a country has been labeled a settler colonial society, there are no remedies to ameliorate this oppressive situation short of complete decolonization. Yet, significantly, the ideology began by taking aim at nations like the United States, Canada, and Australia, countries in which the earlier inhabitants had been reduced in number – typically, these theorists assert, through genocide – to tiny minorities next to the huge populations of settlers and their descendants, who, of course, no longer think of themselves as settlers or colonists. Since there is no possibility of removing the invaders from these countries, the only outcome for those among the settlers’ descendants who become aware of their nations’ terrible histories is to feel the guilt that comes with having committed an original sin. The current practice, for example, of “land acknowledgments” to Native American groups that once occupied the territory on which a museum or university now sits, offers but a poor substitute for any genuine atonement. “The goal,” Kirsch writes, “is not to change this or that public policy but to engender a permanent disaffection, a sense that the social order ought not to exist” (34).
Naturally enough, beneath such a deep sense of cultivated alienation often lurks the potential for violence, or at least for the glorification of violence, since few people can sustain a sense of severe personal guilt without experiencing a corresponding desire to do something cathartic to remove that stain from themselves. Enter the case of Israel. Here, to these ideologues, one confronts a fairly recent example of a new nation seemingly formed through the displacement of most of its original inhabitants, and one in which the roughly equal number of Jews and Arab Palestinians (about 7 million each) in the lands of the former British Mandate (Israel, Gaza, and the occupied West Bank) make imaginable the theoretical, if thoroughly unrealistic, possibility of “decolonization.” For leftists of many stripes, from anti-capitalist and antiracist fighters to environmentalists, feminists, and gay rights activists, the Israel-Palestine conflict provides, as Kirsch puts it, “a local address to a struggle that can otherwise feel all too abstract” (85). It was but a short step from accepting this premise to the corollary of numerous groups and individuals on the left cheering on the murderous actions of Hamas on October 7, 2023.
Kirsch is hard-hitting against this ideology. He condemns its all-or-nothing, purist stance toward perceived social injustice, arguing instead for such moderate, reform-minded strategies of amelioration as those adopted in the past by the organizations of the civil rights movement of the 1960s or the National Congress of American Indians from the 1940s up through today. Such groups recognized that everything good and everything bad do not line up neatly on two separate sides in any conflict. Extremist ideologies like that of settler colonialism fail to see that migrations of people and displacement (or partial displacement) of one group by another, whether by force or by assimilation (or, as in the case of the Americas, Kirsch omits to add, principally by the inadvertent spread of disease), have been a common occurrence throughout human history. Naturally, such upheavals in the past were not carried out in ways that would meet today’s standards of human rights and dignity; indeed, brutality and cruelty were more often the rule than the exception. But when revolutionary ideologies have occasionally taken power (as they did in eighteenth-century France and twentieth-century Russia and China), their attempts to radically restart history in the name of utopian ideals only succeeded in producing new atrocities atop the ones they believed they were avenging. Such disastrous ends would surely be the result if the settler-colonial ideologues’ goal of “decolonizing Palestine” were to be put in motion. Hamas on October 7 offered a preview of just such a result.
Given that Kirsch has so plainly – and correctly, it seems to me – identified the settler-colonial misreading of the history of Israel and Palestine with the historic left, it is peculiar that he chooses to end his book with a mostly laudatory treatment of an essay, “On the Concept of History,” written by the German Jewish Marxist theorist Walter Benjamin in 1940 while Benjamin was living in Paris, on the run from the Nazis. What Kirsch likes about this essay is Benjamin’s reliance on the concept of despair to describe what history has meant for the majority of the world’s people. Such a bleak vision of the past could certainly make sense for someone writing in the face of fascism’s successful early conquests in Europe at that moment. But, in truth, the essay itself reveals Benjamin’s outlook to derive less from his immediate circumstances and more from his commitment to what he calls historical materialism, the Marxist diagnosis of history as a record of “barbarism,” a “tradition of oppression,” against which only a decisive, revolutionary break with the past will deliver relief. All attempts to produce slow but steady “progress,” Benjamin writes, are illusory and doomed to failure.4 This is exactly the same extremist stance as that adopted by today’s settler-colonial activists toward the social ills of the United States and Israel, with far less apparent justification than Benjamin’s precarious position in 1940 might have provided.
Kirsch does in the end depart from Benjamin’s maximalist outlook, so why privilege him with such extended treatment in the book’s final chapter? I think Kirsch’s decision to do so stems from a desire to add a religious dimension to his discussion. Benjamin was an unusual thinker of his era in astutely recognizing a religious longing beneath Marxist doctrine, even going so far in his “Concept of History” essay as to call historical materialism a puppet – he meant this as a positive characterization – whose strings were being pulled by theology. He was similarly at home in referring to his hoped-for revolution as the coming of the Messiah, an event he imagined as modeled after past moments of apocalyptic renewal that would wipe away all the despairs of history.5 Kirsch clearly finds Benjamin’s religio-political vision, based on the weighty notion of despair, compelling, but Kirsch rejects it in favor of what he calls a different kind of despair, one linked to a future that can find hope in repairing past wrongs through partial means of improvement. Kirsch believes he can find this second understanding of despair in portions of the Talmud, which, he relates, speak of the despair owners of lost or stolen articles like a garment, a donkey, or some coins, feel of ever recovering their possessions, especially once they have been passed along or resold to new and innocent parties. The Talmud’s ancient rabbis urge the original owners to accept compensation of some other sort for these goods so as not to create new injustices.
The discipline of religious studies has names to apply to these two different visions of despair and redemption: premillennialism and postmillennialism. In the first, there can be no justice (no thousand years of happiness) until after the Messiah has come. In other words, the Messiah must appear before the millennium can begin; all is dark before that apocalyptic moment arrives, delivering its Day of Judgment. In the second, justice (the thousand years of happiness) comes gradually through small steps of improvement, culminating in the Messiah’s return almost as an afterthought or reward. As Kirsch’s entire critique of the settler-colonial ideology makes clear, the author sides with the second of these redemptive visions, entailing a need for human beings to grasp the particulars of perceived social injustice with care and to proceed deliberately toward gradual changes that will produce more good than harm. But it is not evident to me how framing this choice in religious terms, as Kirsch does, helps bring about his admirable goal. If anything, the history of conflict in the Middle East suggests how frequently competing religious authorities have made compromise and small improvements difficult to achieve.
Kirsch’s turn toward religion at the end of his book, not to mention his elevation of Walter Benjamin’s thought, also obscures his earlier identification of the settler-colonial ideology as part of a longstanding leftist political tradition. This tradition owes more to the romantic distortions of psychological alienation from modern, capitalist society, to which intellectuals have so often been susceptible, than to any genuinely religious impulse. The discomfort left-wing intellectuals, often well-to-do themselves, feel from their participation in modern Western societies best explains the ideology’s underlying guilt and the way in which Israel has come to function as a convenient and vulnerable stand-in for these activists’ more powerful home nations.
Instead of leaving readers with a choice between two different religious approaches, I suggest that Kirsch’s work points to the need for a renewed focus on the political and social history of the Israel-Palestine conflict. Throughout his study, Kirsch alludes to a number of counter arguments taken from The Hundred Years’ War on Palestine: A History of Settler Colonialism and Resistance, 1917-2017 (2020), by Rashid Khalidi, probably the most prominent American historian writing about the conflict from a pro-Palestinian perspective. Yet Kirsch fails sufficiently to rebut these arguments (a lack of precise footnotes doesn’t help his attempts). Two favorable reviewers of Khalidi’s book, both established scholars in the field of Middle Eastern Studies, have called it the best “single book” that they would now recommend both to students and general readers in order to understand the Israel-Palestine dispute.6 His book has also made it onto The New York Times’s weekly best-seller list consistently since October 7, 2023.7 Khalidi’s observations deserve careful evaluation and explicit rebuttal, if only because his book has brought the settler-colonial ideology to countless readers and will continue to do so. There have been several critical reviews of the book, including a strong one by Benny Morris in Jewish Review of Books, but a much greater volume of critical commentary is needed in terms of both scope and influence in order to avoid losing this vital contest of ideas over Israel’s legitimacy as a nation.8
Kirsch states at the outset of his study that he will not be addressing either the conduct of the current war in Gaza – an important subject of its own – or how to resolve the Israel-Palestine conflict itself (x). However, toward the end of the book he does rather ambiguously endorse the idea of a future two-state settlement, presumably based roughly on the divisions of 1967 (116). Indeed, this vision offers the only foreseeable way to resolve peacefully a dispute between two equally legitimate moral claims to the same piece of land, given the existing record of previous armed conflict. Khalidi, by contrast, endorses either the idea of a single Palestinian state in which Jews and Arabs somehow live together as equals or two separate states based roughly on the boundaries, drastically different from those within the region today, of the 1947 United Nations partition plan9 – both thoroughly unrealistic prospects. Bridging this gap in goals and expectations will require intense negotiations, now pushed well into the future owing to Hamas’s horrifying attack and the violence of the subsequent war in Gaza. Whenever these negotiations do begin, however, they will require the support of sober historical understanding, not the partisan distortions of ideology.
1 Rashid Khalidi, Palestinian Identity: The Construction of Modern National Consciousness, (New York: Columbia University Press, 2010), pp. 218n37, 28, 34, 221n69.
2 Howard M. Sachar, A History of Israel from the Rise of Zionism to Our Time, 2nd edit., revised and updated (New York: Alfred A. Knopf, 1996), pp. 18-20.
3 Sachar, pp. 287-295.
4 Walter Benjamin, Illuminations: Essays and Reflections, ed. Hannah Arendt (Boston: Mariner Books, 1968), pp. 196-209, esp. pp. 200 and 202.
5 Benjamin, pp. 196, 205-206, 209.
6 Reviews in Journal of Palestine Studies 51.4 (2022):109-112, by Laila Parsons, Professor of Modern Middle East History, McGill University; and in Journal of Islamic and Muslim Studies 8.2 (2023), by Michael Vicente Perez, Associate Professor of Anthropology at the University of Memphis, TN.
7 Khalidi’s book had been on The New York Times’s best-sellers list for 33 weeks, as of June 2, 2024. See www.nytimes.com/books/best-sellers/2024/06/02/paperback-nonfiction/
8 Benny Morris, “The War on History,” Jewish Review of Books, Spring 2020. See also the valuable review by Michael Rubner in Middle East Policy 27.2 (Summer 2020): 173-177.
9 Rashid Khalidi, The Hundred Years’ War on Palestine: A History of Settler Colonialism and Resistance, 1917-2017 (New York: Metropolitan Books, 2020), p. 251 outlines four possible options, but the weight of the book’s overall argument falls on the third and fourth of these options, the two noted in my text.
Reply to Carla Pestana’s “Origins of Witchcraft Crisis”
In her featured review of Paul Boyer and Stephen Nissenbaum’s 1974 book, Salem Possessed (“The Origins of Witchcraft Crisis 50 Years Later,” AHR, December 2024, 1751-1754), Carla Gardina Pestana laments that a work whose “assumptions underlying the interpretation have crumbled” should “still resonate” and remain “a seductive story” (p. 1754). Her conclusion would have been more persuasive had she summarized the scholarship undermining those assumptions and better captured why the book has nevertheless proved so popular.
This 1000-word reply appeared as a Letter to the Editor to the American Historical Review 130.2 (June 2025): 1007-1008, in response to a featured retrospective review of Paul Boyer and Stephen Nissenbaum’s Salem Possessed: The Social Origins of Witchcraft (1974), entitled, “The Origins of Witchcraft Crisis 50 Years Later,” by Carla Gardina Pestana (American Historical Review 129.4 [December 2024]: 1751-1754). I have linked here my reply and Pestana’s review to pdf copies of the original journal articles. The AHR’s paywall prevents nonsubscribers from accessing these web articles directly.
In printing my letter to the editor, the AHR misprinted the word “capitalism” as “capitalist” in the second-to-last paragraph, 6th line from the bottom (I had taken the word from a direct quotation from Salem Possessed.) I have corrected the pdf copy in pen.
TO THE EDITOR:
In her featured review of Paul Boyer and Stephen Nissenbaum’s 1974 book, Salem Possessed (“The Origins of Witchcraft Crisis 50 Years Later,” AHR, December 2024, 1751-1754), Carla Gardina Pestana laments that a work whose “assumptions underlying the interpretation have crumbled” should “still resonate” and remain “a seductive story” (p. 1754). Her conclusion would have been more persuasive had she summarized the scholarship undermining those assumptions and better captured why the book has nevertheless proved so popular.
In order to explain the pattern of witchcraft accusations in Salem village in 1692, Boyer and Nissenbaum posited a twenty-year-long factional face-off between the economically declining farmers residing on the Puritan village’s western side (they predominated among the accusers) and the economically advancing farmers and tradesfolk residing on the village’s eastern side, next to Salem town (they predominated among the accused and their defenders). The authors contended that this factional strife accounted for the selections and departures of the village’s four ministers during these years, culminating in the witch hunting ministry of Samuel Parris. They argued that the prosecution of presumed witches provided an outlet for the economically struggling Puritans to get back at their more prosperous neighbors.
This depiction of the Salem witch hunt was accepted by nearly all academic reviewers until 2008. In that year Richard Latner demonstrated that Boyer and Nissenbaum’s reliance on tax-assessment lists from the single year 1695 had yielded a misleading portrait of the two sides. Adding tax data from 1681, 1690, 1694, and 1700 to those from 1695, Latner showed that the supporters of Parris and the witch hunt were not a declining group prior to 1692 (or later, for that matter), having gained back an earlier deficit by 1690. “The tax rolls do not support the claim that the pro-Parris group lashed out in resentment in 1692 against those [the presumed witches] who represented the superior forces of modernization,” Latner concluded. “If any group had reason to complain, it is the minister’s opponents” (“Salem Witchcraft, Factionalism, and Social Change Reconsidered: Were Salem’s Witch-Hunters Modernization’s Failures?” William and Mary Quarterly, 3rd ser., 65 [July 2008]: 423-448; quotation pp. 446-447).
In the same year Benjamin Ray revisited Boyer and Nissenbaum’s most famous graphic (appearing on p. 34 of Salem Possessed), a map displaying a large selection of the residences of the leading witchcraft accusers, accused witches, and witches’ defenders in 1692 (90 data points in all). The geographic distribution of these residences dramatically lined up on opposite sides of the authors’ posited east-west divide, with accusers greatly overrepresented (94%) on the village’s western side and accused (77%) and their defenders (81%) overrepresented on its eastern side. Through a careful (though not flawless) reexamination of all the village’s participants in the witch hunt, Ray was able persuasively to correct mistakes and omissions in the original map to produce a less lopsided pattern between accusers (now 59% on the western side) and accused (now 68% on the eastern side); Ray omitted defenders from his corrections (“The Geography of Witchcraft Accusations in 1692 Salem Village,” William and Mary Quarterly, 3rd ser., 65 [July 2008]: 449-478). Yet Ray could not account for what remained of this geographic variance.
Ten years later my own chapter-length critique of Salem Possessed solved the problem of the geographic divide left over from Ray’s amended map by reminding readers of the larger contexts of family ties and religious beliefs within which witchcraft accusations proceeded, typically owing to personal grievances and fears. A number of extended families – with the Putnams and Wilkinses in the lead – produced the lion’s share of witchcraft accusations at Salem, and because generations of family members held contiguous lands through subdivisions, accusers tended to live near each other and at some distance from those they accused. The Wilkins clan alone, which owned most of the land in the northwest corner of Salem village, accounted for eleven accusers on the western side of Boyer and Nissenbaum’s map, while they lived near to just one accused witch and a single defender. The accused also tended to be grouped in extended families, not, as Boyer and Nissenbaum thought, because they held commercial ties to Salem town, but rather because it was believed that witchcraft ran in families. Boyer and Nissenbaum’s “factions” were chiefly an illusion fostered by reading later events and alliances backward into the pre-1692 period. (See Switching Sides: How a Generation of Historians Lost Sympathy for the Victims of the Salem Witch Hunt [Baltimore: Johns Hopkins University Press, 2018], chap. 2.)
Omitting all this scholarship from her review, Pestana is left criticizing Salem Possessed simply for its alignment with historians’ town studies of the 1960s-80s, which depicted (mostly positively) colonial towns as traditional and insular when we now know them to have been bound up with capitalist values and trans-Atlantic commerce right from the start. In truth, these settlements shared in both these characteristics, as most historians of that era acknowledged. What made Salem Possessed so seductive to readers and scholars from the 1970s all the way up through today was not its denial of the commercial element but rather its ideological hostility to it. The Salem story was one where, as the authors put it, “a subsistence, peasant-based economy was being subverted by mercantile capitalism….[T]he social order was being profoundly shaken by a superhuman force which had lured all too many into active complicity with it. We have chosen to construe this force as emergent mercantile capitalism. [Cotton] Mather, and Salem Village, called it witchcraft” (pp. 178, 209).
As for witch hunting, it will not do to banish “village infighting, spite, and envy” from our understanding, as Pestana advocates (p. 1754). That’s the part of the phenomenon that joins early modern European witch hunting (including in its colonial outposts) to witch hunting in all times and places. The other essential part of this phenomenon comes from the religious ideas and fervor of the era that turned small-scale scapegoating into mass panics.
Tony Fels
Professor Emeritus of History
University of San Francisco
“Hate” Was Not the Problem at Penn (or Other Universities), Radicalism Was and Still Is
Late last year Pennsylvania’s Governor Josh Shapiro interjected himself forcefully into the uproar over former University of Pennsylvania president Liz Magill’s Congressional testimony (in which she failed to say that calling for “the genocide of Jews” would necessarily violate the school’s code of conduct) and the events on the Penn campus that would soon culminate in her resignation. The governor came out strongly against antisemitism in all its forms – never bad in itself. But Shapiro’s widely reported speech on Sunday, December 10, 2023, at Philadelphia’s Rodeph Shalom synagogue, in which he proclaimed, “Hate has no place here,” misnamed the chief problem that had plagued Penn and a number of other universities this past fall.
This essay appeared in slightly altered form on the website of Tablet Magazine on April 1, 2024 under the title, “Jew-Hatred Is Not the Problem at Penn (or Other Universities). Radicalism Is.”
Late last year Pennsylvania’s Governor Josh Shapiro interjected himself forcefully into the uproar over former University of Pennsylvania president Liz Magill’s Congressional testimony (in which she failed to say that calling for “the genocide of Jews” would necessarily violate the school’s code of conduct) and the events on the Penn campus that would soon culminate in her resignation. The governor came out strongly against antisemitism in all its forms – never bad in itself. But Shapiro’s widely reported speech on Sunday, December 10, 2023, at Philadelphia’s Rodeph Shalom synagogue, in which he proclaimed, “Hate has no place here,” misnamed the chief problem that had plagued Penn and a number of other universities this past fall. That problem was not the expression of group hatred toward Jews, of which only a handful of examples have existed on most American campuses for many years now, but rather the radical politicization of higher education to the detriment of the free expression of ideas, which constitutes the lifeblood of any college. Threatening in this way to undermine the very idea of a university, political extremism may also predictably endanger the safety and well-being of individuals – Jews among them – who live, study, or work at one.
In the flood of commentary that followed President Magill’s ouster, right-leaning columnists fairly pilloried elite institutions like Penn for their hypocrisy in scrupulously defending the legal principles of free speech on campus concerning criticism of Israel while ignoring years of restrictions on faculty and outside speakers whose views challenged “social-justice” norms on race, gender, religion, and other topics. Meanwhile, left-leaning columnists properly warned of dangers to academic freedom if wealthy donors, politicians, or other self-interested parties can bypass normal university procedures to influence educational content – in this case, in the name of opposition to antisemitism. Neither side in this clash has been keen to acknowledge its own contributions toward undermining academic freedom and diversity of thought at universities, turning the conversation into yet another skirmish in the “culture wars.” A focus on what has provided the stimulus for so many recent campus controversies, the perception of speech and actions that are considered hateful, may offer some clarity toward useful university reforms and an assessment of the current moment’s dangers for Jews.
Incidents of reported antisemitism at Penn this past fall received a boost from two singular events – a high-profile conference showcasing Palestinian literature and political activism, which took place on the campus in late September, and the savage assault by Hamas on Israeli civilians on October 7, precipitating the ongoing war between Israel and the terrorist organization in Gaza. As a result, the record of these incidents (detailed in the next three paragraphs), while at first glance startling in number, on further examination of what’s known about their circumstances suggests somewhat less cause for alarm. Overall, this record comports with the general findings of the Anti-Defamation League for two recent years, which downplay the significance of universities as settings for antisemitic attacks. For both calendar years 2021and 2022, the ADL found that just under 6% of the total number of antisemitic incidents occurring throughout the United States (which rose to its highest number on record in 2022 at 3,697 incidents; no doubt that number will be far higher for 2023) took place on college campuses. And most of the recent ones at Penn, as we will see, are best classified as political in nature.
The record for Penn this past fall is as follows: On September 13 students discovered a swastika painted on an inside surface of Penn’s Stuart Weitzman School of Design, with no apparent leads turning up as to the identity of the perpetrator or the significance of the precise target. On September 21 Penn’s Division of Public Safety apprehended a man for entering the campus’s Hillel building, overturning some furniture, and shouting, “F—k the Jews. They killed JC.” The man, who the Penn police said was “experiencing a crisis,” had been spotted earlier overturning trash cans on a nearby city street. His relationship to the Penn community has never been clarified. (The Washington Free Beacon, citing an unnamed Hillel spokesperson, reports that the intruder was a Penn student, but all other sources refer only to “an individual.”) As part of the three-day “Palestine Writes” conference, held on the campus September 22-24, speakers excoriated Israel from multiple angles [see link at paragraphs 6, 115], including as a nation of “settlers from Europe” who became “occupants of our country.”
On September 27 the display of a foliage-covered booth for the Jewish holiday Sukkot, erected by Penn’s Chabad organization, was desecrated with unreadable graffiti, but Penn’s police did not consider the incident antisemitic. On October 16 a pro-Palestinian demonstrator, not affiliated with Penn, told students in a pro-Israeli counter-demonstration that they should “leave us in peace or go back to Moscow or Brooklyn.” He later pushed a bystander and ripped down pictures of Israelis held hostage by Hamas, for which he was apprehended by Penn’s police. Two days later a Penn library staffer also tore down pictures of the people assaulted and taken captive by Hamas. When confronted by a Jewish student over what he was doing, words were exchanged and the staffer swore at the student [see link at p. 17]. On October 20 students at the off-campus Jewish fraternity Alpha Epsilon Pi found the phrase, “The Jews R Nazis,” written on the door of an adjoining empty building (owned by a Jewish landlord). There are no leads as to the perpetrator(s).
On October 28 an Israeli flag was ripped down and taken from an off-campus residence hall for Orthodox Jewish students. The perpetrator was found to be a Penn student involved in the campus’s anti-Israel group, Penn Against Occupation [see link at p. 18]. On the night of November 8 Penn Against Occupation projected pro-Palestinian slogans, including “Let Gaza live,” “From the river to the sea, Palestine will be free,” “Zionism is racism,” and “Penn funds Palestinian genocide,” onto the faces of a number of campus buildings, messages which President Magill denounced the next day as “vile” and “antisemitic,” promising a full investigation by the Penn Police. Throughout this period (with dates unspecified), according to Penn Hillel’s rabbi, “a small number of Penn staff members” received hateful, antisemitic messages and violent threats that targeted the recipients’ personal identities [see link at p. 21]. And on December 3 in a citywide protest, some 500 pro-Palestinian demonstrators ended a march by spray painting graffiti on several Penn properties and stores that line the campus.
Even a single one of these reprehensible incidents is one too many. But overall, how should we understand this record? Among these roughly ten incidents (leaving aside the “Palestine Writes” conference), two of the perpetrators were identified as Penn students (or a student group), a third as a Penn employee, and a fourth as holding an unspecified relationship to the campus community. A fifth perpetrator was an off-campus community radical. The perpetrators of four more of these incidents remain unknown, and one incident may not have been antisemitic at all. In terms of its content, despite the presence of occasional generic symbols of antisemitism, this record of attacks on Jewish targets is best described as an extremist outgrowth of political radicalism stemming from the longstanding Israeli-Palestinian conflict. The political character of most of these attacks is underscored by the fact that the Jewish population at Penn, by no means weak at 16% of the undergraduate student body, sponsored a variety of its own public, political stands, most supportive but some critical of Israel, during this same period. None of this interpretation goes to minimize the potential for violence against Jews embodied in the anti-Israeli radicalism at Penn (as I will turn to in a moment). Rather, it serves to name the threat in a manner that connects it to the dominance at so many American colleges today of radical left-wing ideology on behalf of causes said to represent such “oppressed groups” as African Americans, other people of color, and a variety of sexual minorities. At Penn, as at other college campuses, Palestinians, not Israelis or Jews, are considered an oppressed group.
For universities, the fact that these incidents derive more from radical political sentiments than from traditional Jew-hatred points the way toward how these institutions should handle the problem. It is lucky that ethnic hatred per se does not lie at the root of today’s campus woes, because colleges are not – or should not be – in the business of inculcating moral values or teaching civics. Those responsibilities are best left to families, lower-level schooling, religious bodies, and voluntary organizations. Universities exist for the purpose of furthering higher education and fostering the pursuit of truth through advanced research, both of which goals demand wide open forums for the presentation and discussion of ideas. At the same time, universities must proceed with internal rules that enable their mission to go forward and not be impeded by illiberal elements (whether arising from among faculty, students, administrators, or outside parties) who would disrupt their educational and research functions.
For many years now, universities have been doing exactly the opposite of what is required to support these goals. They have restricted the free-flow of ideas by “canceling” presentations they think might offend an “oppressed group,” while allowing protesters presumed to represent “oppressed groups” to take over campus buildings, block pathways, or interfere with quiet learning environments. Examples abound, from left-wing students at Middlebury College in 2017 shouting down sociologist Charles Murray’s guest lecture on cultural and genetic differences among social groups, to Penn’s own ongoing disciplinary investigation of law professor Amy Wax for, among other things, inviting a white supremacist to make a presentation to one of her classes. If anyone believes that radicals on the political right might not act similarly to restrict the speech they dislike, were they to be in control of these same universities, one need only glance at the attempt to institute more conservative tenets of orthodoxy in the teaching of American history at colleges in Florida.
As it happens, the threat to university life posed by political radicalism can best be mitigated by colleges adhering to these twin principles of encouraging wide-open speech – excluding foul language or any true threat of violence or intimidation directed at an individual or group (which would include, for example, any “call for the genocide of Jews”) – and placing strict physical limits on campus protests. There is no reason why, for example, the claim that Israel has committed “genocide” against the Palestinian people, or even that the nation of Israel should not exist as a refuge for Jewish people, abhorrent as these ideas are to me and many others, should be ruled out of order at a university. The best way to discredit such radical misconceptions and convictions is precisely by airing them to reasoned criticism and debate, including by experts in related fields of study, through lectures, classes, teach-ins, and written work. That’s what universities are for. The problem with the “Palestine Writes” conference was not that it was allowed to take place but rather that the faculty who set it up made no effort to seek balance or diversity in the perspectives and expertise that were represented on its panels. Meanwhile, plenty of college rules and criminal laws already exist for prosecuting anyone committing acts of disruption, vandalism, harassment, or personal assault on a college campus. They need to be enforced.
In responding to the recent increase in reported incidents of antisemitism, universities should resist the temptation simply to add “antisemitism awareness” to the list of topics already covered in the mandatory DEI (“Diversity, Equity, Inclusion”) orientation sessions that have become commonplace on campuses. There is little evidence that such efforts at overt moralizing accomplish their stated aims, while they more reliably inhibit the expression of unpopular views. Given that the greatest threat to the universities today stems from political radicalism, a far more effective counter to the ugly manifestations of campus protests lies in demonstrating the shallowness and dangers of the radicals’ ideas and rhetoric.
For Jews, the fact that recent campus actions perceived as antisemitic proceed from left-wing political beliefs as opposed to ages-old myths about the Jewish people, or, for that matter, as opposed to newer, right-wing political ideas like the “great replacement” theory, which holds Jews responsible for encouraging illegal immigrants to come to the United States, may offer little comfort. After all, acts of vandalism, shoving and swearing at individuals, or leaving anonymous, threatening messages are frightening and intimidating regardless of their perpetrators’ motives. Radical beliefs, which so often arise from misplaced anger and poorly understood historical relationships, also have a way of migrating from one side of the political spectrum to the other. Marx, for example, contributed an early, derisive text on Jewish commercialism (On the Jewish Question) that figured in the later evolution of European fascist thought. Additionally, at any point along the way a mentally ill individual might act on these radical ideas to produce terrible violence, or mob psychology might take hold of a portion of a radically engaged crowd, resulting in similar consequences. The latter development never happened at Penn this fall, but it almost did at New York City’s Cooper Union College, where pro-Palestinian demonstrators banged on the glass windows of the campus library, frightening some of the Jewish students inside, as a security guard kept the door closed.
Moreover, in the current era of collegiate-based pro-Palestinian radicalism, which appears to have arrived at Penn as early as 2015 with the rise of groups pushing the goals of the “Boycott, Divestment, and Sanctions” campaign [see link at paragraphs 56-90], there really is an element of hatred involved. This is the hatred of Israel. One has only to witness the fury expressed by so many of the speakers at pro-Palestinian campus rallies to recognize that for most of the leaders, if not the followers, at these rallies, Israel is perceived as an illegitimate nation, whose majority Jewish population is living on stolen land that rightfully belongs to Palestinian refugees. This anger toward Israel not infrequently spills over into attacks on Jews who have no immediate connection to Israel. “[B]ecause you have never known the sanctuary of a home,” one presenter at Penn’s “Palestine Writes” conference put it, “…it’s no wonder you want our land for your own.” Who is the “you” in this sentence if not worldwide Jewry? And how else to explain the Penn rally speaker’s retort, noted above, to American Jews in the crowd to “go back to Moscow or Brooklyn,” or the target of one of the threatening messages, also noted above, in this instance conveying a bomb threat aimed at the Lauder College House, Penn’s newest, large dormitory, which was named for its biggest donors, Jewish family members of the Estée Lauder estate? One cannot read the 84-page civil complaint, filed in December by two Jewish students at Penn, alleging that Penn has allowed a hostile environment for its Jewish students to be created on its campus, without acknowledging the genuine sense of fear that evidently gripped many of these students (over 200 placed their names on one petition), as they watched and heard boisterous displays of anti-Israeli sentiment and received occasional antisemitic slurs week after week throughout the fall [see link at paragraphs 2, 101, 104, 107, 141-145, 161, 165-173, 191].
And yet, it would be a mistake to think the recent events at Penn and other American college campuses signify a true resurgence of virulent antisemitism akin to the widespread abuse Jews suffered during the 1930s in the United States, let alone in the cities of Europe. A number of factors serve to limit the current wave of anti-Jewish sentiment, but perhaps the main one is that the hostility at present really is focused on Israel, not on Jewish people as such. (See Eitan Hersh’s valuable observations about this distinction, as revealed in attitudes held by far left-wing as opposed to far right-wing college students.) It is probably not an accident that the lead student plaintiff in the civil lawsuit against Penn is a dual Israeli-American citizen [see link at paragraph 15], for he has reason to fee l particularly vulnerable to attack under these circumstances. And while this young man succeeded in obtaining the signatures of roughly 200 Penn students on a petition to prod the university to curtail pro-Palestinian activism, that number is still a relatively small fraction (about 12%) of Penn’s overall Jewish student population. It is likely that a majority of Jews at Penn did not feel personally threatened by the events of last fall. (Two post-October 7 surveys that purport to show widespread fear and anxiety among Jewish college students have drawn their respondents from those students with particularly strong attachments to Israel, in one case from a pool of young adult Jews who had applied to Birthright Israel, in the other case from students who appear to have been selected by Hillel campus organizations [see note on methodology at end of link]. A more relevant recent survey, one specifically designed not to exclude students with more minimal Jewish identities, found that roughly one third of all Jewish students expressed anxieties about being visibly Jewish on campus, about the same proportion who said they had been personally targeted by antisemitic comments, slurs, or threats. That proportion rose to somewhat less than two-thirds when respondents were asked if they believed Jewish students “pay a social penalty” for supporting Israel as a Jewish state.)
Indeed, some of Penn’s Jewish students conspicuously joined in many of the pro-Palestinian demonstrations, either in formal groups or as individuals. One Jewish student group found itself in a confrontation with the university administration, when it insisted on going ahead with showing a documentary film critical of Israel’s West Bank policies despite the university’s decision to delay the showing until passions on the campus had cooled. We should not be surprised by this split among Penn’s Jewish students, because American Jews under the age of forty hold considerably more critical opinions about Israel’s general policies toward Palestinians than do those older than forty.
The demographic characteristics of the campus protesters, so far as can be determined by second-hand observation, also fit with the demonstrators’ focus on Israel. Palestinian Americans appear to have dominated the protests at Penn, both as the leading speakers at rallies and in the make-up of the supporting crowds. Some are even Palestinians attending American colleges as foreign students – the Penn student who ripped down the Israeli flag from above the Orthodox Jewish student residence hall appears to belong to this category. Many are likely to be in contact with relatives and friends living in the West Bank or Gaza. To some extent, the radicalism of these ethnic Americans, focused on harsh legacies from “the old country” and fueled by the desire for upward mobility in the face of perceived prejudices in their new country, resembles past waves of second-generation immigrant radicalism (among, for example, Irish, Italian, Jewish, and Mexican Americans) common throughout our country’s history. Knowing this history, however, doesn’t make anger-driven radical actions any the less worrisome for institutions, such as universities, that require openness and reasonability to operate, or for individuals, who may easily be demonized as “enemies of the people.”
Radical movements tend to suffer from an unwillingness to look inward and to recognize the failings of their own group’s past leadership, choosing instead to place all blame on their historical antagonists and the latter’s perceived representatives in the present. The pro-Palestinian campus radicals clearly suffer from this flaw, as they have uncritically carried forward the tragic failings of past Palestinian leaders to seize numerous opportunities since 1947 to build a Palestinian nation alongside Israel. As today’s pro-Palestinian radicals have attracted support from among young black, feminist, and other radicals, they have allowed themselves to demonize Israel, just as the Black Lives Matter movement and certain gender radicals have demonized white people as “privileged racists” or men as “cis-gendered patriarchs.”
As a species of scapegoating, antisemitism is inherently unpredictable in its trajectory. It is well to be on guard to see if in the future today’s political antisemitism may burst out of its current anti-Israeli boundaries or spread beyond college campuses, their adjacent youthful urban enclaves, and Arab-American ethnic communities. For now, this worry remains muted by the firewall of sorts that exists in the overwhelming support for Israel shown by most Americans after the attack of October 7. However, the threat posed by left-wing political radicalism itself, particularly to college campuses, is real enough and must be countered by reasoned argument and the enforcement of lawful behavior.
What Penn Got Right and Wrong about Antisemitism on Campus Last Fall
A series of incidents occurring at the University of Pennsylvania last fall, culminating in Penn president Liz Magill’s fateful Congressional testimony (in which she failed to say that calling for “the genocide of Jews” would necessarily violate the school’s code of conduct) and her subsequent resignation, brought widespread charges that the university had ignored or even encouraged an outburst of antisemitic “hate” on its campus. As the war between Israel and Hamas continues and students have now returned to campuses throughout the country, it pays to look back at what Penn got right and wrong in its handling of this volatile subject so as to minimize future confrontations.
Readers may find that six links in this unpublished op-ed are blocked from connecting to their source: https://brandeiscenter.com/wp-content/uploads/2023/11/University-of-Pennsylvania-Title-VI-Complaint-1_Redacted.pdf. All six go to a formal complaint filed on November 9, 2023, by the Louis D. Brandeis Center for Human Rights Under Law with the United States Department of Education’s Office of Civil Rights under the title, “Civil Rights Violations by the University of Pennsylvania.” These six links appear in the op-ed followed in the text by parenthetical page references to that document. This essay was completed on February 9, 2024.
A series of incidents occurring at the University of Pennsylvania last fall, culminating in Penn president Liz Magill’s fateful Congressional testimony (in which she failed to say that calling for “the genocide of Jews” would necessarily violate the school’s code of conduct) and her subsequent resignation, brought widespread charges that the university had ignored or even encouraged an outburst of antisemitic “hate” on its campus. As the war between Israel and Hamas continues and students have now returned to campuses throughout the country, it pays to look back at what Penn got right and wrong in its handling of this volatile subject so as to minimize future confrontations.
Even before the savage assault by Hamas on Israeli civilians on October 7, precipitating the ongoing war, the fall semester at Penn began with a provocative event: a high-profile, three-day conference in September showcasing Palestinian literature and political activism. The university took heat for allowing this conference to go forward, because some of the invited speakers had been accused of making antisemitic statements in earlier appearances. But in greenlighting the conference, Penn did something right that was important. Universities exist to promote the pursuit of truth in both research and education. This pursuit, as John Stuart Mill pointed out almost two centuries ago in his classic, On Liberty, requires the widest berth for the free expression of ideas. Only in this way can wrong-headed and even dangerous assertions be shown to rest on erroneous facts or faulty logic, while truthful elements might be discovered within even the most unpopular positions.
As long as speakers did not threaten any individual or group, or use foul language in their presentations, Penn was correct to refrain from stepping in to halt the conference. Some critics said that the Penn administration should at least have rebutted any statements made at the conference that could be construed as antisemitic (see link at p. 12). But this is not the job of a university administration. The academic committees and departments responsible for setting up and endorsing the conference should have arranged in advance for a wide range of views to be represented, thus enabling criticism and reflection to be built into the conference itself. It sounds like these sponsoring bodies did nothing of the sort, which points to a deep, underlying problem at Penn and other universities: the lack of value placed on viewpoint diversity among its academic staff. This was something Penn got wrong, though the fault lay with the university’s faculty more than with its administration.
When Hamas attacked southern Israel on October 7, slaughtering some 1200 people and dragging another 240 people back to Gaza as captives, Penn’s president equivocated before issuing a statement of condemnation (see link at p. 15). Critics pointed to the hypocrisy of university presidents quickly condemning other national or international atrocities, like the killing of George Floyd or the Russian invasion of Ukraine, but then hesitating over what to say about an attack on Jews and others in Israel. Here, Penn walked into a mistake of its own making. Universities should not be in the habit of supporting or condemning any political event, precisely because to do so undermines a university’s special mission of promoting the all-sided pursuit of truth. All viewpoints must feel welcome at a university, and that can’t happen when the top officer of a school – and that goes as well for department chairs and other institutional heads – embraces a political position. (Former president Magill might have spoken as an individual citizen about the Hamas attack, provided she made clear she was not representing Penn in doing so.)
When a number of incidents of vandalism and harassment hit Penn’s campus – including a swastika found painted inside a campus building; a man entering the school’s Hillel center, overturning furniture and yelling an anti-Jewish obscenity; a pro-Palestinian demonstrator shoving a bystander; a library staffer tearing down pictures of Israelis held hostage by Hamas (see link, at p. 17); graffiti, “The Jews R Nazis,” discovered on a door next to a Jewish fraternity; an Israeli flag ripped down from atop an Orthodox Jewish residence hall (see link at p. 18); pro-Palestinian slogans projected at night on campus buildings; and antisemitic threats left on the voicemails of some Penn staff members (see link at p. 21) – Penn’s campus police acted with swiftness in investigating these incidents and apprehending a few of the perpetrators. Penn got this right, assuming it goes forward with disciplinary actions where merited. However, it should have done more to rein in campus protesters when they blocked pathways, took over a section of the student union, or interfered with the ability of students to study quietly in the library, as was reported (see link at p. 17).
When a Jewish student group tried to show a documentary film critical of Israel, Penn told them to wait several months until tensions had cooled. This was a mistake. Not only was there little reason to believe that the film showing would have resulted in violence, but even if that threat were real, it’s the university’s job to provide police to safeguard any legitimate educational function and, if necessary, to bar outsiders from attending a campus event.
In response to criticism from organizations like the Anti-Defamation League concerning some of these instances, former president Magill agreed before she resigned to add “antisemitism awareness” to the topics already covered in Penn’s mandatory diversity orientation sessions. This, too, was a mistake. Colleges should not be in the business of teaching morals or civics. That job is best left to families, religious bodies, primary and secondary schools, and voluntary organizations. There is little evidence that so-called diversity training can accomplish its stated goals, while it more reliably puts a chill on the voicing of unpopular views. There is no reason, in the case at hand, why, for example, such propositions as that Israel is committing genocide in Gaza, or even that the nation of Israel should not exist as a refuge for Jewish people, abhorrent as these notions are to me and many others, should not be civilly voiced and debated at a university. Teach-ins, provided they make every effort to present a wide range of contexts and views, offer ideal settings for universities, led by their faculties, to take up even the most controversial subjects.
Finally, President Magill’s resignation itself did not need to happen. No doubt she answered the question posed to her toward the end of a grueling Congressional hearing poorly, for which she did try to make amends in the days that followed. (She is certainly no antisemite.) Her confusion at the time, as at many other times throughout the fall semester, however, reflected a fundamental failure, common to university leaders today, to understand and articulate just how free speech and academic freedom should function when confronted by radical political passions. This failure has been evident for years, leaving Penn and many other universities open to the charge of hypocrisy in defending free speech only when it fits with the left-leaning political views shared by those who currently dominate these campuses. A college president who was more sure of the proper boundaries separating academic freedom from politics might have been able to stand up to the forces that drove her out of office.
In going forward, the general rule for Penn and other colleges should be that universities need to become less permissive about disruptive behavior and more permissive about unorthodox ideas.
Compromise is possible in Central Bucks Pride flag conflict
As readers of Philadelphia area newspapers know, a battle has been raging in the Central Bucks School District over how a number of sensitive cultural topics should be handled in the classrooms and school libraries. While subjects concerning race, ethnicity, religion, and political party leanings have all merited mention as examples of thorny challenges to district-wide policy, no single issue has proved as explosive as the question of whether to permit teachers to display the rainbow-colored Pride flag, signifying dignity and rights for sexual minorities, in their classrooms. If compromise can be found for that conflict, it seems safe to say that a similar approach might be applied to the remaining areas of discord.
This article appeared first on the website of the Bucks County Herald on February 2, 2023 (without links), and then on the Broad and Liberty website on March 16, 2023.
As readers of Philadelphia area newspapers know, a battle has been raging in the Central Bucks School District over how a number of sensitive cultural topics should be handled in the classrooms and school libraries. While subjects concerning race, ethnicity, religion, and political party leanings have all merited mention as examples of thorny challenges to district-wide policy, no single issue has proved as explosive as the question of whether to permit teachers to display the rainbow-colored Pride flag, signifying dignity and rights for sexual minorities, in their classrooms. If compromise can be found for that conflict, it seems safe to say that a similar approach might be applied to the remaining areas of discord.
On January 10, 2023, the school board voted 6-3 to ban the display of “any flag, banner, poster, sign, sticker, pin, button, insignia, paraphernalia, photograph or other similar material that advocates concerning any partisan, political, or social policy issue.” The board exempted from this ban such display if it were part of a curriculum unit; if flags were those of the United States, Pennsylvania, or a federal or state military branch; or if school personnel wore a small piece of jewelry representing an individual’s personal beliefs (see Policy 321). The board majority argues that education proceeds best when teachers check their politics at the classroom door, thereby encouraging students to develop and express their own views. The majority acknowledges that its ban would prevent the hanging of a Pride flag but adds that it would similarly prohibit, for example, a pro-Life banner, signifying opposition to abortion.
The three minority members of the school board counter that the new ban on partisan, political or social advocacy is really a smokescreen for eliminating views from the classroom with which the majority disagrees, especially, they write, “positive representations of diversity that reflect the beauty in our society.” Education proceeds best, in the minority’s view, when students feel they belong in their schools. “[F]or historically marginalized groups, most notably the LGBTQ community,” achieving this goal requires a welcoming environment fostered through the display of such symbols as the Pride flag. As one poster at a recent protest on behalf of the minority’s position put it, “Pride is not political.”
So far the battle has remained nonviolent, but angry statements, name-calling (“indoctrination” vs. “censorship”), and protest actions by parents, teachers, and students threaten to turn the conflict in an uncivil direction. A pending investigation by the U.S. Department of Education into a formal complaint brought by the ACLU against the district for creating a “hostile environment” for gay and transgender students has also caused the school board to begin to spend large sums on legal advice. For these reasons, a compromise acceptable to both sides could head off a waste of future resources or worse troubles.
Here’s how a compromise could work. The school system would replace the Pride flags with a conspicuous sign placed at the front of every classroom. The sign would read, “This school does not tolerate discrimination against or bullying of any student.” Teachers would be required to talk about the sign to their classes when the sign first appears and periodically thereafter, explaining why its addition to the classroom environment came about, how it is meant to make the classroom feel safe for all students, including but not limited to members of historically stigmatized minorities. Teachers would also be charged with remaining on the lookout for any acts of discrimination or bullying that occur within their purview, taking steps established by the school administration to bring an end to such acts.
The school board should take the further initiative of establishing a well publicized mechanism, with protections of privacy and due process for all parties, that encourages students to come forward to report any acts that they believe constitute discrimination or bullying. These reports, like those originating from teachers about student behavior, should be subjected to a transparent procedure for remedying the situation. As the school district shows through its actions its determination to end any discrimination and bullying, the claim made by the ACLU that it represents seven unnamed students who have suffered such treatment is likely to be dismissed, and the district can in turn dispense with its costly attorneys’ fees.
In this dispute both sides make valuable points. The school board majority is fundamentally correct that education needs to be kept distinct from advocacy. The minority is also right to be concerned about the emotional needs of vulnerable students. There is a way forward that can satisfy both these positions.
What Elizabeth Johnson's Exoneration Teaches about the Salem Witch Hunt
On July 28, 2022, the state of Massachusetts formally exonerated the last innocent victim of the infamous seventeenth-century Salem witch hunt. Elizabeth Johnson Jr., known to her contemporaries as Betty, was a twenty-two-year-old resident of Andover, Massachusetts, when she got swept up in the frenzy of accusations, judicial examinations, jailings, trials, and executions that convulsed the communities of Essex County in 1692. All of the witch hunt’s other victims had already been exonerated by previous legislation.
This essay appeared first on Witches of Massachusetts Bay on June 6, 2022, and then in slightly altered form on History News Network on August 22, 2022. I include here the latter version, with three changes made to indicate Elizabeth Johnson Jr’s relatives with greater precision.
On July 28, 2022, the state of Massachusetts formally exonerated the last innocent victim of the infamous seventeenth-century Salem witch hunt. Elizabeth Johnson Jr., known to her contemporaries as Betty, was a twenty-two-year-old resident of Andover, Massachusetts, when she got swept up in the frenzy of accusations, judicial examinations, jailings, trials, and executions that convulsed the communities of Essex County in 1692. All of the witch hunt’s other victims had already been exonerated by previous legislation. For some the process began shortly after the trials ended. By 1711 fourteen of the twenty who were executed at Salem had had their names cleared and their legal rights restored for purposes of inheritance. A 1957 state law added the name of one more person to this list, and the act’s 2001 amendment brought the remaining five executed suspects under its purview.
But Betty Johnson fell into a different category of victims. She was one of eleven individuals who had been convicted of witchcraft but, for a variety of reasons, never executed. Betty’s trial occurred in January, 1693, at the first proceedings of a new court that was established to take the place of the original but now discredited witchcraft court and to dispense with the remaining witchcraft accusations. Just three individuals were convicted under the revised rules of this later court, but the governor granted them last-minute reprieves, and they were soon pardoned and released along with those convicted by the earlier court who were still alive. In subsequent years, two of the three witchcraft suspects convicted in January 1693, along with the other eight convicted by the earlier court, had their names cleared and their legal rights restored. Despite petitioning herself to the Massachusetts legislature for legal restitution in 1712 (paralleling a claim filed two years earlier by her brother, Francis, for monetary compensation for her six months spent in jail), Betty Johnson, alone among all those convicted at Salem, never did receive such a simple declaration of justice – until now.
But what larger lessons does Betty Johnson’s story hold for understanding the Salem witch hunt? The most interesting one for me stems from the fact that Johnson had confessed. Over the course of roughly a year, the panic yielded over 150 suspects who were formally accused of witchcraft, fully one-third of whom confessed to the crime, some before they were even arrested. Since it is well established that nobody in eastern Massachusetts at that time was practicing witchcraft in any meaningful sense of the term (attempting to harness supernatural power to harm others), the question arises why so many of these individuals falsely admitted to committing a felony that carried the death penalty.
In Betty’s case her two statements of confession were made back to back on August 10 and 11, 1692, the first to the local Andover justice of the peace, Dudley Bradstreet, and the second to an examining board led by John Hathorne, the Salem town magistrate who sat on the colony’s special witchcraft court and who was certainly one of the prime movers in the witch hunt. What is most striking about Betty's confessions is how stereotypical they are. She simply drew from the known lore about witchcraft, including being baptized by Satan, who also appeared to her in the form of two black cats, in order “to pull down the kingdom of Christ and to set up the Devil’s kingdom,” taking these common notions on herself as if she were an avid follower. She claimed she had hurt a number of her neighbors by having her invisible specter sit on one’s stomach, by pinching or sticking pins in cloth likenesses of several others, and by invisibly attacking yet another with a spear made of iron or wood (though she wasn’t sure which). She said she had a “familiar” (left undescribed but typically thought to be an invisible animal) who nourished itself by sucking on her knuckle and at two other places, one behind her arm, that examining women corroborated by noting two little red specks on her body. Throughout her confessions she cited as her criminal accomplices individuals who had already been named as suspects, including her relative (first cousin, one generation removed) Martha Carrier, the most prominent of the Andover suspects, a woman long believed to be a witch by many of her neighbors, and George Burroughs, the former minister from Salem village, who was widely regarded during the panic to be the witches’ ringleader. Carrier and Burroughs were both tried and convicted in early August, just one week before Betty’s confession, and both were executed on August 19, a little more than a week after Betty turned herself in.
Why did she take this step of falsely accusing herself? Although Betty came from a prominent Andover family -- she was the granddaughter of the town’s elder minister, Francis Dane -- the extended Dane family, itself part of the larger and more significantly targeted Ingalls clan, had already been attacked by the young and middle-aged people who began accusing their Andover neighbors of witchcraft starting in mid-July. Even more directly, Betty's confession was preceded (on the same day, August 10) by those of two of Martha Carrier’s children, eight-year-old Sarah and ten-year-old Thomas, both of whom implicated Betty Johnson as a member of the witches’ “company.” Most likely, Betty knew that she would be named by her second cousins, the Carrier children, and all three may have thought, in the context of accusations that were wildly whipping around their community, that by confessing they might increase their chances of being treated with leniency. This was not an unreasonable assumption, since the Puritans valued repentance, even as they also showed determination to rid their communities of those they believed had allowed the Devil to grant them the power to practice witchcraft. Twenty were executed before the witch hunt effectively came to an end in mid-October, but none of these twenty came from the ranks of those who had confessed, even though this association was probably not discernible until mid-July and, even so, could never be guaranteed.
Confessions also tended to deflect blame. In Betty’s case, she made clear that it was the forty-two-year-old Martha Carrier who had “persuaded her to be a witch." Carrier, Betty said, had also "threatened to tear [her] in pieces,” if she did not do as she was told. Betty probably hoped that this aspect of her statements would also be protective, even as she must have equally known that confessions were regarded as the highest form of legal proof of actual witchcraft.
Beneath all these likely strategic motives, however, lies the fact that members of the Puritan communities of early Massachusetts could readily convince themselves that in some way or other, perhaps at a moment of weakness, they really had allowed Satan into their lives. A form of strict Calvinism, Anglo-American Puritanism held out virtually impossible standards of piety for its followers to live up to. Puritans sought to live in the truest, loving fellowship of Christ but one in which even a stray thought to get back at someone for a perceived grievance or to fail to carry out one’s dutiful role as husband, wife, parent, or child might occasion deep anguish. There is no explicit sign of such religious self-doubt in Betty’s own confessions, but other confessions during the witch hunt were filled with such self-recriminations. Fourteen-year-old Abigail Hobbs, for example, began her witchcraft confession with the admission, “I have been very wicked. I hope I shall be better if God will help me.” Collateral evidence suggests that Hobbs was referring to having been disobedient to her parents, lying out in the woods at night, pretending to baptize her mother, and not caring what anybody said to her. When Abigail Dane Faulkner, Betty’s aunt, confessed at the end of August, she acknowledged that all the accusations made against her kinsfolk had led her to “look with the evil eye” on those doing the accusing, “consent that they should be afflicted,” and “kn[o]w not but that the Devil might take that advantage,” even as she asserted that it was he, not her, who had done the afflicting. In one of the saddest examples of self-recrimination leading on to a witchcraft confession (though this episode was not part of the Salem events), Mary Parsons of Springfield, Massachusetts, imagined that she had entered into a pact with the Devil so she could see her deceased child again.
As the power and momentum of the Salem panic began to recede, many of those who had confessed to the crime of witchcraft recanted their earlier confessions. While there is no remaining record of Betty taking this step, as there is for many of the Andover confessors, we do know that she pleaded not guilty at her January trial, proof that she no longer held to her confession of August 10-11. The people of Essex County were coming back to their senses. Historical records suggest that Betty may have done well in her later years, successfully selling lands in 1709 and 1716 that she had inherited from her father and likely living until the age of 77. By that time – the 1740s – Puritanism itself was well on its way toward softening its spiritual message by moving in two directions, on the one hand toward the more compassionate piety of the evangelical movement and on the other hand toward the version of the Enlightenment’s rational faith that would soon be called Unitarianism.
Review of “The American Encounter with Buddhism, 1844-1912: Victorian Culture and the Limits of Dissent,” by Thomas Tweed
In 1897 the German-American philosopher and editor, Paul Carus, was searching for an American painter to present Americans with a visual image of Buddha, “not according to Japanese and Chinese style, but according to more modern American notions’” (p. 111). Through such means, Carus hoped to attract more of his countrymen to the Asian faith. Carus’s quest, which proved unsuccessful, may be taken as a symbol of the predicament facing the two to three thousand European-descended American Buddhists of his day: they were drawn to Buddhism as a radical alternative to the Judeo-Christian religions of their upbringing, but they were unwilling to break with some of the basic assumptions of those traditions.
This review appeared in Critical Review of Books in Religion 7 (1994): 280-282, an annual publication sponsored by the Journal of the American Academy of Religion and the Journal of Biblical Literature.
Is Freemasonry a Religion? Learning from a Nineteenth-Century Masonic Debate
During the last three decades of the nineteenth century a fascinating debate took place inside the Masonic Fraternity of northern California. This debate centered on the question: Is Freemasonry a religion? It is relevant today for two reasons. For one, this debate was not an isolated phenomenon. There is evidence the same controversy was occurring at this time not only in northern California but across many jurisdictions of the Fraternity in the United States.
This article was published in Heredom: The Transactions of the Scottish Rite Research Society 15 (2007): 167-177. I have made a few corrections in pen to errors or omissions that appeared in the published version.
Is Freemasonry a Religion? Learning from a Nineteenth-Century Masonic Debate (Video)
This video presentation is part of the Worldwide Exemplification of Freemasonry, a Masonic lecture series produced by the Grand Lodge of Indiana. It was first shown on December 24, 2011.
This video presentation is part of the Worldwide Exemplification of Freemasonry, a Masonic lecture series produced by the Grand Lodge of Indiana. It was first shown on December 24, 2011.
Religious Assimilation in a Fraternal Organization: Jews and Freemasonry in Gilded-Age San Francisco
The historical study of American fraternal organizations can yield surprising insights into the complex social processes associated with the term assimilation. A look at the Masonic fraternity in Gilded Age San Francisco provides a case in point. There the Freemasons brought together foreign-born and native-born Protestants and Jews to form a distinct subset within the broad middle class of the city. By examining in detail the extent of Jewish integration within the fraternity, it is possible to show some of the accomplishments and limitations in this process of Masonic identity formation.
This article was first published in American Jewish History 74.4 (June 1985): 369-403. It was reprinted in Freemasonry on Both Sides of the Atlantic: Essays Concerning the Craft in the British Isles, Europe, the United States, and Mexico, edited by R. William Weisberger, Wallace McLeod, and S. Brent Morris (East European Monographs, Boulder, distributed by Columbia University Press, New York, 2002): 621-656.
The "Non-Evangelical Alliance": Freemasonry in Gilded Age San Francisco
Evangelical Protestantism occupies an established place of prominence in the history of the American West. Successive outbursts of revivalism, which enlivened rural areas and cities alike during the nineteenth century, and the great interdenominational associations to promote Bible-reading, Sabbath-keeping and temperance which followed in their wake did much to bring social order and civilization to the vast regions of new settlement. Symbolically capping this united spiritual effort during the post-Civil War decades stood the Evangelical Alliance, a national body of Protestant leaders in roughly forty cities formed to coordinate the multi-faceted evangelical crusade.
This article was published as chapter 10 in Religion and Society in the American West: Historical Essays, edited by Carl Guarneri and David Alvarez (Lantham, Maryland: University Press of America, 1987): 221-253.
A Compromise Is Possible on CRT in the Schools
Fierce argument has recently broken out over whether and how to teach secondary school students about race relations—or CRT, for critical race theory. Here in southeastern Pennsylvania, where I live, White parents at a number of school board meetings have expressed worries that their children will be made to feel guilty for the past racist practices of earlier Americans. These parents have demanded to know what is being taught. Administrators trying to protect coverage of racial prejudice within the school curriculum have sometimes abruptly cut off discussion. In one school district, months-long conflict reached a point in mid-November at which a federal district judge granted a preliminary injunction against the school board for curtailing free speech at its meetings.
This article appeared on American Purpose on January 17, 2022. The subheads (printed bold below), supplied by the website’s managing editor, are somewhat at odds with my intent to propose a solution to this conflict that stands as a true middle ground between the two sides.
We need impassioned, even radical voices as we teach race relations, just not to the exclusion of all others.
Fierce argument has recently broken out over whether and how to teach secondary school students about race relations—or CRT, for critical race theory. Here in southeastern Pennsylvania, where I live, White parents at a number of school board meetings have expressed worries that their children will be made to feel guilty for the past racist practices of earlier Americans. These parents have demanded to know what is being taught. Administrators trying to protect coverage of racial prejudice within the school curriculum have sometimes abruptly cut off discussion. In one school district, months-long conflict reached a point in mid-November at which a federal district judge granted a preliminary injunction against the school board for curtailing free speech at its meetings.
Predictably, these battles have degenerated into slogans hurled from the right and left: “indoctrination” vs. “historical reckoning.” But compromise is possible on this important subject if educators and parents can be persuaded to draw a distinction between the disciplines of history and political thought, both of which deserve respect within the secondary school curriculum.
Debates about CRT often suffer from a failure to specify exactly which texts—or videos, or field trips—proponents and opponents are talking about. A good place to start is the lead essay in the 1619 Project by Nikole Hannah-Jones. Originally published in the New York Times in 2019 and now part of a larger anthology, this essay has probably reached the widest audience and prompted the greatest controversy among the new writings about race in American history. Engagingly written and only about twenty-five pages long, it could readily be assigned in high school classes.
But in which classes should it be assigned?
Soon after the essay first appeared, criticisms by leading U.S. historians flagged many misleading statements of fact and unbalanced judgments in its account of American history. As someone who taught the introductory course in U.S. history to college students for over thirty years, I can attest that it doesn’t take a specialist to recognize the essay’s shortcomings as history.
The essay mistakenly implies that the American colonists fought for independence from Britain in order to protect the institution of slavery from British abolitionists. In reality, the primary motivation behind the Revolution was the colonists’ perception that increases in British taxation, imposed without the consent of the colonial assemblies, signified an entering wedge of British tyranny. Similarly, the essay treats the Constitution at its writing as simply a pro-slavery document; instead, it was deeply ambiguous, even contradictory on the subject of slavery, not surprising in a practical plan of government that aimed to hold together in a single union states that were completely reliant on the institution of slavery and other states that were already doing away with it.
Hannah-Jones’ essay also misstates the principal cause of the Civil War, which was initially fought by the North not simply to prevent the South from seceding but equally to stop the spread of slavery to new territories in the West (the reason why the South seceded in the first place). In her essay, Lincoln emerges as more racially prejudiced than in fact he was. Hannah-Jones also minimizes the role played by White supporters of racial equality throughout U.S. history and oversimplifies the American record on suffrage and immigration. Such lapses as these are enough to disqualify the use of the 1619 essay as a text in the average eleventh-grade U.S. history course.
But if we see Hannah-Jones’ essay as a piece of political literature, we get a much more positive picture. It is worth recalling that the author is a journalist, not a historian. Her essay uses powerful figures of speech to advance an important argument: that the United States owes much of its success, not simply as a nation but as a democracy, to the unrecognized labor, suffering, creativity, and perseverance of its African-American population.
“Now is not the time to exclude thoughtful, impassioned political voices from any discussion of race relations in American history.”
Hannah-Jones uses the metaphor of 1619, the year in which the first African slaves were brought to the English colonies that would eventually become the United States, as an alternative to 1776 as a point of origin for some of our nation’s leading characteristics—both bad ones, like racial prejudice, and good ones, like cultural expressiveness. She employs hyperbole in referring to plantations as forced-labor camps in order to evoke the coercion and terror that so often confronted the lives of slaves.
She movingly begins her essay with reflections on her father, who served in the U.S. Army and kept an American flag raised in the family’s front yard in Waterloo, Iowa, despite having endured the indignities of residential segregation, job discrimination, and police harassment. She ends the essay, again, with the image of the American flag, this time claimed for herself, as she declares Black people to be the most devoted patriots to America’s twin ideals of liberty and equality that the country has ever produced.
At its best, Hannah-Jones’ 1619 essay recalls the strengths of James Baldwin’s The Fire Next Time (1963) and Ta-Nehisi Coates’ Between the World and Me (2015). All three works are impassioned, angry pleas for recognition and justice. All three are deeply personal accounts, moving back and forth between biography and sociology or history. All three reject despair and conclude, despite their anger, by embracing America. Like the two earlier works, Hannah-Jones’ essay deserves a place in the curriculum of a twelfth-grade class on civics or government, where it could be fruitfully paired with the more moderate voices of Black thinkers like John McWhorter, Glenn Loury, or Shelby Steele to offer students a window into the urgent political debates now energizing the Black community and America as a whole.
Now is not the time—there is never such a time—to exclude thoughtful, impassioned political voices, even radical ones, from any discussion of race relations in American history and current life, as our most anxious parents of school-age children (and a few state legislatures) are inclined to do. But neither is it proper for any one political viewpoint to dominate the teaching of racial issues to such a degree as to crowd out all opposing views, much less to sacrifice the disciplinary standards of history along the way, as our most morally driven school personnel seem prepared to do. The aspirations and fears of both sides in these conflicts can be accommodated through compromise, by separating the teaching of history from the teaching of political thought and going forward with both.
Anti-racist Lens Distorts History on New Jersey "Freeholders"
I grant there is no compelling reason for New Jersey’s counties to retain the traditional term “chosen freeholders” as the name for their elected officials. In a bill signed into law by Governor Phil Murphy on August 21, the title of these lawmakers will become “county commissioners” at the beginning of 2021. The new term certainly conveys better than “chosen freeholder” what these elected representatives do. But there is little basis for tying the older term to the history of slavery or racial prejudice, as many of New Jersey’s political leaders have done.
This opinion essay was published on History News Network on September 27, 2020. Internal citations for this article are available on request. Contact Tony Fels.
I grant there is no compelling reason for New Jersey’s counties to retain the traditional term “chosen freeholders” as the name for their elected officials. In a bill signed into law by Governor Phil Murphy on August 21, the title of these lawmakers will become “county commissioners” at the beginning of 2021. The new term certainly conveys better than “chosen freeholder” what these elected representatives do.
But there is little basis for tying the older term to the history of slavery or racial prejudice, as many of New Jersey’s political leaders have done. Governor Murphy, for example, lent his support to the legislation by tweeting, “let us tear down words born from racism”. State Senate President Stephen Sweeney (D., Gloucester) claimed the title “is mired in the language of slavery”. And Felicia Hopson, Director of Burlington County’s Board of Freeholders, linked retiring the term to the goal of “[c]ontinuing our work to end systemic racism…by eliminating an antiquated title from an era when slavery and racism [were] tolerated…”.
The term “freeholder,” first brought to the American colonies from England in the early seventeenth century, meant only a person who owned land (or other property) free of debt. The holding did not have to be a particularly large estate; by the mid-eighteenth century farms as small as half an acre were likely adequate to qualify. The idea was that such people, by virtue of their property ownership, would have the economic independence to be free from the influence of more powerful figures and could therefore be trusted with the vote. The “chosen freeholders” were simply the people selected by the freeholders at large to make the administrative decisions for a county until the next election came around.
The freeholders had a profoundly positive effect on the early development of liberal democracy, and nowhere more so than in New Jersey. A remarkable document, “The Concessions and Agreements of the Proprietors, Freeholders and Inhabitants of the Province of West Jersey in America” (1677), for example, established for the new settlements around Burlington the principle of rule by consent of the governed. Signed by 150 individuals, this early constitution contained a bill of rights, guaranteed religious liberty, and proclaimed it had “put the power in the people.” On its basis, the province’s first representative assembly, elected by the freeholders, convened at Burlington in 1681. Two other elected assemblies had begun even earlier in East Jersey. Once East and West Jersey came together to form the Crown colony of New Jersey in 1702, the freeholders continued to stand up for their rights against the royal governor and his council all the way until the American Revolution.
Who were the freeholders? Certainly, nearly all were men. And because Europeans had founded the colonies, the freeholders were overwhelmingly white. By the mid-1700s, Black Africans comprised about 7% of New Jersey’s population, the great majority of whom were enslaved, including by some of the freeholders.
But another unique feature of New Jersey’s history points to a way in which the ideal of freeholder democracy challenged even these limitations. New Jersey holds the distinction of being the only state, just after the start of the American Revolution, to have allowed both some white women (single and with a certain amount of property) and some Black men and women (those who were free, owned property, and, if female, unmarried) to vote. This unusual development in the history of American suffrage, which lasted for about thirty years, began without fanfare, indeed without any special notice at all – which in turn suggests that single, propertied women, both free Black and white, and free Black men of property had likely joined the ranks of the freeholders for some stretch of years prior to the Revolution.
The world of the colonial period was not the same as ours today. Their world was one based on a principle of social hierarchy that remained largely unquestioned. Racial distinctions, at least in the northern colonies, did not lie at the center of this social system. A sizable minority -- including the wealthy and the middle-class freeholders – occupied positions of independence. Beneath them stood a number of dependent classes: married women, tenant farmers, wage workers, servants, slaves, and the poor. And just as today we cherish the principle of freedom from arbitrary arrest (what came to be known as habeas corpus) that a group of English lords, who probably cared little about anyone other than themselves, won from their king back in 1215 with the Magna Carta, we can similarly pay tribute to the significant, if still limited, gains the freeholders of New Jersey made toward the expansion of popular participation in government.
Sex and The Crucible: A Look into Arthur Miller's Inspiration with Tony Fels
In doing the research for my book, Switching Sides, I came across a surprising realization. Not only did Arthur Miller take nearly the whole story of the Salem witch hunt for his famous play, The Crucible (1953), from his having read Marion Starkey’s The Devil in Massachusetts (1949), but he very likely drew the play’s central dramatic tension, concerning a former affair between the accuser, Abigail Williams, and the accused protagonist, John Procter, from Starkey’s history as well.
This article was originally published on the Johns Hopkins University Press News and Events Blog on January 26th, 2018
WHERE DID ARTHUR MILLER GET THE IDEA FOR THE SEXUAL THEME IN THE CRUCIBLE?
In doing the research for my book, Switching Sides: How a Generation of Historians Lost Sympathy for the Victims of the Salem Witch Hunt, I came across a surprising realization. Not only did Arthur Miller take nearly the whole story of the Salem witch hunt for his famous play, The Crucible (1953), from his having read Marion Starkey’s The Devil in Massachusetts (1949), but he very likely drew the play’s central dramatic tension, concerning a former affair between the accuser, Abigail Williams, and the accused protagonist, John Procter, from Starkey’s history as well.
Two famous literary archives, the Harry Ransom Center at the University of Texas and Yale University’s Beinecke Rare Book and Manuscript Library, are currently jousting over custody of Arthur Miller’s voluminous private papers, including 160 boxes of materials and another 8,000 pages of private journals (see Jennifer Schuessler’s article, “Fight for Arthur Miller’s Archive,” New York Times, January 10, 2018). When scholars gain access to these documents, longstanding mysteries concerning the central plot line of The Crucible may finally be solved. In the play Miller reduced the real age of tavern-owner and farmer John Procter from sixty to the mid-thirties, elevated Abigail Williams’s age from eleven to seventeen, and posited a sexual liaison – wholly made up, as best as the Salem primary documents reveal – when Williams was said, erroneously, to have lived as a servant in the Procter household. In reality, Williams, along with her nine-year-old cousin, Betty Parris, lived in the parsonage of the Salem village minister Samuel Parris. Williams and Betty Parris were the two young girls whose contorted bodies and catatonic states led to the first accusations of witchcraft in the village.
Miller himself obscured the sources for his play. In the 1950s he would only say that he had traveled to Essex County, Massachusetts, and combed through the manuscript records of the trials to find his story, even though we know from Miller’s leading biographer, Christopher Bigsby, that the playwright only allotted seven days for this journey, far too short a time within which to have constructed The Crucible from scratch. Only much later, in his autobiography, Timebends (1987), did Miller for the first time, it appears, acknowledge his debt to Starkey’s history, which he now revealed had served as the model for the play. “In fact, there was little new [beyond Starkey’s account] I could learn from the court record,” he conceded, now explaining his trip to Salem as a way of hearing the sort of language that his seventeenth-century Puritan characters would have spoken.
Yet Miller never disclosed his source for the play’s central drama, in which John Procter must reveal his former affair with the Procters’ “servant” Williams in a failed attempt to save Procter’s wife, Elizabeth, whom Williams had earlier accused of witchcraft. Indeed, Miller did much to throw researchers off the trail. In the 1950s he said he had found the evidence for this sexual relationship in the witch hunt’s examination records themselves. In his later autobiography Miller stated he had discovered evidence for the affair in the leading nineteenth-century history of the witch hunt, Charles W. Upham’s Salem Witchcraft. Neither of these claims has been corroborated by any researcher into these same sources – not surprising, because Williams was only eleven at the time of the witch hunt and was never a servant in the Procter household! However, a hint of exactly the sort of personal situation, entailing extra-marital sexual attraction and jealousy, that Miller developed into his play’s plot line can be found again in Starkey’s history, when she describes the relationships between the Procters and their actual servant, twenty-year-old Mary Warren. Starkey’s hint may have struck Miller with particular force, because the playwright’s own marriage was dissolving under the impact of his affair with Marilyn Monroe just in the years when he was writing The Crucible.
The Devil in Massachusetts would thus seem to have supplied virtually everything Miller needed to produce the plot of his famous play. And, as I show in my book, Starkey’s account offered Miller something even more important than this: the moral lessons to be drawn from the Salem witch hunt – how when a group of people fear something in themselves, they can run roughshod over all restraints in an effort to externalize blame onto scapegoats. Under these circumstances, the call of conscience to speak the truth is extremely hard to follow, because it threatens the competing desire of each person to belong to a community, not to mention evade accusation themselves.
Traditional Understanding Overshadows Academic Explanations at Rebecca Nurse Commemoration
On June 7, 2021, the NPR show, “Here and Now,” aired a segment on the 400th birthday of Rebecca Nurse, broadcast from the Rebecca Nurse Homestead in Danvers (formerly Salem Village), Massachusetts. Readers of Witches of Massachusetts Bay will doubtless recognize Nurse as one of the most well-known of the 20 individuals executed at Salem for alleged…
This article was originally published on Witches of Massachusetts Bay on June 28th, 2021
On June 7, 2021, the NPR show, “Here and Now,” aired a segment on the 400th birthday of Rebecca Nurse, broadcast from the Rebecca Nurse Homestead in Danvers (formerly Salem Village), Massachusetts. Readers of Witches of Massachusetts Bay will doubtless recognize Nurse as one of the most well-known of the 20 individuals executed at Salem for alleged witchcraft.
The radio program struck my interest for revealing the enduring strength of what might be called the “traditional” understanding of the Salem witch hunt over more recent explanations advanced by some of the many scholars who have studied the tragedy. By the “traditional” understanding, I mean the one made famous by Arthur Miller’s 1953 play, The Crucible, though Miller’s play in fact owed practically everything to journalist-historian Marion Starkey’s The Devil in Massachusetts, which appeared four years earlier. As Starkey (and then Miller) saw it, the witch hunt was a product of social hysteria, brought on by a lethal combination of extreme religious values, calling on people to live up to impossible standards of piety, and ages-old communal scapegoating based on personal enmities. When individuals can’t meet their own community’s norms for a life of rectitude, their sense of guilt may lead them either to imagine they have committed terrible transgressions or else to deflect the blame onto others. Intolerance toward oneself in effect breeds intolerance of others. The heroes in both accounts (Starkey’s gripping narrative and Miller’s equally chilling drama) were the 20 martyrs, who, like Rebecca Nurse, went to their deaths rather than confess to the falsehood that they had made a compact with the Devil.
In an early part of the 11-minute segment, “Here and Now” host Robin Young discusses some recent academic explanations for the witch hunt with Kathryn Rutkowski, curator and president of the Rebecca Nurse Homestead. “Historians say the witch trials were to keep women in line,” Young suggests, referring, without naming the source, to the feminist argument advanced especially by Carol F. Karlsen in The Devil in the Shape of a Woman (1987). Young, however, omits the fact that Karlsen’s study actually showed little interest in Rebecca Nurse or any of the other courageous Salem martyrs (14 were women, 6 were men) in favor of concentrating on the young women who, out of the anguish Puritans are said to have foisted onto women in general, did the accusing.
Rutkowski responds by referencing two other recent scholarly interpretations (again without mentioning the names of authors). One, set forth by Mary Beth Norton in her book, In the Devil’s Snare (2002), argued that the Puritans’ continuing conflicts with Native Americans to the north brought on the witch scare, by depositing orphaned victims of Indian attacks in Salem Village, where they reenacted their childhood traumas by accusing other people of attacking them through witchcraft. Another, advanced by Emerson Baker in his A Storm of Witchcraft (2015), proposed a catch-all explanation for the witch hunt under the phrase, “a perfect storm,” said to include the Native American context, the insecurities of a new colonial charter, a harsh winter, village factionalism, and the local pastor Samuel Parris’ rigid orthodoxy. In truth, no such extraneous circumstances or “perfect storms” are needed to account for witch hunting, which occurred with deadly commonality across nearly 300 years of history throughout western Europe, including in its colonial outposts like New England. Indeed, Hartford, Connecticut, was the scene of a lesser version of the Salem events in 1662, when another witch panic led to 14 indictments and four likely executions.
But all these considerations fall by the wayside as soon as the program turns to Beth Lambright, one of a large number of proud Rebecca Nurse descendants who live throughout the United States. As Lambright tells Robin Young, Nurse, age 71 at the time of her death, lived a quite ordinary colonial life, raising eight children and helping with the work on her family farm. “Yet this ordinary life became an extraordinary moment of, really, heroism,” Lambright explains, when by “standing in the truth, [Nurse] paid for that with her life.” Lambright took her family to visit the Danvers homestead a few years ago because she wanted to pass on to her children the important lesson of what their colonial ancestor had accomplished. As Lambright puts it, “No matter what your community might say about you, if you do not believe it’s true, you stand in what you know to be true.” These are lines that Arthur Miller might have included in The Crucible, a work that Lambright knows well, both from having read it and from having watched her daughter perform in a high school production of the play.
Hoping to draw out a political lesson for today’s times, Young asks Lambright if she doesn’t see some parallels to what’s been happening lately, with America menaced by “conspiracy theorists” and “angry mobs” with “pitchforks.” It’s clear from Young’s left-leaning political perspective that she sees these Trumpian manifestations as the equivalent of 1692’s witch hunters. Lambright appears to agree, but I’m not so sure. She observes, “We’re seeing loud voices. They might look like the majority for a while, but it doesn’t mean that they’re always speaking truth. We have to be really careful that we understand who we are and what our truth is.” Most recently, it’s the Democrats, not the Republicans, who have been in the majority. And antiracist zealots on the left are just as capable of trying to enforce conformity of belief on a particular community through scapegoating as are extremists on the right.
Arthur Miller himself might similarly have seen threatening “pitchforks” coming from the margins of both ideological extremes. While it is well known that The Crucible offered up the Salem witch hunt as an allegory for Senator McCarthy’s red scare of the 1950s, in his later life the playwright acknowledged that the lessons of the Salem witch hunt fit the murderous excesses of the Chinese Communists’ Cultural Revolution just as well. The Salem story for good reason continues to resonate with Americans now nearly 330 years after it drew to a close.
(The NPR program may be heard at https://www.wbur.org/hereandnow/2021/06/07/rebecca-nurse-salem-witch-trials. A popular show like this one naturally comes with some factual errors. In the introduction, Robin Young speaks of about 200 people who were tried at Salem, when she means the number who were accused. The Salem Court of Oyer and Terminer (the special witchcraft court) tried 27 suspects, while the later Superior Court of Judicature (which produced no lasting punishments) handled about 70 remaining cases. Later in the show, Young refers to “one man” who was executed at Salem, when actually there were six men. Beth Lambright meant to say that George Jacobs Sr.’s body, not George Burroughs’, is also buried on the Rebecca Nurse Homestead grounds.)
Confessions of Accused Witches
After publishing “Traditional Understanding Overshadows Academic Explanations at Rebecca Nurse Commemoration” by Tony Fels, a fascinating discussion ensued in the Comments section between Tony Fels and Margo Burns. Since readers often skip the Comments section, I wanted to share this important conversation about the meaning of the Salem confessions. As Tony put it, “The Salem witch hunt is one of those subjects that simply crosses the boundaries between what interests academics and what interests the general public. We’re all involved in its meaning simply as people, as evidenced again and again by events like the 400th anniversary of Rebecca Nurse’s birthday.”
This exchange was originally published on Witches of Massachusetts Bay on July 30th, 2021. The speaker in the first paragraph is Robin Mason, creator of the Witches of Massachusetts Bay website. Credentials for Salem researcher Margo Burns appear at the conclusion of the article.
After publishing “Traditional Understanding Overshadows Academic Explanations at Rebecca Nurse Commemoration” by Tony Fels, a fascinating discussion ensued in the Comments section between Tony Fels and Margo Burns. Since readers often skip the Comments section, I wanted to share this important conversation about the meaning of the Salem confessions. As Tony put it, “The Salem witch hunt is one of those subjects that simply crosses the boundaries between what interests academics and what interests the general public. We’re all involved in its meaning simply as people, as evidenced again and again by events like the 400th anniversary of Rebecca Nurse’s birthday.”
Margo Burns responds to original post:
Something that I can’t get through to people, both those who adhere to the traditional understanding as well as academic explanations, is that the notion that confession somehow spared people is simply not accurate. Just because no confessors were hanged does not mean it was the intention of the Court to spare confessors—that’s a historian’s fallacy. The Chief Magistrate wrote a warrant for the execution for several confessors in January, but they and the rest of the people sentenced to die then were all spared by the Governor.
Confession was the gold standard of convictive evidence in witchcraft cases in that era, mentioned in all the contemporary books about witchcraft, and it was not controversial legally the way spectral evidence was. The belief that a confession, even a false one, could spare one from being hanged in 1692 makes it easier to then cast those who were executed as martyrs. They had a way to save themselves but they refused to tell a lie even though it would save them from hanging. So noble! It’s a nice story, but it is not based on historical facts.
Tony Fels responds:
I can’t agree with Margo Burns on this point. She’s technically correct: Confession was the best of all evidence of witchcraft, and those who confessed would have had no assurance that they would not ultimately be hanged for the crime. Indeed, six confessors were convicted by the first witchcraft court and three later on by the second court. But all those trials and convictions occurred late in the witch hunt (mid-September 1692 and then January 1693).
Meanwhile, Tituba had confessed back in March 1692, followed by Abigail Hobbs in mid-April, Deliverance Hobbs a couple days later, Margaret Jacobs in May, Ann Foster and her daughter Mary Lacey Sr. in mid-July, and then a great many more from Andover. A pattern must have been discerned that the confessors were at least being held temporarily without trial in order to name others or to rid the community of the more dangerous, recalcitrant suspects first. Thus, to confess at least bought a suspect time.
By contrast, those suspects who early on proclaimed their innocence, even as they were brought to the first trials in June, July, and August, refused to take that step of falsely confessing. We can surely sympathize with those who were intimidated into confessing, but the actions of those who resisted such pressures do present us with a noble story!
Margo Burns responds:
Tony, respectfully, it’s necessary to look at the historical data more closely—per case and on a timeline—before making claims about patterns that may have been discernable by the accused at the time they were accused. It’s simply not possible that the 11 people who confessed between February and May could have discerned any “pattern” about how their cases would be handled and made choices to confess. The magistrates easily forced confessions out of these people, people who were vulnerable and easily manipulated to say anything the authorities demanded of them—youths, people with low social status, or with some mental defect. And hardly people who were looking at some “big picture” or as some kind of “legal strategy.” No one knew anything about the plans or timing for prosecution anyway, or for certain who the Crown’s attorney or Chief Magistrate would be. At that point, June 2, over 70 people were in custody and 11 had confessed. Before then no one could have thought that confession might be some kind of get out of jail free card, especially considering that in the most recent witchcraft case in Boston, just three years earlier, with Stoughton on that bench. Goody Glover confessed and was hanged. Why would they think it would be different for them?
The first mittimus, in late May, to bring accused people back to Salem from jail in Boston for trial comprised a list of eight people who would ultimately put on trial that summer, plus Tituba, a confessor. While Tituba was the only one not tried that summer, she completely disappears from the legal record until she pops up again a whole year later to have her case dismissed. There is no way to figure out why. She is not part of any of the trials, including Sarah Good’s, for which she should have been a prime witness but she’s not there. The second best convictive standard as evidence in a witchcraft case was the testimony of a confessed witch—so why wasn’t Tituba called as a witness? By mid-July, this is all anyone knew about how things were going to unfold. A single data point, Tituba, does not make a pattern, and she wasn’t used as a witness against anyone.
By late June, before the court hanged 5 more people, the first prosecutor left, and frankly, a lot of things were up in the air about how the following cases would be handled. Ann Foster was interrogated five separate times in mid-July to produce a pretty amazing confession. How could she have concluded anything except that the authorities demanded a confession from her and would not stop until she had? And so she did. That is the purpose of interrogation: to elicit a confession to make prosecution easier. It’s hard to argue with evidence of someone speaking against their own self-interest. Before the Court had even convened in early June, only those 11 people had confessed. ALL the rest of the confessions, 43 of them, starting with Ann Foster’s, came from Andover residents or those who lived near enough to attend the church in Andover or were part of a family from Andover. You’d think that if there was a pattern to be discerned, people in other towns would have figured it out, too, to save themselves. Maybe you’d have some people who were already being prosecuted who would have caught on to the “deal” and recanted their claims to innocence at trial and thrown themselves on the mercy of the court, but no one did.
It’s also important to look at the recantations from several fully covenanted members of the Andover church who confessed in August under pressure and immediately recanted when the interrogations ended. Why would they recant? None of them claimed they’d confessed because they knew it would help them in any way, despite what they may have been told during the interrogations. For the rest of that summer, the interrogators used high-pressure interrogation tactics to coerce false confessions. The case of Samuel Wardwell in September is telling. He was the first confessor to be tried, and was hanged. When the time came for him to acknowledge his confession, he refused. He had discerned a pattern: everyone who was indicted end up being hanged. He knew that it didn’t matter if he confessed or not, and he knew his confession had been coerced. The court was going to hang him either way, so he recanted it.
In September, Dorcas Hoar possibly made a legal last-ditch effort to get some extra time before certain execution by confessing after she was sentenced. She probably did see that the four confessors sentenced to die got temporary stays, but it seems really unlikely that she was in a position to leverage four ministers to come to her aid to close the deal, unless it was in their best interest somehow, perhaps to show that it was still possible to save one’s soul.
I appreciate your effort to make the people who were executed “noble” for not confessing, but it’s revisionist history.
Tony Fels responds:
Margo, I’m afraid you have posited a straw argument concerning the confessors in order to knock it down. No serious historian of the Salem witch hunt believes that the confessors thought that, in confessing, they had obtained a “get out of jail free card” or had “caught on to the deal” about how to handle the witchcraft interrogators. Nor would any serious historian contend that simply because no suspect who confessed was executed, that this fact alone meant the authorities had decided on a policy to spare those suspects’ lives. Indeed, we know that the witchcraft court convicted five confessed suspects (leaving aside Samuel Wardwell, who recanted his confession) at the court’s fourth and last session in mid-September. These individuals might have met their deaths if events had turned out differently.
The whole Salem witch hunt process was a terrifying ordeal that unfolded without any certain outcome. As you point out, confession was nothing anyone would take lightly, since the last person who had confessed to witchcraft, Goody Glover in Boston just four years earlier, had been put to death for the crime. For strictly religious reasons alone, no pious Puritan—and nearly all of the adult confessors could be classified as such—would have casually acknowledged such terrible acts of blasphemy in their own behavior. And yet, of the 150 or so accused in the Salem witch hunt, roughly one-third of these suspects confessed to the crime, and none of these confessors was ultimately executed. Plenty of evidence, much of which is included in your own 2012 article (“‘Other Ways of Undue Force and Fright’: The Coercion of False Confessions by the Salem Magistrates,” Studia Neophilologica 84: 24-39), suggests why this outcome was not purely coincidental: confessing increased one’s chances of survival.
I agree with you that such a likelihood could not have been discerned before the trials themselves got underway with the court’s first session on June 2-3. Eight people had confessed by this point (Tituba, Dorothy Good, Abigail Hobbs, Deliverance Hobbs, Mary Warren, Sarah Churchill, Margaret Jacobs, and Rebecca Jacobs). In your post, you mention 11 confessors before the first trial, but I’ve never seen the names of the three additional people you are referring to. You know the examination and related records better than I do, and these additional names may have surfaced since the publication of your own article. But just focusing on these eight, while one (Good) was a young child and two (Warren and Churchill) quickly recanted their confessions, the other five were all people who could have been selected to be tried at the court’s first session (June 2-3) or its second session (June 28–July 2), but none was. Instead, one non-confessing suspect (Bridget Bishop) was tried and convicted at the first session and on June 10 hanged, followed by five non-confessing suspects (Sarah Good, Susannah Martin, Rebecca Nurse, Elizabeth How, and Sarah Wilds) tried and convicted at the second session and hanged on July 19. At this point (roughly mid-July; there were no confessions in June) it seems possible to imagine that some of the remaining suspects and others still to be named might have begun to see an advantage to confessing.
One (the Salem slave Candy) did so on July 4, followed by five people (Ann Foster on July 15, and then Mary Lacey Sr., Mary Lacey Jr., Richard Carrier, and Andrew Carrier, all on July 21-22), all from Andover, the town to which the witch hunt had by now spread. These latter five were all linked to Martha (Allen) Carrier, an Andover woman strongly suspected of witchcraft by many of her neighbors and who had been accused and arrested at the end of May. It is reasonable to believe, though we have no direct evidence to this effect, that all five, which included two of Carrier’s children, confessed in the hope that their confessions might insulate them from sharing in what appeared to be the impending fate of Martha Carrier. The non-confessing Carrier was indeed tried first at the court’s third session (August 2-5) and was hanged along with the session’s four male convicted suspects, all also non-confessors (John Willard, George Jacobs Sr., John Procter, and George Burroughs) on August 19.
The approach and aftermath of the court’s third session opened a floodgate of further confessions coming from Andover or Andover-related suspects: two more relatives of Martha Carrier on July 23 (niece Martha Emerson) and July 30 (sister Mary Allen Toothaker); a middle-aged woman (Mary Bridges Sr.) on July 30 and her five daughters on August 3 (Mary Post) and August 25 (Mary Bridges Jr., Sarah Bridges, Susannah Post, and Hannah Post); two more of Martha Carrier’s children (Sarah and Thomas) on August 11; Rebecca Eames on August 19; and at least seven more Andover individuals (Elizabeth Johnson Jr., Mary Barker, William Barker Sr., Mary Marston, Elizabeth Johnson Sr., Abigail Johnson, and Abigail Dane Faulkner) by the end of the month. September brought perhaps another 22 confessions along with the court’s fourth session (September 6-17), during which some of the first confessing suspects (Abigail Hobbs, Ann Foster, Mary Lacey Sr., Rebecca Eames, and Abigail Dane Faulkner) were convicted based either on their guilty pleas or by a jury’s decision after a trial. Still, even these convicted confessing suspects avoided execution on September 22, on which date eight more convicted non-confessors (Martha Cory, Mary Esty, Alice Parker, Ann Pudeator, Margaret Scott, Wilmot Redd, Mary Parker, and Samuel Wardwell) were hanged.
(I have checked all of the above names and dates with the authoritative Records of the Salem Witch-Hunt, ed. Bernard Rosenthal, Margo Burns, et al., 2009. The same information may be found in Margo’s article, referenced above. Most historians, including Margo, seem to use Thomas Brattle’s assertion, written on October 8, 1692, that there were 55 confessors among the accused. Nobody, so far as I know, has published a complete list of these names. Based on data found in Records, I include Abigail Dane Faulkner among the August confessors. When she, along with Elizabeth Johnson Sr., and Abigail Johnson, are added to the other August confessors, the total for that month reaches 15, not 12, as noted in Margo’s Table 2 on p. 26 of her article. If 55 is the correct total for the overall number of confessors, then 22 additional suspects must have confessed in September.)
Why did all these individuals confess to crimes we know now they had never committed? We cannot expect the suspects themselves to have explained their motives at the time, because a confession by definition offered an admission of guilt. To the examiners and their surrounding communities, these people acknowledged they had entered into a pact with the Devil to hurt others through witchcraft. In your own article on the subject, Margo, you have emphasized the role played by judicial intimidation, which included everything from intense questioning and incarceration under harsh conditions to the occasional use of physical torture. This is undoubtedly a part of the story. For myself, I would emphasize the role played by guilt for these highly religious people. Under the frenzied conditions of a witch hunt, it was not hard for many of them to imagine that in some way or other they had allowed Satan to enter into their lives by wishing someone harm or hoping to gain personal advantage in some way that the Puritan community frowned upon. There is explicit evidence of this motivation in the confessions of Abigail Hobbs, Margaret Jacobs, Abigail Dane Faulkner, and others.
But confession also carried the hope that the Puritan belief in public repentance might take precedence over the Biblical injunction to “not suffer a witch to live.” Most confessions, beginning with Tituba’s, included anguished portions in which blame was shifted to someone else, typically to suspects who had previously been named. Confessors claimed that these other persons—for example, Sarah Good and Sarah Osburn for Tituba, Martha Carrier for many of the Andover confessors, George Burroughs for nearly all of them—had forced them to carry out the Devil’s wishes. In so doing, they likely hoped to elicit some sympathy for their plight as victims. Family members and friends also played key roles in exerting pressure on suspects to confess, believing that this might be the only way to save their lives. Andover resident Mary Tyler’s brother insisted repeatedly that she do so, both because he thought she must be a witch if so many people had said so and also because “she would be hang’d, if she did not confesse.” A petition submitted in January 1693, urging the newly reconstituted court to ignore the confessions made earlier by some of the Andover suspects, acknowledged the same motivation when it stated, “Confessing was the only way to obtain favor, [and] might be too powerful a temptation for timorous women to withstand, in the hurry and distraction that we have heard they were then in.” That these desperate strategies probably worked to some degree is suggested by the facts that it took until the witchcraft court’s fourth session before any of the confessing suspects were brought to trial, and that when the first group of confessors were finally convicted, this step seems to have been forced on the justices, who were coming under criticism for apparent hypocrisy in overlooking such “obviously” guilty suspects in favor of going after only those who had forthrightly proclaimed their innocence. Even after their conviction, these confessed suspects were still shown a final, and, as it turned out, decisive bit of leniency in receiving temporary stays of sentencing or execution, which Thomas Brattle stated, “for two or three [of them] because they are confessours.” (Abigail Faulkner received a stay of execution by reason of her pregnancy, and Dorcas Hoar, convicted during the same fourth session of the court, also received an unusual stay of execution following her confession just after her sentencing.)
Confession also had a larger impact on the overall course of the witch hunt. From Tituba’s admission of guilt at its start all the way up through the first group of Andover confessors in mid-July, confessions gave credence to the accusations of witchcraft and accelerated the drive to uncover more witches in the communities. Only toward the witch hunt’s end did the sheer number of confessions serve to undermine the credibility of the charges and help bring the panic to a close.
As I see it, the crux of the dispute between you, Margo, and me, lies, as with so many of the controversies generated by the study of the Salem witch hunt, in the question of where blame should be placed. In rejecting what you see as a “nice [but fictitious] story” that draws a moral distinction between those suspects who went to their deaths upholding the truth that they were not witches and those suspects who confessed to crimes they had not committed, you appear to want to concentrate all of the blame for the witch hunt on the Puritan judicial establishment, making sure that nobody gets distracted into thinking that confessors bear at least part of the blame. Hence your emphasis as well on the coerced nature of these confessions. There really was no meaningful choice for a suspect to make, you assert, since all were headed for execution anyway. Confessors did no greater harm than truth-tellers at Salem.
But the Salem magistrates, it’s worth remembering, were not autocrats but elected officials. The Puritan colony of Massachusetts, from top to bottom, fully supported the witch hunt when it was at its height, and even after the English-appointed governor in early October had abolished the first witchcraft court (which the Massachusetts House of Representatives endorsed only in a very close vote of 33-29), it took years for most residents to recognize that a serious miscarriage of justice had been done. In 1695, three years after the witch hunt’s end, a majority of Salem villagers could still sign a petition in support of Rev. Samuel Parris, perhaps the chief instigator of the panic.
In my view, the colony as a whole bears the lion’s share of the blame for the witch hunt, chiefly because of the extremism of its religious views, which lent themselves to picturing the world as a Manichean struggle between Christ and Satan, good and evil. In this context, the determination of thoroughly average people like Rebecca Nurse, Martha Carrier, and George Jacobs Sr. to tell the truth about themselves at all costs—itself one of the great virtues taught by Puritanism—may be seen as genuinely heroic, because it was the accumulated truth-telling of those 20 martyred individuals that did more than anything else to put an end to the catastrophe Massachusetts had brought on itself. The confessors, too, ironically testified to the great power of telling the truth, because when they later recanted their confessions after the witch hunt was over, the aspect of their behavior that they regretted most was that they had “belied themselves” before God.
Margo Burns responds:
Tony: Thanks for your thoughtful reply, but I still don’t accept your claim that my argument is based on the “straw man.” It is very common in popular explanations of the trials to claim that people consciously confessed to save themselves. As for “No serious historian of the Salem witch hunt believes that the confessors thought that, in confessing, they had obtained a ‘get out of jail free card’ or had ‘caught on to the deal’ about how to handle the witchcraft interrogators,” here are four—Norton, Rosenthal, Baker, and Ray—who suggest that the confessors themselves believed that confession would spare their lives:
1) Mary Beth Norton, In the Devil’s Snare, p. 303: “By [August and September], as other scholars have pointed out, it had become clear to the accused that confessors were not being tried.”
2) Bernard Rosenthal, Salem Story, p. 151: “Some did manage to escape; those who could not generally opted to save their lives by confession.” p. 155: “On September 1, [Samuel] Wardwell, in a move that he had every right to believe would protect him, confessed to his witchcraft.”
3) Emerson W. Baker, A Storm of Witchcraft, p. 154: “So when [Samuel] Wardwell was questioned about witchcraft on September 1, he and others appear to have believed that confessing would at least delay their trial and execution, and might possibly even spare their lives.” p. 155: “[B]y the time [George] Burroughs was executed on August 19, it was clear that straightforward denials would be no use. Anyone who had pled not guilty was quickly convicted and executed .… Confession and cooperation at least gave the advantage of delay and offered some hope that the individual might ultimately be spared.”
4) Benjamin Ray, Satan & Salem, p. 123: “[Sarah] Churchill never formally retracted her confession. She almost certainly realized that to have done so would have forced the judges to put her on trial.” p. 125: “Hobbs and [Mary] Lacey clearly believed themselves to be free from trial because of their confessions.”
When I return to my original post in this thread, the point I was trying to make is that I do not accept the popular portrayal of those executed as martyrs. A martyr, by definition, is “a person who sacrifices something of great value and especially life itself for the sake of principle.” For this to be true, those who hanged would have felt or known that they had a choice that could affect whether they lived or died. That is just not true. This is all part of the general origin myth of America portraying our ancestors as noble. Then of course there had to be a reason why the condemned didn’t confess and save themselves, right? Maybe they were really principled Puritans, not willing to “belie” themselves. Really? This is not the case. Part of dismantling this whole portrayal is careful examination of what the accused could actually have known and when they could possibly have known it. The timelines of prosecutions and confessions don’t have any correlation, then or now. The confessions were coerced, which removes the possibility that the confessors knew what they were doing. The people who were executed are not martyrs, including my own ancestor, Rebecca Nurse. They were victims and it was tragic what happened to them, but they had no more agency in the outcome than the people who confessed had.
You are correct, Tony, that I put the blame and responsibility for the whole episode on the judges, because they controlled everything. They decided which legal precedents to follow and which to reject. From the start, local magistrates John Hathorne and Jonathan Corwin made multiple decisions to accept all accusations. They entertained spectral evidence as valid, and then held everyone over in jail without the option of being released on bond, against legal precedent. These and other local magistrates were the ones coercing the false confessions. As for the assertion that the judges were all elected, that was not the case. William Phips and William Stoughton received their commissions as Governor and Lt. Governor from King William & Queen Mary in the new charter. Phips handed the management of the legal system over to Stoughton—when precedent would have had put the Governor himself in charge of such a court. Stoughton processed all these cases rapidly and left no opportunity for the convicted to appeal their sentences to the General Court, again, against precedent. Stoughton had been a judge on a variety of courts across Massachusetts and Maine for two decades and had served on the bench during numerous witchcraft cases before this, and he chose to handle things differently in 1692.
Margo Burns is the associate editor and project manager of Records of the Salem Witch-Hunt (Cambridge University Press, 2009), the most complete compendium of the trial documents. She’s been the expert featured on several Who Do You Think You Are? TV episodes and regularly speaks on the Salem witch trials at History Camp, historical societies, and libraries. Check out her 17th-Century Colonial New England website.
The Return of the “Witch Hunt” Analogy
The political slur “witch hunt” is back. After continually using the term to discredit Special Counsel Robert Mueller’s investigation—84 times over a seven-month period of tweets, by one reporter’s count—President Trump has invoked the term anew to defend against the House Democrats’ impeachment inquiry. Rudy Giuliani, the president’s personal attorney…
This article was originally published in Quillette on October 27th, 2019. Internal citations for this article are available on request. Contact Tony Fels.
The political slur “witch hunt” is back. After continually using the term to discredit Special Counsel Robert Mueller’s investigation—84 times over a seven-month period of tweets, by one reporter’s count—President Trump has invoked the term anew to defend against the House Democrats’ impeachment inquiry. Rudy Giuliani, the president’s personal attorney, went one step further in an interview on October 8, 2019, with Fox News’s Laura Ingraham. Referring to the Salem witch trials of 1692, Giuliani said that the impeachment inquiry is “worse than a witch hunt.” The accused witches back then “had more rights”; the court “required witnesses to face the witch and some witches were acquitted.”
Giuliani claimed he was so angered by the House Democrats’ recent actions that he “went back to read two books about the Salem witch trials.” If so, he either picked deficient accounts, or else he failed to read them very carefully. In truth, all twenty-three individuals who were tried by the specially empowered witchcraft court at Salem were convicted. Nineteen of these were executed by hanging (along with one other accused suspect who was pressed to death under heavy stones for resisting the proceedings), two avoided execution by reason of pregnancy, one was later pardoned, and one escaped. Dismissal of charges, acquittals, or reprieves for the approximately 130 additional suspects came about only after the colonial governor disbanded the original court. The court’s use of “spectral evidence”—ethereal likenesses of the accused, visible only to the accusers—had been discredited by the dawning realization that at least some innocent people were being put to death. As for the accused having the opportunity to face their accusers, this feature of seventeenth-century jurisprudence did the defendants little good, since the accusers fell into fits of torment at the sight of the accused, results that were taken to corroborate the suspects’ powers of bewitchment.
Clearly, whatever deficiencies exist in the Democrats’ handling of the impeachment inquiry—and there appear to be some, addressed below—they pale next to the legal inadequacies of the witch hunting era, when criminal defendants did not yet have the right to counsel, judges felt no obligation to remain neutral, and crowds of onlookers could influence the legal process. And yet, despite its obvious flaws, the “witch hunt” analogy’s reintroduction into today’s partisan battle in Washington does provide the opportunity to explain why the president and his supporters have reached for this particular epithet and why it can be effectively employed, just as it was when defenders of Bill Clinton used it in the 1990s against Kenneth Starr and the Republicans in their own quest to remove a president through impeachment.
* * *
The term “witch hunt” itself gained currency at the outset of the twentieth century, used to denote an incident in social psychology in which individuals are punished by a group, with or without official backing, for committing an alleged offense but without any procedures of due process involved. Suspects are presumed guilty as soon as they are accused. They stand little hope of exonerating themselves, even if innocent, because the crowd and whatever judicial apparatus exists provide them no fair and impartial means to mount a defense and clear their names.
Guilty consciences play a critical role in the genesis of a witch hunt. In the first instance there has to be a trait that the community at large regards with such stigma that most people are prepared to shun anyone who may be seen as openly tainted by its presence. But equally important, this same trait must be thought to exist to a lesser degree or just beneath the surface in enough people, so that when accusations begin to fly, the average person has an interest in clearing his or her own guilty conscience by denying the trait in themselves and foisting all of its blame on the named suspect or suspects. This is the mechanism of scapegoating, which always comes into play in a witch hunt. Personal guilt provides the fuel, ignited by the fear that one’s own sharing in the stigmatized trait will be discovered.
The American prototype for witch hunting (though without the name) took place in and around Salem, Massachusetts, in 1692. In this colonial Puritan outpost, twenty people accused of witchcraft were executed, five more died in custody, and over 150 people were jailed for months, including over forty whose false confessions helped seal the fate of those who were convicted. The twenty who were executed went to their deaths proclaiming their innocence in the face of judicial badgering and enraged public opinion. These individuals, fourteen women and six men, refused to “belie themselves” before God by confessing to crimes they had never committed.
Because most people today no longer believe that witchcraft is real, it is sometimes thought that the essence of a witch hunt lies in persecuting people for entirely made-up crimes. This is a misunderstanding. In the context of seventeenth-century cosmology, in which nearly everyone believed they lived in a world of spirits and demons, it was entirely reasonable to think that certain individuals could be enlisted by Satan to draw on supernatural powers to inflict harm on other people or tempt them away from the Puritans’ utopian experiment. And who better for Satan to designate as witches than those who appeared on the outside to be pious members of Puritan congregations? This is why most of the people who falsely confessed to the crime of witchcraft (and often implicated others) were actually among the most, not the least, pious Puritans. These were the sensitive ones who, when they examined their own behavior and saw occasional signs of malice or greed or envy, were consumed by guilt and imagined that their sinfulness had already turned them in the direction of becoming witches. The Salem witch hunt did not manufacture the crime of witchcraft; it exaggerated the presence of a stigmatized trait that most everyone in the community believed really existed.
* * *
These days Americans on the left of the political spectrum are most given to engaging in the social psychology of witch hunting—our first hint about why Trump and his supporters have seized on the term in their own defense. The fear of harboring “racist” or “sexist” thoughts or of being discovered to have engaged in behavior that can be so labeled by the community has produced numerous rushes to judgment (witch hunts) that have unduly injured a number of both famous and ordinary Americans. Virginia Governor Ralph Northam briefly supplied a recent example of a witch hunting suspect. Why was a sincere apology for his insensitive racial behavior (of appearing in blackface) thirty-five years ago insufficient to end the controversy, considering the man’s subsequent record as a physician and public servant lacking in racial prejudice? Why did so many Democrats believe he needed to resign, that nothing short of such drastic punishment would do? A similar situation confronted Minnesota Senator Al Franken two years ago in the wake of sexual misconduct charges that stopped well short of assault. Angry Democratic leaders forced Franken to resign before the authorized Senate Ethics Committee could carry out an investigation of the incidents in question. Franken had denied most of the charges, while apologizing for his actions in some of them.
A particularly striking small-scale example of the same phenomenon occurred in Albany, California, in 2017. In this San Francisco Bay Area community, enraged white and black high school students, over one hundred in number and backed by parents and teachers, yelled epithets at several white and Asian-American students and chased them off the campus when they returned to school after serving a suspension for having endorsed derogatory images posted about African-American students and a coach at the school. The crowd apparently deemed the school’s own disciplinary procedures insufficient. One of the targeted students was injured in the melee. In a similar way, local communities and anonymous internet users hounded various Americans, given scornful names such as “BBQ Betty,” “Permit Patty,” and “Cornerstone Caroline,” for alleged acts of racial prejudice before anyone cared to learn the details of their transgressions or their own explanations for their actions. Meanwhile, certain liberal universities—Middlebury College and Evergreen State College are two leading examples—have become notorious for permitting students and faculty to stifle the speech of those accused of holding “racist” views, even when such views are either noninflammatory or entirely lacking in prejudice.
During the late 1940s and early 1950s right-wing Americans took their turn at witch hunting. The stigmatized trait at that time was to be a communist sympathizer. Several thousand Americans lost their jobs as teachers, engineers, actors, film directors, and especially government employees for fear that they would undermine the resolve of the United States in its cold war with Communist Russia. As at Salem and as again today concerning what is taken to be insensitive racial and sexual behavior, confessions of guilt played a central part in the “Red Scare” of the era, adding to the seeming truthfulness of the charges and contributing to their spread. Here, too, the existence of communist sympathizers among professionals and within the government bureaucracy was not made up. A significant portion of Americans had developed anti-capitalist leanings during the Great Depression and the period of the World War II alliance between the U.S. and the Soviet Union. A small number of these individuals (perhaps a little over 300, according to historians Harvey Klehr and John Earl Haynes) carried such leanings to the point of spying for the Russians. But the witch hunt of the 1940s and ’50s exaggerated the threat posed by all these people, the vast majority of whom were loyal and idealistic Americans whose chief fault lay in their ignorance and naivete about what life under communism was really like.
* * *
Knowing the propensity of Americans to engage in these extreme sorts of moral and political purges, especially the most recent crusades against racism and sexism, allows us to understand why a number of conservative politicians have lately fancied themselves the victims of witch hunting. President Trump’s repeated charge that the Mueller investigation was a “witch hunt” offers the most prominent example, but similar charges were voiced in defense of former New Jersey Governor Chris Christie and former Missouri Governor Eric Greitens when both of these Republicans faced allegations of wrongdoing. In fact, not one of these cases constituted an example of a witch hunt, since the ensuing investigations or trials operated in line with customary legal proceedings and respected the principle of due process. The Mueller probe, a prosecutorial inquiry, found insufficient basis to bring criminal charges against a sitting president. Even at this pre-trial stage of investigation, the president had the opportunity to testify in person or, as he chose to do, to answer questions in writing under the guidance of his attorneys. In the New Jersey case a jury convicted two Christie aides of illegal actions taken to get back at a political rival, while charges were ultimately dropped against Governor Greitens. Moreover, in none of these cases was the element of scapegoating present, because the public harbored no guilty consciences about similar behavior in themselves.
In the current impeachment inquiry, Democrats, who control the House of Representatives and all of its committees, have indeed shut minority Republicans out of exercising their own subpoena power, and they have allowed witnesses to testify in closed sessions without compelling reasons for doing so (protecting the identity of a formal whistle-blower, of course, would be one such compelling reason). Democrats liken the House inquiry to the secretive, prosecutorial grand jury stage in a criminal case, and they contend that if the impeachment process leads to a formal trial in the Senate, the president and his supporters will have the opportunity at that point to mount their own defense. While constitutionally defensible, this position appears to lack consideration for what might be called political due process. Since the success of any impeachment drive is dependent on ensuring the public’s perception of fairness throughout the process, Democrats are likely being shortsighted in some of these early procedural decisions. Still, the president, aided by his formidable legal staff, will be fully able to defend himself against any articles of impeachment, should the case move to a Senate trial.
Republican politicians have nevertheless cleverly employed the countercharge of “witch hunt” in all these instances, because, much as Americans have historically been prone to conduct campaigns of moral and political purification (i.e., witch hunts), another side of the American character has typically reasserted itself after each such episode and condemned the earlier miscarriages of justice. Following the catharsis of a witch hunt, tolerance for human foibles returns and more humane paths toward reform are found. Politicians can thus cynically appeal to these anti-witch hunting sentiments as a way to discredit legitimate investigations into their own actions. Close to 50 percent of the American population, according to a poll taken just prior to the release of the Mueller report, accepted President Trump’s mischaracterization of the special counsel’s investigation as a “witch hunt.” Even Joseph McCarthy, the leading witch hunter in the Red Scare of the early 1950s, could misleadingly cast his Senate opponents as a “lynch party,” when the Senate in 1954 finally acted to censure him, and Trump himself recently invoked the same concept (“a lynching”) to describe the impeachment inquiry.
The best way to prevent such perennial misuse of the “witch hunt” label would be for Americans to stop themselves before they allow their moral fervor to get out of control and run roughshod over the legal rights of others—in other words, to refrain from witch hunting in the first place.
L’identitié américaine à travers le miroir de Salem
This article, a translation of the Introduction and Conclusion from my book, Switching Sides: How a Generation of Historians Lost Sympathy for the Victims of the Salem Witch Hunt, appeared in the French journal Perspectives Libres, No. 30 (juillet 2022 – juin 2023).
This article, a translation of the Introduction and Conclusion from my book, Switching Sides: How a Generation of Historians Lost Sympathy for the Victims of the Salem Witch Hunt, appeared in the French journal Perspectives Libres, No. 30 (juillet 2022 – juin 2023).
The Fog of Youth: The Cornell Student Takeover, 50 Years On
On April 20, 1969, an era of student rebellions that had rocked American campuses at Berkeley, Columbia, San Francisco State, and Harvard reached a culmination of sorts with the triumphant exit of armed black students from Cornell’s Willard Straight student union building after a two-day occupation. The students had just won sweeping concessions from the university’s administration, including a pledge to urge faculty governing bodies to nullify reprimands of several members of the Afro-American Society (AAS) for previous campus disruptions on behalf of starting up a black studies program, judicial actions that had prompted the takeover. White student supporters cheered the outcome. And when the faculty, at an emergency meeting attended by 1,200 professors, initially balked at the administration’s request to overturn the reprimands, the radical Students for a Democratic Society (SDS) led a body that grew to six thousand students in a three-day possession of the university’s Barton gymnasium. Amid threats of violence by and against the student activists, the faculty, in a series of tumultuous meetings, voted to reverse themselves, allowing the crisis to end. Student protestors claimed victory for a blow successfully dealt to what they held to be a racist institution.
This article appeared on Quillette, June 25, 2019. In that publication the word "urgent" was mistakenly inserted into the third sentence of the ninth paragraph. Internal citations for this article are available on request. Contact Tony Fels.
On April 20, 1969, an era of student rebellions that had rocked American campuses at Berkeley, Columbia, San Francisco State, and Harvard reached a culmination of sorts with the triumphant exit of armed black students from Cornell’s Willard Straight student union building after a two-day occupation. The students had just won sweeping concessions from the university’s administration, including a pledge to urge faculty governing bodies to nullify reprimands of several members of the Afro-American Society (AAS) for previous campus disruptions on behalf of starting up a black studies program, judicial actions that had prompted the takeover. White student supporters cheered the outcome. And when the faculty, at an emergency meeting attended by 1,200 professors, initially balked at the administration’s request to overturn the reprimands, the radical Students for a Democratic Society (SDS) led a body that grew to six thousand students in a three-day possession of the university’s Barton gymnasium. Amid threats of violence by and against the student activists, the faculty, in a series of tumultuous meetings, voted to reverse themselves, allowing the crisis to end. Student protestors claimed victory for a blow successfully dealt to what they held to be a racist institution.
This positive interpretation of the meaning of the Cornell events has surprisingly remained mostly in place among the left-leaning participants (all within the SDS orbit) with whom I have kept in touch over the past 50 years. Most other former New Leftists whom I have spoken with or who have written about the crisis see it roughly the same way. One might have thought that decades of personal maturation would have produced profound doubts about the wisdom of such extreme actions taken when we were still in, or just past, our teenage years.
The continuity in interpretation by former SDSers is all the more remarkable in light of the fact that the nation at large took a distinctly critical view of the same events right from the start. Most Americans immediately recoiled at the sight of the widely reproduced image, captured in a Pulitzer prize-winning photograph, of the bandolier-wearing student leading the Willard Straight Hall activists, rifles at their side, out of the building.
Headlines describing Cornell’s “capitulation” and “disgrace” typified national news coverage. Among 4,000 letters written to Cornell’s top administrators after the crisis, under five percent viewed the administrators’ actions favorably, and the student rebellion no doubt helped reinforce the country’s shift toward conservative dominance that had begun the previous November with the election of Richard Nixon. Yet through this immediate aftermath and on into the future, most of the aging participants have shown little evidence of rethinking.
In searching for a way to explain this insularity in left-liberal interpretation on the occasion of the rebellion’s fiftieth anniversary, I am struck by how little we activists really knew about the details of the events that were unfolding before our eyes, and how we wanted to know these details even less, both then and later. I gained this appreciation of our ignorance by reading Donald Alexander Downs’s Cornell ’69: Liberalism and the Crisis of the American University (Cornell University Press, 1999), an invaluable narrative and analysis of one of the era’s major campus uprisings. A political scientist today, Downs was himself a Cornell undergraduate during the late 1960s, although his book says nothing of any role he may have played in the crisis (and I have no personal recollection of him from those days). The book apparently came much later, a project for which he carried out extensive research in the Cornell archives, reading through local newspaper accounts and other written sources, and interviewing dozens of former participants in the 1990s.
While Downs presents his own argument about the threat posed by the Cornell protests to academic freedom—an argument I find persuasive—his carefully written and thoroughly documented account can be detached from that argument by those who might disagree with the lessons he draws. His study deserves widespread attention by anyone today who still wishes to hold to a romantically positive version of those events. Much as the “fog of war” obscures an accurate assessment of a large-scale battle from the range of vision held by any particular combatant, the “fog of youth” may be said to have prevented the vast majority of Cornell’s students at the time from grasping the implications of the conflict as a whole. Thanks to Downs’s history—to which I have added a few minor corrections from Bruce Dancis’s memoir, Resister: A Story of Protest and Prison during the Vietnam War (Cornell University Press, 2014), and several observations from Divided We Stand: Reflections on the Crisis at Cornell, ed. Cushing Strout and David I. Grossvogel (Anchor Books, 1971), and Anita M. Harris’s Ithaca Diaries: Coming of Age in the 1960s (Cambridge Common Press, 2015)—we can acquire a far more informed view today of the entire picture, revealing just how adolescent, intolerant, and frightening the Cornell protests actually were. Looking back now, there is little to be proud of.
White Radicals Take the Initiative
As Downs shows, two mostly separate streams of student activism, one predominantly white, the other exclusively black, came together in spring 1969 to produce the rebellion at Cornell. The mostly white leftists centered their attention on opposition to the American war in Vietnam. As early as May 1965, radicals in a variety of organizations (SDS came to Cornell in 1963 but did not dominate the campus left until fall 1966) disrupted a speech by Averill Harriman, U.S. ambassador to South Vietnam, taking his microphone and insulting him as an “agent of imperialism.” A few days later students interrupted the annual ROTC (Reserve Officers’ Training Corps) review with a sit-in, sparking an angry reaction by members of the audience. A similar demonstration against Marine recruiters in November 1967 by about 200 protesters and 30 counter-demonstrators led to pushing, shoving, shouting, and obstruction.
Not all such student activism resulted so quickly in confrontation. Predominantly white student radicals pursued other issues along with their antiwar activities, including support for the civil rights movement earlier in the 1960s, and later a push for educational reforms (in class size, the grading system, and other areas), a drive to have Cornell build low-income housing for the residents of the adjoining town of Ithaca (the focus for the SDS faction of which I was a leading member), and a campaign for greater freedom of speech and expression in campus publications. Even the latter cause, however, came to a potentially violent head in January 1967, when the district attorney from the surrounding county directed sheriff’s deputies to seize a literary magazine for its sexually explicit material and arrest its student distributors. Repeating a famous tactic from the 1964 free speech movement at Berkeley, students threatened the county official by ominously surrounding his car, while trying to trip a deputy. Local authorities soon backed off, though not before some students had vandalized the empty police car. An effort to encourage students to resist the draft, begun peacefully in spring 1967, similarly ended in a sit-in at the university proctor’s office when the administrator forcibly tried to stop the organizing by suspending several students.
Whether peaceful or confrontational in design, nearly all these forms of campus activism framed themselves as “demands.” With the barest of exceptions, radical students showed little interest in putting forth proposals, making suggestions, or engaging in reasoned dialogue in order to bring about reforms at Cornell or in their wider communities. Leftists produced plenty of leaflets and other information, increasingly in the name of anti-capitalist or anti-imperialist ideals, aimed at attracting more students to their side, but their unstated goal was expression far more than persuasion. Persuasion by its very nature proceeds slowly, whereas student demands were expected to be met immediately. Even negotiations were frowned upon as likely to lead to unacceptable compromise.
The adolescent character of this sort of rebelliousness displayed itself clearly at the largest campus protest to hit Cornell back in the 1950s, an era that most SDSers would have thought bore little relationship to the antiwar and antiracism activism of the later 1960s. The issues in contention at that time concerned university rules requiring chaperones for women students and prohibiting parties in off-campus student apartments. In 1958 over one 1,000 students gathered outside university president Deane W. Malott’s house, shouting obscenities and throwing rocks while the president was meeting inside with the chairman of the school’s board of trustees. Students chanted, “We have parents now, who needs more?!” It cannot be simply coincidental that the occupation of Willard Straight Hall on April 19, 1969, an act that turned the university on its head, would take place during Parents’ Weekend.
African American Students Join the Fray
Meanwhile, alongside these actions by mostly white activists, radicals among Cornell’s African American students were pursuing their own agenda. When James A. Perkins succeeded Deane Malott as Cornell’s president in 1963 during the height of the civil rights movement, he quickly undertook measures to increase the number of African American students. From four black students admitted to a freshman class of 2,300 in 1963, their numbers grew to 94 in the 1968 incoming cohort. By the spring of 1969, Cornell’s undergraduate and graduate student population included 259 African Americans.
As was true among the white students, radical activists never comprised more than a relatively small minority within the total black student population. But the situation black students faced of being a distinct racial minority on a large campus, together with the heightened racial consciousness that came with the rise of the black power movement in the summer of 1966, meant that radical leaders were able to attract a significant following during this time. Coercion of more politically moderate individuals, especially ones who tried to maintain personal relationships with white students, also played a role, as a number of black former students and others whom Downs interviewed reported. The Afro-American Society, founded in early 1966, typically had 50 to 100 members but could occasionally bring as many as 150 students to its meetings and actions.
Black students at the time faced incidents of racial prejudice and cultural misunderstanding. Examples included a derogatory remark with racial overtones made by some white players toward some black players during tryouts for the men’s freshman basketball team, roommate conflicts in the dorms between white and black women over the procedures involved in fashioning Afro hairdos, and arguments over what music should be played on the cafeteria jukebox. Cornell’s popular fraternity system produced the biggest incident. While some African Americans, including several AAS leaders, belonged to predominantly white fraternities, many black students encountered barriers at the time of rushing. In October 1966, midway into a dance party at one fraternity house, a doorman began charging blacks an entry fee that he waived for whites. The Interfraternity Council was sympathetic to Cornell’s black students and quickly carried out an investigation. Its judicial body found that, while the fraternity did not originally set out to exclude black students (earlier in the party both white and black students had been admitted), discrimination had occurred. In response, the council placed the fraternity on probation for a year and then co-sponsored (with the AAS) a “Soul Week” on campus that brought black power advocate Stokely Carmichael and other national figures to Cornell. Nevertheless, this incident led to the formation of a racially exclusive residence for men and, a little later, another for women (Wari House) for those black students who wished to move to them. As described in a slightly more detailed account of this episode than Downs’s, written a year after the Cornell takeover by AAS member Cleveland Donald, Jr., for an anthology, Divided We Stand: Reflections on the Crisis at Cornell, the fraternity’s blatant act of discrimination had a radicalizing effect on the university’s black students.
Militant actions did not start right away, but a building takeover at predominantly black Howard University in spring 1967 on behalf of a black studies curriculum, among other issues, spurred on African American students elsewhere. Black activism at Cornell, much like its white radical counterpart, now acquired the character of making nonnegotiable demands and using the power of group intimidation to get results. Black student radicals had the added tool of appealing to feelings of social guilt felt by sensitive white students, faculty, and especially administrators, a factor cited by many of the people Downs interviewed for his book. Donald recognized this factor as well in his own essay, though he added that “the act of haranguing whites” also produced frustration for AASers, “because blacks knew that whites enjoyed the punishment…[and] by enjoying the punishment, deprived blacks of the therapeutic value inherent in the act of punishing.”
The First Black Radical Actions
The new militancy found expression principally in the demand for an African American studies program at Cornell. An economics course on development offered in the spring of 1968 semester provided the immediate catalyst. Although the course instructor, a visiting professor from the Philippines, was not explicitly addressing the situation of blacks in the United States (but rather poor people in general), he made a number of statements in class about poverty that three radicals among the seven or eight African Americans in the class found to be racist. When the professor made it difficult for these students (or any students) to raise objections in class (though not out of class), the three radical students took matters into their own hands.
They registered a complaint with a dean and then the Economics chair, asking for an apology from the professor, the professor’s dismissal, and a black professor to be appointed in his place. A couple of weeks later, after spring break, the radical students returned to the classroom, taking over the podium to read a statement. Chaos broke out before the professor canceled the class. The radical students then gathered about 40 to 60 supporters, marched over to the Economics Department and took over the office. There they held the chair hostage for the next seven hours (they also briefly detained three secretaries), declaring the office closed until a mechanism had been established to address their three demands. With student supporters on the outside and plainclothes campus guards called to the scene, the situation grew increasingly tense. At one point a fight broke out when five black students pushed past the guards to join those inside. Two guards and one student were injured in the melee.
The occupation came to an end when the university provost agreed to meet with the students to discuss their demands, hire an outside lecturer selected by the AAS, and investigate the whole matter. A nine-member commission composed of faculty, administrators, and students (the Williams Commission) expeditiously carried out the ensuing investigation, and unanimously concluded that the economics professor had not been guilty of overt racism, although a minority of three believed that unconscious or institutional racism had been at work in some of the professor’s presentations. The commission also censured the radical students’ actions in the episode, referring them to Cornell’s judicial board for adjudication while recommending against severe punishment.
Despite the findings of the Williams Commission, however, the university administration decided not to charge any students in the disruption, and the provost even indicated to leaders of the AAS that he and other administrators took their side morally. As the dean of the College of Arts and Science put it in a public report at the time, “[The economics professor] and I and most whites are racists in some degree. We are all in some degree ignorant of and insensitive to the plight of black people….I think they [black students] have the right to demand of us…that we make an immediate and resolute effort to teach ourselves about black problems, and that we dedicate ourselves as an institution to finding solutions to these problems.” The willingness on the part of Cornell’s administrators to overlook these unlawful campus actions, equally true for some of the disruptions caused by white radical students, would contribute enormously to the armed takeover of Willard Straight Hall one year later.
The Push for a Black Studies Program
In hiring an AAS-approved outside lecturer in the wake of the Economics Department takeover, Cornell in effect took the first step toward establishing a black studies program at the university. In the fall of 1968, the university set up an advisory committee of faculty and students to plan the program, but AAS radicals soon articulated their own proposal, characteristically set forth as a list of demands. Rejecting the advice of faculty and several African American students on the advisory committee to structure the program as an interdisciplinary major with professors hired by contributing departments, the radicals insisted on an autonomous College of Afro-American Studies with powers over its own finances and hiring. In early December, the radicals arrived at the advisory committee meeting with close to 50 supporters and announced that the planning group had been disbanded in favor of a new black-only body, voting 50-0 in favor of the change. On the same day, six radicals precipitously evicted a professor and two employees from a building that the university had already designated to be used as the program’s headquarters beginning a year from then, when the program would likely be starting up. Three days later, the AAS presented their autonomous college plan to President Perkins and demanded his approval within 24 hours.
When the “deadline” passed without the president’s authorization, AAS radicals initiated what became known as the “December actions.” Seven militants pointed toy guns at students in front of the student union and disrupted traffic. They invaded the administration building, committing petty vandalism (knocking over a sand-filled container with cigarette butts and two candy machines, discharging a fire extinguisher, and banging on office doors). Back at the student union, they surrounded a campus police car, striking its hood and roof, and barged into a closed dining room pretending to demand service. The following day, 75 African American students, accompanied by some children, staged a brief sit-in in front of the president’s office. When Perkins offered to speak with them and sent out a cart of food, they refused his offer and overturned the food cart. Another group of 30 went to three different campus libraries, removing an estimated 3,700 books from the shelves, dumping them in front of the circulation desks and proclaiming that they had “no relevance to me as a black student.” The December actions came to an end a day later, when a radical contingent delayed that evening’s basketball game by marching across the court while playing music. It would be the reprimands of three AAS students involved in the toy gun harassment episode, a punishment handed down by the student-faculty judicial board after a nearly five-hour meeting that lasted until 2:00am on April 18, that would precipitate the Willard Straight takeover the following day.
The SDS and the Afro-American Society Join Forces
But the intervening four months between the December actions and the judicial board’s decision had not been devoid of additional and even greater provocations. The new semester on campus (spring 1969) brought SDS and the AAS together in two protests that turned violent. The goal of both protests was ending the university’s perceived support for the apartheid regime in South Africa through the school’s investments in banks that did business in that country. For Cornell, the principal bank in question was Chase Manhattan. Towards the end of February, Cornell’s international studies program sponsored a four-day symposium on the subject of South Africa, and trouble arose at the first evening session when an SDSer tried to interrupt a liberal South African defender of apartheid by asking the audience to decide whether he should be permitted to keep talking. Only the intervention by another SDSer, a law student, who appealed to the audience to uphold the principle of free speech, allowed the speaker to continue. But at the keynote event two evenings later, held at the Hotel Administration School’s Statler Auditorium, President Perkins did not fare as well. The president had earlier promised to use his remarks introducing the evening’s main speaker to explain the trustees’ reluctance to sell Cornell stock in the Chase Manhattan bank, and SDS and AAS members in the audience looked forward to the opportunity to make his position appear weak.
Even before Perkins could get to the podium, an AAS leader grabbed the microphone and an SDS leader shouted from the audience to demand that the president make good on his promise to explain the university’s investment policy and either break with it or defend it. That much was planned, but what happened next was not. As Perkins began to speak, one AAS member moved from the side of the stage to the lectern and lifted the president up by the collar. Black students in the audience began to beat drums they had brought with them, but soon boos from the audience took over. When a safety officer approached the stage to help free the president, another AAS member moved in from the other side of the stage, pointing a two-by-four board at the officer to stop him. A shaken Perkins was soon released and escorted off the stage to be driven home. The crowd in the auditorium was visibly shocked by what had occurred, and most cheered when a black South African anti-apartheid leader rose to condemn the two attackers, as did an SDS leader. But Downs quotes another eyewitness, an administrator, who observed that as the evening went on and more people spoke, “It was amazing as well as very disturbing to see the reaction of many members in the crowd change from one of concern about the uncalled-for treatment of the President to one of almost outright anger that the President didn’t remain in order that they could criticize him publicly.”
A less ugly but still violent protest broke out a little more than a week later, when about 200 SDSers and a considerably smaller number of AASers joined together to stop Chase Manhattan representatives from recruiting future employees at Malott Hall, home to Cornell’s business school. The demonstrators forced their way into the room where the recruiters’ table had been set up. A campus patrolman later recounted what happened next: “When we got totally overrun, I got pushed, I got knocked down on the floor, and [there was] glass all over the place….There were ten or fifteen students. I mean, they just literally chomped all over tables, literally, everything went flying. It all happened just, whoosh! So fast!…I went right through a window, head first….I could have been killed…Several of the recruiters that were there that were sitting in the chairs, I mean, their chairs went over backwards, they just left their briefcases and everything and just walked away.” The university cancelled Chase’s recruiting efforts for the foreseeable future.
Violent Acts and Cornell’s Response
Neither of the disturbances at the Statler Auditorium or Malott Hall resulted in any university judicial actions. Proceedings against the two individuals who had taken part in the physical attack on President Perkins might have occurred, except that one of these men abruptly left the area after being cited and, perhaps more importantly, was expelled from the AAS, while the other had already dropped out of school. Sporadic cases of violent assault in fact hovered around the edges of the AAS’s activism at Cornell. In a fierce conflict between two factional leaders of the AAS that broke out in fall 1968, one small group went after another with guns and knives, and both sides were armed with chains, even if the only explicitly violent result was a smashed car window. During the December actions, an AAS member, one of the two men later involved in the attack on Perkins, struck a Cornell Daily Sun reporter in the face and roughed up a photographer when he noticed them in front of the building the organization had just seized to become the future headquarters of the black studies program.
The worst outbreak of violence occurred over the weekend following the Malott Hall protest. Three white students were assaulted on campus. In two cases the victims were attacked from behind but were able to identify their attackers as black; the third victim was rendered unconscious for four days from head injuries and was unable to remember anything about the assault. No suspects were ever identified, but an anonymous letter published soon after in the Sun under the title, “One Black’s View,” expressed “shock” and “shame” that “some of my brothers have found it necessary to attack white students.” As Downs notes, “He or she then claimed that those black students who opposed the AAS’s direction of action were afraid to speak out. ‘Even though I am black, if I signed my name to this letter, I would be intimidated. I have seen it happen to others.’” It is probably the case that the perpetrators of these violent acts comprised a tiny minority within the AAS, but the fact that such actions had taken place and were widely discussed on campus—to which might be added the outbreak of a number of fires of unexplained origin—enhanced the frightening quality of all the Cornell protests. In the background, too, was the violence taking place in the country at large, most especially the assassination of Martin Luther King (the news of which was reported just a few hours after the Economics Department takeover had ended), provoking anger among African Americans everywhere.
The context of this violent era of social change helps explain why the Cornell administrators responded so timidly to the radical actions of its white and black students. In a few cases the university’s judicial system had reprimanded student protesters and even placed some on probation. This was true for the demonstrators in the ROTC and marine recruitment altercations (in the latter case 129 students received reprimands), as well as for the few students charged in the December actions. But, as Downs points out, over time the mixed faculty-student judicial boards (which themselves underwent structural change during this era) lost legitimacy, not because they lacked fair-minded and dedicated personnel but because the growing influence of leftwing ideology undercut the value of individual responsibility in favor of group accountability. And if a group, like African Americans, was seen as a historical victim of prejudice, then that group’s rule breakers deserved to be treated with special leniency—at least that’s what many at Cornell, including its leading administrators, thought. By the time of the Willard Straight takeover, according to “many sources” whom Downs consulted, “the administration had adopted a ‘hands off’ policy when it came to potentially illegal actions of dissident students, especially blacks.”
The same sort of compensatory thinking, which could never be openly acknowledged, caused Cornell’s administrators to avoid speaking honestly to the faculty about most of their educational policies toward minority students, covering everything from the university’s admissions requirements (which were altered for incoming African American students) to the president’s final proposal for the new black studies program, reflecting most of the AAS’s original demands, that he submitted to the university’s board of trustees in early April 1969. The Cornell administration’s weak and deceptive style of leadership helped set up its strained relationship with the faculty no less than it encouraged continued student disruptions, even when the board of trustees approved the new Afro-American Studies Center at its April 10-12 meeting.
A Deceptive Cross-Burning Incident
The Cornell administration, however, was not alone in its reliance on deception to further its aims. With the trustees’ acceptance of the new black studies program, the AAS’s principal goal, the only demand of the society that remained unfulfilled was that none of its members be disciplined for their actions the previous December—actions needed, as they saw things, to bring that program into existence. Having come this far in obtaining everything it wanted, the AAS must have felt there was no reason to back down now on its final demand. In addition, the society’s group ideology, in which all acts were deemed collective in nature, virtually required that it mount a major demonstration that would rescue the three cited members from their anticipated reprimands. Perhaps sensing, however, that a critique of “judicial racism” might not provide sufficient justification for the audacious step that the AAS was now planning, some radicals—how many and who they were is not known—in all likelihood decided to add the provocation of a cross-burning in front of Wari House, the black women’s residence, together with a brick thrown through the residence window a little before 3:00am on the morning before the AAS seized the student union building.
The circumstantial evidence behind the claim that these events were staged is overwhelming in Downs’s account, although the truth of the matter was probably known to only a few in the AAS. Many people, from sympathetic university officials to police officers from the town of Ithaca, suspected a ruse at the time. There were no physical traces pointing to the involvement of non-Cornellians, to which may be added the fact, omitted from Downs’s book, that the wood used to construct the cross was purchased from art supplies sold at the campus store, as a subsequent report by the university’s trustees revealed. With the passage of years, more and more testimonies by individuals, both black and white, involved in the Cornell takeover have accumulated to buttress the claim made by then-university provost Dale Corson in a 1996 interview that he was “99.9 percent sure” that it was an inside job. In April 1969, however, nobody dared voice these suspicions, and the appearance of such an overtly racist act added momentum to the student rebellion, as it was cited again and again by participants in favor of overturning the reprimands. In her memoir, Ithaca Diaries: Coming of Age in the 1960s, Anita M. Harris wrote that a group of Jewish students issued a statement pledging their support for the AAS based on the “full [historical] implications” of such a “vile act.”
The Takeover
The takeover of Willard Straight Hall was not carried out without violence, even though rifles would only be brought into the building later in the first day of the occupation. At the outset, some of the AAS students were armed with chains, knives, and clubs. Arriving at around 5:00am on the morning of Saturday, April 19, the occupiers roused and expelled the 28 parents who were staying in the building’s upstairs hotel rooms for Parents’ Weekend. Some of the parents endured insults and were compelled to exit the rooms in their nightclothes, leaving their belongings behind. All were led down a long flight of stairs to the building’s garbage area, where they were forced to jump off a three-foot loading dock. Though none was injured, most were left shocked, frightened, and angry. During the occupation itself, a fair amount of vandalism occurred, including to the doors and the contents of the visitors’ rooms, to locks on vending machine coin boxes (with $1,000 taken), to interior floors and paintings, and to stores of food from the kitchen.
The AAS began bringing the infamous rifles (and some hatchets) into the building about eight hours into the takeover, after 25 white fraternity men had entered the student union from a side window in an attempt to break the society’s hold over the building. In the resulting melee, the occupiers were able to repel the fraternity men with only slight injuries to both sides, but this forcible effort to end the takeover added to fears by the AAS—unfounded, it turned out—that whites from the surrounding community, including sheriffs’ deputies or even the national guard, were planning an armed attack. The AAS justified its introduction of rifles on grounds of self-defense (the New York state legislature would make the presence of guns on a university campus illegal only after the Cornell rebellion), but “self-defense” could be asserted so aggressively as to carry the potential for violence itself—two days later, an AAS leader threatened that if Cornell’s faculty did not reverse its vote on the reprimands, various of its “racist” members were “going to die in the gutter like dogs.”
Just as AAS leaders manipulated the society’s own membership by means of staging (or, at the very least, failing to repudiate) the phony cross-burning incident in advance of the takeover, SDS leaders (from its “Action Faction”) carried on secret planning of their own to ensure that the predominantly white organization would rally behind the anticipated occupation. Downs demonstrates that a number of these white radical leaders had been alerted to the planned takeover by their African American counterparts several days before the occurrence. A few had purchased rifles for the AAS leadership several months earlier. By 7:00am of the first day, SDS had thrown up a picket line around the student union as “protection” for the occupiers inside, and the number of these dedicated supporters grew as the day wore on.
In truth, little manipulation of SDS’s membership was needed to bring about this support. Ever since the widely reported and explosive student rebellion at Columbia University the previous spring, most members were looking for some way to provoke a similar confrontation with Cornell’s administration. In addition, nearly all SDSers accepted the radical critique of the university’s judicial system as inherently rigged against black students, thus justifying in their minds the AAS’s demand to nullify the reprimands. Throughout the first three days of the rebellion, SDS managed to speak for an ever-increasing number of white students, who came to see the takeover and the tense showdown that developed between administration and faculty after the initial agreement between administrators and the AAS through the eyes of campus radicals. At one mass meeting of 2,500 students, the few who voiced misgivings were drowned out by chants of “Fight racism—meet the black demands NOW!” In the background lay the frequently voiced threat by SDS to take over the university’s administration building (Day Hall) if the faculty failed to reverse its first vote refusing to go along with the nullification agreement. Most students at the school seemed to endorse that plan.
In the end, however, SDS became a victim of its own success. Once guns had been brought into the occupation, the Cornell administration never wavered from its determination to accede to the AAS’s demand concerning the reprimands. “Saving lives,” in the words of one of the university’s main negotiators, became the administration’s sole objective. Yet even without the genuine fear of terrible violence if Cornell had, for example, sought an injunction to vacate the student union with the threat of law enforcement action behind it, the university’s record in the years leading up to the crisis positioned the administration to do nothing other than capitulate. It had no intellectual resources at its disposal to convince the student body, white and black alike, that reforms in university policies cannot come about through intimidation and force without sacrificing essential elements of any civic community, much less a university. It had given in to these sorts of actions too many times before.
Facing such a weak administration, SDS never got the confrontation it desired. As the number of its student supporters grew into the thousands, young people of more moderate dispositions inevitably came to dominate the huge meetings that took place. These students accepted the radicals’ interpretation of the AAS’s goals and remarkably even most of the society’s tactics. Doubtless the cross-burning incident played a large role in shaping this consensus. But when it became apparent on the evening of the third day (April 21) that the faculty was likely to overturn its initial vote, this great mass of moderate students blocked SDS from moving forward with its projected administration building takeover in favor of giving the faculty one more chance to decide. The following day, the faculty endorsed the agreement, although most said they did so only out of fear for a worse outcome if they hadn’t. Six thousand or more students joined with AAS leaders and President Perkins at Barton gymnasium in cheering this resolution.
Legacy
Eldon Kenworthy, a young government professor who had done more than anyone else to articulate the moderates’ position at a critical moment, later quipped in one of the Divided We Stand essays, “The Mensheviks had won,” a reference to Lenin’s less ruthless but still revolutionary opponents at the time of the Russian Revolution. The analogy was apt, because by arbitrarily overturning the reprimands, the Cornell community had broken, albeit nonviolently, with a fundamental principle in a liberal democracy that requires all mentally competent individuals, regardless of status or ethnic background, to be held accountable to the same set of laws. Confusion on this score would remain a lasting legacy of the Cornell rebellion, particularly because the campus judicial system that the university had in place in the late 1960s, Downs shows, had never been racist to begin with.
Beyond this confusion, the student rebellion produced few lasting results. Experiments in greater student participation in university governance that issued from the “Barton Hall community” proved fleeting. Black student activists achieved an African Studies program, but this goal had been won before the dramatic building takeover had occurred. The new program also suffered, Downs points out, from the extreme separatism of the AAS’s campaign to bring it into existence. Had the program been structured less autonomously and brought more fully into relationship with the university’s academic disciplines, as was the case with a similar program established around the same time at Yale, it might have gotten off to a stronger start. SDS declined in importance in the years following the takeover. Several prominent Cornell faculty members resigned immediately, while quite a few more began to look for positions at other schools. Downs quotes a number of professors who stated that they now began to edit or censor their lectures for fear of incurring student disapproval, knowing that they could not count on the university administration to back up their academic freedom.
The ethical shortcomings of the 1969 Cornell student rebellion, which appear so glaring today, were anything but clear to us radical activists at the time. In those days, what were taken to be moral ends—furthering along racial justice and ending the American war in Vietnam—justified an abundance of coercive means, as a leading Cornell activist, Bruce Dancis, acknowledges in his thoughtful memoir, Resister, although his criticisms do not go as far as mine. We thought little about the negative consequences of the tactics we adopted and delved not very deeply into even the positive goals we pursued—what, for example, would Vietnam be like if U.S. forces withdrew?
Downs helpfully warns against over-emphasizing the differences he has recorded in the tactics adopted by Cornell’s black and white student radicals. One of the AAS members he interviewed in 1997, he tells us, “would often punctuate her recollections of events with the exclamation, ‘We were so young!’” Indeed, the category of youth offers greater insight into the era’s excesses than that of race. The Cornell events formed not just part of a national outburst on American college campuses but also an element within a worldwide explosion of youthful energies that ranged from students opposing communist tyranny in Prague to those who provided the shock troops for Mao’s murderous Cultural Revolution. Perhaps Cornell’s Economics professor George Hildebrand put it best at the time when he castigated the university administration’s “incredibly naïve and romantic permissiveness that prevailed over the last three years,” stemming from its “misplaced faith in youth.” How many veterans of those student days, now in their late-60s or early-70s, would be willing to agree?