We’ve just emerged from another cycle of what my father-in-law likes to refer to sarcastically as “Hallothanksmas.” I doubt he’s the only person to use that label to capture the smearing together of Halloween, Thanksgiving, and Christmas from early October through New Year’s Day. Indeed, it’s become commonplace to observe that our national marketplace appears to push the material elements and practices of these holidays uncomfortably close to each other. Charlie Brown and the gang complained about this exact phenomenon close to 40 years ago, and it has only become more obvious ever since.
I confess that I’m one of the complainers as well. I feel that holiday traditions may, and often do, play a role of some determinative, even normative importance in our lives, whether we realize it or not. Thus there is value in being able conceive of and respond to them distinctly. I feel that way because traditions are deeply associated with many other things I take seriously: local engagement, cultural identity, historical memory, familial attachment, and other “communitarian” goods. These don’t constitute a perfectly indivisible bundle, of course, but “traditionalism” is a thread that runs through them and to a degree connects them. Speaking up for tradition in our economically globalized and hyper-mobile world may be essential to making a case for the communitarian perspective as a whole.
Critics of the attempt to build up traditional beliefs and all their material accouterments aren’t necessarily opponents of holidays; I’m not trafficking in “war on Christmas” accusations here. Instead, their disagreement is usually with the moral claims of traditionalism in general — the idea that giving recognition and support to traditions can serve as both a personal and a public good, and consequently that observing traditions may have some moral obligation tied to it. One of these opponents’ assertions is that what appears to adherents of various traditions as morally worthy is really only a subjective perception of such. Tradition is constructed out of nostalgia and is the result of paying undue attention to isolated moments that can be prettified in our memories. Those who don’t live in a fully reactionary environment find themselves put on the spot: if their lives are in any way characterized by pluralism, then they must acknowledge that there is an element of willful construction involved in how a traditional belief or practice comes to include (and exclude) whatever it does. And that, supposedly, undermines the theoretical force of the moral claim made on behalf of traditions. How, the argument runs, could a subjectively experienced and consciously elaborated-upon moment from out of the whole historical sweep of events be construed as truly serving normative personal or public ends? There is no reason to think that the resulting practice or belief is anything but somewhat arbitrary.
A wonderful summation of this perspective was written by Scott McLemee several years ago, in an essay celebrating the Seinfeld-inspired “holiday” of Festivus. The fact that I just put “holiday” in quotations marks is, in a sense, McLemee’s point: Festivus is so wholly manufactured, so completely a creature of the mass media and the narcissistic world of ironic detachment, that it can’t be commemorated without any such observation becoming a comment on the constructed nature of all “holidays.” Festivus, McLemee wrote, is the “postmodern ‘invented tradition’ par excellence.” The implication is that all traditions are equally invented. Thus, narratives that presume some kind of moral authority associated with their maintenance, and that talk about change in terms of decline and loss, deserve the postmodern puncturing that Festivus provides. McLemee invoked the Marxist historian Eric Hobsbawm,[1] to assert that the traditions we associate with holidays are primarily indicative of our historical position. Only when they are no longer binding, no longer economically necessary — in other words, only once the world had sufficiently modernized — do any of these holidays actually suggest anything that could be consciously expressed as “traditional.” As McLemee put it:
Once upon a time — let’s call this “the premodern era” and not get too picky about dates — people lived in what we now think of as “traditional societies.” Imagine being in a village where few people are literate, everybody knows your name, and not many people leave. A place with tradition, and plenty of it, right? Well, yes and no. There are holidays and rituals and whatnot. As spring draws near, everybody thinks, “Time for the big party where we all eat and drink a lot and pretend for a few days not to notice each other humping like bunnies”….And yet people don’t say, “We do X because it is our tradition.” You do X because everybody else around here does it — and as far as you know, they always have. Not doing it would be weird, almost unimaginable.
But then, starting maybe 300 years ago, things got modern….Well before Queen Victoria planted her starchy skirt upon the throne, people were nostalgic for the old days. And so…they started inventing traditions from bits and pieces of the past. In the 19th century, for example, folks started singing “traditional Christmas carols” — even though, for a couple of hundred years, they had celebrated the holiday with pretty much the same hymns they sang in church the rest of the year. In short, if you say, “We do X because it’s traditional,” that is actually a pretty good sign that you are modern. It means you have enjoyed (and/or endured) a certain amount of progress. What you are really saying, in effect, is, “We ought to do X, even though we sort of don’t actually have to.”
To be fair to McLemee, this really wasn’t so much an argument against traditionalism as it was a snark about it; as he concluded: “We gather with family at Christmas or Hanukkah in order to recapture the toasty warmth of community and family. And because, well, we have to.” But there is an assumption underneath the snark, an assumption holding that the ability to meaningfully affirm things through “mere” traditional practices and materiality depends upon a “naïveté” which has been destroyed by the self-consciousness of modernity. Talk of “tradition” therefore presumably means little more than aspiring to some kind of “second naïveté,” to use Paul Ricoeur’s phrase,[2] one that covers up our constructive role in establishing rituals and observances in the first place. The point is that such aspirations are flawed and a little silly; acceptable in their limited place, perhaps, as situated resources that individuals inclined to nostalgia can make use of if they so choose, but problematic if anyone starts using them in a way that might actually involve shaping public options or personal desires.
While there are other varieties of the anti-traditionalist position, this is perhaps the most defensible one. It isn’t an outright rejection of the communitarian claims made on behalf of traditions, but it is a fundamental weakening of them, such as one may see in the work of Will Kymlicka.[3] He presents “culture” as a repository of stories, behaviors, schemes of judgment and valuation and such, all of which ought to be available for individuals to enter into or exit from according to how rewarding they find them. This leads him to speak of the importance of positive action to support various traditional communities, beliefs, and practices, such as providing for group-specific rights of various forms. That certainly isn’t an unfriendly position to traditionalism, as such, but it is still a weakening. It presents traditions as a tool for individuals to use or disregard, not as anything constitutive of individuals. The argument thus presumes moral reasoning to be, ultimately, a kind of individual calculation, pushing the whole issue in an explicitly economic direction.
Much, however, is elided in this view. What if the primary question pertaining to the value of tradition is how one conceives of practices, how one conceives of flourishing, and how one chooses among the former to achieve the latter, in the first place? This is a point strongly made by the philosopher Charles Taylor, who, in a debate with Kymlicka over the ability of schemes of individual rights to protect cultural traditions, argued straightforwardly for the primacy of tradition:
The liberal accords a culture value as the only common resources of its kind available for the group in question. It is the only available medium for its members to become aware of their options. If these same individuals could dispose of another medium, then the case for defending the culture would evaporate. For the people concerned, their way of life is a good worth preserving; indeed, it is something invaluable and irreplaceable, not just in the absence of an alternative, but even if alternatives are available. The difference comes out clearly in the issue of long-term survival. People who have lived in or near French Canada know the resonance of this goal of survivance…The goal that unborn people, say, my great-grandchildren, should speak Cree or French or Athabaskan, is not one that Kymlicka’s liberalism can endorse….The people of French-Canadian ancestry, now assimilated in New England, are doing just as well as any other segment of the U.S. population in leading their lives in the English-language medium they share with the present compatriots. But the loss from the point of view of survivance is clear.[4]
Traditions, perhaps particularly as instantiated in holidays, form a part of the “medium” (historical, linguistic, moral, and otherwise) through which we interact with and make choices about the beliefs and practices available in the world. To reject the idea that traditions contribute to this collective background, and to reject the idea that they contribute importantly enough to potentially warrant some level of both personal and civic obligation to them, is to grant too much weight to the supposedly revolutionizing idea that this medium is a “constructed” one.
Consider Hobsbawm’s argument again. To work, Hobsbawm’s argument had to assume that there was a privileged historical moment when beliefs and practices endured without a consciousness of change, and without interpretive responses to it. But that is a strange notion; it depends on a kind of materialist absolutism, wherein we assume that no “real” self-consciousness existed until the critical innovations and economic revolutions of “modernity” (meaning the eighteenth century, or thereabouts). But actually it is not as though holidays and traditions that occurred in the premodern world somehow existed without interpretation. The constructive identification of rituals and observances with particular ends has always been a part of their own evolution, and of their celebration.[5] The increased subjective awareness that attends our own rituals and observances does not mean that our appreciation of them is categorically different from what came before; we may well be inventing something when we celebrate holidays today, but whatever we come up with isn’t necessarily arbitrary. Our inventing may be better understood as kind of “adaptive remembering,” — and potentially every bit as morally valid as the innovations of a hundred or even a thousand years ago.
To be sure, increased pluralism makes us into interpreters and inventors of a significantly different sort from earlier centuries. But this difference is not necessarily a tradition-shattering realization of arbitrariness; rather, it is what Ricoeur was getting at with his idea of a “second naïveté,” which I mentioned earlier. As he wrote in The Symbolism of Evil:
In every way, something has been lost, irremediably lost: immediacy of belief. But if we can no longer live the great symbolisms of the sacred in accordance with the original belief in them, we can, we modern men, aim at a second naivete in and through criticism. In short, it is by interpreting that we can hear again. Thus it is in hermeneutics that the symbol’s gift of meaning and the endeavor to understand by deciphering are knotted together.[6]
If modernity has meant anything, it has meant a change in our accounting of subjectivity. We can now think of ourselves as standing apart from our received “medium” of evaluation. And so the old naïveté, with its “immediacy,” won’t do any longer. But the result isn’t necessarily a radical change in how we orient ourselves toward traditions; it is a difference in the environment within which we do it. It’s the difference between someone who has only ever been immersed in a single musical tradition making distinctions between good and bad musical expressions, and someone who has been introduced to a plurality of musical traditions, now having to make distinctions with an understanding that the evaluative criteria provided by their own tradition itself can also be evaluated. So now we have to “wager” on interpretation; we have to use it self-consciously and therefore critically. But this does not warrant an encapsulating of all traditional claims as “subjective” and therefore incapable of playing any kind of normative or constitutive role in how we live our lives, much less how we mark the calendar; to do so would require a much wider, much more radical claim about the nature of our consciousness. When has interpretation ever not been involved in our orientation to the world?
Unfortunately, many defenders of tradition are either unaware of or refuse to take seriously the changes of history; their preference is to reify particular elements of a tradition into static performances or professions of belief, from which any deviation would be catastrophic. Charles Taylor struggled with this when he attempted to articulate a defense of traditionalist thinking while addressing controversies over cultural accommodation in Quebec (for my comments on the report submitted by Taylor and Gerard Bouchard, see here); Christopher Lasch also discussed this tendency, criticizing much communitarian argument as trafficking in lazy sociology, drenched in a nostalgia for exactly the kind of stable, traditional (and unreal) community which Hobsbawm’s argument implicitly relies upon.[7]
Lasch was by no means a complete defender of tradition, but his distinction between popular “memory” and sociological “custom” is important nonetheless. “Memory,” as he presents it, is that which active agents, working with and through (and, therefore, inevitably sometimes also against) their community contexts, enact and vivify (or revivify) through their collective choices. “Custom,” on the other hand, are actions in which the “judgment, choice, and free will” that made memory valuable is no longer necessary. Customs, in his sense, devolve “into patterns that repeat themselves in a predictable fashion.” Someone who is serious about tradition will not allow customary behaviors to get in the way of the responsible, interpretive action of “memory.”
Would such a serious person include my father-in-law, with his grumbling jokes about “Hallowthanksmas”? I wonder. Around Thanksgiving, one of my favorite books to read to our younger children is Thank You, Sarah: The Woman Who Saved Thanksgiving, by Laurie Halse Anderson and Matt Faulkner.[8] It tells the story of Sarah Hale, an abolitionist, editor, and social reformer, who spent thirty years writing letters and publishing articles, trying to get the federal government to officially acknowledge (and thus hopefully resuscitate) Thanksgiving, a religious and cultural holiday which dated back to the early colonial days, but whose observance, by the mid-nineteenth century, was slowing dying out. She finally succeeded, and the book makes President Lincoln’s declaration of a national Thanksgiving Day holiday out to be Sarah’s greatest triumph. What should we make of that? One could, of course, dismiss Hale as a sentimental busybody. But maybe it would be better to say that she was committed to helping her country engage in a little “creative remembering.” The fact that what she accomplished was, strictly speaking, a political invention doesn’t take anything away from the moral connections it makes possible for all Americans. In committing herself to a belief and practice, and interpretively responding to the reality of that those traditions as they existed in the decades leading up until the Civil War, she contributed to the maintenance of a normative factor in the lives of her fellow citizens, a factor that might well not have been there otherwise. This is not about whether you like Thanksgiving, or find it important to distinguish it from the dominant holidays immediately before and after it in our national calendar; this is about preserving a space for both a personal and civic recognition of what a tradition of giving thanks, however one chooses to interpret that, can mean. That’s not blindly following custom (though there may well be elements of such involved; there almost always are); that’s doing the kind of important thinking which holidays have always made possible, whether we were self-conscious about it or not.
Well, enough of that. Off to get the Chinese New Year decorations out of the box in the garage. Can’t be too early for that; it’s become so commercialized lately, don’t you know.
Notes
[1] On Hobsbawm’s arguments about national identity and traditions, see his essays “Inventing Tradition” and “Mass-Producing Traditions: Europe, 1870–1914,” in The Invention of Tradition, Eric Hobsbawm and Terence Ranger, eds. (Cambridge University Press, 1992).
[2] Ricoeur discusses this concept in the final chapter of The Symbolism of Evil, Emerson Buchanan, trans. (Beacon Press, 1967).
[3] See Liberalism, Community, and Culture (Oxford University Press, 1989) and Politics in the Vernacular: Nationalism, Multiculturalism, and Citizenship (Oxford University Press, 2001).
[4] Taylor, “Can Liberalism Be Communitarian?” Critical Review, Spring 1994, 259–260.
[5] Adrian Hastings considers this dynamic, primarily by way of the constitutive role of language in shaping national traditions, in The Construction of Nationhood: Ethnicity, Religion, and Nationalism (Cambridge University Press, 1997).
[6] Ricoeur, The Symbolism of Evil, 351.
[7] See The True and Only Heaven: Progress and Its Critics (W.W. Norton, 1991), 133–134, 139–143.
[8] Simon & Schuster, 2002.