On p. 250, Todorov approvingly quotes: "The man who finds his country sweet is only a raw beginner; the man for whom each country is as his own is already strong; but only the man for whom the whole world is as a foreign country is perfect." Is he correct?
Let me take a stab at a response here; I will run out of time, and then try to post more later tonight or tomorrow.
I think Doug makes an excellent point that "what, for example, people get an MA in at SAIS is called International Relations, and so is theorizing about, say, epistemic communities and hegemony of domestic systems of government" -- even though these activities have, as far as I can tell, less in common with one another than (for instance) the playing of American football and the Monday-morning quarterbacking of a game after it's been played out. At least the Monday-morning quarterbacks have the actual practice of on-field football as a point of reference; no such tight linkage is necessarily the case in IR.
Let me elaborate a bit. For me, "scholarship" is basically a synonym for "producing systematic knowledge that is in some sense valid." [There's a lot packed into that phrase "in some sense," and by "a lot" here I mean "the entire field of the philosophy of science and probably epistemology in general." but I want to set that aside for the moment.] Hence, a scholarly approach to anything involves generating systematic, in-some-sense-valid knowledge about it. This is a qualitatively different matter than actually engaging in the thing in question. Scholarship about baseball is not playing baseball; scholarship about music is not the performance of music; likewise scholarship about world politics is not "doing" world politics.
I don't think I've said anything particularly controversial yet. But where it gets controversial is the relationship between scholarship and object. We have two ideal-typical positions on this: scholarship ought to improve practice, and scholarship can't possibly improve practice, at least not directly. Rob clearly prefers door #1; I prefer door #2. Rob's position is the classic Enlightenment hope for the sciences of society: place practice on a more rational basis, achieve better results, produce a world that looks more like the world we want to live in; I think that's both dangerous and a little naive -- dangerous because it puts a potential transcendental justification for coercion in the hands of would-be reformers (after all, if the experts told us that we can do this, and you disagree, then you're either stupid or obstinate, and in either way you're in the way so forcibly removing you starts to look like a good idea) and naive because it presumes that scholarly knowledge translates more or less simply to the actual world (and once again, if it doesn't, maybe we ought to use force to make the world look more like the model . . .).
I prefer option #2 -- scholarship can't possibly improve practice, at least not directly -- in part because people claiming to have Reason/God/Truth on their sides ("Jesus/Buddha/Muhammad/science likes my policy better!") have been responsible for most of the senseless death in human history, in part because systematic scholarly knowledge is by nature an abstraction (and sometimes a severe abstraction, in which the actual practice of anyone in particular disappears -- the sports analogy here would be to sabremetric analyses of baseball, and we've seen what happens when actual baseball teams try to directly implement strategies that look valid sabremetrically) and therefore not fit for any sort of direct translation into practice, and in part because scholarly knowledge is irreducibly perspectival and thus does not seem to me to be a good solid basis for decision-making (although it can certainly inform decision-making as one element among others).
A concrete example. Rob references the Counter-Insurgency Manual; I can see two scholarly things to do with that work: conduct a discourse analysis of how it implicitly and explicitly enframes the issues of personhood and rights and the like (my preference), and conduct research on the manual's perhaps implicit claims about the success of various tactics to see if those claims stand up (maybe what Rob would prefer). Both of these kinds of scholarship can inform decision-making, the first by highlighting the ethical issues involved in our enframing of actions in one way rather than another, the second by giving some sense of likely results of courses of action. But in neither case do they tell someone involved in counter-insurgency what to do, let alone instruct someone in how to engage in counter-insurgency operations! In neither case does scholarship -- systematic, in=some-sense-valid knowledge -- solve the fundamental practical and political question of what someone ought to do in a specific situation.
To tie this back to MAs and practitioners: if I want to go into the practice of world politics, I want to learn how to make policy decisions. If I am teaching someone who wants to go into the practice of world politics, I want to give them a sense of the irresolvable dilemmas that they are going to face, and help them to develop a critical disposition that can help them grapple with those dilemmas. None of this has anything whatsoever to do with the systematic results of my scholarly investigations into anything; it has to do with exercises designed to clarify value-commitments and their implications.
If, on the other hand, I am advising policymakers, I probably want to present my results but then realize that it is not my job to make the tough decisions surrounding their implementation (as Weber said, politics is the slow boring of hard boards) and leave that to the policymakers. But that's not teaching students, it's offering a scholarly input to a policymaking process that a scholar has to remain independent of lest she or he compromise her or his detachment and turn into a partisan for one or another group or party (and thus, by definition, no longer be engaged in doing scholarship).
And in neither case does this have anything to do with "certifying idealism," which was my original point.
The part I promised to post later reads like this:
Three other things that occurred to me.
1)I still don't see the intellectual point of a terminal MA because, contra Doug, I don't see those classes at operating at a level any higher than what one finds in undergrad. In fact, I teach my MA theory course like I taught my undergraduate theory classes: we read Hobbes, Locke, etc., and talk through their arguments and implications thereof. Precisely what I did and do for undergrads. So from my perspective a terminal MA in IR looks like more undergrad plus a few "policy" classes (talk about some issues, generally in a completely a-theoretical way) and some "methods" classes (basic stats -- which they probably had in undergrad already anyway -- and sometimes "risk analysis," which to a social scientist like myself just looks like bad research design and flawed data analysis). Yes, I get that this helps people get jobs. What I don't get is why it helps people get jobs, and what people think that a graduate of such a program can possibly do that a well-educated undergrad can't already do.
2) I still maintain that the most helpful thing any of us in the academy can do for any of our students is to temper their idealism. This is especially true for Americans, who tend to be pretty unreflective idealists by default. This is why political realism is actually a critical theory in the US context, since it maintains that there are limits on political possibilities, limits that can't be overcome with a little effort and a clever slogan. Again, I think that this is no different than a good undergraduate program, but if MA students are not getting that until their post-graduate work, I suppose better late than never.
3) I pity the student who comes to me hoping to be trained in job-relevant skills if their anticipated job is someplace other than academia. Academia is where I work, and I know how to do that job pretty well, so I can pass on bits of practical advice and professional wisdom. The State Department? I can find it on a map, but I've never worked there and have no desire to do so, so I am not likely to be of any use to students looking to be trained in how to succeed at State (or in any other DC institution).
UPDATE: this also got picked up by Daniel Drezner on his blog, and there's more discussion of the issue there.
Suppose for a moment that the options for organizing a political-economic system are only two, embedded liberalism and disembedded liberalism, which is to say, a market system in which the market serves non-market social purposes or a market system in which the market is as unfettered as possible and serves only its own goals of economic efficiency. Knowing what you now know about the institutional and political requisites and consequences of each system, which system would you rather have? Why?
But I don't really have a lot to say. Dumbledore's sexual orientation matters as little to me as his mother's middle name or what kind of music he likes, because none of these things really seem to make much of a difference to what he does during the course of the Potter novels. Yes, I know that according to Rowling Dumbledore's sexual orientation explains why he was so taken with Grindelwald, and thus ought to be relevant to the plot . . . except that Dumbledore's fascination with Grindelwald was perfectly well explained in the books without introducing that kind of attraction, which makes the detail about Dumbledore's preferences in partners rather irrelevant. (What sweets Dumbledore likes is, however, a relevant issue, given his proclivity for making the names of those sweets into the passwords used to access his office.)
I find it fascinating that so many people thought that I ought to have something to say on the matter. I also find it fascinating that Rowling herself waited until after the novels were completed before revealing this bit of information, as though (and according to what one can read between the lines of her public comments) she were expecting people to be uncomfortable with that fact. Certainly some people might be, particularly those who think that homosexuality is some kind of mortal sin, but most of those people probably aren't reading Harry Potter in the first place, given the conservative religious opposition to the whole series. As far as Rowling's avid readers go, I'd say that for most of them sexual orientation is as uncontroversial and ultimately as irrelevant as shoe size. They're like Harry, through whose eyes the books are largely written -- and Harry couldn't care less about such things.
Whether this is a good development or a bad development depends on your perspective, though. For some people, the absence of explicit markers of differences in sexual orientation constitutes a form of erasure: the lack of explicit identification of difference means that difference is being elided and a universal norm is implicitly being proclaimed in its place. For others, the disappearance of identity-categories like "gay" as critical pieces of information, particularly for fictional characters, constitutes a form of progress. Tough call -- and probably looks very different depending on whether you are self-consciously a part of the group not being explicitly portrayed.
Indeed, I suspect that part of the reason why the question of Dumbledore's sexual orientation doesn't seem all that important to me is that I am not interested in claiming him as a fellow-member of a group on that basis. Were I interested in identifying powerful gay male characters in literature, then it would probably bother me that we only have the author's post facto comments to go on. And I suspect that I might even be bothered by the fact that Dumbledore is not explicitly portrayed as a gay man anywhere in the books -- nothing about his experience really seems to have much of anything to do with his sexual orientation. Does this hetero-normalize Dumbledore? In a way, I suppose, in that opportunities to disrupt the reader's sense of the ordinary are foregone. But Harry Potter isn't about disruption, since it's a piece of myth or legend; rather, it's about re-establishing continuities and sustaining themes. It is not, in that sense, political: like Star Wars, its aims are to do something to the basic structure of how we engage with the world rather than to affect any specific changes within that world. And I am firmly convinced that not every work of literature -- heck, not even every work of social science or social theory -- has to be narrowly political (or, to use the term of art for political science, "policy relevant") in order to have value. I don't read the Potter novels looking to advance a political -- even an identity-political -- agenda, so the fact that the novels are very poor sources for such a thing doesn't bother me at all. A different agenda might make me considerably less sanguine about the whole thing, but I rather like the mythical (or maybe "meta-political") agenda I have and I think it would get very boring and frustrating to always evaluate everything in terms of its portrayal of some specific identity-category.
At the end of the day I suppose I agree with the position that says that when writing fictional literature one ought to reveal those aspects and details of a character's life that are actually relevant to the story. Is Obi-Wan gay? (The potential answer changes if you read the expanded universe novels, because in those novels he falls in love with a female Jedi apprentice named Siri Tachi.) Who cares? Would it matter to the story if he were? I can't imagine that it would. That being said, does the complete absence of any openly gay characters from the Star Wars universe subtly reinforce the heterosexual norm? Yes, it does. But the point of a piece of fictional literature is to tell a story, not to advance a specific political agenda. Losing sight of that is what makes for preachy novels, like (for example) every piece of fiction that Ayn Rand ever wrote -- contrast those to the works of Robert Heinlein, who had roughly the same set of values but was less a propagandist than a writer, so his stories work as stories.
So: Dumbledore's gay and I really don't care. It changes nothing.
The place we left off in class on Tuesday concerned the proper basis for policymaking, especially for the making of foreign policy. We talked a lot about "interests," but this introduces a dilemma: if states have real, objective interests, then they ought to follow them even if their publics disagree, but if states do not have real, objective interests (as Wolfers appears to argue) then the entire debate about "interests" is just a political game. This suggests the following question: is public opinion, or the will of the people, a sufficient basis for state policy? Or should we be looking for something else, something outside of what people might think, as the proper foundation for state action?
It might be easiest to try to tackle this question with an example. Alternatively, click over here and bounce off of what Liz already posted.
Hmm. I think there's a niche there.
Last week my World Politics class was having a discussion about Iran's nuclear program, and I posed the question: why is the US worried about a nuclear Iran? We talked about regional instability a bit, and then I pressed a bit on the notion of "threat" by asking why we were threatened by Iran's potential possession of nuclear weapons -- seeking to suggest the point that maybe "threat" was not just about material capacities. [This followed my usual rule of pedagogical discussion-facilitation: if I'm going to intervene, it's going to be on the opposite side of whatever group consensus is emerging.] Now, I could have made that point by talking about North Korea, or the Cuban Missile Crisis, but I saw an opportunity for a pedagogical bonus and instead asked what the second-largest nuclear arsenal in the world was. People tossed out guesses, and I said "Britain" -- a country no one had named. My point: we don't immediately think of weapons possessed by our allies as threats, even if there are a substantial number of weapons involved.
Now, the claim that Britain has the world's second-largest nuclear arsenal is, strictly speaking, wrong. Russia has a far larger arsenal, both in terms of sheer number of warheads and in terms of potential destructive capacity. As Greg rather conclusively demonstrated, there are a lot of Russian nukes out there, considerably more than we would find in Britain. Greg e-mailed me a more elaborate comparative analysis of the issue, the conclusion of which was
By these calculations, the Russian Federation has approximately 1347.85-1697.85 deployed megatons (and this doesn’t count the approximately 6,000 warheads in reserve!). Now, since Britain has a total of four Vanguard-class SSBNs, each of which carry 16 Trident II D5 SLBMs armed with up to three 100 kiloton warheads, they can have a total deployed megatonnage of only 19.2.
19.2 megatons versus 1347.85-1697.85 megatons…I sure wouldn’t want to be Britain in that fight!
I checked the figures with some friends downtown, and the consensus was that if anything Greg's numbers for Russia were a bit low, given the possibility of mating other nuclear devices with delivery systems in the event of a nuclear emergency.
So: my assertion was wrong, as I knew at the time. (I wonder what I was thinking, making such an assertion in front of someone quite knowledgeable about the Russian military . . . hmm.) This doesn't affect the overall point I was trying to introduce, but it does illustrate that no one should ever feel the slightest bit awkward about challenging a factual assertion as long as they are willing to put in the effort to check out the data. The fact that I have degrees in political science does not mean that everything I say should be unquestionably believed.
[Indeed, there's a further wrinkle to this story: part of why I wanted to push this claim was that I had what seems like a plausible conceptual argument to back it up. If it were the case that the Russian nuclear arsenal was largely inoperable because of command-and-control issues, then it might be possible that the actual Russian nuclear capacity would be quite a bit lower than the simple calculation of yield-rates would suggest. Then we'd have an illustration of the old Ben Kenobi line -- "many of the truths we cling to depend greatly on our own point of view" -- since the issue would move to a discussion about the proper meaning or sense that we should assign to the word "capacity." But it turns out that I was just in error about this, because the Russian command-and-control system has apparently gotten a lot better in the last few years. Were it to come to a nuclear crisis of some sort, Russia would command the world's second-most-powerful arsenal by an order of magnitude. No clever pedagogy this time -- thinking the opposite was just a flat-out error on my part. Yet another reason to check my facts, and everyone else's facts: no one is right all the time!]
Note that it's perfectly okay to vehemently disagree with the entire premise of this question, if you'd like. And remember that this post, and the reflective post, are both due on Monday of next week since you have the essay due on Thursday.
Of course, things were a bit different back in 1992 in terms of the kind and amount of information that I had about the concert before going to see it. I was then, as I am now, a member of the "Paperlate" online group, but back then it was a text-only listserv. I did have acess to setlists from earlier shows on the tour, and reviews of those shows, so I had a reasonably good idea what to expect when walking into the stadium. But this time around, before going to see the show tomorrow I have available to me:
1) complete and detailed reviews of every show that they've performed on this tour to date;
2) fan-produced video footage of other concerts (through YouTube, among other sources);
3) bootleg recordings of earlier shows on this tour [and yes, I am one of those people of the opinion that fan-traded bootlegs of live performances are perfectly acceptable products as long as no one is making any money off of them -- and they certainly don't cut into any profits that the band would otherwise make, since if anything they -- like fan fiction -- only make those of us obsessive enough to go out and find those recordings more excited about the official releases and performances]; and
4) "official" soundboard recordings of the earlier shows on the tour, purchased through a commercial service (http://www.themusic.com/) that is making available recordings of every show on the tour available for purchase. I have two already -- Manchester and Munich -- and have the DC show on order.
Some may think this a bad thing; "no surprises?" is the complaint I often hear. But for me, not having surprises at a concert is a good thing, because it allows me to appreciate all the more precisely what is going on, secure in the knowledge that I have a pretty good idea what to expect. That way I can get my mind out of the way, as it were, and just enjoy the experience. And since Genesis concerts are such rare events, I want to make sure to enjoy this one to the fullest.
If I get inspired I may post a review after the concert.
If a state pursues wealth as its highest goal, will that make it more peaceful? Does the pursuit of wealth (instead of other possible goals that a state might pursue) lead to peace?
Note that the best answers to this question will involve both a theoretical argument about the relationship between wealth and peace (even if that argument is that there is no relationship), and some reference to empirical examples. You want to make a case as strongly as you can for your point of view, and incorporating both of these styles of reasoning will help you to do so.
Also, note that if you are arguing that the pursuit of wealth does not necessarily lead to peace, you might want to use that as an occasion to argue for the primacy of something other than wealth as a state goal and assess the impact of that on peace. Just a suggestion.
Can sovereign states agree on global standards? If so, how and why; if not, why not?
I am looking for arguments here: state your position and then support it. In other words: provide claims and warrants.
The question, in case anyone missed it: what is the most important issue in world politics today, and why?
This is not one of those "there's a right answer" questions (I try not to ask those anyway); this is more of an "make a case for what you think is important" kind of exercises.
|Name||Course Title||Hogwartian Equivalent|
|Anthony Riley||Psychology as a Natural Science (PSYC-115.080C)||Potions|
|Christopher Tudge||General Biology I (BIO - 110.080C)||Herbology|
|Gary Weaver||Cross-Cultural Communication (SIS-140.080C)||Muggle Studies|
|Borden Flanagan||Individual Freedom vs. Authority (GOVT-105.080C)||Defense Against the Dark Arts|
|Caleen Sinnette Jennings||Theater: Principles, Plays and Performances (PERF-115.080C)||Charms|
|Keith Leonard||Great Books That Shaped The Western World (LIT-125.080C)||Ancient Runes|
|Sarah Menke-Fish||Visual Literacy (COMM-105.080C)||former Keeper for the Chudley Cannons|
|Steven Taylor||Politics in the United States (GOVT-110.080C)||Muggle Government|
|Barb Palmer||Politics in the United States (GOVT-110.081C)||Muggle Government|
|Walter Park||Macroeconomics (ECON-100.080C)||former chief strategist for Gringott’s Wizarding Bank|
|Jeffrey Middents||Critical Approach to Cinema (LIT-135.080C)||Care of Magical Creatures|
|Patrick Thaddeus Jackson||World Politics (SIS-105.080C)||Headmaster|
|Tiffany Sanchez||Director of New Student Programs||Transfiguration|
|Jamie Wyatt|| Assistant to the Director of |
In a little while I will be heading over to this year's opening University College reception; this is the first one that I am attending as the person in charge of the program (the Headmaster, if they'll let me use that title -- the jury's still out on whether I can officially call myself Headmaster, although that's how I think of my job when it comes to the UC program). As such I
The theme, or rather, the point of departure for my remarks will be the sorting hat that I've brought with me today. [Normally it lives in my office, which is where sorting hats are supposed to live, right?] Why bring such a thing to an opening reception? Largely because it's a reminder, here in the midst of our opening-of-school celebration, that we have all been selected to be here. We, all of us, have been sorted, and we will continue to be sorted during our time together. Indeed, you might even say that the central purpose of the University College program is to help prepare students to be properly sorted as their college career proceeds.
What do I mean that we've all been sorted? Well, every student in the UC has already been sorted at least three times: they were admitted to AU, they were admitted to the UC, and they were placed in their particular seminar. Every faculty-member was hired, and then specially selected to teach in the program. Every administrator was chosen to take on the responsibilities with respect to the program. And every parent, every family-member, has been placed in a position of giving support and encouragement to the student in their family; thus family-members also have been cast in a role in this particular drama.
I am deliberately doing something that high school English teachers would not like, and that is speaking in the passive voice. I am not identifying a sorter or a decider; I'm leaving that open. See, in the fictional world of Harry Potter, there is a magical object -- the Sorting Hat -- that is uniquely responsible for all of those decisions. There's a doer for the deed of sorting, some-one or some-thing that is responsible for placing students in houses. But in our world, things aren't quite that simple; there's rarely a single identifiable sorter that is completely responsible for where people end up. Finding oneself someplace is usually the kind of thing that can be traced to the influence of many, many people and events; generally, no one person puts anyone anywhere.
Even those of us with some authority to decide where people are going are only presenting people (UC students, in this case) with a set of options and a set of experiences; what they do with those options and experiences will largely determine the kind of person they become, and bereft of the magical insight of the Sorting Hat we mere mortals just basically stumble along dong the best we can with no guarantee that we are placing people where they really belong. So we all play roles, but none of us play definitive roles.
So why talk about "sorting" at all? Why not just talk about people deciding to be where they are? I think that the language of "sorting" is important as a corrective to our culturally-induced habit of assuming that everything that happens is somehow traceable to something we've more or less deliberately done. If we're honest, how many of us ever are in a position to know all of the consequences of the choices we make? We make decisions, and then things happen that we couldn't possibly have foreseen -- and we end up someplace different than we'd planned to be, perhaps because of the unintended consequence of someone else's choices. So, to be philosophical for a moment, we might say that choices are causally connected to outcomes, but not because of the goals of those choices. Instead, how we come to be at a certain place is a function of a very complex and subtle interaction of different factors, many of which we may not even be aware of -- and many of which, I would bet, have little to do with anyone's deliberate decision. And even those factors which are related to choice don't invalidate the notion of "sorting," because even in the Harry Potter universe the Sorting Hat takes people's choices into account when placing people into specific houses.
Speaking in terms of "sorting" also, I think, puts us into a different frame of reference. If we've all been sorted, then the question we all have to ask ourselves is: how do I make the most of this situation into which I have been placed? How do I live into this house, this seminar, this program, this university, this city, in which I find myself? If we focus on decisions instead of on sorting, we might miss large portions of the experiences that await us because we're too narrowly committed to the reasons why we thought we came here in the first place. And we thus close ourselves off to the kinds of growth than can take place when we open ourselves up to the present and its possibilities.
Sorting is about possibilities, not about clearly defined outcomes. We've all been sorted, which means that we have been placed in situations with various possibilities for growth -- possibilities that need to be seized and implemented in order to be realized. Our job in the University College is to help you specially selected students to develop your potential, and in particular to learn how to best take advantage of the experiences that await you in college. We have a lot of resources to help you do just that, and if you adopt the attitude appropriate to one who has been sorted, an attitude of trying to explore all of the myriad possibilities available to you instead of just leaping quickly to some pre-defined end, I have no doubt that you will live into your true potential and prove the sorting process right in the end.
[Not sure exactly how much of this I will use in my remarks. Stay tuned.]
The sorting hat says that I belong in Ravenclaw!
Said Ravenclaw, "We'll teach those whose intelligence is surest."
Ravenclaw students tend to be clever, witty, intelligent, and knowledgeable.
Notable residents include Cho Chang and Padma Patil (objects of Harry and Ron's affections), and Luna Lovegood (daughter of The Quibbler magazine's editor).
Take the most scientific Harry Potter
Quiz ever created.
A Day in the Life of Oscar the Cat
Being-towards-death is one of the basic constituents of Dasein, according to Heidegger. Oscar, it seems, has this at least as strongly as anyone else in the hospice. Truly, cats are people too!
A: a sabbatical is a time of leave, within which a person is supposed to rest and recuperate. It is an extended respite from one's regular labors.
Q: Not working? Where did a crazy idea like this come from? It sounds vaguely socialist.
A: it's Biblical. First and foremost in the Hebrew Bible (a.k.a. the Christian Old Testament); check out Exodus 23:10-11 and Deuteronomy 15:1-6 for good blunt statements. It's related to the idea of a "jubilee," a kind of omnibus divine forgiveness and celebration during which everyday human considerations (e.g., one's daily labors) are suspended in favor of a re-membering (or re-collection) of oneself as a created being -- something that gets lost in the travails of everyday living.
Q: so this is a religious thing?
A: yes, but like so many other "religious things" it's become part of the fabric of the everyday secular world (c.f. Max Weber's famous thesis on the "Protestant ethic" and how it was rationalized into the fabric of contemporary capitalist free-market economies). Indeed, in common parlance "sabbatical" now seems to mean any kind of prolonged hiatus, without any kind of religious connotation at all. (Further illustration of Weber's thesis about rationalization, but that's a different post all together.)
Q: how can I get myself one of these sabbatical things?
A: several pathways. You could become a famous artist and then simply declare that you are taking a sabbatical, like Gary Larson and Bobby McFerrin did. You could be employed by a firm or business that regards such leave-time to be a productive investment in the long-term health and productivity of their workers. Or you could enter one of the two professions where the notion of sabbatical is quite well-established: the pastorate and academia.
Q: I get the link between sabbaticals and being a pastor, because of that whole religious thing . . .
A: You'd be surprised how many people don't get that. Being a pastor is a really draining vocation, because you're tending your flock (to coin a phrase) pretty much non-stop when you're actively serving a church. It's hard to remain energized and focused amidst the daily grind of keeping the church running -- even amidst the daily grind of preaching sermons and helping people develop their spiritual vocations. So a sabbatical, a time of "letting the fields lie fallow," is an indispensable part of being an effective pastor; one needs that time to recharge, refresh, revitalize.
Q: . . . but as I was saying before you interrupted me,
Q: why do professors get sabbaticals? Why do they need them?
A: let me split that up. Why professors get sabbaticals is because universities grew out of monasteries, and there are all kinds of ways that they continue to bear the traces of that parentage. Universities are very medieval places, with arcane hierarchies and hereditary privileges that are usually just taken for granted (like "tenure" and "academic freedom"). Okay, that's oversimplified; some places are in fact questioning tenure, and academic freedom is pretty much continuously under assault by political activists on both sides of the spectrum. [I am not going to give them the satisfaction of linking to any of them; google around a bit if you're curious.] But the overall point stands: universities are in a lot of ways deliberately archaic. Sabbatical leave is just another one of those deliberately archaic things.
As for why professors need sabbaticals, let me start off by saying that they do not need sabbaticals for the reasons that they usually have to give to administrators in order to get one. Sabbatical leave is not supposed to be about finishing a project or starting a new project; that's called "work," and sabbaticals are supposed to be a hiatus from work. Nor is sabbatical leave a reward for getting tenure, although the six-year timetable of most tenure processes means that usually someone's first semester or year of sabbatical leave happens in year seven of their employment, just like in the initial Biblical injunction about how often sabbaticals are supposed to occur. And while sabbatical leave does in fact contribute to having a more productive faculty workforce, that's not why I think that professors need sabbaticals either. [There are probably administrators and boards of trustees who would disagree with me about all of these points; so be it.]
Put bluntly: I believe that professors need sabbaticals for the same reasons that pastors need sabbaticals. It is so easy to lose your vocation amidst the daily grind of "being a professor": teaching classes, grading students, etc. The work piles up, and like that of a pastor is really never stops, especially in the age of e-mail and IM: this is a more than full-time job. As it should be, because I am not at all sure how one would do it any other way without (to be blunt once again) betraying the academic vocation. If one is going to do this right, it takes time and effort and work, and that's why professors need sabbaticals: because they need to recharge, refresh, revitalize.
Q: so you're doing nothing on your sabbatical?
A: not exactly "nothing." I am reading some things I wouldn't usually get a chance to read if I were teaching, things that demand a bit more of an exclusive focus of attention. (Kant's Critique of Pure Reason, at the moment; Hegel and Vico and Wittgenstein and Durkheim and Dewey and James and Simmel a bit later in the semester.) I am also working on a couple of papers I didn't have a chance to write during the past couple of years; this is actually not what I am supposed to be doing with my sabbatical, I don't think, but under modern conditions one has to demonstrate that one used one's time "productively" so there you go. And I'm actually quite excited about these papers: one is the conclusion I wanted to write to my book but didn't have the time of the space, and the other is a paper on causality that I have been toying with for about a year now, and am finally making some headway on. Also, the reading I'm doing at the moment is leading up to a paper on pragmatism on both sides of the Atlantic that I am going to be presenting at a conference in September, so in that way it's "productive" too. [Damn Protestant ethic -- can't shake that thing!] I've also been doing some things I have wanted to do for a while: reading and commenting on some colleagues' papers; teaching myself to use Final Cut Express, thinking about multi-sensory assignments for courses, and so on.
Q: sounds like you're wasting a lot of time instead of spending it thriftily.
A: maybe. But isn't that the point of a sabbatical?
A: that's also why I haven't been blogging much. Eventually I'll feel like talking again and I'll post more, and occasionally something might surface that is just begging to be posted about, like an article in the Chronicle called "Blog Overload" that a colleague sent me today. But otherwise, consider me on hiatus for a few months.
Q: you mean hiatus as in a sabbatical?
"Yesterday I got an email asking me to be a discussant at ISA. I've never done this before and I want to do a good job. Do you have any tips on being a good discussant?"
With the student's permission I am posting my reply here in the hopes that it may prove interesting or informative. [ISA = International Studies Association, the major professional association for IR scholars; their annual meeting is where most of us go to present work in progress and/or to hang out with colleagues from other universities and colleges/]
"Indeed I do.
First of all, I'm not sure that it's a good idea for graduate students to serve as discussant in the first place. The presenters-and-discussant(s) format lends itself to the posing of thorny questions by the discussant directed at the presenters, and this might lead to some role strain if the discussant is a graduate student and the presenters are established scholars. Far better, in my view, is an arrangement in which the presenters can run the gamut from graduate students to senior folks, and the discussant is at least a tenure-tracked professor someplace or has a comparable level of job security. I have no problem with a panel where all the presenters are the same rank, even if they are all graduate students, but I generally think that discussants should be a bit more established. So proceed at your own risk.
Second, a discussant in the traditional presenters-and-discussant(s) format has two distinct tasks: to discuss the papers, and to help to foment a discussion among the panelists and perhaps even members of the audience. Many people make a serious mistake and overemphasize the former task to the detriment of the latter. This is generally a mistake in the conference format because you cannot presume that the audience has read the papers in advance; if you could presume this, then commenting on the papers would be a good way of starting a discussion. But otherwise, comments on specific passages from the papers is likely to just confuse or bore the audience. In my view such feedback (which is in fact one of the tasks of a discussant) is better handled through e-mail or in some other more interpersonal and private setting.
The most boring discussants I've ever seen are those who proceed step-by-step through each of the papers on the panel, making suggestions that are generally of interest only to the author(s) of the individual papers, and then sit back as though they have completed their job. They have not. A panel is not, in my opinion, a kind of feedback session to which the audience has been invited as spectators; it's not a "fishbowl" situation in which the audience is simply observing. Rather, a panel is -- or can and should be -- something of a conversation, a discussion, a clash, a debate: at any rate, something more active and participatory.
It is the second task of the discussant to jump-start that conversation. There are better and worse ways to go about doing this. Often the worst way is to try as hard as possible to find some common thread running through all of the papers, and to display that for the audience regardless of how strained and awkward it is -- as though the point of a panel was for people to agree! I think this is largely silly. A panel is a public forum for disagreement, not agreement; it is contentious, not conciliatory. And it's a lot easier for the audience to participate in a debate than it is in a long train of agreement, because in a debate speakers can take positions -- even if those positions are sometimes "I agree with you about X but disagree with you about Y." The goal here is not to divide people into camps, but to give people an opportunity to articulate stances and to have those stances challenged -- and then give them an opportunity to defend them.
In my experience, a discussant manages this best by articulating opposing points of view -- opposed to the views presented by the panelists. Sometimes this involves setting the panelists against one another by drawing out their differences and disagreements; sometimes it involves explicitly mentioning the elephant in the middle of the room, the implicit Other against which everyone is arguing; sometimes it involves setting the papers in a broader disciplinary context so as to invite other parts of the discipline into the discussion; and yes, sometimes it just involves going to town against a paper and demolishing its absurdity. (This last possibility is also a large part of why I don't think that graduate students should be discussants, because they are most likely going to be more restrained in commenting on the papers of more senior scholars. There's also another caveat here, which is that I think it basically unconscionable to publicly flay a graduate student presenter; you can press them, you can raise objections to their argument, but you have to be somewhat more restrained because a graduate student is still learning the ropes. If the presenter is an established scholar, then they know better and in my opinion you can do things like bluntly point out that they're making no bloody sense -- and do with impunity, and with a clear conscience.)
Hence: I'd say that the job of a panel discussant is to serve as the living exponent of Weber's "uncomfortable facts," to explicitly introduce those lines of inquiry that raise problems and challenges for the paper presenters and their arguments. And then sit back and let the discussion proceed -- unless you're also the chair, you are not responsible for shaping the discussion, just for kick-starting it. Precisely what this involves is going to vary tactically from panel to panel: if the papers are all close together, play the role of the alternative point of view; if the papers are implicitly disagreeing, fan that into a full-blown clash; if there are gaps and silences, call attention to them.
Finally, remember that the panel is not about you. A discussant should not seize the opportunity of being a discussant to present a fully-formed alternative perspective or paper of her or his own. You can allude to it, reference components of it, but do not fall into the trap of actually presenting another quasi-paper of your own. This is both unfair to the participants (neither the other panelists, nor the members of the audience, will have had any opportunity to read your "paper" in advance -- hence they have to react to your arguments on the spot, which not everyone is particularly good at doing) and something of a betrayal of the format. Save your own presentation for another occasion.
Of course, all of this only applies in my opinion to a traditionally-organized panel in the traditional format. The funkier one gets in panel organization, the less these rules apply. I have seen more collaborative panel discussions, in which what transpires is less of a debate and more of a genuine exchange of ideas; I have seen and participated in autobiographical roundtables where people share their stories in preference to making and defending claims; and we all know about "moosehead" panels where the Usual (senior scholar) Suspects show up and give their established song-and-dance routines, and the megawattage of the star-power in the room effectively squelches anything like a real debate or discussion. (That last one is a particularly difficult place to play discussant, I think, unless one is already shining as an established disciplinary star; otherwise, the potential consequences are likely to prevent one from raising anything particularly challenging or difficult. But interestingly, one senior moosehead on a panel with other less luminous scholars is a great opportunity for a discussant; that's an environment where many such scholars are much more comfortable being pressed and challenged. And it's a lot of fun to be able to do that as a relatively junior scholar, believe me.) So I suppose that my last piece of advice is to figure out what kind of panel you are serving on, and adjust accordingly."
[cross-posted at the Duck]