Translate

Sunday 2 August 2015

“Be Yourself!” Is This Really Good Advice?

I’m not sure telling people to be themselves is good advice, but my saying so never seemed to have much purchase with undergraduates in my Intro to Lit course.   The sadist, the homicidal maniac, the pedophile--aren’t they “being themselves” when they commit their crimes?  Shouldn’t we tell people, and ourselves, to “be better”?


The context of the discussion was H.G. Well’s short story, “The Country of the Blind.”  Nunez, a mountain climber in the Andes, tumbles in an avalanche into the Country of the Blind--a society cut off from the world for over 14 generations which has adapted to the fact that everyone living there is blind.  The concept and all memory of sight have disappeared.  Nunez struggles and fails to explain to the people that he is a superior being because he can see.  They perceive him as inferior, unformed, a missing link in the evolutionary chain stricken with occasional delusions and bouts of violent madness. Nonetheless this world is prepared to offer him an idyllic life, peace, sustenance, acceptance, the requited love of a beautiful woman, but in exchange Nunez must agree to surrender his eyes (which doctors have concluded are tumors causing his madness). 


The story is an allegory of cultural blindness; my challenge to students was always to recognize the various perspectives from which the allegory might be applied.  In general, students were a bit too quick to condemn the people of the Country of the Blind for their refusal to “see” beyond their own culture.  Some students remained adamant in their refusal to abandon this position, insisting that the story should be viewed from only one perspective, that of Nunez looking down on the inferiority of the Country of the Blind. 

In the face of this conviction, I pointed out some of the merit of this position.  The Nunez story was typical in our time:  an immigrant arrives in a new culture bringing with her the baggage of her own culture and a host of superior skills associated with that culture.  We, the receiving, settler culture, having been in place for generations, are the people of the Country of the Blind.  

For some students this is a tough pill to swallow, but I invite them to consider how we would react as individuals and as a society if someone ragged, ill-kempt, poorly spoken and perceptibly alien aggressively insisted that he was superior, explaining this superiority with words that had no meaning to us.  Our prisons, asylums, homeless shelters and streets are filled with such people. 

It was fairly easy to win a few adherents to this perspective.  For students who have lived the immigrant experience the allegory was beyond obvious.  For others I invited them to generalize the experience imagining that they brought skills, talents and abilities to a new social group, and think about how quickly and easily any social group (club, team, neighbourhood, school, peer group, etc) would be to accept a new member expecting to be ”king.”  (Nunez’s mantra in the story is “In the country of the blind the one-eyed man is king”).  Above  all we expect immigrants and newbies to be humble in fact and manner.

And so the discussion progressed, with students slowly, tentatively approaching the realization that we are all culturally blind.  We all view our own culture not only as what is best but what is normal, real, and paradoxically “natural.”  The one culture that we can never see clearly is our own.  Trying to understand your own culture is like a fish trying to understand water.

As we reached the end of the story, I would hit a wall.  In the final line of the story it appears that Nunez commits suicide rather than surrender his sight.  (In fact, a coherent and typical reading of the story is that Nunez does not survive the avalanche at the opening of the story, and the entire narrative is his imaginings in the moments before death.  This is an interpretation that I never presented or pursued for the simple reason that it would preempt much of the productive exchange that the story provokes.)

When I criticized and even mocked Nunez’s apparent suicide, the backlash response from students was tidal:  “he was being true to himself,” “true to who he was,”  “true to his beliefs and principles,” “true to what he loved and found beautiful--seeing,” “he was refusing to give up who he was,”  “he was being himself!”  Yeah, maybe!

In the first place I wasn’t going to surrender to a romantic advocation of suicide (see Do No Harm).  In passing I would mention that the child psychologist Piaget defined intelligence as the ability to adapt.  Then I would underline the illogic of Nunez’s apparent decision to choose “seeing” over “living”--a decision that was neither intelligent nor admirable.  Finally, I would point out that advocating Nunez’s decision was an example of cultural blindness--that we live in a visual culture, one that emphasizes and exaggerates the importance of seeing (see Ong, Havelock, McLuhan and Falling in Love is Unprofessional) and like Nunez we fail to “see” beyond the visual culture that we have come to accept as no less than life itself.

To further the point, I would typically bookend the course with Kurt Vonnegut’s short story “Who Am I this Time?” collected in the anthology Welcome to the Monkey House


The satiric short story was adapted into the made-for-TV romantic comedy on PBS, also called  ”Who Am I this Time?”  The story focusses on a young couple who defy “being themselves” by finding happiness in acting and playing roles with one another.  


Monday 13 July 2015

Postmodern Shibboleths


In contemporary usage a “shibboleth” is a word or style or behaviour or custom which identifies you as being part of an in-group--or not.  Postmodern shibboleths are numerous.  If you encounter people who consistently say “discourse” when they mean “theme,”   “the signified” when they mean “the meaning,”  or “deconstruct” when they mean “analyze,” you can be sure you are dealing with postmodernists.  



In the not too distant past the ultimate identifier of a postmodernist was the frequency with which s/he used the word “postmodern”--although this might be taken more as the team cheer rather than a shibboleth.  Upon encountering anything that was kitch, ironic, self referential, or lacking unity, coherence and conclusion, the postmodernist would loudly declare, in the hope someone might overhear, that it was postmodern.

The irony of postmodernism is that its only redeeming social value has been the promotion of tolerance, yet the postmodern catchphrase “political correctness”--a hallmark of intolerance--promises to outlive postmodernism itself.  A postmodernist is someone who can tell you, with conviction, to shut up, while arguing in favour of the right to free speech.


One of the postmodern concepts which I have found to be occasionally useful is “the subject.”  In postmodern speak “the subject” stands in for a variety of possibilities:  the self, the individual, the person, the ego, the “I,” and is a strong counterpoint to the soul, the personality, character and spirit.  In attempting to use “the subject” in my writing, I discovered the other side of employing postmodern shibboleths.  Once you have used an established postmodern catchphrase, you are pretty well locked in, by reader expectation, to following with a typical, well-worn postmodern argument about how the victims of power suffer and the terrible things we already thought about power are even worse than we imagined--which is why most postmodern essays turn out to be convoluted on the surface, obvious underneath, disingenuous overall, and incredibly tedious to read.

Sunday 5 July 2015

Binary Thinking Versus the Other Kind

I still remember from my first-year-undergraduate “Philosophy of Mind” course that the human brain is incapable of thinking, imagining or understanding one thing in isolation without bringing in another, a background, a difference, an opposite.  You can test yourself by trying to think of just one thing.  The notion of a dialectic is based on the binary functioning of the mind; every concept contains its opposite:  the notion “long” requires “short,” “big” invokes “small.”  In an even more rudimentary fashion, in order to know a “thing,” you must be able to say what is “not that thing.”

If you have ever found yourself in a debate with a postmodernist, chances are the postmodernist turned on you at some point to announce dismissively, “oh, that’s binary thinking!”  The postmodernist’s gambit is based on the assumption of binary thinking.  The bluff works because you find yourself thinking “Gee, there was must be a superior, more advanced form of thinking that isn’t binary.”  Is there?

No, there isn’t, but the trickle-down effect of postmodern intellectualizing results in something like this claim from the online “Postmodern Literature Dictionary”:

“If you use ‘binary thinking,’ you are a person who sees no gray, no fuzziness between your categories. Everything is black or white.”

In postmodern speak “binary thinking” has become a synonym for the already well-known and understood idea of “simplistic thinking,” again with the implication that those “non-binary” thinkers must be smarter than the rest of us. How did we arrive at this “two legs bad” juncture?  

The cause is rooted in “poststructuralism,” the theoretical backbone of postmodernism.  In order to understand “poststructuralism” (literally “after structuralism,” therefore a continuation and improvement of) it is necessary to have some grasp of structuralism.  Structuralism is closely aligned with “semiotics,” a term coined by the linguist Saussure meaning the science of signs.  John Fiske offers a clear, accessible and succinct description of semiotics/structuralism in his Introduction to Communications Studies.

Semiotics is a form of structuralism, for it argues that we cannot know the world on its own terms, but only through the conceptual and linguistic structures of our culture. [. . . .] While structuralism does not deny the existence of an external, universal reality, it does deny the possibility of human beings having access to this reality in an objective, universal, non-culturally-determined manner. Structuralism’s enterprise is to discover how people make sense of the world, not what the world is.  (Fiske, 115)



Fiske’s description anticipates the core dispute in the the feud which will eventually take place between postmodernists and empirical scientists like Sokal as I have described in my post The Postmodern Hoax.  Current repudiations of “binary thinking” find their origin in a paper delivered by Jacques Derrida at a structuralism conference at Johns Hopkins University in 1966 entitled  “Structure, Sign and Play in the Discourse of the Human Sciences”.  (The French-language original, "La structure, le signe et le jeu dans le discours des sciences humaines" is slightly more readable than the English translation.)


In this essay, Derrida dismantles (Derrida uses the term "deconstructs") the work of anthropologist Claude Lévi-Strauss, in particular Lévi-Strauss's The Raw and the Cooked.  Although Derrida never explicitly refers to "binary thinking" or "binary opposition" in his essay, it is understood that the structure Lévi-Strauss uses, derived from the linguists Saussure and Jacobson and all of structuralism, is the binary functioning of human thought, and is the target of Derrida's critical inquiry into the "structurality of structure" (“Structure, Sign and Play in the Discourse of the Human Sciences”).

The Longman anthology Contemporary Literary Criticism, in addiction to a translation of Derrida's paper, offers in addendum a transcription/translation of the discussion which took place between Derrida and the leading lights of structuralism immediately after his presentation.  It's interesting to see some of the finest minds in structuralism struggling to understand what the hell Derrida was talking about and, at the same time, to see Derrida cornered into giving a straightforward definition of "deconstruction."   Okay, "straightforward" is never a word that can be applied to Derrida, but with my ellipses eliminating all the asides and parentheses this is what he said:  "déconstruction [. . .] is simply a question [. . .] of being alert [ . . .] to the historical sedimentation of the language which we use [. . .]" (497). This is the definition of "deconstruction" that I typically gave students and, at the same time, I pointed out that even though "deconstruction" was suppose to be something innovative, radical and distinctly postmodern, the Oxford English Dictionary has been"deconstructing" the English language for literally hundreds of years--meaning that the OED gives you the multiple meanings of a word and the year ("the historical sedimentation') in which a particular meaning/definition can be proven to have come into usage.

 Back to structuralist anthropology. As Fiske explains:
The construction of binary oppositions is, according to Lévi-Strauss, the fundamental, universal sense-making process. It is universal because it is a product of the physical structure of the human brain and is therefore specific to the species and not to any one culture or society. (116)
Contrary to popular understandings of "binary thinking,"  the whole point of structuralist anthropology (the binary approach) is to understand how societies, through their mythologies for example, deal with the failures of and exceptions to binary opposition.  Fiske applies the Lévi-Strauss approach to a Western and concomitantly demonstrates how the approach teases out subtextual themes at play in the movie, and how this particular interpretation of the film might stretch credibility.  Even today, 50 years later, it is difficult to fathom exactly what new, radical, distinctly postmodern objection Derrida is raising.  

Certainly it makes sense to challenge how binary thinking is applied in a particular case.  The objection isn't to binary thinking but to a particular application.  If you are going to launch a campaign against food on the grounds that it causes obesity, you should at the same time be ready to present an alternative to eating food, something that goes beyond the absurd claim that "eating food is bad."





Friday 26 June 2015

Falling in Love is Unprofessional

"Falling in Love and Crying in the Academic Workplace"

In the wake of Nobel laureate Professor Tim Hunt’s ironic comments on women in science, a draft article entitled “Falling in love and crying in the academic workplace: ‘Professionalism’, gender and emotion” has been circulating in social media.  

Do We Need Gender?

The challenge that this type of article faces, that this one doesn’t quite overcome, is that it/they end up reinforcing the gender stereotypes they ostensibly set out to oppose.  



I used to challenge students to imagine a world where the words (and concepts) “man” and “woman” didn’t exist, and we were all just people: some of us with brown eyes, some with blue, some of us left-handed, some of us right, some with vulvas, others with penises, some capable of bearing children, some better at lifting heavy objects--no absolute, mutually exclusive binary categories necessary.  Intellectually speaking we don’t “need” the categories “men” and “women.”  The intent of this “thought experiment” was to show the intellectual ease with which gender difference could be erased and to demonstrate how, in the abstract, gender is a fragile and superficial concept.  

However, the fact that students never show much interest in the project of gender erasure shows how culturally attached we are to this dichotomy.  If I pushed the discussion, eventually a fastidious female would vociferously declare: “There is no way I want to share a bathroom with a bunch of smelly guys!”  End of discussion.

Stereotypes and Prejudices

The problem isn’t that gender differences and stereotypes exist, the problem, as Judith Butler would point out, is that these differences and stereotypes are policed and enforced.  There is a difference between a stereotype and a prejudice.  A stereotype is an extreme or rigid form of assigning type (“stereo” means “hard” or “firm”), but it usually has some basis in fact when applied in general to a large group of people. A prejudice is assuming and insisting that a stereotype applies to any and all individuals of a type or category.  It is a gender stereotype that men are physically stronger than women.  It is a scientifically verifiable correlation that, on average, people with penises enjoy more muscle mass than do those endowed with vulvas. 

Enforcing Stereotypes

The problem begins when this generalization is enforced on an individual and we tell John that he is failing as a man because he is not stronger than the average woman, and suspect Mary of not being a real woman because she is stronger than the average man and, of course, John and Mary cannot be a couple because she is stronger than he is; nonetheless John could get a construction job, but Mary can’t, etc, etc.  As a society, we extrapolate, police and enforce these stereotypes.

Solving Prejudice

How do we get beyond stereotypes and prevent them from devolving into prejudices?  it is too easy to say that stereotypes and prejudices are products of ignorance.  We are all ignorant and prejudiced in varying degrees.  In a world of Twitter, instant messaging and an up-to-the-minute news cycle we are constantly being called upon to “pre-judge,” our sympathies and outrage being called upon long before anything approaching a comprehensive knowledge of the facts is possible.  The only solution is to question and to withhold judgment until a sufficient number of facts have come our way; to rigorously apply our reading skills and logic to the facts available, and then to cut the world some slack without slipping into apathy.

The other solution when facing stereotypical differences is to consider other possible paradigms, other axes of comparison.  I admired that  in “Falling in Love and Crying in the Academic Workplace,” the author, Rachel Moss, at least temporarily shifted the discussion to “professionalism.”  Falling in love is unprofessional, mostly because the root of the word “amateur” is “amour,” “to love.”  Even in the study of theatre and drama, I have found ample reason to prefer amateur productions and performances over the professional, though the value system runs in the other direction.  It is not without reason that we describe prostitution as a profession.   It has its rules, and one of them is not falling in love.   

How to Talk about Cultural Differences

In my research I have tried to talk about some of the same differences that Rachel Moss discusses in her article.  I tried to talk about them as the differences between oral and visual cultures (following from Havelock, Ong and McLuhan), and when that didn’t quite work I turned to what John Vernon called “garden” and “map” culture.   Ultimately we have to admit that what we are talking about is “human” culture versus “machine” culture and our society shows an ever-increasing admiration for humans who behave like machines.

"You Fall in Love with Them, They Fall in Love with You"

On that note, a concluding word about Tim Hunt.  Apparently, he has two daughters who love his cooking, but I’ll bet he’s seen the girls cry when he criticized them.   His wife, Professor Mary Collins, was once his student.  So when he said the trouble with girls in the lab is that “you fall in love with them, they fall in love with you” could he have been thinking about himself and his wife?  What an amateur!




Tuesday 23 June 2015

After “the Death of the Author” It Only Takes 39 Words to End an Academic Career

39 Words versus curing cancer

It only takes 39 words to end an academic career even if you are a Nobel laureate in physiology . . . or maybe it’s because you are a Nobel laureate.  The sexist comments of the average smuck don’t go viral on Twitter.

I can’t help imagining some futuristic Wikipedia article on “the cure of cancer.”  It would go something like this: “Professor Tim Hunt’s work on cell division proved instrumental in developing the cure for cancer, however he became notorious and his career was ended in 2015 for his off-the-cuff remarks on women in science at a conference in Korea.”

The 39 words in question

According to The Guardian these are the 39 words which Professor Hunt uttered:


The Danger of irony

His wife, Professor Mary Collins, an immunologist, concurs with most of the critical commentary that “It was an unbelievably stupid thing to say.”  Hunt himself confessed apologetically,  “I was very nervous and a bit confused but, yes, I made those remarks – which were inexcusable – but I made them in a totally jocular, ironic way.”  (I’ve already covered the problems with irony but if you need a refresher see  Do No Harm Part II: Avoid Irony).



The Context is the meaning

No-one is denying that Professor Hunt said what he said, but my reason for commenting is that his words are being so widely reported and repeated out of context.  The context is the meaning.  The only way to understand what an action or an utterance means is to consider the context.  In saying this I know I am indirectly defending “the bad guys” (and "girls"):  the politician who complains of being quoted “out of context” and the adulterer who claims that the sex “didn’t mean anything.”  The truth is that politicians are frequently quoted out of context and their words attributed meanings that are different from, worse than or in complete opposition to their intentions.  And yes, a single act of coitus can be as meaningless as friction.  The only way to know what sex means is to consider the context, and the spectrum of possibilities range from criminal sadism to love.

To Read is to put a text in its proper context

For at least a generation now (the Twitter generation?), we have been training university students to read out of context.  As a professor of literature I thought of my job as teaching my students to be the best possible readers, to be able to analyze and re-synthesize some of the best works that have ever been written.  Reading well meant having a thorough understanding and appreciation of the various contexts within which a work could be read.  As time marches on the new meanings of old works are constantly changing but if we care about meaning, we have to consider the many contexts within which literature is/was written and read.

The "Death of the author" is the death of meaning

However, I noted with chagrin that many of my postmodernist professors and colleagues were quickly and firmly attached to Roland Barthes’ proclamation of “the Death of the Author.”  Fixed meanings were no longer possible, according to Barthes, because professional readers (i.e., postmodern professors) no longer considered the author (who she was, her context or intentions) when interpreting a literary work.  Looking at the author to determine the meaning of a text simply wasn’t done. Whether Barthes was reporting what he witnessed around him or was announcing what should and had to be, on the ground in university classrooms the idea of considering the life of the author as part of the study of a literary work had become so passé that it would be radical to consider this approach.

The "Death of the author" is power grab by pro readers

To my knowledge no-one has ever pointed out how self serving the “Death of the Author” was for university professors.  In the new postmodern context, meaning no longer resided with the author but with the reader, and if you wanted to know what a literary work “really” meant (even though such an absolute was never possible) you had to turn to a professional reader, a professor of literature.  It was clearly a power grab, but no-one seemed to notice--or was it that no-one cared?

The precedents  and procedures for quoting Professor Hunt out of context have been established  and taught.  Everyone is invited to posture self-righteously by attacking him and his un-contextualized utterances.

Tim Hunt is the context of his remarks

When that gets old we might consider challenging the ”Death of the Author,” and taking to heart Professor Collins’ observation that what her husband said “could be taken as offensive if you didn’t know Tim”  and her assurance that “he is not sexist. I am a feminist, and I would not have put up with him if he were sexist.”

What are the proper contexts within which we should read Professor Hunt’s utterance?  My counsel is that we need to be conscious that we are reading different contexts and, in this case,Tim Hunt is one important context of the utterance, not the other way around.  We won’t get the meaning of Tim Hunt by reading the 39 words he uttered in Korea. 

Friday 12 June 2015

Mateus da Costa, the Very First, Original, Authentic, Pure Laine Québécois de Souche and the Real Santa Claus (with Addendum)

Here’s a scenario I used to play out for undergraduate students.

Your roommate comes home from Christmas shopping  and announces enthusiastically that he just saw a guy at the mall who looks “just like the REAL Santa Claus!”

 You, an adult sceptic, reply in your most practiced sarcastic tone, “Duhh! Dude, there is no REAL Santa Claus!”

Most people over the age of eight might agree with you, but you have to admit that you sort of understand what your roommate means.  In fact, on second thought, you understand exactly what he means:  he saw an elderly, roly-poly gentleman with white hair and beard, rosy cheeks and a twinkle in his eye, dressed in a red suit and cap trimmed with ermine.  The man he saw captured with surprising precision the various quintessential images of Santa Claus he has seen on TV, in movies, on posters, Christmas cards and in Coke Cola commercials.


The lesson here is that what we typical consider “real” and “true” are those ideas, images, and notions that fit with what we already happen to believe, the ideas and icons that our culture has preconditioned us to accept.  Even though what we might spontaneously describe as “real” and “true” may have nothing to do with the facts, logic, science, truth or reality, our feelings about what is true and real have enormous influence in our lives.

One of the casualties of last year’s PQ-Marois-Drainville “charter of values” ploy (in addition to the Marois government itself) was the expression “Québécois de souche.”   Until the 1970s a Québécois was a citizen of Quebec City.  The idea of identifying all residents of the province of Quebec as “Québécois” didn’t become current until the mid-70s.   The original expression, whose origin I have not been able to trace or date, was “Québécois de vieille souche” (literally “from the old stump”) and is usually translated as “old stock.”  

Since the word “Québec” is Mik’maq (for “where the river narrows”), it’s pretty obvious that the Mik’maq and their First-Nations brothers and sisters are the oldest stock in Quebec and Canada, but “Québécois de vielle souche”  implies being able to trace your lineage to the first European settlers.  Over time “Québécois de souche” has come to mean any resident of Quebec with a French-sounding name who happens to speak French.

As a resistant expression of pride in the heritage, culture and history of a disadvantaged, oppressed and denigrated minority, I seconded and celebrated the expression “Québécois de souche,” but context changes meaning. In fact, the context is the meaning.  Certainly some commentators had long claimed that the expression smacked of ethnocentrism and xenophobia, but when a nationalist government with aspirations of statehood under the slogan “Nous sommes une peuple” (“We are a people/nation”) came forward with a proposed “charter of values” to rewrite the existing “Quebec charter of individual rights and freedoms” and to guarantee that the history of Quebec could only be retold in one correct, Catholic, French way, it became impossible to disassociate the expression “Québécois de souche” from ethnocentrism and xenophobia.

But there is a way to salvage the expression.  Just as the iconic Santa Claus is taken to be based on Saint Nicolas, the 4th-century monk born in what is now Turkey, we might ask (and answer) who was the first, original, authentic, pure laine Québécois, the person sailing from Europe to settle in New France in advance of all other Europeans, whom we could identify as the primordial source of the expression “Québécois de souche.”  My candidate is Mateus da Costa, for the very simple reason that he was Samuel de Champlain’s secretary on the voyage to settle New France, and da Costa was chosen for the job because he already knew the native languages which implies that he lived in what would become New France for some years before Champlain’s voyage.

Mateus da Costa must have been a brilliant, resilient and resourceful man.  He was of Black-African descent and a resident of Portugal, but most of his history is based on speculation from contracts and court documents.  We know from contract documents that his services were much in demand for anyone who wanted to explore the new lands across the Atlantic.

Mateus da Costa’s Portuguese connection is important in order to understand how it is possible for him to have settled, at least temporarily, in what would become New France prior to Champlain.  The evidence is ample that Portuguese fishers travelled back and forth to Canada, and in particular to Newfoundland’s Grand Banks well in advance of Champlain, Jacques Cartier and Christopher Columbus.  Part of the evidence is the frequency of Portuguese place names all over Newfoundland.  

(Living in Portugal, I discovered that “canada” is a Portuguese word.  Literally it means “here is nothing;” that is, “ca nada,”  but it is used as an equivalent of “rural route” in Portuguese addresses . . . but I digress.)

My point is simply that the next time you hear or use the expression “Québécois de souche” perhaps this (speculative) image of Mateus da Costa should come to mind, and dissipate any sub-text of xenophobia.

Wednesday 14 January 2015

Terrorism and Madness: Between Sympathy and Understanding

When I was researching the uses of madness in literature I came across a paradox from the philosophy of causality. If you are able to analyze the etiology, the causes, of madness, you can no longer claim that what you have analyzed is madness.  You can’t claim that you have found rational causes and effects for a behaviour, and continue to claim that the behaviour is mad.  

Thomas Szasz, a trained and practicing psychiatrist, made a career out of denying the existence of mental illness.  According to Szasz, madness was a “legal fiction.”  Like other such fictions--things that could not be proven to  exist--it was useful for institutions like hospitals, the courts, police forces, governments, and so on, to pretend that they existed, in order to establish procedures and policies for how to deal with particular situations, behaviours and people.

In recent days and weeks and years, the distinction between madness and terrorism has become a matter of significant debate.  Acts which are identified as “terrorism” supply significant political capital to governments interested in the paradoxical project of curtailing the citizenry’s rights and freedoms in the name of protecting those same rights and freedoms.  Madness, on the other hand, is perceived as an individual, personal, private phenomenon, outside the purview of government.  Announcing a war on mental illness has none of the  purchase, cache or logic of a war on terrorism.  

I have come to the conclusion that in both cases we are dealing with “legal fictions,” but legal fictions with powerful, even lethal, consequences and repercussions.  Although we might assume that putting actions and behaviours into particular categories is designed to help us understand and deal with them; historically, these two legal fictions--terrorism and madness--have been used to prevent us from looking further at their causes.  As legal fictions we can deal with them without having to understand them.

Suggesting that we should attempt to “understand” terrorist acts, you will be accused of weakness, naivety, inexperience, and ultimately, “sympathizing” with the enemy.  In some quarters, escalation is considered the only legitimate response to terrorism.  In the Middle Ages,  the common treatment for madness was to lock the madman in a dark room, then starve and beat him. Do you see a parallel?

But really, what are the risks of our trying to understand the causes of terrorism?  Is it that, as with madness, if we begin to understand, we will cease to believe that terrorism is terrorism?  What is the 21st-century alternative to our understanding terrorism?

I get that the intention of a terrorist act is to make a point.  If we admit that we get the point, then the terrorist can claim success.  The dominant, current strategy is a repeated public message that we simply do not understand.  In fact, the message often goes beyond just not understanding all the way to  . . . well, perhaps “madness” is too dramatic a word, but “irrational” and “illogical” seem un-dramatic and appropriate descriptors.   

The example I am thinking of is that it has become common to describe terrorist acts in which terrorists have sacrificed their own lives as “cowardly.” I can see the argument that suicide is cowardly, a refusal to brave life’s hardships, and that terrorists attack “soft” rather than military targets--children, women, men, civilians all. Nonetheless, of all the derogatory descriptions that could be applied to terrorists--evil, immoral, bestial, cruel, inhuman, misguided, foolhardy, disgusting, tyrannical, heartless, mindless, fanatical--what is the logic of describing them as “cowardly”?

Bill Maher lost his job as the host of the TV show Politically Incorrect for contradicting President George W. Bush’s claim that the 9/11 attack on the World Trade Centre was “cowardly.”  “Say what you want about it,” Maher said, “not cowardly.”



This insistence on the idea that terrorism must be described as cowardly brought to my mind the penultimate chapter of George Orwell’s 1984.  Winston is in prison, under the thumb of his interrogator and torturer, O’Brien, and realizes that he has been under surveillance of the Thought Police for the last seven years.  He writes these three sentences in an attempt to prove that he has recuperated and is ready to think as and whatever the regime thinks:

FREEDOM IS SLAVERY.

TWO PLUS TWO IS FIVE.

GOD IS POWER.



Whatever stance we adopt toward terrorism, we must ensure that our position makes sense and we do not become what we oppose.

1 Shakespeare used this treatment of madness for comic effect in his play Twelfth Night.

Why Is the Vagina Masculine? And What’s the Alternative?

“Vagina” is masculine  I first came across this factoid thirty years ago in Daphne Marlatt’s novel Ana Historic .   It came up again more r...