Translate

Monday 26 October 2015

Are Canadian Elections Democratic?

Are Canadian elections democratic?  The short answer to the question is “no!” 

Our electoral system is based on the British system which if you listen to CGP Grey's Why the UK Election Results Are the Worst in History you will quickly understand is far from democratic.


To be fully upfront, I did not vote for the Liberal Party in the election of October 19, 2015 but I am relatively content with the results.  However, in light of the disparity between the popular vote and the number of seats won in Canada's most recent election, CGP Grey will have to retitle his post as "Why the UK Election Results Are the Second Worst in History."

Here are the number of seats and the percentage of the vote won by each of the major political parties in the Canadian election:

LIBCONNDPBQGRN
184
seats
99
seats
44
seats
10 seats1
seat
39.5%
of vote
31.9%
of vote
19.7%
of vote
4.7%
of vote
3.5%
of vote

In case the disparity and level of misrepresentation doesn't immediately jump out at you from these numbers; in round numbers:  the Liberals got less than 40% of the votes but over 54% of the seats,  the Conservatives got 32% of the votes but 29% of the seats, the NDP got 20% of the votes and 13% of the seats, the Block got 5% of the votes and 3% of the seats, the Greens got almost 4% of the votes but far less than 1%  (in fact .29%) of the seats.  Or, viewed the other way around, based on the popular vote, the Green Party should have around 3 seats, the Block around 18 or 19, the NDP around 80, the Conservatives 100 and the Liberals 134.  In short Canadians voted for a coalition government, but they didn't get one. 

Much as I hate to be fatalistic, the situation is not likely to change, in the first place because the problem only shows up every four years or so, and afterwards people are likely to say "oh well, the election is over now."  Secondly, and more importantly, the system favours the winning party and they consequently have a vested interest in keeping things the way they are.  The smaller the party the more unfairly the system treats it--another reason the system is unlikely to change.

One glimmer of hope is that one of the first-announced planks in the Liberal Party's campaign was a promise to reform the electoral process.  Here's what the Liberals announced in their campaign literature.  

MAKE EVERY VOTE COUNT
We are committed to ensuring that 2015 will be the last federal election conducted under the first-past-the-post voting system. As part of a national engagement process, we will ensure that electoral reform measures – such as ranked ballots, proportional representation, mandatory voting, and online voting – are fully and fairly studied and considered. This will be carried out by a special all-party parliamentary committee, which will bring recommendations to Parliament on the way forward, to allow for action before the succeeding federal election. Within 18 months of forming government, we will bring forward legislation to enact electoral reform. 
It would be very interesting to see this list of measures being implemented, but take note that the promise is only that these measures will be "fully and fairly studied and considered."  (Why do I find myself finishing the sentence with "before they are rejected"?)  A year and a half from now the Liberals will be bringing forward legislation.  Unfortunately, when you promise to "bring forward legislation," you are leaving the door open so you can later claim that "I tried to bring forward legislation but my dog ate it."

Despite my instinctive cynicism on this issue, it is going to be interesting and challenging for a government in power to even begin discussion of these issues.  A typical European format is for each party to present a list of candidates and the number of candidates who become members of parliament (or its equivalent)  is determined by the percentage of the popular vote which the party wins.  I suspect that Canadians will be reluctant to give up the idea of voting for their local riding representative, but the European system ensures that the party has the representatives in government that it considers it's best people.  

Although I must confess that if this system were in place in past elections my favourite two candidates would not have been high enough on the NDP list to get elected:  Pierre Luc Dusseault first elected in 2011 at the age of 19, the youngest ever member of parliament, and re-elected in 2015, and Ruth Ellen Brousseau, the candidate everyone thought was a joke in 2011 when she was elected in a largely French riding despite media claims that she couldn't speak French and the fact that she went on a pre-paid vacation in the middle of the election campaign.  Ms. Brousseau turned out to be a dream MP for her riding and won an easy victory in 2015.





























In conclusion:  the system we will be looking for is one that, in the first place, is democratic, so that how people actually voted is reflected in the make-up of parliament, respects regional and even local representation and distribution, and still leaves open the possibility of wild-card outliers being elected.  A lot to ask for, maybe, but in the end we will get the system we deserve--meaning the system we are willing to ask for, to work for, and maybe even to fight for.  Don't let 19 May 2017 slip by without your serious consideration of our "new" electoral process.









Saturday 5 September 2015

The Truth about English Grammar

The “joke” below about the distinction between “can” and “may” crossed my Facebook feed a couple of times this week.  The last time I heard this joke I was around 10 years old, meaning more than 50 years ago, so I am a little bit more than surprised that anyone today would comment on the distinction between “can” and “may” in making polite requests, or even think that such a distinction exists.  Nonetheless the post has received tens of thousands of likes and shares.


After nearly 40 years of teaching English Language and Literature, I think I can say, based on my own authority and that of most grammar books published in the last three or four decades, that if there ever was a polite-request distinction between “can” and “may” it disappeared at least 40 years ago. 

I have a strong suspicion that the reason this kind of false distinction persists is that if you press a less-than-fully competent teacher of English or uninformed speaker of the language to explain the difference between say “going to” and “will” or between “have seen” and “saw” or between “a few” and “a number of” they will retreat into pure nonsense claims that one is more “polite and formal” than the other.  Euphemisms and dysphemisms notwithstanding (see the Sour Glossary for definitions), the distinction between polite and less polite grammar disappeared from English usage in the 16th century with the abandonment of the distinction between “you” and “thee”  (the equivalents of the French “vous” and “tu”).  “Thee,” “thou” and “thine” persisted in prayers, bible translations and poetry, specifically because these words had disappeared from common usage and being rarefied seemed special and more poetic.  Ironically, "thee," "thou," and "thine" are the less polite, less formal and deferential forms but the most typical way that Christians address God.

Register is an important concept in linguistics.  The concept reminds us that different words and expressions fit better in particular contexts.  The language you use with your grandparents will likely be different from the language you use with your friends. If the context is legal or scientific or formal we expect to hear words and expressions that fit that particular context, but the idea that certain common grammatical expressions, particularly modal auxiliaries like “can,” “should,” “may,” might,” “will” and so on, can be distinguished from one another on the basis of politeness is simply wrong.

This being said, the truth about “correct” English grammar is that it is simply a collection of the most recently accepted errors.  Just about anything that is now considered “good English” was a mistake at some point in history.  You know all those words in English that end in “e” (like “bite,” “kite,” “courage,” “wide”):  we were supposed to pronounce those final “e”s, but since most people in the 16th century were mistakenly leaving them silent, the silent final “e” became the correct thing to do.  It’s an idea worth keeping in mind if you have ever had your self-esteem battered by a know-it-all grammar mavin, and based on the vitriolic responses to the above post, it seems like a lot of people have. 

Based on these comments, you might think I am opposed to grammar.  On the contrary, I think the posting above proves that we need to re-introduce grammar instruction into the school system.  The problem with the above posting (beyond satirizing a problem that hasn't existed for 40 years) is that it mistakenly describes "Can I borrow . . ." as a "colloquial" irregularity.  It isn't.  Grammar is constantly changing and at this point in time, "Can I borrow a pencil?" is correct standard English.

Tuesday 1 September 2015

Will the Government Use C-51, Anti-Terrorism Legislation, to Track Canadian University Students with Outstanding Loans?

Ottawa has instructed the Canadian Revenue Agency (CRA) to be more aggressive in collecting outstanding student loans.   According to the Globe and Mail:

The Government annually has to write off some of the $16 billion owing in student loans for a number of reasons:  a debtor may file for bankruptcy, the debt passes a six-year legal limit on collection, or the debtor can’t be found.  (B2, 31 Aug 2015)
For more detail on how the government has disallowed University graduates from declaring bankruptcy and extended the 6-year limit to 15 years, see my earlier post  When Should You Repay Your Student Loan? How about . . . Never!  However, the real cause (“90% of cases”) of non-payment is that CRA has lost track of student borrowers because “the CRA wasn’t allowed to ask other departments for help because of privacy laws” (B2, 31 Aug 2015).

What the Globe article doesn’t mention is the possibility of using C-51, anti-terrorism legislation, to solve the problem.  In case you have forgotten, the official title of the legislation is the “Security of Canada Information Sharing Act” and the purpose of the Act is 

TO ENCOURAGE AND FACILITATE INFORMATION SHARING BETWEEN GOVERNMENT OF CANADA INSTITUTIONS IN ORDER TO PROTECT CANADA AGAINST ACTIVITIES THAT UNDERMINE THE SECURITY OF CANADA

You might not think of a Canadian University grad with a student loan as a terrorist, but that’s because you have forgotten how the Conservative Government has used C-51 to re-define terrorism.  Here, unabridged, is how C-51, which is now law, defines terrorism:

The following definitions apply in this Act.Definitions“activity that undermines the security of Canada” means any activity, including any of the following activities, if it undermines the sovereignty, security or territorial integrity of Canada or the lives or the security of the people of Canada:interference with the capability of the Government of Canada in relation to intelligence, defence, border operations, public safety, the administration of justice, diplomatic or consular relations, or the economic or financial stability of Canada;
With students currently owing $16 billion in loans, and a good chunk of them refusing to pay up (28% in 2004, 13% in 2014), guess what!  They are potentially interfering with “the economic or financial stability of Canada” and therefore qualify as terrorists under C-51.



Both the Conservative and Liberal Parties are in favour of C-51, only the NDP has promised to repeal this legislation.


Sunday 2 August 2015

“Be Yourself!” Is This Really Good Advice?

I’m not sure telling people to be themselves is good advice, but my saying so never seemed to have much purchase with undergraduates in my Intro to Lit course.   The sadist, the homicidal maniac, the pedophile--aren’t they “being themselves” when they commit their crimes?  Shouldn’t we tell people, and ourselves, to “be better”?


The context of the discussion was H.G. Well’s short story, “The Country of the Blind.”  Nunez, a mountain climber in the Andes, tumbles in an avalanche into the Country of the Blind--a society cut off from the world for over 14 generations which has adapted to the fact that everyone living there is blind.  The concept and all memory of sight have disappeared.  Nunez struggles and fails to explain to the people that he is a superior being because he can see.  They perceive him as inferior, unformed, a missing link in the evolutionary chain stricken with occasional delusions and bouts of violent madness. Nonetheless this world is prepared to offer him an idyllic life, peace, sustenance, acceptance, the requited love of a beautiful woman, but in exchange Nunez must agree to surrender his eyes (which doctors have concluded are tumors causing his madness). 


The story is an allegory of cultural blindness; my challenge to students was always to recognize the various perspectives from which the allegory might be applied.  In general, students were a bit too quick to condemn the people of the Country of the Blind for their refusal to “see” beyond their own culture.  Some students remained adamant in their refusal to abandon this position, insisting that the story should be viewed from only one perspective, that of Nunez looking down on the inferiority of the Country of the Blind. 

In the face of this conviction, I pointed out some of the merit of this position.  The Nunez story was typical in our time:  an immigrant arrives in a new culture bringing with her the baggage of her own culture and a host of superior skills associated with that culture.  We, the receiving, settler culture, having been in place for generations, are the people of the Country of the Blind.  

For some students this is a tough pill to swallow, but I invite them to consider how we would react as individuals and as a society if someone ragged, ill-kempt, poorly spoken and perceptibly alien aggressively insisted that he was superior, explaining this superiority with words that had no meaning to us.  Our prisons, asylums, homeless shelters and streets are filled with such people. 

It was fairly easy to win a few adherents to this perspective.  For students who have lived the immigrant experience the allegory was beyond obvious.  For others I invited them to generalize the experience imagining that they brought skills, talents and abilities to a new social group, and think about how quickly and easily any social group (club, team, neighbourhood, school, peer group, etc) would be to accept a new member expecting to be ”king.”  (Nunez’s mantra in the story is “In the country of the blind the one-eyed man is king”).  Above  all we expect immigrants and newbies to be humble in fact and manner.

And so the discussion progressed, with students slowly, tentatively approaching the realization that we are all culturally blind.  We all view our own culture not only as what is best but what is normal, real, and paradoxically “natural.”  The one culture that we can never see clearly is our own.  Trying to understand your own culture is like a fish trying to understand water.

As we reached the end of the story, I would hit a wall.  In the final line of the story it appears that Nunez commits suicide rather than surrender his sight.  (In fact, a coherent and typical reading of the story is that Nunez does not survive the avalanche at the opening of the story, and the entire narrative is his imaginings in the moments before death.  This is an interpretation that I never presented or pursued for the simple reason that it would preempt much of the productive exchange that the story provokes.)

When I criticized and even mocked Nunez’s apparent suicide, the backlash response from students was tidal:  “he was being true to himself,” “true to who he was,”  “true to his beliefs and principles,” “true to what he loved and found beautiful--seeing,” “he was refusing to give up who he was,”  “he was being himself!”  Yeah, maybe!

In the first place I wasn’t going to surrender to a romantic advocation of suicide (see Do No Harm).  In passing I would mention that the child psychologist Piaget defined intelligence as the ability to adapt.  Then I would underline the illogic of Nunez’s apparent decision to choose “seeing” over “living”--a decision that was neither intelligent nor admirable.  Finally, I would point out that advocating Nunez’s decision was an example of cultural blindness--that we live in a visual culture, one that emphasizes and exaggerates the importance of seeing (see Ong, Havelock, McLuhan and Falling in Love is Unprofessional) and like Nunez we fail to “see” beyond the visual culture that we have come to accept as no less than life itself.

To further the point, I would typically bookend the course with Kurt Vonnegut’s short story “Who Am I this Time?” collected in the anthology Welcome to the Monkey House


The satiric short story was adapted into the made-for-TV romantic comedy on PBS, also called  ”Who Am I this Time?”  The story focusses on a young couple who defy “being themselves” by finding happiness in acting and playing roles with one another.  


Monday 13 July 2015

Postmodern Shibboleths


In contemporary usage a “shibboleth” is a word or style or behaviour or custom which identifies you as being part of an in-group--or not.  Postmodern shibboleths are numerous.  If you encounter people who consistently say “discourse” when they mean “theme,”   “the signified” when they mean “the meaning,”  or “deconstruct” when they mean “analyze,” you can be sure you are dealing with postmodernists.  



In the not too distant past the ultimate identifier of a postmodernist was the frequency with which s/he used the word “postmodern”--although this might be taken more as the team cheer rather than a shibboleth.  Upon encountering anything that was kitch, ironic, self referential, or lacking unity, coherence and conclusion, the postmodernist would loudly declare, in the hope someone might overhear, that it was postmodern.

The irony of postmodernism is that its only redeeming social value has been the promotion of tolerance, yet the postmodern catchphrase “political correctness”--a hallmark of intolerance--promises to outlive postmodernism itself.  A postmodernist is someone who can tell you, with conviction, to shut up, while arguing in favour of the right to free speech.


One of the postmodern concepts which I have found to be occasionally useful is “the subject.”  In postmodern speak “the subject” stands in for a variety of possibilities:  the self, the individual, the person, the ego, the “I,” and is a strong counterpoint to the soul, the personality, character and spirit.  In attempting to use “the subject” in my writing, I discovered the other side of employing postmodern shibboleths.  Once you have used an established postmodern catchphrase, you are pretty well locked in, by reader expectation, to following with a typical, well-worn postmodern argument about how the victims of power suffer and the terrible things we already thought about power are even worse than we imagined--which is why most postmodern essays turn out to be convoluted on the surface, obvious underneath, disingenuous overall, and incredibly tedious to read.

Sunday 5 July 2015

Binary Thinking Versus the Other Kind

I still remember from my first-year-undergraduate “Philosophy of Mind” course that the human brain is incapable of thinking, imagining or understanding one thing in isolation without bringing in another, a background, a difference, an opposite.  You can test yourself by trying to think of just one thing.  The notion of a dialectic is based on the binary functioning of the mind; every concept contains its opposite:  the notion “long” requires “short,” “big” invokes “small.”  In an even more rudimentary fashion, in order to know a “thing,” you must be able to say what is “not that thing.”

If you have ever found yourself in a debate with a postmodernist, chances are the postmodernist turned on you at some point to announce dismissively, “oh, that’s binary thinking!”  The postmodernist’s gambit is based on the assumption of binary thinking.  The bluff works because you find yourself thinking “Gee, there was must be a superior, more advanced form of thinking that isn’t binary.”  Is there?

No, there isn’t, but the trickle-down effect of postmodern intellectualizing results in something like this claim from the online “Postmodern Literature Dictionary”:

“If you use ‘binary thinking,’ you are a person who sees no gray, no fuzziness between your categories. Everything is black or white.”

In postmodern speak “binary thinking” has become a synonym for the already well-known and understood idea of “simplistic thinking,” again with the implication that those “non-binary” thinkers must be smarter than the rest of us. How did we arrive at this “two legs bad” juncture?  

The cause is rooted in “poststructuralism,” the theoretical backbone of postmodernism.  In order to understand “poststructuralism” (literally “after structuralism,” therefore a continuation and improvement of) it is necessary to have some grasp of structuralism.  Structuralism is closely aligned with “semiotics,” a term coined by the linguist Saussure meaning the science of signs.  John Fiske offers a clear, accessible and succinct description of semiotics/structuralism in his Introduction to Communications Studies.

Semiotics is a form of structuralism, for it argues that we cannot know the world on its own terms, but only through the conceptual and linguistic structures of our culture. [. . . .] While structuralism does not deny the existence of an external, universal reality, it does deny the possibility of human beings having access to this reality in an objective, universal, non-culturally-determined manner. Structuralism’s enterprise is to discover how people make sense of the world, not what the world is.  (Fiske, 115)



Fiske’s description anticipates the core dispute in the the feud which will eventually take place between postmodernists and empirical scientists like Sokal as I have described in my post The Postmodern Hoax.  Current repudiations of “binary thinking” find their origin in a paper delivered by Jacques Derrida at a structuralism conference at Johns Hopkins University in 1966 entitled  “Structure, Sign and Play in the Discourse of the Human Sciences”.  (The French-language original, "La structure, le signe et le jeu dans le discours des sciences humaines" is slightly more readable than the English translation.)


In this essay, Derrida dismantles (Derrida uses the term "deconstructs") the work of anthropologist Claude Lévi-Strauss, in particular Lévi-Strauss's The Raw and the Cooked.  Although Derrida never explicitly refers to "binary thinking" or "binary opposition" in his essay, it is understood that the structure Lévi-Strauss uses, derived from the linguists Saussure and Jacobson and all of structuralism, is the binary functioning of human thought, and is the target of Derrida's critical inquiry into the "structurality of structure" (“Structure, Sign and Play in the Discourse of the Human Sciences”).

The Longman anthology Contemporary Literary Criticism, in addiction to a translation of Derrida's paper, offers in addendum a transcription/translation of the discussion which took place between Derrida and the leading lights of structuralism immediately after his presentation.  It's interesting to see some of the finest minds in structuralism struggling to understand what the hell Derrida was talking about and, at the same time, to see Derrida cornered into giving a straightforward definition of "deconstruction."   Okay, "straightforward" is never a word that can be applied to Derrida, but with my ellipses eliminating all the asides and parentheses this is what he said:  "déconstruction [. . .] is simply a question [. . .] of being alert [ . . .] to the historical sedimentation of the language which we use [. . .]" (497). This is the definition of "deconstruction" that I typically gave students and, at the same time, I pointed out that even though "deconstruction" was suppose to be something innovative, radical and distinctly postmodern, the Oxford English Dictionary has been"deconstructing" the English language for literally hundreds of years--meaning that the OED gives you the multiple meanings of a word and the year ("the historical sedimentation') in which a particular meaning/definition can be proven to have come into usage.

 Back to structuralist anthropology. As Fiske explains:
The construction of binary oppositions is, according to Lévi-Strauss, the fundamental, universal sense-making process. It is universal because it is a product of the physical structure of the human brain and is therefore specific to the species and not to any one culture or society. (116)
Contrary to popular understandings of "binary thinking,"  the whole point of structuralist anthropology (the binary approach) is to understand how societies, through their mythologies for example, deal with the failures of and exceptions to binary opposition.  Fiske applies the Lévi-Strauss approach to a Western and concomitantly demonstrates how the approach teases out subtextual themes at play in the movie, and how this particular interpretation of the film might stretch credibility.  Even today, 50 years later, it is difficult to fathom exactly what new, radical, distinctly postmodern objection Derrida is raising.  

Certainly it makes sense to challenge how binary thinking is applied in a particular case.  The objection isn't to binary thinking but to a particular application.  If you are going to launch a campaign against food on the grounds that it causes obesity, you should at the same time be ready to present an alternative to eating food, something that goes beyond the absurd claim that "eating food is bad."





Friday 26 June 2015

Falling in Love is Unprofessional

"Falling in Love and Crying in the Academic Workplace"

In the wake of Nobel laureate Professor Tim Hunt’s ironic comments on women in science, a draft article entitled “Falling in love and crying in the academic workplace: ‘Professionalism’, gender and emotion” has been circulating in social media.  

Do We Need Gender?

The challenge that this type of article faces, that this one doesn’t quite overcome, is that it/they end up reinforcing the gender stereotypes they ostensibly set out to oppose.  



I used to challenge students to imagine a world where the words (and concepts) “man” and “woman” didn’t exist, and we were all just people: some of us with brown eyes, some with blue, some of us left-handed, some of us right, some with vulvas, others with penises, some capable of bearing children, some better at lifting heavy objects--no absolute, mutually exclusive binary categories necessary.  Intellectually speaking we don’t “need” the categories “men” and “women.”  The intent of this “thought experiment” was to show the intellectual ease with which gender difference could be erased and to demonstrate how, in the abstract, gender is a fragile and superficial concept.  

However, the fact that students never show much interest in the project of gender erasure shows how culturally attached we are to this dichotomy.  If I pushed the discussion, eventually a fastidious female would vociferously declare: “There is no way I want to share a bathroom with a bunch of smelly guys!”  End of discussion.

Stereotypes and Prejudices

The problem isn’t that gender differences and stereotypes exist, the problem, as Judith Butler would point out, is that these differences and stereotypes are policed and enforced.  There is a difference between a stereotype and a prejudice.  A stereotype is an extreme or rigid form of assigning type (“stereo” means “hard” or “firm”), but it usually has some basis in fact when applied in general to a large group of people. A prejudice is assuming and insisting that a stereotype applies to any and all individuals of a type or category.  It is a gender stereotype that men are physically stronger than women.  It is a scientifically verifiable correlation that, on average, people with penises enjoy more muscle mass than do those endowed with vulvas. 

Enforcing Stereotypes

The problem begins when this generalization is enforced on an individual and we tell John that he is failing as a man because he is not stronger than the average woman, and suspect Mary of not being a real woman because she is stronger than the average man and, of course, John and Mary cannot be a couple because she is stronger than he is; nonetheless John could get a construction job, but Mary can’t, etc, etc.  As a society, we extrapolate, police and enforce these stereotypes.

Solving Prejudice

How do we get beyond stereotypes and prevent them from devolving into prejudices?  it is too easy to say that stereotypes and prejudices are products of ignorance.  We are all ignorant and prejudiced in varying degrees.  In a world of Twitter, instant messaging and an up-to-the-minute news cycle we are constantly being called upon to “pre-judge,” our sympathies and outrage being called upon long before anything approaching a comprehensive knowledge of the facts is possible.  The only solution is to question and to withhold judgment until a sufficient number of facts have come our way; to rigorously apply our reading skills and logic to the facts available, and then to cut the world some slack without slipping into apathy.

The other solution when facing stereotypical differences is to consider other possible paradigms, other axes of comparison.  I admired that  in “Falling in Love and Crying in the Academic Workplace,” the author, Rachel Moss, at least temporarily shifted the discussion to “professionalism.”  Falling in love is unprofessional, mostly because the root of the word “amateur” is “amour,” “to love.”  Even in the study of theatre and drama, I have found ample reason to prefer amateur productions and performances over the professional, though the value system runs in the other direction.  It is not without reason that we describe prostitution as a profession.   It has its rules, and one of them is not falling in love.   

How to Talk about Cultural Differences

In my research I have tried to talk about some of the same differences that Rachel Moss discusses in her article.  I tried to talk about them as the differences between oral and visual cultures (following from Havelock, Ong and McLuhan), and when that didn’t quite work I turned to what John Vernon called “garden” and “map” culture.   Ultimately we have to admit that what we are talking about is “human” culture versus “machine” culture and our society shows an ever-increasing admiration for humans who behave like machines.

"You Fall in Love with Them, They Fall in Love with You"

On that note, a concluding word about Tim Hunt.  Apparently, he has two daughters who love his cooking, but I’ll bet he’s seen the girls cry when he criticized them.   His wife, Professor Mary Collins, was once his student.  So when he said the trouble with girls in the lab is that “you fall in love with them, they fall in love with you” could he have been thinking about himself and his wife?  What an amateur!




Tuesday 23 June 2015

After “the Death of the Author” It Only Takes 39 Words to End an Academic Career

39 Words versus curing cancer

It only takes 39 words to end an academic career even if you are a Nobel laureate in physiology . . . or maybe it’s because you are a Nobel laureate.  The sexist comments of the average smuck don’t go viral on Twitter.

I can’t help imagining some futuristic Wikipedia article on “the cure of cancer.”  It would go something like this: “Professor Tim Hunt’s work on cell division proved instrumental in developing the cure for cancer, however he became notorious and his career was ended in 2015 for his off-the-cuff remarks on women in science at a conference in Korea.”

The 39 words in question

According to The Guardian these are the 39 words which Professor Hunt uttered:


The Danger of irony

His wife, Professor Mary Collins, an immunologist, concurs with most of the critical commentary that “It was an unbelievably stupid thing to say.”  Hunt himself confessed apologetically,  “I was very nervous and a bit confused but, yes, I made those remarks – which were inexcusable – but I made them in a totally jocular, ironic way.”  (I’ve already covered the problems with irony but if you need a refresher see  Do No Harm Part II: Avoid Irony).



The Context is the meaning

No-one is denying that Professor Hunt said what he said, but my reason for commenting is that his words are being so widely reported and repeated out of context.  The context is the meaning.  The only way to understand what an action or an utterance means is to consider the context.  In saying this I know I am indirectly defending “the bad guys” (and "girls"):  the politician who complains of being quoted “out of context” and the adulterer who claims that the sex “didn’t mean anything.”  The truth is that politicians are frequently quoted out of context and their words attributed meanings that are different from, worse than or in complete opposition to their intentions.  And yes, a single act of coitus can be as meaningless as friction.  The only way to know what sex means is to consider the context, and the spectrum of possibilities range from criminal sadism to love.

To Read is to put a text in its proper context

For at least a generation now (the Twitter generation?), we have been training university students to read out of context.  As a professor of literature I thought of my job as teaching my students to be the best possible readers, to be able to analyze and re-synthesize some of the best works that have ever been written.  Reading well meant having a thorough understanding and appreciation of the various contexts within which a work could be read.  As time marches on the new meanings of old works are constantly changing but if we care about meaning, we have to consider the many contexts within which literature is/was written and read.

The "Death of the author" is the death of meaning

However, I noted with chagrin that many of my postmodernist professors and colleagues were quickly and firmly attached to Roland Barthes’ proclamation of “the Death of the Author.”  Fixed meanings were no longer possible, according to Barthes, because professional readers (i.e., postmodern professors) no longer considered the author (who she was, her context or intentions) when interpreting a literary work.  Looking at the author to determine the meaning of a text simply wasn’t done. Whether Barthes was reporting what he witnessed around him or was announcing what should and had to be, on the ground in university classrooms the idea of considering the life of the author as part of the study of a literary work had become so passé that it would be radical to consider this approach.

The "Death of the author" is power grab by pro readers

To my knowledge no-one has ever pointed out how self serving the “Death of the Author” was for university professors.  In the new postmodern context, meaning no longer resided with the author but with the reader, and if you wanted to know what a literary work “really” meant (even though such an absolute was never possible) you had to turn to a professional reader, a professor of literature.  It was clearly a power grab, but no-one seemed to notice--or was it that no-one cared?

The precedents  and procedures for quoting Professor Hunt out of context have been established  and taught.  Everyone is invited to posture self-righteously by attacking him and his un-contextualized utterances.

Tim Hunt is the context of his remarks

When that gets old we might consider challenging the ”Death of the Author,” and taking to heart Professor Collins’ observation that what her husband said “could be taken as offensive if you didn’t know Tim”  and her assurance that “he is not sexist. I am a feminist, and I would not have put up with him if he were sexist.”

What are the proper contexts within which we should read Professor Hunt’s utterance?  My counsel is that we need to be conscious that we are reading different contexts and, in this case,Tim Hunt is one important context of the utterance, not the other way around.  We won’t get the meaning of Tim Hunt by reading the 39 words he uttered in Korea. 

Friday 12 June 2015

Mateus da Costa, the Very First, Original, Authentic, Pure Laine Québécois de Souche and the Real Santa Claus (with Addendum)

Here’s a scenario I used to play out for undergraduate students.

Your roommate comes home from Christmas shopping  and announces enthusiastically that he just saw a guy at the mall who looks “just like the REAL Santa Claus!”

 You, an adult sceptic, reply in your most practiced sarcastic tone, “Duhh! Dude, there is no REAL Santa Claus!”

Most people over the age of eight might agree with you, but you have to admit that you sort of understand what your roommate means.  In fact, on second thought, you understand exactly what he means:  he saw an elderly, roly-poly gentleman with white hair and beard, rosy cheeks and a twinkle in his eye, dressed in a red suit and cap trimmed with ermine.  The man he saw captured with surprising precision the various quintessential images of Santa Claus he has seen on TV, in movies, on posters, Christmas cards and in Coke Cola commercials.


The lesson here is that what we typical consider “real” and “true” are those ideas, images, and notions that fit with what we already happen to believe, the ideas and icons that our culture has preconditioned us to accept.  Even though what we might spontaneously describe as “real” and “true” may have nothing to do with the facts, logic, science, truth or reality, our feelings about what is true and real have enormous influence in our lives.

One of the casualties of last year’s PQ-Marois-Drainville “charter of values” ploy (in addition to the Marois government itself) was the expression “Québécois de souche.”   Until the 1970s a Québécois was a citizen of Quebec City.  The idea of identifying all residents of the province of Quebec as “Québécois” didn’t become current until the mid-70s.   The original expression, whose origin I have not been able to trace or date, was “Québécois de vieille souche” (literally “from the old stump”) and is usually translated as “old stock.”  

Since the word “Québec” is Mik’maq (for “where the river narrows”), it’s pretty obvious that the Mik’maq and their First-Nations brothers and sisters are the oldest stock in Quebec and Canada, but “Québécois de vielle souche”  implies being able to trace your lineage to the first European settlers.  Over time “Québécois de souche” has come to mean any resident of Quebec with a French-sounding name who happens to speak French.

As a resistant expression of pride in the heritage, culture and history of a disadvantaged, oppressed and denigrated minority, I seconded and celebrated the expression “Québécois de souche,” but context changes meaning. In fact, the context is the meaning.  Certainly some commentators had long claimed that the expression smacked of ethnocentrism and xenophobia, but when a nationalist government with aspirations of statehood under the slogan “Nous sommes une peuple” (“We are a people/nation”) came forward with a proposed “charter of values” to rewrite the existing “Quebec charter of individual rights and freedoms” and to guarantee that the history of Quebec could only be retold in one correct, Catholic, French way, it became impossible to disassociate the expression “Québécois de souche” from ethnocentrism and xenophobia.

But there is a way to salvage the expression.  Just as the iconic Santa Claus is taken to be based on Saint Nicolas, the 4th-century monk born in what is now Turkey, we might ask (and answer) who was the first, original, authentic, pure laine Québécois, the person sailing from Europe to settle in New France in advance of all other Europeans, whom we could identify as the primordial source of the expression “Québécois de souche.”  My candidate is Mateus da Costa, for the very simple reason that he was Samuel de Champlain’s secretary on the voyage to settle New France, and da Costa was chosen for the job because he already knew the native languages which implies that he lived in what would become New France for some years before Champlain’s voyage.

Mateus da Costa must have been a brilliant, resilient and resourceful man.  He was of Black-African descent and a resident of Portugal, but most of his history is based on speculation from contracts and court documents.  We know from contract documents that his services were much in demand for anyone who wanted to explore the new lands across the Atlantic.

Mateus da Costa’s Portuguese connection is important in order to understand how it is possible for him to have settled, at least temporarily, in what would become New France prior to Champlain.  The evidence is ample that Portuguese fishers travelled back and forth to Canada, and in particular to Newfoundland’s Grand Banks well in advance of Champlain, Jacques Cartier and Christopher Columbus.  Part of the evidence is the frequency of Portuguese place names all over Newfoundland.  

(Living in Portugal, I discovered that “canada” is a Portuguese word.  Literally it means “here is nothing;” that is, “ca nada,”  but it is used as an equivalent of “rural route” in Portuguese addresses . . . but I digress.)

My point is simply that the next time you hear or use the expression “Québécois de souche” perhaps this (speculative) image of Mateus da Costa should come to mind, and dissipate any sub-text of xenophobia.

Wednesday 14 January 2015

Terrorism and Madness: Between Sympathy and Understanding

When I was researching the uses of madness in literature I came across a paradox from the philosophy of causality. If you are able to analyze the etiology, the causes, of madness, you can no longer claim that what you have analyzed is madness.  You can’t claim that you have found rational causes and effects for a behaviour, and continue to claim that the behaviour is mad.  

Thomas Szasz, a trained and practicing psychiatrist, made a career out of denying the existence of mental illness.  According to Szasz, madness was a “legal fiction.”  Like other such fictions--things that could not be proven to  exist--it was useful for institutions like hospitals, the courts, police forces, governments, and so on, to pretend that they existed, in order to establish procedures and policies for how to deal with particular situations, behaviours and people.

In recent days and weeks and years, the distinction between madness and terrorism has become a matter of significant debate.  Acts which are identified as “terrorism” supply significant political capital to governments interested in the paradoxical project of curtailing the citizenry’s rights and freedoms in the name of protecting those same rights and freedoms.  Madness, on the other hand, is perceived as an individual, personal, private phenomenon, outside the purview of government.  Announcing a war on mental illness has none of the  purchase, cache or logic of a war on terrorism.  

I have come to the conclusion that in both cases we are dealing with “legal fictions,” but legal fictions with powerful, even lethal, consequences and repercussions.  Although we might assume that putting actions and behaviours into particular categories is designed to help us understand and deal with them; historically, these two legal fictions--terrorism and madness--have been used to prevent us from looking further at their causes.  As legal fictions we can deal with them without having to understand them.

Suggesting that we should attempt to “understand” terrorist acts, you will be accused of weakness, naivety, inexperience, and ultimately, “sympathizing” with the enemy.  In some quarters, escalation is considered the only legitimate response to terrorism.  In the Middle Ages,  the common treatment for madness was to lock the madman in a dark room, then starve and beat him. Do you see a parallel?

But really, what are the risks of our trying to understand the causes of terrorism?  Is it that, as with madness, if we begin to understand, we will cease to believe that terrorism is terrorism?  What is the 21st-century alternative to our understanding terrorism?

I get that the intention of a terrorist act is to make a point.  If we admit that we get the point, then the terrorist can claim success.  The dominant, current strategy is a repeated public message that we simply do not understand.  In fact, the message often goes beyond just not understanding all the way to  . . . well, perhaps “madness” is too dramatic a word, but “irrational” and “illogical” seem un-dramatic and appropriate descriptors.   

The example I am thinking of is that it has become common to describe terrorist acts in which terrorists have sacrificed their own lives as “cowardly.” I can see the argument that suicide is cowardly, a refusal to brave life’s hardships, and that terrorists attack “soft” rather than military targets--children, women, men, civilians all. Nonetheless, of all the derogatory descriptions that could be applied to terrorists--evil, immoral, bestial, cruel, inhuman, misguided, foolhardy, disgusting, tyrannical, heartless, mindless, fanatical--what is the logic of describing them as “cowardly”?

Bill Maher lost his job as the host of the TV show Politically Incorrect for contradicting President George W. Bush’s claim that the 9/11 attack on the World Trade Centre was “cowardly.”  “Say what you want about it,” Maher said, “not cowardly.”



This insistence on the idea that terrorism must be described as cowardly brought to my mind the penultimate chapter of George Orwell’s 1984.  Winston is in prison, under the thumb of his interrogator and torturer, O’Brien, and realizes that he has been under surveillance of the Thought Police for the last seven years.  He writes these three sentences in an attempt to prove that he has recuperated and is ready to think as and whatever the regime thinks:

FREEDOM IS SLAVERY.

TWO PLUS TWO IS FIVE.

GOD IS POWER.



Whatever stance we adopt toward terrorism, we must ensure that our position makes sense and we do not become what we oppose.

1 Shakespeare used this treatment of madness for comic effect in his play Twelfth Night.

Sunday 11 January 2015

The Many Ways that the Evaluations of Teachers Go Astray

The difficulty of arguing against the evaluating of teachers is that it seems so obvious, so common-sensical:  you evaluate the teachers and, as a result, the good ones get better, the incorrigibles move on, and education overall improves.  Despite the rock-solid logic of the abstract theory, in practice, they typically turn out to be half-baked, ill conceived, closed-minded, stifling, corrupt, vacuous, bluffing attempts at intimidation,  bureaucratic absurdities, and grounds for peevishness and petty jealousies.  Thanks to the good luck of the right chemistry and the generosity of my students I enjoyed favourable and gratifying evaluations throughout the last nineteen years of my teaching.  In terms of self interest, I should be strongly in favour, in fact, a defender of these evaluations.  I am in favour of some system, procedure or custom that would allow teachers’ work to be recognized, supported, encouraged, rewarded and improved, but the evaluations as they are carried out never seem to come close to these objectives.  I am only in favour of “some system” because without it good teaching would receive no recognition at all, but I remain ambivalent about whether this pursuit of recognition improves or undermines the quality of education overall.  



Prior to my being evaluated by students, my teaching was evaluated by a “Senior Teacher” who, despite the title, was a full-time administrator not a teacher, who would observe one class of my choosing every year.  I was a temporary Senior Teacher myself for a couple of years, and despite my determination to be conscientious in my observations, feedback and support, the tokenism of the process was beyond obvious, not to mention that every teacher’s objective was to get through the observation and not have to think about it for another year.  Tokenism, defensiveness and scapegoating replaced what should have been happening, such as regular meetings of teachers to discuss teaching, exchange ideas and offer invitations to observe each other’s classes.  These are the things that never happened at the university level and rarely in the other institutions where I taught.  To get at the real problems of evaluation you have to look at the details of specific cases.

For example, when I taught at a military college, the novelist Rock Carrier, who was Principal of the college, announced to the gathered faculty with an aporetic shrug that 80% of us had been evaluated as excellent the previous year.  “How could 80% of the faculty be excellent?” he asked rhetorically.  A directive had been received from Ottawa, the bureaucratic equivalent of “the word of God,” specifying that for the current year only 20% of teachers could be assessed as excellent.  I wondered how Rock Carrier would take it if the media decided that one of his novels could not receive a favourable review because the 20% quota of excellent novels had already been reached.  

For teachers of English like me, the situation was even worse because there were only four of us.  As our Senior Teacher pointed out, if he gave even one of the four of us an excellent assessment that would be 25% and in contravention of the directive from Ottawa.  He bravely suggested that since we had all been excellent teachers throughout our careers, he would be our Don Quixote and rebel against the regulations if we agreed to take turns being assessed as excellent teachers.  Each year, by mutual agreement, one of us would be excellent and the other three would volunteer to be something less.  The idea that teachers’ performances were actually being evaluated went out the window, and bureaucracy reigned. 

Why Is the Vagina Masculine? And What’s the Alternative?

“Vagina” is masculine  I first came across this factoid thirty years ago in Daphne Marlatt’s novel Ana Historic .   It came up again more r...