Translate

Saturday 5 September 2015

The Truth about English Grammar

The “joke” below about the distinction between “can” and “may” crossed my Facebook feed a couple of times this week.  The last time I heard this joke I was around 10 years old, meaning more than 50 years ago, so I am a little bit more than surprised that anyone today would comment on the distinction between “can” and “may” in making polite requests, or even think that such a distinction exists.  Nonetheless the post has received tens of thousands of likes and shares.


After nearly 40 years of teaching English Language and Literature, I think I can say, based on my own authority and that of most grammar books published in the last three or four decades, that if there ever was a polite-request distinction between “can” and “may” it disappeared at least 40 years ago. 

I have a strong suspicion that the reason this kind of false distinction persists is that if you press a less-than-fully competent teacher of English or uninformed speaker of the language to explain the difference between say “going to” and “will” or between “have seen” and “saw” or between “a few” and “a number of” they will retreat into pure nonsense claims that one is more “polite and formal” than the other.  Euphemisms and dysphemisms notwithstanding (see the Sour Glossary for definitions), the distinction between polite and less polite grammar disappeared from English usage in the 16th century with the abandonment of the distinction between “you” and “thee”  (the equivalents of the French “vous” and “tu”).  “Thee,” “thou” and “thine” persisted in prayers, bible translations and poetry, specifically because these words had disappeared from common usage and being rarefied seemed special and more poetic.  Ironically, "thee," "thou," and "thine" are the less polite, less formal and deferential forms but the most typical way that Christians address God.

Register is an important concept in linguistics.  The concept reminds us that different words and expressions fit better in particular contexts.  The language you use with your grandparents will likely be different from the language you use with your friends. If the context is legal or scientific or formal we expect to hear words and expressions that fit that particular context, but the idea that certain common grammatical expressions, particularly modal auxiliaries like “can,” “should,” “may,” might,” “will” and so on, can be distinguished from one another on the basis of politeness is simply wrong.

This being said, the truth about “correct” English grammar is that it is simply a collection of the most recently accepted errors.  Just about anything that is now considered “good English” was a mistake at some point in history.  You know all those words in English that end in “e” (like “bite,” “kite,” “courage,” “wide”):  we were supposed to pronounce those final “e”s, but since most people in the 16th century were mistakenly leaving them silent, the silent final “e” became the correct thing to do.  It’s an idea worth keeping in mind if you have ever had your self-esteem battered by a know-it-all grammar mavin, and based on the vitriolic responses to the above post, it seems like a lot of people have. 

Based on these comments, you might think I am opposed to grammar.  On the contrary, I think the posting above proves that we need to re-introduce grammar instruction into the school system.  The problem with the above posting (beyond satirizing a problem that hasn't existed for 40 years) is that it mistakenly describes "Can I borrow . . ." as a "colloquial" irregularity.  It isn't.  Grammar is constantly changing and at this point in time, "Can I borrow a pencil?" is correct standard English.

Tuesday 1 September 2015

Will the Government Use C-51, Anti-Terrorism Legislation, to Track Canadian University Students with Outstanding Loans?

Ottawa has instructed the Canadian Revenue Agency (CRA) to be more aggressive in collecting outstanding student loans.   According to the Globe and Mail:

The Government annually has to write off some of the $16 billion owing in student loans for a number of reasons:  a debtor may file for bankruptcy, the debt passes a six-year legal limit on collection, or the debtor can’t be found.  (B2, 31 Aug 2015)
For more detail on how the government has disallowed University graduates from declaring bankruptcy and extended the 6-year limit to 15 years, see my earlier post  When Should You Repay Your Student Loan? How about . . . Never!  However, the real cause (“90% of cases”) of non-payment is that CRA has lost track of student borrowers because “the CRA wasn’t allowed to ask other departments for help because of privacy laws” (B2, 31 Aug 2015).

What the Globe article doesn’t mention is the possibility of using C-51, anti-terrorism legislation, to solve the problem.  In case you have forgotten, the official title of the legislation is the “Security of Canada Information Sharing Act” and the purpose of the Act is 

TO ENCOURAGE AND FACILITATE INFORMATION SHARING BETWEEN GOVERNMENT OF CANADA INSTITUTIONS IN ORDER TO PROTECT CANADA AGAINST ACTIVITIES THAT UNDERMINE THE SECURITY OF CANADA

You might not think of a Canadian University grad with a student loan as a terrorist, but that’s because you have forgotten how the Conservative Government has used C-51 to re-define terrorism.  Here, unabridged, is how C-51, which is now law, defines terrorism:

The following definitions apply in this Act.Definitions“activity that undermines the security of Canada” means any activity, including any of the following activities, if it undermines the sovereignty, security or territorial integrity of Canada or the lives or the security of the people of Canada:interference with the capability of the Government of Canada in relation to intelligence, defence, border operations, public safety, the administration of justice, diplomatic or consular relations, or the economic or financial stability of Canada;
With students currently owing $16 billion in loans, and a good chunk of them refusing to pay up (28% in 2004, 13% in 2014), guess what!  They are potentially interfering with “the economic or financial stability of Canada” and therefore qualify as terrorists under C-51.



Both the Conservative and Liberal Parties are in favour of C-51, only the NDP has promised to repeal this legislation.


Sunday 2 August 2015

“Be Yourself!” Is This Really Good Advice?

I’m not sure telling people to be themselves is good advice, but my saying so never seemed to have much purchase with undergraduates in my Intro to Lit course.   The sadist, the homicidal maniac, the pedophile--aren’t they “being themselves” when they commit their crimes?  Shouldn’t we tell people, and ourselves, to “be better”?


The context of the discussion was H.G. Well’s short story, “The Country of the Blind.”  Nunez, a mountain climber in the Andes, tumbles in an avalanche into the Country of the Blind--a society cut off from the world for over 14 generations which has adapted to the fact that everyone living there is blind.  The concept and all memory of sight have disappeared.  Nunez struggles and fails to explain to the people that he is a superior being because he can see.  They perceive him as inferior, unformed, a missing link in the evolutionary chain stricken with occasional delusions and bouts of violent madness. Nonetheless this world is prepared to offer him an idyllic life, peace, sustenance, acceptance, the requited love of a beautiful woman, but in exchange Nunez must agree to surrender his eyes (which doctors have concluded are tumors causing his madness). 


The story is an allegory of cultural blindness; my challenge to students was always to recognize the various perspectives from which the allegory might be applied.  In general, students were a bit too quick to condemn the people of the Country of the Blind for their refusal to “see” beyond their own culture.  Some students remained adamant in their refusal to abandon this position, insisting that the story should be viewed from only one perspective, that of Nunez looking down on the inferiority of the Country of the Blind. 

In the face of this conviction, I pointed out some of the merit of this position.  The Nunez story was typical in our time:  an immigrant arrives in a new culture bringing with her the baggage of her own culture and a host of superior skills associated with that culture.  We, the receiving, settler culture, having been in place for generations, are the people of the Country of the Blind.  

For some students this is a tough pill to swallow, but I invite them to consider how we would react as individuals and as a society if someone ragged, ill-kempt, poorly spoken and perceptibly alien aggressively insisted that he was superior, explaining this superiority with words that had no meaning to us.  Our prisons, asylums, homeless shelters and streets are filled with such people. 

It was fairly easy to win a few adherents to this perspective.  For students who have lived the immigrant experience the allegory was beyond obvious.  For others I invited them to generalize the experience imagining that they brought skills, talents and abilities to a new social group, and think about how quickly and easily any social group (club, team, neighbourhood, school, peer group, etc) would be to accept a new member expecting to be ”king.”  (Nunez’s mantra in the story is “In the country of the blind the one-eyed man is king”).  Above  all we expect immigrants and newbies to be humble in fact and manner.

And so the discussion progressed, with students slowly, tentatively approaching the realization that we are all culturally blind.  We all view our own culture not only as what is best but what is normal, real, and paradoxically “natural.”  The one culture that we can never see clearly is our own.  Trying to understand your own culture is like a fish trying to understand water.

As we reached the end of the story, I would hit a wall.  In the final line of the story it appears that Nunez commits suicide rather than surrender his sight.  (In fact, a coherent and typical reading of the story is that Nunez does not survive the avalanche at the opening of the story, and the entire narrative is his imaginings in the moments before death.  This is an interpretation that I never presented or pursued for the simple reason that it would preempt much of the productive exchange that the story provokes.)

When I criticized and even mocked Nunez’s apparent suicide, the backlash response from students was tidal:  “he was being true to himself,” “true to who he was,”  “true to his beliefs and principles,” “true to what he loved and found beautiful--seeing,” “he was refusing to give up who he was,”  “he was being himself!”  Yeah, maybe!

In the first place I wasn’t going to surrender to a romantic advocation of suicide (see Do No Harm).  In passing I would mention that the child psychologist Piaget defined intelligence as the ability to adapt.  Then I would underline the illogic of Nunez’s apparent decision to choose “seeing” over “living”--a decision that was neither intelligent nor admirable.  Finally, I would point out that advocating Nunez’s decision was an example of cultural blindness--that we live in a visual culture, one that emphasizes and exaggerates the importance of seeing (see Ong, Havelock, McLuhan and Falling in Love is Unprofessional) and like Nunez we fail to “see” beyond the visual culture that we have come to accept as no less than life itself.

To further the point, I would typically bookend the course with Kurt Vonnegut’s short story “Who Am I this Time?” collected in the anthology Welcome to the Monkey House


The satiric short story was adapted into the made-for-TV romantic comedy on PBS, also called  ”Who Am I this Time?”  The story focusses on a young couple who defy “being themselves” by finding happiness in acting and playing roles with one another.  


Monday 13 July 2015

Postmodern Shibboleths


In contemporary usage a “shibboleth” is a word or style or behaviour or custom which identifies you as being part of an in-group--or not.  Postmodern shibboleths are numerous.  If you encounter people who consistently say “discourse” when they mean “theme,”   “the signified” when they mean “the meaning,”  or “deconstruct” when they mean “analyze,” you can be sure you are dealing with postmodernists.  



In the not too distant past the ultimate identifier of a postmodernist was the frequency with which s/he used the word “postmodern”--although this might be taken more as the team cheer rather than a shibboleth.  Upon encountering anything that was kitch, ironic, self referential, or lacking unity, coherence and conclusion, the postmodernist would loudly declare, in the hope someone might overhear, that it was postmodern.

The irony of postmodernism is that its only redeeming social value has been the promotion of tolerance, yet the postmodern catchphrase “political correctness”--a hallmark of intolerance--promises to outlive postmodernism itself.  A postmodernist is someone who can tell you, with conviction, to shut up, while arguing in favour of the right to free speech.


One of the postmodern concepts which I have found to be occasionally useful is “the subject.”  In postmodern speak “the subject” stands in for a variety of possibilities:  the self, the individual, the person, the ego, the “I,” and is a strong counterpoint to the soul, the personality, character and spirit.  In attempting to use “the subject” in my writing, I discovered the other side of employing postmodern shibboleths.  Once you have used an established postmodern catchphrase, you are pretty well locked in, by reader expectation, to following with a typical, well-worn postmodern argument about how the victims of power suffer and the terrible things we already thought about power are even worse than we imagined--which is why most postmodern essays turn out to be convoluted on the surface, obvious underneath, disingenuous overall, and incredibly tedious to read.

Sunday 5 July 2015

Binary Thinking Versus the Other Kind

I still remember from my first-year-undergraduate “Philosophy of Mind” course that the human brain is incapable of thinking, imagining or understanding one thing in isolation without bringing in another, a background, a difference, an opposite.  You can test yourself by trying to think of just one thing.  The notion of a dialectic is based on the binary functioning of the mind; every concept contains its opposite:  the notion “long” requires “short,” “big” invokes “small.”  In an even more rudimentary fashion, in order to know a “thing,” you must be able to say what is “not that thing.”

If you have ever found yourself in a debate with a postmodernist, chances are the postmodernist turned on you at some point to announce dismissively, “oh, that’s binary thinking!”  The postmodernist’s gambit is based on the assumption of binary thinking.  The bluff works because you find yourself thinking “Gee, there was must be a superior, more advanced form of thinking that isn’t binary.”  Is there?

No, there isn’t, but the trickle-down effect of postmodern intellectualizing results in something like this claim from the online “Postmodern Literature Dictionary”:

“If you use ‘binary thinking,’ you are a person who sees no gray, no fuzziness between your categories. Everything is black or white.”

In postmodern speak “binary thinking” has become a synonym for the already well-known and understood idea of “simplistic thinking,” again with the implication that those “non-binary” thinkers must be smarter than the rest of us. How did we arrive at this “two legs bad” juncture?  

The cause is rooted in “poststructuralism,” the theoretical backbone of postmodernism.  In order to understand “poststructuralism” (literally “after structuralism,” therefore a continuation and improvement of) it is necessary to have some grasp of structuralism.  Structuralism is closely aligned with “semiotics,” a term coined by the linguist Saussure meaning the science of signs.  John Fiske offers a clear, accessible and succinct description of semiotics/structuralism in his Introduction to Communications Studies.

Semiotics is a form of structuralism, for it argues that we cannot know the world on its own terms, but only through the conceptual and linguistic structures of our culture. [. . . .] While structuralism does not deny the existence of an external, universal reality, it does deny the possibility of human beings having access to this reality in an objective, universal, non-culturally-determined manner. Structuralism’s enterprise is to discover how people make sense of the world, not what the world is.  (Fiske, 115)



Fiske’s description anticipates the core dispute in the the feud which will eventually take place between postmodernists and empirical scientists like Sokal as I have described in my post The Postmodern Hoax.  Current repudiations of “binary thinking” find their origin in a paper delivered by Jacques Derrida at a structuralism conference at Johns Hopkins University in 1966 entitled  “Structure, Sign and Play in the Discourse of the Human Sciences”.  (The French-language original, "La structure, le signe et le jeu dans le discours des sciences humaines" is slightly more readable than the English translation.)


In this essay, Derrida dismantles (Derrida uses the term "deconstructs") the work of anthropologist Claude Lévi-Strauss, in particular Lévi-Strauss's The Raw and the Cooked.  Although Derrida never explicitly refers to "binary thinking" or "binary opposition" in his essay, it is understood that the structure Lévi-Strauss uses, derived from the linguists Saussure and Jacobson and all of structuralism, is the binary functioning of human thought, and is the target of Derrida's critical inquiry into the "structurality of structure" (“Structure, Sign and Play in the Discourse of the Human Sciences”).

The Longman anthology Contemporary Literary Criticism, in addiction to a translation of Derrida's paper, offers in addendum a transcription/translation of the discussion which took place between Derrida and the leading lights of structuralism immediately after his presentation.  It's interesting to see some of the finest minds in structuralism struggling to understand what the hell Derrida was talking about and, at the same time, to see Derrida cornered into giving a straightforward definition of "deconstruction."   Okay, "straightforward" is never a word that can be applied to Derrida, but with my ellipses eliminating all the asides and parentheses this is what he said:  "déconstruction [. . .] is simply a question [. . .] of being alert [ . . .] to the historical sedimentation of the language which we use [. . .]" (497). This is the definition of "deconstruction" that I typically gave students and, at the same time, I pointed out that even though "deconstruction" was suppose to be something innovative, radical and distinctly postmodern, the Oxford English Dictionary has been"deconstructing" the English language for literally hundreds of years--meaning that the OED gives you the multiple meanings of a word and the year ("the historical sedimentation') in which a particular meaning/definition can be proven to have come into usage.

 Back to structuralist anthropology. As Fiske explains:
The construction of binary oppositions is, according to Lévi-Strauss, the fundamental, universal sense-making process. It is universal because it is a product of the physical structure of the human brain and is therefore specific to the species and not to any one culture or society. (116)
Contrary to popular understandings of "binary thinking,"  the whole point of structuralist anthropology (the binary approach) is to understand how societies, through their mythologies for example, deal with the failures of and exceptions to binary opposition.  Fiske applies the Lévi-Strauss approach to a Western and concomitantly demonstrates how the approach teases out subtextual themes at play in the movie, and how this particular interpretation of the film might stretch credibility.  Even today, 50 years later, it is difficult to fathom exactly what new, radical, distinctly postmodern objection Derrida is raising.  

Certainly it makes sense to challenge how binary thinking is applied in a particular case.  The objection isn't to binary thinking but to a particular application.  If you are going to launch a campaign against food on the grounds that it causes obesity, you should at the same time be ready to present an alternative to eating food, something that goes beyond the absurd claim that "eating food is bad."





Friday 26 June 2015

Falling in Love is Unprofessional

"Falling in Love and Crying in the Academic Workplace"

In the wake of Nobel laureate Professor Tim Hunt’s ironic comments on women in science, a draft article entitled “Falling in love and crying in the academic workplace: ‘Professionalism’, gender and emotion” has been circulating in social media.  

Do We Need Gender?

The challenge that this type of article faces, that this one doesn’t quite overcome, is that it/they end up reinforcing the gender stereotypes they ostensibly set out to oppose.  



I used to challenge students to imagine a world where the words (and concepts) “man” and “woman” didn’t exist, and we were all just people: some of us with brown eyes, some with blue, some of us left-handed, some of us right, some with vulvas, others with penises, some capable of bearing children, some better at lifting heavy objects--no absolute, mutually exclusive binary categories necessary.  Intellectually speaking we don’t “need” the categories “men” and “women.”  The intent of this “thought experiment” was to show the intellectual ease with which gender difference could be erased and to demonstrate how, in the abstract, gender is a fragile and superficial concept.  

However, the fact that students never show much interest in the project of gender erasure shows how culturally attached we are to this dichotomy.  If I pushed the discussion, eventually a fastidious female would vociferously declare: “There is no way I want to share a bathroom with a bunch of smelly guys!”  End of discussion.

Stereotypes and Prejudices

The problem isn’t that gender differences and stereotypes exist, the problem, as Judith Butler would point out, is that these differences and stereotypes are policed and enforced.  There is a difference between a stereotype and a prejudice.  A stereotype is an extreme or rigid form of assigning type (“stereo” means “hard” or “firm”), but it usually has some basis in fact when applied in general to a large group of people. A prejudice is assuming and insisting that a stereotype applies to any and all individuals of a type or category.  It is a gender stereotype that men are physically stronger than women.  It is a scientifically verifiable correlation that, on average, people with penises enjoy more muscle mass than do those endowed with vulvas. 

Enforcing Stereotypes

The problem begins when this generalization is enforced on an individual and we tell John that he is failing as a man because he is not stronger than the average woman, and suspect Mary of not being a real woman because she is stronger than the average man and, of course, John and Mary cannot be a couple because she is stronger than he is; nonetheless John could get a construction job, but Mary can’t, etc, etc.  As a society, we extrapolate, police and enforce these stereotypes.

Solving Prejudice

How do we get beyond stereotypes and prevent them from devolving into prejudices?  it is too easy to say that stereotypes and prejudices are products of ignorance.  We are all ignorant and prejudiced in varying degrees.  In a world of Twitter, instant messaging and an up-to-the-minute news cycle we are constantly being called upon to “pre-judge,” our sympathies and outrage being called upon long before anything approaching a comprehensive knowledge of the facts is possible.  The only solution is to question and to withhold judgment until a sufficient number of facts have come our way; to rigorously apply our reading skills and logic to the facts available, and then to cut the world some slack without slipping into apathy.

The other solution when facing stereotypical differences is to consider other possible paradigms, other axes of comparison.  I admired that  in “Falling in Love and Crying in the Academic Workplace,” the author, Rachel Moss, at least temporarily shifted the discussion to “professionalism.”  Falling in love is unprofessional, mostly because the root of the word “amateur” is “amour,” “to love.”  Even in the study of theatre and drama, I have found ample reason to prefer amateur productions and performances over the professional, though the value system runs in the other direction.  It is not without reason that we describe prostitution as a profession.   It has its rules, and one of them is not falling in love.   

How to Talk about Cultural Differences

In my research I have tried to talk about some of the same differences that Rachel Moss discusses in her article.  I tried to talk about them as the differences between oral and visual cultures (following from Havelock, Ong and McLuhan), and when that didn’t quite work I turned to what John Vernon called “garden” and “map” culture.   Ultimately we have to admit that what we are talking about is “human” culture versus “machine” culture and our society shows an ever-increasing admiration for humans who behave like machines.

"You Fall in Love with Them, They Fall in Love with You"

On that note, a concluding word about Tim Hunt.  Apparently, he has two daughters who love his cooking, but I’ll bet he’s seen the girls cry when he criticized them.   His wife, Professor Mary Collins, was once his student.  So when he said the trouble with girls in the lab is that “you fall in love with them, they fall in love with you” could he have been thinking about himself and his wife?  What an amateur!




Tuesday 23 June 2015

After “the Death of the Author” It Only Takes 39 Words to End an Academic Career

39 Words versus curing cancer

It only takes 39 words to end an academic career even if you are a Nobel laureate in physiology . . . or maybe it’s because you are a Nobel laureate.  The sexist comments of the average smuck don’t go viral on Twitter.

I can’t help imagining some futuristic Wikipedia article on “the cure of cancer.”  It would go something like this: “Professor Tim Hunt’s work on cell division proved instrumental in developing the cure for cancer, however he became notorious and his career was ended in 2015 for his off-the-cuff remarks on women in science at a conference in Korea.”

The 39 words in question

According to The Guardian these are the 39 words which Professor Hunt uttered:


The Danger of irony

His wife, Professor Mary Collins, an immunologist, concurs with most of the critical commentary that “It was an unbelievably stupid thing to say.”  Hunt himself confessed apologetically,  “I was very nervous and a bit confused but, yes, I made those remarks – which were inexcusable – but I made them in a totally jocular, ironic way.”  (I’ve already covered the problems with irony but if you need a refresher see  Do No Harm Part II: Avoid Irony).



The Context is the meaning

No-one is denying that Professor Hunt said what he said, but my reason for commenting is that his words are being so widely reported and repeated out of context.  The context is the meaning.  The only way to understand what an action or an utterance means is to consider the context.  In saying this I know I am indirectly defending “the bad guys” (and "girls"):  the politician who complains of being quoted “out of context” and the adulterer who claims that the sex “didn’t mean anything.”  The truth is that politicians are frequently quoted out of context and their words attributed meanings that are different from, worse than or in complete opposition to their intentions.  And yes, a single act of coitus can be as meaningless as friction.  The only way to know what sex means is to consider the context, and the spectrum of possibilities range from criminal sadism to love.

To Read is to put a text in its proper context

For at least a generation now (the Twitter generation?), we have been training university students to read out of context.  As a professor of literature I thought of my job as teaching my students to be the best possible readers, to be able to analyze and re-synthesize some of the best works that have ever been written.  Reading well meant having a thorough understanding and appreciation of the various contexts within which a work could be read.  As time marches on the new meanings of old works are constantly changing but if we care about meaning, we have to consider the many contexts within which literature is/was written and read.

The "Death of the author" is the death of meaning

However, I noted with chagrin that many of my postmodernist professors and colleagues were quickly and firmly attached to Roland Barthes’ proclamation of “the Death of the Author.”  Fixed meanings were no longer possible, according to Barthes, because professional readers (i.e., postmodern professors) no longer considered the author (who she was, her context or intentions) when interpreting a literary work.  Looking at the author to determine the meaning of a text simply wasn’t done. Whether Barthes was reporting what he witnessed around him or was announcing what should and had to be, on the ground in university classrooms the idea of considering the life of the author as part of the study of a literary work had become so passé that it would be radical to consider this approach.

The "Death of the author" is power grab by pro readers

To my knowledge no-one has ever pointed out how self serving the “Death of the Author” was for university professors.  In the new postmodern context, meaning no longer resided with the author but with the reader, and if you wanted to know what a literary work “really” meant (even though such an absolute was never possible) you had to turn to a professional reader, a professor of literature.  It was clearly a power grab, but no-one seemed to notice--or was it that no-one cared?

The precedents  and procedures for quoting Professor Hunt out of context have been established  and taught.  Everyone is invited to posture self-righteously by attacking him and his un-contextualized utterances.

Tim Hunt is the context of his remarks

When that gets old we might consider challenging the ”Death of the Author,” and taking to heart Professor Collins’ observation that what her husband said “could be taken as offensive if you didn’t know Tim”  and her assurance that “he is not sexist. I am a feminist, and I would not have put up with him if he were sexist.”

What are the proper contexts within which we should read Professor Hunt’s utterance?  My counsel is that we need to be conscious that we are reading different contexts and, in this case,Tim Hunt is one important context of the utterance, not the other way around.  We won’t get the meaning of Tim Hunt by reading the 39 words he uttered in Korea. 

"Three Days of the Condor" and the Tenth Anniversary of "The Sour Grapevine"

Sharing Intelligence I'm still obsessing over " sharing intelligence ."  May 15th was the tenth anniversary of this blog.  I w...