Translate

Wednesday 26 March 2014

“Critical Thinking Skills” and “Family Values”

“Critical thinking skills” and “family values”:  these days it is typical to imagine that these concepts are dichotomous to one another.  In the binary thinking of those people who espouse strident opposition to binary thinking these expressions are in mutually-exclusive opposition to each other.  In other words, it is assumed that if you have any “critical thinking skills” you cannot believe in “family values.” What strikes me is how much these phrases have in common.

What these locutions share is the fact that their literal, obvious, word-for-word, face-value meanings are no longer what they mean.  “Family values” doesn’t mean that you value family.  "Critical thinking skills" as taught in most universities aren't skills and rarely show signs of clear thinking, though they are invariably critical.  In both cases, these expressions have taken on a level of meaning that the essayist Roland Barthes calls “mythology.”  In simpler terms, their connotations (what these phrases suggest) have become more important, more widely and significantly understood, than their denotations (the literal meanings of the words).  

These days the expression “family values” tends to suggest (more than anything else) the value system associated with the evangelical, religious right in the USA.  This domination and precedence of connotation over denotation is confirmation of the theory associated with Mikhail Bakhtin that how words are used over time affects their meaning as much as the dictionary definition.  In fact, how they are used eventually becomes the dictionary definition. What “family values” has come to mean is a result of the fact that the expression has historically been used to oppose family planning (at the turn of the 20th century it was a crime to send contractive devices through the mail, for example) and as justification for denying employment to women.  “Family values” was another, nicer way of saying “a woman’s place is in the home.”  “Family values” could be used as a basis to attack not only abortion, but homosexuality, lesbianism and various forms of non-procreational sex.

Just as the expression “family values” has come to signal an attitude more than what the words themselves mean, “critical thinking” has become code for left-wing, materialist, feminist thinking and attitudes.  As it happens, I have always been of the opinion that if you exercise critical thinking skills they will eventually lead you to left-wing, materialist, feminist thinking and attitudes.  The problem, of course, is that if I as a professor profess my left-wing, materialist, feminist leanings and conclusions to my students and they follow along and agree with me, at no point are they actually exercising their own critical thinking skills.  I am understating the case.  In fact, university students are measured by the degree to which they reject and rebel against right-wing ideologies, patriarchy and idealism or dualism.  The problem isn’t with the conclusions, but with the process, which is basically that they are being taught a series of opinions as if they were religious dogma.  Having absorbed this teaching, students are encouraged to expect good marks for having the “right” opinions without having demonstrated the logical reasoning skills which led them to these conclusions.

The causes of this malaise are not abstract or purely academic.  The demise of what “critical thinking” should be was provoked by the rise of deconstruction and the concomitant, haphazard decline of university departments of philosophy.   Most of the theory which paraded under the banner of deconstruction was nonsensical.  I saw Jacques Derrida being interviewed on French television a couple of years before his death, and he seemed honestly embarrassed to be the father of deconstruction.  He insisted that it was not a theory of any importance, not even a theory, not even a word that he used anymore.  However, in true Derridean, deconstructionist fashion he subsequently used the word at least a half dozen times in answering the final question of the interview.  I came to understand what “deconstruction” was (and more importantly what it wasn’t) by reading John Ellis’s succinct monograph Against Deconstruction, published in 1989. 



As Ellis points out, when the promoters attempted to define it, they typically defined deconstruction as a attack on or opposition to “logocentrism.”   The challenge then became to try and understand what “logocentrism” was; only to discover that deconstructionists were as foggy and obscure about defining logocentrism as they were about deconstruction itself.  Here is Derrida’s comment on logocentrism from the opening sentence of his seminal work, Of Grammatology

[ . . .]  “le logocentrisme  : métaphysique de l’écriture phonétique (par exemple de l’alphabet) qui n’a été en son fond -- pour des raisons énigmatiques mais essentielles et inaccessibles à un simple relativisme historique -- que l’ethnocentrisme le plus original et le plus puissant, [. . .].”

In English, without the multiple parentheses:  “logocentrism:  the metaphysics of phonetic writing [ . . .] which was at its base [ . . .] not but the most original [meaning earliest] and most powerful ethnocentrism [ . . .].

I have done my best not to play games with the translation.  It is clear that “logocentrism” is like “ethnocentrism” and, therefore, to people like me who live in and admire multicultural society, logocentrism must be something bad.  The single sentence from which I have taken this quotation runs for 400 words.  (Okay, I only counted the first 175 and estimated the rest.)  No, I still don’t know what logocentrism is, but I do know that “logos” is the Greek word for “reason” and “logic,”  and that in the opening sentence of Of Grammatology, as run-on and gobbledygook-ish as it is, Derrida, by attacking reason and writing as Western prejudice, digs himself a hole that neither he nor anyone else can dig out of.


At exactly the same moment, that Derrida was turning logic, reason and clear writing into an object of suspicion, universities were following the established business model and downsizing the study of philosophy on the grounds of a lack of sex appeal.  Logic and reason, of which departments of philosophy were the crucibles, were being hammered from both sides.  The remnants of the collision are the glib or purple descriptions of “critical reasoning skills” on university web sites which bury logic and reason somewhere in the hinterland of a third paragraph or fourth priority.

Tuesday 18 March 2014

When Should You Repay Your Student Loan? How about . . . Never!

  To Owe Is to Own

Do you remember this line:  “So Romeo would, were he not Romeo called, / Retain that dear perfection which he owes / Without that title.”?  “Owes” used to mean “owns.”? 

I’ve been reading David Graeber’s book Debt:  The First 5000 Years.  It begins with the American proverb:  “If you owe the bank a hundred thousand dollars, the bank owns you.  If you owe the bank a hundred million dollars, you own the bank.”  I can remember my father telling me this.  I thought he made it up.  

Avoiding Debt

I’ve always had an aversion to debt.  I think it has something to do with when I was five and my mother, as she was leaving to go to work, telling me, “If there is a knock at the door, just sit on the floor and be quiet.  Don’t answer the door; they might be bailiffs.”  Of course, like everyone else, I’ve understood that it is impossible to get on in the world without a car loan, a mortgage, a credit card and a line of credit.  Nonetheless, I’ve always been fairly obsessive about paying my debts and as soon as possible.  You too I imagine.


Why Must Debts Be Paid?

Graeber is an anthropologist and he must have been a good teacher because the book is full of those “dumb questions” that a student might ask which turn out to be really profound, epiphanic, inspiring and unanswerable.  For example:  “Why should we pay our debts?”  And the corollary:  “Why are we so absolutely convinced that we should pay our debts?” Or, “What is money?” 



Is Barter Really the Root of Economics?

Graeber’s ambition in the book is to dispel all those preconceived notions that come to us through the study of economics--that discipline created by Adam Smith in 1776 at the University of Glasgow--like that “barter,” people exchanging one commodity or service for another, is a primordial, primeval human activity as well as the historical basis of economics, and that we are morally obliged to pay debts.  It intrigued and amused me to learn that economists (and anthropologists) are unable to trace the historical origins of money or agree upon a definition of what it is.  The camps divide into those who think of money as a commodity (meaning that it is worth something, like gold and silver coins) and those who think money is an IOU (a way to contract and measure debt).  These days it seems obvious that money is either paper or pixels, and not worth anything in itself.  Graeber argues that money and debt are pretty much the same things:  money is a measuring system (like meters and feet) and debt is what money measures.  If you have a twenty-dollar bill on you, it means that the government of the country that issued it owes you twenty dollars worth of something.  The problem these days is “of what?”  

The "Yellow-Brick Road":  The Gold Standard

In the old days, the government was supposed to have enough gold in storage so that all the money it issued could, in theory if not practice, be exchanged for gold--what was known as the “the gold standard.”  It was interesting for me (I’m a lit prof remember) to discover that L. Frank Baum’s The Wonderful Wizard of Oz, published in 1900, was an allegory in opposition to the gold standard.  Farmers in Kansas needed government loans, but the treasury refused them because the USA didn’t have sufficient gold reserves.   The “yellow-brick road” was the illusion of the gold standard leading to a fraud, the Wizard of Oz, who was just an ordinary old man (Oz is the abbreviation for ounce, the typical way gold was measured).  The gold standard came to an absolute end in the USA in 1971 when Richard Nixon announced that American dollars could no longer be redeemed for gold.  


Money Is a Debt that Can Never Be Paid

Even today there are people who believe that gold is the only real money.  However, according to a CBC documentary I watched not so long ago, a lot of individuals and countries hold papers that say they own gold, but the gold doesn’t actually exist. So if you have an American twenty, the American government owes you twenty dollars worth of something, but not gold.  No need to worry about gold, because it is fairly obvious that the American government has no intention of paying its IOUs period.  Economics really does become the purview of literary critics, I mean people skilled in analyzing works of fantasy and imagination, when you consider that the USA currently has a debt of over 17 and a half-trillion dollars.  That’s American dollars, so it owes trillions of its own IOUs.  

You Can Create Your Own Money

Think about it.  You’re out having a beer with your friend George and he runs out of money.  You pay the bar bill and George gives you an IOU.  The same thing happens a week later and the week after that.  Instead of getting George to pay, when you go out to dinner with Rosemary, you pay your half of the bill by giving her a couple of George’s IOUs.  George keeps writing IOUs and pretty soon everyone in your and George’s circle of friends has George’s IOUs.  When does it end?  It would end when people start refusing to accept George’s IOUs, but why would they?  If they keep getting what they want (beer and meals; maybe they can even get the bar and restaurant owners to start using George IOUs), what motivation is there to stop accepting George’s IOUs?  At a certain point, everyone knows that George is never going to pay off all his IOUs, but it is in everyone’s interest to pretend that he can and will.  It will all end when George has given out so many IOUs (like say 17 trillion) that people can no longer pretend to believe in them or George himself refuses to write any more IOUs, even though he is only being asked to write IOUs to pay a debt that he owes in his own IOUs (because at a certain point George began to receive and repay in his own IOUs).

USA:  The Most Indebted Country in the World

Both of these scenarios have taken place in the US government recently.  The USA has the biggest debt in the world at 17 and a half-trillion dollars, or over $55,000 per citizen, but every single country in the world is doing the same thing and is in debt.  The only debate is about those countries that we know will never be able to pay off their debts. Even in these cases, it seems like they "own” the banks and the countries that they owe money to, and we hear constantly that they can’t be allowed to go bankrupt.  Governments are considered conservative and fiscally responsible if they announce the intention to balance their budgets and eliminate deficits, meaning to stop going further into debt, some year in the future.

Canadian and Québécois Debt Levels

In this context, should a recent university graduate pay back her student loan?  Jeez, I don’t know!  But here are some of the facts of the Canadian/Quebec case.  The Canadian debt is currently approaching 700 billion, which means we each (every man, woman and child) owe over 20,000.  (Yep, if you were born yesterday, you are already in hock for $20,000.)  The Quebec debt is at 265 billion.

History of Non-repayment of Canadian Student Loans

Beginning in the 1990s Canadian students started to borrow a lot more money and were having increasing difficulty in paying back their loans when they graduated.  In 1980 around 9% of graduates were unable to repay their student loans; by 1990 the level of non-payment was at 17%.  By 1997, non-payment of student loans reached a total of 70 million dollars.  The federal government passed a law making it illegal for a student to declare bankruptcy until 10 years after graduation.  The Canadian Federation of Students took the government to court claiming discrimination under the Canadian Charter of Rights but lost the case on the grounds that student borrowers were not considered a social group.  The government later reduced the length of time before a student could declare bankruptcy to 7 years (thereby falling into line with what Graeber identifies as the ancient Judaic tradition of forgiving loans after the sabbath--or seventh--year).  

In 2003-2004 Canadian student loans amounted to $1.63 billion.  The same year, 28% of student borrowers (43,600 former students) defaulted on their loans totaling 331 million dollars. The government’s solution to the student debt crisis has been to create a Repayment Assistance Plan whereby students’ loan payments will be tapered to a maximum of 20% of their gross family income and the maximum repayment period will be 15 years.  For 15 years (or until the loan is paid) you will have to fill out a form every 6 months requesting permission to only pay one-fifth of your gross family income toward your student loan.  I’m not sure that I would find myself jumping for joy at these conditions, and it seems clear that the real objective is to ensure that the government gets its money back (or at least more than it was getting when 28% of former students were defaulting).

Canadian Student Debt Is Equivalent to a Small Country

The Federation of Canadian Students has begun to maintain its own “debt clock” showing how much Canadian students owe in Canada Student Loans.  The amount is now over 15 billion.  If they keep going, eventually they will be able to declare themselves a country (they are currently between Jamaica and Guatemala in debt size), or maybe the Federation will come to the realization that it’s members own this one.

Sunday 16 March 2014

What Is the Relationship Between University Education and Employment?

What Is the Relationship Between University Education and Employment?  The official answer is always absolute:  you need the diploma to get a decent job.  At ground level the answer is a matter of degree--in both senses of the word.  With some degrees the answer is redundant:  you get an accounting degree to become an account, a medical degree to become a doctor, an engineering degree to become an engineer.  More or less. To be honest most of the engineers I know work in sales.  Outside the obvious cases, the relationship between a particular degree and employment is a matter of debate.  On the other hand, if a graduate from a BA in English becomes a news broadcaster on local TV, you can be sure that “television journalist” will be added to the list of employment outcomes for that degree in the university calendar and on the web site.



Inside humanities programs the answer is adamant that a university degree is not job training.  It’s hard not to angle your nose toward the sky while saying this.  Holding a university degree proves that you know how to learn, not that you have learned any particular X or Y.  The average person will hold seven different jobs during a working career.  The degree has to transcend the specifics of any one job.

I don’t remember where the “seven different jobs” claim comes from, but I can remember using it fairly frequently.  And I believe that the degree should prove that you know how to learn, but how is it possible for a professor to know that a student knows how to learn without confirming that the student has learned a particular X or Y? We claim that our students learn “critical thinking skills” but how can we verify those “critical thinking skills”?  Do we ask the student to write an essay to demonstrate critical thinking skills?  If the student mimics the critical thinking skills the professor has demonstrated, does she get an A?  Or should she fail because she didn’t disagree with the professor and thereby failed to be critical?  I have found answers to these rhetorical questions in my own teaching but I have no idea how other professors dealt with them.  In my entire university career, not once did I address these questions with a colleague, or in a department or faculty meeting.  Pedagogy is just not one of the subjects that university professors talk about.

Early in my career at the University I was associated with a BA in Professional Writing.  I taught a few courses on writing and applied grammar, and was part of the committee to evaluate the program with external experts and business people.  This was a program closely aligned, at least in theory, with employment opportunities and the business community.  Students in the program went on paid internships during their studies, and the most typical reason that students didn’t graduate was that employers offered them jobs before they were finished the degree.  On an ongoing basis we had twice as many available internships as students.  However, we were never able to attract enough students to justify the program’s existence.  The experience taught me that “a job” was not the most  profound attraction when students were choosing their undergraduate field of studies.  The cache, the possibilities, however unlikely, that a degree suggested were far more attracting than the guarantee of work.

My colleagues who didn’t teach writing made it very clear to me both directly and indirectly, that a degree in Professional Writing was something to be looked down upon.  Teaching writing skills was the bottom of the prestige ladder at universities, something to be assigned to the least qualified, non-ascending personnel.  Students were supposed to learn how to write in high school.  Writing skills were simply below the level of university education.  These colleagues had a point, except that they were the first people to complain bitterly about their own students’ lack of writing skills.  It seemed inappropriate to me to denigrate the courses and the people who were trying to solve the problem, but that is what happened.

Professional Writing had another strike against it within the academy.  Whether it was right-wing snobbery or left-wing ideology didn’t matter, it was clear that a degree that kowtowed to business and/or that was closely tied to students’ getting jobs was considered beneath the ethos of university studies.  I was susceptible to these pressures and prejudices, and as time wore on I came to teach, almost exclusively, the more prestigiously-viewed literature courses (which were of course disparaged by the faculties teaching the professions) while remaining nostalgic for my old writing and grammar courses.  My dilemma was solved when the University closed down Professional Writing because of low enrollment.

On the other hand, I came to understand the shortcomings of attempting to tie a university degree with a particular form of employment and the business community in general when I served on the committee to evaluate our Professional Writing program.  I remember one internet entrepreneur being very insistent that a professional writer should know at least three computer coding languages.  Outside the university, Professional Writing tended to mean Technical Writing, which implied a degree in science and engineering prior to the writer honing his writing skills. This example, notwithstanding, the general rule is that a university education has to supply a much greater knowledge base than any entry level work position will require, but it also has to be a guarantee that a graduate has full control of a portfolio of requisite skills.


My misgiving concerns the growing tendency that I have witnessed in universities to abandon any responsibility for skills training and only minimal concern for exactly what knowledge a student is acquiring.  I have witnessed and been a participant in the lengthy processes involved in attempting to develop a program of studies for both undergraduate and graduate studies.  However, once a program exists, an entirely different level of forces comes into play which will determine exactly what any individual student is going to study and learn in a particular program:  


  • the popularity of certain courses (students do get to “choose” courses, but the truth is in any given semester the choices are likely to be very limited; courses and programs that don’t attract students get cancelled), 
  • budgets (courses that require extra funds or have low enrolments get cancelled), 
  • available teaching personnel (as lecturers unionize they have collective agreements which give them priority to teach courses that have been assigned to them in the past. If a particular lecturer is deemed not up to the job, the easiest and perhaps only solution is to cancel the course. Courses are cancelled when no-one can be found deemed qualified.) 
  • what tenured faculty feel like teaching (Tenured faculty have a very strong if not absolute influence on the courses they themselves teach. A professor might, for example, insist on only teaching courses directly related to his research--and be accommodated.  The most heated conflict I ever witnessed first-hand was between two professors over which would teach graduate seminars).  


Programs do, of course, specify “required” and “optional” courses, but these requirements tend to be very flexible.  Professors, administrators, and students themselves can get around requirements with equivalences, reading courses, and exemptions according to the exigencies of the moment.   In the end, what an individual student ends up studying (within the very loose confines a program's design) is left to the student’s inclinations and to chance.  As a professor and even as a program director I never once sat down with a student’s complete transcript at the end of her degree to consider if the courses a particular student had actually done, as a whole, made sense.  There was never any discussion of what a student had actually done, how it related to the designed objectives of the program or how it might relate to employability.  This situation, which verges on haphazard, is celebrate in university calendars as students' being able to “customize their undergraduate studies.”

Thursday 13 March 2014

How Universities Have Promoted the Unemployment Crisis

Some examples of how universities have promoted an unemployment crisis are already well known.  The faculties of education in Ontario producing thousands more  teachers than the school system can absorb is an egregious example.  Universities are also responsible for the glut of PhDs on the market because universities have a vertical monopoly, being both the exclusive producers and major employers of PhDs.  It is hard to argue that the universities have handled their monopoly in a more enlightened fashion than the robber barons of the past.

It is worth stopping to consider what it takes to get a PhD.  Steps one and two are a four-year BA and a two-year MA in order to apply for admission to a PhD.  Universities have lots of strategies on the books to shorten or even eliminate the MA, but most students go through the MA process and take longer than the two years suggested in university calendars--six years is a conservative estimate of how long you need to study just to apply to a PhD program.  Once you have been accepted into a PhD program, there is a better than 50% possibility that you won’t finish. If you do finish, the average length of time the PhD takes is seven years, the mode (the number which occurs most frequently) is ten years.  If you are part of that happy minority that actually finishes the PhD, there is a better than 50% chance that you won’t get the tenure-track university teaching position which was your original purpose in starting the PhD in the first place.  If you are part of that lucky minority that finished the degree and got a tenure-track position, chances are you are in your late thirties.  The people you went to high school with have been working for twenty years by the time you have nailed down your first permanent, full-time job.  If you happened to have gone to school with Steve Jobs and Bill Gates (who both dropped out as undergrads), you will probably have noticed that the time it took you to do your PhD was about as long as it took them to develop Apple and Microsoft respectively.

Who would choose to do graduate studies under these conditions?  Well . . . someone like me, slightly idealistic, overly optimistic and drawn to the intellectual environment  of the university, in part because it was secure and comfortable.  The university offered the comforting illusion that I was heading somewhere and improving myself.  In keeping with the boy scout (and girl guide) motto, I was getting prepared, even though I didn’t know for what.  The hardcore facts were that I wasn’t turning down any solid job offers in order to continue studying.  I viewed language teaching as a stop-gap measure.  It took me ten years to accept that teaching was my profession.  My alternative to a PhD was working as a waiter in LA while I tried to flog film scripts.  Studying was my only percentage option even if it  might turn out to be no more than a hobby while I earned a living as a language teacher.  And if you are really lucky--as I was--it’s hard to imagine a better job than teaching in a university.

I tried to be upfront with my own graduate students about the prospects, but I kept hearing and I keep hearing that demographics will solve the problem soon.  Nonetheless, let’s consider a few stats.  According to the Canadian Association of University Teachers (CAUT) Almanac  there were 41,934 University Teachers in Canada in 2010.  Let’s imagine the number is 50,000 now.  In 2011, Canadian universities gave out  5,736 PhDs.  Let’s round that number down to 5,000 to simplify the math.  If we produce 5000 PhDs every year, within 10 years we will have produced enough new candidates to fill all of the 50,000 university teaching positions which now exist.  The idea that the entire university teaching core would replace itself every 10 years isn’t part of anyone’s rosiest daydreams.  It’s not happening, and it’s not going to happen.  Of course, the problem would be solved (in Canada) if the number of university teaching positions increased every year by 10%.  We know that hasn’t happened and isn’t going to happen either.   So what is happening?

University professors, who could, aren’t retiring.  According to the CAUT Almanac:    “As mandatory retirement laws have been rescinded in a number of provinces in recent years, the proportion of full-time university teachers in Canada, employed as teachers beyond the common retirement age of 65, more than quadrupled between 2001 and 2011.”  A few weeks ago there was a PBS documentary on exactly this subject:  university professors refusing to retire. The theme which ran through various professors’ explanations of why they were staying on was that occupying their university positions made them feel good.  The purpose of a university is not to make university professors feel better about themselves, and it is astounding that it should be necessary to say so.

However, the number of positions occupied by professors over 65 is probably a drop in the bucket.  According to a Services Canada report on future employment as a university teacher in the province of Quebec “the number of university professors is forecasted to rise slightly over the next few years.”  Three causes of this increase, according to the report,  (which all sound questionable to me) are:  the expansion of universities, professors' leaving to pursue other careers, and retirements.  According to the report:
 “Openings will arise first from the need to replace university professors, a relatively large number of whom will be retiring. In fact, the average age of these professors is considerably higher than in the work force as a whole. The proportion of university professors aged 55 and over in 2006 was much higher than that of all occupations (32% compared with 15%, according to census data)." 
This is a claim that I have repeated to my graduate students over the years.  However, as I read this report which is based on information from MELS (the French-language acronym for the Quebec ministry of education, recreation--"loisir" in French--and sport) which in turn comes from Quebec universities themselves, I am struck by how a fairly negative situation (from the point of view of Canadian /Quebec job seekers to whom the report is addressed) is given a hyperbolically positive spin.
For example, the report claims:  “This occupation [university professor] posts serious shortages in certain disciplines, including engineering, computer science and medicine.” “Including” is the tricky word in this sentence.  In common discourse what usually follows the word “including” is either a fairly comprehensive list or, in the other direction, some surprising inclusion.  We all know that engineers, computer scientists and doctors are in demand.  The sentence (and the report as a whole) encourages us to believe  that “university professor” is an in-demand category of profession in all areas and that there is a huge shortage of Canadian PhDs.  The report avoids any mention of the “certain disciplines” where there are no shortages and, in fact, there is an over supply. 
Rather than addressing the issue of over-supply, the report focusses on the causes of the shortage of university professors:  “competition from universities in the United States [and . . . ] competition from the rest of the labour market, which often pays these highly qualified professionals better.”  Anyone who holds a PhD and has been unable to get a tenure-track university teaching position will interpret the basic facts of the report in direct opposition to the report’s tenor and conclusions.
For example, the report notes that “only 37% of employed PhDs were working in universities in 2006.”  The report’s interpretation of this fact is, as we have seen, that PhDs are being lured to the USA and/or the private sector.  The interpretation for those people holding a PhD in the Humanities and Social Sciences (the area which accounts for more than two thirds of all university students and graduates) is that there are no university positions available in their fields and they have been forced to look elsewhere to earn a living.
Continuing on the theme that there is a shortage of PhDs in Quebec, the report announces the good news that “the doctoral student body jumped by close to 60% between 2000-2001 and 2009-2010. This labour pool is expected to increase 3,1% [3.1%] per year during our forecast period (2012-2016), according to Quebec Department of Education, Recreation and Sport's [sic] (MELS).”  Once again, if you happen to hold a PhD right now and are looking for work, this “good news” is really bad news for you because the supply of PhDs has increased in recent years and is going to keep increasing in the years ahead.  
This report which is supposed to be informing Canadians looking for work on their job prospects continues to emphasis the shortage of qualified candidates by noting that “30% of university professors in 2006 had received their degrees abroad” and that “in 2006, the percentage of immigrants in this occupation was three times higher than in all occupations (33% compared with 12%).”  There is nothing untoward in this information, except if you happen to be a Canadian with a PhD from a Canadian university unable to get a job.
While I am on this theme, the report on “Imbalances Between Labour Demand and Supply - 2011-2020”  published online by Employment and Social Development Canada offers these project shortages and surpluses  (among many others):

Shortages are projected over the next 10 years in some high-skilled occupations
  • Skill Types
  • Business, Finance and Administration Occupations
  • Occupations in Shortage
  • Human Resources and Business Service Professionals (NOC 112), Administrative and Regulatory Occupations (NOC 122)
  • Skill Types
  • Natural and Applied Sciences and Related Occupations
  • Occupations in Shortage
  • Other Engineers (NOC 214), Architects, Urban Planners and Land Surveyors (NOC 215), Mathematicians, Statisticians and Actuaries (NOC 216)
  • Skill Types
  • Health Occupations
  • Occupations in Shortage
  • Managers in Health, Education, Social and Community Services (NOC 031), Physicians, Dentists and Veterinarians (NOC 311), Optometrists, Chiropractors and Other Health Diagnosing and Treating Professionals (312), Therapy and Assessment Professionals (NOC 314), Nurse Supervisors and Registered Nurses (NOC 315), Medical Technologists and Technicians (NOC 321), Assisting Occupations in Support of Health Services (NOC 341)
  • Skill Types
  • Occupations in Social Science, Education, Government Service and Religion
  • Occupations in Shortage
  • Managers in Health, Education, Social and Community Services (NOC 041), Judges, Lawyers and Quebec Notaries (NOC 411), College and Other vocational Instructors (NOC 413), Policy and Program Officers, Researchers and Consultants (NOC 416)


Surpluses over the next 10 years are projected in low-skilled occupations, manufacturing and in trades and transportation
  • Skill Types
  • Business, Finance and Administration Occupations
  • Occupations in Excess Supply
  • Managers in Communication (NOC 013), Secretaries, Recorders and Transcriptionists (NOC 124), Clerical Occupations, General Office Skills (NOC 141), Office Equipment Operators (NOC 142), Library, Correspondence and Related Information Clerks (NOC 145), Recording, Scheduling and Distributing Occupations (NOC 147)
  • Skill Types
  • Natural and Applied Sciences and Related Occupations
  • Occupations in Excess Supply
  • Computer and Information Systems Professionals (NOC 217), Technical Occupations in Physical Sciences (NOC 221)
  • Skill Types
  • Occupations in Art, Culture, Recreation and Sport
  • Occupations in Excess Supply
  • Managers in Art, Culture, Recreation and Sport (NOC 051), Technical Occupations in Libraries, Archives, Museums and Arts Galleries (NOC 521), Athletes, Coaches, Referees and Related Occupations (NOC 525)


Take note of what you will from these predictions but I noticed the need for college and vocational teachers--yes--but university professors--no!  Did you notice in the Job Futures report that Quebec will need more professors of computer science?  Did you also notice the prediction (above) that we on the verge of an over supply of “Computer and Information Systems Professionals”?
How have universities promoted the unemployment crisis?  Thus far, the summary answer to the question is:  1) little effort is being made to tie university programs to the employment market (more on this question in a future posting), 2) universities are simply not being straightforward with students on what the employment possibilities are for the degrees they are pursuing, 3) universities are knowingly producing more PhDs (and other degrees) in certain fields than the employment market can absorb, 4) professors are no longer being required or encouraged to retire, 5) Canadian universities are not hiring the PhDs that they themselves produce.

All of this is a partial, fragmentary answer to the question but the global answer is that universities have been increasing compelled and have, with varying degrees of willingness, embraced a vision of themselves as independent businesses.  The basic business model is that you make a profit by keeping your labour costs low.  Like the robber barons of the past, universities have taken advantage of their monopoly on PhD production in order to create a surplus of labour and a shortage of tenure-track university  teaching positions.  The result is a cheap labour force of part-time, occasional and sessional lecturers and adjunct faculty which does most of the actual university teaching in Quebec, in Canada and in North America.  The upshot works on the balance sheet but it doesn’t work where it really matters, in the lives of PhD graduates who merit tenure-track university teaching positions, and it doesn't work for students who deserve access to professors fully committed to teaching in their fields.  

Sunday 9 March 2014

You get the degree, then you get the job--right?

You get the degree, then you get the job--right?  When did that ever happen?  Not in my life time. Well, not exactly not in my life time.  I remember one of my senior colleagues at his retirement party talking about when he had just completed his PhD and had to decide which of three University offers he was going to accept.  As he sagely pointed out, that was a time that must seem like a fantasy land to young PhDs these days. Many of the professors who taught me in my first year as an undergrad in 1970 got their jobs of the basis of a PhD in progress.  Without any statistical back-up, my intuition is that the days of “you get the degree and then you get the job” ended around 1970.  

The reason is pretty simple, straightforward demographics--”the baby boom.”  If you have read Boom, Bust and Echo, you can fine turn the demographics to your own case. I can remember when the accepted wisdom was that two years of high school was enough to get you into the labour market.  Smart people graduated high school, and post-secondary education was considered really heady stuff--mixed with a good dose of cynicism. The standard joke where I grew up was “you went to university to get a BS degree, then you got a PHD--Piled Higher and Deeper.”  Jokes notwithstanding, I was convinced that my only chance of a good life was getting into university, and I was far from convinced it would ever be possible--but I did get in.  Except it seemed that so did everyone else, and the new truism was that a University degree, in terms of employment, was the equivalent of Grade 10 (two years of high school) in the old days, only maybe not quite as good because there weren’t as many job openings as there used to be.

I remember the TA (Teaching Assistant, a PhD student--we only got to see the Prof on closed-circuit TV) in my first-year psychology course telling me, when he heard that I was doing a BA in English, that I would never get a job. I don’t remember exactly how I responded but no doubt I said something like “I’m not here to get a job; I’m here to get an education” which I had heard said a dozen times before, and a hundred times since.   In the back of my mind I was of course praying he was wrong.  

As it turned out, in my case, he was wrong.  I got my first teaching job while I was in the 4th year of my BA and I’ve never been unemployed since.  This sounds good, but keep in mind I’ve never said "no" to a job offer and every job I’ve ever held I’ve felt like I just managed to squeeze in and was lucky to have it.  My early jobs teaching ESL had little if anything to do with my education but I was always struggling to make a connection.  In fact, I wasn’t able to make substantial use of what I learned in my BA until twenty-five years later--I was lucky.

The disconnection between university education and employment isn’t new; in fact, I suspect its origins trace all the way back to the beginnings of universities, but recently the situation has gotten acute.  One study suggests that unemployment rates for recent university graduates have reached 25%; another claims that university graduates are actually earning less than high-school graduates.  An American study claimed that 60% of university graduates are either unemployed or under-employed (working part time, at a McJob, etc). 

Last week I read a “good news” article published online by one of my former students vaunting the fact that, based on a longitudinal study conducted by Statistics Canada on the “Labour Market Premiums Associated with a Postsecondary Education,” a BA degree correlates to an additional $724,000 in earnings.  The study followed university students from the 70s and concluded that, over the 20-year period analyzed, men with a BA earned $724,000 more than high-school graduates.   Woman earned $442,000 more.  Since the men under study were in their 50s in the 90s (roughly my cohort), the study really doesn’t have anything to say about the current crisis except perhaps the implied wishful thinking that history will repeat itself. 

More than anything else, the article made me think about how the author, my former student, was an interesting case study of the Byzantine twists and turns between point A and point B, between university education and employment.  He seems to be doing very well, but I remember him being very straightforward that he wasn’t interested in “a job”; he wanted to “a writer” and there was “no plan B.”  He works as a writer and editor for a headhunting company.  He gained some notoriety as a undergraduate by publishing a poetry chapbook.  The signature poem of the book was a sardonic account of his interview with the placement officer in charge of finding internships for students.  My former student is a smart guy, as well as a good writer, and is bound to be conscious of the irony that he now works in the field he once satirized.  My point is simply that if we had told him during his BA that he was being prepared for a career in work placement or human resources, he would have run in the opposite direction.  Similar cases are numerous, my own included. When I ask myself why I never became a lawyer, the first answer that comes to mind is that this was the field my father recommended to me.  As if blindly being determined by the fates I, in turn, recommended law school to my son, who is also not a lawyer.  To this list of incongruities I can add on a pile of anecdotes about lawyers who became construction contractors, an engineer who became a literary critic, and a dentist who became the owner of a used book store.


I am not attempting to whitewash the university for its part in the current crisis; on the contrary, I think the institution, I and my colleagues have a lot to answer for.  We have promoted this crisis in many active ways, but mostly our crime is not caring or at least not caring enough.

Monday 3 March 2014

Why Teachers Should Read ''The Origins of AIDS''


Why learn about AIDS?

I know AIDS has taken the lives of a lot of good people, millions in fact.  I have contributed to charities raising funds for AIDS research, but I have never felt personally concerned about this disease more than about any other (an ancillary benefit of having been a one-woman man for the last 31 years, I suppose).  I’ve never been particularly interested in medicine or biology for that matter.  So what compelled me to read an extensive, detailed study of the history of the virus known as HIV?

I had read a lot of good reports about this book, how it was the truth about AIDS that nobody wanted to talk about.  How many times had I heard that before? Although I’d never read a complete book on the subject, I had a hunch that this might be a book worth reading.  Not because I was interested in AIDS per se, but because its spread had become such a cultural and educational phenomenon.  

Has AIDS education failed?

How many times have you heard it said that what we need is “AIDS education”?  So after 30 years of AIDS education and an intense media blitz, how is it that someone like me, who can read and pay attention, is still so ignorant about this disease?  AIDS, because it has been described as an epidemic beginning in 1981, is an example of how the population we are all part of is educated on a mass level.  My conclusion is:  very poorly.




How the media covers AIDS

Over the years, every time I encountered a discussion of AIDS it was invariably someone announcing that someone else was wrong about it’s etiology.  The news media was only interested in an AIDS story if it involved a celebrity, a scandal or a surprising and dramatic turn of events.  It was only news if someone was claiming an unexpected breakthrough or a cover-up.  Almost as soon as I had learned that AIDs was caused by a Human Immunodeficiency Virus, I heard someone claiming that AIDS (Acquired Immune Deficiency Syndrome) was not caused by HIV.  I remember “learning” that the source of AIDS was homosexual men.  In fact, in the early 80s a homosexual flight attendant from Quebec was identified as “patient zero.”  This guy not only had AIDS, and spread it everywhere his airline company flew, but he was reported to have had 200 to 300 different sex partners per year.  Great fodder for homophobic evangelicals.

This book aims to teach!

So why should teachers in particular read this book?  I have to invent a word to answer this question:  because it’s teacherly. “Pedantic,” which literal means “like a male teacher,” has become a strictly derogatory term.  “Educational” and “informative” are the kinds of descriptors that can be applied to any book.  “Pedagogical” would be misleading in that the word would imply that the book is about education and teaching (and etymologically about children).   By teacherly, I mean that the book is an obvious, careful and patient attempt to teach the reader.  It worked for me.  I learned a lot.  In fact everything I know about AIDS and HIV--and by this I mean everything that isn’t muddled, foggy and contradictory in my brain--I learned from this book.

What we need to learn

I’m not saying that the book answered every question about AIDS; in fact, the author Jacques Pepin (not to be confused with the chef) sounded almost apologetic that the book was about the early history and origins of the disease.  Like the author, I agree that in order to understand AIDS we need to know where it came from and how it evolved.  Pepin’s prose style isn’t literary or poetic, and he expects you to hang in there while he talks statistics, divisions and percentages and does the math, but every step of the way he tells you clearly and frankly what he is doing, and how certain and precise his conclusions are and aren’t.  Every time a concept or procedure is introduced that a lay reader might not understand, he takes the time to clearly explain and lay out the groundwork of the methodologies used to  reach his conclusions.  So yes, dear reader you are going to learn about “iatrogenic” and “nosocomial” diseases (meaning those caused by doctors and treatment, and in a hospital), and “molecular clocks” used to tell us how long a virus has been around, and “phylogenetics” (the study of the evolutionary diversification of organisms).  The book has a lot to say (I mean teach) about colonial and neo-colonial Africa and, in his admittedly most hypothetical and controversial claim, about how the spread of HIV from Africa to Haiti to North America was significantly enhanced by the establishment of plasma banks where poor people and prisoners could sell the plasma extracted from their blood.

So where did AIDS (Acquired Immune Deficiency Syndrome) come from?


It is the result of a virus, specifically the Human Immunodeficiency Virus (HIV).

Aren’t there some people who have HIV and never develop AIDS?


This is where the story starts to get complicated, but at the same time its pretty simple.  To answer this question we need to answer the next question first.

Where did HIV (Human Immunodeficiency Virus) come from?  Answer: SIV.


HIV is the human form of SIV; that is, Simian Immunodeficiency Virus.  “Simian” refers to monkeys and apes.  Long story short:  AIDS comes from monkeys.  No, not homosexual, drug-addicted monkeys.  In fact, the form of HIV which has killed millions can be traced to the common chimpanzees of central Africa.  SIV has existed in ape and monkey populations for hundreds of years.  The divergent forms of HIV which have now been identified (yes, there are many kinds of HIV)  can be traced back (using “molecular clocks” and “phylogenetics”) to different times and places.  HIV-2 came from a specific variety of SIV found in apes in eastern Africa.  HIV-2 is a slow developing form of the virus.  People seldom die from HIV-2.  HIV-1 is the killer virus; the typical time span between contracting the virus and the development of AIDS is 10 years.  There are 8 identified, divergent strains of the HIV-1.  HIV-1 group M (for Main) accounts for 99% of infections and is the cause of the pandemic.  Every strain of the HIV virus is different in terms of rate of mortality and ease and type of transmission. In general, the chance of sexual transmission of HIV is about 1 in 1000.  (In other words, the statistical expectation is that if a man with HIV had sex 1000 times, the virus would be transmitted once.  However, if the man also had an STD and/or was un-circumcized the possibility of transmission would increase.)  The transmission rate through blood (transfusions, shared needles, etc) is 1 in 10.

How did HIV-SIV enter the human population?


Simple answer is that African tribesmen hunted and cooked monkeys.  It is easy enough to imagine people being bitten and scratched by monkeys.  By back tracing the various strains of HIV now in existence the virus (which was SIV and became HIV) entered the human population at 8 different points/occasions some time between 1900 and 1930.  For convenience of reference Pepin gives 1921 as year one of HIV.  Under “normal” circumstances the 8 people who had contracted the virus would have died within 10 years and that would have been the end of the virus and the disease--which brings us to the next question.

How did HIV-AIDS become an epidemic?


This is really the question that The Origins of Aids sets out to answer.  The answer lies in the geo-politics of colonial and neo-colonial Africa, the sex trade, African customs and traditions, industrialization and urbanization, and, in a detailed exposé of ‘how the road to hell is paved with good intentions,’ how attempts to combat sleeping sickness, malaria, tuberculosis, leprosy and various STDs through mass inoculations with unsterile needles led to HIV-1 M, which infected only one person in 1921, being transmitted to millions world wide. 

No doubt if he stumbled across this blog, Dr. Pepin would tear his hair out reading my clumsy, facile attempt to summarize his work.  I recommend you don't depend on this trailer.  If I've piqued your curiosity, read the book.   

Saturday 1 March 2014

Testing, Teaching and "Negative Capability"

Teaching for the test

I believe in testing.  Some years back, I was even certified as a Government of Canada Language Tester.  On the other hand, my experience as both teacher and tester confirmed my (and everyone else’s) misgivings about standardized testing.  The problems emerge when “the test” becomes the objective rather than one of the means at an educator’s disposal.  Nothing undermines the educational process more thoroughly and renders what is being taught more meaningless than when teachers are forced to teach for “the test.”

To Teach is to connect the unknown to the known

“To teach,” “to educate,” means to connect something new and meaningful to what students already know.  Meaning is context.  To learn something means that you are able to understand what it means or at least give that thing a meaning, which in turn means that you are able to place that thing in a context, to connect it to something that you already know.  That’s what good teachers do. They help students connect something new with what the students already know.  

The Opposite of teaching/learning

If you don’t believe me, consider the opposite of what I am describing.  You are sitting in a classroom, a “teacher” enters and begins talking in a language you don’t know and can’t identify.  The “teacher” continues for an hour and then leaves.  What have you learned?

"Negative capability"

The connection of new and old knowledge which defines teaching and learning rarely happens immediately and doesn’t come easily, which is why in the first class of my first-year undergraduate course I always introduced my students to the concept that the poet John Keats called “negative capability.”  “Negative capability,” which Keats described as the ability that all great poets have and I describe as what students need to have, is the capacity and willingness to hold onto information even when those facts and data may not immediately or completely make sense.  Students need to have confidence in the knowledge and ability of their teachers.  Students need to know and feel that their teachers will eventually help them make sense of what they have learned, help them connect the dots, but also connect all those dots to something that the student already knows about, giving them a fuller context and a meaning.  Teaching for the test means that what is being taught is likely to remain meaningless, to be un-connected from any meaningful context.  

But it gets worse.  

The Wire

If you haven’t had the experience (as I have), consider season four of my favourite television series, The Wire.  Yes, it’s fiction, but it does a good job of demonstrating what can and does happen when funding and teachers’ jobs are tied to students’ performance on a standardized test.  Schools (in this case a school in an underprivileged neighbourhood of Baltimore) will abandon their students’ interests and, by my definition, their education to a total focus on preparing for the test.



 

The Polarization of testing

Testing has become a polarized issue.  Macro-educators (specialists, administrators, institutions, ministries and governments) give too much importance to standardized testing, and micro-educators ( teachers, especially university teachers) abjure anything that comes close to a sit-down exam.

Traditionally a "discipline" means "an examination is possible"

I was involved in a protracted debate at my university about PhD Comprehensive Exams.  I was in favour of a traditional, three-or-four-hour sit-down exam.  The majority of my colleagues and the students preferred a take-home style of exam.  The single most compelling argument I could offer in favour of the traditional style of exam was that it would require that students study.  In the course of the debate, it came to me that the concept of “studying” had all but disappeared from the field in which I taught.

The Definition of "a test"

My definition of “a test” is that it is something that students have to study for.  A test should be based on what is taught, not the other way around, and not on something else--you’d be surprised how many teachers test something they haven’t really taught (or maybe you wouldn’t).  In addition to causing a student to study (by which I mean to review and reflected upon the course material), the test gives feedback to both the teacher and the student about what has been learned and what hasn’t.  


A Test requires attendance

That’s what I believe, but the truth is the original reason I adopted the habit of testing my undergraduates on a regular basis was to be sure they showed up.  I’ve seen other professors’ syllabi in which they specify that a student who misses two classes would have to drop the course.  This always sounded like a bluff to me, and if it wasn’t it would require taking attendance in every single class.  Not only does that seem un-university-like to me, but do you know how much time you would waste every single class taking the attendance of 60 students?  I wanted my students to show up because my lectures were so brilliant and stimulating that they wouldn’t want to miss one.  On the other hand, I remembered all the really good reasons I came up with for missing classes when I was an undergrad. So I started giving my classes little quizzes every two or three weeks or so.  Students who missed the class would, of course, miss the quiz, and if they missed the class after the quiz they wouldn’t be there to pick up the corrected copy.  This was my original intention, but something strange happened and I never did use the quizzes to check attendance.

Students want to be tested

As it turned out, attendance never proved to be a big enough issue to disturb me.  Students who didn’t show up usually failed or did poorly, and if a student was brilliant enough to do well without attending regularly, more power to her.  Even in a class of 60, I gave 5% of the mark for participation which, of course, required that I be able to identify every student in the room by the end of the semester--not as hard as it sounds.  The strange thing about the quizzes is, as I came to discover, that students really liked them.  

Students like being tested

I remember turning up at the classroom one day around 20 minutes before class (which was my habit) and being surprised to discover that most of the students were already there.  One of the students came up to me to announce that they were studying, had even formed study groups and mine was “the only course that people had to study for.”  At first I thought she was complaining, but she seemed so cheerful about it that I took her announcement as a compliment.  As I got to know the students better, especially those that had more than one class with me, I suggested that we could drop the quizzes, but the students wanted to keep them.  I started analyzing my evaluation process and informed students that overall their marks were lower on the quizzes than on the other forms of evaluation--the essay outline, the essay and exams.  Still students asked to maintain the quizzes.  

Testing is teaching

The quizzes were painless little things, multiple choice, circle the correct answer which could be done in less than ten minutes at the beginning of the class.  (There is a sample at the end of this post.) They were closely tied to the lectures and to notes that I put up on the course web site.  I understood that students appreciated and even enjoyed being tested, and the tests gave me the chance to go over the material a second time (or more) that a number of students hadn’t gotten the first time.  It was also a source of endless curiosity for me why students found some questions easy and others hard.  In fact, the quizzes confirmed the theories of teaching and learning that I’ve been talking about in this post.  

Students learn what connects to what they already know and think about

Let me explain.  When I taught American Literature, I always had a few quiz questions on Tennessee Williams’ A Streetcar Named Desire.  It didn’t surprise me that students always got the answer right to the question:  “Why was Blanche Dubois fired from her job as a high-school teacher?”  The answer “that she had had an affair with one of her students” was bound to have caught the attention of students not that far out of high school themselves--not to mention that anything sexual or scandalous is libel to stick in the mind.  Still it surprised me that students always seemed to know that the hotel where Blanche went with various men was called “the Flamingo,”  until one day I was driving down Main Street passed the pub which I knew to be the favourite hang out of students from the university and I noticed for the first time that the run-down hotel next door was called “the Flamingo.”  It is so obvious.  Students hold onto information that they can connect to, that has meaning/context for them, that’s what learning is.

The Evils of standardized testing

I believe that testing can facilitate the learning process, but it can also have the opposite effect.  The motivation/inspiration for this posting was the photograph (below) that one of my former students who now has school-age children shared on Facebook.  






The woman who took this photograph of her daughter in tears as she tried to correct her homework wrote a short piece explaining the image and telling the horror story of her daughter’s struggle to complete a standardized test that American schools are now imposing.  I have never read so many heartfelt responses to a single posting.  Even for someone like me, a career educator with a super bright child, I can remember how turning my kid over to the educational system felt like surrendering him to kidnappers.  If I made one false move the system could punish my child in retaliation.


This photograph of a little girl in tears is a perfect icon of an educational system gone terribly wrong.  One not governed by teachers and parents but by a Wall-Street mentality that sees pain and suffering as evidence of austerity, productivity and good business.  This image made me think about how that terrible, moving photograph of a Vietnamese girl running down the road naked and burned after a napalm attack helped to turn the hearts and minds of Americans against the Vietnam War.  It also made me think about another famous photograph of a young Black man being attacked by a German Shepard, which Malcolm Gladwell (in David and Goliath) describes as provoking a turning point in the civil rights movement in the States.  I’d like to think that this image of a little girl’s sadness could provoke some positive change.


In Quebec we talk a lot about “values” these days.  Any society which would wittingly put pressure on and cause stress for five-year-olds for motives as feeble as standardized testing and statistics gathering has a serious problem with its values.



PS:  Here’s an example of a literature quiz for first-year students:

First Quiz   
Instructions:  Circle the letter of the
best answer or completion to each of the following questions or statements.
1.  The word “quaint” in the phrase “your quaint honour” in the poem “To His Coy Mistress” is  . .
a.  a synonym for “great.”
b.  a metaphor for “cute” or “old fashioned.”
c.  a metonym for virginity.
d.  a pun on the word “queynte.”
e.  a hyperbole.
2.  The Latin expression “carpe diem” means . . .
a.  “god is dead.”
b.  “broken by the gods.”
c.  “I think therefore I am.”
d.  “buyer beware.”
e.  “seize the day.”
3.  The relationship between a sign and its referent can be . . . 
a.  discursive, non-discursive or logical.
b.  iconic, motivated or arbitrary.
c.  cultural, natural or ecological.
d.  physical, biological or neurological.
e.  phonetic, syntactic or grammatical.
4. The idea that words get their meanings from referents; that is, from things in the world is called . .
a.  constructionism.
b.  anthropologism.
c.  semiotics.
d.  linguistics.
e.  essentialism.
5.  How did the people of the Country of the Blind explain Nunez?
a.  He came from a strange and mystical place called Bogota.
b.  He came from rocks and was still unformed.
c.  He was a messenger from God.
d.  He was an alien from another world.
e.  He was a mountain climber who had fallen in an avalanche.
6.  Three traditional forms of irony are . . . 
a.  non-discursive, non-referential and aesthetic.
b.  poetry, prose and drama.
c.  verbal, situational and dramatic.
d.  Greek, Latin and Christian.
e.  textual, sociological and psychological.              
7.  Jacques Derrida defined “deconstruction” as . . .
a.  recognizing literature as the best writing that a society has produced.
b.  being true to one’s principles, beliefs and convictions.
c.  being conscious of the historical sedimentation of language.
d.  acknowledging that truth is beauty and beauty truth.
e.  the analysis of tropes and figures of speech in a literary text.
8.  “Vegetable love” is an example of . . .
a.  an oxymoron
b.  personification.
c.  a simile.
d.  an allusion.
e.  hyperbole.
9.  A “feminist” reading of “To His Coy Mistress” would be  . . .
a.  a sociological and resistant reading.
b.  a psychological and psychoanalytic reading.
c.  a formal and textual reading.
d.  a literal and historical reading.
e.  a reading of the poem as being ironic.
10.  How did the short story “The Country of the Blind” end? 
a.  Nunez returned to Bogota.
b.  Nunez and Medina-Saroté were married
c.  Nunez was accepted as the one-eyed King 
d.  Nunez lay down in the mountains, staring at the stars.
e.  Nunez was locked up because he was insane.

Why Is the Vagina Masculine? And What’s the Alternative?

“Vagina” is masculine  I first came across this factoid thirty years ago in Daphne Marlatt’s novel Ana Historic .   It came up again more r...