Translate

Sunday 1 May 2016

This Professor Should Be Fired for Defending What I Believe In

I call it the “ad hominem dilemma.”  Just to remind you, an “ad hominem argument“ is a logical fallacy defined as trying to win an argument by attacking a person rather than the ideas that person is trying to present or represent in a debate.  The dilemma I have just coined occurs when you like an idea, but you don’t like the person presenting it, or you like a person but you don’t like the idea or argument.  In an ideal world the dilemma disappears because you always agree with the ideas of the people you like—though you might want to have your intellectual rigour checked.

So you might feel torn when you discover that Hitler liked apple pie, and you like apple pie, but you don’t want to be identified as one of those apple-pie-eating Nazis.  Like me, you might have wanted to tear out your hair when Wayne Gretsky announced he was supporting Stephen Harper in the last federal election—you remember, the election Gretsky couldn’t vote in because of Conservative policy preventing non-residents from voting.  Tells you what years in the California sun can do to an otherwise sane Canadian hockey player.  

Then there’s the Donald Trump (aka Drumpf) phenomenon.  You may have heard the claim that an infinite number of monkeys pounding on the keys of an infinite number of typewriters (i.e, keyboards without computers) would eventual type the complete works of Shakespeare.  Trump Drumpf gets so much media coverage, without ever spelling out the details of his proposals, that eventually he is bound to make some vague promise that you agree with, and there you are facing the “ad hominem dilemma.”

Many women were dismayed by the outcome of the Jiam Ghomeshi trial.  It seems pretty obvious that consensual sex does not mean you are consenting to be choked and punched in the head,  but how the obvious was represented at trial was anything but clear.  Ultimately, the acute “ad hominem dilemma” has been provoked not by Ghomeshi himself (okay, being an anus is not a provable crime, but still he has been proven an anus) or by his accusers, but by Marie Henein, Ghomeshi’s lawyer.




Marie Henein should be a feminist icon, a heroine for all womankind, a tough, skilled, astute defence lawyer at the peak of her profession.  In fact, she is all those things and has become them by defending people accused of some pretty heinous crimes, including crimes against women--because that's what defence lawyers do.  Both Michelle Hauser in the Whig ("Mansbridge hit journalistic low point") and Tabatha Southey in the Globe ("Upset about the Jian Ghomeshi verdict? Don’t get mad – get informed") have broached the dilemma which Henein has provoked

The issue of my concern will seem trivial, insignificant and certainly pedantic by comparison to the justice system's futile struggles to prosecute sexual assault.  The object of my obsession is the course plan; what is usually referred to in colleges and universities as the syllabus (the “silly bus” that carries students from the beginning to the end of the course?).  Who cares about syllabi?  Well, I guess people of my ilk who know how to pluralize "hippopotamus"--pedants (which is generally an insult even though it just means "male teachers.")

I used to really care about course plans . . . a lot.  I didn't call them course plans or syllabi, I used to call them "the contract" and I would do this really pumped-up, earnest presentation in the first class explaining that this document was a contract between me and my students, that they had the right to object and make changes if they could persuasively argue that something I was requesting was unreasonable or there were better alternatives.  If the first class and "the contract" went well, chances of the course as a whole going well were vastly improved.

Then the worst happened. University administrators began to agree with me that course plans were really important.  The Chair of our department announced a new policy. In the name of providing the best possible education to our students, in future we would all submit our course plans for review at the beginning of each semester.  My colleagues and I objected to this new policy on three grounds:  1) it was redundant; the information that might concern the department was already available in the form of course descriptions which were regularly updated, 2) the requirement to submit a more detailed description of what we would be doing with students to an administrator seemed more like surveillance than pedagogy, and 3) it would lead to bureaucratization, the uniformisation and rigidification of all course plans.  Redundancy was undeniable, but we were assured that in no way did this new policy suggest increased surveillance or bureaucratization.  The new policy was implemented.

The first time I submitted a course plan, the department Chair took me aside--at the department Christmas party--to tell me she had reviewed my course plan and determined that I hadn't scheduled enough classes for one of my courses.  I had been teaching the course for ten years and the number of classes had always been the same.  How was this not surveillance, I wondered? A year later, under a new Chair, I was notified that the same course plan contained one too many classes.  Luckily for me, as a tenured professor, I could and did blithely ignore the instructions in both cases.  

A more damaging outcome for me was the bureaucratization of the course plan.  With each passing semester I received increasingly insistent and precise instructions on the form and content of each course plan circulated through the Faculty of Education and seconded by my own faculty. The upshot was that as I presented my course plan to students I realized that what they saw before them was a replica of every other course plan that had been presented to them that week. The chances that I could credibly describe the plan as a mutual contract were nil. Even the possibility that I might convince the students there was something distinctive in the syllabus, something worthy of their concentration and interest, was minute at best.  They would view the course plan as bureaucratic red tape, imposed as much upon me as it was upon them, and they weren't wrong.  In the name of "providing the best possible education for students," I was deprived of a useful pedagogical tool.



In recent weeks, reading reports online about Roberty T. Dillen Jr., an associate professor of "genetics and evolutionary biology at the College of Charleston," who was facing suspension for refusing to change his course plan for the university's suggested course "outcomes," I thought "a messiah, a Prometheus willing to sacrifice himself to give fire to university teachers everywhere!"  I read the article in which his Dean accused him of playing "Silly, Sanctimonious Games" and described complaints against Dillen Jr., including his self-confessed, impish penchant for deliberately misinforming students and refusing to answer their questions. Then I read Dillen Jr.'s defense of his resistance: "Why I’m Sticking to My ‘Noncompliant’ Learning Outcomes."

My ad-hominem dilemma:  despite my conviction that course plans should be the purview of teachers not administrators, everything that I have read (especially his own words) leads me to the conclusion that this Robert T. Dillen Jr. is really an ass.  His only motivation seems to be that he likes being an ass and his pleasure was redoubled by the fact that he could get away with it.   As a tenured professor he can be an obfuscating, obstreperous lump of inertia who doesn't even have to logically defend himself and no-one can do anything about it, or so he thought.

Dillen Jr. has been teaching for 34 years.  He was consulted, advised, warned, and presented with alternative "outcomes" which he rejected. Still he manages to feign bewilderment, as if he were the only calm rational mind in this brouhaha rather than its provocateur, and asks rhetorically:  "How could such an apparently minor disagreement escalate so far, so fast?"

I am irked, in the first place, because Dillen Jr. could not have done a better job of undermining all university teachers in their efforts to control the presentation of their own courses.  When university administrators argue that the syllabus must be administered by the university and not left in the hands of eccentric egg heads, Dillen Jr. will be the precedent they cite.

But I am also outraged by a university professor's vain display of elitist, aloof, opinionated incoherence.  In lieu of "course outcomes," in his syllabus, Dillen Jr. inserted a quotation from a speech given by Woodrow Wilson at Princeton University in 1896.  In his apologia, Dillen Jr. offered three justifications for use of this quotation as the learning outcome of a biology course:  1) he and Woodrow Wilson were born 10 miles apart, 2) both he and Wilson "were Presbyterian professors"  and 3) that Wilson "seems to be so universally despised." 

Here is the Wilson quotation which Dillen Jr. used as his "course outcomes" and cannibalized for his rhetorical self-defence:
Explicit Learning Outcome. "It is the business of a University to impart to the rank and file of the men whom it trains the right thought of the world, the thought which it has tested and established, the principles which have stood through the seasons and become at length part of the immemorial wisdom of the race. The object of education is not merely to draw out the powers of the individual mind: it is rather its right object to draw all minds to a proper adjustment to the physical and social world in which they are to have their life and their development: to enlighten, strengthen, and make fit. The business of the world is not individual success, but its own betterment, strengthening, and growth in spiritual insight. ‘So teach us to number our days, that we may apply our hearts unto wisdom’ is its right prayer and aspiration."— Woodrow Wilson, 1896
Beyond the ludicrousness of his justifications, the gross absurdity of Dillen Jr.'s using this quote as the cornerstone of his refusal to accept and adjust to authority is that the quote and the Princeton Commencement speech from which it is taken and even the Bible quote which it cites (and Dillen Jr. re-cites) are all explicit refrains of the theme that the individual must accept and submit to the direction of higher authorities, including "the social world in which they are to have their life"--exactly what Dillen Jr. is refusing to do.

No-where in his exposition does Dillen Jr. show any interest in what his students might (or might not) be gaining from his stubbornly repeated use of Wilson's quote (encouraging Princeton grads to enlist for the Spanish-American War) for his "course outcomes."  The university's decision that Associate Professor Robert T. Dillen Jr. "would be suspended without pay for the fall 2016 academic term" strikes me as a set back for all good teachers and a gift to the students of genetics and evolutionary biology at the College of Charleston.


Addendum

Princeton University decides to remove Woodrow Wilson's name from its building because of racist history.



Monday 21 March 2016

The Art of Complaining

“Complain, complain, that’s all you do
Ever since we lost!
If it’s not the crucifixion
It’s the Holocaust.”
L. Cohen

In my brief (five years) and tiny tenure as an administrator responsible for an array of university programs, one of my duties was to receive student complaints.  Students usually had real or at least honestly perceived grounds for complaint.  The typical complaint was about the quality of instruction or the instructor of a particular course.  Frequently, the student would announce a shift of discourse with the phrase “It’s not the reason I’m here, but . . . .”

The irony of the situation was that if a student wanted to complain about a grade or even the evaluation of a particular assignment, that was a situation I could easily deal with--and that was the point students would take twenty minutes to get to.  The university had rules and procedures in place for reassessing a mark.  As I discovered the hard way, the university provided no legal means for dealing with a lackluster or incompetent teacher.  Like the psychoanalyst of the how-many joke trying to change a lightbulb, I could only change an instructor if he/she wanted to change.

Being faced with complaining students reminded me of early days as a steward in my ESL teachers’ union.  The principal duty of a steward was to represent and counsel teachers through the grievance procedure, and we were given a weekend-long course on how to grieve (the legalistic verb for “to complain,” not a synonym for “to mourn”). Step one and rule one of the grievance process was to know what my brother and sister union members wanted; that is, what outcome, of a limited number of possibilities, they were looking for from the grievance.  Sounds simple, right?   I found this advice to be logical, compelling and useful, but the objective is what people most frequently lose track of in the process of complaining.  This lack of focus is, I believe, what gives complaining a bad name.



Decades later at a university department meeting, one after another, my colleagues were complaining bitterly about how prolific and quick students were to complain.  I interrupted the brouhaha to suggest that complaining was a good sign; it meant students cared and, furthermore, I was thinking of preparing a module on “how to complain” for one of my courses.  My colleagues were not amused.

I really believe that complaining is beneficial, that we all benefit from those who have the wherewithal and courage to complain.  They are the whistle-blowers of everyday life, but the problem with complaining is one of degree, of frequency, of being identified with the “boy who cried wolf” once too often.  The conundrum for the would-be complainant then becomes the proverbial “separating of the dancer from the dance”: how to complain without being a complainer.  Although I was told when travelling in Europe that it paid to pretend to be French because the French were known to complain and would, therefore, get good service, I was never able to empirically verify this hypothesis--but it makes sense.

I have also been warned that my attitude toward complaining as being about outcomes was masculinist.  (Excuse the gender stereotyping here.  I’m just the messenger.)  I have been informed that when a woman is complaining, a man’s suggesting a (pre-emptory and perfunctory) solution to a problem simply compounds a woman’s frustration and irritation.  It took me a while to understand this instruction, but I have come to recognize the universal principle that the less you know about a problem the easier it is to imagine a solution.  If you (and I) immediately see an obvious, quick and easy solution to a problem being presented to us, chances are we have failed to understand the details and recognize the complexity and intricacy of the issue.

There is a phenomenon that is usually identified as complaining but is really “self-expression”—as vague as that locution is.  Sometimes it is necessary or at least healthful to decompress, to vent, to exhale with expletives.  What passes for complaining is often just thinking out loud.  Sometimes we just need to hear our own words (in my case read them) in order to clarify our own thinking to ourselves.



I used to be a fan of the television series House.  Dr. Gregory House, out of context, always sounded like he was complaining, but he was carrying out a process of “differential diagnosis.”  I didn’t quite know what that meant until I read Crilly’s definition of “differential calculus.”  Both cases are studies of change:  what has changed, what needs to change, the speed of change, the meaning of change, the prognosis and prescription for change.   Complaining is a differential science and a differential art.



Thursday 17 March 2016

“Let’s End the Myth that PhDs Are Only Suited for the Ivory Tower.” Really! Why?

Let’s End the Myth that PhDs Are Only Suited for the Ivory Tower.”  This was the headline for an opinion piece in the Globe and Mail written by Queen’s University’s Dean of Graduate Studies.  The article reminded me of meetings our tiny caucus of English teachers used to  have once or twice a year with our faculty’s Dean.  Invariably, at some point in the meeting, the Dean would turn to my colleague who was responsible for our section’s graduate programs and ask:  “How many new admissions do you have for next semester?”



Everyone in the room knew, with the possible exception of the Dean (and I suspect he may have known as well) that, at this point, we had two or maybe three new admissions.   Invariably my colleague would look surprised and begin to shuffle papers.  Having briefly occupied his position before he did, I had a rough idea of the bafflegab he was preparing to deliver, but I was never as good or practised at it as he was.

“Well,” he would begin, “in order to give the most up-to-date numbers I would have to include the inquiries that the secretary passed on today. With those six, and two students from France who emailed, and of course we have identified eight of our own BA students, and two MAs, as well as the returning students, and that’s right, Theresa Somebody, a really excellent candidate will be coming in as soon as she gets confirmation of her funding request . . . .”

Thanks to the miracle of my colleague’s loaves-and-fishes rhetoric,  the Dean could claim that our two new admissions appeared to be somewhere in the neighbourhood of twenty-two.  As long as no-one insisted on knowing accurate numbers, we all had plausible deniability when, next semester, our graduate seminars turned out to be the size of tutorials.

The “Let’s End the Myth” piece in the Globe is an extension of the same desperate smoke-screen rhetoric designed to dissuade anyone from asking for accurate numbers.  

Sure, let’s end the myth that PhDs want to be professors, and at the same time we can get rid of the myth that people who go to medical school want to be doctors, or that people who go to law school want to be lawyers, or that people who study accounting want to be accountants.  Or, we could go the other route, and take a look at accurate numbers for the occupational outcomes of PhDs and deal with the situation we know exists.

Before we get to the big question, we should stop to consider that—at best—only 50% of people who start a PhD in the humanities actually finish.  The average length of time to complete a PhD is seven years, the mode is ten.  Of the lucky 50%, how many, after five, seven, or ten years of hard work and study, get the tenure-track university positions which at least 86% of them have declared they covet?   And the answer is . . . wait for it . . . we don’t know!

How can we not know something as basic as how many PhDs get tenured jobs?  Just like my colleague who had to hide the fact that we only had two new students, universities in general have to hide the dismal outcomes for PhDs.  To reveal the numbers would put courses, programs, prestige, credibility, funding and ultimately positions at risk.

What do we know?  According to the available statistics, which are somewhat out of date (from 2011) and optimistically inaccurate, 18.6% of PhD graduates got full-time teaching positions in universities.  “Full time” does not mean permanent.  Crunch those numbers!  You start with 100 PhD students, at best only 50 of them complete the hard, five-to-ten-year slog and successfully complete the degree.  Of those 50 successful PhDs we don’t know exactly how many but we do know that fewer than nine (9 of the original 100) got the tenure-track university jobs which, for the great majority of them, were the goal of the PhD in the first place.

Given these depressing outcomes, you might imagine that universities are working hard to increase the number of tenure-track positions or downsize doctoral programs or a bit of both.  On the contrary, from everything I have read, the dominant priority of universities is still to maintain or increase PhD enrolments while hiring small armies of underpaid and underprivileged adjuncts, sessional and part-time lecturers to do most of the teaching.  Why? You might well ask.

Crudely put, screwing over PhDs has for decades been the national sport of academia.  For the university as well as individual departments and programs, the PhD student is a cash cow for government funding.  Additionally, PhDs are still a mark of prestige.  Universities identify themselves as either PhD-granting (one of the big boys) or non-PhD granting (not so big) institutions.  Individual professors applying for research grants will have their prospects vastly enhanced if they can point to PhD candidates who will serve as their research assistants.  The lucky professorial few who win the grant money can use it to travel to conferences in Miami, Mexico and Honolulu, while the bull work of their research projects is carry out by their minimum-wage PhD research assistants.

Despite the foggy and evasive arguments that universities might put out suggesting that the problems and the solutions are incredibly complicated—they aren’t.  The solution isn't for MacDonald's or Rogers or Walmart to hire more PhDs.  The PhD has to be the minimum requirement for teaching in a university (with rare, obvious and fully justified exceptions) and for any other significant position within the university hierarchy for that matter.  Any PhD holder who is teaching at a university should automatically qualify for tenure.  The ballooning administrative and support budgets of universities need to be transferred to pedagogical objectives.

As I pointed out in an earlier post “universities have a vertical monopoly, being both the exclusive producers and major employers of PhDs.”  It’s time universities began to acknowledge and correct their abuse of this monopoly.




Friday 11 March 2016

If You’re One of “the Good Guys,” Do You Still Have to Worry about the FBI Accessing Your iPhone? With Addendum.

In some ways, we have not completely escaped the prejudices of our oral ancestors.  There is always a lingering suspicion that someone demanding privacy must have something to hide.

Last week the Director of the FBI was on television arguing for the agency’s right to unlock the particular iPhone used by the ISIS-inspired San Bernardino terrorist—and by extension all iPhones.  His justification is that we are “the good guys” and we’re trying to catch “the bad guys.”  It’s hard to imagine a weaker a priori argument for the simple reason that in the history of governments, tyrannies, military juntas, secret police forces, and dictatorships there has never been one that announced to the world “we are not the good guys!”.

Nonetheless, personally, I have nothing to hide, and I'm a Canadian with a very non-ISIS sounding name and a regular readership of less than a dozen people for this blog.  (I am proud to have a select and discriminating readership.)  The ultimate defense against being surveilled by the FBI or some other secretive police force is to remain irrelevant and insignificant.  I have nothing to fear, nor do you, right?

Still, it rubs me the wrong way that it is exactly police forces like the FBI  that insist on the importance of secrecy for themselves which challenge the rights of individuals to have secrets.  I start thinking about the people who probably thought of themselves as one of "the good guys" (in the current, colloquial, gender-neutral sense of the term "guys") who were unfortunate enough to cross paths with the FBI, then I realized that you are only one of "the good guys" until the FBI decides you're not for whatever secret reasons they might have.

Consider some famous cases.



Ernest Hemingway, the renowned American novelist, was hospitalized for six weeks in the psychiatric section of St. Mary’s Hospital in Rochester, New York, where he was receiving electroshock treatments. Hemingway was diagnosed as suffering from paranoid delusions because of his constant ranting that he was under surveillance by the FBI and that even the hospital phone was tapped and his nurse, named Susan, was working for the FBI. One week after he was released from hospital, Hemingway shot himself.

“Fifty years after his death, in response to a Freedom of Information petition, the FBI released its Hemingway file. It revealed that beginning in the 1940s J. Edgar Hoover had placed Ernest [Hemingway] under surveillance because he was suspicious of Ernest’s activities in Cuba. Over the following years, agents filed reports on him and tapped his phones. The surveillance continued all through his confinement at St. Mary’s Hospital. It is likely that the phone in the hall outside his room was tapped and that nurse Susan may well have been an FBI informant” (Hemingway in Love 167).



Sunil Tripathi, a 22-year-old Brown University student, committed suicide after the FBI released surveillance photos of the Boston Bombers, and Sunil was falsely identified as one of them.  His body was discovered in the Seekong River, April 23, 2013.



Monica Lewinsky was a 23-year-old Washington intern when she engaged in various kinds of sexual activity with then President Bill Clinton.  Whatever moral compass you might bring (or not) to Lewinsky's tryst with the President, it seems obvious that the affair did not constitute a crime or a threat to public security.  Nonetheless, on January 16, 1998, Monica Lewinsky was held in a hotel room by FBI agents and threatened with 27 years of imprisonment if she did not reveal the details of her relations with the President.  She was also told that the FBI would arrest her mother who could be imprisoned for two years  (http://law2.umkc.edu/faculty/projects/ftrials/clinton/lewinskyday.html). 

(Whenever I reflect on this kind of prurient political theatre, I think of Prime Minister Pierre Trudeau's 1967 declaration in anticipation of the Omnibus Bill that "There's no place for the state in the bedrooms of the nation."  Someone needs, once and for all, to declare the converse: "There's no place for the nation in the bedrooms of the state.")

December 21, 2001, Martha Stewart propitiously sold stock in a friend's company and thereby avoided a potential loss of $45,673--a minuscule amount considering her estimated wealth at the time was 700 million.  Her friend, Sam Waskal, was being pursued by the FBI for insider trading on the stock of his own company, ImClone.  Martha Stewart was never convicted of insider trading but she did serve five months in a federal prison and two years probation for lying to the FBI about details of the stock sale (http://coveringbusiness.com/2012/05/15/what-martha-stewart-did-wrong/). 

(I still can't figure out exactly what crime Martha Stewart committed if, in fact, she did commit a crime, but it's hard not to compare her case with Wall Street companies which lost hundreds of billions of dollars in what seemed like fairly obvious mortgage and bond fraud schemes and the result was that they were bailed out by taxpayer money, CEOs continued to receive bonuses and severance packages, and not a single Wall Street insider was ever charged with a crime.)


Addendum

Now perhaps we should include former Secretary of State and presidential candidate Hillary Clinton in this list!




Saturday 5 March 2016

Privacy Versus Security: Debating a False Dichotomy

Is privacy necessary?

Is privacy really an innate human desire?  Is it normal to want to be alone?  While it seems intuitive and logical to assume that our culture and technology have evolved in response to a basic human desire for privacy, anthropologists, as well as communication and cultural theorists have argued that the cause and effect are the other way around.   Our habits, customs, created environments and mindsets are not a response to a primordial human need.  Technological culture created the idea of and need/desire for privacy.




Oral culture

In oral societies (that is, societies which depended on direct person-to-person oral communication), the desire to be alone was immediately identified as a symptom of illness.  In a world dominated by orality, today’s millennial otaku introvert generation would have fared as either deities or as mad demons.  They might have become the oracles living in caves at Delphi or the first monks dedicating their lives to transcribing ancient scripts or they would have been imprisoned, starved, tortured and burned at the stake.  We should also consider, given cultural ecology’s displacement of natural environment, that the neurodiverse, digi-destined, screen-slaver generation might be the next step in the evolution of our species.

Privacy is a byproduct of visual culture

Privacy is a byproduct of the visual culture created by the development of literacy from basic forms of writing to the phonetic alphabet, to Gutenburg’s printing press to the digital universe we know today.  Reading meant it was possible to be alone and still be connected to the world in important, informative ways.  In fact, the most serious forms of communication and knowledge-gathering were, in this new visual/ literate culture, best done in solitude.  In an oral culture being alone meant you could only be talking to yourself or a god—both of which were suspect if not dangerous activities.

Compartmentalized living

Living in spaces that have one room for cooking, another for sleeping and another for gathering might seem “natural” to us now, but our early ancestors would be mystified by our insistence on compartmentalizing our daily activities. Primitive man might have agreed with the dysphemistic adage that “You don’t shit where you eat,” but beyond the scatological, compartmentalized privacy is cultural not natural.

No doubt our primitive ancestors at times needed to be out of view, literally in hiding from enemies and predators, as a matter of security. Hence the overlap and confusion between privacy and security, between solitude and survival.

A Gun or an Iphone:  Which is more dangerous?

Fast forward to the debate between the FBI and the Apple Corporation about unlocking the iPhone once used by the ISIS-inspired murderer who killed 14 people in San Bernardino. On the surface, the request is to access one iPhone, but the reality is clear that the FBI is asking for the ability to access all iPhones.

The debate is being couched in terms of individual privacy and public security but this is a false dichotomy.  All things being equal (and they never quite are) security trumps privacy.  (And the pun is intended since Republican presidential aspirant Donald Trump [a.k.a Drumf] has already declared that all Americans should boycott Apple.)  History has proven over and over again that this debate is between individual security and collective security; a debate closely tied to the more typical dichotomy of individual rights versus collective rights. In the American context the priority line between collective versus individual rights and security tends to slide around like the dial on old-fashion radio gone wild depending on the issue--abortion, gun ownership, medical insurance, seat belts, drugs, homosexuality, same-sex marriage, civil rights, equality for women, and so on. During debates for the Republican presidential candidates, President Obama was chastised for using the San Bernardino shootings as an opportunity to challenge the Second-Amendment rights of American citizens to "bear arms."  In this mindset a locked cellphone poses a much greater hypothetical threat to public security than an assault rifle and thousands of rounds of ammunition.

NSA, CIA and you:  Who has the right to have secrets?

In his autobiography, Playing to the Edge: American Intelligence in the Age of Terror,  Michael V. Hayden, former director of the NSA and the CIA, points out that "Stellarwind," the CIA program to gather data on Americans' telephone calls which was outed by Edward Snowden,  “did indeed raise important questions about the right balance between security and liberty.”


In his review/commentary of the Hayden autobiography, "Can You Keep a Secret?", New Yorker staff writer George Packer points out that last week Hayden "sided with Apple in its privacy dispute with the F.B.I." while continuing to tacitly support the CIA's programs of torture and human-rights abuses.

Secrets and safety

In his review, Packer comments:

Spooks in general have had a lot to answer for in the past decade and a half: the 9/11 attacks themselves, Iraq’s nonexistent weapons of mass destruction, secret prisons, torture, warrantless eavesdropping, the bulk collection of Americans’ data, and targeted killings.

With this recent history in mind, it seems obvious that individuals, as a matter of personal security, need to protect themselves not just from malfeasance but the mistakes, the callous indifference, the questionable ethics and the politically/ideologically-dictated overreach of secret and secretive police forces like the NSA, CIA and FBI.






Monday 15 February 2016

Is Your Professor a Better Grader than Moody’s or Standard & Poor's?

If you have been following the thread of my last posts you will have arrived at the question:  Why did savvy investors from around the world buy billions of dollars of worthless bonds from Wall Street companies in 2008?  The answer is that they absolutely believed the ratings.  If the ratings agencies said that a bond was AAA , they accepted that it was a guaranteed, virtually no-risk investment.  Investors in Germany, Japan, Canada, the USA and all over the globe  willingly or willfully ignored the fact that the ratings agencies—Moody’s and Standard & Poors—were being controlled and manipulated by the very companies they were supposed to be evaluating.




If this situation sounds inappropriate to you, stop and consider for a moment:  who evaluates you when you take a university course?  Yes, you are evaluated by exactly the same person who has a vested interest in demonstrating that his/her course has produced knowledgable, skilled graduates.  On the other hand, every course needs to show distribution of grades, but luckily there are always a few students who conspicuously “don’t give a damn” to whom it is possible to assign lower grades—always useful to take note of who sits in the back row.  Overall, non-permanent lecturers (who do most of the teaching)  are likely at risk of losing their jobs if the grades are low enough to cause protest or produce too many failures.  Among lecturers it is widely assumed that if they give their students low marks, the students will retaliate with low course evaluations.

The difference between the ratings agencies on Wall Street and those who evaluate university students is, I assume, that just about everyone is aware of the situation in universities.  I’ve started asking around (on Quora and Workopolis):  Do employers seriously consider a university graduate’s grades?  I infer from the answers I’ve received that the answer is “no, they don’t.”  The answers ranged from “no, they don’t consider them” to “they shouldn’t, if they know what they are doing.”  So, unlike investors who bought worthless bonds from Wall Street, employers are not being deceived by the ratings systems applied to university graduates.  

What does this fact—the disbelief in marks—mean?  Does it matter that grades don’t matter?

In my world, I mean the world inside my head, they mattered a lot.  Grades and their accompanying justification are supposed to give students the feedback they need to progress, and to make sound educational and career decisions.  When I look back on my own experience as a student, I am shocked by how infrequently I was tested in a thorough and convincing fashion.  Grades were used as punishment in some cases; in others they were gestures of sympathy, at best they were a pat on the back.  I never felt I was being reasonably tested or justly evaluated; nevertheless, I still allowed grades to determine my path in university and in high school for that matter.  A low grade meant that subject would be dropped next year; a high grade determined my next major.  No-one ever gave me clear instructions on what I would have to do in order to get higher grades--and it always seemed unfashionable, humiliating and whiney to ask. Besides, my grades were always high enough to get by.  I never clearly understood how my grades were being determined, what specific criteria were being used to evaluate me, and now that I have had a career as a professor, I'm quite sure my professors didn't know either.

Any experienced professor seeing where this argument is heading will be quick to tell you what I am on the verge of suggesting is just not possible.  The system does not allow professors to evaluate students in the clear and comprehensive fashion I am trying to imagine.  Moreover, it never has.  As a consequence it is typical for professors to separate themselves from the entire business of grading.  Marking is turned over to students; marks are arrived at in some comfortable non-judgemental fashion, or avoided in favour of pass/fail "exams" which no-one ever fails.   Many professors, myself included, feel that their jobs are to inform and encourage students, not judge them.

On the other hand, there is no quality control system for university degrees.  The assumption is that this work is being done by the people who teach, but at the same time these teachers are under constant pressure, from university administrations as much as from students, to give good grades.  There is no up-side to teachers' diligently, conscientiously and rigorously evaluating their students--except perhaps the silent pride which comes from the conviction that you are doing your job. The periodic evaluation of students should be an important  part of the educational process.  Grades are a reflection of the underlying education that students are getting (or not getting). The problem isn't in itself that grades are being inflated (and I don't doubt that examples of unjustly harsh evaluations are numerous--exceptions which prove the rule), but the constant growth of grade inflation has correlated to a corresponding devaluation in the worth of university degrees.

The discussions of the "housing bubble," the "financial bubble" and the possibility of an "education bubble" have gone on for years now.  Grade inflation in and of itself is not a great concern, but the fact that it has reached the point of making grades meaningless is a sign that the "education bubble" may have already burst.





Tuesday 19 January 2016

How Did University Degrees Become Subprime Mortgages? Part II

If you made it to the end of my last post you will have the same question I did.

Why did Finance Companies and our ersatz Mr. Finance Company Guy arrange a mortgage for Michelle, our hypothetical mortgagee,  when he should have known quite well from experience and the numbers that she would likely be unable to pay it off?  To answer this question we have to introduce a bond trader on Wall Street in New York.  We can even give him a real name.  We can call him Howie Huble



because Howie Hubler really exists and he is the most infamous example of what was happening in the bond markets in 2008.  A bond like a mortgage and like money is a kind of IOU.  It’s a piece of paper that says someone owes someone an amount of money.  Bonds have always been considered good, solid, conservative  low-risk forms of investment.  Usually bonds were IOUs from big companies or banks or even countries, but any kind of debt can be incorporated into a bond and resold to investors.  Unless you are an investment banker you have to get used to the idea that one person’s debt is always someone else’s investment. For the most part, debt is what financial markets buy and sell.

Leading up to 2008, bonds were being created by bundling together a lot of subprime mortgages.  Imagine that Michelle’s mortgage and 99 other mortgages just like hers were put together in a bundle.  All together they would create a bond (or what was called an “asset-backed security” or "collateralized default obligation") which, on paper, was worth 17 million dollars.  Howie Hubler and his ilk on Wall Street would turn around and sell that 17-million-dollar bond to pension funds, banks, and other investment companies around the world.  The answer to our question, the reason Mr. Finance Guy gave Michelle the mortgage in the first place was because he could immediately turn around and sell it to a Hublerite on Wall Street, and the Hublerites would sell it inside a bond to investors all over the world.

Pause and take note that, despite what we are constantly told about capitalism, the people who were making the real money were taking no risks, in fact, as the saying goes, they had no skin in the game. Howie and Mr. Finance Company Guy got paid no matter what.  They could remain indifferent to the quality of the mortgages and bonds they were selling.  In fact, being attentive to the quality and viability of the paper they were selling would have been counter to their financial interests.  The only thing that mattered to them was that they move a lot of "paper" from Main Street to Wall Street to Global Markets.  The hypothetical 17-million-dollar bond I mentioned above was peanuts to the Hublerites on Wall Street.  Merrill Lynch, the company Howie worked for, was paying him 25 million a year and he complained of being underpaid.  Even after Howie lost 9 billion dollars on "credit default swaps" and was fired, he still walked away with a 10-million-dollar severance payment in addition to the millions he would have salted away over the years.

Confession of ignorance #2:  I hadn’t heard of Howie Hubler or “collateralized default obligations” nor did I understand how "credit default swaps" worked until I read The Big Short.  My general impression of how subprime mortgages were bundled into bonds and resold was correct but The Big Short gave me the details and explained how Mr. Finance Guy and Howie could make enormous amounts of money, for as long as the bubble lasted, by moving worthless pieces of paper around.  Actually calling these investments “pieces of paper” gives them more substance than they had in reality.  Not only were the transactions digital, but when Main Street Finance Guys couldn't keep up with the demand on Wall Street for more mortgage paper, some Wall Street gurus, as Michael Lewis reports,  figured out a way to create mortgages out of nothing.  When financial analysts went looking to see what was inside CDOs (“collateralized default obligations”); for example, to find "Michelle's mortgage," they came away saying it was impossible to say what was in a CDO.  

Think about it!  People were paying billions of dollars for CDOs but no-one was able to say exactly what was being purchased when you bought  a CDO.  No problem!  As we now know the government of the USA stepped in and purchased hundreds of billions of dollars worth of this bad, toxic, imaginary debt and all the perpetrators were let off the hook, except of course, Michelle and people like her who were evicted from their homes, lost savings and pensions and were forced into debt and bankruptcy. 

Parallel #2:  In both cases, Michelle’s mortgage and Michelle’s BA, a huge system has been constructed, an upside down pyramid, based on the assumption that Michelle is going to do what is expected of her.  Despite Michelle’s good intentions and hard work, the open question remains:  was Michelle being provided with the conditions which would allow her to succeed?  In the case of her mortgage, the evidence is now clear she had little hope of  paying off the mortgage and owning a house.  The whole venture was to be a losing proposition for her.  What about her BA?

I have to hold back the vitriol in launching accusations against Mr. Finance Company Guy who sold Michelle her mortgage, because in terms of education I am his homologue. I never consciously deceived or mislead a student, and remained convinced throughout my career that I was doing something beneficial for my students.  However, I suspect that the average Mr. Finance Guy could make the same claims about his mortgage customers.  

Like Mr. Finance Guy, I felt pressure to attract students and get them through the program.  I know lots of individuals who were conscientious and diligent, working to ensure that Michelle got the best education possible--and I count myself among them.  Overall, anecdotal evidence I’ve gleaned suggests that my undergraduates have done better than average in finding employment. Nonetheless it seems clear to me that just as the problem with the financial markets was that no-one was playing careful attention to the details of Michelle’s mortgage, no-one  is paying careful attention to the details of her BA.   As the system currently stands there is no incentive for anyone to be particularly attentive to the make-up and quality of the education that Michelle is receiving.  

For university professors interested in advancing their careers, far from "no incentive,"as I have pointed out in earlier posts, there is significant disincentive for any professor getting too fussy about the quality of their courses or teaching, or becoming preoccupied with the overall quality of education their students are receiving.  In universities there are only two roads to advancement: research or administration.  The cliché of "publish or perish" has more purchase than ever.  The number and salaries of university administrators has ballooned, while most of the teaching is left to lowly adjuncts and part-time lecturers who are destined to remain at the bottom of the ladder in terms of salary, job security and status.  

Are administrators and tenured professors concerned about the level of pedagogy in their universities?  The most interesting and telling aspect of this question is that we can't know the answer because the question never gets asked.  Lots of lip service gets passed around as part of every university's sales pitch but it is simply not something that professors ever discuss.

In keeping with a typical business model, universities are concerned with enrolment and completion rates.  The system offers no incentive for anyone to be preoccupied by what happens between registration and graduation.  The system is driven by ideology, narcissism, petty politics, turf wars and the obsessive compulsions of fastidious low-level administrators, but because there is little to no consensus about what students should be taught, there is no tracking or sharing of information about what students are supposed to have learned.  


My recent posts might create the false impressions in readers' minds 1) that I view university education as somehow comparable to the financial markets, and 2) that I have some silver-bullet solution in mind for how to fix university education for all times.  This comparison of universities and the financial markets is a demonstration of how disastrous the business model is for education.  I have seen numerous panaceas proposed to cure all that ills university education and invariably come away with the impression that there is no one solution to fit all situations.  The solutions that seem clear and viable to me are the ones who's outcomes are least predictable.  We need to empower those who teach, those who can facilitate effective teaching, and those who want to learn, and then, to quote Death of a Salesman, "attention must be paid" to what is happening to students every step of the way from pre-registration to career.

AN EYE FOR AN EAR: FIFTH BUSINESS AND LA GROSSE FEMME D'À CÔTÉ EST ENCEINTE

Studies in Canadian Literature : Volume 14, Number 2 (1989): pages 128-149.  The absence of Robertson Davies and Michel Tremblay from Philip...