Translate

Monday 21 March 2016

The Art of Complaining

“Complain, complain, that’s all you do
Ever since we lost!
If it’s not the crucifixion
It’s the Holocaust.”
L. Cohen

In my brief (five years) and tiny tenure as an administrator responsible for an array of university programs, one of my duties was to receive student complaints.  Students usually had real or at least honestly perceived grounds for complaint.  The typical complaint was about the quality of instruction or the instructor of a particular course.  Frequently, the student would announce a shift of discourse with the phrase “It’s not the reason I’m here, but . . . .”

The irony of the situation was that if a student wanted to complain about a grade or even the evaluation of a particular assignment, that was a situation I could easily deal with--and that was the point students would take twenty minutes to get to.  The university had rules and procedures in place for reassessing a mark.  As I discovered the hard way, the university provided no legal means for dealing with a lackluster or incompetent teacher.  Like the psychoanalyst of the how-many joke trying to change a lightbulb, I could only change an instructor if he/she wanted to change.

Being faced with complaining students reminded me of early days as a steward in my ESL teachers’ union.  The principal duty of a steward was to represent and counsel teachers through the grievance procedure, and we were given a weekend-long course on how to grieve (the legalistic verb for “to complain,” not a synonym for “to mourn”). Step one and rule one of the grievance process was to know what my brother and sister union members wanted; that is, what outcome, of a limited number of possibilities, they were looking for from the grievance.  Sounds simple, right?   I found this advice to be logical, compelling and useful, but the objective is what people most frequently lose track of in the process of complaining.  This lack of focus is, I believe, what gives complaining a bad name.



Decades later at a university department meeting, one after another, my colleagues were complaining bitterly about how prolific and quick students were to complain.  I interrupted the brouhaha to suggest that complaining was a good sign; it meant students cared and, furthermore, I was thinking of preparing a module on “how to complain” for one of my courses.  My colleagues were not amused.

I really believe that complaining is beneficial, that we all benefit from those who have the wherewithal and courage to complain.  They are the whistle-blowers of everyday life, but the problem with complaining is one of degree, of frequency, of being identified with the “boy who cried wolf” once too often.  The conundrum for the would-be complainant then becomes the proverbial “separating of the dancer from the dance”: how to complain without being a complainer.  Although I was told when travelling in Europe that it paid to pretend to be French because the French were known to complain and would, therefore, get good service, I was never able to empirically verify this hypothesis--but it makes sense.

I have also been warned that my attitude toward complaining as being about outcomes was masculinist.  (Excuse the gender stereotyping here.  I’m just the messenger.)  I have been informed that when a woman is complaining, a man’s suggesting a (pre-emptory and perfunctory) solution to a problem simply compounds a woman’s frustration and irritation.  It took me a while to understand this instruction, but I have come to recognize the universal principle that the less you know about a problem the easier it is to imagine a solution.  If you (and I) immediately see an obvious, quick and easy solution to a problem being presented to us, chances are we have failed to understand the details and recognize the complexity and intricacy of the issue.

There is a phenomenon that is usually identified as complaining but is really “self-expression”—as vague as that locution is.  Sometimes it is necessary or at least healthful to decompress, to vent, to exhale with expletives.  What passes for complaining is often just thinking out loud.  Sometimes we just need to hear our own words (in my case read them) in order to clarify our own thinking to ourselves.



I used to be a fan of the television series House.  Dr. Gregory House, out of context, always sounded like he was complaining, but he was carrying out a process of “differential diagnosis.”  I didn’t quite know what that meant until I read Crilly’s definition of “differential calculus.”  Both cases are studies of change:  what has changed, what needs to change, the speed of change, the meaning of change, the prognosis and prescription for change.   Complaining is a differential science and a differential art.



Thursday 17 March 2016

“Let’s End the Myth that PhDs Are Only Suited for the Ivory Tower.” Really! Why?

Let’s End the Myth that PhDs Are Only Suited for the Ivory Tower.”  This was the headline for an opinion piece in the Globe and Mail written by Queen’s University’s Dean of Graduate Studies.  The article reminded me of meetings our tiny caucus of English teachers used to  have once or twice a year with our faculty’s Dean.  Invariably, at some point in the meeting, the Dean would turn to my colleague who was responsible for our section’s graduate programs and ask:  “How many new admissions do you have for next semester?”



Everyone in the room knew, with the possible exception of the Dean (and I suspect he may have known as well) that, at this point, we had two or maybe three new admissions.   Invariably my colleague would look surprised and begin to shuffle papers.  Having briefly occupied his position before he did, I had a rough idea of the bafflegab he was preparing to deliver, but I was never as good or practised at it as he was.

“Well,” he would begin, “in order to give the most up-to-date numbers I would have to include the inquiries that the secretary passed on today. With those six, and two students from France who emailed, and of course we have identified eight of our own BA students, and two MAs, as well as the returning students, and that’s right, Theresa Somebody, a really excellent candidate will be coming in as soon as she gets confirmation of her funding request . . . .”

Thanks to the miracle of my colleague’s loaves-and-fishes rhetoric,  the Dean could claim that our two new admissions appeared to be somewhere in the neighbourhood of twenty-two.  As long as no-one insisted on knowing accurate numbers, we all had plausible deniability when, next semester, our graduate seminars turned out to be the size of tutorials.

The “Let’s End the Myth” piece in the Globe is an extension of the same desperate smoke-screen rhetoric designed to dissuade anyone from asking for accurate numbers.  

Sure, let’s end the myth that PhDs want to be professors, and at the same time we can get rid of the myth that people who go to medical school want to be doctors, or that people who go to law school want to be lawyers, or that people who study accounting want to be accountants.  Or, we could go the other route, and take a look at accurate numbers for the occupational outcomes of PhDs and deal with the situation we know exists.

Before we get to the big question, we should stop to consider that—at best—only 50% of people who start a PhD in the humanities actually finish.  The average length of time to complete a PhD is seven years, the mode is ten.  Of the lucky 50%, how many, after five, seven, or ten years of hard work and study, get the tenure-track university positions which at least 86% of them have declared they covet?   And the answer is . . . wait for it . . . we don’t know!

How can we not know something as basic as how many PhDs get tenured jobs?  Just like my colleague who had to hide the fact that we only had two new students, universities in general have to hide the dismal outcomes for PhDs.  To reveal the numbers would put courses, programs, prestige, credibility, funding and ultimately positions at risk.

What do we know?  According to the available statistics, which are somewhat out of date (from 2011) and optimistically inaccurate, 18.6% of PhD graduates got full-time teaching positions in universities.  “Full time” does not mean permanent.  Crunch those numbers!  You start with 100 PhD students, at best only 50 of them complete the hard, five-to-ten-year slog and successfully complete the degree.  Of those 50 successful PhDs we don’t know exactly how many but we do know that fewer than nine (9 of the original 100) got the tenure-track university jobs which, for the great majority of them, were the goal of the PhD in the first place.

Given these depressing outcomes, you might imagine that universities are working hard to increase the number of tenure-track positions or downsize doctoral programs or a bit of both.  On the contrary, from everything I have read, the dominant priority of universities is still to maintain or increase PhD enrolments while hiring small armies of underpaid and underprivileged adjuncts, sessional and part-time lecturers to do most of the teaching.  Why? You might well ask.

Crudely put, screwing over PhDs has for decades been the national sport of academia.  For the university as well as individual departments and programs, the PhD student is a cash cow for government funding.  Additionally, PhDs are still a mark of prestige.  Universities identify themselves as either PhD-granting (one of the big boys) or non-PhD granting (not so big) institutions.  Individual professors applying for research grants will have their prospects vastly enhanced if they can point to PhD candidates who will serve as their research assistants.  The lucky professorial few who win the grant money can use it to travel to conferences in Miami, Mexico and Honolulu, while the bull work of their research projects is carry out by their minimum-wage PhD research assistants.

Despite the foggy and evasive arguments that universities might put out suggesting that the problems and the solutions are incredibly complicated—they aren’t.  The solution isn't for MacDonald's or Rogers or Walmart to hire more PhDs.  The PhD has to be the minimum requirement for teaching in a university (with rare, obvious and fully justified exceptions) and for any other significant position within the university hierarchy for that matter.  Any PhD holder who is teaching at a university should automatically qualify for tenure.  The ballooning administrative and support budgets of universities need to be transferred to pedagogical objectives.

As I pointed out in an earlier post “universities have a vertical monopoly, being both the exclusive producers and major employers of PhDs.”  It’s time universities began to acknowledge and correct their abuse of this monopoly.




Friday 11 March 2016

If You’re One of “the Good Guys,” Do You Still Have to Worry about the FBI Accessing Your iPhone? With Addendum.

In some ways, we have not completely escaped the prejudices of our oral ancestors.  There is always a lingering suspicion that someone demanding privacy must have something to hide.

Last week the Director of the FBI was on television arguing for the agency’s right to unlock the particular iPhone used by the ISIS-inspired San Bernardino terrorist—and by extension all iPhones.  His justification is that we are “the good guys” and we’re trying to catch “the bad guys.”  It’s hard to imagine a weaker a priori argument for the simple reason that in the history of governments, tyrannies, military juntas, secret police forces, and dictatorships there has never been one that announced to the world “we are not the good guys!”.

Nonetheless, personally, I have nothing to hide, and I'm a Canadian with a very non-ISIS sounding name and a regular readership of less than a dozen people for this blog.  (I am proud to have a select and discriminating readership.)  The ultimate defense against being surveilled by the FBI or some other secretive police force is to remain irrelevant and insignificant.  I have nothing to fear, nor do you, right?

Still, it rubs me the wrong way that it is exactly police forces like the FBI  that insist on the importance of secrecy for themselves which challenge the rights of individuals to have secrets.  I start thinking about the people who probably thought of themselves as one of "the good guys" (in the current, colloquial, gender-neutral sense of the term "guys") who were unfortunate enough to cross paths with the FBI, then I realized that you are only one of "the good guys" until the FBI decides you're not for whatever secret reasons they might have.

Consider some famous cases.



Ernest Hemingway, the renowned American novelist, was hospitalized for six weeks in the psychiatric section of St. Mary’s Hospital in Rochester, New York, where he was receiving electroshock treatments. Hemingway was diagnosed as suffering from paranoid delusions because of his constant ranting that he was under surveillance by the FBI and that even the hospital phone was tapped and his nurse, named Susan, was working for the FBI. One week after he was released from hospital, Hemingway shot himself.

“Fifty years after his death, in response to a Freedom of Information petition, the FBI released its Hemingway file. It revealed that beginning in the 1940s J. Edgar Hoover had placed Ernest [Hemingway] under surveillance because he was suspicious of Ernest’s activities in Cuba. Over the following years, agents filed reports on him and tapped his phones. The surveillance continued all through his confinement at St. Mary’s Hospital. It is likely that the phone in the hall outside his room was tapped and that nurse Susan may well have been an FBI informant” (Hemingway in Love 167).



Sunil Tripathi, a 22-year-old Brown University student, committed suicide after the FBI released surveillance photos of the Boston Bombers, and Sunil was falsely identified as one of them.  His body was discovered in the Seekong River, April 23, 2013.



Monica Lewinsky was a 23-year-old Washington intern when she engaged in various kinds of sexual activity with then President Bill Clinton.  Whatever moral compass you might bring (or not) to Lewinsky's tryst with the President, it seems obvious that the affair did not constitute a crime or a threat to public security.  Nonetheless, on January 16, 1998, Monica Lewinsky was held in a hotel room by FBI agents and threatened with 27 years of imprisonment if she did not reveal the details of her relations with the President.  She was also told that the FBI would arrest her mother who could be imprisoned for two years  (http://law2.umkc.edu/faculty/projects/ftrials/clinton/lewinskyday.html). 

(Whenever I reflect on this kind of prurient political theatre, I think of Prime Minister Pierre Trudeau's 1967 declaration in anticipation of the Omnibus Bill that "There's no place for the state in the bedrooms of the nation."  Someone needs, once and for all, to declare the converse: "There's no place for the nation in the bedrooms of the state.")

December 21, 2001, Martha Stewart propitiously sold stock in a friend's company and thereby avoided a potential loss of $45,673--a minuscule amount considering her estimated wealth at the time was 700 million.  Her friend, Sam Waskal, was being pursued by the FBI for insider trading on the stock of his own company, ImClone.  Martha Stewart was never convicted of insider trading but she did serve five months in a federal prison and two years probation for lying to the FBI about details of the stock sale (http://coveringbusiness.com/2012/05/15/what-martha-stewart-did-wrong/). 

(I still can't figure out exactly what crime Martha Stewart committed if, in fact, she did commit a crime, but it's hard not to compare her case with Wall Street companies which lost hundreds of billions of dollars in what seemed like fairly obvious mortgage and bond fraud schemes and the result was that they were bailed out by taxpayer money, CEOs continued to receive bonuses and severance packages, and not a single Wall Street insider was ever charged with a crime.)


Addendum

Now perhaps we should include former Secretary of State and presidential candidate Hillary Clinton in this list!




Saturday 5 March 2016

Privacy Versus Security: Debating a False Dichotomy

Is privacy necessary?

Is privacy really an innate human desire?  Is it normal to want to be alone?  While it seems intuitive and logical to assume that our culture and technology have evolved in response to a basic human desire for privacy, anthropologists, as well as communication and cultural theorists have argued that the cause and effect are the other way around.   Our habits, customs, created environments and mindsets are not a response to a primordial human need.  Technological culture created the idea of and need/desire for privacy.




Oral culture

In oral societies (that is, societies which depended on direct person-to-person oral communication), the desire to be alone was immediately identified as a symptom of illness.  In a world dominated by orality, today’s millennial otaku introvert generation would have fared as either deities or as mad demons.  They might have become the oracles living in caves at Delphi or the first monks dedicating their lives to transcribing ancient scripts or they would have been imprisoned, starved, tortured and burned at the stake.  We should also consider, given cultural ecology’s displacement of natural environment, that the neurodiverse, digi-destined, screen-slaver generation might be the next step in the evolution of our species.

Privacy is a byproduct of visual culture

Privacy is a byproduct of the visual culture created by the development of literacy from basic forms of writing to the phonetic alphabet, to Gutenburg’s printing press to the digital universe we know today.  Reading meant it was possible to be alone and still be connected to the world in important, informative ways.  In fact, the most serious forms of communication and knowledge-gathering were, in this new visual/ literate culture, best done in solitude.  In an oral culture being alone meant you could only be talking to yourself or a god—both of which were suspect if not dangerous activities.

Compartmentalized living

Living in spaces that have one room for cooking, another for sleeping and another for gathering might seem “natural” to us now, but our early ancestors would be mystified by our insistence on compartmentalizing our daily activities. Primitive man might have agreed with the dysphemistic adage that “You don’t shit where you eat,” but beyond the scatological, compartmentalized privacy is cultural not natural.

No doubt our primitive ancestors at times needed to be out of view, literally in hiding from enemies and predators, as a matter of security. Hence the overlap and confusion between privacy and security, between solitude and survival.

A Gun or an Iphone:  Which is more dangerous?

Fast forward to the debate between the FBI and the Apple Corporation about unlocking the iPhone once used by the ISIS-inspired murderer who killed 14 people in San Bernardino. On the surface, the request is to access one iPhone, but the reality is clear that the FBI is asking for the ability to access all iPhones.

The debate is being couched in terms of individual privacy and public security but this is a false dichotomy.  All things being equal (and they never quite are) security trumps privacy.  (And the pun is intended since Republican presidential aspirant Donald Trump [a.k.a Drumf] has already declared that all Americans should boycott Apple.)  History has proven over and over again that this debate is between individual security and collective security; a debate closely tied to the more typical dichotomy of individual rights versus collective rights. In the American context the priority line between collective versus individual rights and security tends to slide around like the dial on old-fashion radio gone wild depending on the issue--abortion, gun ownership, medical insurance, seat belts, drugs, homosexuality, same-sex marriage, civil rights, equality for women, and so on. During debates for the Republican presidential candidates, President Obama was chastised for using the San Bernardino shootings as an opportunity to challenge the Second-Amendment rights of American citizens to "bear arms."  In this mindset a locked cellphone poses a much greater hypothetical threat to public security than an assault rifle and thousands of rounds of ammunition.

NSA, CIA and you:  Who has the right to have secrets?

In his autobiography, Playing to the Edge: American Intelligence in the Age of Terror,  Michael V. Hayden, former director of the NSA and the CIA, points out that "Stellarwind," the CIA program to gather data on Americans' telephone calls which was outed by Edward Snowden,  “did indeed raise important questions about the right balance between security and liberty.”


In his review/commentary of the Hayden autobiography, "Can You Keep a Secret?", New Yorker staff writer George Packer points out that last week Hayden "sided with Apple in its privacy dispute with the F.B.I." while continuing to tacitly support the CIA's programs of torture and human-rights abuses.

Secrets and safety

In his review, Packer comments:

Spooks in general have had a lot to answer for in the past decade and a half: the 9/11 attacks themselves, Iraq’s nonexistent weapons of mass destruction, secret prisons, torture, warrantless eavesdropping, the bulk collection of Americans’ data, and targeted killings.

With this recent history in mind, it seems obvious that individuals, as a matter of personal security, need to protect themselves not just from malfeasance but the mistakes, the callous indifference, the questionable ethics and the politically/ideologically-dictated overreach of secret and secretive police forces like the NSA, CIA and FBI.






"Three Days of the Condor" and the Tenth Anniversary of "The Sour Grapevine"

Sharing Intelligence I'm still obsessing over " sharing intelligence ."  May 15th was the tenth anniversary of this blog.  I w...