Translate

Showing posts sorted by relevance for query critical thinking. Sort by date Show all posts
Showing posts sorted by relevance for query critical thinking. Sort by date Show all posts

Wednesday 26 March 2014

“Critical Thinking Skills” and “Family Values”

“Critical thinking skills” and “family values”:  these days it is typical to imagine that these concepts are dichotomous to one another.  In the binary thinking of those people who espouse strident opposition to binary thinking these expressions are in mutually-exclusive opposition to each other.  In other words, it is assumed that if you have any “critical thinking skills” you cannot believe in “family values.” What strikes me is how much these phrases have in common.

What these locutions share is the fact that their literal, obvious, word-for-word, face-value meanings are no longer what they mean.  “Family values” doesn’t mean that you value family.  "Critical thinking skills" as taught in most universities aren't skills and rarely show signs of clear thinking, though they are invariably critical.  In both cases, these expressions have taken on a level of meaning that the essayist Roland Barthes calls “mythology.”  In simpler terms, their connotations (what these phrases suggest) have become more important, more widely and significantly understood, than their denotations (the literal meanings of the words).  

These days the expression “family values” tends to suggest (more than anything else) the value system associated with the evangelical, religious right in the USA.  This domination and precedence of connotation over denotation is confirmation of the theory associated with Mikhail Bakhtin that how words are used over time affects their meaning as much as the dictionary definition.  In fact, how they are used eventually becomes the dictionary definition. What “family values” has come to mean is a result of the fact that the expression has historically been used to oppose family planning (at the turn of the 20th century it was a crime to send contractive devices through the mail, for example) and as justification for denying employment to women.  “Family values” was another, nicer way of saying “a woman’s place is in the home.”  “Family values” could be used as a basis to attack not only abortion, but homosexuality, lesbianism and various forms of non-procreational sex.

Just as the expression “family values” has come to signal an attitude more than what the words themselves mean, “critical thinking” has become code for left-wing, materialist, feminist thinking and attitudes.  As it happens, I have always been of the opinion that if you exercise critical thinking skills they will eventually lead you to left-wing, materialist, feminist thinking and attitudes.  The problem, of course, is that if I as a professor profess my left-wing, materialist, feminist leanings and conclusions to my students and they follow along and agree with me, at no point are they actually exercising their own critical thinking skills.  I am understating the case.  In fact, university students are measured by the degree to which they reject and rebel against right-wing ideologies, patriarchy and idealism or dualism.  The problem isn’t with the conclusions, but with the process, which is basically that they are being taught a series of opinions as if they were religious dogma.  Having absorbed this teaching, students are encouraged to expect good marks for having the “right” opinions without having demonstrated the logical reasoning skills which led them to these conclusions.

The causes of this malaise are not abstract or purely academic.  The demise of what “critical thinking” should be was provoked by the rise of deconstruction and the concomitant, haphazard decline of university departments of philosophy.   Most of the theory which paraded under the banner of deconstruction was nonsensical.  I saw Jacques Derrida being interviewed on French television a couple of years before his death, and he seemed honestly embarrassed to be the father of deconstruction.  He insisted that it was not a theory of any importance, not even a theory, not even a word that he used anymore.  However, in true Derridean, deconstructionist fashion he subsequently used the word at least a half dozen times in answering the final question of the interview.  I came to understand what “deconstruction” was (and more importantly what it wasn’t) by reading John Ellis’s succinct monograph Against Deconstruction, published in 1989. 



As Ellis points out, when the promoters attempted to define it, they typically defined deconstruction as a attack on or opposition to “logocentrism.”   The challenge then became to try and understand what “logocentrism” was; only to discover that deconstructionists were as foggy and obscure about defining logocentrism as they were about deconstruction itself.  Here is Derrida’s comment on logocentrism from the opening sentence of his seminal work, Of Grammatology

[ . . .]  “le logocentrisme  : métaphysique de l’écriture phonétique (par exemple de l’alphabet) qui n’a été en son fond -- pour des raisons énigmatiques mais essentielles et inaccessibles à un simple relativisme historique -- que l’ethnocentrisme le plus original et le plus puissant, [. . .].”

In English, without the multiple parentheses:  “logocentrism:  the metaphysics of phonetic writing [ . . .] which was at its base [ . . .] not but the most original [meaning earliest] and most powerful ethnocentrism [ . . .].

I have done my best not to play games with the translation.  It is clear that “logocentrism” is like “ethnocentrism” and, therefore, to people like me who live in and admire multicultural society, logocentrism must be something bad.  The single sentence from which I have taken this quotation runs for 400 words.  (Okay, I only counted the first 175 and estimated the rest.)  No, I still don’t know what logocentrism is, but I do know that “logos” is the Greek word for “reason” and “logic,”  and that in the opening sentence of Of Grammatology, as run-on and gobbledygook-ish as it is, Derrida, by attacking reason and writing as Western prejudice, digs himself a hole that neither he nor anyone else can dig out of.


At exactly the same moment, that Derrida was turning logic, reason and clear writing into an object of suspicion, universities were following the established business model and downsizing the study of philosophy on the grounds of a lack of sex appeal.  Logic and reason, of which departments of philosophy were the crucibles, were being hammered from both sides.  The remnants of the collision are the glib or purple descriptions of “critical reasoning skills” on university web sites which bury logic and reason somewhere in the hinterland of a third paragraph or fourth priority.

Wednesday 15 May 2013

Time to Blow the Whistle

What does "education" mean?

Teachers, past, present and future, and students, it's time to blow the whistle.  Complaints and confessions are needed.  Name any problem--crime, depression (economic and psychological), sexism, racism, drug abuse, the breakup of marriages and families, etc, etc--and someone has already proposed that "education" is the solution.  Does anybody ever stop to consider what these specialists (politicians, administrators, sociolgists, ecologists, psychologists, pedagogues and functionaries) mean by "education"?


New myths for old

The world renowned literary theorist and educator, Northrop Frye, described education as the process of getting rid of old myths, in order to replace them with new ones.  Frye was a great believer in "myth," so his declaration isn't quite as cynical as it sounds.  So let me play the cynic, although as you might guess, like most cynics, I'm really just a slightly bruised idealist.


Education versus cognitive bias, ideology and prejudice

Everywhere I look at what passes for "education," I see one group of people trying to impose their thoughts and beliefs on another group.  (Liberal-minded educators will object to the verb "impose," but whatever verb you choose--"transmit," "share," "pass"--the end result is the same.)  "Education" too often means simply replacing one set of ideas with another set that the educator likes better. Unfortunately, whenever you ask someone why one set of ideas is better than another, you very quickly find yourself running in a circle, trapped in a tautology, exhausted by a conversaton that never quite takes place. 'My ideas are better because they correspond to my values.  My values are better because they correspond to my ideas.'


Critical thinking skills and postmodernism

Lots of university programs in the Humanities and Social Sciences pretend to have solved the problem by flashing "we teach critical thinking skills" on their web sites.  The sad truth is that much of what gets taught as "critical thinking" is anything but.  Far too often, what passes for "critical thinking" in universities is slavish, dogmatic adherence to the loosely reasoned ideologies of armchair socialist and armchair feminists.  (I speak as a socialist and feminist with a longstanding commitment to his armchair.)  But think about it, really, if there was any commitment to "critical thinking" in universities, would we still be forcing students to read the bogus diatribes of junk theorists like Lacan, Kristeva, Derrida, Bhabha and their ubiquitous spawns as if they all made perfect sense?

Students cannot be called upon to effectively exercise critical thinking skills until they have amassed a bank of uncritical thinking skills and knowledge.  This is a problem that universities do not want to address, and which we need to talk about.


Dogma is the enemy of learning

Since this is my first posting, I guess I should explain what I think this blog is about.  It is dedicated to speaking openly and frankly about education without having an agenda or a dogma to defend.  Education is too important to be left in the hands of specialists.  Education is the passing on of knowledge, skills and attributes from one person to another.  It is carried on everyday by millions of people, many of whom have never thought of themselves as teachers or as students.  Its practices are as diverse, unique and personal as are the relationships of all those people involved in the process.  Our collective knowledge of the field is boundless.  Everyone has something important to contribute, if we have the courage to write the truth, and the respect and sagacity to read with an open mind.



Sunday 16 March 2014

What Is the Relationship Between University Education and Employment?

What Is the Relationship Between University Education and Employment?  The official answer is always absolute:  you need the diploma to get a decent job.  At ground level the answer is a matter of degree--in both senses of the word.  With some degrees the answer is redundant:  you get an accounting degree to become an account, a medical degree to become a doctor, an engineering degree to become an engineer.  More or less. To be honest most of the engineers I know work in sales.  Outside the obvious cases, the relationship between a particular degree and employment is a matter of debate.  On the other hand, if a graduate from a BA in English becomes a news broadcaster on local TV, you can be sure that “television journalist” will be added to the list of employment outcomes for that degree in the university calendar and on the web site.



Inside humanities programs the answer is adamant that a university degree is not job training.  It’s hard not to angle your nose toward the sky while saying this.  Holding a university degree proves that you know how to learn, not that you have learned any particular X or Y.  The average person will hold seven different jobs during a working career.  The degree has to transcend the specifics of any one job.

I don’t remember where the “seven different jobs” claim comes from, but I can remember using it fairly frequently.  And I believe that the degree should prove that you know how to learn, but how is it possible for a professor to know that a student knows how to learn without confirming that the student has learned a particular X or Y? We claim that our students learn “critical thinking skills” but how can we verify those “critical thinking skills”?  Do we ask the student to write an essay to demonstrate critical thinking skills?  If the student mimics the critical thinking skills the professor has demonstrated, does she get an A?  Or should she fail because she didn’t disagree with the professor and thereby failed to be critical?  I have found answers to these rhetorical questions in my own teaching but I have no idea how other professors dealt with them.  In my entire university career, not once did I address these questions with a colleague, or in a department or faculty meeting.  Pedagogy is just not one of the subjects that university professors talk about.

Early in my career at the University I was associated with a BA in Professional Writing.  I taught a few courses on writing and applied grammar, and was part of the committee to evaluate the program with external experts and business people.  This was a program closely aligned, at least in theory, with employment opportunities and the business community.  Students in the program went on paid internships during their studies, and the most typical reason that students didn’t graduate was that employers offered them jobs before they were finished the degree.  On an ongoing basis we had twice as many available internships as students.  However, we were never able to attract enough students to justify the program’s existence.  The experience taught me that “a job” was not the most  profound attraction when students were choosing their undergraduate field of studies.  The cache, the possibilities, however unlikely, that a degree suggested were far more attracting than the guarantee of work.

My colleagues who didn’t teach writing made it very clear to me both directly and indirectly, that a degree in Professional Writing was something to be looked down upon.  Teaching writing skills was the bottom of the prestige ladder at universities, something to be assigned to the least qualified, non-ascending personnel.  Students were supposed to learn how to write in high school.  Writing skills were simply below the level of university education.  These colleagues had a point, except that they were the first people to complain bitterly about their own students’ lack of writing skills.  It seemed inappropriate to me to denigrate the courses and the people who were trying to solve the problem, but that is what happened.

Professional Writing had another strike against it within the academy.  Whether it was right-wing snobbery or left-wing ideology didn’t matter, it was clear that a degree that kowtowed to business and/or that was closely tied to students’ getting jobs was considered beneath the ethos of university studies.  I was susceptible to these pressures and prejudices, and as time wore on I came to teach, almost exclusively, the more prestigiously-viewed literature courses (which were of course disparaged by the faculties teaching the professions) while remaining nostalgic for my old writing and grammar courses.  My dilemma was solved when the University closed down Professional Writing because of low enrollment.

On the other hand, I came to understand the shortcomings of attempting to tie a university degree with a particular form of employment and the business community in general when I served on the committee to evaluate our Professional Writing program.  I remember one internet entrepreneur being very insistent that a professional writer should know at least three computer coding languages.  Outside the university, Professional Writing tended to mean Technical Writing, which implied a degree in science and engineering prior to the writer honing his writing skills. This example, notwithstanding, the general rule is that a university education has to supply a much greater knowledge base than any entry level work position will require, but it also has to be a guarantee that a graduate has full control of a portfolio of requisite skills.


My misgiving concerns the growing tendency that I have witnessed in universities to abandon any responsibility for skills training and only minimal concern for exactly what knowledge a student is acquiring.  I have witnessed and been a participant in the lengthy processes involved in attempting to develop a program of studies for both undergraduate and graduate studies.  However, once a program exists, an entirely different level of forces comes into play which will determine exactly what any individual student is going to study and learn in a particular program:  


  • the popularity of certain courses (students do get to “choose” courses, but the truth is in any given semester the choices are likely to be very limited; courses and programs that don’t attract students get cancelled), 
  • budgets (courses that require extra funds or have low enrolments get cancelled), 
  • available teaching personnel (as lecturers unionize they have collective agreements which give them priority to teach courses that have been assigned to them in the past. If a particular lecturer is deemed not up to the job, the easiest and perhaps only solution is to cancel the course. Courses are cancelled when no-one can be found deemed qualified.) 
  • what tenured faculty feel like teaching (Tenured faculty have a very strong if not absolute influence on the courses they themselves teach. A professor might, for example, insist on only teaching courses directly related to his research--and be accommodated.  The most heated conflict I ever witnessed first-hand was between two professors over which would teach graduate seminars).  


Programs do, of course, specify “required” and “optional” courses, but these requirements tend to be very flexible.  Professors, administrators, and students themselves can get around requirements with equivalences, reading courses, and exemptions according to the exigencies of the moment.   In the end, what an individual student ends up studying (within the very loose confines a program's design) is left to the student’s inclinations and to chance.  As a professor and even as a program director I never once sat down with a student’s complete transcript at the end of her degree to consider if the courses a particular student had actually done, as a whole, made sense.  There was never any discussion of what a student had actually done, how it related to the designed objectives of the program or how it might relate to employability.  This situation, which verges on haphazard, is celebrate in university calendars as students' being able to “customize their undergraduate studies.”

Sunday 5 July 2015

Binary Thinking Versus the Other Kind

I still remember from my first-year-undergraduate “Philosophy of Mind” course that the human brain is incapable of thinking, imagining or understanding one thing in isolation without bringing in another, a background, a difference, an opposite.  You can test yourself by trying to think of just one thing.  The notion of a dialectic is based on the binary functioning of the mind; every concept contains its opposite:  the notion “long” requires “short,” “big” invokes “small.”  In an even more rudimentary fashion, in order to know a “thing,” you must be able to say what is “not that thing.”

If you have ever found yourself in a debate with a postmodernist, chances are the postmodernist turned on you at some point to announce dismissively, “oh, that’s binary thinking!”  The postmodernist’s gambit is based on the assumption of binary thinking.  The bluff works because you find yourself thinking “Gee, there was must be a superior, more advanced form of thinking that isn’t binary.”  Is there?

No, there isn’t, but the trickle-down effect of postmodern intellectualizing results in something like this claim from the online “Postmodern Literature Dictionary”:

“If you use ‘binary thinking,’ you are a person who sees no gray, no fuzziness between your categories. Everything is black or white.”

In postmodern speak “binary thinking” has become a synonym for the already well-known and understood idea of “simplistic thinking,” again with the implication that those “non-binary” thinkers must be smarter than the rest of us. How did we arrive at this “two legs bad” juncture?  

The cause is rooted in “poststructuralism,” the theoretical backbone of postmodernism.  In order to understand “poststructuralism” (literally “after structuralism,” therefore a continuation and improvement of) it is necessary to have some grasp of structuralism.  Structuralism is closely aligned with “semiotics,” a term coined by the linguist Saussure meaning the science of signs.  John Fiske offers a clear, accessible and succinct description of semiotics/structuralism in his Introduction to Communications Studies.

Semiotics is a form of structuralism, for it argues that we cannot know the world on its own terms, but only through the conceptual and linguistic structures of our culture. [. . . .] While structuralism does not deny the existence of an external, universal reality, it does deny the possibility of human beings having access to this reality in an objective, universal, non-culturally-determined manner. Structuralism’s enterprise is to discover how people make sense of the world, not what the world is.  (Fiske, 115)



Fiske’s description anticipates the core dispute in the the feud which will eventually take place between postmodernists and empirical scientists like Sokal as I have described in my post The Postmodern Hoax.  Current repudiations of “binary thinking” find their origin in a paper delivered by Jacques Derrida at a structuralism conference at Johns Hopkins University in 1966 entitled  “Structure, Sign and Play in the Discourse of the Human Sciences”.  (The French-language original, "La structure, le signe et le jeu dans le discours des sciences humaines" is slightly more readable than the English translation.)


In this essay, Derrida dismantles (Derrida uses the term "deconstructs") the work of anthropologist Claude Lévi-Strauss, in particular Lévi-Strauss's The Raw and the Cooked.  Although Derrida never explicitly refers to "binary thinking" or "binary opposition" in his essay, it is understood that the structure Lévi-Strauss uses, derived from the linguists Saussure and Jacobson and all of structuralism, is the binary functioning of human thought, and is the target of Derrida's critical inquiry into the "structurality of structure" (“Structure, Sign and Play in the Discourse of the Human Sciences”).

The Longman anthology Contemporary Literary Criticism, in addiction to a translation of Derrida's paper, offers in addendum a transcription/translation of the discussion which took place between Derrida and the leading lights of structuralism immediately after his presentation.  It's interesting to see some of the finest minds in structuralism struggling to understand what the hell Derrida was talking about and, at the same time, to see Derrida cornered into giving a straightforward definition of "deconstruction."   Okay, "straightforward" is never a word that can be applied to Derrida, but with my ellipses eliminating all the asides and parentheses this is what he said:  "déconstruction [. . .] is simply a question [. . .] of being alert [ . . .] to the historical sedimentation of the language which we use [. . .]" (497). This is the definition of "deconstruction" that I typically gave students and, at the same time, I pointed out that even though "deconstruction" was suppose to be something innovative, radical and distinctly postmodern, the Oxford English Dictionary has been"deconstructing" the English language for literally hundreds of years--meaning that the OED gives you the multiple meanings of a word and the year ("the historical sedimentation') in which a particular meaning/definition can be proven to have come into usage.

 Back to structuralist anthropology. As Fiske explains:
The construction of binary oppositions is, according to Lévi-Strauss, the fundamental, universal sense-making process. It is universal because it is a product of the physical structure of the human brain and is therefore specific to the species and not to any one culture or society. (116)
Contrary to popular understandings of "binary thinking,"  the whole point of structuralist anthropology (the binary approach) is to understand how societies, through their mythologies for example, deal with the failures of and exceptions to binary opposition.  Fiske applies the Lévi-Strauss approach to a Western and concomitantly demonstrates how the approach teases out subtextual themes at play in the movie, and how this particular interpretation of the film might stretch credibility.  Even today, 50 years later, it is difficult to fathom exactly what new, radical, distinctly postmodern objection Derrida is raising.  

Certainly it makes sense to challenge how binary thinking is applied in a particular case.  The objection isn't to binary thinking but to a particular application.  If you are going to launch a campaign against food on the grounds that it causes obesity, you should at the same time be ready to present an alternative to eating food, something that goes beyond the absurd claim that "eating food is bad."





Monday 13 February 2023

On "Putin's American Cheerleaders"

Critical Thinking skills

I have to preface this post by revisiting "critical thinking skills"--that phrase used by university programs in the humanities and social sciences as a core justification for their existence.  The vast majority of university students graduate from these programs.  In theory,  millions upon millions of university-educated Americans and Canadians can claim an expertise in identifying arguments based on logic and evidence and, conversely, immediately spot logical fallacies:  the ad hominem, the straw man, guilt by association, and rhetorical obfuscation.  

"Putin's American Cheerleaders"

I read Adrian Karatnycky's Wall Street Journal article, "Putin’s American Cheerleaders: How Jeffrey Sachs, Mark Episkopos and Dimitri Simes contribute to the Russian propaganda effort" against the grain, as a string of logical fallacies light on rebuttal evidence.  The headline makes obvious the ad hominem intent to attack the authors rather than their arguments.  

We Are at war

But let's be clear:  we are at war.  The war is being fought by Ukrainians, but it is a war between Russia and the collective West, led by the USA.  The war has caused global precarity, massive destruction and the deaths of thousands.  Beyond the concrete devastation, the war in Ukraine is, above all, a propaganda war.  Arguably, propaganda will determine the outcome of this war.  In this context, we shouldn't be surprised that we are all likened to soldiers on the battlefield, and any deviation from the Western narrative is collaborating with the enemy, if not betrayal and treason. 

And yet . . .

Even if we are all conscripts in the propaganda war should we accept "to do and die" in a nuclear Crimean War without stopping "to reason why"? Is it unreasonable to invoke "thinking skills" in the midst of this war?  No-one knows the whole story of this war.  Even in Kyiv or Moscow or Washington or Berlin or London or Ottawa, even on the battlefield, even with drones and satellites, people know as much and as little as they can see and hear and read.  In a war, especially in a propaganda war like this one, enormous effort is put into controlling what is seen and heard and read. 

The Dominant Western narrative

The dominant Western narrative, primarily in the legacy media, is that escalation is the only acceptable solution to the conflict in Ukraine.  The argument is presented that Russia must be defeated because failure to defeat Russia now will lead to Russian expansionism and greater escalation somewhere down the road.  Overlaying this argument is an appeal to morality.  Russia must be defeated because the invasion and the conduct of the war are immoral, criminal and evil.  Anything less than total Russian defeat would be a victory for evil.    

Does the Western narrative hold up under scrutiny?

Under the microscope of critical reasoning skills,  the arguments for escalation do not hold up well.  Let me quickly insert that this does not mean that they are wrong or untrue.  They are simply unproven, counterfactual, hypothetical, and speculative.  We will inevitably try to imagine what Russia might do after the war, but there is a weakness in trying to be too specific and too certain about what might happen in the distant future.  We can say with fair certainty that a negotiated peace--what the Western narrative qualifies as a Russian victory--would include some sort of autonomy if not outright Russian control of Crimea and the eastern regions of Ukraine; that is, those regions with significant populations of ethnic Russians where President Viktor Yanucovitch, who was overthrown in a bloody coup in 2014, had his strongest democratic support.

The Moral argument

The moral argument for escalating the war is equally weak.  The argument depends on our accepting as axiomatic that the war is between absolute evil and pure goodness.  The goal of propaganda is to promote this vision, but even cursory scrutiny of the context of the war makes this absolutist vision impossible to maintain.  Some 13,000 people were killed in the Donbas region in the aftermath of the bloody coup overthrowing President Yanocovitch in 2014 and before Russia's full-scale invasion in 2022.  Even the US Congress has banned the sale of weapons to Ukraine's Azimov Battalion on the grounds that the battalion openly includes neo-Nazis in its ranks.

Naming and Shaming

I first read "Putin's American Cheerleaders" because it provides a list of a half dozen Americans who question the proxy war between Russia and the West going on in Ukraine--which isn't generally easy to come by.  The article is a telling example of widespread, ham-fisted attempts to discredit, shame and silence anyone who dares to question the war. Articles of this ilk are emotionally evocative and are based on an underlying presumption of moral superiority shared by writer and reader.  The vocabulary is emotionally charged but logical consideration of risks and outcomes is avoided.  For potential outcomes, the war in Ukraine should be compared to other recent wars spearheaded by the USA--Afghanistan, Iraq, Syria, Vietnam and Korea--but these are comparisons which the dominant narrative tends to avoid.

Guilt by Association

While Mr. Karatnycky concedes that "experts are free to challenge the pro-Ukraine views held by the vast majority of Americans," he decries the fact that these American experts have appeared on a Russian program hosted by Vladimir Solovyov, whom he describes as a Russian propagandist. Karatnycky has more to say about Solovyov than about the "American cheerleaders."  The Americans' failure is guilt by association with Solovyov.  According to Karatnycky, what Jeffrey Sachs said on Russian media was

that a “massive number” of Americans “wish to exit the conflict in Ukraine,” condemned the U.S. administration for “disinformation,” and called President Volodymyr Zelensky’s conditions for peace “absolute nonsense.”

None of these claims about American attitudes are obvious errors of fact.  Zelensky's conditions for peace go beyond total Russian defeat and surrender.  They sound a lot like the "conditions" imposed upon Germany after the First World War. The Washington Post has reported that the Biden administration has been asking Zelensky to dial down his "conditions for peace." 

Framing the War as exclusively between Russia and Ukraine

Karatnycky's awkward--and therefore revealing--attempts to frame the war as between Ukraine and Russia leaving the USA and even NATO out of the equation is typical of the dominant narrative.  People who dare to suggest a negotiated peace are not identified as critics of the war but "Ukraine critics." Americans who endorse escalation of the war are identified as "pro-Ukrainian."

NATO Expansion isn't a threat!  Really!?

Jeffrey Sachs is characterized as a "Putin cheerleader" because, as with a number of other "foreign policy realists," he "has long argued that the West provoked Russia into invading Ukraine in 2014 by virtue of the North Atlantic Treaty Organization’s 'threatening' expansion toward Russia."  Karatnycky's quotation marks around the word "threatening" are intended to display a tone of sarcasm.  Still, no matter what your politics, how can anyone look at the ongoing expansion of NATO to Russia's borders and logically conclude that the expansion of an inimical military alliance to a nation's very borders is not "threatening"?

What Jeffrey Sachs said . . .

Furthermore, beyond the threatening posture of NATO, as Sachs points out in an interview on Democracy Now, [ . . .] the United States, very unwisely and very provocatively, contributed to the overthrow of Mr. Yanukovych in early 2014, setting in motion the tragedy before our eyes."  

What Cannot be said:  Ukraine is ethnically divided between east and west

One argument which shaming the authors is designed to preclude is that Ukraine is ethnically divided.  As Sacks elaborates:

The "Minsk Accords" must also be denied

The resulting Minsk Accords, as we have seen, are quashed and denied in pro-war editorials, even when the narrative requires contradicting its own sources.  Sachs argues:

What happened — and this is crucial to understand — is that, in 2015, there were agreements to solve this problem by giving autonomy to these eastern regions that were predominantly ethnic Russian. And these are called the Minsk agreements, Minsk I and Minsk II.

John Bolton was in Ukraine in 2019 and reports that Volodymyr Zelensky, who was elected promising to end Ukrainian corruption and make peace with the eastern regions,  "was determined to get the Donbas back as soon as possible and end the war within the Minsk agreements" (457 The Room Where It Happened).  However in the intervening years there has been consistent repudiation and denial of the Minsk Accords in Western and Ukrainian media.  It is as if they never existed.

The Zeitgeist:  Preparing for the historical dialectic

Karatnycky claims that "Most U.S. guests on Russian media come from the fringe."  He names Virginia State Sen. Richard Black and former United States Marine Corps intelligence officer, former United Nations Special Commission (UNSCOM) weapons inspector Scott Ritter.  However, the first name on his list of "Putin's American Cheerleaders" is Tulsi Gabbard, a former American Congresswoman and candidate in the 2016 Democratic Presidential Primaries.  In her interviews, she has a very simple and clear message:  "The world has never been closer to a nuclear war."

The rule of the historical dialectic is that the Zeitgeist will change over time and the dominant thesis of the age will give way to its antithesis.  If the rule of the dialectic holds in this case, those "fringe" arguments against escalation, which are everywhere on social media in blogs and vlogs and interviews but nowhere in the Wall Street Journal, New York Times, or Globe and Mail, may soon become the dominant Western narrative.



Tuesday 15 September 2020

The Case Against "The Case Against Reality"

The Case Against Reality

When a friend (thanks Fred!) lent me a copy of Donald Hoffman's The Case Against Reality:  Why Evolution Hid the Truth from our Eyes, it was number five or six on the list of books I was planning to read.  However, glancing at the preface, my curiosity got the better of me, it jumped the queue and I started reading.  I had already heard the broad outlines of Hoffman's argument that what we call "reality" is a holograph filled with icons like the ones on the desktop of a computer.  I couldn't imagine that I would ever accept Hoffman's conclusions but reading, like life, is about the journey not the destination.


Analytic Philosophy

Much of my undergraduate education was in analytic philosophy, meaning I was schooled to pay particular attention to the terms Hoffman used in constructing his argument. Immediately, my skepticism was aroused by his use of the words "truth" and "true."  (I have already written on the concept of "truth":  see Does Knowledge Require Truth?)  Hoffman challenges the notion that our perceptions are "true," but I'm uncertain what the expression "true perception" means.  (I'm equally uncertain about what "true love" really means.  In fact, it seems to me every time I have heard the expression "true love" being used, the speaker was being sarcastic.)  Presumably, Hoffman is invoking the correspondence theory of truth; i.e., that something is true because it corresponds to reality.  

The Definition of "Truth"

Tracing Hoffman's argument, I noted that "truth" eventually became "veridical perception."  "Veridical" means "corresponds to reality" or in more common language "truthful."  By the time I reached page 65, I was ready to pack it in because Hoffman was either using a version of "truth" that no-one believed ("corresponds to reality") or he was using a maybe-half-full-maybe-half-empty definition of "truth" ("truthful").  However, Hoffman saw me coming, and on page 67, he tells us:
Consider three notions of veridical perception.  The strongest is “omniscient realism”—we see reality as it is.  Next is “naive realism”—we see some, but not all, of reality as it is.  The weakest is “critical realism”—the structure of our perceptions preserves some of the structure of reality.  If the FBT Theorem targeted omniscient or naive realism, then we could indeed dismiss its conclusions—no one (save lunatics and solipsists) claim omniscience, and few espouse naive realism.  But the theorem targets critical realism, which is the weakest, and most widely accepted, notion of veridical observation in the science of perception and in science more broadly.  The FBT Theorem does not torch a straw man.  (67)
Consequently, in reviewing Hoffman's argument, we must keep in mind that "truth" = "veridical perception" = "critical realism."  Can Hoffman maintain the coherence of his theories when "truth" only means that our perceptions "preserve some of the structure of reality"?

Fitness Beats Truth in Human Evolution

Hoffman's premise "the FBT Theorem," confirmed through game theory, is that "Fitness Beats Truth."  The theory is a counter-argument to the claim that as the species has evolved we have become better at perceiving reality (or, "the truth").   FBT argues that we evolve by taking advantage of "fitness payoffs."  Our perceptions direct us toward what is useful, desirable, helpful, beneficial; not what is real and true.  Much of the book is comprised of fascinating experiments and examples of how our perceptions (in particular visual perception) construct reality rather than apprehend reality.

Perception Versus Objective Reality

There is nothing new in the claims that we do not perceive objective reality or the "thing-in-itself" (in my day, we always used the German "Ding an sich" to preserve the expression's Kantian origins). For decades, I hammered away at students telling them that the tree falling in the desert didn't make a sound, that colours only existed in human brains, not on walls, that the reason we can watch movies is that our visual processing is slow and twenty-one frames per second looks like continuous action to us,  that with perfect perception, walking into a room,  we would perceive nothing identifiable, no chairs or people or walls, just infinite clumps of molecules and atoms, that they had never perceived what they thought they knew best, themselves, because they had never seen their own bodies in entirety or heard the distinct sound of their own voices.  What is distinctive about Hoffman's claims is the degree of disconnection and separation he proposes between objective reality and perception.

What Works Isn't Necessarily True

Ptomaine might have been absolutely wrong in his vision of the universe with the Earth holding steady at its centre but, as I lectured my students, that mistaken vision still worked to allow accurate calendars and navigation at sea.  Reading Hoffman, I was more than willing to accept that our perceptions could be useful without being accurate or truthful.  However, was it possible to make decisions or even claims about "fitness" when our perceptions were so far removed from any trace of objective reality?

Dialectics, Binary Thinking and Formal Logic

Once again, Hoffman anticipated my skepticism.  I have to say, this feature, its dialectics, (the sense of debate echoing Plato's Socratic dialogues) made the book compelling reading for me.  Hoffman invokes "formal logic" as follows:

  • "Suppose I tell you that p is some particular claim and q is some particular claim, but I refuse to tell you what either claim is."
  • "Then suppose I make the further claim, 'p is true or q is true'."
  • Then suppose the "claim, 'if either p is true or q is true then it follows that p is true'. You know that this claim is false, even though you don't know the contents of p or q." (72)

Hoffman is invoking binary logic here, and I happen to be a fan of binary thinking.  (See Binary Thinking Versus the Other Kind.)  However, 'either p is true or q is true' depends on the fact that p and q are mutually exclusive and never overlap.  Lest we lose track of the underlying terms, fitness ("fitness payoffs") and truth ("critical realism") have not been proven to be mutually exclusive categories; in fact, there is good reason to imagine significant overlap between the two.  In other words, sometimes the connection between our perceptions and some aspect of objective reality produces "fitness payoffs."  To be clear, what Hoffman is trying to argue here is that if he can show we evolve through fitness payoffs then he has proven that we do not see "the truth."   My counter-argument is that this logic only works if "fitness payoffs" and "critical realism" are mutually exclusive; that is, only one can be true, but it is equally reasonable to conclude that both p and q are true, and fitness payoffs sometimes involve seeing objective reality as it is.

What We Call "Reality" Is Like the Icons on a Computer Desktop

The next step in Hoffman's theorizing is what he calls ITP, "the interface theory of perception" (76, italics in original).  Hoffman's key metaphor is that what we perceive as "reality" is analogous to the blue file icon on the desktop of a computer.  Our perception is of "the interface--pixels and icons--[and] cannot describe the hardware and software it hides" (76).  Our interactions with the interface are helpful and useful, but they do not tell us anything about reality, about the computer's software or wiring.

Quantum Theory:  Consciousness Creates Reality

Hoffman is heading toward the conclusion that has become increasingly popular among sub-atomic physicists and quantum-theory wonks that consciousness produces reality rather than the other way around.  Citing physicist John Bell's experiments in the 1960s which are reported to prove that "an electron has no spin when no none looks" (54), and the broader claim that "no property, such as position or spin, has a definite value that is independent of how it is measured" (98), Hoffman reports the conclusion that "Quantum theory explains that measurements reveal no objective truths, just consequences for agents of their actions" (100).   In short, perception determines measurement.  Hoffman goes all the way in his hypothesis that consciousness creates reality, which he dubs "conscious realism" (184, italics in original).

According to "conscious realism," the interface which we typically call reality is "instantiated" by a network of conscious agents.  Conscious agents can start out as "two-bit" things, but "a realm of conscious agents [can] interact and instantiate higher agents" (192).

Science Has Failed to Explain Consciousness

Arguments privileging consciousness, including Hoffman's, seem, invariably, to point out the failure of science to explain consciousness.  Personally, I don't draw any particular conclusion from this "failure." Medical science, as Bill Bryson points out in The Body, has yet to explain asthma, along with hundreds of other medical and scientific phenomena.  The mind-over-matter cohort consistently disparages the claim that consciousness is an "emerging property" of physical properties of the brain.  Although we can now observe what goes on in the brain during perception and can even manipulate the brain to cause certain perceptions, the claim remains that we fail to explain how perception happens.  Since I am still mystified by the "emerging property" called fire, I am perhaps too at-ease accepting that consciousness is a similar "emerging property."  But I have to say, Hoffman's instantiated "higher agents" sound a lot like "emerging properties" to me.

What Is Knowable When Perception Has No Connection to Objective Reality?

The concluding chapters of The Case Against Reality, when Hoffman's prose becomes proselytizing and purple, are the least convincing and compelling.  For example, Hoffman writes: "We can, despite this poverty of translation, see a friend's smile and share their joy--because we are insiders, we know first hand what transpires behind the scene when a face fashions a genuine smile" (186).  Since any object before me, according to Hoffman, ceases to exist when I close my eyes, since even my own body is an icon which must cease to exist when I am not sensing it, how can I possibly conclude that my friend, my friend's smile or my friend's joy exist or that they are accessible to me in any "veridical" sense?

How Is the Holograph We Call "Reality" Created?

Of course, the big question is: if the world we perceive is a holograph, a series of computer-generated icons, who is making all this happen.  Hoffman does not dismiss the possibility that we are living in some alien kid's video game.  Hoffman is obviously a big fan of the film The Matrix.  Ultimately, he does get to the big Creator question:  "The idea of an infinite conscious agent sounds much like the religious notion of God, with the crucial difference that an infinite conscious agent admits precise mathematical description" (209).

Parallels with Descartes Meditations

Throughout my reading of The Case Against Reality, the parallels with Descartes Meditations seemed obvious.  Descartes gave himself the project of thinking that the world as he perceived it did not exist.  He even allowed that an evil demon was deliberately confusing his perceptions.  Descartes concluded that his only certainty, in the first place, was that the "I" doing the thinking must exist: Cogito ergo sum, "I think therefore I am."  From this premise, he concluded that God must also exist.  As outlined in the Stanford Encyclopedia of Philosophy:

Descartes often compares the ontological argument to a geometric demonstration, arguing that necessary existence cannot be excluded from the idea of God anymore than the fact that its angles equal two right angles, for example, can be excluded from the idea of a triangle. The analogy underscores once again the argument’s supreme simplicity. God’s existence is purported to be as obvious and self-evident as the most basic mathematical truth.

Philosophy of Mind

The absence of any mention of Descartes in The Case Against Reality is a strange lacuna.  The problems of Hoffman's argument are similar to those Descartes faced almost 500 years ago.  The philosophy of mind permits three possibilities in the relationship between consciousness (what is mental) and the world (what is matter):  idealism (everything is mental), dualism (mind and matter both exist), materialism (everything is matter).  Each comes with its own particular problems but the challenge of dualism, in particular, is to explain how matter creates non-matter or vice versa or how the two co-exist.

The Problem from which There Is No Escape

Hoffman declares "conscious realism" as a form of "monism" which implies that everything is mental.  If the world is entirely mental how is the appearance of a physical world created?  Materialism may have a problem of how matter creates consciousness, but reversing the order, having consciousness create matter doesn't really solve the problem.  Hoffman acknowledges that an objective reality exists, but he is left with the problem of how that objective reality is created, how conscious agents create the appearance of matter in a matter-less world.

Tuesday 28 November 2017

Deconstruction and “Ways of Talking”

Derrida denied deconstruction was of any importance

As I’ve mentioned previously, the last time I saw Jacques Derrida, who is credited with coining the term “deconstruction,” being interviewed he was quite adamant that “deconstruction” was not a concept of any importance, not even a theory, not even a word that he used anymore. ( See "Critical Thinking Skills" and "Family Values")  Nonetheless, the word has taken on a life of its own and, while it may have gone out of fashion, it is still with us and showing no signs of disappearing from the language.  (See footnotes.)

Postmodernist deconstructionist smuggery

If you have ever tried to confront a postmodernist deconstructionist by pointing out that his work was contradictory, illogical, duplicitous, nonsensical and hypocritical, you would likely find him responding with glee, “Exactly!”—as if he were personally responsible for your recent intellectual epiphany.  Given the deconstructionist stance that language is guaranteed to fail and is ultimately meaningless, you might wonder why Derrida seemed so happy with the tens of books (meaningless books, obviously) he had published.  Why write at all?  If you asked your postmodernist deconstructionist friend that question, the conversation would inevitably lead to a tangential monologue about a recent grant application winning hundreds of thousands of dollars, an upcoming publication in a prestigious journal, a conference in Hawaii, and high expectations of promotion.

"Ways of talking" in The Big Picture: On the Origins of Life, Meaning and the Universe Itself

So how can we confront deconstruction?  How can we address the malaise of postmodernist deconstructionist smuggery?  Recently I found an answer in an unusual source, The Big Picture: On the Origins of Life, Meaning, and the Universe Itself, by a physicist named Sean Carroll.  The answer lies in an expression that Carroll uses quite frequently:  “ways of talking.”  However, before we get there we need to have a better grasp of what deconstruction is/was.



Deconstruction begins with "Structure, Sign and Play in the Discourse of the Human Sciences"

Whenever I taught deconstruction (no, I didn’t only teach the stuff I admired), I would focus on the definition that Derrida provided when he was being cross-examined after his seminal conference paper “Structure, Sign and Play in the Discourse of the Human Sciences” at Johns Hopkins University in 1966.  (Excuse all of my ellipses which follow but I find they are necessary if you want to pick out what Derrida is saying from the obfuscating verbiage.  I’ll put the full quote in a footnote, so you’ll know I’m not fudging.) Derrida said, “[. . . .] déconstruction [. . . . .] is simply a question  of being alert to the implications, to the historical sedimentation of the language which we use [ . . .  .].”* 

Deconstruction is a very old, and not very complicated, idea

“Being alert to the historical sedimentation of language” is good advice.  In fact, “being alert to the historical sedimentation of language” is exactly what generations of lexicographers and scholars have done over centuries in creating The Oxford English Dictionary (OED) since the project was first begun by Dr. Samuel Johnson in 1746.  If you peruse the OED, you will notice that the meanings of words change over time, until every word in the language seems to have, on average, five or six different meanings. If you imagine a sentence in English with ten words and each of those words has five potentially different meanings, and the meaning of the sentence can be affected by connotation, figures of speech, interpretations, intertextuality, tone of voice and punctuation, you can begin to appreciate postmodernist deconstructionist claims that the language fails, that its meanings are “indeterminate,” “deferred,” even “infinite”—and therefore meaningless.

Deconstructionist ways of talking about language create meaninglessness

How do these claims work?  How is it possible that this deconstructionist idea that language fails to communicate seems so logical and convincing, even though I remain absolutely confident that when I read or hear ten words of a sentence in English I understand the meaning, even if it contains some ambiguity or irony.  The explanation I now see is that there are different “ways of talking” about language.

"Ways of talking" is a profound concept

Carroll’s description of that “innocuous sounding but secretly profound idea that there are many ways of talking about the world, each of which captures a different aspect of the underlying whole” helps us to understand how deconstructionist claims about the meaninglessness of language can be convincing even as we hold onto the strong conviction that we do manage to understand the meaning of language on a daily basis.  

The "way of talking" can determine meaning or meaninglessness

The easiest and most obvious way to reflect upon different “ways of talking” is to consider that the average human being is comprised of seven billion billion billion atoms (7 followed by 27 zeros).  Consider the claim that “I don’t understand Mary because she is comprised of billions of billions of billions of atoms and they are constantly changing.”  It’s pretty hard to argue with the science and the logic of this claim but, at the same time, it seems obvious that this is not an appropriate or meaningful way of talking about Mary or any human being for that matter.  

As Carroll explains,"There is one way of talking about the universe that describes it as elementary particles or quantum states [ . . . .] There is also another way of talking about it, where we zoom out a bit and introduce categories like ‘people’ and ‘choices’.”

Mary may, with scientific certainty, be an octodecillion of atoms and be 99% oxygen, carbon, hydrogen, nitrogen, calcium, and phosphorus, but talking about her this way will certainly make her appear impossible to understand and, in fact, meaningless.  In truth, I can sometimes understand Mary and may sometimes misunderstand her, but overall I know that she is comprehensible and meaningful.

Deconstructionists' "way of talking" about language makes it meaningless

Similarly, postmodernist deconstructionists’ way of talking about language reduces it to marks on the page or collections of morphemes and phonemes.  This way of talking precludes understanding and meaning.  To get understanding and meaning you have to use these words in the way of talking in which people--who aren't just clumps of molecules--usually use them.



Footnotes

*”Here or there I have used the word déconstruction, which has nothing to do with destruction.  That is to say, it is simply a question of (and this is a necessity of criticism in the classical sense of the word) being alert to the implications, to the historical sedimentation of the language which we use—and that is not destructive”  (Derrida in Contemporary Literary Criticism 497).

http://www.dictionary.com/browse/deconstruction

https://dictionary.cambridge.org/dictionary/english/deconstruction

https://www.merriam-webster.com/dictionary/deconstruction

http://www.dictionary.com/browse/deconstruct

https://en.wikipedia.org/wiki/Deconstruction

Friday 13 November 2020

What is Comparative Literature?

From English to comparative literature

Equipped with a collection of degrees in English language and literature, for two decades, I taught, researched and published in a field called "comparative literature."  As near as I can judge, the discipline got its English name in the early 20th century from a faulty translation of the French expression "littérature comparée."  The literature which comparativists study isn't comparative in any meaningful sense.   It would make some sense to call the subject "compared literatures" (a literal translation of "littératures comparées") or, even more obviously and aptly, "comparative studies of literature." However, we specialists learned to succumb and accept the terminology that got us tenure without a whimper until some first-year undergraduate asked us "what exactly does 'comparative literature' mean?" Then we mumbled and grumbled about students who hadn't done enough reading.

Comparative literature = literary theory

It might be a stretch to describe comparative literature as influential, but whatever fashionable nonsense we didn't originate we were quick to support and promulgate. Over the postmodern period, comparative literature became code for literary theory, and comparative literature never met a theory it didn't like enough to adopt. Whatever nascent passion a student might bring to the study of literature, you can be sure literary theory was ready to quell it.


 Identity crises


Comparative literature has been suffering from an identity crisis for about as long as it has existed  (see Gayatri Spivak's Death of a Discipline), as has the discipline of English literature (see Alvin Kernan's The Death of Literature). I have come to accept George Steiner's definition from his lecture/essay "What Is Comparative Literature?": "[...] comparative literature is an art of understanding centered in the eventuality and defeats of translation" (10).  There has been a turf war (more of a squabble really) between comparative literature and translation studies in recent decades.  Having done some translation work and research in translation studies, I came to the conclusion that Steiner got it right: comparative literature fills in the gaps in translation and tells us about what any translation is forced to leave out or leave behind.  We need a comparativist to tell us why a joke is funny in one language but not in another.

Comparative = 2 or more?

I think the expression "comparative study" means something because it suggests that the study is marked by "a consideration of at least two things."  I actually proposed this starting point at a meeting of comparativists once and was roundly told that my definition was "too narrow."  An additional irony (paradox? absurdity?):  for as long as I was active in the field there was a strident movement against explicit comparisons in the field of comparative literature on the grounds that such comparisons were out of date and smacked of "binary thinking."  (See Binary Thinking Versus the other Kind.)

Binary = bad!

In Comparative Literature: A Critical Introduction, Susan Bassnett traces the notion that “comparative literature should involve the study of two elements (études binaires)” (27) to Paul Van Tieghen’s La Littérature comparée (1931) and argues that “[i]t is possible to see almost all French comparative literature from the 1930s onward as coloured by the études binaires principle” (28).  Bassnett describes a binary approach as having served comparative literature “so ill for so long” (24) and cites the “narrowness of the binary distinction” as the first of a number of reasons that “[t]oday, comparative literature in one sense is dead” (47).  

 

         

Studies of Canadian literatures in two languages = binary = bad!

In his introduction to Textual Studies in Canada 5: The Aux Canadas Issue, Robert K. Martin argues that the “binary model is no longer acceptable to many Canadians” (3). Claiming that “the paradigm of two founding nations leaves little place for the native peoples of Canada” (3), and he invokes the need for Canada “to go beyond duality” (3) in order to remain open to other voices.  Insisting that it is not enough to “simply add a soupçon of otherness to an otherwise unchanged recipe” (3), Martin points out that “[t]he comparatist enterprise has too long sought to produce a paradigm with variations, without adequately recognizing how much the apparently descriptive paradigm becomes prescriptive.  If major Canadian works are like this, then one that is like that can’t possibly be major, or even Canadian” (4).

Major Canadian works of literature?

The problem with the counterfactual problem that Martin imagines is that the average Canadian scholar of literary studies would be hard-pressed to name a "major" Canadian work of literature and reluctant to even describe a literary work as Canadian.  The postmodern scholar would dismiss the concept of "major" or a canon of major literary works, and equally dismiss the notion of a national literature.  The postmodern project was the stalwart investigation of the eccentric and the minor in opposition to a major or mainstream national literature.  What Martin and Bassnett fail to acknowledge, which anyone who has ever touched the keyboard of a computer knows, is the incredible possibilities for refinement, subtlety, inclusion and advancement that a binary approach can offer.


Everything old is new again!

Ultimately, literary studies, both English and comparative, was born out of an attempt to escape philology.  No doubt, historically speaking, philology has a lot of tedium and absurdity to answer for.  My career was spent studying the intersections of language(s), literature(s), culture(s) and disciplines which, everywhere I look, is a basic definition of philology.  In fact, Spivak's new comparative literature sounds a lot like philology to me.  As Sheldon Pollock points out in World Philology:  "The lowest common denominator of philology is [. . .] how to make sense of texts."  Turf wars aside, making sense of texts--which today means making sense of intertexts--has always been the lowest common denominator of literary studies, comparative studies, translation studies, and a host of other disciplines both new and old.


Why Is the Vagina Masculine? And What’s the Alternative?

“Vagina” is masculine  I first came across this factoid thirty years ago in Daphne Marlatt’s novel Ana Historic .   It came up again more r...