Translate

Showing posts with label Daniel Dennett. Show all posts
Showing posts with label Daniel Dennett. Show all posts

Monday 4 January 2021

The Mystery of the "Off" Switch . . . Solved! Sort of.

Where's the "on" switch?

Just when I was feeling so smart because I'd bought myself a new fancy-pants Mac computer, I had to spend 45 minutes looking for the power switch to turn the damn thing on.  Then I had the problem of figuring out the right way to turn it off.  In the early days of personal computers, people mocked the fact that you had to choose "on" to turn your computer off.  Though I thought myself more savvy when I bought my Mac, apparently, pressing the near-invisible "on" button was not the right way to turn it off.

"OK Boomer"

These "OK Boomer" moments became a motif--these days people say "meme"--in my encounters with technology.  Every time I dealt with a new "app" (why does everything have to have a nickname, abbreviation, initialism or acronym which obscures its meaning?  A sour-grape complaint for another day), I ended up asking "how do you turn it off?"   The millennial response was a look of wonderment which asked "How does one communicate with an alien life form from another planet?"  The more empathetic answer was "You don't have to."

It's called a light switch!

In my world, when you walked into a darkened room, you flicked up on a switch and the lights came on.  As you left the room, you flicked down on the switch and the lights turned off.  (Granted, some electricians don't get the whole up/down thing.)  Even if "I didn't have to," when I was finished with an application, I wanted to turn it off. Banking applications usually displayed "log out" or "sign out" possibilities (apparently "turn off" is verboten in the big-tech world).  Banks even recommend that after I "log out," I should "empty cache." (I should learn how to do that someday.)

Who knew there was a plan?!

Over and over again, I went looking for the "off" switch in a computer application, only to discover that it would take three clicks beyond a particular hotlink, if I was lucky enough to choose the right hotlink in the first place.  Often there simply was no off switch.  Then the application would suddenly appear on my screen when I rebooted my computer.  To turn off the application, I would have to delete it, but even deleting it was not enough.  I would have to "uninstall it" which meant buying "uninstall software" or deleting six or seven files in various folders.  And hopefully I would not delete the file which was essential for the operating system.  "These are terrible design flaws," I thought naively.  Then I watched the Netflix documentary The Social Dilemma.


The Dystopia is now!

The backbone of the film is a string of interviews with insiders from big tech companies like Facebook, Google, Twitter, Instagram, Youtube, etc, etc.  In turn, they each confessed that they had created a monster.  Then, ironically, some admitted that they themselves were being consumed by the monster they had created to consume the rest of us.  They struggled to give the monster, the underlying problem, a name.  The groundwork for a near-future dystopia has been laid out, they tell us.  If, today, you are a typical teenager addicted to social media, then the dystopia is now.  The tipping point, we are told, was 2011, with dramatic increases in tween and teenage girls committing self-harm and suicide.

There Is no off switch!

Given the scale and the stakes, the anachronistic problem of a stubborn, aging Boomer who wants an off switch seems minor, insignificant, even silly.  But it tells us what the problem is:  there is no off switch.  The epiphany which The Social Dilemma provided for me was that social-media companies measure success by one single metric:  how much time does a user spend looking at a screen.   As I clicked one hyperlink after another looking for an off switch, becoming more aimlessly lost and pissed off, then went to Google and Youtube looking for a solution, all the while thinking how terribly these tech companies and designers had failed, at the other end, they were celebrating their success at having gotten me to extend my screen time.  (CGP Gray claims that getting me angry is what creates viral social media.  See This Video Will Make You Angry.)

Calling Social Media an "addiction" may be an understatement

Not surprisingly, in The Social Dilemma, the attachment to social media, particularly among the young, is described as an addiction.  Any behaviour which you are unable to control is aptly described as an addiction.  The word "addiction" brings to mind images of someone scruffy in a hoodie lurking outside a schoolyard selling cocaine and ecstasy.  The image to imagine in the case of "social-media addiction" is the user, a teenage girl for example, at one end, and, at the other end, an army of billionaire tech execs, engineers, psychologists, neurologists, designers, sociopaths and other influencers with one objective:  getting that user to keep staring at her screen.  Unfortunately, this image only brings us to the mouth of the rabbit hole.


 Free Will and determinism

In response to my post on Free Will and Determinism, two of my readers (thanks Seb and Ken!) recommended Daniel Dennett's work on the subject.  In his lecture, entitled Herding Cats and Free Will Inflation, Dennett argues that while the laws of physics may determine beginnings and endings, in between, there are inevitable instances of "degrees of freedom" (even when talking about machines).  These instances, these degrees of freedom, offer human individuals some opportunity for independence, for autonomy, for self-control.  In the abstract debate over free will, what really matters, according to Dennett, is our ability to act independently and autonomously.  In this context, Dennett underlines that "there is now a multi-billion-dollar competition among various giant companies to pull your strings, to control your attention."

Dennett argues:

The capacity of individuals and companies to distract you and to clamp your degrees of freedom so that you just don’t think about things that you really should be thinking about because you’re so distracted by all these other things which you can’t help looking at, and thinking about instead. The competition for your attention strikes at the heart of your freedom, your ability to think for yourself.

An agent who controls your attention controls you. 

Down the rabbit hole!

Further down the rabbit hole in The Social Dilemma, we learn the "agent" controlling us isn't an individual, isn't a group or a company.  In the first instance it is an algorithm and, inside the company, only a handful of people understand the algorithms, but even they don't know what the programs responding to the algorithms are doing or how they are doing it in real-time.  The machine (server plus software) has been instructed to get users to look at screens as long as possible.  The machine has been programmed to teach itself how to best accomplish this task.  The machine learns through trial and error.  It has billions of lab rats (that would be us) and can perform millions of tests in a relatively short period of time to figure out how (based on our personal data) to keep us looking at a screen.  What works; what doesn't, and everything in between. Based on the machine's study of my data and past behaviour, it calculates where best to hide the off switch from a Sour Boomer like me, while littering my path with images of the elderly man I would like to look like and a series of ab exercises sponsored by a local gym franchise.


Understanding the stakes

Jaron Lanier asserts and Dennett concurs that if you think a company like Facebook wants your data so they can sell it to a third party, you have no idea what game is being played.  Your data is too useful, too valuable, to be sold to a third party.  Certainly, Facebook has proven that your data can be successfully monetized by matching your data with the products of an advertiser. But even monetization isn't the whole story; after all, we are talking about companies that are already the richest companies in the history of the world.  They can change you, me, and the teenage girl who wants plastic surgery to look more like her filtered Snapchat photo.  Lanier argues that a subtle one-percent change in the world, in how we think and feel and are, is a greater measure of power than anything that can be accomplished with a few billion dollars.  Why would they do this?  For the worst of all possible reasons:  because they can.


https://www.zdnet.com/article/apple-ceo-sounds-warning-of-algorithms-pushing-society-towards-catastrophe/



"Three Days of the Condor" and the Tenth Anniversary of "The Sour Grapevine"

Sharing Intelligence I'm still obsessing over " sharing intelligence ."  May 15th was the tenth anniversary of this blog.  I w...