The Age of Surveillance Capitalism by Shoshana Zuboff

Guest post by Jason Gagnon

The Age of Surveillance Capitalism by Shoshana Zuboff

There is a lot to this book, and it's difficult to unpack, but it is important, so let me give it a whirl:

First- the twin streams of internet connected data collecting devices and the rise of Artificial Intelligence has dark implications for the economy, for democracy, and for autonomy and privacy.

Second- the idea that "If something is free, you are the product" is close to true, but is not how the surveillance capitalist firms actually work.  What you are, are the churn in the system.  These companies - primarily Facebook, Google, and Microsoft - digest your "behavioral surplus" to predict our future behaviors and needs, and then bundle and sell these predictions.  Ever get an advert for something you had just been thinking about, but never searched for or talked about?  That's how good these predictions are getting it.  Facebook and Google promise that they don't sell our personal data- and they don't - they sell predictions about us based on their intimate knowledge of us.  They track what you type, where you go (both on line and in the real world) they note your dwell times on images and words, as well as the active movements you make through the various corners of the web.

And they just don't spy on you using your phone and laptop.  As "the internet of things" grows, all of these devices also spy on you.  Roomba has toyed with the idea of selling the maps they have made of the interiors of our homes.  Nest and other "smart home" devices know all about your comings and goings, and sell this data to third parties.  And if you don't like that, these devices just don't work.  So, we capitulate. 

Third-  she calls this corporate spying "Big Other."  She was a student of B. F. Skinner's at Harvard in the seventies.  Skinner was a radical behaviorist, and he believed that the interior lives of beings did not matter nearly as much as their behavior, and that an outside behavior, a dispassionate "other" observing an organism, could fully understand the needs actions and control the behavior of an organism.  She uses this idea as the foundational philosophy behind Surveillance Capitalism where dispassionate AIs use their huge data sets to understand us, and make predictions about our future behaviors. 

Fourth- there is a knowledge gap between us and the surveillance capitalists.  They know things that we don't even know they know.   This has implications for contracts- they are no longer social agreements, but based on the knowledge - and increasingly power - gaps between us and them.  If you miss an insurance or car payment, they can remotely turn off your car starter so that it will be essentially a brick in a parking spot, and give the location to the repo man. 

Fifth - they wear us down.  They do the thing- they get caught- they deny the thing - they keep doing it anyway - eventually the limited benefits of the thing become assumed parts of our lives, and we forget why we were shocked in the first place at the larger violation of our privacy.

She wants us to ask Who knows? Who decides?  Who decides who decides?  Right now, we don't know these answers.  The knowledge gap has to be shrunk, and we need to have a say in who decides.  That's democracy.  If we let that go, we let go of self-government.

She also wants us to learn to stand against "inevitablism."  This is something I have been questioning myself, for years. And called a luddite for it.  Inevitablism is a characteristic of utopianism.  Marx did it, as do all other utopian visionaries.  We are supposed to accept the growth of the "Internet of Things" as "Inevitable." And embrace it as good.  Without questioning it. 

It has to be be "inevitable" for the surveillance capitalism companies, because of the way that the larger neo-liberal framework has developed.  If we don't keep growing, our economy, as we have structured it over the past few decades, dies.  And while "inevitabilsm" is a hallmark of utopianism - here is something actually inevitable about utopianism:  utopias always fail.  Heck, all human institutions eventually fail.  But the millenial vision of the utopians like to say "but not us." Yes, you too. And we need to prepare for the failure of surveillance capitalism and its fully networked, total knowledge future. 

She doesn't talk about these, but I have come across a few ideas in other books I read over the last year, about the Internet of Things-  think about a Smart Trash Can that keeps track of everything you throw away.  RFID tags in the packaging identifies what you have thrown away, and maybe automatically adds it to your shopping list.  There are two ways this can go - that data can be sent out into the belly of Google and Facebook, or it can be contained within your own data network for your own personal use and that's it.  That second idea was the goal and vision before the dot com bubble burst.  The first idea is what we have now, and it is hard to avoid. 

And a company that isn't data mining you today can start tomorrow with an impossible to avoid "Terms of Service" update that are written so as to be incredibly difficult to actually read.  The average online ToS is on the screen for 14 seconds before the "Accept" button is clicked.  

She has a long section on the idea of sanctuary, and how we no longer have that.  Our homes themselves have eyeballs and ears through out them. What does it mean when we are always "on stage?"  How does this change who we are as people?  And when we are on stage, Facebook is writing the script.  We think we are dancing our own unique dance, but they are running experiments on us.  They have admitted to it -and they don't, being outside of academia and the government, have to follow the normal rules in regards to running experiments on human subjects. They have openly admitted to running experiments to see if they can have impacts on voter turn out.  And they were successful.  What if they want to push an election one way or another?  They know they can have an impact on turnout larger than the margin of narrow elections.  And on personal behavior?  What if they turned off your ability to see people liking your statuses, but still seeing the likes appearing on other people's updates? What sort of an impact would that have on a fifteen year girl?  Even without that sort of obvious manipulation, what does living in a fully commercialized digital milieu do to us?  Besides the people that sleep under the same roof with you, do you have more "contact" with people you care about online, or in real life? If you're reading this in my Facebook or goodreads post, I'm betting it's online.

This technology is dehumanizing.  And it doesn't have to be.  A digital future doesn't have to be a dehumanized future.  We can say no to this particular path that these companies have taken.  

Takeaway:  Turn your phone off when you aren't using it.  Don't get Nest, or Google Home or Alexa devices.  "Smart" devices are a euphamism for "spying".  Don't buy them.  Pressure your representatives to work to pass laws and guidelines that push back this encroachment.

She refrences Hannah Arendt a great deal in this book, and for good reason.  Political totalitarianism is inescapable, while corporate totalitarianism is, at least in  theory, avoidable- but as the sensors and devices expand their reach, it will become increasingly less so.  And in China we see the two ideas combining with their "social credit system".  It is a "coup from the top" and we need to push back.  We need space, and time, that is not fully commercialized.  We need room to be human.

Comments

Popular posts from this blog

Healing Your Wounded Soul: Growing From Pain to Peace

The Jane Austen Society by Natalie Jenner

School's First Day of School