“‘Drive,’ she said when she sat down next to Sami. ‘Where to?’
She thought for a moment. Without looking at him, she said, ‘To where the country ends.’
Each day, the least democratic thing we could imagine (or even couldn’t imagine) happening happens. And then the next day, something even less democratic happens. This is true in the political realm, and it’s increasingly apparent in technology (now that we have algorithms deciding for us what we can/will see).
“‘I went,’ he told her, ‘because every day I ask myself the same question: How can this be happening in America? How can people like these be in charge of our country? If I didn’t see it with my own eyes, I’d think I was having a hallucination.’” –The Plot against America, Philip Roth
How much of what we think and see is influenced by what we are fed pre-emptively? How can we think for ourselves and discover new things when there are limits on what we see (thanks to algorithms and our blindly, blithely feeding all of our own data into massive data crunching/manipulating machines)? I have been thinking about this for some time. Most of the jobs I’ve had were directly within or at least adjacent to/dependent on the collection, analysis and use of data about users and their behaviors and habits. Many companies exist solely because of their access and ability to harvest data – it has created entirely new business models and applications. But it’s never been mysterious what was going on (even if most average people don’t consider the implications). I don’t know why people are now, suddenly surprised.
Just as I was trying to figure out how to discuss this, an article appeared on HBR.org:
“The ability for an elite to instantly alter the thoughts and behavior of billions of people is unprecedented.
This is all possible because of algorithms. The personalized, curated news, information and learning feeds we consume several times a day have all been through a process of collaborative filtering. This is the principle that if I like X, and you and I are similar in some algorithmically determined sense, then you’ll probably like X too. Everyone gets their own, mass-personalized feed, rationed by the machines.
The consequences are serious and wide-ranging. Fake news and misinformation are pervasive. Young kids are being subjected to algorithmically generated, algorithmically optimized pernicious content. Perhaps the least concerning implication is that there is systemic bias in our information feeds, that we operate in and are informed by tiny echo chambers. It’s a grotesque irony that our experiences of the world wide web today are actually pretty local, despite warnings from the likes of Eli Pariser back in 2011.”
My own words – base oversimplifications – are totally inadequate to deconstruct and intelligently discuss the complexity of these issues. But almost every book I read contains a warning. Almost none are direct cautionary tales in the vein of 1984, but almost all advise us to consider what we have and how easy, without vigilance, it is to lose:
“Because civilization isn’t a thing that you build and then there it is, you have it forever. It needs to be built constantly, re-created daily. It vanishes far more quickly than he ever would have thought possible. And if he wishes to live, he must do what he can to prevent the world he wants to live in from fading away. As long as there’s war, life is a preventative measure.” –The Cellist of Sarajevo, Stephen Galloway
But then, we also need to consider that the erosions and explosions of “civilization” also come about because not everyone agrees about what constitutes civilization – this fundamental disagreement poses its own dangerous fragmentation.