admonishment to vigilance

Standard

“‘Drive,’ she said when she sat down next to Sami. ‘Where to?’

She thought for a moment. Without looking at him, she said, ‘To where the country ends.’

‘For me it ended a long time ago,’ he hissed.” –To the End of the Land, David Grossman

Each day, the least democratic thing we could imagine (or even couldn’t imagine) happening happens. And then the next day, something even less democratic happens. This is true in the political realm, and it’s increasingly apparent in technology (now that we have algorithms deciding for us what we can/will see).

“‘I went,’ he told her, ‘because every day I ask myself the same question: How can this be happening in America? How can people like these be in charge of our country? If I didn’t see it with my own eyes, I’d think I was having a hallucination.’” –The Plot against America, Philip Roth

How much of what we think and see is influenced by what we are fed pre-emptively? How can we think for ourselves and discover new things when there are limits on what we see (thanks to algorithms and our blindly, blithely feeding all of our own data into massive data crunching/manipulating machines)? I have been thinking about this for some time. Most of the jobs I’ve had were directly within or at least adjacent to/dependent on the collection, analysis and use of data about users and their behaviors and habits. Many companies exist solely because of their access and ability to harvest data – it has created entirely new business models and applications. But it’s never been mysterious what was going on (even if most average people don’t consider the implications). I don’t know why people are now, suddenly surprised.

Just as I was trying to figure out how to discuss this, an article appeared on HBR.org:

“The ability for an elite to instantly alter the thoughts and behavior of billions of people is unprecedented.

This is all possible because of algorithms. The personalized, curated news, information and learning feeds we consume several times a day have all been through a process of collaborative filtering. This is the principle that if I like X, and you and I are similar in some algorithmically determined sense, then you’ll probably like X too. Everyone gets their own, mass-personalized feed, rationed by the machines.

The consequences are serious and wide-ranging. Fake news and misinformation are pervasive. Young kids are being subjected to algorithmically generated, algorithmically optimized pernicious content. Perhaps the least concerning implication is that there is systemic bias in our information feeds, that we operate in and are informed by tiny echo chambers. It’s a grotesque irony that our experiences of the world wide web today are actually pretty local, despite warnings from the likes of Eli Pariser back in 2011.”

My own words – base oversimplifications – are totally inadequate to deconstruct and intelligently discuss the complexity of these issues. But almost every book I read contains a warning. Almost none are direct cautionary tales in the vein of 1984, but almost all advise us to consider what we have and how easy, without vigilance, it is to lose:

“Because civilization isn’t a thing that you build and then there it is, you have it forever. It needs to be built constantly, re-created daily. It vanishes far more quickly than he ever would have thought possible. And if he wishes to live, he must do what he can to prevent the world he wants to live in from fading away. As long as there’s war, life is a preventative measure.” –The Cellist of Sarajevo, Stephen Galloway

But then, we also need to consider that the erosions and explosions of “civilization” also come about because not everyone agrees about what constitutes civilization – this fundamental disagreement poses its own dangerous fragmentation.

Photo by paul morris on Unsplash

Data protection, use, rights and apathy

Standard

Do we have any idea what we are giving up in letting our data run free? Not really.

Watch the frightening documentary Terms and Conditions May Apply and start to get the idea. In our race to have speed, convenience, access and mobility – among other things – we are willing to sign away rights, privacy and protection for ourselves without even knowing it. Or in lacking the attention span or interest to follow things like privacy rights or something like the net neutrality debate in the US, we lose choice and transparency.

As John Oliver explained on his fantastic and revealing weekly HBO program, Last Week Tonight, discussing the net neutrality subterfuge, companies can bury all the information they are required to tell consumers but don’t really want them to read or understand in EULAs. Much of the time, these terms and conditions are innocuous but some are quite malicious, misleading and violate user privacy, leaving most users uninformed and having given blind consent.

At 9:50:

“The cable companies have figured out the great truth of America: if you want to do something evil put it inside something boring. Apple could put the entire text of Mein Kampf inside the iTunes user agreement and you’d just click ‘Agree’.”

It’s one thing to just complain and worry about data collection and use – but what kinds of solutions may exist? Craig Mundie’s piece in Foreign Affairs addressing the issue. “The time has come for a new approach: shifting the focus from limiting the collection and retention of data to controlling data at the most important point — the moment when it is used.”

Some kind of change has to happen because “… there is hardly any part of one’s life that does not emit some sort of “data exhaust” as a byproduct. And it has become virtually impossible for someone to know exactly how much of his data is out there or where it is stored. Meanwhile, ever more powerful processors and servers have made it possible to analyze all this data and to generate new insights and inferences about individual preferences and behavior.”

Interestingly, Mundie cites the introduction and eventual ubiquity of credit cards as the truly disruptive technology that opened the consumer-data floodgates. Did anyone imagine that the truly disruptive technology – well before the internet – was the credit card? They open so much access for financial institutions to create credit reports and scores and to basically control a person’s life based on their spending and saving habits, to keep tabs on her location, habits, tastes, propensities – it’s a gold mine of data that financial institutions could sell to retailers – so much opportunity for consumer exploitation. Consumers, though, have trusted that this would not happen because of data handling and storage regulations.

But once the floodgates were open, and regulations in place – the internet came along. But data privacy and rights have not changed to keep pace with how industry and technology have changed.

The part that is most alarming for me when I think about it is that whole business models and companies are built on this virtually free access to, collection of and manipulation, analysis, sale and packaging of data. How many of us are actually employed in industries whose bread and butter is somehow a link in that data collection and use chain?

Are the trade-offs of allowing all this data collection worth it? The Mundie article cites the public good as one reason not to entirely do away with data collection (but to limit/change it). One example is in a case when vast data sets yielded key findings in medical research, which can benefit society as a whole. But does that supersede the right of the individual not to have their own personal data used in some way to which they have not expressly consented? (Opting into a serpentine user agreement as a layperson does not really signify consent in my mind.)

Solutions that Mundie proposes are interesting but fail to take into account personal laziness. People like talking about having their privacy violated, but if taking control meant, as the writer suggests, “It would also require people to constantly reevaluate what kinds of uses of their personal data they consider acceptable” and one would have to take personal responsibility for context and assessing the value of how their data were used, almost no one would do it.

People do not want to evaluate at all – which is why they just say yes or no in the first place – expedience, convenience. Damn the consequences.